Jul 212015
 

Outer Places – July 21st, 2015

 

eye implant

 

Are we on our way to bionic replacement organs? British scientists have just successfully implanted a bionic eye for the first time, allowing an 80-year-old legally blind man to see the world around him with a video camera.

80-year-old Ray Flynn suffers from an age-related condition called macular degeneration, which caused him to lose his central vision in one eye. He can still see out of the periphery of his eye, but there’s a large black hole in the center of his vision that has grown over the years. There is no treatment for this condition, but using the bionic implant, he can now see obstacles in front of him using special video glasses.

The implant, called the Argus II, consists of a transmitter that’s attached directly to the eye using electrodes. Images from a miniature video camera mounted on special glasses transmit images in the form of electrical pulses to the electrodes, which in turn stimulate the remaining cells in the retina to send the information to the brain.

In a test two weeks after the surgery, Flynn could detect horizontal and vertical lines on a computer. The implant even allows him to see with his eyes closed, as the doctors had him close his eyes during the test in order to ensure the information was coming from the camera.

 

eye implant 2

 

The Argus II is not yet available widely in the UK, although the researchers hope that once they run a few more successful tests, they will be able to get it onto NHS.   “Mr. Flynn’s progress is truly remarkable, he is seeing the outline of people and objects very effectively,” said lead surgeon Paulo Stanga, of the University of Manchester. “I think this could be the beginning of a new era for patients with sight loss.

 

Flynn has already begun to return to activities that require his central sight, such as gardening and watching television.

“Your eyes are the most precious thing,” said Flynn. “My brain is still trying to catch up and work out what is going on, but I have been told it will continue to get better.”

Over time, Stanga claims, Flynn should get accustomed to the implant and be able to interpret the images sent to his brain more clearly, so he can see almost exactly what the video camera is seeing.

This technology could have enormous impact on the quality of life for people with macular degeneration, and while the technology is specific to forms of sight loss that leave some healthy retinal cells, the successful implant of a bionic eye, or a bionic organ in general, is a major breakthrough.

“We hope these patients will develop some central visual function which they can work in alongside and complement their peripheral vision,” said Stanga. “We are very excited by this trial and hope that this technology might help people, including children with other forms of sight loss.”

For better and for worse (but mostly for better) this could be the beginning of a whole new landscape for medical treatment. Once we start fusing organic bodily functions with machines, we’ll be on the road towards turning the humans of the future into full-fledged cyborgs. If we could create other bionic organs, like bionic hearts, lungs, and kidneys, then we could extend the human lifespan by many, many years.

Jul 212015
 

Outer Places – July 21st, 2015

 

 

 

cyborgs us

cyborgs us

Did we just get one step closer to the singularity? Chinese researchers claim they’ve developed an AI that scored higher than the average human on the verbal portion of an intelligence test.  According to the authors, the verbal section of an IQ test is more difficult for an AI to complete accurately than the quantitative section because the verbal portion requires knowledge of the nuances of polysemous words (words with multiple meanings) and the relationships between them. Usually, computer programs utilize word embedding technologies that allow the AI to learn just one vector per word, which is insufficient to pass this sort of test.

 

In this study, the researchers claim to have invented a new “knowledge-powered” type of word embedding that allows the AI to adjust its strategy depending on what type of question is being asked and to take into account the relationships between different words. Using this new method, the AI was not only able to satisfactorily complete the IQ test, but were able to score slightly higher than the average human.
The authors wrote in their paper:   “The results are highly encouraging, indicating that with appropriate uses of the deep learning technologies, we could be a further small step closer to the human intelligence.”

 

These results are, in fact, groundbreaking, as the most successful AI IQ test so far measured the program as having the intelligence of a human four-year-old, while this AI was measured as equally intelligent to most humans. But, that being said, there are several qualifications to consider. First, many experts believe that the IQ test itself is flawed, as it only measures a normative type of intelligence and doesn’t take into account other types of mental ability, such as creativity or emotional intelligence. Some dispute it as a measure of intelligence altogether, as intelligence is a difficult concept to define and test bias has been demonstrated against certain marginalized groups. But even if it’s not exactly a measure of intelligence, most experts contend that it is correlated with intelligence, so these results should still be taken very seriously.

 

 

Cyborg brain

Cyborg brain

This study was also published on an online database, and has not yet been accepted to a scientific journal. But still, experts not involved in the study have praised the findings, even as they caution that this technology is in its infancy, and we’re not anywhere near Hal 9000 yet. Hannaneh Hajishirzi, an electrical engineer and computer scientist at the University of Washington in Seattle, told Business Insider that the researchers in this study “got interesting results in terms of comparison with humans on this test of verbal questions,  we’re still far away from making a system that can reason like humans.”

Robert Sloan, a computer scientist at the University of Illinois at Chicago, similarly acknowledged that this AI was a small step forward, but claimed that there was no guarantee the program would be able to perform as well as a human on an open-ended test, as opposed to multiple-choice. In the field of artificial intelligence research, Sloan said, “the places where so far we’ve seen very little progress have to do with open dialogue and social understanding.” For example, a normal human child would be expected to understand that if he or she sees an adult lying in the street, then he or she should call for help. “Right now, theres no way you could write a computer program to do that.”

Jul 212015
 

Outer Places – July 21st, 2015

 

optigenetics
optigenetics

Is immortality within our reach? Maybe not yet, but we are definitely trying. While the new film “Self/Less” features an interesting science fiction take on achieving immortality, various advances have been taking place in the very real scientific community. We may have a long way to go before we can transfer our consciousness into Ryan Reynolds body, but science is working pretty hard on some fascinating alternatives to the notion of immortality:

 

 

Anti-Aging Genetic Engineering

Maybe someday anti-aging will really reverse aging and keep us young forever, but until that day current anti-aging discoveries are at least helping to slow down specific aspects of the aging process. This spring, scientists at UC Berkeley discovered a drug called the Alk5 kinase inhibitor that helps restore brain and muscle tissues to youthful levels through stem cells used in tests on mice. The Alk5 kinase inhibitor limits the release of TGF-beta1, a chemical that restricts a stem cell’s ability to repair the body. This chemical tends to become over-produced as people age, but in restricting its release, it is hoped that the Alk5 kinase inhibitor can keep people healthier in old age by lessening the onset of aging related diseases, such as Alzheimer’s, increasing the quality of life and cutting down medical costs.

The inhibitor is currently in trials as an anticancer agent, and the hope is that one day death will not be the result a prolonged, painful disease, but through a quicker, more natural means like cardiac arrest or stroke. Here’s what Irina Conboy, one of the scientists at UC Berkeley, said about the motivations behind the team’s efforts.

 

“The goals of my colleagues and I are not to live forever. Instead of becoming old and becoming a burden on society, we can age ourselves more with integrity.”

Regenerative Medicine

robotic brainOne of the main goals of regenerative medicine has been developing the ability to produce hematopoietic stem cells (HSCs) suitable for blood cell transplants, or bone marrow transplants. These transplants are limited by the challenge of finding a good match and the rarity of naturally occurring HSCs, but in 2014 researchers at the Harvard Stem Cell Institute programmed mature blood cells in mice into reprogrammed HSCs by reversing the process of stem cells, to progenitors, to mature effector cells.

Tests have not yet been performed on human subjects, but the progress seen so far is enough to make Stuart Orkin of the Harvard Stem Cell Institute, feel very confident about the future.

 

This discovery could have a radical effect on transplantation You could have gene-matched donors, you could use a patient’s own cells to create iHSCs. It’s a long way off, but this is a good step in the right direction.

But that’s not the only advance in stem cell research. This year, scientists at the Salk Institute discovered a type of stem cell whose identity is tied to their location in a developing embryo, and not their time-related stage of development. These region-selective pluripotent stem cells (rsPSCs) are easier to grow in the laboratory, offer advantages for gene editing, and, unlike conventional stem cells, have the ability to integrate into modified mouse embryos.

As Jun Wu, a postdoctoral researcher describes; understanding the spatial characteristics of the stem cells “could be crucial to generate functional and mature cell types for regenerative medicine.” It could well be that in the near future, parts of the body that have degenerated due to age, could be regenerated at will by the introduction of these fascinating stem cells.

Nanomedicine

We have previously featured nanobots in medicine, but there are many more theoretical uses of nanomedicine that could someday affect our lifespan. According to Frank Boehm, author of “Nanomedical Device and Systems Design: Challenges, Possibilities, Visions,” a conceptual Vascular Cartographic Scanning Nanodevice could scan the entire human vasculature down to the capillary level and transfer the image to a Pixel Matrix display, holograph, or virtual reality system, allowing for a detailed inspection of the system to find aneurysm risks, especially in the brain.

A nanodevice imbued with data on toxins and pathogens could be used to enhance the human immune system by recognizing and destroying an invasive agent. Nanotechnology could also be used to remove lipofuscin, a product that accumulates in lysosomes negatively impacting cell function and manifesting in age related conditions. All of these technologies are speculative, but nanobots are already lengthening our lives in tests to fight cancer, and many believe such technologies are truly the future of the medical industry.

Digital Immortality

At Transhuman Vision 2014, Randal Koene, a neuroscientist and neuro-engineer described his plan to upload his brain to a computer by “mapping the brain, reducing its activity to computations, and reproducing these computations in code.” While it sounds remarkably like that awful Johnny Depp movie, Transcendence, Koene and many neuroscientists believes that our memories, emotions, consciousness, and more are just the sum of signals from electrochemical signal jumps from synapse to synapse.

Computer programmers have already created artificial neural networks that can form associations and learn through pattern-recognition, but they don’t possess the complexity of the human brain. However, if our consciousness is just based on brain activity and if technology can record and analyze them, they could possibly be reduced to computations. Advances have already been made with animal tests, and in 2011 a team from the University of Southern California and Wake Forest University created the first artificial neural implant, a device that produces electrical activity that causes a rat to react as though the signal came from its own brain.

Cyborgization

While it may sound the most sci-fi of all these scenarios, cyborg technology is already a part of our lives. We have artificial retinas, cochlear implants, pacemakers, and even deep-brain implants to alleviate the symptoms of Parkinson’s. In fact, the list of real world cyborg technologies is seemingly endless, so much so that we’ve had to reduce it to bullet form. Below you’ll find a few ways that humans and electronics have merged in beautiful harmony:

– Neruobridge technology reconnected a paralyzed man’s brain to his body

 

 

 

 

– The Eyeborg: Canadian filmmaker Rob Spence lost his right eye in a shotgun accident and replaced it with a video camera that transmits what he’s seeing to a computer.

– Programmer Amal Graafstra has inserted radio-frequency identification chips in his hands connected to scanners on his doors and laptop, eliminating the need for keys or passwords.

 

– “Transhumanists” advocate for cyborgization, genetic engineering, and synthetic biology, to increase our intelligence, health, and lives to transform humanity to a “post-human” stage.

Current advances in anti-aging, regenerative medicine, nanomedicine, digital immortality, and cyborgization may only be focusing on prolonging life at the moment. But these technologies have already improved our lives, and as the possibility of immortality is played out on the movie screen, we can see the world of fiction slowly melding with our own reality.

Jul 212015
 

Science Daily – July 20, 2015

 

Transparent-Graphene-Electrodes

 

A pioneering new technique to produce high-quality, low cost graphene could pave the way for the development of the first truly flexible ‘electronic skin’, that could be used in robots.

Researchers from the University of Exeter have discovered an innovative new method to produce the wonder material Graphene significantly cheaper, and easier, than previously possible.

The research team, led by Professor Monica Craciun, have used this new technique to create the first transparent and flexible touch-sensor that could enable the development of artificial skin for use in robot manufacturing. Professor Craciun, from Exeter’s Engineering department, believes the new discovery could pave the way for “a graphene-driven industrial revolution” to take place.

She said: “The vision for a ‘graphene-driven industrial revolution’ is motivating intensive research on the synthesis of high quality and low cost graphene. Currently, industrial graphene is produced using a technique called Chemical Vapour Deposition (CVD). Although there have been significant advances in recent years in this technique, it is still an expensive and time consuming process.”

 

Transparent-Graphene-Electrodes-on-skin-590x330

The Exeter researchers have now discovered a new technique, which grows graphene in an industrial cold wall CVD system, a state-of-the-art piece of equipment recently developed by UK graphene company Moorfield.

This so-called nanoCVD system is based on a concept already used for other manufacturing purposes in the semiconductor industry. This shows to the semiconductor industry for the very first time a way to potentially mass produce graphene with present facilities rather than requiring them to build new manufacturing plants. This new technique grows graphene 100 times faster than conventional methods, reduces costs by 99 % and has enhanced electronic quality.

These research findings are published in the scientific journal, Advanced Materials.

Dr Jon Edgeworth, Technical Director at Moorfield said: “We are very excited about the potential of this breakthrough using Moorfield’s technology and look forward to seeing where it can take the graphene industry in the future.”

Professor Seigo Tarucha from the University of Tokyo, coordinator of the Global Center of Excellence for Physics at Tokyo university and director of the Quantum Functional System Research Group at Riken Center for Emergent Matter Science said: “The ability to manufacture high quality, large area graphene (at a low cost) is essential for advancing this exciting material from pure science and proof-of-concept into the realm of conventional and quantum electronic applications. After starting the collaboration with Professor Craciun’s group, we are using Exeter CVD grown graphene instead of the exfoliated material in our graphene-based devices, whenever possible.”

The research team used this new technique to create the first graphene-based transparent and flexible touch sensor. The team believes that the sensors can be used not just to create more flexible electronics, but also a truly-flexible electronic skin that could be used to revolutionise robots of the future.

Dr Thomas Bointon, from Moorfield Nanotechnology and former PhD student in Professor Craciun’s team at Exeter added: “Emerging flexible and wearable technologies such as healthcare electronics and energy-harvesting devices could be transformed by the unique properties of graphene. The extremely cost efficient procedure that we have developed for preparing graphene is of vital importance for the quick industrial exploitation of graphene.”

At just one atom thick, graphene is the thinnest substance capable of conducting electricity. It is very flexible and is one of the strongest known materials. The race has been on for scientists and engineers to adapt graphene for flexible electronics.

Professor Saverio Russo, co-author and also from the University of Exeter, added: “This breakthrough will nurture the birth of new generations of flexible electronics and offers exciting new opportunities for the realization of graphene-based disruptive technologies. ”

In 2012 the teams of Prof Craciun and Profesor Russo, from the University of Exeter’s Centre for Graphene Science, discovered that sandwiched molecules of ferric chloride between two graphene layers make a whole new system that is the best known transparent material able to conduct electricity. The same team have recently discovered that GraphExeter is also more stable than many transparent conductors commonly used by, for example, the display industry.

Jul 212015
 

I09 – July 20th 2015

cyborgs us

cyborgs us

The prospect of uploading your brain into a supercomputer is an exciting one — your mind can live on forever, and expand its capacity in ways that are hard to imagine. But it leaves out one crucial detail: Your mind still needs a body to function properly, even in a virtual world. Here’s what we’ll have to do to emulate a body in cyberspaceWe are not just our brains. Conscious awareness arises from more than just raw calculations. As physical creatures who emerged from a material world, it’s our brains that allow us to survey and navigate through it; bodies are the fundamental medium for perception and action. Without an environment — along with the ability to perceive and work within it — there can be no subjective awareness. Brains need bodies, whether that brain resides in a vertebrate, a robot, or in future, an uploaded mind.

In the case of an uploaded mind, however, the body doesn’t have to be real. It just needs to be an emulation of one. Or more specifically, it needs to be a virtual body that confers all the critical functions of a corporeal body such that an uploaded or emulated mind can function optimally within its given virtual environment. It’s an open question as to whether or not uploading is possible, but if it is, the potential benefits are many, but knowing which particular features of the body need to reconstructed in digital form is not a simple task. So, to help me work through this futuristic thought experiment, I recruited the help of neuroscientist Anders Sandberg, a researcher at the University of Oxford’s Future of Humanity Institute and the co-author of Whole Brain Emulation: A Roadmap. Sandberg has spent a lot of time thinking about how to build an emulated brain, but for the purposes of this article, we exclusively looked at those features outside the brain that need to be digitally re-engineered.

Emulated Embodied Cognition

cognitive scienceTraditionally, this area of research is called embodied cognition. But given that we’re speculating about the realm of 1’s and 0’s, it would be more accurate to call it virtual or emulated embodied cognition. Thankfully, many of the concepts that apply to embodied cognition apply to this discussion as well.  Philosophers and scientists have known for some time that the brain needs a body. In his 1950 article, “Computing Machinery and Intelligence,” AI pioneer Alan Turing wrote:

It can also be maintained that it is best to provide the machine with the best sense organs that money can buy, and then teach it to understand and speak English. That process could follow the normal teaching of a child. Things would be pointed out and named, etc.

Turing was talking about robots, but his insights are applicable to the virtual realm as well.  Similarly, roboticist Rodney Brooks has said that robots could be made more effective if they plan, process, and perceive as little as possible. He recognized that constraint in these areas would likewise constrain capacity, thus making the behavior of robots more controllable by its creator (i.e. where computational intelligence is governed by a bottom-up approach instead of superfluous and complicated internal algorithms and representations).

Indeed, though cognition happens primarily (if not exclusively) in the brain, the body transmits critical information to it, in order to fuel subjective awareness. A fundamental premise of embodied cognition is the idea that the motor system influences cognition, along with sensory perception, and chemical and microbial factors. We’ll take a look at each of these in turn as we build our virtual body.  More recently, AI theorist Ben Goertzel has tried to create a cognitive architecture for robot and virtual embodied cognition, which he calls OpenCog. His open source intelligence framework seeks to define the variables that will give rise to human-equivalent artificial general intelligence. Though Goertzel’s primary concern is in giving an AI a sense of embodiment and environment, his ideas fit in nicely with whole brain emulation as well.

The Means of Perception

abstract computer brainA key aspect of the study of embodied cognition is the notion that physicality is a precondition to our intelligence. To a non-trivial extent, our subjective awareness is influenced by motor and sensory feedback fed by our physical bodies. Consequently, our virtual bodies will need to account for motor control in a virtual environment, while also providing for all the senses, namely sight, smell, sound, touch, taste. Obviously, the digital environment will have to produce these stimuli if they’re to be perceived by a virtual agent.

For example, we use our tactile senses a fair bit to interact with the world.  “If objects do not vibrate as a response to our actions, we lose much of our sense of what they are,” says Sandberg. “Similarly, we use the difference in sound due to the shape of our ears to tell what directions they come from.” So, in a virtual reality environment, this could be handled using clever sound processing, rather than simulating the mechanics of the outer ear. Sandberg says we’ll likely need some exceptionally high-resolution simulations of the parts of the world we interact with.  As an aside, he says this is also a concern when thinking about the treatment of virtual lab animals. Not giving virtual mice or dogs a good sense of smell would impair their virtual lives, since rodents are very smell-oriented creatures. While we know a bit about how to simulate them, we don’t know much about how things smell to rodents — and the rodent sense of smell can be tricky to measure.