Erik Sorto, 34, has been paralysed from the neck down for the past 13 years. However, thanks to a ground-breaking clinical trial, he has been able to smoothly drink a bottle of beer using a robotic arm controlled with his mind. He is the first patient to have had a neural prosthetic device implanted in a region of the brain thought to control intentions. The technology created surprisingly natural movements and has the potential to work for multiple robotic limbs.
Tragically, through illness or injury, millions of individuals have lost the ability to sense and move their bodies. In recent years, a handful of studies have shown that it is possible to record brain activity from such individuals and use this information to restore movement capabilities. Signals recorded from primary motor cortex – a part of the brain that is necessary for the control of movement – have been used to control external devices such as a cursor on a computer screen, and even different kinds of robotic arms.
However, these kinds of devices often result in delayed, shaky movement, which could mean that these brain signals may not be the best ones to use. As research in this area continues to move forward, new possibilities are being explored. The latest study’s measurement of neurons – brain cells – in a part of the brain called the posterior parietal cortex, which is believed to represent action intentions, has resulted in more intuitive movement – even enabling Sorto to play rock, paper, scissors.
Sorto drinks a beer on his own for the first time in over a decade. Credit: Keck Medicine of USC
Reading intentions
Before we choose to act, we must first intend to act. For routine actions of everyday life, such as grasping our favourite coffee mug or flicking on a light switch, our action intentions unfold so effortlessly, and within fractions of a second, they typically go unnoticed.
Despite these fleeting and effortless qualities, a wealth of evidence from basic psychology and neuroscience suggests that our action intentions constitute an incredibly rich source of information, including desired outcomes, and predicted sensory and movements. While it is generally well accepted that this information is useful for the control of actions, this hypothesis is inherently difficult to evaluate.
In the new study, the information recorded from the neurons in the posterior parietal cortex was used to control the operations of a computer interface and visual display. The researchers first used functional magnetic resonance imaging to identify brain areas in Sorto, who had suffered a spinal cord injury due to a gun shot 10 years prior to the study, and as a result is unable to move or feel his arms, legs, and torso. In that way they could see which areas of his brain were active for imagined reaching and grasping movements.
With recording instruments in place, the results showed that the patient could voluntarily control the activity of single neurons. Some of these cells responded in a remarkably specific way. For example, a cell could show increasing or decreasing activity when Sorto imagined moving his hand to his mouth, but not when he imagined moving his hand to his chin, or to his ear.
Key to an entire robotic body?
The activity of some cells were shown to be specifically related to either intended arm or eye movements, and of those that were selective for arm movements, some responded for intended movements with either arm while others responded only for intended right but not left arm movements, or vice-versa. The authors say that the information from cells representing either hand may be important for controlling actions involving both hands. This also means that these signals might be capable of controlling a robotic interface with multiple appendages.
Caleb Roenigk/Flickr, CC BY-SA
These signals may also be flexible. Consider the different ways that we are able to use hand-held tools of various shapes and sizes – even ones that move and are therefore controlled differently – to perform essentially the very same actions to achieve the same goals. For example, we are able to write our name on paper using a pencil or in the sand using a stick. The goals of the action are the same, but the tools used to achieve those goals differ.
In order to accomplish this, the control system must be able to accommodate changing qualities of tools and the environment. Perhaps by recording from cells that appear to reflect action intentions, the information that is captured is similarly malleable. This may prove essential for the potential to use the same brain signals to control a wide range of devices in order to perform a variety of actions.
Fans of the Star Wars films will recall the scene from Empire Strikes Back when Luke Skywalker feels his new cybernetic hand for the first time. While still a long way away from this, for those patients who are unable to move most of their body, this new progress is a tremendous step forward. And it certainly has the potential to revolutionise research in the field.
No comments:
Post a Comment