Skip to Content

Patient Shows New Dexterity with a Mind-Controlled Robot Arm

With twice as many electrodes in her brain as previous study participants, a paralyzed woman can move a robotic arm with unprecedented flexibility.
December 17, 2012

A woman who is completely paralyzed below the neck has regained the ability to reach out and interact with the world around her thanks to the most advanced brain-computer interface for operating a robotic arm so far.

In February, surgeons implanted two four-millimeter-by-four-millimeter electrode arrays into the participant’s motor cortex, the region of the brain that initiates movements. Each chip has 96 electrodes and is wired through the skull to a computer that translates her thoughts into signals for the robotic arm. The work, performed by researchers from the University of Pittsburgh, is reported in the latest issue of The Lancet.

Mind control: Jan Scheuermann drives a robotic prosthetic limb with her thoughts, which are recorded by electrodes in her brain and then interpreted by a computer.

The work is the latest advance to show how brain-controlled interface technology can restore some movement to quadriplegics. In May of this year, researchers at Brown University described how a paralyzed patient could use a robotic limb to perform basic tasks, including giving herself a drink of coffee (see “Brain Chip Helps Quadriplegics Move Robotic Arms with Their Thoughts”). The participant in the new study has twice as many electrodes in her brain as the woman in the Brown study and can demonstrate more complex hand movements with her robotic limb.

“We are reproducing more of a natural and realistic movement of the arm and hand,” says Andrew Schwartz, a neuroscientist at the University of Pittsburgh and the senior author on the study.

Some experts, however, caution that it’s hard to draw conclusions about the technology’s potential from a single case.

Miguel Nicolelis, a brain-machine interface researcher at Duke University, notes that recording from more neurons makes it possible to improve precision and complexity in the movements of connected devices. However, he adds that it is hard to say just how many neurons the Pittsburgh team was actually recording from. “There is little documentation of the brain signal,” says Nicolelis of the Lancet paper describing the work. “It would be really great if they had reached the 200-neuron mark, but there appears to be no documentation of that,” he says.

The Lancet study describes the progress of the woman as she operated the robot arm over 13 weeks. After the electrodes were implanted into her brain, she began her training by watching the arm move and imagining that she was controlling it. All the while, the computer was recording neural activity in her motor cortex, and this information was used to better decode her intentions into movements of the robot arm. “Then we started to give her some control,” says Jennifer Collinger, a biomedical engineer at Pittsburgh and the first author on the study. “That generates a feedback loop—she can see whether what she is thinking is moving the arm in the right direction or not. Eventually, we took off those training wheels and gave her full control.”

By the second day of use, the participant was able to move the arm in three dimensions on her own. With practice, she was able to move cubes and other objects around a table and even pick up a two-pound rock. The woman continues to work with the researchers. She recently was able to pick up a piece of chocolate and feed herself, says Schwartz.

Like a spinal cord, the robotic arm used in the study has some ability to control its own movement. Years of study in primates on how the motor cortex coördinates hand movements helped the team develop the technology that could translate the participant’s thoughts in more “fluid and natural” movements, says Grégoire Courtine, a neuroscientist at the Swiss Federal Institute of Technology Lausanne in Switzerland.

“When animals move, they follow certain sets of rules, and it turns out we can pick that up in the neural signals that we record from the motor cortex,” says Schwartz.

The arm, which was developed under a Defense Advanced Research Projects Agency contract, has 17 motors that control 26 joints in what is the most sophisticated artificial limb system in the world. “The arm was designed to be able to mimic a human limb,” says Michael McLoughlin, program manager for the Modular Prosthetic Limb project, which is based at Johns Hopkins University in Maryland. The Johns Hopkins team has built six of the robotic limbs that are in use by different research groups in the U.S., says McLoughlin.

A crucial next step for the Pittsburgh team will be incorporating sensory feedback into the prosthetic. The arm has over 100 sensors, says McLoughlin, capable of detecting vibration, pressure, temperature, and more. The team is also working on developing a wireless version of the brain-machine interface so that participants do not have to have electronics sticking out of their heads.

The researchers also hope to recruit more participants to work with the prosthetic, and to continue to improve the technology so that one day the “laboratory oddity can be translated into therapeutic use,” says Schwartz.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.