Skip to Content
Artificial intelligence

A sensor-filled “skin” could give prosthetic hands a better sense of touch

Image of a person wearing a prothetic hand
Image of a person wearing a prothetic handMario Tama/Getty

The “electronic skin,” inspired by the nervous system, can sense temperature, pressure, or humidity. It could be used to give prosthetic limbs a more complex sense of touch.

Humans are amazing: Your body is a sensing machine, thanks to the roughly 45 miles of nerves inside your body that connect your skin, brain, and muscles. A team from the University of Singapore has now used that nervous system as inspiration to create a "skin" for robots that, one day, could improve their ability to detect and understand their environment.

How it works: Sheets of silicon were covered with 240 sensors that can pick up contact, pressure, temperature, and humidity. These are able to simultaneously transmit all this data to a single decoder, and should still work when the system is scaled up to 10,000 sensors, according to Benjamin Tee, the coauthor of the study, which was published in Science Robotics today.

What’s new: Flexible robotic “skin” has been tested in previous studies, but this system is the first to enable many sensors to feed back to a single receiver, allowing it to act as a whole system rather than a bunch of individual electrodes, Tee said. Crucially, it still works even if the individual receptors are damaged, making it more resilient than previous iterations. 

What it could enable: Essentially, this would give robots something slightly like the human sense of touch, says Shan Luo, assistant professor of robotics at the University of Liverpool. A robot equipped with an electronic skin would be able to manipulate tools in a warehouse more dexterously and could interact with humans more safely, he added. The team are now working with engineers and neuroscientists to help restore a sense of touch to prosthetic hands. This might "let people feel less of a sense of loss," says Tee.

The bigger picture: Robot dexterity was picked out as one of Tech Review’s 10 breakthrough technologies this year, and that’s because of the growing need to improve how robots navigate the physical world, especially if we’re going to work alongside machines, or let them help us with household chores.

Sign up here for our daily newsletter The Download to get your dose of the latest must-read news from the world of emerging tech.

Deep Dive

Artificial intelligence

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.