The loss of a limb following an injury, accident or disease can greatly reduce quality of life, making it harder for people to engage in daily activities. Yet recent technological advances have opened new exciting possibilities for the development of more comfortable, smarter and intuitive prosthetic limbs, which could allow users to easily complete a wider range of tasks.
Many smart prosthetics developed over the past decade are operated via myoelectric signals, electrical signals originating from muscles that are picked up by sensors attached to a wearer’s skin. While some of these systems have proved to be very effective, they require users to consciously produce specific muscle signals to perform desired movements, which can be both physically and mentally demanding.
Researchers at Memorial University of Newfoundland in Canada recently developed a new automated method to control the movements of prosthetic hands that does not rely on myoelectric or other biological signals. Their proposed control system, outlined in a paper posted to the preprint server arXiv, is based on a machine learning-based model trained on video footage of prosthetic hands completing specific tasks, which can autonomously plan and execute the movements required to tackle a given task.
“The idea for this paper came from our desire to make prosthetic hands easier to use,” Xianta Jiang, senior author of the paper, told Tech Xplore. “Traditional systems rely on muscle signals, which can be hard to control and tiring for users. We wanted to explore whether an autonomous system—like a robot that can ‘see’ and ‘feel’ the world—could take over some of that effort.”
The main objective of the recent study by Jiang and his colleagues was to create a prosthetic hand that could autonomously process its surrounding environment and perform grasping tasks, requiring minimal efforts from the user wearing it. Instead of planning movements based on biological signals or on commands sent by a user, the control system created by the researchers relies on data collected by a small camera mounted on a prosthetic wrist, as well as sensors detecting both touch and motion.
“These inputs are combined using artificial intelligence (AI), specifically a learning technique called imitation learning,” explained Kaijie Shi, first author of the paper. “The AI model learns from past demonstrations—basically watching how objects should be picked up, held, and released. The hand then uses this knowledge to make decisions in real-time. What’s unique is that the system doesn’t rely on muscle signals; it works by ‘understanding’ the object and the task, making it more natural and intuitive for the user.”
To test their newly developed control system, the researchers deployed it on a real prosthetic hand and performed a series of experiments in a real-world setting. They found that even when trained on only a few videos showing the same person handling a limited set of objects, their system allowed the prosthetic hand to successfully grasp desired items, with a high success rate.
“Our system performed grasp-and-release tasks autonomously with over 95% success,” said Jiang. “This is a major step toward making prosthetic hands that work automatically and reliably in everyday settings. Practically, this means future prosthetic users could benefit from a device that helps them complete common tasks—like picking up a cup or opening a door—without needing to constantly think about every movement.”
The researchers plan to continue improving the imitation learning-based approach they developed and test it in a wider range of experiments, also involving individuals who would benefit from more advanced prosthetic systems. In the future, they hope that their system will contribute to the advancement of commercially available prosthetic hands, reducing the effort required to operate them.
“Next, we want to test the system with actual prosthetic users and gather feedback from them,” added Jiang. “We also plan to improve the system’s ability to adapt to different environments and more complex tasks, such as handling soft or oddly shaped objects. Another goal is to explore how this technology can be used in other assistive devices, like exoskeletons for stroke recovery.”
More information:
Kaijie Shi et al, Towards Biosignals-Free Autonomous Prosthetic Hand Control via Imitation Learning, arXiv (2025). DOI: 10.48550/arxiv.2506.08795
Project page: sites.google.com/view/autonomous-prosthetic-hand
arXiv
© 2025 Science X Network
Citation:
New system reliably controls prosthetic hand movements without relying on biological signals (2025, June 18)
retrieved 18 June 2025
from https://techxplore.com/news/2025-06-reliably-prosthetic-movements-biological.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.