'Seeing' and 'feeling': Controlling prosthetic hands with AI
Annually, more than one million people around the world experience the loss of a limb, which can have a significant impact on their daily lives.
Prosthetic arms have great potential to improve the lives of amputees. However, prosthetic hands have limited functions and the interface lacks intuitive control.
Dr. Xianta Jiang, an assistant professor of computer science in the Faculty of Science at Memorial, was recently awarded $237,750 by the Government of Canada’s New Frontiers in Research Fund (NFRF).
The funding is to create an artificial intelligence interface for controlling prosthetic hands, enabling amputees to operate an artificial limb as easily as their intact hand without requiring surgery.
The project team includes co-principle investigators Dr. Ting Zou and co-applicants Dr. Stephen Czarnuch, both with the Faculty of Engineering and Applied Science, and Dr. Vinicius Prado da Fonseca, also in the Department of Computer Science, Faculty of Science.
“Current state-of-the-art, non-invasive prosthesis control systems use pattern recognition techniques driven by surface muscle signals,” said Dr. Jiang. “This requires the user to carefully exert distinct muscle signal patterns to perform different gestures. However, in real-life situations, intact people rarely have to think about their hand gestures when grabbing an object, instead, the fingers and the hand are naturally configured to the proper posture when the hand reaches and touches a target object.”
Surgery to connect sensors of an artificial hand to the nerve in the amputee’s residual arm, known as targeted muscle reinnervation, is highly costly in terms of both the extra surgical cost and potential infections.
“Receiving this funding is a tremendous source of encouragement for us to explore our innovative ideas.”
Dr. Jiang and his colleagues were inspired to propose a novel bio-inspired natural prosthetic hand control interface that features the addition of miniature cameras and tactile sensors to the prosthetic hands.
“These additions will enable the robotic hand to “see” the target and “feel” the environment and the object during the reach-and-grasp process, and automatically drive the hand towards grasping with little control effort from the user. The amputees only need to decide whether to proceed or retrieve the robotic hand.”
The project is highly interdisciplinary, involving the manufacturing and integration of tactile sensors, computer vision and the development of machine learning algorithms.
In addition to bringing together researchers from computer science and engineering — with expertise spanning mechanical, electrical and computer engineering and computer science — it also includes collaborators from rehabilitation science, kinesiology and psychology.
They aim to meet three primary goals: to enable prosthetic hands with vision and haptic functions using computer vision and tactile sensing techniques; to explore the best prosthetic hand control strategies to achieve high-accuracy movement with minimum control effort from the user; and to develop an easy and natural prosthetic hand control interface by fusing multiple inputs from computer vision, touch sensing and muscle signals.
“We will attach the developed interface to both commercially available and customized prosthetic hands and test with amputees in collaboration with our local rehabilitation center,” said Dr. Jiang.
They hope the project will deliver an affordable and easy-to-use natural control interface that will decrease the rejection rate of prosthetic hand use in real life and benefit amputees throughout Canada and the world.
“Receiving this funding is a tremendous source of encouragement for us to explore our innovative ideas, which may entail high risk but also high reward, as is the defining characteristic of the NFRF funding,” said Dr. Jiang.
The NFRF competition results were announced April 25 and include more than $200 million in support of Canadian-led research.
Dr. Jiang was successful in the NFRF 2022 Exploration competition, which funded 128 research projects bringing disciplines together in novel ways to form bold, innovative perspectives.