Lifting a glass, making a fist, entering a phone number using the index finger: it is amazing the things cutting-edge robotic hands can already do thanks to biomedical technology. However, things that work in the laboratory often encounter stumbling blocks when put to practice in daily life.
The problem is the vast diversity of the intentions of each individual person, their surroundings and the things that can be found there, making a one size fits all solution all but impossible.
A team at FAU is investigating how intelligent prostheses can be improved
Products and exhibitors related to the topic
Exhibitors and products related to this topic can be found in the database of MEDICA 2022:
The idea is that interactive artificial intelligence will help the prostheses to recognize human intent better, to register their surroundings and to continue to develop and improve over time. The project is to receive 6 million euros in funding from the EU, with FAU receiving 467,000 euros.
“We are literally working at the interface between humans and machines,” explains Prof. Dr. Claudio Castellini, professor of medical robotics at FAU. “The technology behind prosthetics for upper limbs has come on in leaps and bounds over the past decades.” Using surface electromyography, for example, skin electrodes at the remaining stump of the arm can detect the slightest muscle movements. These biosignals can be converted and transferred to the prosthetic limb as electrical impulses. “The wearer controls their artificial hand themselves using the stump. Methods taken from pattern recognition and interactive machine learning also allow people to teach their prosthetic their own individual needs when making a gesture or a movement.”
At present, advanced robotic prosthetics have not yet reached optimal standards in terms of comfort, function and control, which is why many people with missing limbs still often prefer purely cosmetic prosthetics with no additional functions. The new EU Horizon project “AI-Powered Manipulation System for Advanced Robotic Service, Manufacturing and Prosthetics (IntelliMan)” therefore focuses on how these can interact with their environment even more effectively and for a specific purpose.
Researchers at FAU concentrate in particular on how to improve control of both real and virtual prosthetic upper limbs. The focus is on what is known as intent detection. Prof. Castellini and his team are continuing work on recording and analyzing human biosignals, and are designing innovative algorithms for machine learning aimed at detecting the individual movement patterns of individuals. User studies conducted on test persons both with and without physical disabilities are used to validate their results. Furthermore, FAU is also leading the area “Shared autonomy between humans and robots” in the EU project, aimed at checking the safety of the results.
Prof. Castellini heads the “Assistive Intelligent Robotics” lab (AIROB) at FAU that focuses on controlling assistive robotics for the upper and lower limbs as well as functional electrostimulation. “We are exploiting the potential offered by intent detection to control assistive and rehabilitative robotics,” explains the researcher. “This covers wearable robots worn on the body such as prosthetics and exoskeletons, but also robot arms and simulations using virtual reality.” The professorship focuses particularly on biosignal processing of various sensor modalities and methods of machine learning for intent detection, in other words research directly at the interface between humans and machines.