Scientists at EPFL have developed a new way to control a robotic arm meant for amputees. The prosthetic arm combines automation and individual finger control to improve manipulation and grasp.
This is proof of concept, which merges robotics and neuroscience, was tested on both healthy subjects and amputees. The results were promising and the data is available in today’s copy of the Nature machine intelligence issue.
The tech merges concepts from different disciplines. Implementing the two fields at the same time has not been tried before in robotic arm control. This new technology will contribute significantly to shared control neuroprosthetics.
One of the concepts, from the field of neuroengineering, encompasses decoding planned finger motion by reading muscular movements from the stump of an amputee to enable individual control of the digits on the prosthetic. The other concept, borrowed from robotics, lets a robotic arm hold objects and sustain contact.
The arm is controlled through algorithms. The code is able to decode a user’s intention and translate the message into digit motion on the prosthetic arm. The user has to perform various movements to teach the code. Several sensors on the stump detect neural activity in the muscles. Once a user’s intent is detected and understood, the data is used to control digits on the prosthetic arm.
The scientist developed code so that it initiates automatically when the users wished to grab an object. The code instructs the prosthetic to close digits when the user has made contact with an object.
This feature was adapted from another study involving robotic arms that were designed to interpret the shape of objects and use the information to grasp without the need for visual stimulation. The engineers are yet to perfect the code before the prosthetic is released into the market. In the meantime, the code is undergoing tests on robots.