New AI system reduces the mental effort of using bionic hands

A new system utilizing artificial intelligence allows bionic hands to grasp objects with a delicacy and precision that usually requires intense concentration from the user. By equipping prosthetic fingers with sensors that detect both pressure and proximity, engineers have created a device that shares control between the human user and the machine. This collaborative approach improves grip stability while reducing the mental effort required to perform daily tasks, according to a study published in Nature Communications.

The human hand is a marvel of biological engineering, capable of exerting force to crush a can or the finesse required to hold a grape without bruising it. This versatility stems from a feedback loop between the hand and the brain. The skin contains sensors that detect the slightest touch, and the brain maintains an internal model of where the fingers are in space.

Modern prosthetic limbs have advanced in their mechanical design, often looking quite lifelike. Yet, controlling them remains a distinct challenge. Most commercial bionic hands lack the sense of touch. They operate based on electromyography, or EMG, which records electrical signals from muscles in the residual limb.

To close the hand, the user flexes a muscle. To open it, they extend the muscle. This process relies on visual feedback. The user must watch the prosthetic hand intensely to judge distances and grip strength. If they look away, they might drop the object or crush it. This constant need for visual attention creates what researchers call a high cognitive burden.

Because of this mental strain and the difficulty of controlling fine motor movements, many individuals abandon their electronic prostheses. They often return to simpler body-powered hooks or stop using a prosthesis entirely. Marshall A. Trout, a postdoctoral researcher in the Utah NeuroRobotics Lab and a lead author of the study, noted the disconnect between appearance and function.

“As lifelike as bionic arms are becoming, controlling them is still not easy or intuitive,” Trout said. “Nearly half of all users will abandon their prosthesis, previous research shows, often citing their poor controls and cognitive burden.”

To address these limitations, a team of researchers at the University of Utah sought to restore a form of automated reflex to the bionic hand. The study was led by Trout and Jacob A. George, a professor in the Department of Electrical and Computer Engineering at the University of Utah. Their goal was not to remove the human from the loop but to create a system where the machine handles the low-level adjustments while the human dictates the high-level intent.

The researchers retrofitted a commercially available prosthetic hand, the TASKA, with custom-made fingertips. Inside each silicone fingertip, they embedded a barometric pressure sensor and an optical proximity sensor. The pressure sensor detects physical contact, while the proximity sensor emits infrared light to detect objects up to a few centimeters away.

This sensor array allows the hand to perceive its environment in a limited but useful way. The fingers can “see” an object before touching it. They can also feel when they have made contact. To process this data, the team developed an artificial neural network. This is a type of machine learning algorithm modeled loosely on the human brain.

The neural network was trained to interpret the sensor data and predict the exact position each finger needed to reach to contact an object. This allows the fingers to conform to the shape of an item automatically. If a user reaches for a ball, the fingers curve to match the sphere. If they reach for a rectangular block, the fingers adjust accordingly.

The core innovation of the study is a concept called shared control. In previous attempts at assisted grasping, systems often acted like a switch. The human would initiate a movement, and then the computer would take over completely. This often left users feeling a lack of agency, or control, over the device.

In this new framework, the control is blended continuously. The user provides the primary command to close or open the hand using their muscle signals. Simultaneously, the AI adjusts the position of the fingers based on the sensor readings. The result is a cooperative effort. The machine ensures the fingers align with the object and stop upon contact, while the human controls the overall timing and firmness of the grasp.

“What we don’t want is the user fighting the machine for control. In contrast, here the machine improved the precision of the user while also making the tasks easier,” Trout said. “In essence, the machine augmented their natural control so that they could complete tasks without having to think about them.”

To validate this system, the researchers recruited nine participants with intact limbs and four participants with transradial amputations, which is an amputation occurring between the elbow and the wrist. The participants performed a series of standardized tests designed to measure dexterity and control.

One such test involved moving a “fragile” object. The object was equipped with sensors to detect if it was squeezed too hard. If the grip force exceeded a certain threshold, the object was considered broken.

When using the shared control system, participants were much less likely to break the object compared to using standard human-controlled methods. The sensors allowed the hand to stop closing the moment it made contact, preventing the user from accidentally crushing the target.

Another test assessed grip security. Participants had to pick up and hold a large sphere. With standard control, the lack of sensory feedback meant users often applied uneven pressure, causing the ball to slip. With shared control, the independent fingers adjusted themselves to maintain contact with the surface of the ball. This resulted in fewer drops and longer holding times.

The researchers also measured the cognitive burden placed on the users. While performing the grasping tasks, participants had to respond to a small vibration on their collarbone or hip by pressing a button. This is known as a detection-response task.

If a person is thinking hard about controlling the hand, their reaction time to the vibration slows down. The study found that when using shared control, participants responded faster to the vibration. This suggests that the AI assistance freed up mental resources, allowing the users to pay attention to other things while still grasping objects effectively.

The benefits were particularly evident for the amputee participants. They attempted activities of daily living that are notoriously difficult with standard prostheses. One participant attempted to pick up a disposable foam cup, bring it to his mouth, mime taking a sip, and set it back down.

These cups are flimsy. Squeezing them just a little too hard crushes them, spilling the contents. With his standard control method, the participant crushed or dropped the cup in almost every attempt. When using the shared control system, he successfully completed the task the majority of the time.

The system also enabled participants to pick up small, delicate items like an egg or a piece of paper without damaging them. The machine control acted as a safety net, modulating the force applied by the fingers even if the user sent a strong muscle signal.

“By adding some artificial intelligence, we were able to offload this aspect of grasping to the prosthesis itself,” George said. “The end result is more intuitive and more dexterous control, which allows simple tasks to be simple again.”

While the results are promising, the study does have limitations. The experiments were conducted in a controlled laboratory setting. Real-world environments are messier and less predictable. The sensors might encounter interference from bright sunlight or dirt, although the study notes the distributed nature of the sensors provides some redundancy.

Additionally, the current study focused on specific grasping tasks. Future research will need to determine if this shared control approach remains effective for a wider variety of objects and manipulations, such as turning a doorknob or tying shoelaces. The researchers also plan to investigate how the system performs over longer periods of time as users become more accustomed to the assistance.

The team views this work as a stepping stone toward even more advanced bionic systems. They are investigating ways to provide sensory feedback directly to the user’s nervous system.

“The study team is also exploring implanted neural interfaces that allow individuals to control prostheses with their mind and even get a sense of touch coming back from this,” George said. “Next steps, the team plans to blend these technologies, so that their enhanced sensors can improve tactile function and the intelligent prosthesis can blend seamlessly with thought-based control.”

The study, “Shared human-machine control of an intelligent bionic hand improves grasping and decreases cognitive burden for transradial amputees,” was authored by Marshall A. Trout, Fredi R. Mino, Connor D. Olsen, Taylor C. Hansen, Masaru Teramoto, David J. Warren, Jacob L. Segil & Jacob A. George.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons
HTML Snippets Powered By : XYZScripts.com

Shopping cart

×