Bicep-Sensing Robot Helps Humans Lift Objects

A team of roboticists from MIT has created a system that will allow robots to lift objects by sensing the electric signals from a person’s biceps / Photo by: improvisor via 123RF

 

A team of roboticists from Massachusetts Institute of Technology has created a system that will allow robots to lift objects by sensing the electric signals from a person’s biceps as the human moves their arm up and down. This is according to space and technology news provider Popular Science.

 

MIT’s Bicep-Sensing Bot: How It Functions

The bicep-sensing bot functions using electrodes, an electrical conductor used to take an electric current to or from the power source that can either be a living body or equipment. The MIT team used a living body as an electrode in their research. They stuck the electrodes into the upper arm of a human and then connected it to the robot using wires. 

The bicep-sensing bot functions using electrodes, an electrical conductor used to take an electric current to or from the power source that can either be a living body or equipment / Photo by: luchschen via 123RF

 

MIT’s doctoral candidate Joseph DelPreto, who is also a part of the current research and whose research interests focus on human and robot interaction, explained that they hope their robotic system will make it easier for robots and people to join forces as a team to achieve physical tasks. The MIT roboticists said that for a human and a robot to work together well, it would often need good communication. In their bicep-sensing bot, they made sure that such communication would stem directly from the human muscles. So what happens is that the robot will look at the activity of the human muscle and then it will “sense” or read how it is moving. After that, the robot will try to help the human in physical tasks.

Two Ways the Robot Responds to Muscle Signals

There are two ways the robot responds to the signals it reads from the human muscle. First, it senses the EMG signals from the biceps, as the person moves his arm in an up-and-down motion. The robot will then mirror such movement. In a separate study titled “Techniques of EMG signal analysis: detection, processing, classification, and applications,” authors M.B.I. Raez from Malaysia-based Multimedia University’s Faculty of Engineering and team wrote that the electromyography (EMG) signal is a biomedical signal that evaluates the electrical currents gathered in muscles during a contraction that represents neuromuscular activities.

In the MIT research, their robot reads the EMG signals from the muscles. The person whose upper arm is connected with electrodes can likewise flex his biceps without really moving the arm. For example, he may simply relax or tense the muscle and this will already send a message to the robotic hand that it should also move in an up-and-down motion.

The second way the robot responds to muscle signals is by interpreting even the subtle motions. This is made possible by using artificial intelligence—a branch of computer science that focuses on the creation of machines that react and work like humans. MIT’s system can also tell the robot to lift objects in a nuanced way. For example, the person can slightly move his wrist down once or up twice. Then, the robot will also follow this movement. DelPreto stated that to make such possible, they used an artificial intelligence system called the neural network that can study from the data. It does that by interpreting the electromyography signals that are coming from the triceps and biceps of the person where the electrode is attached. Then, it will analyze what it sees 80 times every second. Further, the neural network will tell the robot how it should function.

“Sharing the Load” With the Robot

DelPreto and team believed that their system can help people doing physical labor, such as those in construction and factories because they are used to lifting heavy or big objects. With the help of robotics, they can work as a team or share the load via muscle activity. In the first paper that described the system, DelPreto and Daniela Rus likewise mentioned how robots can be “valuable” assistants to human and can increase productivity. Prior to their research, they saw a communication barrier with robots and humans trying to work together to accomplish a physical task. This is because there is no interface yet that can allow the robot to interpret the human’s commands and intention in an accurate way. The standard method, for instance, when interacting with technologies is by using hand gestures or by setting up radar units, cameras, and lasers in their surroundings like with autonomous cars. Yet, none of these systems can sense or measure a person’s reflex. 

The MIT group realized that since it is natural for a person to generate muscle activity during their physical movement and interactions, they used such an approach to address the past challenges. So, they came up with a control interface that is muscle-based. In particular, their work used an algorithm that continuously reads the “lifting setpoint” from the upper arm activity. DelPreto continued that the closer the robot and human are working together, they produce a more effective synergy.

The team believed that their system can help people doing physical labor, like construction and factories because they are used to lifting heavy or big objects / Photo by: Katarzyna Białasiewicz via 123RF