VentureBeat presents: AI Unleashed - An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More


Controlling robots with your mind isn’t as far-fetched as it sounds. Researchers at the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed an interface that reads the brainwaves of human operators, allowing them to direct machines to perform tasks just by thinking about them.

“We’d like to move away from a world where people have to adapt to the constraints of machines,” CSAIL director Daniela Rus told MIT News. “Approaches like this show that it’s very much possible to develop robotic systems that are a more natural and intuitive extension of us.”

The system monitors brain activity using a combination of electroencephalography (EEG), which detects electrical activity in the brain using electrodes attached to the scalp, and electromyography (EMG), a technique of measuring the signals produced by motor neurons.

EEG and EMG aren’t a perfect science — neither is particularly precise. But by merging the two together, the team was able to achieve a much higher degree of accuracy than if it used either in isolation.

Event

AI Unleashed

An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.

 

Learn More

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures, along with their snap decisions about whether something is going wrong,” Joseph DelPreto, a Ph.D candidate and lead author on the paper introducing the project, said. “This helps make communicating with a robot more like communicating with another person.”

The team’s algorithm parsed the signals for “error-related potentials” (ErrPs), a neural activity pattern that has been found to occur naturally when people notice mistakes. The minute it detected ErrP — i.e., when the robot being controlled was about to make an error — it stopped so that the operator could correct it using a hand gesture-based menu interface.

“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” Rus said. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

Human-supervised robots corrected for errors more than 97 percent of the time, the researchers found, compared to the control group’s 70 percent. All the more impressively, the system worked just as well on people who’d never used it before.

The team imagines the system could be useful for workers with language disorders or limited mobility.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.