Robot avatar body controlled by thought alone

July 5, 2012 | Source: New Scientist

These areas of the motor cortex are activated when thinking about moving parts of your body (credit: A. Hheddar/New Scientist)

For the first time, a person lying in an fMRI machine has controlled a robot hundreds of kilometers away using thought alone.

.”The ultimate goal is to create a surrogate, like in Avatar, although that’s a long way off yet,” says Abderrahmane Kheddar, director of the joint robotics laboratory at the National Institute of Advanced Industrial Science and Technology in Tsukuba, Japan.

Teleoperated robots, those that can be remotely controlled by a human, have been around for decades. Kheddar and his colleagues are going a step further. “True embodiment goes far beyond classical telepresence, by making you feel that the thing you are embodying is part of you,” says Kheddar. “This is the feeling we want to reach.”

To attempt this feat, researchers with the international Virtual Embodiment and Robotic Re-embodiment project used fMRI to scan the brain of university student Tirosh Shapira as he imagined moving different parts of his body. He attempted to direct a virtual avatar by thinking of moving his left or right hand or his legs.

The scanner works by measuring changes in blood flow to the brain’s primary motor cortex, and using this the team was able to create an algorithm that could distinguish between each thought of movement (see diagram). The commands were then sent via an internet connection to a small robot at the B├ęziers Technology Institute in France.

The set-up allowed Shapira to control the robot in near real time with his thoughts, while a camera on the robot’s head allowed him to see from the robot’s perspective. When he thought of moving his left or right hand, the robot moved 30 degrees to the left or right. Imagining moving his legs made the robot walk forward.

To test the extent of his feelings of embodiment, the researchers also surprised him with a mirror (see “On the inside, looking out“). “I really felt like I was there,” Shapira says. “At one point the connection failed. One of the researchers picked the robot up to see what the problem was and I was like, ‘Oi, put me down!'”

The brain is very easily fooled into incorporating an external entity as its own. Over a decade ago, psychologists discovered that they could convince people that a rubber hand was their own just by putting it on a table in front of them and stroking it in the same way as their real hand. “We’re looking at what kinds of sensory illusions we can incorporate at the next stage to increase this sense of embodiment,” says Kheddar. One such illusion might involve stimulating muscles to create the sensation of movement (see “Feeling is believing“).

The researchers are also fine-tuning their algorithm to look for patterns of brain activity, rather than simply areas that are active. This will allow each thought process to control a greater range of movements. “For example, you could think of moving your fingers at different speeds and we could correspond that with different speeds of walking or turning,” says Cohen, who presented the results of the embodiment trials at BioRob 2012 in Rome, Italy, last week.

So far, only healthy people have embodied the surrogate. Next, the researchers hope to collaborate with groups such as Adrian Owen’s at the University of Western Ontario in Canada to test their surrogate on people who are paralyzed or locked in.

“I think it is very impressive and in the broadest sense reflects where it is that we are trying to get to in enabling communication in patients who are deemed to be locked in or even vegetative,” says Owen. He cautions, though, that there is a long way to go before the technology is able to provide long-term help for patients.

Electroencephalogram (EEG) technology, which uses electrodes attached to the scalp to record electrical activity in the brain, is likely to prove more practical than fMRI, he says, since it is cheaper and more comfortable to use for extended periods of time. Although EEG has been used to control robots, the readings are not yet as clear as those from fMRI. Nevertheless, this demonstration is “an interesting glimpse of what might be possible in the future,” Owen says.

(The New Scientist article requires free registration. The article will be available for ten days.)