Yesterday a partially paralyzed man directed a robot by thought alone. From a distance of 100 kilometers (60 miles), his brain signals were transmitted to a Lausanne lab
In a stunning demonstration, Mark-Andre Duc steered a foot-tall robot by just thinking about lifting his fingers, a task he is incapable of doing in real life after a fall which paralyzed his legs and fingers. A simple head cap was used to pick up the electrical signals and a laptop converted these EEG signals and converted them as instructions which it transmitted to the remote robot.
Some readers might remember that in 2009 Mattel claimed to have a toy called MindFlex which could do the same. Well, not steering a robot, but by sheer will power you could maneuver a little blue ball. At least that was the claim and the tech press was all over it in 2009. You can still buy it but the German Spiegel conducted an experiment showing the effect was nearly random so not exactly comparable to this very precise robot steering.
However there has been earlier scientific research into mind control of devices and it has been shown to work, but so far this either involved able bodied persons or invasive brain implants. In all cases experiments were carried out in a lab environment with reasonable, but not perfect, success rates.
This breakthrough experiment was done over a large distance with Mark-Andre Duc being in a hospital. So far this has proven to be the holy grail of device control by thought. Duc says controlling the robot wasn’t hard on a good day “But when I’m in pain it becomes more difficult.”
Swiss professor José del R. Millán explains that background noise caused by pain or even a wandering mind has emerged to be a major challenge in the research of so-called brain-computer interfaces since they first began to be tested on humans more than a decade ago.
Concentration is key for this kind of thought control, so experiments are usually done by healthy persons in a perfect laboratory setting to avoid all distractions. Trying to do this from a remote hospital without direct visual feedback on the robot and suffering from incidental pain is the hardest case. The moment you´re distracted, the brain stops sending clear signals and the task is disrupted.
To solve this problem, Millan´s team has worked hard on filtering out all disruptions. In the so-called Brain Computer Interfacing (BCI) even blinking your eyes disturbs the signal so a main challenge is to get rid of all the natural signals the brain emits. So far researchers had large problems filtering brain activity precisely enough to focus solely on the signal used to control a device.
The human mind is extremely good at multitasking so it constantly switches attention. One of the methods the Swiss team used was to instruct the robot to stick to the last instruction given. A few milliseconds of distraction just meant it would follow the last task given. The little round robot only changed course when a new left or right decision was made by Mark-Andre.
Team leader Millan states that just a few days of training is sufficient to steer a robot through several rooms and that the results show “mental control was only marginally worse than manual control on the same task.” The goal is to turn these video-equipped small robots into the –hands- minds of persons with impaired movement so that they can go to places they can´t reach by wheelchair.
Are we the only ones who immediately want one just to satisfy our lust for gadgets?
Swiss research: http://www.escif.org/files/documents/members_downloads/Congress/escif10_millan.pdf
Mattel´s toy: http://www.spiegel.de/wissenschaft/technik/0,1518,761169,00.html
Author: +Max Huijgen