Brain Control Robot- It’s Alive!
The concept of “thought control”- using a brain neural interface to control a robot (or other device) takes another step forward with this research from U Washington.
Rajesh Rao, associate professor of computer science and engineering, and his students have demonstrated that an individual can “order” a robot to move to specific locations and pick up specific objects merely by generating the proper brain waves that reflect the individual’s instructions. The results were presented last week at the Current Trends in Brain-Computer Interfacing meeting in Whistler, B.C.
“This is really a proof-of-concept demonstration,” Rao says. “It suggests that one day we might be able to use semi-autonomous robots for such jobs as helping disabled people or performing routine tasks in a personâ€™s home.”
The controlling individual â€“ in this case a graduate student in Raoâ€™s lab â€“ wears a cap dotted with 32 electrodes. The electrodes pick up brain signals from the scalp based on a technique called electroencephalography. The person watches the robotâ€™s movements on a computer screen via two cameras, one mounted on the robot and another above it.
Right now, the “thought commands” are limited to a few basic instructions. A person can instruct the robot to move forward, choose one of two available objects, pick it up, and bring it to one of two locations. Preliminary results show 94 percent accuracy in choosing the correct object.
I previously wrote about DARPA’s work on Braingate- the implanted neural interface and monkeys controlling robot arms.Â Both of these projects are precursors to thought controlled prosthetic armsÂ or overcoming neurodegenerative disease(or Doc Oct forÂ those of you focusing on theÂ scarier military side).Â Â
This project differs in that it’s a removable cap that uses a “dirty brain signal”- opening up the potential for everyday use including games, industrial uses, and control of surgical tools.
One of the important things about this demonstration is that we’re using a ‘noisy’ brain signal to control the robot,” Rao says. “The technique for picking up brain signals is non-invasive, but that means we can only obtain brain signals indirectly from sensors on the surface of the head, and not where they are generated deep in the brain. As a result, the user can only generate high-level commands such as indicating which object to pick up or which location to go to, and the robot needs to be autonomous enough to be able to execute such commands.”Â
Further progress is of course being made also by the gamers as reported here.Â Several PhD’s I spoke with in the VR development field told me that significant progress is being made in their field now by gamers as much as the basic science researchers.Â Does not surprise me.Â