The Real Scoop on Honda’s Brain Controlled Asimo Robot

1,771

Many of our readers may have seen the announcement yesterday that Honda has enabled its famous Asimo robot to be controlled by the thoughts of a nearby human.  The story certainly creates a sensational headline, and a few years ago we admit it really would be an amazing story.  But this is 2009, and technology is moving fast…so fast that the announcement from Honda seems like old news.  About the only interesting thing we can see from the announcement is that some really cool images were created that can spur our imagination and wet our appetites for the much better stuff that is in the pipeline.  Lets look at the pictures first, then more discussion:

asimo_bmi_brain_control_scanasimo_bmi_closeup_1
Above: Closeup shots of the head gear that reads blood flow in the brain and electrical signals
asimo_brain_control_robot_bmi_machineasimo_bmi_brain_control_scan_robot
Above: An individual uses thoughts to send four simple commands to a nearby Asimo robot

Honda has demonstrated a person sending four simple commands to the robot simply by thinking. The commands are “lift right hand”, “lift left hand”, “move legs”, and “stick out tongue”. Pretty cool you say? Not really! Earlier this year we reported on a team at Carnegie Mellon that is able to read substantially more than four thoughts from an individual with at least the same accuracy as Honda is claiming.  The Carnegie Mellon team was not using these extracted thoughts to control a robot or anything else, but they easily could have.  The hard part is extracting thoughts from a person’s head.  After the thought is extracted it is a simple matter to automate an action based on these thoughts.

Not only have we already seen better non-invasive thought extraction than in the Asimo demonstration, but also earlier this year we saw something far better: invasive, direct access to the brain through implanted electrodes.  In a previous story we saw that a monkey implanted with electrodes was able to control the walking movement of a robot simply by thinking about walking.  The monkey demonstration was far more advanced than the Honda demonstration, allowing for a modulated spectrum of commands to be sent, such as slow down, speed up, and stop.  This goes far beyond sending a simple command such as “walk”.

The stories referenced above are just the tip of the iceberg.  All across the world researchers are rapidly breaking down the barrier between our once private thoughts and the outside world.  Of course we are a long way from completely reading someone’s entire mind, but on the flip side reading four simple thoughts from a person’s mind is already old news.

Don’t get us wrong about Honda though!  We are thrilled to see Honda joining the brain interface party with investment, ideas, prototypes, and more.  Honda’s success is no small feat and is a valid contribution to the field.  But we also want readers to have some perspective of where the Honda demonstration fits in with what the rest of the industry is doing.

The field of interfacing with the human brain is literally exploding.  It is going to be an amazing journey and the rewards will be great.  Honda says they want to “open the car trunk simply by thinking”.  We say forget the trunk!  How about allowing a paralyzed person to walk again once we can extract their thoughts and re-route them to a pair of robotic legs.  Now that is what we’re talking about!

Singularity Hub chronicles technological progress by highlighting the breakthroughs and issues shaping the future as well as supporting a global community of smart, passionate, action-oriented people who want to change the world.

Follow Singularity: