Body Suit Controls Robot With Gestures (Video)

2,045
fuRo-WIND device
Those white boxes are position sensors that allow the WIND system to turn your gestures into robotic commands.

Of all the ways you could command a robot, turning your body into a game controller sounds like the most fun. Future Robotics Technology Center (fuRo), part of the Chiba Institute of Technology, has developed an upper body suit that does just that. The Wireless Intelligent Networked Device (WIND) uses several small sensors, each with 3D positioning, to translate user motion into robotic commands. WIND communicates with a robot via Bluetooth signals, eliminating the need for a direct wired connection. All sensor information is controlled by a System in Package (SiP) core which consolidates a PC’s worth of robot command capability into a single chip. The fuRo system uses gestures, not one-to-one motion capture, to dictate commands. In other words the user doesn’t raise a hand when she wants the robot to raise a hand, she raises a hand when she wants the robot to dance. It’s a very cool looking control scheme when you see it in action. Check out the videos below to see for yourself.

Gesture controls are really hitting it big this year. We’ve seen gestures replace TV remotes – why press a button when waving your hand works equally well? The Acceleglove is a wearable device that translates movement to command a robot or record ASL. Yet the WIND is the first such system I’ve seen that uses the entire upper body to provide gesture controls. It increases the possible vocabulary of movements by a large amount. WIND also seems to be quite fun judging by the users in the video.

Plasticpals provided the following video which shows a live demonstration of the WIND and fuRo’s Morph 3 robot. Skip ahead to 1:05 to see how elaborate physical movements on the users part produce equally elaborate movement of the robot. However, the controller and robot movements are often quite different. Again this isn’t one-to-one motion control.

Which is, frankly, a little disappointing to me. If I shrug my shoulders I want my robot to shrug its shoulders not do a funny pose. Gesture controls are cool when it comes to computers and TVs, yet when there’s a humanoid shape in front of me I want it to mimic my human shape. That’s just a visceral reaction on my part, but I think it’s something most other robot enthusiasts would share.

Now, I don’t want that criticism to detract from what fuRo has accomplished, because they’ve done something great here. SiP technology could really change the scale of robotics, letting more control happen in less space. And WIND is a remarkable way to translate body movement into commands, and it could lead to some really cool innovations in human-computer interfaces. Getting the entire upper body into the command environment could free up one’s hands for separate tasks, or even allow body posture to signal for an automated response. Slouching? You could have your robot yell at you to straighten up. Have you begun to contort and grab your left arm? Your robot could call for aid because you’re having a heart attack.

It will be interesting to see where fuRo takes the WIND concept. Cyberdyne’s nerve signal sensing for its exoskeleton and Microsoft’s Project Natal highlight how full body motion capture could change robotics and gaming. WIND will have to find a niche outside of those arenas. Even if full body gestures don’t catch on, I suspect that the technology that enables WIND will find good use in other areas. Multiple position sensor signals routed through a central processor – sounds like the basis for the craziest Wii accessory ever.

[image credit: fuRo]