Explore Topics:
AIBiotechnologyRoboticsComputingFutureScienceSpaceEnergyTech
Robotics

Robot See, Robot Do

Aaron Saenz
May 22, 2009
Robot See, Robot Do

Share

hawk-robot-waving-hand1

You don't need to go through years of school and computer science classes to learn how to program robots. With the new Hawk robot from Dr. Robot, programming is as easy as playing with an action figure. The Hawk has two long arms with a full range of motion that looks something like an orange robot butler. Like some bizarre puppetry act, you can program the large robot by moving the limbs on a smaller Hawk bot (I call him Mini-Jeeves). The larger Hawk robot mimics and records the actions for playback later. It's the simplest robot programming technique I've seen.

The real life Dr. Robot, Dr. Haipeng Xie, has been providing us with a constant stream of interesting and practical robots from his Ontario based company. Hawk, though, is the first I know of where the interface itself is rather revolutionary. By translating movement into programming, he's managed to take robotics in a very user-friendly direction. The Hawk robot can be a kinetic learner, which is a trait shared by most toddlers (just ask elementary school teachers). Check out the demonstration video straight from the Dr. Robot website after the break:

Be Part of the Future

Sign up to receive top stories about groundbreaking technologies and visionary thinkers from SingularityHub.

100% Free. No Spam. Unsubscribe any time.

The Hawk bot is designed to be something of an educational tool, or promotional bot. It can be programmed to serve water, or act like a butler, or hand out fliers. With two built in cameras, though, it may also serve as a casual patrolman and security robot. In fact, if you can think of an action that a two-armed human-like robot can perform, chances are there's a video of Hawk doing it. I think my favorite is watching it try to play the drums (is that the Zelda theme I hear in the background?):

I'm not sure if Dr. Robot will keep progressing into even more intuitive interfaces, but someone will. The next step would be using cameras to recognize human position and movement and mimicking that as well. Maybe some of the image recognition software from ACE could be ported over? The model manipulation is a good first step, but one day it would be great just to have the robot watch you and learn what it needs to do. Hopefully it won't pick up any of our bad habits in the process.

Related Articles

These Tiny Liquid Robots Merge and Split Like ‘Terminator’

These Tiny Liquid Robots Merge and Split Like ‘Terminator’

Shelly Fan
A human hand shaking a robotic hand

This Robotic Hand’s Electronic Skin Senses Exactly How Hard It Needs to Squeeze

Shelly Fan
Artist's depiction of a swarm of robots forming into a wrench.

This Robot Swarm Can Flow Like Liquid and Support a Human’s Weight

Shelly Fan
These Tiny Liquid Robots Merge and Split Like ‘Terminator’
Robotics

These Tiny Liquid Robots Merge and Split Like ‘Terminator’

Shelly Fan
A human hand shaking a robotic hand
Robotics

This Robotic Hand’s Electronic Skin Senses Exactly How Hard It Needs to Squeeze

Shelly Fan
Artist's depiction of a swarm of robots forming into a wrench.
Robotics

This Robot Swarm Can Flow Like Liquid and Support a Human’s Weight

Shelly Fan

What we’re reading

Be Part of the Future

Sign up to receive top stories about groundbreaking technologies and visionary thinkers from SingularityHub.

100% Free. No Spam. Unsubscribe any time.

SingularityHub chronicles the technological frontier with coverage of the breakthroughs, players, and issues shaping the future.

Follow Us On Social

About

  • About Hub
  • About Singularity

Get in Touch

  • Contact Us
  • Pitch Us
  • Brand Partnerships

Legal

  • Privacy Policy
  • Terms of Use
© 2025 Singularity