“People just behave like people, even when interacting with a robot.” — Cynthia Breazeal 2010
Forget the automated factories, the Mars rovers, and the vacuum cleaners, Cynthia Breazeal wants you to know that robots should actually be social machines. Head of the Personal Robotics Group at the MIT Media Lab, Breazeal is a world-renowned expert in robotics. In the late 1990s, she and her team built Kismet, one of the first advanced robots ever made specifically to interact with humans on a social level. More than a decade later, Breazeal is still pushing the boundaries of where machines and people come together. In her latest talk at TED Women in Washington DC this past year, Breazeal gave a 14 minute overview of some of the incredible projects coming out of the Personal Robotics Group. Don’t miss the video of her talk below, followed by two more video examples of her work. Breazeal’s exploration of how bot and humans can bond socially heralds the arrival of the age of personal robotics.
Cynthia Breazeal’s team at MIT has its hands in many different projects, but they all seem to share a common understanding – robots are really about people. From the first days of Kismet (1:20), to the more advanced emotional awareness of Leonardo (2:10) the Personal Robotics Group has designed and constructed some of the most socially viable machines on the planet. People will talk to these robots, listen to their advice, and even miss them when they are gone. At 4:15 in the talk below, Breazeal discusses how psychologists are using robots to help them understand the nuances of human behavior. With body language, a main advantage bots have over computers, social machines can augment digital communication (5:18), or form the personal connections needed to help people change their habits (8:30). In Breazeal’s vision, robots my even come to have a valuable role in educating our youngest children (11:15). In this field of robotics, the greatest successes aren’t measured in speed, strength, or coordination, but rather in the trust and connections these machines engender among the humans that interact with them.
Huggable, a robotic teddy bear from the PRG, is able to track objects and interact with humans via a voice. In the video below, you can see it encourage a user to color a picture. Brief shots of the computer interface showcase its tracking skills, and its body awareness.
Graduate student Ryan Wistort’s project at the PRG was TOFU, a robot that used tested methods of animatronics to create a stretchable and squishable character with a lot of personality.
Here’s Tofulandia, a ‘mixed reality’ system that allows for interactions between fully digital and fully physical (robotic) versions of Tofu.
Clearly there are going to be uses for robots that don’t require them to have a high level of social skills. When you’re assembling car parts 24/7, you don’t need a lovable face or a cuddly fur exterior. Yet I think Breazeal’s work highlights the vast potential that robots could have when aimed towards social interactions. With driven research, we could have machines that closely mimic our expressions and appearance, able to act as stand-ins for humans. There are teams all over the world, like MIT’s Personal Robotics Group or Kokoro in Japan working to make these kinds of replicants. Who knows, with enough advances in artificial intelligence, maybe they wouldn’t simply look like us, maybe they would think like us as well. We’re still many years (decades?) away from having real world versions of C3PO and R2D2 in our homes, but the concept of a robot pal arriving in my lifetime seems more and more likely every day. Thanks, Prof. Breazeal.
[image credits: TED (modified), Personal Robotics Group at MIT Media Lab]
[sources: TED, PRG at MIT]