Hanson Discusses Robots That Show Emotion in TED Video

David Hanson (right) wants to create robots that show emotions.
David Hanson (right) wants to create robots that can understand and display emotions.

Are the robots of the future going to be cold-hearted automatons or loving empathetic androids? David Hanson, founder and head of Hanson Robotics, is working to make sure that robots know how humans are feeling, and teaching them to mimic those emotions. Over the past eight years, Hanson has created over 20 life like synthetic faces, disturbingly real replicants, that seem to talk and respond as if they were human. Check out the robotics genius’ awesome facial hair during his quick five minute presentation video from TED 2009 below.

I often debate with people about whether or not computers and robots will ever reach or exceed human intelligence. There are many who believe that the human mind and spirit is simply too complex and beautiful to be replicated in a machine. Our emotional intelligence, they say, is beyond the reach of any artificial intelligence. I think Hanson’s presentation points to the possibility that robots will in fact be able to achieve some level of emotional intelligence. His robotic ‘characters’ can follow human faces, and mimic their expressions. Working with the Machine Perception Lab at UC San Diego, Hanson will create robots that can correlate key movements of your face with emotional states. These are the first steps towards emotional acuity, as demonstrated by human infants every day. Along with projects like iCub, Hanson’s work could help build robots that learn like children. Hopefully they will develop into robots that know how to care for others.

We mentioned Hanson’s TED talk, and the appearance of the Einstein replicant, in our first story about the robotics company, so it’s great to finally be able to watch the video. However, I’m really disappointed they only gave him five minutes of stage time. You can tell from Hanson’s fast paced explanations that he had a lot more to say. I certainly have more questions: Have the algorithms used for the Phillip K Dick android been improved in the last four years? Do any of the replicants make their own choices about which facial expressions to use when, or is everything tightly scripted? How much wax does it take to get that ‘stache to stay so pointy?

Of course, the great thing about Hanson Robotics is that they are dedicated to their work. That means we’re likely to see many more ‘character’ robots that are advancing emotionally. It will be interesting to see if Hanson, or any robotics engineer, can successfully merge learning machines with expressive animatronics. After all, we don’t just want robots that have emotional intelligence, we also want robots with whom we can identify. Singularity Hub has explored many different kinds of human-computer interfaces, but it may be that the most successful interplay between machines and people will also be the most basic: face to face conversation.

[photo credit: Hanson Robotics]

RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured