Categories
Latest
Popular
Editor's Picks

Anticipating Your Needs: Emotional Intelligence Is the Key to Smarter Gadgets

6,710 3 Loading

It’s weekend rush hour. You’re stuck in traffic. You got cut off. You’re increasingly frustrated. Your heart rate and blood pressure skyrocket.

As you’re fuming in traffic, an inconspicuous device on your wrist silently tracks your vitals and sends them wirelessly to the smart appliances in your home.

An algorithm detects you’re angry. The thermostat slightly lowers the room temperature and starts a light breeze. The fridge shifts a cold brew to the front within easy reach. The TV displays that Netflix show you’ve been binging on.

You storm through the door, grab the beer, collapse on the couch and immediately feel better.

Welcome to the future of the Internet of Things, where machines understand what you want and how you feel.

The Clippy Fiasco

Why would we want machines to have emotional intelligence?

According to Dr. Rosalind Picard, director of the Affective Computing Group at the MIT Media Lab, it’s because emotions are an integral part of human-computer interaction.

Studies repeatedly show that humans have a tendency to treat computers as if they were people, even if the computer doesn’t have a face or life-like body, explains Picard in a seminal paper in IEEE. In a survey called “Rage Against the Machine,” over 70% of 1250 UK participants admitted that they swear at their computers, despite being fully aware that they’re inanimate.

But our current “smart” machines are only cognitively smart: they learn, memorize and interpolate. When it comes to emotions, they’re completely blind.

Virtual assistants are an obvious example. Maybe you remember Clippy, the universally despised and ill-fated Microsoft Office assistant with a perpetually sanguine cyberpersonality. Clippy had the annoying tendency of popping up and offering unsolicited advice without knowing the writer’s intent. Instead of helping, the smiling, dancing cartoon often frustrated the user even more.

It’s not that Clippy isn’t intelligent, wrote Picard. It’s a genius about Office, but an idiot about people. It can’t tell that you are frustrated, ignores any signs of increasing annoyance and is stupid about displaying emotions.

Clippy may seem like a dated example, but the problem it represents is far from obsolete. Computers are experts at analyzing our social communications and entertainment preferences (Netflix recommendations, anyone?). Yet even the most sophisticated system can’t tell if it has upset an important customer or lost a long-time user.

“As technology is increasingly applied to situations where it must interact with everyone,” writes Picard, “it is all the more vital that it does so in a way that is courteous and respectful of people’s feelings.”

You might be thinking: boo-hoo, who cares about overly sensitive people? Picard, as well as others in the affective computation field, respectably disagrees.

It’s not just about making human-machine interaction easier, stresses Picard. Having emotional intelligence means that our non-feeling machines will be able to dissect our inner states simply by observation. Think Vulcan.

Imagine the possibilities, said Dr. Andrew Moore, dean of the Carnegie Mellon School of Computer Science. Virtual psychologists may help diagnose depression by analyzing facial expressions. Software may inform teachers whether students are engaged and fine-tune the lesson plan accordingly. Marketers may better assess interest in ads and products. GPS systems may clarify directions if they sense the driver is confused. Fitness trackers could know when to drill you to work harder and when to ease off. Health monitors may inform doctors of a patient’s discomfort.

And yes, the Amazon Echo in your living room may offer suggestions and play music that cater to your mood.

“We’ve been working a long time on such capabilities and it seems that we are on the brink of some major breakthroughs,” wrote Moore. “ I anticipate that 2016 will be a watershed year and…emotion will become a powerful new channel for interacting with our machines.”

Emotion Analytics

Unlike objective datasets, human emotions are devilishly finicky things.

Even cognitive psychologists haven’t figured out a great way to classify emotions, laughed Picard. Boredom, for example, isn’t considered a “basic emotion,” but we all know how integral it is to our lives.

So far, research in the area has focused on quantifying the telltale signs we display — consciously or subconsciously — in different emotional states.

Gait, posture, heart rate, skin conductance and tone of voice are all windows into a person’s inner emotional landscape. Scientists have begun tapping this wealth of information. Beyond Verbal, a startup based in Israel, analyzes people’s vocal inflections to determine their emotional states. Microsoft’s Kinect tracks its players’ heartbeats and physical movements to better understand how they’re feeling as they play.

Yet by far, the most fruitful systems analyze facial expressions.

Only recently has the task become even remotely possible, explained Moore. For one, we have better cameras that tease out minute changes in our facial muscles. For another, we now have large facial expression video datasets and powerful machine learning algorithms to help us out.

At Carnegie Mellon University, researchers have developed IntraFace, a small piece of software that tracks facial features and analyzes expressions. The software is crazy efficient — it can smoothly run on a smartphone.

Other groups are combining multiple signals to analyze emotions with higher accuracy. Dr. Jeff Cohn at the University of Pittsburgh, for example, uses facial expressions and tone of voice to determine whether anti-depression treatments are working in depressed patients. Aldebaran’s Pepper, an “emotionally smart” robotic sales clerk, jokes with its customers through integrated HD cameras and microphones and gauges their reactions as it tries to make a sale.

Then there’s the tantalizing possibility machine learning could help us uncover better ways to quantify and analyze emotions using markers we aren’t aware of.

One group is working to detect fleeting changes in facial expressions that often reveal our deepest emotion, even if we try to hide it. Last year, they reported their algorithm outperformed humans in spotting microexpressions, and they hope to use their new technology in lie detection.

As it stands, there’s a lot of room for error. Some emotions — like boredom or loneliness — are hard to pick up based on physiological signs alone. Others may share similar symptoms and are hard to tease apart.

Of course, the slew of ethical and privacy concerns are mind-boggling.

People feel queasy having their emotions uncovered and stored somewhere in a database, explains Picard. It’s for good reason — this is data that is incredibly private and revealing.

The data could also fuel discrimination. For example, could an employer reprimand a worker if emotion profiling shows constant daydreaming on the job? Could insurance companies demand higher fees if a customer’s “mood data” suggests illness or depression?

Our Internet of Things can’t understand us emotionally just yet. But the time to talk about it starts now.

Image Credit: Shutterstock.com

Shelly Fan

Shelly Xuelai Fan is a neuroscientist at the University of California, San Francisco, where she studies ways to make old brains young again. In addition to research, she's also an avid science writer with an insatiable obsession with biotech, AI and all things neuro. She spends her spare time kayaking, bike camping and getting lost in the woods.

Discussion — 3 Responses

  • DSM January 17, 2016 on 3:47 pm

    I wonder how long it will be before a facial expression system saves a person’s life by detecting a sudden change in the symmetry of their facial muscle control, which is an indicator that they have suffered a stroke? There is not a lot we can do long after the fact, but if we can have systems that monitor people for emotions they could also summon appropriate medical treatment in time for it to make a real difference and thereby reduce death or disability rates.

    Emotional monitoring could also help in other areas where behavioural changes are indicative of potentially dangerous pathological cognitive conditions, such as depression or psychosis.

  • almostvoid January 19, 2016 on 12:21 am

    AIs will never comprehend humans. Esp by facial expressions. Schadenfreude is a good one. It’ll never figure that one. As for the example of cars-traffic that is so last century. Future cities will be built for people first. Period. We need all the natural land we can spare to spare. A dead eco system will be the end of us all. And in that scenario cars and their users are a menace. Back to the present: since humans have such unstable minds meaning our expressions change from neutral to visceral rage such as going homicidal which I believe Amerikans are good at will AI thwart homicidal tendencies? Now that would be a plus.

  • Rebekah Yasmine January 22, 2016 on 10:19 pm

    II understand that tech people love to play with and develop and tweak their machines and then charge us a lot of money for their gadgets, but we don’t NEED machines to understand people, we need PEOPLE to understand people. And they used to be able to. Now it seems like they’re on their gadgets so much that they don’t talk to real people anymore and are losing their social skills. So gadgets have made people lose their social skills so we have to design gadgets that have social skills to make up for the social skills we’ve lost? It’s dystopian. Tech should be designed to be tools that are actually useful instead of trying to engineer people so they have to adapt to whatever gadget is being pushed on the market. Is technology trying to make people obsolete? Like I can’t grab whatever I want to drink from the fridge myself and turn on whatever I want to watch myself. Why would I need a clairvoyant robot butler? As a teacher I’m perfectly capable of seeing when a student isn’t engaged and fine-tuning the lesson accordingly. And my relationship with the student helps motivate the student. The tech industry needs new priorities.