With Emotion Recognition Algorithms, Computers Know What You’re Thinking

 

Back when Google was first getting started, there were plenty of skeptics who didn’t think a list of links could ever turn a profit. That was before advertising came along and gave Google a way to pay its bills — and then some, as it turned out. Thanks in part to that fortuitous accident, in today’s Internet market, advertising isn’t just an also-ran with new technologies: Marketers are bending innovation to their needs as startups chase prospective revenue streams.

A handful of companies are developing algorithms that can read the human emotions behind nuanced and fleeting facial expressions to maximize advertising and market research campaigns. Major corporations including Procter & Gamble, PepsiCo, Unilever, Nokia and eBay have already used the services.

Companies building the emotion-detecting algorithms include California-based Emotient, which released its product Facet this summer, Massachusetts-based Affectiva which will debut Affdex mobile software development kit in early 2014, and U.K.-based Realeyes, which has been moving into emotion-detection since launching in 2006 as provider of eye-movement user-experience research.

They’ve all developed the ability to identify emotions by taking massive data sets — videos of people reacting to content — and putting them through machine learning systems. (The startups have built on a system of coding facial expressions developed in the 1970s for humans to carry out.) Machine learning is the most straightforward approach to artificial intelligence, but it’s mostly limited to deductive reasoning and isn’t likely to give rise to nuanced artificial intelligence.

Yet, with the ability to capture, in video freeze-frame, fleeting expressions that are too quick for a human to definitively identify, the algorithms may already be smart enough to provide more information on what people are thinking than has ever before been available.

“The unguarded expressions that flit across our faces aren’t always the ones we want other people to readily identify. We rely to some extent on the transience of those facial expressions,” Ginger McCall, a lawyer and privacy advocate based in Washington, D.C., told the New York Times recently.

Here’s how the systems work. Facet breaks facial expressions down into 44 distinct movements. Emotient, a University of California San Diego spinoff, says its algorithm can identify joy, sadness, anger, surprise, fear, disgust and contempt. It offers its service as a software development kit, or SDK, for others to use in their apps. For instance, a video game could pick up the pace if the player appears bored.

Affdex is a cloud-based software application that subscribers interact with using a dashboard that allows them to generate reports, export data and bookmark videos. It identifies both emotions and the intensity of the user’s attention.

Affdex originated with academic research founder Rana el Kaliouby conducted into autism. Those affected by autism can’t recognize emotion in human expressions the way most people can, and el Kaliouby first sought to use emotion-labeling computers to assist autistic patients. Marketers quickly saw the utility of the system for them and el Kaliouby spun off a company.

Realeyes relies on similar algorithms, but the company recruits viewers who fall within a particular target audience to vet the content. (See more coverage here.) As a result, it takes 48 hours to get results, whereas the other services are offered in real-time.

In addition to the advertising arms race to deliver the most “relevant” content to bleary-eyed web surfers, the presence of a webcam on nearly every laptop and mobile device has propelled this use of artificial intelligence. Not all of those webcams provide an image clear enough for the algorithms to decode, but the companies are betting on continued improvements.

We’ve all heard about webcams being hijacked — could this technology be used on unsuspecting users? That depends. It is ostensibly illegal to record an unwitting user via webcam, but that doesn’t make it impossible.

“It would be very difficult for the companies to ensure consent of third parties, especially if the technology becomes mobile — that is, if it is used as part of a mobile application. We’ve already seen instances where similar technology has been used in televisions without consumer consent,” privacy advocate McCall told Singularity Hub.

And if a mounted camera on private property did the recording, emotion detection would likely be legal. Job interviews may have just gotten a lot harder — just be careful not to frown.

Cameron Scott
Cameron Scott
Cameron received degrees in Comparative Literature from Princeton and Cornell universities. He has worked at Mother Jones, SFGate and IDG News Service and been published in California Lawyer and SF Weekly. He lives, predictably, in SF.
RELATED
latest
Don't miss a trend
Get Hub delivered to your inbox

featured