Wednesday, April 24, 2024

Machines are getting freakishly good at recognizing human emotions

Share

Until very recently we’ve had to interact with computers on their own terms. To use them, humans had to learn inputs designed to be understood by the computer: whether it was typing commands or clicking icons using a mouse. But things are changing. The rise of A.I. voice assistants like Siri and Alexa make it possible for machines to understand humans as they would ordinarily interact in the real world. Now researchers are reaching for the next Holy Grail: Computers that can understand emotions.

Whether it’s Arnold Schwarzenegger’s T-1000 robot in Terminator 2 or Data, the android character in Star Trek: The Next Generation, the inability of machines to understand and properly respond to human emotions has long been a common sci-fi trope. However, real world research shows that machine learning algorithms are actually getting impressively good at recognizing the bodily cues we use to hint at how we’re feeling inside. And it could lead to a whole new frontier of human-machine interactions.

Affectiva

Don’t get us wrong: Machines aren’t yet as astute as your average human when it comes to recognizing the various ways we express emotions. But they’re getting a whole lot better. In a recent test carried out by researchers at Dublin City University, University College London, the University of Bremen and Queen’s University Belfast, a combination of people and algorithms were asked to recognize an assortment of emotions by looking at human facial expressions.

The emotions included happiness, sadness, anger, surprise, fear, and disgust. While humans still outperformed machines overall (with an accuracy of 73% on average, compared to 49% to 62% depending on the algorithm), the scores racked up by the various bots tested showed how far they have come in this regard. Most impressively, happiness and sadness were two emotions at which machines can outperform humans at guessing, simply by looking at faces. That’s a significant milestone.

Emotions matter

Researchers have long been interested in finding out whether machines can identify emotion from still images or video footage. But it is only relatively recently that a number of startups have sprung up to take this technology mainstream. The recent study tested commercial facial recognition machine classifiers developed by Affectiva, CrowdEmotion, FaceVideo, Emotient, Microsoft, MorphCast, Neurodatalab, VicarVision, and VisageTechnologies. All of these are leaders in the growing field of affective computing, a.k.a. teaching computers to recognize emotions.

The test was carried out on 938 videos, including both posed and spontaneous emotional displays. The chance of a correct random guess by the algorithm for the six emotion types would be around 16%.

Damien Dupré, an Assistant Professor at Dublin City University’s DCU Business School, told Digital Trends that the work is important because it comes at a time when emotion recognition technology is becoming more relied upon.

“Since machine learning systems are becoming easier to develop, a lot of companies are now providing systems for other companies: mainly marketing and automotive companies,” Dupré said. “Whereas [making] a mistake in emotion recognition for academic research is, most of the time, harmless, stakes are different when implanting an emotion recognition system in a self-driving car, for example. Therefore we wanted to compare the results of different systems.”

It could one day be used to spot things like drowsiness or road rage, which might trigger a semi-autonomous car taking the wheel.

The idea of controlling a car using emotion-driven facial recognition sounds, frankly, terrifying — especially if you’re the kind of person prone to emotional outbursts on the road. Fortunately, that’s not exactly how it’s being used. For instance, emotion recognition company Affectiva has explored the use of in-car cameras to identify emotion in drivers. It could one day be used to spot things like drowsiness or road rage, which might trigger a semi-autonomous car taking the wheel if a driver is deemed unfit to drive.

Researchers at the University of Texas at Austin, meanwhile, have developed technology that curates an “ultra-personal” music playlist that adapts to each user’s changing moods. A paper describing the work, titled “The Right Music at the Right Time: Adaptive Personalized Playlists Based on Sequence Modeling,” was published this month in the journal MIS Quarterly. It describes using emotion analysis that predicts not just which songs will appeal to users based on their mood, but the best order in which to play them, too.

Affectiva

There are other potential applications for emotion recognition technology, too. Amazon, for instance, has very recently begun to incorporate emotion-tracking of voices for its Alexa assistant; allowing the A.I. to recognize when a user is showing frustration. Further down the line, there’s the possibility this could even lead to full-on emotionally responsive artificial agents, like that in Spike Jonze’s 2013 movie Her.

In the recent image-based emotion analysis work, emotion sensing is based on images. However, as some of these illustrations show, there are other ways that machines can “sniff out” the right emotion at the right time.

“When facial information is for some reason unavailable, we can analyze the vocal intonations or look at the gestures.”

“People are generating a lot of non-verbal and physiological data at any given moment,” said George Pliev, founder and managing partner at Neurodata Lab, one of the companies whose algorithms were tested for the facial recognition study. “Apart from the facial expressions, there are voice, speech, body movements, heart rate, and respiration rate. A multimodal approach states that behavioral data should be extracted from different channels and analyzed simultaneously. The data coming from one channel will verify and balance the data received from the other ones. For example, when facial information is for some reason unavailable, we can analyze the vocal intonations or look at the gestures.”

Challenges ahead?

However, there are challenges — as all involved agree. Emotions are not always easy to identify; even for the people experiencing them.

“If you wish to teach A.I. how to detect cars, faces or emotions, you should first ask people what do these objects look like,” Pliev continued. “Their responses will represent the ground truth. When it comes to identifying cars or faces, almost 100% of people asked would be consistent in their replies. But when it comes to emotions, things are not that simple. Emotional expressions have many nuances and depend on context: cultural background, individual differences, the particular situations where emotions are expressed. For one person, a particular facial expression would mean one thing, while another person may consider it differently.”

Dupré agrees with the sentiment. “Can these systems [be guaranteed] to recognize the emotion actually felt by someone?” he said. “The answer is not at all, and they will never be! They are only recognizing the emotion that people are deciding to express — and most of the time that doesn’t correspond to the emotion felt. So the take-away message is that [machines] will never read … your own emotion.”

Still, that doesn’t mean the technology isn’t going to be useful. Or stop it from becoming a big part of our lives in the years to come. And even Damien Dupré leaves slight wiggle room when it comes to his own prediction that machines will never achieve something: “Well, never say never,” he noted.

The research paper, “Emotion recognition in humans and machine using posed and spontaneous facial expression,” is available to read online here.

Read more

More News