We Need Computers with
An emerging trend in artificial intelligence is to get computers to
detect how we’re feeling and respond accordingly. They might even
help us develop more compassion for one another.
I was rehearsing a speech for an AI conference recently when
I happened to mention Amazon Alexa. At which point Alexa
woke up and announced: “Playing Selena Gomez.” I had to yell
“Alexa, stop!” a few times before she even heard me.
But Alexa was oblivious to my annoyance. Like the majority
of virtual assistants and other technology out there, she’s clueless about what we’re feeling.
We’re now surrounded by hyper-connected smart devices
that are autonomous, conversational, and relational, but they’re
completely devoid of any ability to tell how annoyed or happy
or depressed we are. And that’s a problem.
What if, instead, these technologies—smart speakers,
autonomous vehicles, television sets, connected refrigerators,
mobile phones—were aware of your emotions? What if they
sensed nonverbal behavior in real time? Your car might notice
that you look tired and offer to take the wheel. Your fridge might
work with you on a healthier diet. Your wearable fitness tracker
and TV might team up to get you off the couch. Your bathroom
mirror could sense that you’re stressed and adjust the lighting
while turning on the right mood-enhancing music. Mood-aware
technologies would make personalized recommendations and
encourage people to do things differently, better, or faster.
Today, an emerging category of AI—artificial emotional
intelligence, or emotion AI—is focused on developing algorithms that can identify not only basic human emotions such
as happiness, sadness, and anger but also more complex cognitive states such as fatigue, attention, interest, confusion, distraction, and more. (As for whether machines could not only
detect emotions but also experience their own, see “How We
Feel About Robots That Feel,” page 74.)
My company, Affectiva, is among those working to build
such systems. We’ve compiled a vast corpus of data consisting
of six million face videos collected in 87 countries, allowing an
AI engine to be tuned for real expressions of emotion in the wild
and to account for cultural differences in emotional expression.
Using computer vision, speech analysis, and deep learning,
we classify facial and vocal expressions of emotion. Quite a few
open challenges remain—how do you train such multi-modal A N
Rana el Kaliouby