reason coveted a doll advertised for the
ability to pee in her pants.
We’re even accustomed to the idea of
machines thinking in ways that remind us
of humans. Many of our long-cherished
high-water marks for human cognition—
the ability to beat a grandmaster at chess,
for example, or to compose a metrically
accurate sonnet—have been met and surpassed by computers.
Octavia’s actions, however—the fearful widening
of her eyes, the confused
furrow of her plastic eyebrows—seem to go a step
further. They imply that
in addition to thinking the
way we think, she’s also
feeling human emotions.
That’s not really the
case: Octavia’s emotional affect, according
to Gregory Trafton, who
leads the Intelligent Systems Section at
the Navy AI center, is merely meant to
demonstrate the kind of thinking she’s
doing and make it easier for people to
interact with her. But it’s not always possible to draw a line between thinking and
feeling. As Trafton acknowledges, “It’s
clear that people’s thoughts and emotions are different but impact each other.”
Octavia, a humanoid robot designed to
fight fires on Navy ships, has mastered
an impressive range of facial expressions.
When she’s turned off, she looks like a
human-size doll. She has a smooth white
face with a snub nose. Her plastic eyebrows sit evenly on her forehead like two
little capsized canoes.
When she’s on, however, her eyelids
fly open and she begins to display emotion. She can nod her head in a gesture of
understanding; she can widen her eyes
and lift both her eyebrows in a convincing
semblance of alarm; or she can cock her
head to one side and screw up her mouth,
replicating human confusion. To comic
effect, she can even arch one eyebrow and
narrow the opposite eye while tapping her
metal fingers together, as though plotting
acts of robotic revenge.
But Octavia’s range of facial expressions isn’t her most impressive trait.
What’s amazing is that her emotional
affect is an accurate response to her interactions with humans. She looks pleased,
for instance, when she recognizes one of
her teammates. She looks surprised when a
teammate gives her a command she wasn’t
expecting. She looks confused if someone
says something she doesn’t understand.
She can show appropriate emotional
affect because she processes massive
amounts of information about her envi-
ronment. She can see, hear, and touch.
She takes visual stock of her surroundings using the two cameras built into
her eyes and analyzes characteristics like
facial features, complexion, and clothing. She can detect people’s voices, using
four microphones and a voice-recognition
program called Sphinx. She can identify
25 different objects by
touch, having learned
them by using her fingers
to physically manipulate
them into various possi-
ble positions and shapes.
Taken together, these per-
ceptual skills form a part
of her “embodied cogni-
tive architecture,” which
to her creators at the
Navy Center for Applied
Research in Artificial
Intelligence—to “think and act in ways
similar to people.”
That’s an exciting claim, but it’s not
necessarily shocking. We’re accustomed
to the idea of machines acting like peo-
ple. Automatons created in 18th-century
France could dance, keep time, and play
the drums, the dulcimer, or the piano. As
a kid growing up in the 1980s, I for some
Culture and Human-Robot
Interaction in Militarized Spaces:
A War Story
By Julie Carpenter
How Emotions Are Made:
The Secret Life of the Brain
By Lisa Feldman Barrett
Houghton Mifflin Harcourt, 2017
The U. S. Army Robotic and
Autonomous Systems Strategy
As robots become smart enough to detect our feelings and respond
appropriately, they could have something like emotions of their own.
But that won’t necessarily make them more like humans.
By Louisa Hall
How We Feel About
Robots That Feel