When Carpenter asked one serviceman to describe his feelings about a robot
that had been destroyed, he responded:
I mean, it wasn’t obviously … anywhere
close to being on the same level as,
like, you know, a buddy of yours getting
wounded or seeing a member getting
taken out or something like that. But
there was still a certain loss, a sense of
loss from something happening to one
of your robots.
Another serviceman compared his
robot to a pet dog:
I mean, you took care of that thing as
well as you did your team members. And
you made sure it was cleaned up, and
made sure the batteries were always
charged. And if you were not using it, it
was tucked safely away as best could
be because you knew if something happened to the robot, well then, it was
your turn, and nobody likes to think that.
Yet another man explained why his
teammate gave their robot a human name:
Towards the end of our tour we were
spending more time outside the wire
sleeping in our trucks than we were
inside. We’d sleep inside our trucks outside the wire for a good five to six days
out of the week, and it was three men
in the truck, you know, one laid across
the front seats; the other lays across
the turret. And we can’t download sensitive items and leave them outside the
truck. Everything has to be locked up,
so our TALON was in the center aisle
of our truck and our junior guy named
it Danielle so he’d have a woman to cuddle with at night.
These men all stress that the robots
are tools, not living creatures with feel-
ings. Still, they give their robots human
names and tuck them in safely at night.
They joke about that impulse, but there’s
a slightly disturbing dissonance in the
jokes. The servicemen Carpenter inter-
viewed seem to feel somewhat stuck
between two feelings: they understand
the absurdity of caring for an emotion-
less robot that is designed to be expend-
able, but they nevertheless experience the
temptation to care, at least a little bit.
Once Carpenter had published her initial interviews, she received more communication from men and women in the
military who had developed real bonds
with their robots. One former explosive
ordnance disposal technician wrote:
As I am an EOD technician of eight
years and three deployments, I can tell
you that I found your research extremely
interesting. I can completely agree with
the other techs you interviewed in saying that the robots are tools and as
such I will send them into any situation
regardless of the possible danger.
However, during a mission in Iraq
in 2006, I lost a robot that I had named
“Stacy 4” (after my wife who is an EOD
tech as well). She was an excellent robot
that never gave me any issues, always
performing flawlessly. Stacy 4 was completely destroyed and I was only able to
recover very small pieces of the chassis. Immediately following the blast that
destroyed Stacy 4, I can still remember
the feeling of anger, and lots of it. “My
beautiful robot was killed …” was actually the statement I made to my team
leader. After the mission was complete
and I had recovered as much of the
robot as I could, I cried at the loss of her.
I felt as if I had lost a dear family member. I called my wife that night and told
her about it too. I know it sounds dumb
but I still hate thinking about it. I know
that the robots we use are just machines
and I would make the same decisions
again, even knowing the outcome.
I value human life. I value the relationships I have with real people. But I
can tell you that I sure do miss Stacy 4,
she was a good robot.
If these are the kinds of testimonials that can be gathered from soldiers
interacting with faceless machines like
PackBots and Talons, what would you
hear from soldiers deployed with robots
like Octavia, who see and hear and touch
and can anticipate her human teammates’
states of mind?
In popular conversations about the
ethics of giving feelings to robots, we tend
to focus on the effects of such technological innovation on the robots themselves.
Movies and TV shows from Blade Runner
to Westworld attend to the trauma that
would be inflicted on feeling robots by
humans using them for their entertainment. But there is also the inverse to consider: the trauma inflicted on the humans
who bond with robots and then send them
to certain deaths.
What complicates all this even further
is that if a robot like Octavia ends up feeling
human emotions, those feelings won’t only
be the result of the cognitive architecture
she’s given to start with. If they’re anything
like our emotions, they’ll evolve in the context of her relationships with her teammates, her place in the world she inhabits.
If her unique robot life, for instance, is
spent getting sent into fires by her human
companions, or trundling off alone down
desert roads laced with explosive devices,
her emotions will be different from those
experienced by a more sheltered robot, or
a more sheltered human. Regardless of
the recognizable emotional expressions
she makes, if she spends her life in inhumane situations, her emotions might not
be recognizably human.
Louisa Hall, a writer in New York, is
the author of Speak, a 2015 novel about