WIRED: I wanted to talk about navigating relationships with home or companion robots, especially when it comes to empathy and actually developing pretty complex relationships. What can we learn from what we’ve been doing for thousands of years with pets?
KD: One of the things that we’ve learned by looking at the history of pets and other emotional relationships we’ve developed with animals is that there isn’t anything inherently wrong with it—which is something that people often leap to immediately with robots. They’re immediately like, “It’s wrong. It’s fake. It’s going to take away from human relationships.” So I think that comparing robots to animals is an immediate conversation-shifter, where people are like, “If it’s more like a pet rabbit, then maybe it’s not going to take away my child’s friends.”
One of the other things we’ve learned is that animals, even in the companionship realm, are actually really useful in health and education. There are therapy methods that have really been able to improve people’s lives through emotional connections to animals. And it shows that there actually may be some potential for robots to help in a similar, yet different, way—again, as kind of a new breed. It’s a new tool, it’s something new that we might be able to harness and use to our benefit.
One of the things that was important to me though to put in the book is that robots and animals are not the same. Unlike animals, robots can tell others your secrets. And robots are created by corporations. There’s a lot of issues that I think we tend to not see—or forget about—because we’re so focused on this human replacement aspect. There are a lot of issues with putting this technology into the capitalist society we live in, and just letting companies have free reign over how they use these emotional connections.
WIRED: Say you have a home robot for a kid. In order to unlock some sort of feature, you have to pay extra money. But the kid has already developed a relationship with that robot, which you could argue is exploiting emotions, exploiting that bond that a child has developed with a robot, in order to get you to pay more.
KD: It’s kind of like the whole in-app purchases scandal that happened a while back, but it’ll be that on steroids. Because now you have this emotional connection, where it’s not just the kid wanting to play a game on the iPad, but the kid actually has a relationship with the robot.
For kids, I’m actually less worried because we have so many watchdog organizations that are out there looking for new technologies that are trying to exploit children. And there are laws that actually protect kids in a lot of countries. But the interesting thing to me is that it’s not just kids—you can exploit anyone this way. We know that adults are susceptible to revealing more personal information to a robot than they would willingly enter into a database. Or if your sex robot has compelling enough purchases, that might be a way to really exploit consumers’ willingness to pay. And so I think there needs to be broad consumer protection. For reasons of privacy, for reasons of emotional manipulation, I think it’s extremely plausible that people might shell out money to keep a robot “alive,” for example, and that companies might try to exploit that.
WIRED: So what does the relationship between robots and humans look like in the near future?
KD: Roomba is one of the very simple examples where you have a robot that’s not very complex, but it’s in people’s homes, it’s moving around on its own. And people have named their Roombas. There’s so many other cases, like military robots. Soldiers were working with these bomb disposal units and started treating them like pets. They would give them names, they would give them Medals of Honor, they would have funerals with gun salutes, and really relating to them in ways similar to how animals have been an emotional support for soldiers in intense situations throughout history.