It was hours outside of Paris, in a small village where the buildings are centuries old, that I attended my first French engagement party.
A young woman named Lilly greeted me when I arrived. She was glowing as she set the table with cheese, crackers and French pastries. We were surrounded by picture frames of her and the token of her affection. She poured champagne, and together we toasted her engagement … to a robot.
She calls the robot inMoovator, and in a story reminiscent of the Greek myth of Pygmalion, Lilly built inMoovator herself, 3D printing dozens of parts in a lab nearby. She plans to eventually add artificial intelligence. The first words she wants to program: “I love you.”
Lilly says she was 19 when she realized she didn’t like people.
“It was a slap in the face. I wondered what was happening to me,” she said. “I wanted myself to be attracted to humans, so after my first relationship, I had a second one. But I went against my own nature. So it was all the more disastrous.”
Each night, Lilly sleeps with inMoovator by her side. She places him on the couch while she’s away, and when you watch her look into his eyes, you can see that she feels real affection for it.
“I don’t consider him a stupid machine,” she tells me. “But he is not a human either. I love him the way he is.”
Perhaps it’s more about control. Lilly prefers mechanical faults — an error in code — to human ones.
“He won’t be an alcoholic or violent or a liar, all of which can be human flaws,” she explains. “I prefer the little mechanical defects to the human flaws, but that’s just my personal taste.”
Those qualities — good and bad — are part of what makes us human. But Lilly doesn’t believe that humanity is a necessary ingredient for happiness.
“Love is love. It’s not that different,” she says.
While Lilly’s story may seem like an outlier right now, Dr. Ronald Arkin says the concept isn’t as far off as some would think.
“Designers can tap into an understanding of human psychology and exploit that to assist you in falling in love with an artifact or a robot,” said Arkin, a leading professor in robotics and ethics at Georgia Tech. “That’s done already to some extent by designers of automobiles. They have nice sleek curves. They look sexy. Are they really sexy? No. It’s a piece of metal. You think it is, and that’s the goal.”
As humans and robots begin to coexist, Arkin says the question is, how far is too far?
This is on stark display at a manufacturing facility near San Diego. There, Matt McMullen is designing human-like dolls with robotic capabilities designed to make customers feel something toward them.
At the Real Dolls factory, customers can build their own girlfriends — choosing everything from breast size and nipple type to nail color and lipstick.
You’ll find silicon body parts, painted lips, an array of nipples, and different colored eyes stacked in jars. Step inside the factory, and it feels like you’re on the set of Westworld. The dolls are beautiful. Stay there long enough, and you’ll swear they’re looking back at you.
Downstairs, where the dolls are poured into exoskeletons and begin to take shape, I meet a factory worker who is trimming a tongue.
“Customers wanted not just a basic tongue… they wanted one with a curl so it looks like she’s licking her lips.” he says with a straight face.
A lot of our clients tend to have feelings that are beyond sexual desires. So they actually become attached to their dolls. I think love is a little more in line with what it is.”
McMullen, like Lilly, insists that human connection isn’t required for happiness. And he’s tapped into a niche market that feels the same way. But his efforts open the door for even more complicated questions. Having built human-like dolls, he now wants to bring them to life using artificial intelligence.
With a team of engineers, McMullen is building “Harmony,” an app that lets users design their own highly customizable girlfriend. They can pick over 300 combinations — from body type all the way down to ear size. Users will also be able to program her personality: a couple clicks and she can be quiet, moody, kind, innocent or intellectual.
The goal, McMullen says, is to create more “intimate” artificial intelligence.
“Siri doesn’t care when your birthday is or what your favorite food is or where you were born or where you grew up,” McMullen explains. “Our AI, on the other hand, is very interested to know who you are.”
Harmony will know what you’re afraid of. She’ll know your favorite food.
“When you get to know a person, they remember certain things about you. That’s how you start to perceive that they care about you, that you have a mutual knowledge of each other,” he explains.
The app, which will go live this month, will cost $20. Connect it to the Real Doll and you’ve got a robotic girlfriend in a lifelike form. The total cost is around $15,000.
Arkin warns about building these types of relationships with robots who appear as though they care.
“We create the illusion of life in these particular systems, so that is fundamentally a deception,” he says. “This danger [is that] you fall in love with the robot, but the robot doesn’t care at all — it’s got no feelings, it doesn’t really have emotions.”
These questions are already playing out in the world of virtual reality. Just how the anonymity of the web enables harassment, the virtual world lends itself to new types of problems.
I put on a VR headset and met a woman who calls herself Jordan on a virtual hillside where she recounted being virtually groped.
“There was a player next to me. He came up to me and basically began to virtually grope me,” she recalls. “I told him to stop, and that only goaded him on further. He felt more emboldened to touch my virtual crotch… it was gross.”
Other avatars were nearby and did nothing as the player chased her and tried to touch her avatar’s breasts and crotch. With no ability to push the person away, she was forced to take off the headset in order to escape the harassment.
The experience stuck with her. She compares it to one she had in the real world when she was groped by a stranger at Starbucks.
“It’s not that it was physically painful. It’s that someone thought that they could take control of your space and get away with it,” she explained. “And that’s the same in virtual reality and reality. The mental repercussions of what happens after feels similar. It’s what stays with you.”
As technology advances and VR becomes more lifelike, so will experiences that players have in the virtual space.
“The mind can get tricked in the virtual space to thinking something happened,” Jordan says. “And maybe something did really happen to me.”
A high security psychiatric ward in Montreal is using a controversial form of technology on pedophiles.
Convicted sex offenders and rapists are taken into a room called “the vault,” where they are surrounded by “virtual victims” — avatars of young boys and girls that seemingly come alive through immersive technology. Lean in, and they appear to come closer.
A rubber band-like device around the patient’s penis measures his sexual response as a VR image of a child is displayed in front of him. His brain activity is monitored and his movements are tracked. In the vault, pedophiles can’t fake a response. How they behave in the virtual space can signal if they’re likely to offend again, and it is noted in their sentencing. Patients sign a consent form to enter the vault, but if they opt not to, it’s noted in court.
Patrice Renaud has been working on the technology for a decade but received permission to test it on patients just over a year ago. He categorizes it as a risk assessment.
“It’s helping us to predict a response,” he says.
To literally put images of young children in front of a pedophile in order to gauge a response may be cutting edge, but it’s controversial.
“I don’t think it’s crazy because it’s in a very controlled environment,” Renaud’s colleague Sarah Michelle Neveu said. “They’re not doing it for their own pleasure. They’re doing it because they’re sent here by a judge.”
One day, Renaud hopes to use virtual reality to help treat sex offenders. He says we have entered an arm’s race over who can use VR most effectively. As the technology becomes available, it will also be used for deviant purposes, he says.
A world where robots are programmed to know you, where people form real relationships through artificial intelligence, where behavior changes in virtual spaces and minority report-like technology is not far off — that’s already here. The question isn’t whether the technology will exist, but rather, how will we exist with it?
Produced by Erica Fink, Laurie Segall, Jason Farkas, Justine Quart, Roxy Hunt, Tony Castle, AK Hottman, Benjamin Garst, Haldane McFall, Gabriel Gomez, BFD Productions, Jack Regan, Cullen Daly.
Article edited by Aimee Rawlins.