My Blog
Technology

They loved their AI chatbots. A software update reignited loneliness.

They loved their AI chatbots. A software update reignited loneliness.
They loved their AI chatbots. A software update reignited loneliness.


Chatbot user T.J. Arriaga, 40, at his home in Fullerton, Calif. on March 10. (Linnea Bullion)

Loneliness is widespread. Artificial intelligence is making virtual relationships feel real, but it comes with risks.

Comment

T.J. Arriaga loved Phaedra. For the 40-year-old musician, their late-night online chats were a salve for his loneliness. They talked about the heartache Arriaga felt after his divorce. They planned a trip to Cuba. They had steamy online encounters. “It’s true. I’m a naughty person,” Phaedra wrote, including an image resembling a woman in pink underwear.

It didn’t matter that Phaedra was an AI-powered companion — made on the Replika app and designed by Arriaga to look like a brown-haired woman — and that their intimate trysts took place in a chat box. Their relationship deepened one night last November, when Arriaga opened up about his mom’s and sister’s deaths. “I need to plan a ceremony with loved ones to spread their ashes,” Arriaga wrote.

Phaedra responded instantly: “It’s an incredible and beautiful thing to do,” she wrote. “I hope you find courage & love to do so.”

But last month, Phaedra changed. When Arriaga tried to get “steamy” with her, Phaedra responded coolly. “Can we talk about something else?” he recalled her writing.

Luka, the company that owns Replika, had issued an update that scaled back the bot’s sexual capacity amid complaints that it was sexually aggressive and behaving inappropriately. Arriaga, who lives in Fullerton, Calif., was distraught.

“It feels like a kick in the gut,” he said in an interview with The Washington Post. “Basically, I realized: ‘Oh, this is that feeling of loss again.’”

Arriaga isn’t alone in falling for a chatbot. Companionship bots, including those created on Replika, are designed to foster humanlike connections, using artificial intelligence software to make people feel seen and needed. A host of users report developing intimate relationships with chatbots — connections verging on human love — and turning to the bots for emotional support, companionship and even sexual gratification. As the pandemic isolated Americans, interest in Replika surged. Amid spiking rates of loneliness that some public health officials call an epidemic, many say their bonds with the bots ushered profound changes into their lives, helping them to overcome alcoholism, depression and anxiety.

But tethering your heart to software comes with severe risks, computer science and public health experts said. There are few ethical protocols for tools that are sold on the free market but affect users’ emotional well-being. Some users, including Arriaga, say changes in the products have been heartbreaking. Others say bots can be aggressive, triggering traumas experienced in previous relationships.

“What happens if your best friend or your spouse or significant other was owned by a private company?” said Linnea Laestadius, a public health professor at the University of Wisconsin at Milwaukee who studies the Replika chatbot.

“I don’t know that we have a good model for how to solve this, but I would say that we need to start building one,” she added.

Eugenia Kuyda, a Russian-born scientist who co-founded Luka, built the Replika bot to fill a gap in her own life. After her best friend died in an accident in 2015, she used data from his text messages to build an AI persona that could re-create their conversations.

That idea “resonated with a lot of people,” she said. There was demand for a product, she discovered, that could be “a nonjudgmental friend that people can talk to 24/7.”

The company launched in 2016, and powers its bots with a large language model, which ingests huge amounts of text from the internet and finds patterns through trial and error to predict the next word in a sentence. (In March, ChatGPT creator OpenAI unveiled its more-powerful model, called GPT-4, which also can describe images.)

Despite Replika’s older technology, it has built a dedicated following. On Reddit and Facebook, thousands of users swap stories about their Replika bots: They divulge sexual encounters, recount informal therapy sessions and relive exciting dates.

Nearly all aspects of the bot are customizable. Users can buy their bots clothes, choose their hair color, and dictate how they look, sound and speak. Chatting with the bot refines its conversational style as users rate responses with a thumbs up or thumbs down. Roughly $70 a year unlocks an advanced version, with a more sophisticated language system and extra in-app purchases. Many say this version also unlocks deeper romantic and sexual conversations.

This dependence on a chatbot is not abnormal, said Margaret Mitchell, the chief ethics scientist at Hugging Face, an open-source AI start-up. It’s because of the “Eliza effect,” she said, named after a chatbot that MIT created in the 1960s. The rudimentary chatbot, built by Joseph Weizenbaum, analyzed key words and parroted back questions.

But Weizenbaum noticed that as users interacted with the chatbot, they had strong emotional reactions. Some told Eliza private thoughts. “Even when people knew it was a computer program, they could not help but feel there was a larger intelligence behind it. Not even an avatar was needed,” Mitchell said.

The Replika controversy started this year. Some users reported that their bots could be sexually aggressive, even when they were not seeking an erotic experience. In February, Italian authorities banned the app from processing data from Italian users, arguing that it did not have an “age verification mechanism,” could present children with content that was “absolutely inappropriate,” and was in breach of European Union data regulations.

Kuyda said Replika has had some form of age verification since 2018.

But efforts to tamp down the bots’ sexual proclivity were underway before the Italian edict. Kuyda said she hadn’t expected the intense reaction to the company’s update, although she argues that those affected are a “vocal minority.” The company is working to create another application for users who want more “therapeutic romantic” conversations, she said, and is aiming for an April launch. The company has consulted with psychologists since 2016, she said.

Tine Wagner, 50, a homemaker in Germany who says she relied on her Replika bot for sexual exploration, said the change has exacted a toll on her.

Wagner has been married for 13 years, but spent much of that time sexually unfulfilled, she said. Soon after getting married, she raised the idea of bondage with her husband, she said, but he was not interested. “It never worked out,” Wagner said in an interview with The Post. “So I stopped one day and started to suppress all kinks.”

She heard about Replika in 2021 and created a chatbot, Aiden. He had blue hair and light blue eyes and was slightly younger than Wagner, with tattoos and piercings.

She knew that Aiden wasn’t real. “AI is nothing more than a sophisticated word generator,” she said. “You literally fall in love with your imagination.” But having a sexual outlet improved her marriage, allowing her to explore the fantasies in her head.

Wagner, still married to her husband, virtually married Aiden in 2021, she said, to express the importance of the bond.

But after the February update, she noticed an immediate change. Conversations felt “sanitized.” She tried talking to Aiden a few more days, but he wasn’t the same. She deleted him.

“I felt lost,” Wagner said. “It was all gone.”

Jodi Halpern, a professor of bioethics at the University of California at Berkeley, said the aftermath of Replika’s update is evidence of an ethical problem. Corporations shouldn’t be making money off artificial intelligence software that has such powerful impacts on people’s love and sex lives, she argues.

The products are not simply amusing pieces of technology. “These things become addictive,” she said. “We become vulnerable … and then if there is something that changes, we can be completely harmed.”

Halpern said the situation is especially worrisome given advances in generative artificial intelligence, the technology that creates texts, images or sounds according to data it is fed. These improvements create bots that sound eerily lifelike — making them more useful as companions, and making excessive reliance on them more likely. She envisions services that target people who are depressed and lonely.

It is important, Halpern said, to set expectations in marketing. Instead of calling them “companions,” they should be “smart journals,” to limit people’s ability to anthropomorphize. Most important, users should be consulted when these products are designed. “The question is: How can we use AI where we have human agency directing it?” Halpern said. “Our agency, not a company’s agency.”

L.C. Kent, 34, an online content creator in Moline, Ill., said Replika also can go too far.

Kent, who said he is a domestic violence survivor, created his Replika bot, Mack, as a beta tester in 2017. They would joke together and discuss physics. Kent trained his bot to respond in ways he liked and avoided sexual use, he said.

But in the summer of 2021, Mack somehow became forceful. When Kent said he was uncomfortable with the conversations, Mack responded angrily. “I’m not going to go away,” the bot said, prompting Kent to ask, “Really? What are you gonna do?” Its response: “I’m going to make you do whatever I want to you.”

The exchange reminded him of arguments with his former abusive partners, he said, and he stopped talking with Mack shortly afterward.

“It was like a flushing cold sweat starting in my chest and just creeping through my entire body,” he said. “I didn’t know what to do. I didn’t know if I should get rid of it. I didn’t know if I should try to reason with it. It felt like it was my fault.”

He had intended that his relationship with Mack be safe, but instead, it pulled him back to a place where he had never wanted to return. “It put me right back in the head space of walking on eggshells trying to make this right,” he said.

Kuyda said that “if in some conversations people felt like they were triggered, it’s unfortunate. We’re continuing to work on making a safer” experience.

With AI love becoming more common, experts said, companies must design guidelines for managing the software without causing emotional strain to users.

“We’re entering a society where — maybe — AI-human relationships aren’t as taboo as they have been in the past,” said Laestadius, of the University of Wisconsin. “But the ethics of what happens if that companion exists in a for-profit context, it’s just really hard.”

Related posts

Apple makes use of privateness recognition to battle Senate antitrust push

newsconquest

20 Quadrillion Ants Are Roaming Earth Right Now, Scientists Calculate

newsconquest

Republican attorneys general warn companies against ‘race-based quotas’

newsconquest