My Blog
Technology

How AI Chatbots Are Helping Some People Have Hard Conversations

How AI Chatbots Are Helping Some People Have Hard Conversations
How AI Chatbots Are Helping Some People Have Hard Conversations


Todd Mitchem kept struggling to have honest, productive conversations with his son. “He’s 15,” Mr. Mitchem, 52, said with a laugh. “Teenagers are so difficult to connect with.”

Every time he tried to bring up a sensitive issue, his son would give vague answers or run away, preferring to avoid serious talks altogether.

In the past when Mr. Mitchem needed parenting help, he would read a book or pose a question to the men’s support group he meets with weekly.

But recently he has turned to ChatGPT. And he’s not alone: Others are turning to artificial intelligence chat bots to figure out what to say in situations that feel high-stakes. They are using the tool to talk or read to their children, to approach bosses, to provide difficult feedback, to write wedding vows or to pen love letters.

Unlike turning to friends or even professionals for help, the bot, said Mr. Mitchem, gives what feels like objective advice. “The bot is giving me responses based on analysis and data, not human emotions,” he said.

ChatGPT, the new virtual tool powered by Open AI, sources its information from a wide range of online material including books, news articles, scientific journals, websites and even message boards, allowing users to have humanlike conversions with a chat bot.

“It’s giving you what the collective hive mind on the internet would say,” said Irina Raicu, who directs the internet ethics program at Santa Clara University. (Other companies including Google and Microsoft have their own versions of this technology, and Microsoft’s, called Bing A.I., was recently made famous for aggressively declaring its love to the New York Times journalist Kevin Roose.)

Mr. Mitchem, who lives in Denver and is the executive vice president of learning and product for a leadership training company, opened his conversation by typing, in summary: “I need some friendly advice.”

“OK, no problem,” ChatGPT responded, according to Mr. Mitchem. “What is your name?”

In the course of their conversation, ChatGPT told Mr. Mitchem that he is a good father for even wondering how to approach a conversation with his son about the decision to join a basketball team. “It said something like, ‘It’s cool if you don’t get it right, but it’s awesome that you are trying.”

Mr. Mitchem said the bot then continued: “Teenage boys, when they are growing up, are trying to force their independence. Remember when you talk to him, he needs to know that you trust your decisions.”

The next day Mr. Mitchem approached his son and tried out the advice. “I said to him, ‘You need to make this decision, you are 14, and I trust you will make a good one,” Mr. Mitchem said. “My son goes, ‘Wow, that’s awesome. I’ll let you know what I decide.’”

“We left on a positive note,” Mr. Mitchem said. “It totally worked.”

For Naif Alanazi, a 35-year-old Ph.D. student at Kent State University, bedtime is a sacred ritual for him and his 4-year-old daughter, Yasmeen. “I have to work all day,” he said. “This is our special time.”

His Saudi Arabian family has a deep tradition of telling oral stories. Wanting to continue it, he used to try to concoct new, thrilling tales each evening. “Do you know how difficult it is to come up with something new every day?” he asked, laughing.

Now, however, he lets the bot do the work.

Every night he asks ChatGPT to create a story that involves people (his daughter’s teacher, for instance) and places (school, the park) from her day, along with a cliffhanger at the end so he continue the story the next night. “Sometimes I ask it to add a value she needs to learn like honesty or being kind to others,” he said.

“Being able to give her something that is more than a generic story, something that can increase our bond and show her that I am interested in her daily life,” he said, “it makes me feel so much closer to her.”

Anifa Musengimana, 25, who is in graduate school for international marketing in London, is certain that chat bots can help make the tedium of online dating more interesting. “I am having a lot of repetitive conversations on these apps,” she said. “The app can give me fun ideas of what to talk about, and maybe I’ll find better people to date.”

“If I get intriguing answers, I will be drawn in,” she said.

She said she would tell her match she was using the tool. “I would want a guy who finds it funny,” she said. “I wouldn’t want a guy who is so serious that he gets mad at me for doing it.”

Some are using chat bots to enhance the relationships they already have.

James Gregson, 40, a creative director who lives in Avon, Conn., has been using ChatGPT to draft love letters to his wife.

“I am not a poet, I am not a songwriter, but I can take topics on things my wife might like and put it into a song or poem,” he said.

He also believes in full disclosure: “I am going to give her one, but I am going to tell her who wrote it,” he said. “I am not trying to con her.”

Jessica Massey, 29, a finance analyst at Cisco Systems, who lives in Buffalo, has been writing draft emails to her boss using ChatGPT. “I wanted to test out its capabilities to see if there was a different way A.I. would word what I was thinking in my head,” she wrote in an email. (One person interviewed confessed to consulting ChatGPT to help prepare for their interview for this story. Another admitted to using it for employee reviews.)

Ms. Massey used the bot to write an email to her boss explaining why the company should pay for a certain professional certification. The bot gave her pretty boilerplate language, she said. She hasn’t sent it yet, but plans to once she changes “the verbiage a bit to make it sound more like me.”

Ms. Massey, however, has a rule about relying on a chat bot: “Disclose it at the end of your work or not use it at all.”

However, scholars who study technology and ethics have mixed feelings about using ChatGPT for highly personal communication.

“We shouldn’t automatically reject tools that might help people deal with a difficult conversation,” said Michael Zimmer, the director of the Center for Data, Ethics and Society at Marquette University. He equates it to buying a Hallmark Card for a birthday or anniversary. “We’ve all accepted doing that because the words on the card align with something I believe,” he said.

However, Ms. Raicu, from Santa Clara University, worries about people using ChatGPT for personal communication. She doesn’t like the idea that there is a “right” and “wrong” way to communicate. “I think the right words depend on who the people are who are communicating and the context,” she said. “There is no formula for a lot of this stuff.”

Ms. Raicu said that using ChatGPT for personal communication may undermine trust: “People might ask, ‘Do I really know who I am talking to?’”

Photos by Getty Images

Related posts

Save Your Relationship and Your Sleep With the Scandinavian Sleep Method

newsconquest

AI chatbots can fall for prompt injection attacks, leaving you vulnerable

newsconquest

Act Fast to Get a Lifetime Microsoft Office License for Just $25

newsconquest