My Blog
Technology

How AI can help with mental health



Comment

This article is a preview of The Tech Friend newsletter. Sign up here to get it in your inbox every Tuesday and Friday.

For at least 60 years, technologists have hunted for a mental health holy grail: a computer that listens to our problems and helps us.

We keep failing at making an artificial-intelligence Sigmund Freud, and there is both value and risk in leaning on technology to improve our mental well-being. Let’s talk it over. (Imagine me saying that in my most cliched therapist voice.)

Projects such as Woebot and Koko have used artificial intelligence to augment elements of talk therapy. Their predecessors included Eliza, a 1960s MIT software program that unintentionally became an early attempt at a computer shrink.

Mental health experts told me that there are no magic technology fixes for our individual or collective mental health struggles. Instead, the experts said AI and other technologies may do the most good when we don’t expect them to do too much.

AI Sigmund Freud is not coming anytime soon, and it might be terrible for you if it did exist. What might be more helpful are relatively simple and targeted technologies in mental health care including telemedicine, telephone hotlines and AI for self-help skill building or clinician training.

Kevin Rushton is working on one such project for Mental Health America. The advocacy organization operates an AI assistant that is, essentially, a chatbot self-improvement workbook.

You type in negative ideas you have about yourself, and the AI helps you practice reworking them into something more productive.

Instead of thinking that you’ll get fired at work because you messed up one project, you might be guided to consider that everyone makes mistakes and that it’s probably not fatal for your career.

“Learning to reframe things on your own is a skill that people need to learn to improve their mental health,” said Rushton, program manager of digital solutions for Mental Health America.

If people try to use the AI assistant as a computer therapist or to vent about a problem, the software is designed to respond with something positive but not give advice, Rushton said.

Some experts in technology and mental health care bristle at the suggestion that AI can do more than operate in narrow uses such as an interactive workbook.

“We know we can feel better from writing in a diary or talking aloud to ourselves or texting with a machine. That is not therapy,” said Hannah Zeavin, author of “The Distance Cure: A History of Teletherapy” and a professor at Indiana University. “Not all help is help.”

But Zeavin and others that I spoke with said that it’s no wonder that we keep trying to computerize therapy and other mental health services. Existing mental health care is expensive, inaccessible for many people, often of poor quality and uncomfortably intimate.

Alison Darcy, founder of Woebot Health, the company behind the chatbot of the same name, said digital therapeutic tools aren’t trying to replace human therapists.

Darcy said there needs to be broader discussion of what technology can do differently that can “engage people in ways and at times that clinicians can’t.”

Benjamin F. Miller, a psychologist and the former president of the Well Being Trust, a foundation focusing on mental and spiritual health, imagines AI being useful in training professionals or amateurs who want to provide mental health help.

Or, he said, AI might also be useful to automate the rigorous record-keeping required in mental health — although automating physicians’ notes has a spotty track record.

I also asked Miller what you should do if you feel you need mental health care and don’t know where to start.

He said if you feel comfortable doing so, ask for advice from a trusted person who is familiar with the health-care system such as a primary care physician.

If that doesn’t feel like a good option, consider opening up to someone else you trust like a pastor, school principal or the person who cuts your hair, Miller said. They may not know how to help you or what to say, but the act of reaching out can be an important first step.

“Opening up to people that you feel like you can trust is a powerful tool to start that journey,” he said.

Lindsey Bever, a Washington Post colleague who writes about mental health, recently published a guide for people struggling during a shortage of mental health professionals. She wrote that group therapy sessions, support groups and supportive friends can be helpful, particularly for people waiting to find a therapist.

Apps such as Insight Timer, Calm and Headspace can help some people reduce stress and anxiety, Lindsey wrote. And Zeavin said Trans Lifeline, a peer hotline, has a good track record.

Miller also said we cannot expect technology to be a substitute for or a shortcut to the human ties that are a bedrock of our health.

“There is nothing magical about creating meaningful, healthy relationships, but it does heal,” he said.

Online creators are de facto therapists for millions. It’s complicated.

Normally, I wouldn’t describe lying as a “win.” But just this once …

My colleague Heather Kelly wrote recently about why more video streaming services are asking for your children’s birthdays. The request is related to the growing number of legal requirements to block kids from apps or limit what they can do with them.

Heather’s advice is to fib and not give your kid’s exact birth date. It’s a piece of information that could be used for fraud if it falls into the wrong hands.

Read more from Heather: Tech companies want your kid’s birth date. Should you tell them?

Brag about YOUR one tiny win! Tell us about an app, gadget or tech trick that made your day a little better. We might feature your advice in a future edition of The Tech Friend.

Related posts

Every HDR TV Format Explained

newsconquest

HBO Max: The 41 Best TV Shows to Watch

newsconquest

Benjamin Sisko Merits Actual Reputation in Superstar Trek: Picard

newsconquest

Leave a Comment