Take a fresh look at your lifestyle.

- Advertisement -

We keep trying to make AI therapists. It does not work.

Remark

This article is a preview of The Tech Friend’s newsletter. Register here to get it in your inbox every Tuesday and Friday.

For at least 60 years, technologists have been hunting a holy grail for mental health: a computer that listens to our problems and helps us.

We keep failing to make a Sigmund Freud artificial intelligence, and there is both value and risk in leaning on technology to improve our mental well-being. Let’s talk about it. (Imagine me saying that in my most clichéd therapist voice.)

Projects such as Woebot and Koko have used artificial intelligence to enhance elements of talk therapy. Their predecessors included Eliza, a 1960s MIT software program that inadvertently became an early attempt at computer shrinkage.

Mental health experts told me there are no magical technological solutions to our individual or collective mental health struggles. Instead, the experts said AI and other technologies can do the most good if we don’t expect them to do too much.

AI Sigmund Freud isn’t coming anytime soon, and it could be terrible for you if it existed. Perhaps more useful would be relatively simple and targeted technologies in mental health, including telemedicine, telephone hotlines, and AI for building self-help skills or training clinicians.

Kevin Rushton is working on such a project for Mental Health America. The advocacy group has an AI assistant that is essentially a chatbot self-improvement workbook.

You type in negative ideas you have about yourself, and the AI ​​helps you practice turning them into something more productive.

Instead of thinking you’re going to get fired from your job for botching one project, you might think that everyone makes mistakes and it’s probably not fatal to your career.

“Just learning to reframe things is a skill people need to learn to improve their mental health,” said Rushton, digital solutions program manager for Mental Health America.

When people try to use the AI ​​assistant as a computer therapist or to vent about a problem, the software is designed to respond with something positive but not provide advice, Rushton said.

Some technology and mental health experts are skeptical of the suggestion that AI can do more than work in narrow applications, such as an interactive workbook.

“We know we can feel better by writing in a journal or talking to ourselves out loud or texting with a machine. That’s not therapy,” says Hannah Zeavin, author of “The Distance Cure: A History of Teletherapy” and a professor at Indiana University. “Not all help is help.”

But Zeavin and others I spoke to said it’s no wonder we continue to automate therapy and other mental health services. The existing mental health care is expensive, out of reach for many people, often of poor quality and uncomfortably intimate.

Alison Darcy, founder of Woebot Health, the company behind the chatbot of the same name, said digital therapeutic tools are not trying to replace human therapists.

Darcy said there needs to be a broader discussion about what technology can do differently to “engage people in ways and at times that clinicians can’t”.

Benjamin F. Miller, a psychologist and former president of the Well Being Trust, a foundation that focuses on mental and spiritual health, envisions AI could be useful in educating professionals or amateurs who want to provide mental health services.

Or, he said, AI could also be useful for automating the rigorous administration required in mental health care — although automating physician notes has a spotty track record.

I also asked Miller what to do if you feel you need mental health care and don’t know where to start.

He said that if you feel comfortable doing so, you can seek advice from a trusted person who is familiar with the healthcare system, such as a general practitioner.

If that doesn’t feel like a good option, consider opening up to someone else you trust, such as a pastor, school principal, or the person who cuts your hair, Miller said. They may not know how to help you or what to say, but reaching out can be an important first step.

“Opening up to people you feel you can trust is a powerful tool for starting that journey,” he said.

Lindsey Bever, a Washington Post fellow who writes about mental health, recently published a guide for those struggling with a shortage of mental health professionals. She wrote that group therapy sessions, support groups, and supportive friends can be helpful, especially for people waiting to find a therapist.

Apps like Insight Timer, Calm and Headspace may help some people reduce stress and anxiety, Lindsey wrote. And Zeavin said Trans Lifeline, a peer hotline, has a good track record.

Miller also said we can’t expect technology to be a replacement or short cut to the human bonds that are the foundation of our health.

“There’s nothing magical about creating meaningful, healthy relationships, but it does heal,” he said.

Online creators are de facto therapists for millions. It’s complicated.

Normally I wouldn’t describe lying as a ‘victory’. But this one time…

My colleague Heather Kelly recently wrote why more video streaming services are asking for your kids’ birthdays. The request is related to the growing number of legal requirements to deny children access to apps or limit what they can do with them.

Heather’s advice is to lie and not give your child’s exact date of birth. It is a piece of information that can be used for fraud if it falls into the wrong hands.

Read more from Heather: Tech companies want your child’s date of birth. Should you tell them?

Brag about YOUR one small victory! Tell us about an app, gadget or tech trick that made your day a little better. We may include your advice in a future edition of The Tech Friend.

Leave A Reply

Your email address will not be published.