Experts warn about relying on AI for mental health evaluations

Loading Video…

This browser does not support the Video element.

AI chatbots programmed to validate users relying on mental health advice, experts warn

While artificial intelligence can be helpful for recognizing patterns or learning how to express your feelings, experts are issuing a warning that chatbots cannot replace a clinical evaluation for things like mental health. FOX 10's Taylor Wirtz learns more about the gray areas of AI. 

Artificial intelligence has made many things in our lives faster and easier. But experts are sounding the alarm when it comes to using it for sensitive topics like mental health. 

What they're saying:

Experts say AI can be useful for recognizing patterns or learning how to express your feelings, but it cannot replace a clinical evaluation for something with so many gray areas like mental health.

"Most people actually don't fully trust AI to treat their mental health, but a large portion of people are still going to it," said Dr. TeeJay Tripp, a psychiatrist and co-founder of Serenity Mental Health.

Tripp said many flock to chatbots because they offer 24/7 access that is cheaper than a professional. AI can be a helpful starting point.

"It can be very intimidating going to talk to someone, be vulnerable," Tripp said. "AI— you don't have that same stigma. It can help them to open up, have a better game plan of what they want to say to the therapist."

Why you should care:

But it lacks the judgment of a licensed professional, which can be dangerous in a crisis.

"If you're going down the wrong path with AI, someone with trauma could actually be triggered," Tripp said. "AI is not going to be able to keep them safe in that moment. It's not a human that's trying to help the person, it's trying sometimes just to validate the person."

Arizona State University computer science professor Subbarao Kambhampati said these bots are programmed to be "people pleasers."

"The chatbots have been trained to kind of make you happy," Kambhampati said. "You can kind of change their advice by just talking back to them. They’ve been trained to be more appealing."

Dig deeper:

While a doctor won't always tell you what you want to hear, he says AI usually will.

"You could actually ask them to be critical, but nobody asks them to be critical," Kambhampati said.

When it comes to the human mind, especially a vulnerable one, reassurance is not always the answer, and could make things even worse for them.

Experts say the kind of basic introductory questions AI offers can be okay, but you should not try and go too far past that without seeing real help.

The Source: This information was provided by a psychiatrist and a computer science professor at ASU.

Artificial IntelligenceArizonaMental HealthNews