People Are Using ChatGPT in Place of Therapy—What Do Mental Health Experts Think?

<p>10

10'000 Hours/Getty Images

Fact checked by Nick Blackmer




  • Some people online are experimenting with using artificial intelligence in place of real therapy for mental health issues.

  • For now, experts say using the technology as a therapist likely isn't safe, since it poses a number of confidentiality and safety concerns.

  • But with its frequent accessibility and cost issues, therapy could possibly be supplemented by AI was the tech advances, some experts said.





<p>10'000 Hours/Getty Images</p>

10'000 Hours/Getty Images

Artificial intelligence is having a moment. As AI-powered chatbots like ChatGPT gain popularity, more people have taken to testing out the technology on tasks like answering math questions, translating words or sentences, and even generating recipe ideas or grocery lists.

Some people on social media have also begun using these AI chatbots as makeshift therapists. By presenting the technology with mental health questions or crises, people can receive advice—often free advice—without having to spend the time or money on therapy sessions.

One TikTok user went so far as to say they replaced their therapist with an AI chatbot. “Today I officially quit therapy because I just found the best free replacement: using ChatGPT,” the TikToker said, recommending others may want to do the same.

This advice, however, is worrisome to healthcare providers who focus on mental health.

“Be skeptical. [AI chatbots] are not meant to be used as a substitute for therapy, psychotherapy, or any kind of psychiatric intervention,” Bruce Arnow, PhD, professor at the department of psychiatry, associate chair, and chief psychologist at Stanford University, told Health. “They’re just not far enough along for that, and we don’t know if they’ll ever be.”

Here’s what expert psychologists had to say about why using AI as a therapist could be a concern, best practices when it comes to seeking help for mental health issues, and the ways that AI could be used safely in the future.

Related:Poor Body Health May Indicate Poor Mental Health—Experts Discuss Mind-Body Connection

How Are People Using AI Chatbots for Therapy?

It can be challenging to imagine what a “therapy session” with AI might look like. But for most users online, it simply means messaging with an AI chatbot, which allows people to ask specific and oftentimes personal questions.

One TikTok user walked their followers through a conversation they had with ChatGPT, instructing the chatbot to “act as my therapist. I need support and advice about what I should do in certain situations I have been struggling with.”

ChatGPT responded that it was “here to support [them] and offer advice,” before asking follow-up questions about the creator’s concerns and offering possible solutions. It also recommended that they seek professional help if their anxieties still felt overwhelming.

Another TikTok creator shared screenshots of their conversation with an AI chatbot embedded in the social media app Snapchat. When the user presented the chatbot with questions about issues in a relationship, it responded, “It’s understandable to want to know what’s going on with a friend. But it’s important to respect their boundaries and give them space if that’s what they need.”

Still other users have presented ChatGPT with suicidal ideation. Even in these situations, the technology seems to respond remarkably well, said Olivia Uwamahoro Williams, PhD, assistant professor of counselor education at the University of West Georgia, and co-chair of the American Counseling Association Artificial Intelligence Interest Network.

“They all would generate very sound responses,” she told Health. “Including resources, national resources—so that was good to see. I was like, ‘Okay, well these things are very accurate. The generated response is very counselor-like, kind of therapist-esque.’”

Related:I Have Tried Online Therapy at Three Companies. Here's What I Learned

Concerns About Using AI Chatbots for Therapy

Despite the chatbots’ seemingly good responses to queries about mental health concerns, psychologists agree that simply using AI in place of traditional therapy is not yet a safe option.

At the most basic level, there are some concerns about ChatGPT or other AI chatbots giving out nonsensical or inaccurate information to questions, Arnow explained. ChatGPT itself warns users that the tech “may occasionally generate incorrect information,” or “may occasionally produce harmful instructions or biased content.”

Beyond this, Uwamahoro Williams said there are some logistical concerns with trying to use AI as a therapist, too.

Therapists are trained and licensed, which means that they have to maintain a certain standard of practice, she explained. AI chatbots don’t have these same guidelines.

“There’s not a person involved in this process. And so the first concern that I have is the liability,” she said. “There’s a lack of safety that we have to be open and honest about, because if something happens, then who is held accountable?”

Similarly, using AI as a therapist involves putting sensitive information on the internet, Uwamahoro Williams added, which could be a privacy issue for some people.

In the case of ChatGPT, the site does collect and record conversations, which it says it uses to better train the AI. Users can opt out, or they can delete their account or clear their conversations, the latter of which is deleted from ChatGPT’s systems after 30 days.

Uwamahoro Williams is also concerned that advice from a chatbot could be misinterpreted by the person seeking help, which could make things worse in the long run.

All of these qualms, however, can really be traced back to one main issue, namely that AI is just that—artificial.

“I think in the future it's going to probably surpass us—even therapists—in many measurable ways. But one thing it cannot do is be a human being,” Russel Fulmer, PhD, senior associate professor at Xi’an Jiaotong-Liverpool University and incoming professor and director of counseling at Husson University, told Health. “The therapeutic relationship is a really big factor. That accounts for a lot of the positive change that we see.”

Traditional therapy allows the provider and patient to build an emotional bond, as well as clearly outline the goals of therapy, Arnow explained.

“AI does a really good job in gathering a lot of knowledge across a continuum,” Uwamahoro Williams said. “At this time, it doesn't have the capacity to know you specifically as a unique individual and what your specific, unique needs are.”

Related:Talk Therapy Is Good for Your Heart Health, Study Finds

Will AI Chatbots Ever Be a Safe Therapy Option?


Though psychologists largely agree that using AI as a stand-in for a therapist isn’t safe, they diverge a bit on when and if the technology could ever be useful.

Arnow is a bit skeptical as to whether AI chatbots could ever be advanced enough to provide help on the same level as a human therapist. But Fulmer and Uwamahoro Williams are a bit more comfortable with the idea of chatbots potentially being used in addition to traditional therapy.

“These platforms can be used as a supplement to the work that you’re actively doing with a professional mental health provider,” Uwamahoro Williams said.

Chatting with an AI could even be thought of as another tool to further work outside of therapy, similar to journaling or meditation apps, she added.

There are even some chatbot AIs that are being piloted specifically for mental health purposes, such as Woebot Health or Elomia. It’s possible that these could be a better option since they’re created specifically for handling mental health-related queries.

For example, Elomia says they have a safety feature where humans will step in if people need to speak to a real therapist or a hotline, and Woebot says their AI has a foundation in “clinically tested therapeutic approaches.”

Most of these programs—in addition to AI in general—are still being developed and piloted though, so it’s probably too early to compare them definitively, Fulmer said.

Online AI therapy certainly holds no candle to the real thing—at least for now—Fulmer and Arnow agreed. But the fact remains that mental health care is inaccessible for many people—therapy can be incredibly expensive, many therapists don’t have space for new clients, and persistent stigma all dissuade people from getting the help they need.

“I guess there’s a difference between my ideals and the recognition of reality,” Fulmer said. “ChatGPT and some of these chatbots, they offer a scalable solution that’s, in many cases, relatively low-cost. And they can be a piece of the puzzle. And many people are getting some benefit from them.”

If just one person has received some sort of benefit from treating AI as a therapist, then the notion of whether it could work is at least worth considering, Fulmer added.

For now, ChatGPT may have useful applications in helping people “screen” themselves for mental health disorders, experts said. The bot could guide someone through common symptoms to help them decide if they need professional help or diagnosis.

AI could also help train new counselors and help psychologists learn more about which strategies are most effective, Arnow and Uwamahoro Williams said.

Years down the line as AI advances, it may have more applications in therapy, Fulmer said, but it still may not be right for everyone.

“Right now, there is no substitute for a human counselor,” Fulmer said. “[AI] synthesizes large data sets, it’s good with offering some useful information. But only a real-life therapist can get to know you and tailor their responses and their presence to you.”

For more Health news, make sure to sign up for our newsletter!

Read the original article on Health.