AI has become part of one mental health provider's treatment tools. How does it work?

OCONOMOWOC – Imagine a young woman gets admitted to an inpatient hospital for suicidal ideation, depression and anxiety. She's fearful and uncertain about her future.

But every few days, she's able to answer a series of questions targeting anxiety and depression. The program calculates her risks and, comparing her answers over time, predicts the best plan to get her on a healthy path. Her clinicians then analyze the data, offer treatments and the cycle repeats. The young woman continues to log her answers, her mood dramatically lightening over time.

That's been the general response of patients at Rogers Behavioral Health since one of its residential treatment facilities in Oconomowoc launched an artificial intelligence program a year ago. At Rogers, AI is leveraged in two critical models: predicting the risk of suicide and predicting how a patient responds to treatment. It's also taking center stage at a time when clinicians continue to find themselves overbooked and short-staffed.

The concern with AI as a tool is that it has terrible PR issues — and some for good reason, said Joe Austerweil, an associate professor of psychology with an affiliate appointment in computer science at the University of Wisconsin-Madison.

One of Austerweil's favorite movies depicting the use of AI is the 2012 movie, "Robot and Frank" starring Frank Langella and Susan Sarandon. Langella's character has dementia and his son buys him a robot companion to help take care of him. Quickly, it's revealed that Frank was a jewelry thief in his past life and introduces the robot to a life of crime.

The whimsical film has inspired profound questions about identity, technology and philosophy — what exactly do we want from our relationship to machines?

"The robot gets promoted to do some of the things that Frank used to do. Well, Frank used to be a thief. So does the robot help this guy put on heists?" Austerweil said. "That's actually a realistic quandary that people in moral philosophy and moral psychology are studying."

Essentially, the biggest problem with AI is that humans are the weakest links, Austerweil said. And the biggest threat from AI, at least as far as Austerweil is concerned, comes in the form of misinformation propaganda campaigns, often a result of "Deepfake" technology, which manipulates the faces of known public figures to say and do things in a convincing way.

In other words, humans are training AI to perform psychological heists on public consciousness. Recently, sexually explicit deepfake photos of Taylor Swift emerged on the internet. There have been others of both leading presidential candidates in visual and audio depictions.

"It can be very hard to tell, even for a large number of folks well-versed in the technology," Austerweil said.

Related: Deceptive AI campaign ads could target Wisconsin. Lawmakers have a plan to fight them.

Related: Paper exams, AI-proof assignments: Wisconsin college professors adjust in a world with ChatGPT

Of course, nothing fraudulent is happening at Rogers. And while Austerweil isn't familiar with Rogers' AI technology, he's not too concerned about its role in mental health.

"Getting people to use mental health support more is such a big issue that, if this makes a small dent in that problem, it's a huge breakthrough and major success," Austerweil said.

Artificial intelligence can be used to harness data otherwise missing from behavioral health. Evidence shows it improves outcomes.

At Rogers, the role of AI serves to augment clinical human work, according to Dr. Brian Kay, its chief strategy officer, who said the clinic has been developing an in-house AI program over the last decade. Rogers' AI program monitors the progress of its long-term residential patients, who are an ideal population on which to pilot the technology due to the length of stay and treatment intensity, Kay said.

Being able to engage with the AI software throughout a patient's time at a long-term residential facility, Kay said, allows clinicians to observe progress and learn what treatments reduce behavioral symptoms. As the AI suggests treatment strategies, human clinicians are able to talk through issues with the patient, guided by data.

And with each new dataset, the AI learns. Before it launched, the AI program trained off the dataset of close to 30,000 patients. In 2023, it added 2,500 patients to its algorithm process. It's a way to validate clinicians' hypotheses of what's working and what's not, Kay said.

"We view AI as a tool to augment our clinicians. It is not a tool that will replace clinicians or treatment, but may be helpful in identifying patterns that a human may not easily see," Kay said.

Lest anyone start to imagine a robot takeover, Skynet from "The Terminator" or Hal's watchful eye from "2001: A Space Odyssey," the use of AI, at least in Rogers' context, is part of an existing science-driven approach to mental health called measurement-based care.

According to the American Psychological Association, measurement-based care tracks behavioral health screening tools and other routine assessments to inform treatment decisions and engage patients on their journey to healing.

With AI handling the work of mental health screenings and analyzing data, it frees up time for clinicians to work more immediately, and accurately, with patients.

AI may be especially useful for predicting suicidality in patients, Kay said. Sleep disturbances, in particular, offer insights for clinicians. And it uses the Columbia Suicide Severity Rating Scale, considered the gold standard by behavioral health professionals, to interview patients on their risk levels.

"Measurement-based care takes away the, for lack of a better term, fluffiness that's in behavioral health," Kay said. "This gives the ability to measure somebody's depression symptoms today compared to yesterday."

But measurement-based care remains an underutilized resource in behavioral health, Kay said. Less than 20% of behavioral health practitioners use it, even though study after study demonstrates it improves mental health outcomes in patients.

In the absence of measurement-based care, research shows, clinicians struggle to identify when patients aren't responding to treatment and, more concerning, fail to note when a patient's mental health is deteriorating.

Kay explained that Rogers gave its homegrown AI program aggregated data from the clinic's various services and the program was able to pick up key patterns. Who got better through treatment and why? Who didn't get better through treatment and why?

"If we can identify if someone falls into one of those patterns, we can change their treatment plan early on in collaboration with the patient," Kay said. "It gives them the best chances to respond to treatment during the course of their stay."

You can think of Rogers' AI software like a GPS determining the best route out of traffic. It doesn't take the place of the driver, but by analyzing the different routes and real-time traffic patterns, it offers a path forward that gets you home faster.

Using that metaphor, the clinician simply has a better idea of where they're going by using AI. The alternative might look like driving on a curved road in the dark with your headlights as the only mechanism guiding you forward.

"Our patients comment that the use of data in their treatment is actually very helpful, and it helps spur on conversations with clinicians," Kay said, referencing the power of measurement-based care. "It has very much been a patient-satisfier in that way."

AI may have some powerful benefits, but the public is increasingly suspicious of the technology

The arrival of Rogers' AI software comes at an inflection point. AI is everywhere, from search engines like Microsoft Bing to Spotify's AI DJ, and it isn't going away. With its ubiquity, however, comes caution as Americans grow increasingly leery of the technology.

Consider recent data from Pew Research Center, which surveyed U.S. adults in 2021 about their views on artificial intelligence. Nearly half of the respondents said they were "equally excited and concerned" about the increased role of AI in daily life.

Within two years, that attitude about AI dramatically changed. The 45% in 2021 who said they were "equally concerned and excited" dropped to 36% by 2023, and the 37% of people surveyed who were "more concerned than excited" about AI in 2021 shifted to more than half, at 52% in 2023.

Of specific discomfort was the health care industry's increasing reliance on AI technology, with six in every 10 U.S. adults stating they're uncomfortable with the idea. And Americans were split on whether AI would improve, worsen or make no difference in their health outcomes, with slightly more stating AI would better their health.

One widely expressed viewpoint concerns how AI will interfere with the patient-provider relationship.

"Part of it, I think, is the mystery of AI. AI is a black box, right? So you put information in, it kind of does some different things that may not be easily seen. And then it outputs a number," Kay said. "A lot of providers, they look at it like: 'How am I supposed to use something I don't exactly understand?'"

AI in therapy may be suspicious for some, but is it worse than waiting six months to a year to see a human clinician? College psychologist who teaches machine learning says no.

There may be a lot to worry about when it comes to AI, but Austerweil said it's likely not going to come from behavioral health care.

"One way I like to sum it up is … rather than us having to learn to use machines, the machines are learning how to interact with us," Austerweil said. "There's a lot of dangers and risks (with AI), and they're already there, a large amount of them. But we're not close to Skynet coming."

On the contrary, leveraging AI to broaden the public's access to therapy can only be a benefit, especially at a time when wait times to see a counselor are, in Austerweil's words, "abysmal."

AI can be an especially helpful tool when diagnosing neurodiverse patients — such as those with autism — whose conditions are often misdiagnosed. Some of that is the result of "masking," when an autistic person suppresses their behaviors to be perceived as neurotypical. The urge to camouflage may not necessarily be there if a person is expressing their feelings to a machine. In other words, the fear of being judged all but disappears.

"A lot of people on the autism spectrum don't feel comfortable as much with people, generally," Austerweil said. "That's important because folks with autism spectrum disorders often have comorbid social anxiety."

The ability to speak to a machine can help any individual wary of therapy. Shame and stigma can interfere with how someone presents their mental health to a physician. Austerweil said expressing some of those struggles with a machine instead of a human can allow people to share more information than they otherwise might.

It's less taboo to tell a computer you're in a dark place, which might explain why so many people express more details about their mental states on social media.

AI can relieve the staffing burdens felt in residential settings

Measurement-based care has a host of benefits, but having patients fill out forms a few times a week can be exhausting for the clinician who is already overbooked. Rather than mill through stacks of patient forms, AI can sort and flag the critical data a clinician needs.

Relieving some burdens is especially important for mental health providers, who "have to do more with less," Kay said. According to a 2022 report from the Kaiser Family Foundation, nearly 35% of Wisconsin adults reported having an unmet counseling or therapy need due to the surging rates of anxiety and depressive disorders, substance use disorders and suicide rates.

The residential clinic is also piloting a new ambient listening component of its software during patient-clinician conversations at a handful of its facilities, from outpatient to residential. The AI model distills the key takeaways of that conversation for the clinician to review, which frees up the clinician's time.

"That is one of the biggest pieces from a staff side of things: ensuring that we're documenting conversations and it's done accurately," Kay said. "That's, frankly, a larger time consumer for our staff."

Rogers is leading the trend of using AI in behavioral health spaces. Are any following suit?

USA TODAY NETWORK-Wisconsin reached out to a number of mental health providers throughout Wisconsin to learn whether they have any plans to add AI to their repertoire of tools.

UW Health said it's currently not using AI, but that it might be a future possibility.

Debbie Patz, vice president of Bellin Psychiatric Center, said that it hasn't made the shift quite yet. Medical experience and training would still need to be coupled with AI for patient care, Patz said, and there are some things a trained expert would still need to do in person.

For example, a patient in crisis may report feeling fine but exhibit signs such as profuse sweating, pacing, trouble focusing. AI could, of course, play a role, she said, but certain pieces still need skilled and trained professionals.

Rogers is only getting started with AI. Biometrics is a new frontier, where patients wear a proximity-based, Bluetooth-enabled bracelet during their residential stay. The bracelet syncs to tablets carried by the staff. They'll not only be able to observe and document patient safety checks on a tablet but see how much sleep the patient gets throughout the night.

"AI gets a bad name and it's very overhyped," Kay said. "But can it be a really good tool to help in certain situations? Absolutely."

Natalie Eilbert covers mental health issues for USA TODAY NETWORK-Wisconsin. She welcomes story tips and feedback. You can reach her at neilbert@gannett.com or view her Twitter profile at @natalie_eilbert. If you or someone you know is dealing with suicidal thoughts, call the National Suicide Prevention Lifeline at 988 or text "Hopeline" to the National Crisis Text Line at 741-741.

This article originally appeared on Green Bay Press-Gazette: AI has entered mental health care in Wisconsin. Are patients ready?