NewsNationalScripps News

Actions

How AI could help with the shortage of mental health professionals

Around 1 in 5 adults experience mental illness each year, according to the Substance Abuse and Mental Health Services Administration.
How AI could help with the shortage of mental health professionals
Posted at 4:37 PM, Apr 11, 2024
and last updated 2024-04-11 17:34:10-04

Can AI help with the shortage of therapists the country is experiencing?

The short answer: There’s potential.

“I think the need to aid in therapy is definitely there, but those just, the risks at the same time are great,” said H. Andrew Schwartz, an associate professor at Stony Brook University. He is also the director of the Human Language Analysis Lab.

Schwartz was also an author on research released this year that looks at how large language models could change the future of behavioral health care.

He said there are pros and cons to using AI in the therapy space.

@scrippsnews How would you feel if your therapist was an #ArtificialIntelligence chatbot? With the rise in AI technology, companies are looking at how the tech can be used to help in the mental health space. Experts say there are pros and cons. #MentalHealth ♬ original sound - Scripps News

While its best use may not be to replace the act of therapy between a client and clinician, there are other aspects of the job it can help with, "such as summarizing therapy sessions,” Schwartz said.

Another use could be in the assessment process.

“There's also less risk because it's not an interactive task, there’s less risk of giving someone bad, risky instructions,” he said.

Around 1 in 5 adults experience mental illness each year, according to the Substance Abuse and Mental Health Services Administration.

One study found that the psychiatrist workforce would see a shortage of 14,280 to 31,091 psychiatrists by 2024.

Health companies have begun to experiment with using AI to help with the shortage of mental health care providers, but it hasn’t always gone well.

Last year, the National Eating Disorders Association took down their AI-powered chatbot after it “may have given information that was harmful," the organization wrote in an Instagram post.

“I think they have enormous potential for helping therapists and helping more people get treatment, especially personalized treatment for their unique situation. On the other hand, to get there, to get it to have the most help the fastest, I think we’re many years away from doing that in a way that's risk-free,” Schwartz said.

SEE MORE: FDA clears AI stethoscope technology that can detect heart failure


Trending stories at Scrippsnews.com