Artificial intelligence is just as good as humans at identifying red flag language in text messages from people with mental illness, according to a recent study. study found. But with many clinicians concerned to be replaced by AI, should they worry about these discoveries?
Replacement won’t be an issue, said Justin Tauscher, lead author of the University of Washington Medicine study mentioned above. On the contrary, AI tools are there to support clinicians, not to replace them.
“The hope is that these tools and the potential interventions that could be developed… will help give clinicians greater insight into how they might better support their work,” Tauscher said in an interview. “So rather than replacing clinicians or changing the way their work is done, it’s just adding to the work that can be done and adding to the tools that someone has.”
Another industry expert agreed with Tauscher’s comments. AI is simply a support tool for clinicians, said Robin Farmanfarmaian, an entrepreneur who works with AI startups and author of “How AI Can Democratize Healthcare.”
“AI will never replace doctors,” she said in an interview. “It is the doctors who use AI who will replace the doctors who do not use it. AI is a tool… the app can do one thing very well. So in the case of studying, he can only detect the few things he was looking for, while the therapist is looking at a lot more.
The study, published in the Journal of Psychiatric Services, collected thousands of text messages from 39 patients in a 12-week randomized controlled trial. Clinicians rated the texts for cognitive distortions or thoughts that increase depression and anxiety. The researchers also used natural language processing in the study, which means they programmed computers to identify cognitive distortions. Specifically, clinicians and computers have looked for five types of cognitive distortions in language: mind filtering, jumping to conclusions, catastrophizing, “should” statements, and overgeneralization.
This is the first study to use natural language processing for text messages between patients with severe mental illness and clinicians, Tauscher said.
“This was the first, at least to our knowledge, to attempt to train a classifier in text message exchanges between people with severe mental illness and their clinicians for cognitive distortions… Just kind of knowing that we can take advantage of certain techniques of nature. language processing to better understand what is going on between a patient and their clinician and therapeutic intervention is really exciting.
Text messages have become a common part of mental health treatment, Tauscher said. Usage started in simpler ways, such as appointment reminders, medication reminders and statements of support, he said. Eventually, text messaging in mental health settings became real therapeutic conversations between clinicians and patients, including letting people know about mental health resources in the community or checking in on clients between sessions.
“We’re learning a lot more about how to use text messages to actually deliver therapeutic interventions,” Tauscher said. “Thoughts so stimulating, relaxing, identifying cognitive distortions. Real back-and-forth text messaging has proven to be useful in this way and some customers really like this type of methodology or modality because it’s convenient. They don’t necessarily need to go to an office for this and there’s something about not having a face-to-face meeting that is helpful for some clients.
Now that the researchers have these findings, they want to expand their work. Going forward, they want to test their model on a different group of clinicians and patients, as well as look for other examples of cognitive distortion, Tauscher said.
Most importantly, Tauscher wants to show clinicians how technology can support and enhance their work, not replace them.
“In mental health, language is one of the most important ways to deliver care and it contains a lot of information that can be used to inform how we deliver our services… We can use advances in technology, from natural language processing and artificial intelligence, to capture the information transmitted in our language to improve how mental health care is delivered to people over time,” Tauscher said.
Photo: metamorworks, Getty Images