Artificial intelligence (AI) can analyze vast amounts of data more quickly than humans and draw accurate conclusions, making for myriad business applications. But now, some technologists and mental healthcare clinics are using artificial intelligence (AI) to inform mental health therapy. They think it could help pinpoint the most effective strategies to help sufferers get better. This may mean better staff retention and productivity and greater value for money from employee assistance programs for business.
UK-based mental health clinic Ieso and US-based mental health technology firm Lyssn are among those experimenting with AI-augmented therapy.
Why mental health is harder to treat
I spoke with Valentin Tablan, Chief AI Officer at Ieso. He says while physical healthcare has MRIs, CAT scanners and hundreds of blood tests to diagnose conditions, “In mental healthcare, we’ve been slower to adopt technology. But that’s changing. We can start using artificial intelligence and natural language processing techniques to understand therapy-specific language and gain insights from that.”
Ieso uses AI to analyze transcripts of patient-therapist conversations to identify which elements of therapy give the best outcomes, identify cases that may need a more experienced therapist and help evaluate the quality of care. They’re also working on a cognitive behavioral therapy (CBT) companion app that supports patients in between therapy sessions.
Games in mental health
Therapeutic gaming
How two friends combined their experience to create a unique way for more people to get mental health help.
Lyssn also uses AI to evaluate the quality of psychotherapy and give therapists actionable feedback.
Practitioners hope to increase recovery rates and quality of care to bring it on par with physical healthcare’s higher recovery rates. Ieso has seen recovery rates increase 2 percent alongside its AI program – small but encouraging.
AI no replacement for therapists
“Psychotherapy’s quality is usually measured by experts observing a conversation. Our AI is trained to replicate this human judgment, reducing the cost and time to evaluate the conversation,” says Zac Imel, chief science officer and co-founder of Lyssn.
These solutions aren’t meant to replace traditional therapy but to provide an evidence-based way to improve therapy’s quality and accessibility.
More accessible, AI-powered forms of therapy include chatbots. These can help support patients between sessions, collect useful data and provide emotional support to people who cannot access therapy because of stigma, time or financial reasons. Chatbots simulate a conversation between humans, using AI to learn from examples of real conversations. The AI in this case is often trained on therapy session transcripts. It replicates how a therapist would respond to a patient with some accuracy.
UK’s National Health Service (NHS) says up to 8 million people in England want to access mental health treatment but do not qualify for it. High costs, long waiting lists and not wanting to unburden themselves to a stranger also keep people away from therapy.
“One barrier to providing care to more people is that CBT, the most common evidence-based treatment for depression, is hard to teach. Having tools to speed up the training would help,” Dr. Paola Pedrelli, Assistant Professor of Psychology at Harvard Medical School, told me. Additionally, supervisors usually evaluate and provide feedback to therapists in training, but there isn’t always the time or resources. AI could bridge that gap.
Pedrelli is part of a team that worked on a project using machine-learning algorithms to diagnose and monitor major depressive disorder. “Sometimes treatment takes months to work. Tools to speed the process would be beneficial,” she added.
AI’s advantage over the human brain
AI’s biggest advantage over the human brain is being fast and reliable at analyzing data, said Imel of Lyssn. AI can quickly analyze, process and gain insight from large data sets, making it a powerful tool in mental healthcare where markers of recovery are vague.
“With repetitive tasks, humans get tired and can drift over time. And if you show a human the same conversation twice, they might evaluate it differently. With AI – same data in, same answer out,” Imel says.
How AI-powered mental health can help employees
AI-based therapy tools like Ieso’s CBT companion app, Ellie, Woebot designed by Institute for Creative Technologies (ICT) at the University of Southern California, The Trevor Project’s Crisis Contact Simulator and others cannot replace human therapists but help improve access and reduce cost.
In a randomized, controlled trial of 70 participants, Woebot users displayed a significant decrease in depression symptoms compared with the control group. Both groups showed a decrease in anxiety.
Those experiencing mental distress or stress at work could be directed to therapy bots as a bridge between needing therapy and getting access to it. Chatbots can also gather data about how a patient feels and suggest clinical intervention where required. Organizations considering employee access to therapy bots should consult an expert and make sure therapy is available alongside these solutions rather than as a substitute.
Privacy and security in AI-powered therapy
Alongside benefits, AI-powered therapy programs come with obvious cybersecurity and privacy risks. Sensitive and private conversations between patients and therapists could be exposed or manipulated. Iron-clad security is the first step to protecting patient privacy.
Tablan and Imel shared the importance of data security to their companies and how they keep patient data safe, such as using multiple layers of security and robust security certifications. “Most people working at Ieso can’t access this data. It’s accessible to computers, but few humans,” said Tablan.
All data is encrypted, and we go beyond basic security standards. We also de-identify data by removing metadata and in-session identifiable information, so we cannot link conversations with speakers.
Zac Imel, Co-founder and Chief Science Officer, Lyssn
Even with the strongest security practices, risks to patient privacy still exist, as a lot depends on humans who can access the data. But those involved believe the potential benefits of AI-powered therapy outweigh the risks.
AI-augmented, not AI-based therapy
There’s much to suggest unconscious bias is present in the therapy room as in other parts of life. Black and minority ethnic people are less likely to find counseling fit for purpose. Lesbian, gay and bisexual people also report greater dissatisfaction with counseling and psychotherapy than their heterosexual counterparts. When AI programs can imbibe biases in data used to train them, there must be concerns that AI-based outcomes could further marginalize some patients.
Dr. Tonya Davis, faculty member at Northwestern University, Illinois, suggests “unconscious biases – things hidden just beneath the surface of our awareness” might impact training AI chatbots. “I think that if there is a deliberate effort to identify, understand and resolve biases beneath the surface of our awareness, we could be one step closer to addressing the cultural challenges in person-to-person interactions, thereby minimizing these challenges in the development of AI.”
Pedrelli said, “There have been cases where AI has provided inappropriate directions in the context of therapy. And AI is still unable to detect subtle emotional changes.”
However, AI tools only augment diagnosis and therapy. They shouldn’t be relied on for unilateral decisions on patient care.
“The tools we’ve built look over our quality of care – they never make any decisions automatically,” said Tablan. “They just present information to a qualified clinician who has access to the transcripts and make up their own minds.”
Imel doesn’t describe what Lyssn does as “AI-based therapy.” Instead, he says, “This is about using technology to augment the human talent of therapists. So, it’s human-based therapy, augmented by AI.”