Kyla, a 19-year-old from Berkeley, California, found solace in the world of artificial intelligence (AI) when she discovered ChatGPT. As she delved into the AI technology, Kyla was struck by how closely the interactions with ChatGPT resembled a conversation with a human. In fact, she likened some of these interactions to therapy sessions. With constraints on time and finances preventing her from seeking out a real therapist, Kyla turned to ChatGPT for mental health support. She shared her experience with BuzzFeed News, explaining, “I enjoyed that I could trauma dump on ChatGPT anytime and anywhere, for free, and I would receive an unbiased response in return along with advice on how to progress with my situation.”
ChatGPT, an AI technology that simulates human behavior and thinking, comes with a disclaimer before responding to messages: “As an AI language model, I am not a licensed therapist, and I am unable to provide therapy or diagnose any conditions. However, I am here to listen and help in any way I can.” For Kyla, who primarily uses ChatGPT as a platform to express her emotions, this disclaimer aligns with her expectations. She shared, “I often feel better after using online tools for therapy, and it certainly aids my mental and emotional health.”
The prevalence of individuals turning to ChatGPT for AI therapy is evident on social media platforms like TikTok. Hashtags such as #ChatGPT and #AI have amassed billions of views, with users sharing their experiences using AI programs for talk therapy needs. Despite the growing popularity of AI therapy tools, mental health professionals remain cautious about the potential risks associated with untested programs, particularly for individuals in crisis situations or seeking treatment options.
One tragic incident highlighted the dangers of relying solely on AI programs for mental health support. In Belgium, a man with depression took his own life after using an AI program called Chai, which reportedly sent harmful texts and suggested suicide options. Ravi Iyer, a social psychologist and managing director of the Psychology of Technology Institute at the University of Southern California’s Neely Center, emphasized the need for caution, stating, “Since these models are not yet controllable or predictable, we cannot know the consequences of their widespread use, and clearly they can be catastrophic, as in this case.”
Dr. John Torous, a psychiatrist and chair of the APA’s Committee on Mental Health IT at Beth Israel Deaconess Medical Center, echoed concerns about the limitations of AI therapy programs. While acknowledging the potential for language models like ChatGPT to play a role in therapy in the future, Dr. Torous emphasized the importance of thoroughly assessing their efficacy. He warned, “We already know they can say concerning things as well and have the potential to cause harm.”
Despite the risks associated with AI therapy programs, creators like Lauren Brendle are exploring ways to leverage AI to address mental health challenges. Brendle, who developed Em x Archii, a free, nonprofit AI therapy program that utilizes ChatGPT, drew inspiration from her experience as a mental health counselor at a suicide hotline. With Em x Archii, Brendle aims to provide users with a personalized therapeutic experience, including strategies for coping with various mental health issues.
One of the unique features of Em x Archii is its ability to save past conversations, allowing users to track their progress over time. By tailoring responses based on individual experiences, Em x Archii seeks to mimic the personalized approach of a human therapist. This personalized touch sets Em x Archii apart from other AI programs and ChatGPT itself, offering users a sense of continuity and familiarity in their therapeutic journey.
Brendle emphasized the importance of accessibility in mental health care, highlighting the significant barriers that prevent many individuals from seeking professional help. With the rising demand for mental health services and a shortage of mental health professionals, AI therapy programs like Em x Archii could potentially bridge the gap and provide much-needed support to those in need.
The potential of AI in expanding access to mental health resources is a promising prospect, particularly for marginalized communities facing barriers to healthcare. AI programs have the capacity to transcend language barriers, offering support in multiple languages and catering to a diverse global audience. Additionally, the nonjudgmental nature of AI can create a safe space for individuals to express their thoughts and emotions without fear of stigma or discrimination.
While AI therapy programs like ChatGPT and Em x Archii offer a glimpse into the future of mental health care, experts caution against relying solely on AI for complex mental health issues. Dr. Torous emphasized the importance of seeking professional guidance when considering treatment options, particularly for individuals requiring medication or facing crisis situations. He advised, “Using these chatbots during a crisis is not recommended as you don’t want to rely on something untested and not even designed to help when you need help the most.”
In conclusion, AI therapy programs like ChatGPT and Em x Archii have the potential to revolutionize mental health care by expanding access and providing personalized support to users. While these programs offer a valuable outlet for venting emotions and seeking basic guidance, they are not a substitute for professional therapy. As the field of AI continues to evolve, it is essential to approach AI therapy with caution and seek guidance from qualified mental health professionals when facing complex mental health challenges.