
Can You Use ChatGPT as a Therapist?
ChatGPT 4 is the pinnacle of modern-day AI. More and more, it is being integrated into our daily lives, and its popularity is growing rapidly. With the rise of technology and the digital sphere, people are becoming increasingly isolated, and lack true friends and companions.
Many have come to see ChatGPT as a solution to this problem. It is private (at least it appears so to the user — who knows what OpenAI employee is reading your chats), it is non-judgmental, and it is accessible 24/7. It is also extremely well-informed on likely all domains of human knowledge.
In June of 2023, OpenAI banned usage of ChatGPT as a therapist, likely to avoid liability issues. Although many had begun to use it as a dedicated companion / therapist, its output was unpredictable, and perhaps not entirely reliable. Instead of taking the time to refine and perfect its therapeutic ability, the decision was made to simply have ChatGPT direct the user to a human therapist, if such issues were brought up.
The real question, however, is whether AI therapy can be effective at all. At first glance, the idea seems absurd. A therapist is a human being, who can see you, understand you, communicate with you, and help you discover underlying issues and traumas. How could an AI possibly replicate this? To most, a chatbot is simply a cold, unfeeling, robotic entity, barely capable of replicating even human text. How can a box of wires and switches be a therapist?
Because the field of AI is a relatively new field, there are admittedly more questions than answers at this point. Some researchers, however, have completed studies regarding AI Therapy and its efficacy. Here are 3 of the more well-known studies:
- NPJ Digit Med, December 2023
In December of 2023, NPJ Digit Med published a systematic review and meta-analysis of AI-based conversational agents for promoting mental health and well-being. Their results were as follows:
“”The meta-analysis revealed that AI-based CAs significantly reduce symptoms of depression (Hedge’s g 0.64 [95% CI 0.17–1.12]) and distress (Hedge’s g 0.7 [95% CI 0.18–1.22]). These effects were more pronounced in CAs that are multimodal, generative AI-based, integrated with mobile/instant messaging apps, and targeting clinical/subclinical and elderly populations.””
Their review definitely posits that AI therapy has a real basis, and a strong reason to believe that it could be highly effective. - JMIR, June 2021
In June of 2021, the Journal of Medical Internet Research published a study on Youper, a widely used artificial intelligence therapy app.
The results of the study were as follows:
“”Youper users rated the app highly (mean 4.36 stars, SD 0.84), and 42.66% (1927/4517) of users were retained by week 4. Symptoms decreased in the first 2 weeks of app use (anxiety: d=0.57; depression: d=0.46). Anxiety improvements were maintained in the subsequent 2 weeks, but depression symptoms increased slightly with a very small effect size (d=0.05). A higher proportion of successful emotion regulation attempts significantly predicted greater anxiety and depression symptom reduction.””
The study found that overall, the app was successful at decreasing anxiety and depression symptoms. The improvement in anxiety symptoms was sustained over the following two weeks, while there was a slight, though not significant, increase in depression symptoms. Furthermore, the data indicated that better emotion regulation, facilitated by the app, was linked to greater reductions in anxiety and depression symptoms. - NIH, July 2022
In July 2022, the NIH conducted a systematic review on using AI to enhance ongoing psychological interventions for emotional problems in real — or close to real-time.
Their conclusion was as follows:
“”Overall, the reviewed investigations indicated significant positive consequences of using AI to enhance psychotherapy and reduce clinical symptomatology. Additionally, most studies reported high satisfaction, engagement, and retention rates when implementing AI to enhance psychotherapy in real- or close to real-time. Despite the potential of AI to make interventions more flexible and tailored to patients’ needs, more methodologically robust studies are needed.””
Although they admit that they feel like more studies are necessary to prove the safety and efficacy of AI therapy, it was clear that in the measurements they did complete, it was extremely effective, and had a clear positive effect.
While the concept of using AI as a therapist might initially seem unconventional or even controversial, the evidence from various studies suggests that there is significant potential in this approach. The studies conducted by prestigious institutions and published in reputable journals, such as NPJ Digital Medicine, the Journal of Medical Internet Research, and the National Institutes of Health, collectively indicate that AI-based conversational agents and therapy apps can effectively alleviate symptoms of depression, anxiety, and emotional distress. These findings are particularly promising given the accessibility and privacy that AI platforms can offer, making mental health support more readily available to those who might otherwise face barriers to accessing traditional therapy.
However, it’s important to approach the integration of AI into mental health care with caution and ethical consideration. The need for more rigorous, methodologically sound research is evident to ensure the safety, efficacy, and ethical implications of AI in therapy are thoroughly understood and addressed. As technology continues to advance, the potential for AI to complement traditional therapy methods and provide a supplementary form of support is an exciting prospect, but it should not replace the nuanced and empathetic care provided by human professionals.