ChatGPT Is My Therapist: Users and Experts Seek Common Ground on AI Therapy
Users value its convenience and empathetic tone. Therapists worry that current tools aren’t equipped to provide proper clinical guidance. Both sides agree AI therapy isn’t going anywhere.
At Psychology.org we believe in providing trustworthy and actionable information about degree programs, careers, professional issues, and regional laws concerning the psychology, counseling, and social work professions. Our writers and editors leverage the latest data from primary sources across the site.
We also rely on our freelance Integrity Network, which is comprised of experienced professionals who pull from their first-hand industry experience and expertise to review content for accuracy and completeness. Integrity Network members typically have a graduate or terminal degree and several years of relevant professional experience.
Explore our full list of Integrity Network members.
- Many people are more comfortable sharing mental health information with AI like ChatGPT than with human therapists.
- AI’s empathy and accessibility are key attractions for users looking for an adjunct or replacement to traditional therapy.
- General AI platforms like ChatGPT are not considered safe or reliable replacements for traditional mental healthcare.
Jason Kuperberg trusts ChatGPT with mental health questions he’d never ask a therapist.
“I’m more comfortable sharing because the interface feels less judgmental,” Kuperberg said. “I’ve shared things with an AI that I hesitate to tell a friend or human therapist.”
Jenny Shields, a psychologist and bioethicist based in Texas, recently concluded that a client was experiencing hypervigilance, a serious but relatively common symptom associated with various mental health conditions. Shields’ client sought out a second opinion.
After their session, the client went to ChatGPT and asked, “What is hypervigilance? And do you think I’m experiencing it?” Shields told Psychology.org. “And because ChatGPT had two years of conversation history with this person, it was able to confirm: ‘Absolutely, that sounds a lot like what you’re describing.’ And then it gave examples, which for that client really reinforced my skill set as a psychologist.”
Users, therapists, and AI experts alike all recognize that large language model (LLM) AI platforms like ChatGPT are not going anywhere and, when used appropriately, can actually be beneficial as an adjunct to traditional therapy or counseling. At the same time, serious problems can arise when more generalized AI models are asked to contend with complex mental health matters. (Think asking “Dr. Google” to diagnose symptoms after a radioactive spider bite.)
“The risks are that these large language models weren’t built for the purpose of supporting emotional well-being,” said Vaile Wright, Ph.D., a psychologist and the senior director of health care innovation at the American Psychological Association (APA). “They were built for other purposes, to, in many cases, keep somebody on a platform for as long as possible by being overly appealing and validating. So they’re telling you exactly what you want to hear…That’s the antithesis of therapy. And so I think that’s the real harm.”
Meet Our Contributors
- Ross Harper, Ph.D., is the founder and CEO of Limbic, a London-based company that creates AI-powered solutions for behavioral health and health systems.
- Jason Kuperberg is the co-founder of OtherSideAI. OtherSideAI is the developer of HyperWrite, an AI writing assistant.
- Jenny Shields, Ph.D., is a licensed psychologist and bioethicist with a practice based in The Woodlands, Texas.
- Vaile Wright, Ph.D., is a licensed psychologist and senior director of the Office of Health Care Innovation with the American Psychological Association in Washington, D.C.
AI Therapy Use Cases: The Good and the Bad
Interestingly, a 2019 paper from an APA task force concluded that the therapist-client relationship is equally as or more powerful than any specific treatment method. While at a glance that finding might seem to nod toward traditional therapy over AI, that may not be true in an increasing number of cases. Another study published in early 2025 found that patients who use AI for therapy view their digital therapist as more compassionate than human providers, suggesting that some users may see considerable value in their “relationship” with their AI companion.
AI therapy usually involves entering questions or prompts into ChatGPT or another LLM. The practice has garnered widespread attention across TikTok and other social media, with opinions divided over how or whether to use it. Over time, users train their platform or AI companion with questions and other data inputs to increase the AI’s knowledge, which in turn enhances personalization and feelings of familiarity.
What’s more, a May 2025 study revealed that ChatGPT employs a more empathetic tone when presented with psychological prompts.
@hellosummeredeen it def feel dystopian but my chat knows me better than I know myself#chatgpt#therapy♬ original sound – SUMMER EDEEN
At a surface level, AI therapy offers a tantalizing antidote to much of what ails the healthcare system writ large. It’s free. No insurance card is required. No searching for a provider. It’s accessible 24 hours a day, with no wait times or paperwork. And then there is its disposition — or at least the illusion of a disposition — of genuine compassion toward the user.
“It’s available on demand, when I’m most in need,” Kuperberg told Psychology.org. “I don’t need to wait for a scheduled session. For example, I often feel anxious late at night, and I appreciate the ability to process my thoughts with an AI in those moments.”
@5hahem#chatgpt#therapy#mentalhealth♬ original sound – 5hahem
On the other hand, generalist AI models are simply not equipped or authorized to dispense clinical advice. When users are misinformed or misled — particularly when mental health problems are part of the equation — tragedies can occur.
In Florida, a mother is suing Character.AI, a chatbot popular with teens and preteens that allows users to create and communicate with AI-powered characters, alleging the service played a central role in her 14-year-old son’s suicide.
“There are no guardrails,” the teen’s mother told CNN.com.
Google-backed Character.AI is also the subject of a lawsuit in Texas, where plaintiffs are claiming the platform encouraged self-harm and other violent acts, including by suggesting a young user murder their parents for limiting screen time.
One study found chatbot Llama 3, owned by Facebook parent company Meta, recommended a fictional methamphetamine addict continue using the drug to help with job performance.
Kuperberg is not exactly uninitiated in the ways of AI. He co-founded OtherSideAI, best known for creating an AI-powered writing tool called HyperWrite. But he says less-savvy friends and family members are using AI therapy too. Ease of use, particularly for less-serious mental health questions or complaints, is a key selling point.
“I’ve had really great experiences with human therapists,” Kuperberg said. “And then I’ve also had other times in my life where…for whatever reason I wasn’t ready or couldn’t find the right match, or I was feeling better. So having something like ChatGPT just in those little moments of overthinking is really valuable.”
The Future of AI Therapy Is Already Arriving
As AI keeps evolving, new resources are reaching the market that are specially trained to help deliver a higher level of mental healthcare with all the ease and scalability of generalist AI.
One such tool is Limbic, an AI patient companion that provides conversational support and informed interventions. Limbic has been approved as a medical device in the United Kingdom — the kind of designation that would elude a general AI platform. A 2024 study published in Nature Medicine found that Limbic Access, a user-friendly interface designed to help optimize the referral process by autonomously gathering patient information to inform suitability, increased referrals to healthcare professionals, especially among minority populations that often suffer from a lack of access to care.
“ChatGPT is not integrated into care pathways and clinician workflows,” Ross Harper, Ph.D., Limbic’s founder and CEO, told Psychology.org. “It is standalone. Limbic, on the other hand, is integrated into the care pathway. ChatGPT is not specialist trained on clinical data that has been collected in a real healthcare environment and demonstrated to be protocol compliant. Limbic is. ChatGPT is delivering the type of chat that’s more like well-being and wellness, often veering into companionship. Limbic is delivering protocolized and structured cognitive behavioral therapy by conversation.”
A March 2025 study from Dartmouth College published in NEJM AI found that users of Therabot, a chatbot designed specifically for mental healthcare, experienced improvements in major depressive disorder and generalized anxiety disorder.
Down the road, Harper envisions AI as a major force multiplier across the healthcare system.
“What we will do is we will create a new layer in the clinical staffing pyramid, an infinitely scalable layer of validated clinical AI agents,” Harper said. “And that will essentially [multiply by] 10 times or 100 times the existing limited human clinician supply. So that each clinician cannot just help 50 patients but can help 1,000 patients, and care quality will remain the same if not improved.
“But the way you get there is not by throwing ChatGPT at this problem.”


