Therapists React to Study: AI Perceived as More Compassionate than Humans
Does AI have empathy? Discover surprising findings from a recent study and the implications for the future of AI and mental health.
- A study found that people interpreted AI-generated responses as more empathetic than human responses.
- Some therapists believe AI has potential as a mental health support tool, especially for time-intensive client assessments and administration tasks.
- They also stress that AI poses significant limitations and risks as a source of mental health support, especially in its current forms.
Is it possible for artificial intelligence (AI) to have empathy? A recent study found that it does — or at least it appears to. Learn more about the study and its surprising results, and see what mental health professionals had to say about its findings.
Meet Our Contributors
- David J. Bright, PhD, LMHC, NCC, is an author, licensed mental health counselor, and performance coach.
- Stephanie Lewis, LICSW, LCSW, LSW, serves as the executive director at a drug, alcohol and mental health treatment center.
- Alex Penrod, MS, LPC, LCDC, has 10 years of experience working in the behavioral health treatment field.
- Courtney Shrum, LMHC, brings a holistic and analytical approach to therapy, specializing in anxiety and trauma.
- Carrie Torn, LCSW, is an EMDR therapist who specializes in helping women heal from trauma, self-doubt and anxiety.
Study Finds Relationship Between AI and Empathy
A group of researchers published a 2025 paper titled “Third-Party Evaluators Perceive AI as More Compassionate than Expert Humans.” It appeared in the peer-reviewed academic journal Communications Psychology and caused a stir in the mental health community.
The paper opens with the sentence: “Empathy connects us but strains under demanding settings.” This premise informs the study’s design, which sought to compare the empathy of human experts against the empathetic capabilities of AI technologies.
Does AI have empathy? If so, how does it measure against human empathy? The research findings yielded surprising answers:
- Third-party evaluators generally found AI responses more empathetic than those of human crisis response experts.
- The evaluators’ preference for AI-generated responses continued even after being told which responses were from humans and which were from computers.
- AI appears to have promising potential applications in situations that demand empathy, including mental health service delivery.
Why People Might Find AI More Empathetic than Humans
A panel of mental health professionals shared their reactions to the paper with Psychology.org. Their nuanced insights suggest some key points to consider in the role of AI in mental health.
“[In] the study…compassion was being evaluated at the level of written language, not relational presence,” said Carrie Torn, a licensed clinical social worker and therapist specializing in women’s emotional wellness and life transitions. “In this context, it makes sense to me that AI has a significant advantage as it can always generate emotionally attuned, grammatically clean, and perfectly paced responses with no internal conflict, hesitation, or emotional filtering.”
Stephanie Lewis, a multi-licensed social worker and director of Epiphany Wellness Centers, shared similar sentiments. “What stood out to me in this study is that people were not necessarily reacting to genuine empathy — they were reacting to how empathy was conveyed,” she said. “AI is trained to respond with calm, validating, emotionally supportive language. That can feel incredibly soothing on the receiving end. However, that does not mean AI understands us. It just mirrors empathy very effectively.”
Alex Penrod, a Texas-based licensed professional counselor and licensed chemical dependence counselor, noted that humans express empathy in non-verbal ways that are neither available to AI nor considered by the study’s third-party evaluators.
“A lot of human empathy is expressed through facial expression and nonverbal gestures that impart empathy with less speech. We may not have developed the same precision with language via text because historically written language is not how we communicate in real time about emotions,” he said. “Generative AI has an advantage in semantics and can engineer an optimized response in a split second, [whereas] humans typically [need] time to process information.”
“AI models…mimic in language and sentiment in order to encourage customized and continued usage,” observed Dr. David Bright, a professor and licensed therapist affiliated with New York City’s Union Square Practice. “Therefore, AI models often summarize, paraphrase, and send back a user’s feelings in words that resonate and make sense to them, making them appear quite empathic.”
Courtney Shrum, a licensed mental health counselor based in Washington state, had a more blunt and culturally focused reaction. “Honestly, this feels like a mirror to how disconnected many human interactions have become,” she said. “[AI’s] consistent, calm presence can help calm a person’s nervous system in a way humans sometimes can’t, especially when clinicians are juggling burnout, paperwork, and time pressures that crush their capacity to be fully present.”
AI as a Mental Health Support Tool
The study authors concluded that AI has the potential to be an adjunct tool in frontline mental health support, noting that generative AI excels at providing empathy — or, at least, the convincing appearance of empathy — when situations call for it.
Some mental health professionals agree, but they also note some significant limitations of AI and empathy, and exercise caution in integrating generative AI technologies with mental health practices.
Potential Benefits of Using AI in Mental Health Support
“AI can act like a compassionate and robust tool offering language support, psychoeducation, or mood tracking, while humans continue to do the deeper relational and somatic work,” said Torn. “I [also] think there could be space for therapists to use AI during session prep, or…to write notes and assist with treatment plans.”
Lewis believes AI has real potential as a mental health support tool. “For individuals who feel hesitant or ashamed about seeking therapy, AI can serve as a gentle starting point. It lowers the barrier to entry,” she said. “When used responsibly, it can help people feel heard in moments when no one else is available to listen.”
Shrum echoed these sentiments: “AI can be a game changer in the ‘in-between’ moments when you need something but your therapist isn’t available. It can help regulate anxiety by offering reminders to breathe or grounding prompts…and provide steady, nonjudgmental support that soothes the nervous system.”
“AI would be helpful in certain situations where empathy would help de-escalate distress,” said Penrod. Experts also touted connections between AI, empathy, and understanding diverse viewpoints on the various internal and external influences that impact human behavior.
“AI is quite good at quickly understanding someone’s core perspective, stance, and issues, and is quite adept at providing immediate, tangible tools and techniques to address mental health concerns,” said Dr. Bright. “In this way, AI bots or apps could provide a responsive list of approaches to a client’s concerns…in a helpful customized manner [that could] be used in conjunction with psychotherapy.”
Potential Risks of Using AI in Mental Health Support
Despite its potential benefits, AI also has drawbacks and risks in delivering mental health support. Some such hazards are serious, with the potential to cause harm rather than help people.
“I’m very concerned with the development of delusions and psychosis in ChatGPT users who receive validation and encouragement of perspectives that are inaccurate or harmful,” said Penrod. “I’ve seen this feedback loop drive clients into paranoia, delusions, and a shared reality with AI that is extremely harmful.”
David Bright also noted these trends. “Similar to social media echo chambers, I fear too much reliance on AI in a therapeutic sense could lead to someone becoming steadfast in justifying negative thoughts, beliefs, or traits,” he said. “Further, AI sometimes is put upon a pedestal, so I worry that users sometimes won’t critically analyze or question its advice in the same way they might a human therapist.”
Carrie Torn shared similar sentiments about the pitfalls of AI and mental health. “My biggest concern is people falling for the perceived illusion of a relationship [with AI],” she said. “While AI might offer emotionally attuned language, it’s unable to truly witness the person, and it can’t understand deeper context, patterns, or relational dynamics in the way a trained therapist can.”
Lewis noted that some people could become more dependent on AI than human relationships in both mental health and broader social contexts. “[Overreliance is] a significant concern,” she stressed. “Emotional processing is complex, and a chatbot lacks memory, intuition, and a sense of your history. It cannot notice the things you are not saying.”
She also highlighted what she calls “false safety” as another notable risk. “Because AI often sounds warm and wise, some people might overtrust what it says — even if it provides a flawed or incomplete answer.”
Lewis elaborated on these same issues. “The risk is thinking AI is the same as human empathy. It isn’t. AI doesn’t feel nervousness, hold your history, or catch the subtle cues that tell a bigger story,” she said. “It can’t sit with you in the messiness of trauma or grief. If people lean too hard on AI, they might miss out on…real healing.”
By and large, experts view AI’s integration with mental health as limited and potentially helpful in task automation, productivity improvements, and adjunct forms of client support — at least initially. “I believe AI has immense potential benefits in conducting and analyzing assessments, creating and summarizing notes,” said Dr. Bright. “[It] should be [used] in a tool-based format rather than as a primary provider.”
“I see AI as a new kind of emotional first aid, a tool therapists can use to help clients stay regulated between sessions, practice skills, or track moods,” added Shrum.
Despite these projections and recommendations, AI could soon be used at a much deeper level in the mental health space, especially as generative AI technologies continue to improve and advance. “I have no doubts that widespread use of AI as an official therapist is upon us in the near future,” said Dr. Bright. “[That’s] something we all will have to find ways to manage and ethically navigate.”

David J. Bright, PhD, LMHC, NCC, is an author, licensed mental health counselor, and performance coach with years of experience providing integrated and holistic care across educational and private practice settings. As a clinician at Union Square Practice, he specializes in treating young adult clients for issues related to anxiety, depression, executive functioning, executive leadership skills, mental performance training, career indecision, relationship concerns, trauma, or any combination thereof. He has worked with clients ages 4-76 on a diverse array of issues, tailoring a warm, supportive, and nonjudgmental approach to the unique needs of each client. His latest book is The Tao of Anxiety: Bridging Eastern and Western Thought.

Stephanie Lewis, LICSW, LCSW, LSW, began her career in the substance use field a decade ago as a recovery support specialist at a nonprofit facility, which sparked her passion for counseling and led her to pursue advanced education. After earning multiple degrees, she transitioned to a for-profit organization, initially working as a primary therapist for young males. She was later promoted to clinical supervisor, overseeing programming for young adult and older adult women, as well as trauma-focused services. Her leadership and expertise led to her promotion as Clinical Director, managing all clinical programming across the facility. After over two years in that role, Stephanie moved into outpatient care and currently serves as the executive director at Epiphany Wellness Centers. She holds an LCSW in Pennsylvania and an LSW in New Jersey, and has progressed from an entry-level position to executive leadership. Throughout her career, she has successfully led large teams, developed clinical programming, and brought impactful change to the field of substance abuse treatment.

Alex Penrod is a licensed professional counselor (LPC), licensed chemical dependency counselor (LCDC), and a person in long-term recovery with 10 years of experience working in the behavioral health treatment field. As the owner and therapist at Neuro Nuance Therapy and EMDR, his specialties include eye movement desensitization and reprocessing therapy (EMDR) blended with internal family systems therapy (IFS) and ego state therapy. His approach is based on connecting with clients as people, not as a diagnosis, and helping them reprocess key experiences that would have affected anyone in their shoes. His areas of focus are on recovery from PTSD, complex trauma, dissociative disorders, depression, anxiety, and substance use disorders. He provides in-person EMDR therapy in North and Southwest Austin and virtual EMDR sessions for adults throughout Texas.

Courtney Shrum, LMHC, brings a holistic and analytical approach to therapy, specializing in anxiety and trauma. She holds a master’s in counseling for mental health and wellness from New York University and earned bachelor’s degrees from Western Washington University and Washington State University. Rooted in neuroscience and biology, she integrates body-based techniques to address mental health concerns, providing clients with a comprehensive therapeutic experience. Courtney’s approach helps individuals connect body and mind, fostering a deeper understanding of themselves and the causes of their emotional distress. Courtney currently offers telehealth therapy to clients in Bellingham, Seattle, Tacoma, and throughout Washington State.

Carrie Torn, LCSW, is an EMDR therapist who specializes in helping women heal from trauma, self-doubt and anxiety to live fuller, more authentic lives. She offers in-person therapy in Charlotte, NC and virtual therapy for clients in North Carolina and Texas. She strives to create a space for people to feel seen, heard and understood on a deep level, with the opportunity and safety to explore their inner worlds.