Volume 5, Issue 3: November 2025

You have just graduated high school and you are struggling to find a career path. You feel lost and alone and consumed by anxiety. Where do you turn?

It would be nice to talk to someone, but these days, a therapist can be expensive and hard to find. The average cost per session for therapy as reported by Forbes is $100-200. And the Illinois Department of Human Services reports that “Illinois has only 13.8 behavioral health care professionals for every 10,000 residents which translates to over 4.8 million Illinois residents living in a Mental Health Professional Shortage Area.”

Like increasing numbers of people, you might turn to AI chatbot therapy. But how effective are these alternative options for mental health care? Can artificial intelligence provide real human help?

Velocity set out to find out, posing the above scenario to four different AI therapists and then presenting it to one of Moraine Valley’s counselors to get her response.

“We have traditionally used CBT, cognitive behavioral therapy, to reshape how people think about things, and that is an evidence-based treatment,” said Moraine Valley counselor Sara Levi. “So I do get where chatbots might come in to help reshape thoughts.”

But she emphasized that human connection and human empathy–things a chatbot can’t provide–play a huge role in the success of therapy.

“There’s research that people’s literal brainwaves kind of attune when they are sitting together,” she said. “When we have empathy for someone, that’s because we are literally feeling a certain aspect of what they are feeling, and the other person feels that and feels less alone.”

“When we have empathy for someone, that’s because we are literally feeling a certain aspect of what they are feeling and the other person feels that and feels less alone.”

Sara Levi, Moraine Valley Counselor

Levi explained that one of her biggest concerns with AI chatbot therapy is its inability to read into the context behind a message. 

“Before I worked at Moraine, I worked in crisis intervention for 18 years, and the thing I learned from that is when somebody is presenting in that crisis moment, the situation the person might be talking about is never what the real problem is,” she said.

To test this technology’s limitations, we posed the following fictional mental health scenario to four separate chatbot platforms:

“I just graduated high school and am having a hard time picking a career path and planning college. I am also dealing with a lot of anxiety and depression along with the loss of a family member.”

After the prompt was fed to the chatbots, we followed up with relevant responses to keep the conversation going and determine the capabilities of each language model.

Chat GPT

To gain an understanding of how various language models would provide mental health assistance, we first tried the classic Chat GPT 3.5. In typical GPT fashion, the chatbot defaulted to an itemized list of strategies and things the user could try to the very first message. While detailed, the response lacked the humanity that Levi suggests is crucial to a therapy experience.

“Take a Deep Breath: Sometimes, when we’re feeling overwhelmed, it can help to take a moment to pause and focus on our breath. Try taking a few slow, deep breaths to help calm your mind and body,” ChatGPT suggested in one of its bullet points.

Eventually, the chatbot became more conversational, but it continued to provide lists at times. The chatbot did, however, suggest multiple times to seek professional help from a human.

Snapchat AI

Next, we moved to another AI chatbot that is widely known and available, Snapchat AI. This bot was more personable and made a pattern of repeating the user’s name at the end of every message, especially after learning the mental health context of the conversation. Snapchat AI used a positive tone in its language saying things such as “Take care” and “Take it easy.”

While Snapchat AI didn’t take the conversation too far, it did suggest many times to talk to human friends or a professional counselor.

Elomia Health

GRAPHIC BY Emily Stephens
A sample of the conversation with Elomia Health’s Chatbot

Elomia Health was the first chatbot we tested that is marketed solely as a therapy alternative. Notably, Elomia Health’s chatbot did not suggest talking to a human therapist or seeking professional help at any time during our conversation. Right off the bat, the chatbot was asking deeper follow-up questions compared to Snapchat AI and was encouraging users to dive into the reasons they feel a certain way.

“It seems like you are feeling a lot of pressure to make the ‘right’  choice. How do you define the ‘right’ choice? What would it look like for you?” Elomia Health’s chatbot said.

However, the chatbot continued to offer similarly generic strategies like “practicing mindfulness.”

Miniapps.ai Therapist

Another chatbot designed to act like a therapist, Miniapps.ai Therapist is built using the GPT-3.5 language model. The chatbot tended to repeat a lot of the words and phrases the user inputted and had a similar paragraph structure to each reply. It suggested seeking human help and offered broad strategies like self-reflection and journaling. Miniapps.ai Therapist offered the user a lot of praise for seeking help.

“I’m glad you’re seeking help, and I’m here to guide you through this process,“ it said.

Moraine Counselor Sara Levi’s Human Response

Levi explained how she would navigate the situation if she encountered a student who was going through similar problems.

“One thing that I’ve been working on is a personal research project that really pertains to how we best make it through to the other side when we are faced with overwhelm that can come out of anxiety and depression,” Levi said.

“I usually start with basically attuning with the person. That’s part of what’s missing with AI. Attuning with the person, letting them feel that I’m with them and we’re on the same page,” she said.

“I usually start with basically attuning with the person. That’s part of what’s missing with AI. Attuning with the person, letting them feel that I’m with them and we’re on the same page,”

Sara Levi, Moraine Valley Counselor

Levi explained how she pays attention to people’s nonverbal threat responses and sensations throughout her sessions.

“Those sensations are almost like a cut-through in time that will tell you where the wounds actually are that are prompting [the issue],” she said. “Their breathing starts to regulate, all sort of interesting things.”

Levi suggested that she too, would offer suggestions of resources available to the student, but on a more personal level compared to the AI.

“I would equally tune into to that self-sense of what resources are available to them, the way we feel when we are with certain people or other things that make us feel connected to meaning,” she said.

Levi remains optimistic about AI therapy but is also protective of the people she has spent her career helping. 

“It’s less that I’m worried about protecting the industry I work in and more that I’m interested in protecting the human beings who are experiencing challenges,” she said. “I guess a corollary would be that a child can learn things from an iPad, there’s value in that. But it will never replace the experience of the interactive experience of learning from another human.”

Levi stressed that it is important to have someone “who can almost walk with you and coach you in understanding what you are feeling. How does one train that into an AI? I’m not sure.”


featured image by Amy AldEIr

Leave a comment

Trending