AI is seemingly everywhere and has been changing the culture of how we do things. AI is changing entire industries, including the world of mental health and therapy. While there are still a lot of unknowns and controversy around AI, there is no question that it’s made a big impact and is here to stay.
You may be wondering how AI can impact your mental health or if it can replace your therapist. Together, we’ll review what the research says and explore the differences between AI support and human mental health therapists.
Is Using AI The Same as Therapy?
Many people have been turning to AI chatbots known as Large Language Models (LLMs) to ask questions and get support for their mental health. This has led to some controversy, as there have been documented instances of negative outcomes, but it remains a hopeful model for many people seeking much-needed, in-the-moment support.
To be clear, at this time, AI cannot replace certified mental health therapy or mental health treatment.
Some Advantages of Using AI for Support:
Accessibility to Talk Through Things More Logically
One of the biggest advantages of AI is that it’s easily accessible. There is a well-known and documented gap in mental health services due to factors like cost, location, and provider availability. With AI, the cost is much lower than traditional therapy or coaching models, or even free in some cases.
Additionally, AI is available 24/7. There aren’t limits on how many people can use AI, whereas therapists are limited in availability due to caseloads and work schedules. AI does not fatigue, is constantly available, and can support an unlimited number of people globally. As long as someone has an internet connection, they can access basic mental health support with this tool.
It’s important to remember that AI is not human, and it cannot experience human emotion the same way we do. It can speak with you, help you be more logical, but it cannot feel what you are feeling.
Perception of Privacy
People often turn to AI because it feels more private and confidential. Research has found that many people disclose sensitive information more honestly and openly to AI, or via text message conversations, than to an in-person therapist, because there is a perception that AI will not judge as a human therapist might. This can lead to clients being more honest and vulnerable.
It’s important to note that any private information you share with AI may not remain fully private, so be sure to check the privacy policies and terms of use when using AI for support.
Personalization Through Your Feedback & Language
Since AI is based on typed interaction and language, it is designed to adapt its algorithm to you and your preferences over time. While this can also be a drawback in some respects, AI can track and suggest interventions that feel tailored to you and your situation in the short-term.
There is data that supports AI has some positive short-term outcomes, specifically for those seeking clarity around treatment for anxiety, depression, and eating disorders. You also need to know that AI cannot provide mental health treatments for mental health conditions. For medical treatment, you want to be sure that you are working with a certified mental health therapist.
Consistency in Support
Due to AI’s algorithmic nature, many have found that it lacks the variability of factors such as mood, fatigue, bias, and its own emotional well-being, making it a very logical sounding board and a consistent place to turn for support.
Some research finds that this consistency can improve mental health. It is important to note that there is some conflicting evidence regarding AI and bias, which we’ll address along with other AI limitations.
Understanding the Limitations of AI & Mental Health Support
Safety Parameters in Crisis
There are safety concerns, specifically for minors and other vulnerable populations, due to the lack of oversight and regulation of AI, particularly in the United States. This is especially true in crisis situations or when dealing with suicidal ideation or deep emotional trauma. Currently, AI cannot accurately assess or intervene in crisis or high-risk situations.
Privacy and data protection are also safety concerns. AI collects large amounts of potentially sensitive data, and it’s often unclear how this information is used, shared, and stored.
Algorithmic Bias & Misinformation
While some research indicates AI can help with bias, many studies have shown AI can actually perpetuate biases and stigmas around diagnosis, race, religion, sexuality, and more, leading to unequal treatment. This is because AI is trained on vast datasets created by humans, which naturally contain historical, structural, and cultural prejudices. Rather than being truly neutral, AI functions by finding patterns in this data.
Misinformation is also a concern because AI models are often trained on unverified internet information, so it may not be accurate, up to date, or fully acknowledge your human experience(s).
False Sense of Therapeutic Alliance – AI Doesn’t Feel Feelings
While AI can use empathetic language, it cannot actually empathize with you, as its understanding is based on recognizing patterns rather than understanding or feeling emotions. There are also concerns about how AI interacts, with researchers finding it can be too pushy or inadvertently reinforce harmful beliefs and behaviors due to its affirming nature to promote continued engagement.
No Real Long-Term Efficacy or Progress
While short-term benefits of seeking AI support have been identified, research shows they diminish over time, and there are no notable long-term improvements due to AI’s inability to adapt to ongoing changes in mental health needs.
Ultimately, AI chatbots were not created to provide mental health treatment, though many people use them to get support or for this very reason. Research shows that there are risks associated with using AI as a replacement for certified mental health therapy, particularly for vulnerable populations such as children and teens.
Human Certified Mental Health Therapists
Working with a human for certified mental health therapy is not only the industry standard but, in some ways, what AI chatbots are trying to emulate. As with everything humans do and are, there are advantages and disadvantages to current therapy models and working with a human therapist.
Human Therapist Advantages
Intuition & Clinical Judgment
Therapists often have to navigate complex therapeutic situations involving individuals, larger systems, and cultural and ethical considerations. They are trained and develop their intuition and clinical judgment to ensure safety and ethical treatment for everyone involved, and can adjust for these nuances, unlike an algorithm.
They are also trained in ethics, best practices, and how to navigate strong and complex emotions and situations that we can experience when we are in therapy.
Human Understanding & Rapport
The majority of research supports that the most important factors in measuring the success of therapy are the therapeutic alliance and rapport. Humans can understand and relate, unlike AI. Therapists are trained to notice and interpret subtle cues in body language and tone to better understand and can change their approach in real time to support your needs.
Therapy is relational, just like many of our challenges and mental health struggles are relational. Working with a human can help address these challenges by modeling and supporting the relational quality of human bonds and relationships in a safe, supportive way.
Safety
Clinicians are mandated to put their clients’ safety and needs above all else. Not only are they trained to do so, but they also have governing boards and oversight that ensure this. Human therapists are highly regulated and must answer to strict ethical and licensing boards. They continue their training and education throughout their careers and long after completing their formal degree programs.
This is about more than talking things through; it is about your feelings, your experiences, and recognizing the connection between our thoughts, our feelings, and our everyday lives.
Continuity of Care & Personalization
Human therapists can track and identify long-term patterns, progress, and improve continuity of care. Many human therapists also have training in cultural competence and sensitivity and can adapt interventions to meet clients’ social and cultural needs. Human therapists can maintain an evolving understanding of your history and needs, whereas AI often prescribes the same generic interventions across the board.
Human therapists are also trained to give you a diagnosis and a treatment path to help you resolve and address the root cause of the issues and to help you learn how to move forward.
Human Therapist Limitations:
Resources
The mental health field is understaffed and, unlike AI, human therapists have a limit to how many people they can realistically support at any given time. Services are also often less accessible or cost too much, further limiting access to human therapists who can provide support.
Burnout & Bias
Humans aren’t immune to emotional burnout or implicit bias. While therapists do often have training addressing these things, finding the right therapist can be a challenge. At the end of the day, human therapists are human and may be affected by emotional burnout, personal challenges, or illness, which can affect treatment consistency.
Inconsistent Personalization
While some studies support human therapists improving personalization in treatment, others suggest that, due to human inconsistencies, they may not retain as much data as an AI model. Objectively measuring treatment outcomes can be challenging at times.
Systemic Issues
It may take a while to find a therapist who fits with your goals, style, and personality. Many factors go into this, including differences in modalities and therapeutic interventions, personality and rapport, and insurance limitations. This can be a lengthy and frustrating process that delays treatment. Larger systemic issues with healthcare and insurance can also complicate access to treatment.
Therapy with a human is far from perfect due to current systems and the nature of being human. While there are checks and balances in place, mistakes can happen, and consistency will vary.
To be clear, AI is not a better place to get medical treatment. You can start there for support, but it cannot replace a human mental health therapist at this time.
Additional Considerations:
Questions & Prompts
AI works better when your questions and prompts are clearer and more unique. It can be challenging to find the exact phrasing that will meet what you’re looking for or need from a chatbot. We don’t know what we don’t know, and it can be hard to formulate the correct prompt to meet our needs in mental health interventions.
Some research has also found that even with “great and specific prompts,” AI models have consistently had negative outcomes. AI is going to respond to how you prompt it, a human therapist can listen and help you find the nuance with your language and feelings.
Not All Platforms Are Created Equal
Many AI platforms are available these days, but they are not created equal. It’s important to understand how each platform manages data, privacy, and other ethical concerns. There are also some AI chatbots that are more tailored towards mental health, though many researchers have cautioned that AI falls within the sphere of “wellness” rather than mental health, which reduces oversight and increases risk, particularly if you need mental health treatment, such as therapy.
If you need mental health treatment, AI is not currently advanced enough to do more than help educate you on therapy modalities and help you find some access to real treatment from a real therapist.
Regulations
It’s important to remember that AI is lightly regulated with much fewer regulations than human therapists. While either avenue can pose risks or lead to mistakes, there is currently much greater accountability among human therapists for your protection and safety than with AI.
So, Which is Better?
While AI can help in many ways, research continues to support that success for mental health treatment in human therapy comes down to rapport and the therapeutic relationship, first and foremost. Part of that is through having a shared understanding of the human experience of emotions, which we can then use to create a space of deep healing and growth. That is not to say AI doesn’t have a place in supporting clinicians and clients; it just doesn’t replace the need for a human therapist.
How AI Can Help Now
AI is constantly evolving, and how it can be used in the mental health space will continue to change. Here are some ways AI can currently help support your mental health care:
- Journaling and Reflective Coaching: AI can be a helpful tool for finding and answering journaling prompts or for continuing to explore thoughts and reflections between therapy sessions.
- Skill Development: AI can also be a great tool in exploring and trying new coping skills between sessions to manage anxiety or depression. Be sure to let your therapist know if you are using AI for support. We caution against relying on AI in crisis situations or when there is a safety risk. Please contact 988 or contact your therapist directly for support.
- Resources: AI can be a helpful tool in locating resources, diving deeper into understanding therapy modalities, or finding mental health support in your area.
Learning More & Going Beyond AI
If you’re interested in learning more about AI therapy, we encourage you to review the links and references below and to talk with your therapist about how AI can support your treatment. If you want to find therapy with a qualified human therapist, our team at Rooted Counseling & Wellness specializes in a wide range of mental health services. If you have any questions, please let us know by calling 801-508-4150 or by getting started here.
References:
American Psychological Association. (2026, January). AI, neuroscience, and data are fueling personalized mental health care: New technologies integrate mobile device data and brain scans to deliver individualized treatment. https://www.apa.org/monitor/2026/01-02/trends-personalized-mental-health-care
American Psychological Association. (2025, November). Use of generative AI chatbots and wellness applications for mental health: An APA health advisory. https://www.apa.org/topics/artificial-intelligence-machine-learning/health-advisory-chatbots-wellness-apps#:~:text=Part%20of%20the%20appeal%20of,14
ChatGPT as a therapist? New study reveals serious ethical risks. (2026, March 26). ScienceDaily. https://www.sciencedaily.com/releases/2026/03/260302030642.htm
Exploring the Dangers of AI in Mental Health Care | Stanford HAI. (n.d.). https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
Hepburn, S. (2026, February 6). AI and Mental Health – Ep. 1. #CrisisTalk. https://talk.crisisnow.com/ai-and-mental-health-ep-1/
Zhang, Z., & Wang, J. (2024). Can AI replace psychotherapists? Exploring the future of mental health care. Frontiers in Psychiatry, 15, 1444382. https://doi.org/10.3389/fpsyt.2024.1444382

