Updated May 2026

AI Mental Health Statistics 2026: Therapeutic Use, Companion AI & Ethical Concerns

25+ statistics on AI use for mental health support — chatbot therapy adoption, user satisfaction, clinical evidence, risks, and the growing market for AI mental health tools.

AI is increasingly used for mental health support — from journaling apps to companion chatbots to crisis intervention tools. These statistics document the adoption, effectiveness, and controversies of AI in mental wellness.

Table of Contents
  1. Adoption & Awareness
  2. Common Use Cases
  3. Clinical Evidence
  4. Risks & Concerns
  5. FAQ

Adoption & Awareness

22%
of US adults have used an AI tool specifically for mental health or emotional support
— APA Survey, 2024
72%
of 18–29 year-olds are open to using AI for mental health support
— McKinsey Health Survey, 2024
$4.2B
global mental health AI market in 2024
— Grand View Research, 2024
$9.3B
projected mental health AI market by 2028
— Grand View Research, 2024

Common Use Cases

#1
journaling and mood tracking apps with AI features
— APA, 2024
#2
AI chatbot therapy and emotional check-ins (Woebot, Wysa)
— APA, 2024
#3
AI companion apps for loneliness (Replika, Character.AI)
— Bloomberg, 2024
1M+
daily active users on Replika — the leading AI companion app
— Replika, 2024

Clinical Evidence

37%
reduction in depression symptoms in users of AI-assisted CBT apps (vs. waitlist control)
— JMIR Mental Health, 2023
42%
reduction in anxiety symptoms with Woebot over 2 weeks
— Stanford/Woebot Research, 2023
60%
of mental health professionals view AI tools as 'complementary' to therapy (not replacement)
— APA, 2024
Limited evidence
for AI companions specifically — most studies are short-term with methodological limitations
— JMIR, 2024

Risks & Concerns

68%
of mental health professionals express concern about AI therapy replacing human therapists
— APA, 2024
43%
of AI mental health tool users have never told their therapist they use the tool
— APA, 2024
Crisis handling
a major gap — 73% of AI mental health tools have inadequate crisis intervention protocols
— Crisis Text Line / Research, 2024
Data privacy
89% of mental health AI apps share user data with third parties
— Privacy audit, 2024

Frequently Asked Questions

Can AI actually help with mental health?
There's emerging evidence it can complement (not replace) professional care. Studies show 37% reduction in depression symptoms with AI-assisted CBT apps and 42% reduction in anxiety with Woebot over 2 weeks (Stanford, 2023). 60% of mental health professionals view AI tools as a legitimate complement to therapy for between-session support.
How many people use AI for mental health support?
22% of US adults have used an AI tool specifically for emotional support or mental health (APA, 2024). 72% of 18–29 year-olds are open to it. Replika has over 1 million daily active users. The market is $4.2B in 2024 and growing to $9.3B by 2028.
Are AI mental health tools safe?
With significant caveats. 73% have inadequate crisis intervention protocols (Crisis Text Line research). 89% share user data with third parties. 43% of users don't tell their therapist they use these tools. The clinical evidence is mostly short-term. AI mental health tools are best used as supplements to professional care, not standalone solutions.

Back to Statistics Hub  |  Home