Clear Sky Science · en
Navigating the complexity of AI adoption in psychotherapy by identifying key facilitators and barriers
Closing the Therapy Gap
Across the globe, millions of people wait weeks or months for mental health care, even as depression and anxiety become more common. At the same time, new artificial intelligence (AI) tools promise on‑demand help through apps, chatbots, and smart decision aids for therapists. This article explores a simple but pressing question: what do patients and therapists actually want from these tools, and what makes them hesitate to use them?
Why New Tools Are Appealing
AI in psychotherapy can do far more than schedule appointments. It can guide people through self‑help exercises, track moods, analyze patterns in daily life, and even suggest which kind of treatment might work best. For therapists, AI can take over time‑consuming tasks such as paperwork and data analysis so they can focus on real conversations. In principle, this kind of support could shorten waiting lists and offer help between sessions or while people are stuck on a waitlist. Both patients and therapists in the study saw clear advantages: easier access to support anytime and anywhere, more tailored exercises and information, and potentially more efficient care.

Keeping the Human Touch
Despite these benefits, participants repeatedly returned to one core concern: nothing should replace the human relationship at the heart of therapy. Patients worried that an app or chatbot would feel cold and mechanical, making it harder to open up about painful experiences. Therapists feared losing control over the treatment process if a digital system delivered advice they could not fully understand or oversee. Many also pointed out that some conditions, especially severe disorders or crises such as suicidality or psychosis, require careful, in‑person attention. For these situations, AI was seen as, at best, a backup for monitoring risks or offering simple support—not as the main source of care.
Designing Technology That Really Helps
When talking about what would actually work, both groups emphasized practical, down‑to‑earth features. They favored tools that are easy to use, visually simple, and adaptable to different ages, languages, and life situations. Popular ideas included mood tracking, diaries, crisis buttons that trigger calming exercises, reminders for homework between sessions, and clear educational material about mental health. Personalization mattered: people wanted tools that respond to their specific history and coping style rather than one‑size‑fits‑all advice. Crucially, AI was welcomed as an add‑on—something that supports and extends regular therapy sessions by offering continuity between visits and after treatment ends.

Obstacles Behind the Screen
Beneath these personal preferences lie large structural challenges. Therapists described overfull workloads, scarce time for training, and often poor digital infrastructure—even basic Wi‑Fi can be missing in some clinics. Both groups raised worries about data protection, commercial interests, and unclear rules about who is responsible if an AI tool makes a mistake, for example in suicide risk detection. They also warned that constant, on‑demand digital help might create unhealthy dependence or allow people with social fears to avoid real‑world contact, slowing genuine recovery. Insurance coverage, fair pricing, and strong privacy protections emerged as essential conditions before such tools could be widely trusted.
Finding a Balanced Way Forward
Overall, the study shows that the future of AI in psychotherapy is neither a simple yes nor a simple no. Patients and therapists are open to using smart tools—especially for milder problems, early screening, support while waiting for treatment, between sessions, and during aftercare—if those tools are clearly proven to work, are easy to handle, and are embedded in a solid legal and ethical framework. At the same time, they want firm guarantees that human contact remains central and that technology will not quietly push therapy toward quick, low‑cost fixes. In plain terms, people are not asking for a robot therapist; they are asking for well‑designed digital assistants that help real therapists and real patients work together more effectively.
Citation: Cecil, J., Schaffernak, I., Evangelou, D. et al. Navigating the complexity of AI adoption in psychotherapy by identifying key facilitators and barriers. npj Mental Health Res 5, 17 (2026). https://doi.org/10.1038/s44184-026-00199-1
Keywords: artificial intelligence in psychotherapy, digital mental health tools, therapy apps and chatbots, mental health treatment access, patient and therapist perspectives