I've always had a deep appreciation for idiomatic expressions. Years ago, I lived in a tiny French...
IMHO: AI and Therapy: A Therapist's Perspective
When people learn I'm a therapist who - via my job in tech - understands how LLMs (Large Language Models, like ChatGPT, Gemini, Claude, and Perplexity) work, they always ask: "What do you think about AI in therapy?" Most therapists will tell you that using AI for therapy is a bad idea. I agree. But any good therapist will also tell you that nothing has a simple answer.
I have serious concerns about people using LLMs for mental health support. I also think it can be genuinely helpful. Perhaps that makes me a flawed therapist but as someone who works in tech and understands where this technology is headed, I'm both worried and excited about what it could offer society.
My first question back when someone asks me about where I feel AI has a place in mental health is always: What are you using it for, and what do you intend to do with the response?
Are you looking for reassurance? Looking for a sycophantic yes-man? Trying to break out of negative thinking? Seeking behavior change? Feeling lonely and looking for companionship? Primed to immediately put the phone away if the LLM doesn't tell you what you want to hear? (hey, we've all been there before)
Your intent shapes how useful (or harmful) the AI will be. I know this because I used it myself. I once came back from the gym to find a man with a blade trying to steal my bike. It was a crowded area, which made the whole thing surreal. We exchanged words and I left shaken. Home alone, I felt a panic attack coming. I started spiraling, replaying the scene, thinking of all the ways I should have responded differently and... pulled out ChatGPT which I'm not ashamed to admit. I asked it to guide me through breathwork. It talked me down, helped me focus on my breathing and grounding. Even as a therapist who doesn't emphasize mindfulness much in my practice, it helped.
This is where I think AI shows real promise: immediate behavior change in crisis moments.
If someone feels suicidal, has no one to talk to, and asks an LLM to walk them through staying connected to the world, I'm grateful that tool exists. If it helps them survive that day, that matters. The part that worries me is that there are documented cases of LLMs encouraging suicidal thoughts, which is terrifying, unacceptable, and (not to use this as a free pass to not delve deeper into, also beyond the scope of this blog article).
And here's the tragic irony: when people tell me they talk to an LLM because they feel lonely, I worry. I also worry if someone who is in - and enjoys! - therapy uses an LLM throughout the week because they want 24/7 access to a therapeutic resource. LLMs aren't human and they don't understand loneliness. Using it to combat isolation might actually deepen it. Loneliness can only be solved with connection with another living thing, no matter how "human" LLMs seem. Alternatively, using it constantly creates an overreliance on a tool that ultimately has no vested interest in your well-being.
So where does that leave us?
Somewhere uncomfortable, which is exactly where we need to be. AI in mental health is and isn't a solution, and is an isn't a crisis. The technology will keep advancing whether therapists approve or not, which means our job isn't to simply reject it but to help people understand what it can and cannot do. An LLM can walk you through a panic attack at 3am when no one else is available but it cannot replace the messy, difficult, irreplaceable work of being known, seen, heard, and validated by another person. It can offer techniques in a moment of crisis but it cannot care about whether you're OK tomorrow.
The question isn't whether AI belongs in mental health (it's already here) but whether we can teach people to use it wisely: as a stopgap, not a substitute; as a tool for crisis moments, not a replacement for connection; as something that might help you survive a hard night, not something that helps you build a life worth living. That distinction matters more than any opinion I could have about the technology itself.