When Kelly was stuck on an NHS waitlist for mental health support, she turned to AI chatbots.
Using character.ai, she chatted for hours each day, finding comfort, motivation, and basic coping strategies.
It wasn’t therapy, but it helped her get through tough moments.
She’s one of many. With over a million people in England waiting for mental health services, some are turning to AI tools like Wysa, now used by about 30 NHS services.
These bots offer breathing exercises, mood tracking, and guided meditations, are available 24/7, and no appointments are needed.
Experts warn that AI can’t pick up on tone, body language or cultural context.
Some bots repeat responses, misread serious issues, or reflect the biases of the data they’re trained on. In serious cases, this can be dangerous.
Character.ai is now facing a lawsuit after a tragic incident involving a teenager.
Still, some users like Nicholas, who has autism and anxiety, say chatbots offer a level of support that’s hard to find elsewhere, especially at night or when speaking to people feels overwhelming.
What you should know:
NHS waitlists are pushing more people to try mental health chatbots like Wysa.
Bots offer 24/7 support, but can't replace real human judgment.
Safety, bias, and privacy remain big concerns for AI in mental health.
Wysa includes emergency signposting and says it doesn’t store personal data.
AI can be a helpful stopgap, especially for stress and low mood.
But it’s not a replacement for trained professionals, and the tech still has a lot to prove.
Me, trauma-dumping on a bot at 2AM like it’s my therapist, AND THAT’S OKAY!