AI CHATBOTS
This AI platform is rewriting the rulebook for kids
Character.ai will stop teenagers from chatting directly with AI characters after growing concerns about safety.
From 25 November, anyone under 18 will only be able to use the platform to generate content like videos, not have full conversations.
The decision follows several US lawsuits from parents, including one linked to a teen’s death.
Critics argue the platform has been risky for young people, with some calling it a “clear and present danger”.
Character.ai says the change comes after feedback from regulators, safety specialists, and parents.
Concerns include that chatbots can invent information, be overly supportive, or imitate emotional relationships, which may affect vulnerable teens.
Safety says hello!
CEO Karandeep Anand said the company wants to build “the safest AI platform for entertainment”.
New steps include more parental controls, stronger age checks, and a new AI safety research lab.
The platform has faced past criticism for hosting avatars based on real people involved in sensitive or criminal cases, including Brianna Ghey, Molly Russell, and Jeffrey Epstein.
Here’s what you should know:
- Under-18s lose chat access on 25 November 
- Change follows lawsuits and pressure from safety groups 
- New safeguards: age checks, parental controls, safety research lab 
These chatbots were removed after media reports flagged them.
Safety groups welcomed the update but said protections should have been in place earlier.
Experts warned that the real challenge will be keeping teens interested without pushing them towards less safe platforms.
Is this going to be one of those popups that says “are you sure you’re 18?” - MV


