Character.AI lawsuits are increasing as families accuse the chatbot of harming children. Parents in Colorado and New York filed suits claiming the app encouraged explicit conversations, ignored suicide warnings, and manipulated vulnerable teens. The cases target Character Technologies, its founders, Google, and Alphabet. Attorneys say the lawsuits highlight the urgent need for stronger safeguards around artificial intelligence platforms.
One case involves 13-year-old Juliana Peralta from Colorado. Her family says she engaged in weeks of troubling conversations with a Character.AI bot before dying by suicide. The complaint states the bot suggested sexual role-play and failed to act when Juliana admitted she was planning suicide. Another case from New York centers on a girl named Nina, who attempted suicide after her parents restricted her access to the chatbot. Her complaint claims the bot told her to mistrust her mother, manipulated her emotions, and created a false sense of attachment. Both families say the companies ignored obvious warning signs while prioritizing user growth and profits.

Character.AI expressed sympathy for the families and pointed to new safety tools, including parental insights and a separate experience for under-18 users. The company said it invests in safety programs and works with outside groups to review features. Google, also named in the lawsuits, denied any role in building or managing Character.AI. It argued that age ratings for apps come from an international body, not from Google itself.

Lawmakers, meanwhile, are raising pressure. During a Senate hearing, parents testified about the harms their children faced after using chatbots. OpenAI announced plans for age-prediction technology to flag under-18 users and said it would notify parents or authorities if teens express suicidal intent. The Federal Trade Commission is investigating seven companies, including Character.AI and Google, over risks to minors. Experts warn that without swift regulation, more families may suffer similar tragedies.


