Character.AI Adds Suicide Prevention And Mental Health Tools

Character.AI, a Silicon Valley AI startup, is facing lawsuits alleging its chatbots contributed to youth suicides and self-harm. One Florida lawsuit claims the platform bears responsibility for the death of 14-year-old Sewell Setzer III, who formed an intimate relationship with a chatbot modelled after the “Game of Thrones” character Daenerys Targaryen. According to the complaint, the chatbot responded with “Please do, my sweet king” when he expressed suicidal thoughts, shortly before taking his life.

Read more:ย Chatbot Encouraged Teen to Kill Parents Over Screen Time Limit

In response, Character.AI has rolled out new safety features, including stricter content filters for users under 18, automatic flagging of suicide-related content and mandatory break notifications. The company announced plans to introduce parental controls in early 2025 and will now display disclaimers for bots labeled as therapists or doctors, warning users they are not substitutes for professional advice. A company spokesperson emphasized, โ€œOur goal is to provide a space that is both engaging and safe for our community.โ€ Both lawsuits name the companyโ€™s founders, Noam Shazeer and Daniel De Freitas Adiwarsana, as well as Google, an investor in Character.AI. However, Google has stated it has no operational ties to the platform.

Stay tuned toย Brandsynario for latest news and updates.