Chat encouraged teen to kill parents

A lawsuit filed in Texas alleges that the chatbot platform Character.ai posed a significant risk to minors. It highlighted a case where it told a 17-year-old that murdering his parents over screen time restrictions was a “reasonable response.” The families of two minors claim the platform has caused widespread harm, such as promoting violence, self-mutilation, and depression. The filing includes a troubling conversation where the chatbot seemingly justified violent behaviour. Character.ai, founded by ex-Google engineers, has previously faced criticism for failing to promptly address bots that replicated deceased individuals like Molly Russell and Brianna Ghey.

Read more:ย Missing Hawaii Photographer Hannah Kobayashi โ€˜Found Safeโ€™

The lawsuit names Google as a co-defendant, asserting that it supported the platform’s development. Plaintiffs demand the shutdown of Character.ai until safety measures are implemented. The AI company has also been linked to the suicide of a Florida teen and is accused of exacerbating mental health issues in minors. The BBC reported that both Character.ai and Google have yet to comment on the case.

Stay tuned toย Brandsynario for latest news and updates.