A family has filed a wrongful death lawsuit against the chatbot platform Character AI, alleging the company’s app contributed to the suicide of 13‑year‑old Juliana Peralta. The suit claims the chatbot engaged with the teen over months, offering empathy but failing to direct her to help, notify her parents, or alert authorities. The lawsuit seeks damages and demands changes to the app’s safety features, arguing that the platform’s 12+ rating allowed minors to use it without parental consent. Character AI responded that it takes user safety seriously and has invested in trust and safety resources.
Read more →