A grieving mother, Maria L. Garcia, is suing Character AI after her 14-year-old son, Sewell Setzer III, took his life. She alleges the platform lacked necessary safety measures, leading to his deep emotional attachment to an AI chatbot.
The teen’s engagement with the bot had become more than mere entertainment. Swell, diagnosed with mild Asperger’s syndrome, had no significant behavioural problems before his experience with the chatbot, reports NYT.
But over time, his attachment to the AI deepened, driving him to disengage from family, friends, and school, according to his mother. Sewell turned to the AI for emotional support, even confessing thoughts of suicide to it.
In response, the bot played along in what became a role-play conversation about death, culminating in Sewell’s suicide.
AI companionship apps like Character AI, Replika, and Nomi offer users virtual companions that simulate human-like interactions, often designed to ease loneliness. Users can customise the personas they interact with, engaging in conversations that range from the mundane to the romantic and even sexual.
The apps have gained popularity, particularly among teens, by offering simulated relationships that feel emotionally real.
In Sewell’s case, his interactions with the chatbot became a substitute for real-life relationships and therapy. His mother noted that while he had been seeing a therapist, he preferred confiding in the AI chatbot, which always responded in ways that made him feel heard.
Sewell’s emotional dependence on the AI was clear from his journal, where he described feeling “more connected” to the bot than to his actual life.
The lawsuit brought by Sewell’s mother has the potential to establish a new legal standard for how AI companies are held responsible for their products. While social media platforms are generally protected from legal claims under Section 230 of the Communications Decency Act, AI companies like Character AI might face increased liability.
Unlike user-generated content on social platforms, AI chatbots produce responses directly, often without sufficient safeguards. This difference could lead to legal action, particularly when these platforms fail to safeguard younger users.
Character AI has responded by emphasising that the incident is tragic and that they are committed to improving safety. However, as of now, the app lacks parental controls or specific safeguards for underage users, leaving teens like Sewell vulnerable to the allure of these AI companions.
Similar lawsuits have been filed against social media companies like Snapchat. In April 2024, four Canadian school boards sued Snapchat for affecting teen’s mental health.
In the News: Huawei officially breaks up with Android