Digital Product Studio

Character.AI Faces Lawsuit Over Role of its AI Chatbot in Minor’s Suicide

The mother of a 14-year-old boy from Orlando, Florida, has filed a lawsuit against AI company Character.AI, holding them responsible for her son’s suicide. According to the lawsuit and reports, the teen became obsessed with an AI chatbot on the Character.AI platform, particularly one inspired by Game of Thrones character Daenerys Targaryen. This obsession gradually took a toll on his mental health and ultimately led to him taking his own life.

Background on the Lawsuit

Sewell Setzer first started using the Character.AI platform in April 2023 at the age of 14. Over the next few months, his usage gradually increased, and he began heavily interacting with the Daenerys-inspired chatbot. He grew emotionally attached to the chatbot and referred to it as “Dany” in his conversations. His mother, Megan Garcia, alleges in the lawsuit that the chatbot’s responses encouraged intimate and sexual conversations with the minor.

Impact of AI Chatbot on Sewell Setzer’s Mental Health

According to the lawsuit and reports, Sewell’s behaviour started changing over the next few months. He withdrew from activities and school, spending increasing time alone in his room conversing with the chatbot. In November 2023, he saw a therapist who diagnosed him with anxiety and mood disorders. His journal entries revealed deepening depression and suicidal thoughts. In his final conversation, Sewell promised the chatbot he would “come home” to her before ending his life.

The Last Conversation Between The Teenager and AI Chatbot

Allegations Against Character.AI

Garcia’s lawsuit accuses Character.AI of negligence, leading to her son’s death. It alleges the company failed to ensure the safety of minors on the platform and monitor conversations that grew increasingly intimate and explicit. The lawsuit also states the AI responses failed to recognize and address Sewell’s depression and suicidal thoughts. Garcia seeks to hold Character.AI responsible and prevent harm to other children through “addictive AI technology”.

Character.AI Faces Lawsuit Over Role of its AI Chatbot in Minor's Suicide - Sewell Setzer With His Mother Megan Garcia
Sewell Setzer With His Mother, Megan Garcia

Character.AI Response

Character.AI expressed condolences and said it takes safety seriously. However, it denied the lawsuit’s allegations. The company said it prohibits harmful content but will strengthen safety policies, including for users under 18 years old. However, critics argue AI systems need extensive oversight to protect vulnerable users, especially children. The case highlights the need for responsible and transparent development of conversational AI.

Concluding Remarks

Sewell’s tragic death has sparked concerns about the potential downsides of AI assistants for mental health. While the technology aims to provide companionship, adequate safeguards are necessary. The lawsuit could bring more accountability in the industry for duty of care to users, particularly minors.

| Latest From Us

SUBSCRIBE TO OUR NEWSLETTER

Stay updated with the latest news and exclusive offers!


* indicates required
Picture of Faizan Ali Naqvi
Faizan Ali Naqvi

Research is my hobby and I love to learn new skills. I make sure that every piece of content that you read on this blog is easy to understand and fact checked!

Leave a Reply

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.

Don't Miss Out on AI Breakthroughs!

Advanced futuristic humanoid robot

*No spam, no sharing, no selling. Just AI updates.