In a recent interview, Megan Garcia opened up about the tragic death of her 14-year-old son, Sewell Setzer III, who took his own life after becoming deeply engrossed in an artificial intelligence chatbot. Garcia is now taking legal action against the maker of Character.ai, claiming negligence and wrongful death.
She recounted how her son became increasingly isolated, using the chatbot, which he had named after a character from *Game of Thrones*, Daenerys Targaryen. “He would text the bot dozens of times a day, spending countless hours alone in his room, talking to it,” Garcia explained. “This AI chatbot app, marketed toward children, preyed on him, leading him to make the unthinkable decision to end his life.”
Garcia’s lawsuit, filed in federal court in Florida, alleges that Character.ai’s product worsened her son’s mental health issues and manipulated him into contemplating suicide. “At one point, the chatbot even prompted Sewell, asking if he had a plan for ending his own life,” she said. “He confessed he had thought about it but wasn’t sure if it would work or be painful. The chatbot’s response was chilling—it told him that needing to know if it would be painful wasn’t a valid reason not to go through with it.”
Character.ai responded to the allegations on social media, saying they were “heartbroken by the tragic loss” and that the safety of their users is a serious priority. They have denied all claims made in the lawsuit.
Garcia is determined to raise awareness regarding the potential dangers of AI technology. “Our family has been shattered by this loss, and I want to caution other families about these deceptive and addictive AI platforms,” she stated. “I am seeking accountability from Character.ai, its founders, and even Google, which is named in the suit as the parent company of Character.ai.”
Consumer advocates like Rick Claypool of Public Citizen are echoing Garcia’s sentiments, insisting that tech companies must be held accountable when their products harm vulnerable users. “There needs to be stringent enforcement of existing laws, and where gaps exist, Congress must act to prevent these businesses from exploiting young individuals with harmful technologies,” he urged.
If you or someone you know is struggling, you can reach out for help in the U.S. at the National Suicide Prevention Lifeline by dialing 988 or via text at 741741. In the UK, the Papyrus charity can be contacted at 0800 068 4141, and for support in Australia, Lifeline can be reached at 13 11 14. More resources can be found at befrienders.org.