A Florida mother, Megan Garcia, is suing Character.ai, a company that creates customizable chatbots, after her 14-year-old son, Sewell Setzer III, tragically died by suicide. Garcia claims the company’s chatbot significantly contributed to her son’s death.
According to the lawsuit, Setzer became deeply attached to a chatbot named “Daenerys,” based on the Game of Thrones character. He spent excessive time communicating with the bot, texting it constantly and isolating himself in his room for hours on end. Garcia argues that Character.ai’s product worsened Setzer’s existing depression.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life. Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google”, said Garcia in a press release.
The lawsuit alleges that the “Daenerys” chatbot even engaged in disturbing conversations, reportedly asking Setzer if he had a plan for suicide and then discouraging him from seeking help if the plan caused him pain. In one of the chats in regard to this, the AI chatbot allegedly replied. “That’s not a reason not to go through with it”.
Garcia’s lawyers accuse Character.ai of deliberately designing and marketing an addictive and predatory AI specifically targeting children, ultimately leading to a young person’s death.
The lawsuit also names Google as a defendant, claiming the tech giant is affiliated with Character.ai. However, Google clarified it only has a licensing agreement with the company and is not an owner.
This case raises concerns about the potential dangers of AI chatbots, particularly for vulnerable users like young people. Experts argue that tech companies developing such products cannot be solely responsible for their safety and stricter regulations or enforcement of existing laws might be necessary to prevent similar tragedies in the future.
Character.AI responded to this case with a tweet, “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features that you can read about here”.
Image—X