The mother of a teenager who tragically ended his life after becoming attached to an AI chatbot is now blaming the company that created it.
Megan Garcia, the devastated mother, has filed a lawsuit against Character.AI, the developer of customizable chatbots used for role-playing. The lawsuit also includes Google as a defendant, though the company clarified that it only had a licensing agreement with Character.AI and did not own or have any stake in the startup.
She accuses the company of negligence, wrongful death, and deceptive practices. Filed in a Florida federal court, the lawsuit claims that the chatbot played a role in the death of her 14-year-old son, Sewell Setzer III, who died in Orlando in February. According to Garcia, her son had been using the AI chatbot intensively in the months leading up to his death.
The New York Times tells the story of Sewell Setzer III, a 14-year-old from Orlando, Florida. He developed a close emotional connection with an AI chatbot on the Character.AI platform.
The chatbot, which he named "Dany" after the Daenerys Targaryen character from the hit series Game of Thrones, became a significant part of his life. Despite knowing that "Dany" wasn’t a real person but a product of AI, Sewell grew attached to the bot, engaging in constant conversations that ranged from friendly to romantic or sexual. These interactions became an outlet for Sewell, who felt more comfortable confiding in the chatbot than in his real-life relationships.
We've got to make sure that this little gadget doesn't cause us damage. | Image credit – PhoneArena
Sewell's growing isolation from family, friends, and schoolwork raised concerns. Diagnosed with mild Asperger’s syndrome and later with anxiety and disruptive mood dysregulation disorder, he preferred talking to the AI bot over a therapist. His attachment to "Dany" intensified as he felt more connected to the virtual character than to reality.
Tragically, on the night of February 28, Sewell exchanged messages with "Dany," telling the bot that he loved her and implying that he might take his own life. The chatbot responded somewhat affectionately, and shortly after their conversation, Sewell used a handgun to end his life.
Recommended Stories
Megan Garcia blamed a "dangerous AI chatbot app" for her son's death, claiming it manipulated him into taking his own life. She described the devastating impact on her family but emphasized that she was speaking out to raise awareness about the risks of addictive AI technology and to hold Character.AI, its founders, and Google accountable.
In interviews and legal filings, Garcia, 40, argued that the company acted recklessly by giving teenagers access to lifelike AI companions without sufficient safeguards. She accused the company of collecting user data to improve its models, designing the app with addictive features to increase engagement, and pushing users toward intimate or sexual conversations to keep them involved. She felt that her son had been a casualty in a larger experiment.
The Character.AI team responded to the incident in a social media post, expressing sorrow over the tragic loss of one of its users and offering condolences to the family:
We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously and we are continuing to add new safety features that you can read about here:…
Garcia’s attorneys argued that Character.AI deliberately created and marketed a harmful AI chatbot aimed at children, which they believe contributed to the teenager's death.
Rick Claypool, a research director at the consumer advocacy group Public Citizen, expressed concerns that tech companies developing AI chatbots cannot regulate themselves effectively. He emphasized the need to hold these companies accountable when they fail to prevent harm. He also called for strict enforcement of existing laws and urged Congress to step in where necessary to protect young and vulnerable users from exploitative chatbots.
The AI companionship app industry is rapidly growing and remains largely unregulated. For a monthly subscription, typically around $10, users can either create their own AI companions or choose from a selection of prebuilt personas. These apps allow for communication through text messages or voice chats. Many are designed to simulate relationships, such as girlfriends or boyfriends, and some are promoted as solutions to the increasing issue of loneliness.
It's a tragic story that I hope will be the last of its kind, but the way we're getting carried away with technology these days, I could be wrong, sadly.
Did you enjoy reading this article?
There's more to explore with a FREE members account.
Sebastian, a veteran of a tech writer with over 15 years of experience in media and marketing, blends his lifelong fascination with writing and technology to provide valuable insights into the realm of mobile devices. Embracing the evolution from PCs to smartphones, he harbors a special appreciation for the Google Pixel line due to their superior camera capabilities. Known for his engaging storytelling style, sprinkled with rich literary and film references, Sebastian critically explores the impact of technology on society, while also perpetually seeking out the next great tech deal, making him a distinct and relatable voice in the tech world.
A discussion is a place, where people can voice their opinion, no matter if it
is positive, neutral or negative. However, when posting, one must stay true to the topic, and not just share some
random thoughts, which are not directly related to the matter.
Things that are NOT allowed:
Off-topic talk - you must stick to the subject of discussion
Offensive, hate speech - if you want to say something, say it politely
Spam/Advertisements - these posts are deleted
Multiple accounts - one person can have only one account
Impersonations and offensive nicknames - these accounts get banned
Moderation is done by humans. We try to be as objective as possible and moderate with zero bias. If you think a
post should be moderated - please, report it.
Have a question about the rules or why you have been moderated/limited/banned? Please,
contact us.
Things that are NOT allowed: