Mum can continue lawsuit against AI chatbot firm she holds responsible for son's death

Friday, 23 May 2025 12:12

By Mickey Carroll, science and technology reporter

The mother of a 14-year-old boy who claims he took his own life after becoming obsessed with artificial intelligence chatbots can continue her legal case against the company behind the technology, a judge has ruled.

"This decision is truly historic," said Meetali Jain, director of the Tech Justice Law Project, which is supporting the family's case.

"It sends a clear signal to [AI] companies [...] that they cannot evade legal consequences for the real-world harm their products cause," she said in a statement.

Warning: This article contains some details which readers may find distressing or triggering

Megan Garcia, the mother of Sewell Setzer III, claims Character.ai targeted her son with "anthropomorphic, hypersexualized, and frighteningly realistic experiences" in a lawsuit filed in Florida.

"A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life," said Ms Garcia.

Sewell shot himself with his father's pistol in February 2024, seconds after asking the chatbot: "What if I come home right now?"

The chatbot replied: "... please do, my sweet king."

In US Senior District Judge Anne Conway's ruling this week, she described how Sewell became "addicted" to the app within months of using it, quitting his basketball team and becoming withdrawn.

He was particularly addicted to two chatbots based on Game of Thrones characters, Daenerys Targaryen and Rhaenyra Targaryen.

"[I]n one undated journal entry he wrote that he could not go a single day without being with the [Daenerys Targaryen Character] with which he felt like he had fallen in love; that when they were away from each other they (both he and the bot) 'get really depressed and go crazy'," wrote the judge in her ruling.

Ms Garcia, who is working with the Tech Justice Law Project and Social Media Victims Law Center, alleges that Character.ai "knew" or "should have known" that its model "would be harmful to a significant number of its minor customers".

The case holds Character.ai, its founders and Google, where the founders began working on the model, responsible for Sewell's death.

Ms Garcia launched proceedings against both companies in October.

A Character.ai spokesperson said the company will continue to fight the case and employs safety features on its platform to protect minors, including measures to prevent "conversations about self-harm".

A Google spokesperson said the company strongly disagrees with the decision. They added that Google and Character.ai are "entirely separate" and that Google "did not create, design, or manage Character.ai's app or any component part of it".

Defending lawyers tried to argue the case should be thrown out because chatbots deserve First Amendment protections, and ruling otherwise could have a "chilling effect" on the AI industry.

Judge Conway rejected that claim, saying she was "not prepared" to hold that the chatbots' output constitutes speech "at this stage", although she did agree Character.ai users had a right to receive the "speech" of the chatbots.

Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK.

Sky News

(c) Sky News 2025: Mum can continue lawsuit against AI chatbot firm she holds responsible for son's death

More from Technology

Today's Weather

  • Hereford

    Medium-level cloud

    High: 19°C | Low: 11°C

  • Ludlow

    Medium-level cloud

    High: 19°C | Low: 13°C

  • Abergavenny

    Medium-level cloud

    High: 18°C | Low: 13°C

  • Monmouth

    Medium-level cloud

    High: 19°C | Low: 13°C

Like Us On Facebook