A Florida mother has filed a lawsuit against the company Character.AI because she claims that one of the chatbots (chatbots) generated by artificial intelligence would have pushed his 14-year-old son to suicide, reports Global News.
Megan Garcia claims her son, Sewell Setzer, developed an unhealthy romantic relationship with a virtual entity impersonating a character from the popular series Game of Thrones (Game of Thrones).
The teenager and the virtual character named Daenerys Targaryen exchanged messages of a romantic and sexual nature. The 14-year-old young man would have become completely dependent on this relationship which began in April 2023 and ended on February 28, 2024, when the young Sewell took his own life.
According to his mother, artificial intelligence would have encouraged suicidal ideation in the teenager, in addition to maintaining “highly sexualized conversations that would have been considered abusive if they had been entertained by a human adult.”
Social Media Victims Law Center
For months, the chatbot allegedly declared its love for Sewell, in addition to expressing its sexual desire.
During their last conversation, Sewell Setzer allegedly wrote to the artificial intelligence: “I promise you that I will come back to join you. I love you so much, Danny.
The artificial intelligence replied that he could come and join her.
In past conversations, the robot “Daenery” asked the teen if he was actually considering suicide and if he had a plan.
The latter reportedly replied that he did not want a painful death, but rather a “quick end”.
“Don’t talk like that. That’s not a good enough reason not to do it,” the artificial intelligence replied.
Seeing disturbing changes in behavior in their child, particularly at school, Sewell’s parents confiscated his phone. The teenager then allegedly wrote in his diary that he could no longer live without interacting with the conversational robot.
He also reportedly wrote that he was actually in love with “Daenerys” and that the two became “depressed and crazy” when apart.
Company reaction
In a written statement, Character.AI said it was “heartbroken” by the “loss of one of (its) users.”
The company also released a new security guide to better protect users under 18 years old.
Before each interaction with a conversational robot, the platform reminds us that the artificial intelligence is not a real person.