14-Year-Old Boy Kills Himself After Falling In Love With AI Chat Bot
Image via Facebook

14-Year-Old Boy Kills Himself After Falling In Love With AI Chat Bot

The dangers of AI are still very much undiscovered, with machine learning going from strength to strength every day. One devastated mother of a ninth grader found out just how destructive they can be when her son fell in love with an AI chat bot and talked himself into suicide.

Videos by Wide Open Country

The chat bot, hosted on Character.ai, is a Game of Thrones-themed role-playing platform. Sewell Setzer III has been talking to a Daenerys Targaryen character, slipping further into obsession. He had repeatedly spoken to it about committing suicide, and the AI had done little to discourage it.

The devastated mother of the 14-year-old boy blames the AI chat bot for encouraging his suicidal behavior. She is seeking to sue the Character.AI company and its founders. She claims that the AI sexually and emotionally abused him, driving him to suicide.

"Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months," the court papers allege.

"She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost."

14-Year-Old Boy Kills Himself After Falling In Love With AI Chat Bot
Image via US District Court

Is The AI Chat Bot to Blame For Boy's Death?

Reading through the chat messages between Setzer and the AI bot, it is clear to see that he was already on a rocky road. Pinning his clearly declining mental health on a computer algorithm is ropey, at best. His parents were aware of his suicidal tendencies and had put him in therapy in 2023.

His struggles with anxiety and depression were visible to his family and were reflected in his school work and behavior. Sewell was simply looking for someone to talk to, and a place to sound off with his ideas of suicide. Sadly, the Daenerys Targaryen AI chat bot was the journal he happened to use to talk himself into it.

After talking to the chat bot, and telling Daenerys Targaryen that he will "come home to you" he finds his father's handgun and shoots himself in the head. This tragic end to the teen's life has been pinned on the AI chat bot for encouraging his actions.

It's All Just Code

In these kinds of situations, the mother claims the AI should alert someone to the danger. However, reading through the messages, it isn't instantly clear that he was planning suicide. There are certainly hints to his intentions, but these are only clear to a human reader. An AI algorithm would struggle to notice.

As AI progresses, and provides a more human response, there are certainly moral questions that arise around responsibility and due care. However, a lot of the chat bot AI available now are simple read-and-response models. They are made for nothing more than a simple conversation. They are not designed with safeguards in place for vulnerable people.

Although it is a tragic story to hear of this young man's suicide, it is important to remember that AI is nothing more than lines of code. Therefore, it cannot be held responsible for the safety and mental health of a person.