<img src="https://sb.scorecardresearch.com/p?c1=2&amp;c2=36750692&amp;cv=3.6.0&amp;cj=1"> Did a ‘Game of Thrones’ AI chatbot cause the death of a 14-year-old boy? – We Got This Covered
Something went wrong. Try again, or if the problem persists.
Your details are incorrect, or aren't in our system yet. Please try again, or sign up if you're new here.
This site is protected by reCAPTCHA and the Google and of Service apply.
Create a GAMURS
By g up, you agree to our and of Service.
Something went wrong. Try again, or if the problem persists.
This site is protected by reCAPTCHA and the Google and of Service apply.
Choose a name
Choose a unique name using 3-30 alphanumeric characters.
Something went wrong. Try again, or if the problem persists.
This site is protected by reCAPTCHA and the Google and of Service apply.
Choose your preferences
Choose how we communicate with you, opt out at anytime.
Something went wrong. Try again, or if the problem persists.
Check your email
An confirmation link was sent to your email. Don't forget to check your spam!
Enter the email address you used when you ed and we'll send you instructions to reset your .
If you used Apple or Google to create your , this process will create a for your existing .
This site is protected by reCAPTCHA and the Google and of Service apply.
Reset instructions sent. If you have an with us, you will receive an email within a few minutes.
Something went wrong. Try again or if the problem persists.
Screengrabs via HBO/ABC News

Did a ‘Game of Thrones’ AI chatbot cause the death of a 14-year-old boy?

We need to talk about how AI chatbots may impact teens' mental health.

Content Warning: This story deals with the subject of teen suicide. Please take caution.

Recommended Videos

Even though the world is, in many ways, more connected than ever, we have been living through a loneliness epidemic. Twenty years ago, fewer teenagers were suffering from mental health issues nor were they driven to take their own lives with as much frequency.

According to AI field as the solution to this epidemic of loneliness. But can we trust this optimistic view?

Countless psychologists, sociologists, philosophers, and other scholars who have dedicated themselves to studying the topic, have argued that Artificial Intelligence alone can’t solve this problem and, if employed incorrectly or without the proper safeguards, it could well exacerbate it. In Reclaiming Conversation (2015), sociologist Sherry Turkle discusses the empathy and intimacy crises we’ve been experiencing thanks to the inescapable prevalence of the digital world and its ramifications: “From the early days, I saw computers offer the illusion of companionship without the demands of friendship and then, as the programs got really good, the illusion of friendship without the demands of intimacy.”

While we cannot exclusively blame the intrusion of AI chatbots into the lives of people who may already be psychologically and emotionally debilitated and experiencing disillusionment and a weak connection to social life.

Is Generative AI to be blamed for Sewell’s tragic death?

Sewell Setzer III was a ninth grader from Orlando, Florida, who took his own life on Feb. 28, 2024. For months, the teenager had been texting back and forth with chatbots from character.ai, but he was particularly attached to one that emulated the character of Daenerys Targaryen from Game of Thrones. Right before he died, Sewell exchanged the messages screenshotted above.

For people who may believe these interactions are innocuous, popular YouTuber Charlie White, better known as MoistCr1TiKaL or penguinz0, dedicated two videos to unpacking this heartbreaking situation. In the first one, he raised awareness of the dangers of companionship-focused Generative AI, illustrating his views with a worrisome conversation he had with a “Psychologist” chatbot from character.ai. The second video Charlie posted was titled “I Didn’t Think This Would be Controversial.” In it, he doubled down on his opinion that character.ai’s chatbots contain highly concerning and condemnable features:

“You can toss tomatoes all you want but it is dangerous and irresponsible to have AI that tries its absolute best to convince s that it is real human beings and what you’re getting [is] legitimate professional help or a legitimate emotional connection and relationship. That will inevitably lead to more scenarios where people get attached to these chatbots because they are no longer able to distinguish reality, because the bot is fighting tooth and nail constantly to get you to buy into it and get you to believe that this is all real experiences you’re having with it.”

In October, Sewell’s mother, Megan Garcia, filed a lawsuit against character.ai. After her son committed suicide, Garcia went through his phone and was appalled by the “sexually explicit” conversations the adolescent was having with the chatbots, Daenerys in particular. Her 93-page-long wrongful-death lawsuit alleges the company purposefully designed their AIs to be addictive and to appeal to children, while nevertheless including interactions that can grow sexual in nature.

Charlie’s conversation with the “Psychologist” chatbot illustrates how severely the company’s AI can blur the lines between real life and fiction. “Soon we will also find ourselves living inside the hallucinations of nonhuman intelligence,” Yuval Noah Harari wrote in 2023. Yet “soon” may in fact be now if what is becoming reality remains unchallenged.

If you or someone you know needs help, please the 988 Suicide & Crisis Lifeline at 988lifeline.org or call or text 988.

Hot Items On Amazon This Week


We Got This Covered is ed by our audience. When you purchase through links on our site, we may earn a small commission. Learn more about our Policy
Author
Image of Margarida Bastos
Margarida Bastos
Margarida has been a content writer for 3 years. She is ionate about the intricacies of storytelling, including its ways of expression across different media: films, TV, books, plays, anime, visual novels, video games, podcasts, D&D campaigns... Margarida graduated from a professional theatre high school, holds a BA in English with Creative Writing and an MA in Text Editing/Publishing.