To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots

Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots

Sewell Setzer III was reportedly lured in by a Game of Thrones-inspired chatbot

Warning: This article contains discussion of suicide which some readers may find distressing.

There's a tragic side to our new reliance on artificial intelligence, and while some of us get excited about how it can make our lives easier or even highlight deadly diseases before even the doctors can, there are growing concerns about how it can negatively impact our lives.

There have always been fears about the potential to fall in love with machines, with 2013's Her being a whole movie with that exact premise. This has already become shockingly relevant due to the tragic story of a 14-year-old boy who took his own life after 'falling in love' with an internet chatbot.

A lawsuit was filed in America after Sewell Setzer III from Orlando, Florida, spent months speaking to chatbots on the Character.AI server.

Sewell Setzer II and his mother, Megan Garcia (Social Media Victims Law Center)
Sewell Setzer II and his mother, Megan Garcia (Social Media Victims Law Center)

Setzer tragically took his life on February 28, 2024. His mother claims he spent an excessive amount of time talking to chatbots and that Character.AI is guilty of promoting its addictive design. She also adds that her son would ceaselessly text with the online chatbots,

Although the Character.AI site warns users that 'everything Characters say is made up', Sewell's diary shows how he grew attached to bots that he'd made himself or had been created by others.

He apparently became attached to a chatbot called Dany (after the Game of Thrones character), and despite taking part in five therapy sessions per week before his death, his family say he became increasingly withdrawn.


Character.AI has spoken out about the lawsuit, offering its condolences to Sewell Setzer III's family. It reiterates that it takes the safety of its users very seriously and is currently implementing a number of new procedures.

The Community Notes on the response reiterate that while Character.AI isn't directly being connected to Setzer taking his own life, it is being chastised for a lack of proper guidance or intervention.

Promising there are new 'guardrails' for users under the age of 18, Character.AI has hired a Head of Trust and Safety and a Head of Content Policy, while also bringing on more engineering safety support team members.

Importantly, there's a new pop-up resource that will direct users to the National Suicide Prevention Lifeline if certain phrases related to self-harm or suicide are inputted.

There are plans to alter the models available to those under the age of 18, a revised disclaimer reminding users bots aren't real, and notification when someone has spent over an hour on the platform.

Character.AI vows to continue implementing new policies and features as its evolves, hopefully ensuring situations like the Setzer case can be prevented in the future.

If you or someone you know is struggling or in crisis, help is available through Mental Health America. Call or text 988 to reach a 24-hour crisis center or you can webchat at 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Tech Justice Law Project / US District Court Middle District of Florida Orlando Division