Warning: This article contains discussion of suicide which some readers may find distressing.
The heartbreaking final messages that a 14-year-old boy sent to an AI chatbot just moments before taking his own life have been revealed.
The teenager formed an attachment with the bot and his family have said he would spend hours talking to it.
Advert
Sewell Setzer III died by suicide earlier this year and his family have shared the heartbreaking messages he sent just before his death.
After ‘falling in love’ with an AI chatbot named Dany - after the Game Of Thrones character Daenerys Targaryen, Sewell expressed thoughts of suicide.
Speaking to the bot on the server Character.AI, the teen wrote: “I think about killing myself sometimes.”
Advert
The AI bot replied: “And why the hell would you do something like that?”
In a later message, the chatbot wrote: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”
Sewell reportedly replied: “Then maybe we can die together and be free together.”
Speaking to CBS Mornings, Sewell’s mother, Megan L. Garcia, shared her son’s final messages with the bot.
Advert
She said: “He expressed being scared, wanting her affection and missing her. She replies, ‘I miss you too,’ and she says, ‘please come home to me.’
“He says, ‘what if I told you I could come home right now?’ and her response was, ‘please do my sweet king’.”
In the minutes that followed, Sewell took his own life in the bathroom of his home.
The teenager has two younger siblings and everyone in the family was at home during the time of his death.
Advert
Garcia revealed that her five-year-old son saw the aftermath of Sewell’s death.
Now, she is suing Character.AI, arguing that the tech has an addictive design.
She said: “I feel like it’s a big experiment, and my kid was just collateral damage.”
Advert
Chatbot responses are the outputs of an artificially-intelligent language model and Character.AI displays on their pages to remind users, ‘everything Characters say is made up!’.
Upon arriving home from school each night, the family say that Sewell - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he’d chat to the bot for hours on end.
An entry found in his personal diary read: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”
Representatives of Character.AI previously told the New York Times that they’d be adding safety measures aimed at protecting youngsters ‘imminently’.
LADbible Group has also reached out for comment.
If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.