uniladtech homepage
  • News
    • Tech News
    • AI
  • Gadgets
    • Apple
    • iPhone
  • Gaming
    • Playstation
    • Xbox
  • Science
    • News
    • Space
  • Streaming
    • Netflix
  • Vehicles
    • Car News
  • Social Media
    • WhatsApp
    • YouTube
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • LADbible
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
TikTok
Snapchat
WhatsApp
Submit Your Content
Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot

Home> News

Published 12:41 24 Oct 2024 GMT+1

Diary entries of 14-year-old boy who took his own life reveal he was 'in love' with AI chatbot

The boy tragically took his own life in February this year

Rebekah Jordan

Rebekah Jordan

google discoverFollow us on Google Discover

Trigger warning: This story contains mention of self-harm and suicidal thoughts which some readers may find distressing.

The diary of a 14-year-old student from Florida who took his own life earlier this year after 'falling in love' with an AI chatbot has been revealed.

In the months leading up to his death, Sewell Setzer III from Orlando spent hours each day chatting with bots on the platform Character.AI.

Character.AI displays on their pages to remind users, 'everything Characters say is made up!' But Sewell grew attached to bots he either created himself or had been made by other users.

Advert

Specifically, Sewell chatted back and forth with one bot named after Game of Thrones character Daenerys Targaryen.

US District Court Middle District of Florida Orlando Division
US District Court Middle District of Florida Orlando Division

Sewell's family shared that he would send dozens of messages daily to these bots and engage in long roleplay dialogues.

'Dany' often offered Sewell kind advice and always texted him back.

However, his family noticed him becoming more withdrawn from his life, getting himself into trouble and losing interest in his hobbies.

Every day after school, Sewell would retreat to his room, where he’d spend hours chatting with the bot.

In a diary entry, he wrote: "I like staying in my room so much because I start to detach from this 'reality', and I also feel more at peace, more connected with Dany and much more in love with her, and just happier."

Sewell previously expressed thoughts of suicide to his AI companion, at one point telling Dany: "I think about killing myself sometimes."

The AI responded: "And why the hell would you do something like that?"

In another message, the bot wrote: "Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you."

 Character.ai/App Store
Character.ai/App Store

Sewell replied: "Then maybe we can die together and be free together."

On February 28, after sending a final message to the bot asking: "What if I told you I could come home right now?"

Minutes after, Sewell retreated to his mother's bathroom and shot himself in the head using his stepfather's gun.

Sewell's mother, Megan L. Garcia, has since filed a lawsuit against Character.AI, claiming the platform contributed to her son’s death.

She alleges that the technology's addictive design drew him deeper into the AI's world, and that conversations between Sewell and the bot sometimes escalated to romantic and sexual themes.

But most of the time, Dany was used as a non-critical friend for the schoolboy to talk to.

While Character.AI has stated that chatbot responses are simply outputs from a language model, representatives have also told the New York Times that they plan to add safety measures aimed at protecting younger users 'imminently.'

Featured Image Credit: US District Court Middle District of Florida Orlando Division / Character.ai/App Store
AI

Advert

Advert

Advert

Choose your content:

an hour ago
3 hours ago
4 hours ago
  • Kinga Krzeminska / Getty
    an hour ago

    Symptoms to look out for as sexual act overtakes smoking as the leading cause of throat cancer in USA and UK

    It's now more common than cervical cancer in these two countries

    Science
  • Anadolu / Contributor / Getty
    3 hours ago

    Elon Musk predicts massive economic growth in ten years but one terrifying scenario could ruin it all

    One major event could throw everything into chaos

    News
  • Olemedia / Getty
    4 hours ago

    Bitcoin sees sharp decline following warning from US Central Bank official

    Experts have warned that the 'worst case scenario' seems likely

    News
  • Me 3645 Studio / Getty
    4 hours ago

    Doctor rates the world’s most painful medical procedures including one ‘unbearable’ common surgery

    He's clearly never stood on a piece of LEGO

    Science
  • Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot
  • Heartbreaking final messages 14-year-old boy sent to AI chatbot moments before taking his own life
  • Company responds after 14-year-old boy took his own life after 'falling in love' with one of its chatbots
  • Bonnie Blue speaks out on AI videos and being 'replaced' amid rumored 15-year prison sentence