
A new study has found that heavy ChatGPT users might be getting emotionally attached to the AI chatbot.
Researchers found that so-called ChatGPT 'power users' who spent the most time with the OpenAI model were showing 'indicators of addiction.'
In a new joint study, researchers with OpenAI and the MIT Media Lab found that this small group of individuals engaged in more 'problematic use' of the chatbot, including 'preoccupation, withdrawal symptoms, loss of control, and mood modification.'
Advert
In short, some users weren’t just using ChatGPT, they were leaning on it emotionally - maybe more than they realised.
To explore this further, the research team surveyed thousands of users to learn not just how people use ChatGPT but also how they feel during those interactions.

The team called these 'affective cues', which were defined in a joint summary of the research as 'aspects of interactions that indicate empathy, affection, or support.' they used when chatting with it.
Advert
While most users 'didn't engage emotionally' with ChatGPT, those who used it for more extended periods were more likely to treat it like a 'friend.'
These users also tended to report feeling lonelier and were more sensitive to subtle changes in how the chatbot responded.
Interestingly, the study shows that people with fewer social connections in real life may be forming deep, one-sided relationships with AI - and where that could lead is anyone’s guess.
Furthermore, the research revealed some interesting contradictions. For example, people were more emotionally expressive when typing with ChatGPT than with its Advanced Voice Mode. And oddly enough, the voice mode was linked to better well-being - but only when 'used briefly.'
Advert

This new research also highlighted unexpected contradictions based on how ChatGPT was used.
Meanwhile, those who used ChatGPT for 'personal' reasons like discussing feelings or memories were less emotionally dependent upon it than those who used it for more practical stuff, like brainstorming or getting advice.
Out of all of it, the longer someone used ChatGPT - no matter how or why - the more likely they were to grow emotionally attached.
Advert
Commenting on the research, OpenAI and MIT jointly wrote: "Our findings show that both model and user behaviors can influence social and emotional outcomes. Effects of AI vary based on how people choose to use the model and their personal circumstances.
"This research provides a starting point for further studies that can increase transparency, and encourage responsible usage and development of AI platforms across the industry."