Artificial intelligence (AI) research has been plowing forward over the last few years, resulting in floods of new products for people to access. Alongside the more famous chatbots and AI helpers, like ChatGPT and Microsoft Copilot, there are also a whole heap of products with slightly more niche ideas.
For one thing, some places are starting to sell chatbots that are able to mimic loved ones who have passed away, and experts have suggested that this could be a quite dangerous path to go down.
While it might sound comforting to know that you could talk to a projection of a dead family member's personality after they're gone, as a way to remember them, researchers from Cambridge University have called this sort of idea "high-risk".
Advert
This is because they believe it could cause real psychological harm to the person talking to the chatbot in the long run, at the same time as it would ignore the rights of the deceased person in question, who might not have been thrilled by the idea.
AI researcher Dr Tomasz Hollanek from Cambridge’s Leverhulme Centre for the Future of Intelligence said: "It is vital that digital afterlife services consider the rights and consent not just of those they recreate, but those who will have to interact with the simulations. These services run the risk of causing huge distress to people if they are subjected to unwanted digital hauntings from alarmingly accurate AI recreations of those they have lost."
His choice of word, "hauntings", is a pretty strong one, suggesting that this sort of chatbot could make it almost impossible to move on and accept the loss of our loved ones. Hollanek went on to say: "The potential psychological effect, particularly at an already difficult time, could be devastating."
Advert
Hollanek and his team have published a research paper on the topic, saying it's not just the risk of people not moving on - amoral companies could do much worse with these chatbots.
Once they have the ability to mimic someone who's passed away, there's nothing to stop these companies from targeting people with advertising or commercial offers using the tone and appearance of their dead loved ones, which is the sort of idea you might expect to see in a dark science fiction movie.
This would be a lot like getting digitally "stalked by the dead", according to the paper, and could again have really bad psychological knock-on consequences, since being bombarded with notifications can be stressful even in normal circumstances.
Advert
The paper does affirm that if people want to leave behind AI personalities when they die, that might well be their right, but concludes that "the rights of both data donors and those who interact with AI afterlife services should be equally safeguarded". That sounds fair to us.