A 12-year-old girl has been left “traumatized” after a deepfake image was shared by school bullies on social media.
The family of the young girl have accused the police of not doing enough to protect their daughter after bullies posted a deepfake image of her online.
Deepfakes are synthetic media that digitally manipulate an image to replace one person's likeness convincingly with that of another.
Advert
Deepfake porn has become an increasing problem since the birth of the technology.
The edited photo was shared on Snapchat and West Yorkshire Police in the UK initially told the family that nothing could be done because the social media platform is based in the US.
The parents of the 12-year-old got in touch with police using the non-emergency number earlier this year and finding out that the image was circulating online.
Advert
An officer visited their home but nine days later, the family received a text to say that the case had been closed and no suspect was identified.
After a complaint, the police have now admitted that it made “mistakes” and have since apologized to the family.
They say that the incident is now being “thoroughly investigated”.
Advert
Speaking to the BBC, the girl’s mother said: “It has honestly been the most horrendous thing to go through.
“This image is being shared by children who presumably think it is funny but it is basically child pornography.
“How can someone make a fake pornographic picture of a 12-year-old girl for people to share again and again - and police do nothing at all?”
She went on to add: “It just felt to me that they [the police] just weren't bothered at all.
Advert
“We called back several times to find out why they weren't investigating and my husband went to the police station.
“At one point we were told the log said they didn't have our phone number - but we'd been sent that text message.
“We thought at least they might contact my daughter's school before the summer holidays to try and stop the image being spread further, but they didn't and now it is all over her social media.”
Advert
A West Yorkshire Police spokesperson said: “We acknowledge that this matter was not handled in a satisfactory manner and our method of communication does not reflect the appropriate level of victim care.
“The officer in the case has been advised accordingly and we have since spoken with the victim’s family to assure them this is being thoroughly investigated.
“Further information about this matter has since come to light. Our inquiries remain ongoing.”
A spokesperson for Snapchat said: “Any activity that involves the sexual exploitation of a child, including the sharing of explicit deepfake images, is abhorrent and illegal, and we have zero tolerance for it on Snapchat.
“If we find this content through our proactive detection technology, or if it is reported to us, we will remove it immediately and take appropriate action. We also work with the police, safety experts and NGOs to support any investigations.”