One former Facebook content moderator has shared the 'traumatizing' content they had to sift through while working at the social media giant, in a YouTube video many have deemed to be 'horrifying'.
The early days of social media were all about sharing your life, whether it be pictures of your dog, your lunch, or your sunny holidays, and there was perhaps no better platform for that than Facebook.
Originally set up as a means to connect college students, it eventually expanded out to everyone - allowing them to show their life to their friends and family, and perhaps rebuild friendships that were lost to time.
Advert
There was unfortunately a much darker side to the social media platform though, and it was down to content moderators to sift through 'traumatizing' content that included hate speech, pornography, and brutal violence.
One former employee of Facebook spoke to VICE anonymously in a YouTube video, documenting their experience in the role and the 'horrors' that they faced on a daily basis, and it's an experience that you really have to hear to believe.
Boiling down the experience to the basics, they describe the job as follows: "I would basically just come in, find out what my target is for the day, press the button, just go. And the first piece of content, boom, it's just there on the screen in front of you.
Advert
You take a look, you make a decision, you press a couple of buttons, it goes away, and the next one loads. And the next one. And the next one."
While that might sound like any old boring office job, it's far from a pleasant or even mundane experience when you're faced with illegal and often deplorable content, one after another.
"You are seeing dead bodies," they describe, "and murders, or people celebrating having killed somebody, dogs being barbequed alive."
Part of the trouble, outside of seeing content that is deeply traumatizing, is the nature of evaluation and moderation. The subject outlines: "I think it would be easier to deal with the images if you weren't having to think about them so deeply."
Advert
Facebook allegedly has incredibly strict and concretely defined moderation rules that clearly separate between aspects like harm, injury, and violence, and are used to determine whether something falls under hate speech or not, and it is up to the moderators to consume potentially traumatizing content with intent in order to evaluate it within these rules.
This particular former moderator has been diagnosed with PTSD that stems from his time at Facebook and the content that he had to deal with on a daily basis, and at the time had a lawsuit pending against the social media company in regard to the treatment that he faced.
Facebook has previously had to pay $52 million in settlement fees to current and former moderators as compensation for issues faced on the job, and just only recently have reports surfaced in the Guardian that 140 Kenyan Facebook moderators have been diagnosed with PTSD.
Advert
It certainly brings into question the ethical considerations for a job like this and the role of social media companies in designing moderation systems that bypass harm to individuals.