Jump directly to the content

THIS is the story of the secret army of moderators hired by social media companies to clean up inappropriate content on their websites.

They quietly work away behind the scenes, sifting through the endless stream of pictures and videos that have been flagged by users.

 Facebook and other social media sites employ thousands of moderators
4
Facebook and other social media sites employ thousands of moderatorsCredit: YouTube, hotdocsfest

It includes some of the nastiest stuff imaginable, including snuff films, self-harm images, bestiality clips, beheading videos and child abuse.

They see themselves as policemen or bodyguards, protecting the social media users and “keeping the platform healthy”.

But the nature of the job means many suffer with mental health issues.

A new documentary shines a light on the grim reality of this strange new job that has emerged in the age of social media.

 The moderators sift through an endless stream of flagged content
4
The moderators sift through an endless stream of flagged contentCredit: YouTube, hotdocsfest

Companies like Facebook, Google and Twitter who hire them, do so at an arms’ length and much of their operational work is shrouded in secrecy.

All three companies declined to comment in the film.

The documentary, called The Cleaners, follows a handful of young, often naive, Filipino men and women who are happy to have a job that pays a decent wage in a country where many face poverty.

Some specialise in certain types of content such as “live self-harm videos” and many are required to look through 25,000 pictures a day, choosing either to delete them or allow them to remain on the site.

 Many moderators suffer from mental health problems, the film claims
4
Many moderators suffer from mental health problems, the film claimsCredit: YouTube, hotdocsfest

One young female content moderator says in the film: “We were introduced to all sorts of words about sex.

“Words like p***y or t**s. I didn’t know those terms. I was kind of innocent then.

“Then there were sex toys, I didn’t know what they were either. Terms like butt plugs.”

To get more acquainted with the depravity of the web users she was policing, she would go home and watch different kinds of porn.

Coupled with the day-to-day demands of the job, it had a profound effect on her.

She said: “I wasn’t used to seeing penises, so whenever I went to sleep I would dream of seeing different kinds of penises.

“That’s all I saw. Penises everywhere.”

 Moderators come to specialise in different kinds of content, like terror videos
4
Moderators come to specialise in different kinds of content, like terror videosCredit: YouTube, hotdocsfest

Other moderators have to memorise the flags and slogans of terrorist groups around the world.

They have seen so many beheadings they have become experts in the grisly act.

The worst ones are when the knife is not that sharp, “kind of like a kitchen knife,” they say.

One moderator said if they were to pull down a live video of someone threatening self-harm before they actually hurt themselves, the moderator will be hit with a strike.

They are only allowed three strikes for wrongly deleted content in a month.

One moderator who specialised in live streaming self-harm videos went on to hang himself.

The exact number of the workers around the world is hard to pin down, but Facebook is thought to have about 7,500 global moderators who sift through 10million posts per week.

Google also has many thousands doing such work.

A version of this story originally appeared on au.


We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . You can WhatsApp us on 07810 791 502. We pay for videos too. Click here to upload yours.