Facebook moderators ‘addicted’ to extreme clips and conspiracy theories after being forced to view 500 dodgy posts per day, report claims
FACEBOOK moderators have revealed how they're becoming "addicted" to extremist content, according to a new report.
Some workers even say they have hoarded troves of dodgy content for their own personal collections.
Facebook uses teams of third-party moderators to scour the app for banned or illegal photos, videos or posts.
And workers at one office in Berlin told The that they had developed addictions to extreme content.
Even their political and social views were influenced after being forced to view fake news and hate speech constantly.
Employees also scoured through texts sent through Facebook Messenger, to uncover sexual abuse against children.
Algorithms flag up these rogue messages, which are then reviewed by moderators.
"You understand something more about this sort of dystopic society we are building every day," said
"We have rich white men from, from the US, writing to children from the Philippines.
"They try to get sexual photos in exchange for $10 or $20 (£8 to £16)."
Moderators were reportedly expected to review 1,000 items during an eight-hour shift.
That's roughly one issue every 30 seconds.
According to recent reports, this number has now been reduced to between 400 and 500 items per day.
In a statement given to The Sun, Facebook said: "Content moderators do vital work to keep our community safe, and we take our responsibility to ensure their well-being incredibly seriously.
"We work closely with our partners to ensure they provide the support people need, including training, psychological support and technology to limit their exposure to graphic content.
"Content moderation is a new and challenging industry, so we are always learning and looking to improve how it is managed.
"We take any reports that our high standards are not being met seriously and are working with our partner to look into these concerns."
In another statement given to The Sun, Facebook said: "The wellbeing of the people who review content for Facebook is a priority for us and we employ a range of tools to limit the amount of graphic material they have to see.
"This includes tools to ensure they are not seeing this kind of content back-to-back for long periods of time.
"In Berlin, as well as in all other review sites, the devices used by our content reviewers have also been adapted to ensure people cannot save any content they review.
"This includes disabling USB drives and the ability to print screens, and blocking external email providers. Phones and smart watches are also not allowed on the floor or in training rooms.
"Any suggestion that we allow our content reviewers to save content they have reviewed into a 'personal collection' is false."
Back in February, a report revealed how Facebook moderators were using drugs and having sex at work to cope with their PTSD symptoms.
One moderator even reportedly found himself believing in the warped conspiracy theories he had to monitor continually.
Based in a facility in Arizona, operated by third-party contractor Cognizant, employees are constantly bombarded with sexual, violent, and bloody content – for £11 an hour.
One terrorised employee told she started having panic attacks after having to watch a man stabbed to death on video while he screamed and begged for his life.
This was in order to graduate to becoming a "process executive", and she had to explain to a group of other trainees why it should be removed straight after seeing it.
The moderators, whose job is to remove anything deemed offensive or inappropriate, or that contradicts Facebook's user requirements, are allowed two 15-minute breaks, a 30-minute lunch break, and nine minutes of “wellness time”.
To add to the hell employees endure they reportedly find themselves spending their breaks – the only peace they have at work – queuing for the toilets.
For a facility that houses hundreds of workers, there are apparently three bathroom stalls, which are reportedly often occupied by people having sex.
As a means of coping and escape, employees also snuck into the facility’s nursing rooms.
It reportedly got so bad, the locks had to be removed from nursing room doors.
reported the psychological stress had turned many to drugs and alcohol, with many smoking marijuana while working.
Cognizant employees were apparently even contending with threats from ex-employees, who send threats after being sacked.
Tackling dodgy content – a new plan
Here's the latest bid by Facebook to stamp out extreme videos online...
- Yesterday, Facebook revealed it would give UK police body cameras to train AI systems that hunt down shooter clips
- Earlier this year, twisted white supremacist Brenton Tarrant livestreamed a massacre at two mosques in New Zealand
- Some 52 people were slaughtered when the far-right fanatic gunned down worshippers
- The attack was filmed in first-person and uploaded to Facebook. The app's moderators were later criticised for not taking down footage quickly enough
- Now Facebook is hoping to get a clearer idea of what shootings look like, to help train its own detection systems
- Facebook will provide London's Metropolitan Police with body cams for use in firearms training centres
- This will allow them to capture the "volume of images needed" to train its machine learning tools
- Over time, the AI tools will learnt o more accurately and quickly identify real-life first person shooter incidents – and then remove them from the platform
Then in June, a Facebook moderator died from a heart attack while sifting through gruesome videos on the social media platform.
In an investigation by , Keith Utley was just 42-years-old when he died at his desk.
He had spoken out about how the grotesque videos were affecting his mental health, but he was also desperate to keep his job to support his family.
A former lieutenant commander of the Coast Guard he was one of the 800 workers employed by Cognizant - a Facebook content moderation site in Tampa, Florida.
The stress of the job was crippling Keith, who openly expressed that he was struggling with the content he was seen.
According to The Verge, regular exposure to such distressing scenes resulted in workers being diagnosed with post-traumatic stress disorder and other related conditions.
One of Keith's managers told The Verge: "The stress they put on him — it’s unworldly.
"I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired."
Nightworker Keith was found slumped over his desk in March last year, his colleagues raised the alarm when they noticed he was in distress and started sliding out of his chair.
According to The Verge, two began to perform CPR but no defibrillator was available in the building.
By the time paramedics arrived, one worker said that Keith had already begun to turn blue.
Allegedly, some staff members didn't even look up from their screens and continued to moderate.
Keith was pronounced dead in hospital, further information about his health history or exact circumstances of death were not released.
He left behind his wife Joni and two young daughters.
According to insiders, workers on the day shift were informed that there had been an incident but were not told that Keith had died.
MOST READ IN TECH
After the Christchurch massacre, Facebook promised instant bans for rogue live-streamers under a new "one-strike policy".
Facebook CEO Mark Zuckerberg also admitted that the social network should be regulated after the attack.
Facebook is also under fire after it emerged WhatsApp had been hacked – allowing crooks to install spyware on users' phones.
Do you think Facebook does enough to protect users and moderators? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]