Facebook moderators ‘addicted’ to extreme clips and conspiracy theories after being forced to view 500 dodgy posts per day, report claims
FACEBOOK moderators have revealed how they're becoming "addicted" to extremist content, according to a new report.
Some workers even say they have hoarded troves of dodgy content for their own personal collections.
Facebook uses teams of third-party moderators to scour the app for banned or illegal photos, videos or posts.
And workers at one office in Berlin told The that they had developed addictions to extreme content.
Even their political and social views were influenced after being forced to view fake news and hate speech constantly.
Employees also scoured through texts sent through Facebook Messenger, to uncover sexual abuse against children.
This was in order to graduate to becoming a "process executive", and she had to explain to a group of other trainees why it should be removed straight after seeing it.
The moderators, whose job is to remove anything deemed offensive or inappropriate, or that contradicts Facebook's user requirements, are allowed two 15-minute breaks, a 30-minute lunch break, and nine minutes of “wellness time”.
To add to the hell employees endure they reportedly find themselves spending their breaks – the only peace they have at work – queuing for the toilets.
Cognizant employees were apparently even contending with threats from ex-employees, who send threats after being sacked.
Tackling dodgy content – a new plan
Here's the latest bid by Facebook to stamp out extreme videos online...
- Yesterday, Facebook revealed it would give UK police body cameras to train AI systems that hunt down shooter clips
- Earlier this year, twisted white supremacist Brenton Tarrant livestreamed a massacre at two mosques in New Zealand
- Some 52 people were slaughtered when the far-right fanatic gunned down worshippers
- The attack was filmed in first-person and uploaded to Facebook. The app's moderators were later criticised for not taking down footage quickly enough
- Now Facebook is hoping to get a clearer idea of what shootings look like, to help train its own detection systems
- Facebook will provide London's Metropolitan Police with body cams for use in firearms training centres
- This will allow them to capture the "volume of images needed" to train its machine learning tools
- Over time, the AI tools will learnt o more accurately and quickly identify real-life first person shooter incidents – and then remove them from the platform
Then in June, a Facebook moderator died from a heart attack while sifting through gruesome videos on the social media platform.
In an investigation by , Keith Utley was just 42-years-old when he died at his desk.
He had spoken out about how the grotesque videos were affecting his mental health, but he was also desperate to keep his job to support his family.
A former lieutenant commander of the Coast Guard he was one of the 800 workers employed by Cognizant - a Facebook content moderation site in Tampa, Florida.
The stress of the job was crippling Keith, who openly expressed that he was struggling with the content he was seen.
According to The Verge, regular exposure to such distressing scenes resulted in workers being diagnosed with post-traumatic stress disorder and other related conditions.
One of Keith's managers told The Verge: "The stress they put on him — it’s unworldly.
"I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired."
Nightworker Keith was found slumped over his desk in March last year, his colleagues raised the alarm when they noticed he was in distress and started sliding out of his chair.
According to The Verge, two began to perform CPR but no defibrillator was available in the building.
By the time paramedics arrived, one worker said that Keith had already begun to turn blue.
Allegedly, some staff members didn't even look up from their screens and continued to moderate.
Keith was pronounced dead in hospital, further information about his health history or exact circumstances of death were not released.
He left behind his wife Joni and two young daughters.
According to insiders, workers on the day shift were informed that there had been an incident but were not told that Keith had died.
After the Christchurch massacre, Facebook promised instant bans for rogue live-streamers under a new "one-strike policy".
Facebook CEO Mark Zuckerberg also admitted that the social network should be regulated after the attack.
Facebook is also under fire after it emerged WhatsApp had been hacked – allowing crooks to install spyware on users' phones.
Do you think Facebook does enough to protect users and moderators? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at tech@the-sun.co.uk