Facebook moderators ‘diagnosed with PTSD’ after watching 500 clips a day featuring stabbings and child sex abuse
FACEBOOK employees in Ireland forced to watch sick clips posted to the site are suing the company for trauma.
Moderators have been left broken after examining hundreds of harrowing videos a day featuring stabbings, bestiality and child sex abuse.
Based in a facility in Dublin, operated by third-party contractor CPL Resources, employees are constantly bombarded with traumatic content – for just over minimum wage.
One terrorised ex-worker told he viewed brutal murders within hours of taking the job.
"My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed," New Jersey native Sean Burke said.
"Day two was the first time for me seeing bestiality on video — and it all escalated from there."
Two weeks later, he "started seeing actual child porn."
The moderators' job is to remove anything deemed offensive or inappropriate, or that contradicts Facebook's user requirements.
Around 15,000 are employed across the globe with various third-party companies.
In Dublin, moderators are given a few short "wellness breaks" and access to psychological support to help them cope.
But one worker told Vice that moderators are expected to trawl through so much horrific content they don't have time to take advantage of support.
Daniel Valdermasson, who worked as a Facebook moderator at CPL from October 2017 to December 2018, said he didn't expect the role to be so traumatic.
Daniel watched beheadings and other sick clips on the internet in the 1990s and presumed he could handle the brutality.
"I really didn't think it was going to be affecting me but then you get exposed to true evil," Daniel said.
"For most people, I have spoken to, the problem is not seeing a grown man beheaded, it's seeing a little six-month-old baby being raped."
Chris Gray, 50, is a former moderator based in Ireland who has been diagnosed with Post-traumatic stress disorder (PTSD) following his 10-month ordeal with CPL.
He's fronting a lawsuit seeking monetary compensation for the trauma he and several other ex-employees suffered.
The suit also aims to force Facebook to hand over data on what employees were exposed to and for how long on a day-to-day basis.
It's hoped exposing this data will force Facebook to do more to protect moderators in future and improve how their wellbeing is tracked on the job.
"There are 40,000 people doing this s**t," Gray said, including contractors and Facebook employees.
"If I can get them better working conditions, better care, then that also improves the quality of the content moderation decisions and the impact on society."
Tackling dodgy content – a new plan
Here's the latest bid by Facebook to stamp out extreme videos online...
- In September, Facebook revealed it would give UK police body cameras to train AI systems that hunt down shooter clips
- Earlier this year, twisted white supremacist Brenton Tarrant livestreamed a massacre at two mosques in New Zealand
- Some 52 people were slaughtered when the far-right fanatic gunned down worshippers
- The attack was filmed in first-person and uploaded to Facebook. The app's moderators were later criticised for not taking down footage quickly enough
- Now Facebook is hoping to get a clearer idea of what shootings look like, to help train its own detection systems
- Facebook will provide London's Metropolitan Police with body cams for use in firearms training centres
- This will allow them to capture the "volume of images needed" to train its machine learning tools
- Over time, the AI tools will learnt o more accurately and quickly identify real-life first person shooter incidents – and then remove them from the platform
Diane Treanor, the Dublin-based lawyer leading the lawsuit, said Facebook moderators based in Berlin and Barcelona had also been in contact.
She hopes to ensure moderators have access to counsellors and healthcare workers even after they leave their roles.
Back in February, a report revealed how Facebook moderators were using drugs and having sex at work to cope with their PTSD symptoms.
One moderator even reportedly found himself believing in the warped conspiracy theories he had to monitor continually.
Based in a facility in Arizona, operated by third-party contractor Cognizant, employees were constantly bombarded with sexual, violent, and bloody content – for £11 an hour.
One terrorised employee told she started having panic attacks after having to watch a man stabbed to death on video while he screamed and begged for his life.
TOP STORIES IN TECH
After the Christchurch massacre, Facebook promised instant bans for rogue live-streamers under a new "one-strike policy".
Facebook CEO Mark Zuckerberg also admitted that the social network should be regulated after the attack.
Facebook is also under fire after it emerged WhatsApp had been hacked – allowing crooks to install spyware on users' phones.
Do you think Facebook does enough to protect users and moderators? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]