Jump directly to the content
Exclusive
YOUTUBE FAIL

YouTube takes down ‘less than half of the dangerous hate content that gets reported to it’

The Henry Jackson Society think tank ran a three month long experiment to test the Google-owned site by reporting alarming material it found on it every week

YOUTUBE takes down less than half of the dangerous hate content that gets reported to it, a major study has revealed.

Even when the video sharing giant does act to remove Islamist extremist and far right films, it takes almost two weeks to do it.

 Facebook is owned by Google, a company that has also come under fire for not removing radical videos quickly
3
Facebook is owned by Google, a company that has also come under fire for not removing radical videos quicklyCredit: Alamy

The Henry Jackson Society think tank ran a three month long experiment to test the Google-owned site by reporting alarming material it found on it every week.

At the end of the period, moderators had removed just 47 of a total of 107 Islamist extremist postings that glorified terror acts.

For the Islamist hate videos they did bother to act on, it took them an average of 11 and a half days to take them down.

And just 33 out of 94 far right movies that promoted racial violence spotted by the think tank’s researchers were eventually taken down - and on average, after 13 and a half days.

 Youtube decided not to remove controversial Adolf Hitler videos
3
Youtube decided not to remove controversial Adolf Hitler videosCredit: Getty - Contributor

The revelation comes despite almost every jihadi terrorist attack on British soil being linked to radicalisation online.

Hate content that YouTube refused to remove during the experiment included a video of a man filmed slapping a Muslim teenager with bacon and shouting ‘ISIS scum’.

Another, entitled ‘Adolf Hitler was right’, praised of Hitler with images of Jewish families being marched off to concentration camps.

It also allowed a film to stay up of a child singing to images glorifying Islamist terrorism, as well as promotional material posted to support the Taliban.

 It took moderators an average of 11 and a half days to take them flagged videos down
3
It took moderators an average of 11 and a half days to take them flagged videos downCredit: Getty - Contributor

By the end of the three months, 121 extremist videos that had been reported were still fully viewable.

The extensive study was commissioned by Commons Home Affairs Select Committee chair Yvette Cooper to test YouTube’s repeated pledges that it acts immediately on hate reporting.

Former Labour Cabinet minister Ms Cooper dubbed the findings “simply unacceptable”, adding: “We know social media can play a role in the radicalisation of young people, drawing them in with twisted and warped ideology.

“YouTube have promised to do more, but they just aren’t moving fast enough. Google, which owns YouTube, is one of the richest and most innovative companies on the planet. They have the resources and capability to sort this and they need to do so fast.”

Dr Alan Mendoza, Executive Director of the Henry Jackson Society, added: “These ideologies can be freely disseminated and amplified online, and there is room for improvement by technology firms to provide spaces to expose and debate their inconsistencies”.

The internet giants have also failed to deliver on a demand by PM Theresa May at the UN in September that they remove all extremist content within two hours of it being posted or face crippling fines.

US President Donald Trump blames crime rise in UK on 'radical Islamic terror'
Topics