Jump directly to the content
Revealed
SICK CLIPS

YouTube admits it can’t stop Jihadis and paedos posting BANNED ‘child abuse and violent extremist’ videos

Google's powerhouse video site struggles to keep a lid on extreme clips being shared online

BANNED videos promoting terrorism or child abuse can make it past YouTube filters, The Sun can reveal.

If an extreme video has already been taken down, YouTube will prevent it from being uploaded ever again – but minor edits make it possible to bypass the blocks.

 Terrorists can easily tweak videos to get their propaganda past YouTube's blocks
3
Terrorists can easily tweak videos to get their propaganda past YouTube's blocksCredit: AP:Associated Press

This means terrorists and paedophiles can easily get their twisted videos onto YouTube and broadcast them to the world with relative ease, YouTube confirmed to The Sun.

Roughly 300 hours of video are uploaded to YouTube every minute.

This makes it very difficult to make sure extreme content isn't making its way onto the Google-owned video-sharing site.

That's why Google relies on machines to help police YouTube.

 Google said it received 1.6million reports on live YouTube videos from users about child abuse between October and December 2017
3
Google said it received 1.6million reports on live YouTube videos from users about child abuse between October and December 2017Credit: Getty Images - Getty

When Google deletes a video for breaching its content policy, it keeps the video in mathematical code form.

There's a huge database with all of these codes in.

So when someone uploads a video to YouTube and its own code is an exact match with one in the blocked video database, the user won't be allowed to upload it.

That prevents terrorists or child abusers from re-uploading banned videos.

But The Sun understands that even minor edits – like cropping the size of the video, tweaking the video's length, adding banners or tinkering with the audio – will mean banned videos can get past these blocks.

 YouTube admitted to The Sun that its banned-video blocking systems aren't fool-proof
3
YouTube admitted to The Sun that its banned-video blocking systems aren't fool-proofCredit: Alamy

YouTube will still flag these videos to its team of content reviewers for being similar to banned videos.

But the video will make it online – and there's no way for Google to stop it.

Making these edits would be very easy to do, and could be done with free basic software – including smartphone apps.

YouTube officials told The Sun that in some cases, it's useful to allow edited versions of extreme videos to make it online.

For instance, news reports containing shortened clips of extreme videos should be allowed to go online.

But Google's auto-block systems aren't sophisticated enough to automatically prevent rogue edits of banned clips from making their way onto the site.

In the first YouTube Community Guidelines Enforcement Report, released today, Google revealed that it had taken down 8.3million videos between October and December 2017.

The overwhelming majority (81%) of these videos were "machine detected" – roughly 6.7million.

And according to YouTube, a quarter of these videos managed to rack up views before being removed.

YouTube by the numbers

The facts...

  • The first YouTube video was uploaded in April 2005
  • More than 1.3billion people use YouTube around the world
  • Over 300 hours of video are uploaded to YouTube every minute
  • More than 5billion videos are watched on YouTube each day
  • YouTube is an American company, but 80% of its views come from outside the US
  • More than half of YouTube views are from mobile devices – like phones or tablets
  • Eight out of 10 18-49 year olds watch YouTube

Most videos flagged by humans in the last quarter of 2017 were sexually explicit clips, which accounted for 9million reports.

But users also flagged 491,000 videos that promoted terrorism in the small three-month period.

Although some of those videos would have been false flags, a number would've been genuine terrorism clips that weren't caught by YouTube before being uploaded.

Google says it's getting quicker at catching these rogue clips, however.

At the beginning of 2017, 8% of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.

Now, more than half of the videos removed for violent extremism have fewer than 10 views.

But Google says that 76% of machine-flagged videos were removed before they received a single view.

That means YouTube is finding it much harder to tackle violent extremism videos than other categories of policy-breaking clips.

To improve the situation, Google has promised to boost the number of content reviewers examining dodgy content on YouTube to 10,000 staff by the end of the year.

The tech giant also says it's hired "full-time specialists with expertise in violent extremism, counter terrorism, and human rights".

The company also works with a network of "trusted flaggers" – 150 academics, government partners and NGOs who help report bad content.

Their content flags have higher priority than those from normal users, so dodgy content they report should get taken down quickly.

But Google clearly still has some work to do to keep the millions of videos flagged as breaching policy from making their way onto YouTube each year.

Last week, we exposed YouTube porn playlists full of hardcore sex videos that were "luring kids" with cartoons and gaming clips.

A spokesperson from the NSPCC slammed YouTube over the fiasco, saying it risks giving children "a distorted view of sex, body image and healthy relationships".

And this weekend, UK Health Secretary Jeremy Hunt said he wanted computer giants to impose daily screen-time limits.

He blamed "irresponsible" internet giants who were failing to make their sites safe for children.

Following Hunt's comments, Carolyn Bunting, CEO of Internet Matters, said: "Children’s internet safety is one of the most pressing concerns parents face in the digital age.

"And while there is always more that the tech industry and social media networks can do to enforce age restrictions and protect children’s rights, we need to collectively focus on wider education to address the growing behavioural issues online that can negatively impact children’s health and well-being."

Do you think Google needs to do more to make YouTube a safe place? Let us know in the comments!


We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . We pay for videos too. Click here to upload yours.


Topics