YouTube caught promoting deadly ‘how to self harm’ tutorials for youngsters aged 13
YOUTUBE has been caught recommending "dozens" of graphic videos relating to self-harm to children.
The hugely popular video sharing app has been blasted for promoting dangerous clips to users as young as 13.
Google-owned YouTube has come under fire for hosting inappropriate suicide-themed content before.
But a new report by found that YouTube has failed to clean up its act.
It found videos depicting "graphic images of self-harm", which were easily accessible to youngsters on the site.
One flagged clip titled "My huge extreme self-harm scars" is still live on the site, having racked up nearly 400,000 views over the last two years.
YouTube was also found to be offering worrying search term recommendations, including "how to self-harm tutorial, "self-harming girls", and "self-harming guide".
UK ministers are currently drawing up plans for tech giants to have a legal Duty of Care for young people using social media platforms.
There's growing pressure on mega-corporations like Google and Facebook to take more responsibility for the well-being of children on their sites.
In a statement, UK health secretary Matt Hancock said: "We are masters of our own fate as a nation, and we must act to ensure that this amazing technology is used for good, not leading to young girls taking their own lives."
A YouTube spokesperson issued a statement on the exposé, saying it "works hard" to prevent harmful videos and recommendations from turning up on the site.
"We know many people use YouTube to find information, advice or support sometimes in the hardest of circumstances," a YouTube spokesperson said.
"We work hard to ensure our platforms are not used to encourage dangerous behaviour.
"Because of this, we have strict policies that prohibit videos which promote self-harm and we will remove flagged videos that violate this policy.
"Our policies also prohibit autocomplete predictions for these topics, and we will remove any suggestions which don't comply with our policies."
YouTube also shows a phone number to contact suicide support charity Samaritans when users search for terms like "suicide" on the site.
But experts think YouTube simply isn't going far enough to tackle these issues.
Speaking to The Sun, Andy Burrows, Associate Head of Child Safety Online at the NSPCC, said: "It’s concerning that YouTube is actively recommending inappropriate videos to young people.
"YouTube, as with so many social media platforms, appears to be failing to abide by its own rules to keep children safe.
"The NSPCC’s Wild West Web campaign has for months been calling on Government to impose a statutory duty of care that finally forces social networks to truly protect children and be faced with tough punishments if they don’t. This is not an opportunity the Government can miss."
Last September, an investigation by The Sun revealed how YouTube was profiting from sick pranksters who post shocking fake suicide videos online.
We uncovered hundreds of shocking videos: some have millions of views, and have been live on YouTube for years.
Mental health charities warned The Sun that these videos could even inspire real suicides, due to the detailed methods shown in some clips.
One clip saw a woman fabricate her own death in a bathtub filled with fake blood, filming her husband's reaction when he returned home.
The woman's distressed partner weeps, cries out her name, and even steps into the bathtub to resuscitate her.
Another five-year-old video sees a Brit prankster fake an angry phone call, before jumping into the Thames – an act that claims 25 lives a year – stunning onlookers. It has more than 3million views.
Speaking to The Sun at the time, Brian Dow, managing director at Mental Health UK and co-chair of the National Suicide Prevention Alliance said: "It really should not need stating that suicide is not a joke or a prank.
"Every day people lose parents, children, siblings and friends, and to see it trivialised in this way is both cruel and incredibly irresponsible.
"To present this very serious issue in this way can have immediate and lasting effects not only on the viewer, who might be triggered by what they see on screen, but also the victims of the ‘pranks’ that we see being performed."
At the time, a spokesperson for the Department for Digital, Culture, Media and Sport told The Sun: "Suicide is a very serious issue that affects millions of people every year.
"We would urge YouTube to consider whether videos that trivialise someone taking their own life should be on its platform."
YOU'RE NOT ALONE: WHERE TO GET HELP
If you, or anyone you know, needs help dealing with mental health problems, the following organisations provide support:
- CALM, , 0800 585 858
- Childline, , 0800 1111.
- Heads Together,
- Mind, , 0300 123 3393
- Papyrus,, 0800 068 41 41
- Samaritans, , 116 123
That wasn't the first time YouTube had been exposed for hosting shocking content.
In December 2017, popular YouTube vlogger Logan Paul sparked controversy after filming the body of a suicide victim.
The clip, which was posted to YouTube, showed the recently deceased corpse of someone who had hung themselves in a forest in Japan.
Paul earned millions of views within hours, but was widely condemned. He eventually removed the video, issued an apology, and took a month-long break from YouTube.
The Sun has also uncovered a rogue steroids advert, a secret cache of porn, smut playlists designed to "lure kids", and webcam sex ads on YouTube.
MOST READ IN TECH
Social media sites may soon be prosecuted for failing to protect kids from disturbing content online.
Instagram has admitted its failure to block self-harm and suicide pics.
And Facebook was recently caught paying children up to £15 a month to install a "spying app" that monitors everything they did online.
Do you think Google needs to do more to keep YouTube clean? Let us know in the comments!
We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . We pay for videos too. Click here to upload yours.