Jump directly to the content
CRACKING DOWN

Instagram, Facebook and YouTube to face massive fines for failing to weed out sick posts under Government plans

SOCIAL media giants Instagram, Facebook and YouTube are set to face fines of millions of pounds for showing harmful videos in a new Government crackdown.

Broadcasting watchdog Ofcom will be given new legal powers to police, investigate and fine social platforms for sharing or live-streaming “harmful” videos, including violence, child abuse and pornography.

 Social media giants face a new Government crackdown over sharing harmful videos
6
Social media giants face a new Government crackdown over sharing harmful videosCredit: Getty - Contributor

Sites must establish strict age verification checks and parental controls to ensure young children are not exposed to video content that “impairs their physical, mental or moral development”, the reported.

If the conditions are not met, the regulator will be able to issue fines of £250,000 or an amount worth up to five per cent of a company’s revenues.

They will also have the power to “suspend” or “restrict” the tech giants’ services in the UK if they fail to comply with enforcement measures.

In the year to June, Facebook made a revenue of £51.5 billion, while YouTube amassed around £10 billion.

The crackdown is seen as an interim measure and comes ahead of the Government’s White Paper plans for a statutory duty of care to combat online harms.

The Culture Department said the new powers on videos were scheduled to come into effect in September next year, subject to a further consultation and a statutory order in Parliament.

The department said Ofcom also would have the power to suspend or restrict the entitlement to provide a service.

An Ofcom spokesman said: “These rules are an important first step."

It may mean that Ofcom could take the role of “super-regulator” charged with policing the White Paper laws on online harms.

The culture department said Ofcom also would have the power to suspend or restrict the entitlement to provide a service.

The move has come about due to “off-the-shelf” legislation set out in an EU directive that Britain and other countries agreed upon.

Importantly, the Audio Visual Media Services directive extended regulation from TV and video-on-demand services to “video sharing” platforms.

PROTECTING CHILDREN

Andy Burrows, the NSPCC’s head of child safety online policy, said: “This is a real chance to bring in legislative protections ahead of the forthcoming Online Harms Bill and to finally hold sites to account if they put children at risk.

“The immediacy of live-streaming can make children more vulnerable to being coerced by abusers, who may capture the footage, share it and use it as blackmail.”

A NSPCC survey of 40,000 children aged seven to 16 found one in four had live-streamed, and one in eight had video-chatted with someone they had never met.

Of those, one in 20 children who had live-streamed were asked to remove their clothes.

And four per cent of 11 to 17-year-olds had sent, been asked to send or received sexual content to or from an adult.

NSPCC chief executive Peter Wanless said last month: “The scale of risk children face on social networks revealed in this research cannot be ignored and tackling it with robust legislation needs to be a priority for this Government.

“Tech firms need to be forced to get a grip of the abuse taking place on sites.”

The charity says paedophiles contact large numbers of children on social media and encourage those who respond to move over to encrypted messaging or live streaming.

Damian Collins, chair of the culture committee, stressed that the White Paper be "implemented in its entirety, so that the full extent of online harms can be tackled".

Firms must comply with “minimum standards” to protect minors from damaging content to their wellbeing, material that incites “violence or hatred” and criminal content such as child abuse images or terrorism.

Regulator Ofcom will use “appropriate information gathering” powers to order sites like Facebook or Youtube to hand over data or algorithms which many say drive content to vulnerable children.

What can tech firms do?

The NSPCC’s Wild West Web campaign has been calling for a regulator to require platforms to take proactive action to identify and prevent grooming on their sites by:

  • Using Artificial Intelligence to detect suspicious behaviour
  • Sharing data with other platforms to better understand the methods offenders use and flag suspicious accounts
  • Turning off friend suggestion algorithms for children and young people, as they make it easier for groomers to identify and target children

DEATH BY SOCIAL MEDIA

Some child exploitation and abuse experts have said that live-streaming has become a magnet for paedophiles.

Others point towards their influence in glorifying self-harm and suicide.

Molly Russell, a 14-year-old British schoolgirl, died after viewing scores of disturbing images on Instagram.

One of the images Molly viewed showed a blindfolded girl hugging a teddy bear, captioned: "This world is so cruel, and I don't wanna to see it any more."

She was found dead just hours after handing in her homework - and packing a schoolbag for the next day.

Her devastating suicide note wrote: "I'm sorry. I did this because of me."

Molly - who went to Hatch End High School in Harrow, Middlesex - had started viewing the disturbing posts without her family's knowledge.

Ian told the BBC: "I have no doubt that Instagram helped kill my daughter. She had so much to offer and that's gone"

"She seemed to be a very ordinary teenager. She was future-looking. She was enthusiastic.

He said the images found show Instagram "must look longer and harder".

He added: "The posts on those sites are so often black and white, they’re sort of fatalistic.

"Instagram needs to step up the consistency and quality of their moderation, to match their investment in technology that can proactively detect harmful content."

In January 2018, 11-year-old Ursula visited “suicide sites” on the social network before taking her own life in Halifax, West Yorks.

And in Malaysia, Davia Emilia, 16, jumped three floors to her death at her home in Sarawak in May after 69 per cent of users who responded to a poll she posted on Instagram asking if she should die or not said she should.

 Molly Russell took her own life when after viewing suicide posts on Instagram
6
Molly Russell took her own life when after viewing suicide posts on InstagramCredit: PA:Press Association
 Davia Emilia killed herself after posting a poll on Instagram asking if she should live or die
6
Davia Emilia killed herself after posting a poll on Instagram asking if she should live or die
 Tragic Ursula took her own life in January last year
6
Tragic Ursula took her own life in January last year

LIVESTREAMING TERROR

Facebook received heavy backlash after failing to remove videos of the livestreamed Christchurch mosque terrorist killings.

And earlier this year Prime Minister Theresa May joined world leaders to demand tech giants wipe out “sickening” terror content from the internet.

Signing up to the ‘Christchurch Call’ at a major summit in Paris, she demanded Facebook did more after it was used by twisted white supremacist Brenton Tarrant to livestream his massacre.

Ms May said: “That 1.5 million copies of the video had to be removed by Facebook - and could still be found on Youtube for as long as eight hours after it was first posted - is a stark reminder that we need to do more both to remove this content, and stop it going online in the first place.”

She urged world leaders to be “ambitious and steadfast” in making sure social media is not “weaponised” by terrorists.

COMPANY PROMISES

In a statement last month, Facebook's safety chief Antigone Davis said: "Keeping young people safe on our platforms is a top priority for us.

"In addition to using technology to proactively detect grooming and prevent child sexual exploitation on our platform, we work with child protection experts, including specialist law enforcement teams like CEOP in the UK, to keep young people safe.

"99% of child nudity content is removed from our platform automatically."

YouTube also came under fire for failing to take action against far-right extremism and "drill" videos that incite gang violence.

Similarly, Instagram boss Adam Mosseri promised a crackdown on disturbing content.

Karina Newton, Instagram’s Head of Public Policy, said: “Keeping people who use Instagram safe is one of our biggest priorities, particularly the most vulnerable.

“This year we changed our policy and started building new technology to find and take action on more content, including removing it or adding sensitivity screens.

“We’re also stopping recommendations to make it harder to find, and offering people resources to help them get support.

“We’re making progress, but there is more to do.

"The new policy and technology has helped us address twice as much content, with an increase from 386k pieces of content actioned between January March - to over 834k between April and June."

YOU'RE NOT ALONE

EVERY 90 minutes in the UK a life is lost to suicide.

It doesn't discriminate, touching the lives of people in every corner of society - from the homeless and unemployed to builders and doctors, reality stars and footballers.

It's the biggest killer of people under the age of 35, more deadly than cancer and car crashes.

And men are three times more likely to take their own life than women.

Yet it's rarely spoken of, a taboo that threatens to continue its deadly rampage unless we all stop and take notice, now.

That is why The Sun launched the You're Not Alone campaign.

The aim is that by sharing practical advice, raising awareness and breaking down the barriers people face when talking about their mental health, we can all do our bit to help save lives.

Let's all vow to ask for help when we need it, and listen out for others... You're Not Alone.

If you, or anyone you know, needs help dealing with mental health problems, the following organisations provide support:

  • CALM, , 0800 585 858
  • Heads Together,
  • Mind, , 0300 123 3393
  • Papyrus,, 0800 068 41 41
  • Samaritans, , 116 123
 Some child exploitation and abuse experts have said that live-streaming has become a magnet for paedophiles
6
Some child exploitation and abuse experts have said that live-streaming has become a magnet for paedophilesCredit: Getty - Contributor
 It has been found that Instagram still contains scores of sick self-harm posts
6
It has been found that Instagram still contains scores of sick self-harm postsCredit: Alamy
Inside the teen dating site Snog where perverts target young girls to groom with sick messages


We pay for your stories! Do you have a story for The Sun Online news team? Email us at [email protected] or call 0207 782 4368 . You can WhatsApp us on 07810 791 502. We pay for videos too. Click here to upload yours.


Topics