Tech firms in UK to be legally required to protect users from harmful content

Campaigners have said plans for social media companies to be legally required to protect their users from harmful content, with senior management held personally liable for any failings, will make the internet a safer place for children.

A government White Paper on online harms published today sets out plans for a regulator to ensure internet companies including social media platforms abide by a mandatory duty of care to their users.

Charities called for the reforms to be swiftly implemented amid widespread concern about the risk of sexual grooming online and the growing toll on young people’s mental health from harmful content.

Hide Ad
Hide Ad

Paul Masterton, the East Renfrewshire Conservative MP who has campaigned for a duty of care for internet companies since the suicide of a teenage constituent targeted by online bullies, said the measures “have the potential to make a huge difference”.

Facebook's CEO Mark Zuckerberg leaves the Elysee presidential palace, in Paris, on May 23, 2018 following a meeting with French President on the day of the "Tech for Good" summit. Picture: ALAIN JOCARD/AFP/Getty ImagesFacebook's CEO Mark Zuckerberg leaves the Elysee presidential palace, in Paris, on May 23, 2018 following a meeting with French President on the day of the "Tech for Good" summit. Picture: ALAIN JOCARD/AFP/Getty Images
Facebook's CEO Mark Zuckerberg leaves the Elysee presidential palace, in Paris, on May 23, 2018 following a meeting with French President on the day of the "Tech for Good" summit. Picture: ALAIN JOCARD/AFP/Getty Images

The joint proposals from the Home Office and the Department for Digital, Culture, Media and Sport (DCMS) will require firms to take more responsibility for the safety of users and more actively tackle the harm caused by content or activity on their platforms.

The regulator will have the power to issue “substantial fines, block access to sites and potentially impose liability on individual members of senior management”, the proposal says.

The government is currently consulting on whether to create a new regulator or use an existing one, such as Ofcom, to enforce the new rules.

A number of charities and campaigners have called for greater regulation to be introduced, while several reports from MPs and other groups published this year have also supported the calls for a duty of care to be implemented.

Prime Minister Theresa May said the proposals were a sign that the age of self-regulation for internet companies was over.

She said: “The internet can be brilliant at connecting people across the world – but for too long these companies have not done enough to protect users, especially children and young people, from harmful content.

“That is not good enough, and it is time to do things differently.

Hide Ad
Hide Ad

“We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe. Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”

Mr Masterton backed calls for a statutory duty of care after the death of Ben McKenzie, an Eastwood High School pupil who took his own life in October following what Mr Masterton called “cruel online threats and bullying on social media and his mobile phone”.

Mr Masterton said yesterday: “There are lots of positive measures here which if followed through and done properly have the potential to make a huge difference in making a safer environment for our kids online.

“But the White Paper is just the first step. The government mustn’t let this vital work slip down the agenda – it’s time to prove to tech companies we are serious about making them face up to their responsibilities.”

Meanwhile, Labour’s shadow culture secretary, Tom Watson, warned that the measures “could take years to implement”, saying: “We need action immediately to protect children and others vulnerable to harm.”

He added: “These plans also seem to stop short of tackling the overriding data monopolies causing this market failure and do nothing to protect our democracy from dark digital advertising campaigners and fake news.

“This is a start but it’s a long way from truly reclaiming the web and routing out online harms.”

The proposed new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, the government said, applicable to companies of all sizes from social media platforms to file hosting sites, forum, messaging services and search engines.

Hide Ad
Hide Ad

It also calls for powers to be given to a regulator to force internet firms to publish annual transparency reports on the harmful content on their platforms and how they are addressing it.

Companies including Facebook and Twitter already publish reports of this nature.

Last week, Facebook boss Mark Zuckerberg told politicians in the Republic of Ireland that the company would work with governments to establish new policies, in an effort to regulate social media.

The Home Secretary, Sajid Javid, said tech firms had a “moral duty” to protect the young people they “profit from”. He said: “Despite our repeated calls to action, harmful and illegal content, including child abuse and terrorism, is still too readily available online. That is why we are forcing these firms to clean up their act once and for all.

“I made it my mission to protect our young people and we are now delivering on that promise.”

A 12-week consultation about the proposals will now take place before the government will publish its final proposals for legislation.

Peter Wanless, chief executive of children’s charity the NSPCC, which has campaigned for regulation for the past two years, said the proposals would make the UK a “world pioneer” in protecting children online.

He said: “For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content.

Hide Ad
Hide Ad

“It’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so.”

And Barnardo’s chief executive Javed Khan said the measures were “an important step in the right direction”.

He said: “We particularly welcome proposals for a new independent regulator, which should ensure internet bosses make the UK one of the safest places in the world for children to be online.

“It’s only right that tech companies are penalised if they fail to keep children safe and protect them from harmful and illegal content that leads to sexual abuse and child criminal exploitation.”

However, ex-culture secretary John Whittingdale said ministers risk dragging people into a “draconian censorship regime” in their attempts to regulate internet firms.

The Conservative backbencher said: “Countries such as China, Russia and North Korea, which allow no political dissent and deny their people freedom of speech, are also keen to impose censorship online, just as they already do on traditional media,” he wrote in a Sunday newspaper.

“This mooted new UK regulator must not give the despots an excuse to claim that they are simply following an example set by Britain, where civil liberties were first entrenched in Magna Carta 800 years ago,” he said.

And Daniel Dyball, UK ­executive director at trade body the Internet Association warned that the current scope of the proposals was “extremely wide”, which could hinder their implementation.

Hide Ad
Hide Ad

He said: “The internet industry is committed to working together with government and civil society to ensure the UK is a safe place to be online. But to do this, we need proposals that are targeted and practical to implement for platforms both big and small.”

Responding to the proposals, Facebook’s UK head of public policy Rebecca Stimson said: “We have responsibilities to keep people safe on our services and we share the government’s commitment to tackling harmful content online.”