It can’t be left up to social media companies to self police harmful content - Jo Stevens

Last week Sir David Amess was killed while he was doing the most important job of all for an MP - helping his constituents.

The suspect in the killing has been charged with murder and the preparation of terrorist acts. It’s not surprising after the vicious killing of another MP in the course of their duties we have seen an outpouring of concern about the spread of extremism online and the way in which our online discourse has been corroded.

This week Hope not Hate showed Labour an example of violent Islamism and far-right propaganda on TikTok which stayed up despite being reported to moderators.

Sign up to our Opinion newsletter

Sign up to our Opinion newsletter

This is just one of many examples of how social media companies have been given far too much time to prove again and again that self-regulation does not work.

Shadow Secretary of State for Digital, Culture, Media and Sport in the United Kingdom Jo Stevens addresses delegates on September 26, 2021 in Brighton, England. Picture: Leon Neal/Getty Images

Online lives are swamped with dangerous anti-vax content, anti-democratic slurs, racist and misogynistic abuse and more, all amplified and spread on social media. Keir Starmer described it as a cesspit this week and he’s right.

More than three years after promising “world leading” legislation, the Conservatives have finally published a draft Online Harms Bill, but it’s weak, watered down and embeds the failed and broken self-regulation in law.

Earlier this week Boris Johnson surprised me - and I think probably his own ministers and civil servants when he made two promises on the Bill. Labour has been calling for criminal liability for senior tech executives who breach the new law to be included in the Bill so that it gives it real teeth and incentivises a culture change at the top of these powerful social media companies.

For months and months Conservative ministers have refused to agree this saying they wanted to wait until after the new law was in place and then review things. I’m glad the Prime Minister has now realised how farcical a position that is and accepted our arguments.

As the testimony from Facebook whistleblower Frances Haugen has shown, making money is all these companies care about. Facebook knew its app was damaging children’s mental health but chose to do nothing. Imagine if a drugs company was selling something that made kids sick or a toy firm was selling a dangerous product. They would rightly suffer severe consequences.

Another big change from the Prime Minister was his promise to bring the Bill before the House of Commons before Christmas so it can actually start its parliamentary journey to becoming law. Again this commitment only came in response to a question from Keir Starmer.

The UK has been waiting far too long for this legislation - I’m the first to say let’s crack on. But less than 24 hours later, that promise had fallen apart. Jacob Rees-Mogg refused to commit to the Prime Minister’s promise on Thursday.

Another element that's missing from the Conservative’s Bill is strong rules around legal but harmful content.

Too often social media companies say this stuff doesn't breach their rules which is why we can't leave it up to them.

They fail to recognise that language can have different meanings in different parts of the UK.

While antisemitism, misogyny and islamophobia are all too common across the county, there are particular bigotries that are more prevalent in particular places.

The scourge of sectarianism has no place in Scotland’s public life.

But tackling it means social media companies hiring people who have geographically specific knowledge to be able to deal with it.

We also need companies to make their products safer by design including the algorithms that drive much of the traffic. So that we avoid the dangerous situation of social media companies hosting more and more extreme content.

We need a tough duty of care on the companies which covers all that abuse because it can’t be left up to social media companies to self police.

They have already shown they can’t, or won’t do it voluntarily.

There is also a concern that this Bill's two-tier approach to social media companies will actually embolden some of those with the most disturbing content.

It currently contains tougher rules for “category 1” sites like Facebook and Twitter but less stringent rules in place for the smaller sites such as 4chan, Bitchute, Telegram and others. But as Hope Not Hate, the Anti Semitism Policy Trust and others have warned, it’s often the smaller platforms that host the most terrifying and dangerous content with Twitter and Facebook acting as a shop window for more extreme material.

Labour believes this needs to be addressed because excluding these sites from stronger rules is playing into the hands of those who want to spread this dangerous extremism.

This Bill gives us a long-awaited chance to change our online space for good and Labour will work to stop the Conservatives wasting this opportunity.

Jo Stevens is Shadow Secretary of State for Digital, Culture, Media and Sport

A message from the Editor:

Thank you for reading this article. We're more reliant on your support than ever as the shift in consumer habits brought about by coronavirus impacts our advertisers.

If you haven't already, please consider supporting our trusted, fact-checked journalism by taking out a digital subscription.


Want to join the conversation? Please or to comment on this article.