‘Fake news’ – information or news proven to be either verifiably false or misleading – has become a major, global concern.
As news and opinion pieces are increasingly shared with readers via online and social media channels, the speed of their dissemination has accelerated exponentially, as have the challenges around regulating third-party messaging presented as news and opinion.
These challenges, coupled with increasingly polarised electorates, have led to documented instances of manipulation of electoral processes, in the UK and United States, and to content being linked directly to the incitement of violence in countries such as India and Myanmar.
The demand for tougher regulatory intervention on fake news is compelling, whether such content is used as part of a deliberate attempt to influence people’s opinion (political or otherwise) or merely to increase profitability, be that through direct influence or via reader clicks.
As a result, fake news has become a major threat to democracy and political stability, and to user confidence in the wider internet eco-system. The rate at which false information can be spread globally is a major challenge for governments and regulatory frameworks, as regulators worldwide scramble to implement measures to address the problem. However, the size and reach of such online platforms pose a challenge.
We are at the start of a long journey and regulators are under increasing pressure from government to move more quickly to address these issues. However, the major challenges lie in resolving the material issues around censorship; the levelling of regulation in an increasingly converged digital world; policing the process fairly and effectively; and protecting the regulatory process from unnecessarily political influence by using effective agencies that can secure the trust of all stakeholders in the process.
A slew of recent examples have been widely reported in the UK media, underlining how widely fake news or false information has spread, and the impact it has had. One prominent example was around the EU referendum, when online channels were found to have been used to serve up targeted, fake information designed specifically to undermine institutions and incite anti-EU sentiment. This was very similar to the allegation against Facebook with respect to its role in helping to spread fake news during Donald Trump’s successful US Presidential campaign.
The Cairncross Review Report, 2019 investigated the UK news market and the role of digital search engines etc. Amongst other things, it recommended that the online platforms’ efforts to improve their users’ news experience should be placed under regulatory supervision: the aim being to ensure oversight of the steps online platforms have agreed to take to improve both the quality and the accuracy of the news content they serve.
In addition, Parliament’s Digital, Culture, Media and Sport (DCMS) Committee has also been investigating issues around fake news, particularly, its nature and origin, its impact and the responsibilities of social media platforms and search engines in curbing it. In February, the DCMS Committee issued its final report, calling for, among other things, a compulsory Code of Ethics for technology companies, overseen by an independent regulator, which is to be given powers to take legal action against companies that breach the code; reform the current electoral communications laws and rules on overseas involvement in UK elections; and compel social media companies to take down known sources of harmful content, including proven sources of disinformation.
The UK Government has also established the National Security Communications Unit, tasked with “combating disinformation by state actors and others”.
Without some form of intervention, problems caused by the spread of online falsehoods will only increase. However, critics of increased regulation suggest governments could use the law to suppress free speech and make it easier to monitor citizens. There are other pressures on governments in terms of legislating, given the size of many online platforms, the economic power they wield, and their ability to direct and withdraw inward investment. Like discerning misinformation itself, the choices around how to regulate and legislate are not easy, but measures will need to be taken. Critical will be balancing the need for fair and effective review and oversight of the discretionary powers required against the need for speedy and effective intervention. While the models and the problems may be new, the legal challenges around due process and effective redress are not. Regardless, the UK now has a golden opportunity to create a regulatory framework that sets the standard globally.
Gordon Moir is a Partner, Shepherd and Wedderburn