Within ten weeks, Dr Kris Van Kerckhoven’s conspiracy theory linking 5G with the coronavirus outbreak led to arson attacks and seemingly attracted public endorsement from Hollywood stars.
John Cusack wrote on Twitter: “5G wil be proven to be very very bad for people’s health” (sic).
Mr Cusack might be a fine actor, but he is not a medical expert.
NHS England’s national medical director Professor Stephen Powis is, though, and he rightly said claims about 5G are “complete and utter rubbish” and “the worst kind of fake news”.
The spread of disinformation during the coronavirus crisis has been alarming.
We are not only fighting a global pandemic, we are fighting a virus of fake news.
No, drinking more water will not flush out the infection, nor will gargling with saltwater.
No, garlic won’t protect you.
No, and this should really go without saying, risking your life by drinking bleach will not wipe out coronavirus.
Last week, regulator Ofcom revealed that almost half of UK online adults had seen false or misleading information about coronavirus. Among people who have been exposed to falsehoods, two thirds are seeing it every day.
And 40 per cent of people are finding it hard to know what is true or false about the virus.
Everyone has a duty to prevent the spread of disinformation.
Last year, the Open Knowledge Foundation campaigned for improved transparency from large social media companies about tackling fake news and disinformation. This was aimed at political adverts, but the same principle applies to Covid-19.
There is still not enough transparency about efforts in terms of what is being taken down and why. Companies such as Twitter and Medium have rushed out updates to their rules while also having to rely more on automated systems given that their content moderators can’t physically be in the normal offices where they work.
The tech giants have a responsibility, but so too do governments. There is pressing need for governments to come together to set international legislation to cover what should and shouldn’t be allowed on these platforms, otherwise the platforms will continue to make up the rules.
Disinformation was discussed at a European Council summit last year, but the EU must make this a priority when normal politics resumes.
And there is a role for each and every one of us as well. If you see something being shared on Facebook that is demonstrably false, don’t share it – call it out.
In the same way that we are all washing our hands and social distancing to prevent the spread of the virus, so too can we help prevent spreading the virus of fake news.
The best way to tackle disinformation is to make information open, allowing journalists and researchers to provide facts to the public.
Newspapers have never been more important.
Scientists and researchers have never been more important.
Open data is a force for good in these troubled times, and ultimately this is what will lead to a vaccine.
A large number of companies, organisations and groups have already signed an Open Covid Pledge to remove barriers to the use of intellectual property.
This will allow experts to use otherwise inaccessible technology and content, with the potential to help end the pandemic and mitigate its effects.
This approach lies at the heart of the Open Knowledge Foundation’s campaign for a fair, free and open future.
In recent years, the acceptance of basic facts has disappeared, with expert views dismissed and a culture of “anti-intellectualism” from those on the extremes of politics. I hope the response to this international emergency is that people who should know better stop their anti-expert rhetoric.
There are tough times ahead, but we will get through it by spreading facts, not fiction.
Catherine Stihler is Chief Executive of the Open Knowledge Foundation