The suggestion Moscow will troll Nicola Sturgeon because of her strong backing for Theresa May’s anti-Kremlin stance? The alleged air-brushing of Corbyn’s hat on a Newsnight backdrop to make it look more Russian? The ever-weakening health of the father and daughter at the centre of the Salisbury nerve agent attack? Alex Salmond under more pressure over his RT programme?
Vladimir Putin rolling to another sizeable victory in yesterday’s Presidential election? Or a hard to understand story published in the Observer suggesting data from 50 million people’s personal information and preferences was harvested from Facebook accounts without permission and may have been used to subvert democracy in America and Britain and influence election results?
According to whistleblower Christopher Wylie, he and another Cambridge academic obtained Facebook data for genuine academic research about fashion trends and social preferences. Apparently, Aleksandr Kogan, developed a Facebook app featuring a personality quiz, and Cambridge Analytica paid people to take it. The app recorded answers, but also collected data from the quiz-taker’s Facebook account and from their Facebook friends as well to create individual patterns and build an algorithm predicting results for other Facebook users. Friends’ profiles provided a testing ground to fine tune the algorithm and make it a politically valuable resource because each user had to have a Facebook account and be an American voter to qualify for payment, so tens of millions of profiles could instantly be matched to electoral rolls. From an initial trial of 1,000 “seeders”, the researchers obtained 160,000 profiles. Eventually a few hundred thousand paid test-takers would allow data to be extracted from millions of voters. All without the knowledge or permission of those taking part in the quiz or their entirely unsuspecting friends.
Cambridge Analytica (CA) then created voter personality profiles to target individual Facebook users with tailored political messages on behalf of Donald Trump during the presidential campaign – an effort that netted them $6.2 million. Back then the company - owned by the hedge fund billionaire Robert Mercer and formerly headed by ex-Trump adviser Steve Bannon – boasted that its “psychographic” profiles could predict the personality and political leanings of every adult in the United States. He went on to claim a major involvement in the EU Leave campaign. Journalist Carole Cadwalladr has been tracking the story down for two years; “It’s an incredible revelation. Our intimate family connections, our “likes”, our crumbs of personal data [are] all sucked into a swirling black hole that’s expanding and growing and is now owned by a politically motivated billionaire.”
Of course Facebook are trying to say the company is clean because it simply approved the use of data for academic “in app improvements” and were assured it would then be destroyed. It isn’t their fault if that didn’t happen.
So should we be worried? After all, rules governing data protection will toughen in mid May as the EU enacts tough new regulations that will survive Brexit. Under the general data protection regulation (GDPR) “silence, pre-ticked boxes or inactivity” will no longer imply consent. Companies will have to explain clearly why they are collecting personal data and if it’s to be made available to third-party providers (like Facebook, Google Analytics or telemarketing companies) explicit consent will be needed. Users will also have the right to access their own personal data and request its removal. So will this be enough to make the online environment secure? No it won’t.
There’s an old adage; if it’s free you are the product. And uniquely on Facebook, so are your friends. Facebook makes money by allowing other websites to access the wealth of choices, likes, browsing history and advert clicks made by millions of its users through tracker numbers stored as cookies on computers. Companies then use Facebook’s data to create personally targeted adverts. These days, most of us wearily accept that our browsing and shopping preferences drive the customised adverts we see. But we may not know how much further data is derived from our responses, or from our friends answering innocent-looking Facebook surveys. In the words of one media analyst; “Companies are fine-tuning your reactions like lab rats and selling on the back of it.” Which is slightly creepy – but not illegal.
The real furore is the suggestion Facebook has entered the political domain by allowing customised political messages to be sent and monitored based on the likes and preferences of millions of users. Of course any misuse occurred when the data was handed over to Cambridge Analytica – but did Facebook simply fail to take precautions or was there more to it?
In the USA Robert Mueller’s investigation traced the first Russian efforts to disrupt the US election to Cambridge Analytica’s meeting with a Russian oil company to outline its datasets, capabilities and methodology. According to material leaked by Christopher Wylie, the presentation had little to do with “consumers” but instead focused on election disruption techniques. On the strength of this, Adam Schiff, the top Democrat on the US House intelligence committee, wants Facebook CEO Mark Zuckerberg to appear before the committee again and explain how his company came to provide private user information to an academic with links to Russia.
Meanwhile Damian Collins, chairman of the Digital, Culture, Media and Sport Committee, wants Zuckerberg over here to explain whether the Brexit vote was similarly affected and potentially compromised.
That could prove a tad embarrassing to the Prime Minister, because back in 2016, Theresa May was reportedly set to deploy Cambridge Analytica as; “an army of computerised ‘mind-readers’ to help her win the next Election.” Veteran US pollster Frank Luntz said at the time: ‘They have figured out how to win. There are no longer any experts except Cambridge Analytica.’ Mind you, the miscalculated “snap election” suggests there may be serious limitations to polling predictions based on mined data alone.
So has Facebook been careless and sloppy – or worse? Can the company be trusted to act honourably with massive sets of personal data, or not?
If Westminster doesn’t haul Zuckerberg over, individual voters must decide whether to keep using Facebook or quit the world’s most addictive bit of social media until a new, more ethical and transparent platform comes along. Amidst the current international panic, who saw that coming?