‘This is their information, they own it… they want to share it with only a few people,” the baby-faced 25-year-old says. “I think a lot of the reaction we’ve seen is more a reflection of how much people care about Facebook, and what their relationship is with it, and how much they trust us.”
Mark Zuckerberg was already a billionaire when he gave an interview to the BBC in 2009 about privacy, but in many ways the callow young man in the clip that resurfaced last week wasn’t the icon of today.
The social network he founded had just 300 million users, compared with 2.2 billion now; he wasn’t yet the subject of a Hollywood biopic; he hadn’t launched the biggest stock market listing by a tech company in US history.
Wearing a shirt and tie rather than his uniform of a sludgy t-shirt and slacks, in a cramped TV newsroom rather than on stage at a global tech conference, it’s as if Zuckerberg and Facebook were still trying to fit into the world, rather than towering over it.
A lot has changed. Last week the founder of the world’s biggest social network struck a different tone. “We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” he posted on Wednesday to his 105 million Facebook followers, and global media that had accused him of going to ground. “This was a breach of trust… between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.”
The story of how the data of up to 50 million Facebook users was weaponised by a British political consultancy on behalf of the American right to help elect Donald Trump is a complex web of chance meetings and introductions at the frontiers of tech, politics and psychology. But it is ultimately about trust: in a brand, in how democracy is conducted, and in a way of living online that it’s easy to forget has existed for less than 20 years.
Revelations in the past week have wiped $50bn off Facebook’s value, triggered summonses for Zuckerberg to give evidence to parliamentarians on both sides of the Atlantic, and the growing backlash against the company has its own hashtag, #deletefacebook. And, according to campaigners on one side of the Brexit debate, they raise serious questions about the conduct of the campaign for the UK to leave the EU, threatening to undermine faith in the result of the 2016 referendum itself.
If the details seem too remote and technical to care, consider this: in 2005, the year I started university, Facebook launched in the UK, its first overseas market. Anyone my age or younger has had their entire adult life (and a good chunk of their childhood) played out on social media.
I skipped Myspace, didn’t join Twitter until 2009, have a grand total of four photos on Instagram, so for me, Facebook has been the repository for the most intimate parts of my life online – photographs, friendships, my film and music preferences, and messages both public and private.
Like all social networks, Facebook keeps this information on file. If you have a Facebook profile, it’s easy to download your own archive. Doing so offers a sobering lesson in how much we share with social media giants, and how much their business model relies on selling us, their product, to their real clients: advertisers.
Scanning my own file confirms I’m a current affairs junkie, signed up for updates from The Scotsman, the Guardian, the Spectator and the Economist. Facebook knows where I’m from, so it bombards me with ads for money transfer services featuring Canadian expats in the UK. The fitness apps I’ve given permission to access my location know where I’ve been (or more frequently, not been) jogging. Facebook knows which internet dating apps I’ve used. It knows I once flew to South America with KLM. From scanning which products, brands and artists I’ve “liked”, it has targeted me for advertisers promoting gin, indie bands, newspapers and My Big Fat Greek Wedding. It’s a self-portrait I barely knew existed, but I painted it, so I can hardly argue.
How that kind of information ended up being used to exploit American voters’ deepest fears starts, bizarrely, with a Canadian high school dropout interested in fashion and liberal politics.
Aged 14, Chris Wylie successfully sued the British Columbia education ministry over its discrimination policies. Fiercely bright and resolutely nonconformist, by the age of 17 he had drifted to Ottawa and was interning for the leadership of the Canadian Liberal Party. There, he met Ken Strasma, who had just helped put Barack Obama into the White House, and who introduced Wylie to the power of data in politics. Wylie taught himself computer coding and became interested in the emerging field of psychometrics, the study of how people’s personality traits can be mapped to predict their reactions and preferences in any area of life.
In 2013, a paper was published by researchers at the Cambridge University Psychometric Centre that caught Wylie’s attention. One of its authors was David Stillwell, who had used a new Facebook feature introduced in 2007 to develop psychological quizzes that sorted participants into categories – openness, conscientiousness, extraversion, agreeableness and neuroticism. The quizzes were harmless fun and went viral, but the apps that powered them also asked users whether they would consent to handing over data on their Facebook profiles, including their “likes”. In total, 40 per cent did so.
By the time the findings were published, Wylie was studying at the London School of Economics and trying to use his ideas to help the Liberal Democrats. But his PhD was in fashion trend forecasting, and he would later tell the New York Times that his interest was “fashion, not fascism”.
Then he was introduced to Alexander Nix. The Old Etonian former banker was chief executive of a company called SCL Elections, part of a larger group specialising in research and communications that it touted as “psychological warfare” tools for “hearts and minds” campaigns.
Nix headhunted Wylie with a promise of “total freedom” to experiment with his ideas, believing they could be applied in the lucrative American elections market where political spending in the 2016 election cycle topped $6.5bn. He was given his opening with a recommendation to two key figures considered kingmakers of right-wing American politics.
One was Robert Mercer, a billionaire computer scientist; the other was Steve Bannon, one of the architects of the Trump presidency who was at the time looking to bring his influential right-wing blog Breitbart to the UK. With funding from Mercer, Bannon on the board, Nix in charge and Wylie handling the data, SCL Elections was effectively relaunched as Cambridge Analytica.
At first, the company lacked the tools to deliver Bannon’s ambition of reshaping American politics. Using existing means – polling, knocking on doors and asking people who are already supporters for their information – you could collect enough data for a US state-wide election, but replicating that on a national level was impossibly difficult and expensive. But Wylie knew of a data set that was big enough to build a US presidential campaign around: Stillwell’s research using Facebook.
However, the data should never have been passed to a third party in the shape of Cambridge Analytica. Kogan broke cover last week to claim he was given assurances the project was legal, and has been made a “scapegoat” by both the recipients of the data and by Facebook. He says he has made no financial gain, but like Cambridge Analytica and Wylie, Kogan was last week banned from Facebook.
Wylie has described how the information was used to trial messages for the American “alt-right” which found their way into Trump’s campaign slogans. In testing, voters identified as “neurotic” were shown images of people illegally scaling US border fences, and given messages about the dangers of the “deep state”. New York Times journalist Matt Rosenberg, who has been shown the material produced by Cambridge Analytica using the data, said last week: “Most of it was fear based. I didn’t see a whole lot of hope-based [messages].”
The company denies ever having worked for Leave.EU, the unofficial campaign in the 2016 Brexit referendum. But another Cambridge Analytica whistleblower, the former business development director Brittany Kaiser, claimed last week that Nix misled MPs when he said the company had “not undertaken any paid or unpaid work for them”.
After being caught offering “honey trap” tactics to prospective clients in a hidden-camera sting, last week Nix was suspended. Disillusioned, Wylie left Cambridge Analytica at the end of 2014. He has said one set of prospective clients asked if the data could be used to identify homosexuals, so they could be targeted with messages promoting gay “cures”. It was a long way from predicting fashion trends.
But like any information downloaded from the internet, the data remained with Cambridge Analytica. When Facebook was contacted by Guardian journalists in 2015, it asked Cambridge Analytica to certify that the data had been deleted – but it did not contact those whose profiles had been harvested in Kogan’s study. Cambridge Analytica insists it complied with the request, but that does not appear to have satisfied the UK’s data protection watchdog, which in May last year launched an inquiry into how personal data was used in the EU referendum. In an unprecedented show of force, investigators from the Information Commissioner’s Office entered Cambridge Analytica’s offices on Friday night, but a five-day wait for a warrant has led critics to question the integrity of any evidence seized. Bannon has denied any involvement in Facebook data mining at Cambridge Analytica, but said last week: “Facebook data is for sale all over the world.”
With accounts from Wylie and other whistleblowers now in the open, Zuckerberg has promised that Facebook will tighten its rules further and inform those whose data was harvested. The man credited with inventing the World Wide Web, Sir Tim Berners-Lee, has called the controversy “a serious moment for the web’s future”.
“But I want us to remain hopeful,” he said. “The problems we see today are bugs in the system. Bugs can cause damage, but bugs are created by people and can be fixed by people.” Whether fixing the bug can rebuild lost trust remains to be seen.