Deepfake videos take fake news to dangerous new level – Martyn McLaughlin

A screenshot of a deepfake video  in which Steve Buscemis face speaks with Jennifer Lawrences voice on her body  that was designed to show how convincing the footage can beA screenshot of a deepfake video  in which Steve Buscemis face speaks with Jennifer Lawrences voice on her body  that was designed to show how convincing the footage can be
A screenshot of a deepfake video  in which Steve Buscemis face speaks with Jennifer Lawrences voice on her body  that was designed to show how convincing the footage can be
The rise of ‘deepfakes’ could open up a new front in the disinformation war, writes Martyn McLaughlin.

If you have not yet watched the footage of Steve Buscemi, wearing a red, floor-length Dior Haute Couture gown and 156 carats of bling around his neck in the form of a string of Chopard diamonds, it is strangely compelling viewing.

In it, the actor speaks of his surprise at winning a Golden Globe and discusses his outfit choice for the evening, his expression veering seamlessly from one emotion to the next. Were Buscemi’s figure not already a clue as to something strange afoot, another obvious and disquieting anomaly becomes clear with the volume turned up: he speaks not with his voice, but that of Jennifer Lawrence.

Hide Ad
Hide Ad

The viral video, which has been viewed more than a quarter of a million times, was a novelty project on the part of its creator, designed to showcase how easily – and convincingly – genuine footage can be manipulated using readily available and relatively affordable software.

Complex in terms of the deep-learning algorithms at play, the process has a simple outcome: taking two people, it maps their eye movements, mouth details, face contours, and even the way they blink, before swapping one person’s face for the other.

The artificial intelligence technique, known as a deepfake, first gained prominence via a disturbing subculture of fake sex videos, where prominent female celebrities such as Scarlett Johansson and Emma Watson were the victims.

Only last month, Johansson said her legal attempts to curb the proliferation of deepfakes featuring her likeness had proven to be a “useless pursuit”, given how “the internet is a vast wormhole of darkness that eats herself”.

Read More
Joyce McMillan: Could internet propaganda topple Western society?

One video, falsely described as real “leaked” footage of the actor, has been watched on a popular porn site more than 1.5 million times. Johansson has been powerless to prevent the scourge from spreading due to the minefield of copyright laws worldwide. Short of bringing about a level legislative playing field worldwide and giving regulators some teeth, she and others have no option but to play a never-ending game of legal whack-a-mole, a course of action only open to those with deep pockets.

But as deepfaking becomes more widespread, with dedicated desktop tools now in circulation, it is becoming increasingly obvious that it is not only high-profile celebrities who will find themselves targeted. All of us stand to suffer.

What has been the plaything of those intent on creating deliberate hoaxes and fake porn has the power to become a weaponised Frankenstein technology for use in the disinformation war being waged online.

Until now, the majority of politicians who have featured in deepfakes have formed part of what might be subjectively described as humour-based footage. Only a few days ago, a video emerged of Mr Bean’s face superimposed on Donald Trump’s body, predictably ranting about Mexicans.

Hide Ad
Hide Ad

Our democratic institutions are sufficiently resilient to withstand such substandard attempts at satire. But what will the response be if and when they are subjected to something altogether more nefarious? Among a slew of fabricated stories to circulate in Brazil in the run-up to last October’s presidential elections was a video appearing to show Lula da Silva, the country’s former president, making a series of unpatriotic remarks. The footage was fairly crude in execution, but it still managed to go viral. What is most concerning about that example is not the proficiency of the video, but the way in which it was disseminated via WhatsApp, where a remarkable 49 per cent of the Brazilian population now turn to get their news and current affairs.

That is in part an indictment of the freedom of the Brazilian media, yet it also speaks of a wider malaise caused by the fake news which sprung up around the likes of the 2016 US elections, much of which was infamously spread via Facebook.

As a result, the world’s biggest social media platform has seen its influence as a news outlet wane as it scrambles to restore credibility. The latest Reuters Institute Digital News Report, which scrutinises the habits of over 74,000 online news consumers in 37 countries including the US and UK, shows that the proportion of people using Facebook as a source of news is falling, a trend which is true of all social media providers in general.

The beneficiaries, as shown in Brazil, are messaging apps such as WhatsApp, which allow people to shun established sites and instead rely on friends and family for their diet of news. Over the past four years, the average usage of WhatsApp for news has more than doubled to 16 per cent.

If the motivation behind such a trend is a general mistrust of social media and news outlets, it leaves millions of supposedly judicious news consumers at risk of further manipulation. With no checks or balances available in an encrypted, peer-to-peer platform, the question of what is real and what is fake may not even be considered.

It is only a matter of time before the trickle of deepfakes becomes a spate, each video marginally more convincing than that which went before. The majority will no doubt be occupied with sex and celebrity, but when these tools of harassment are repurposed so as to skew our political discourse at a time when the public is acting as its own gatekeeper, the threat will be considerable. After all, how do you convince people who don’t believe in anything?