Seeing, but not believing: The threat from Deepfake videos
Both traditional and social media lit up earlier this year following the launch of a ground-breaking campaign for charity Malaria No More which was fronted by David Beckham.
In this 55-second video ad, the football star makes a passionate appeal for support. He makes it, however, in nine different languages.
No, David Beckham has not suddenly become a polyglot. Instead, his image has been subtly manipulated to perfectly lip-sync him with the voices of others by using a technology called “deepfake”.
It is this technology which is the most vivid harbinger of the next great cyber threat to our society.
Of course, changing faces in videos is not new. Actors have been digitally recreated for films like The Matrix Reloaded and The Curious Case of Benjamin Button more than a decade ago, and synthetic characters like Gollum from The Lord of the Rings, pasted onto actors wearing motion-capture suits, are now ubiquitous in cinema.
But these are laborious, painstaking, expensive works of art that are handcrafted at huge expense.
Over time, new technology has been developed to make this process easier and faster.
For instance, in 2011, computer scientists at Harvard University described a new method that could replace someone’s face in a video and be made in hours rather than weeks – as long as numerous experts in image processing were on hand.
Although in recent years, machine learning has replaced the need for such human skill and craft. Deep neural networks can now be used to rapidly produce fake imagery that is utterly believable to our eyes.
Fake news
Videos produced in this way first appeared in 2017 and quickly became known as deepfakes.
Unfortunately – and perhaps inevitably – the first application of this new technology was in pornography. Deep learning was used to drape the faces of celebrities onto the bodies of adult movie stars most realistically, to the understandable and considerable consternation of the victims.
Since then, experiments with new applications have opened more eyes to the potential use – and abuse – of this technology. For example, in 2018, political leaders like Barack Obama and Vladimir Putin were seen making speeches they never gave.
Then, early in 2019, actor Steve Buscemi’s face gave Jennifer Lawrence’s Golden Globe acceptance speech in a video broadcast on a US comedy show.
As you can see, Beckham’s flawless malaria appeal – created by commercial London firm Synthesia – simply marks the moment that deepfakes transformed from creepy gimmicks to a mainstream media form.
Lies to our eyes
This technology is also being recognised by authorities as a new kind of cyber risk.
Its danger lies not only in its power to vilify and humiliate individual victims, but also to attribute abhorrent actions and words to them. As such, deepfakes have the potential to undermine the foundations of our society and our democracy.
Successful societies are built on trust – trust in each other, trust in a counterparty to a transaction, trust in sources of information, trust in institutions. The most powerful basis of trust is personal experience, followed closely by the evidence of our own eyes.
So, while conspiracy theories have flourished online, fed by bogus information and doctored photos, up until now, videos have held the line against lies. Deepfakes will breach that line.
Arms race
Deepfakes also pose a threat at a national level. At a recent US Department of Homeland Security conference, NSA research director Dr Deborah Frinke observed that we are gravitating towards attacks on critical national infrastructure becoming more akin to contemporary psychological warfare.
One example of such a target will be trust in the identity of others over the internet. Each day, millions of people receive video calls from their friends and colleagues via Skype, WhatsApp or Facetime. One glance or one word is enough to convince them of the caller’s identity.
In many European countries, citizens validate their identity remotely to banks and healthcare providers by making a video call, talking to an operator, and holding up their passport or identity card to show that they are genuinely present. These interactions allow consumers to access services securely and efficiently.
Deepfake technology will undermine trust in such transactions, replacing it with unease and suspicion – the germs that eat a society from the inside.
Fortunately, the same technology creating this mayhem will help contain it. For example, DARPA’s Media Forensics programme is using deep learning technology to probe for the invisible, tell-tale signs of spoofing that is buried in such deepfake imagery, and machine learning will analyse its metadata footprints to detect traces of forgery. But germs are fought by medicines, and subsequently adapt by developing resistance.
Beckham’s video appeal is partly necessitated by the need to develop new malaria drugs to replace those losing their effectiveness. The world is in an arms-race with the plasmodium parasite. The parallels to deepfake synthetic video are therefore entirely apt.
Main image credit: ALEXANDRA ROBINSON/AFP/Getty Images