Technology that can change what people say and do on video is here and it’s scaring both tech experts and politicians. From use in standard media editing, to video art, to humor, to online bullying and abuse, to fake news, falsified video has a wide range of potential applications.
“The idea that someone could put another person's face on an individual's body -- that would be like a homerun for anyone who wants to interfere in a political process,” Virginia Senator Mark Warner told CBS. Here is an example of just that, in which Kate McKinnon in a Saturday Night Live clip is given the face of Hillary Clinton:
Professor of Computer Science Hany Farid runs a lab at Dartmouth College that focuses on exposing digital fakes in various media, including video. He told CBS we are “not ready” to deal with fake video:
I think the nightmare situation is a fake video of a politician saying, 'I have launched nuclear weapons against a country’ … We have a 'fake news' phenomenon that is not going away. And so add to that fake images, fake audio and fake video, and you have an explosion of what I would call an information war.
In March 2018, a modified video showing Parkland school shooting survivor Emma Gonzalez ripping up the U.S. Constitution, instead of what she actually tore -- a shooting target poster -- went viral.
Fake news has become a common phrase in 2017 and 2018, but is used differently by different people. It sometimes refers to unverified or falsified news, such as the fake stories and social media user accounts spread particularly on Facebook during the 2016 election, some of which were linked to personal data misuse by the firm Cambridge Analytica.
“It's clear now that we didn't do enough to prevent these tools from being used for harm as well. That goes for fake news, foreign interference in elections and hate speech, as well as developers and data privacy,” Facebook CEO Mark Zuckerberg said when he testified before a joint Senate Committee regarding these topics.
The term fake news is also often used by President Trump and others to characterize what they feel are the liberal bias and unreliability of “Fake Mainstream Media.”
“There is no Democrat or Republican that would be safe from this kind of manipulation. But, boy oh boy, we need as a country to get our act together,” Warner says of altered video tech.
The YouTube clip above comes from an account called “derpfakes.”
“Fun with deep learning and neural networks. For memes, obviously,” the Twitter profile reads.
The username is likely a play on the term “deepfakes,” which combines the words "deep learning" and "fake" to describe AI-run image synthesis techniques. The name "deepfakes" was used by a redditor who switches faces in pornography videos. This is another problematic and abusive use of this technology: the creation of fake pornography and/or fabricated “revenge porn.” A trend began when deepfakes began to put celebrities’ faces on the videos. The account has since been banned on Reddit.
There is now an app called “FakeApp” that makes the process so easy, it is being used to create fake elicit videos of everyday people. This ranges from what some consider a practical joke played on a friend to a form of revenge or nonconsensual porn, in which personal sexual media is shared without a person’s consent. Mary Anne Franks of the University of Miami Law School told Wired that while this type of attack may be hard to prosecute under current nonconsensual porn laws, which were not designed to cover false images, anti-defamation statutes may protect victims.
“You can make fake videos with neural networks today, but people will be able to tell that you’ve done that if you look closely, and some of the techniques involved remain pretty advanced. That’s not going to stay true for more than a year or two,” Peter Eckersley, chief computer scientist for the Electronic Frontier Foundation, told Motherboard.
Reddit and Facebook told CBS they are working on regulating false video and Warner has called on major tech companies to cooperate with Congress to curtail false news in all forms.