A new 'arms race': How the U.S. military is spending millions to fight fake images
Competing technology would automatically spot manipulated video
It's a video that looks convincing — former U.S. president Barack Obama speaking directly to a camera and calling current U.S. President Donald Trump "a total and complete dipshit."
But it never actually happened.
The video was produced and voiced by director Jordan Peele and Buzzfeed to warn people of an emerging technology that can make it seem as though people are saying or doing things they never did.
Watch the video below.
Convincing fake videos like that are just one of the reasons a specialized team at the U.S. Department of Defence is investing tens of millions of dollars to develop competing technology that would automatically spot manipulated videos and images. The Department of Defence says this technology can have an impact on national security.
Matt Turek, manager of the media forensics program at the department's Defence Advanced Research Projects Agency (DARPA), told CBC's The Fifth Estate that "in some sense it's easier to generate a manipulation now than it is to detect it."
Part of the agency's goal is to anticipate what they call "strategic surprise" and the impact technology will have on the world, Turek says. They came to the conclusion that the capability to manipulate images automatically and without skill "was probably going to arrive sooner rather than later."
- Fake videos are disturbing and getting easier to make
- How BBC journalists found the truth behind terrifying video
- Watch "The Deepfake" on The Fifth Estate on CBC-TV Sunday at 9 p.m.
Turek says the U.S. government's adversaries could be anyone at this point.
"Could be an individual, could be low resource groups, could be … more organized groups and nation states certainly. But I will point out that nation states have always had the capability to manipulate media."
Eager for a solution
DARPA's media forensics program is halfway through its four-year research mandate and has spent an estimated $68 million on this technology so far.
For digital forensics expert Hany Farid, a technological solution for spotting manipulated videos can't come fast enough.
Farid, a computer science professor at Dartmouth College in New Hampshire, is concerned about how technology that can manipulate video could potentially be misused.
"The nightmare situation is a video of Trump saying I've launched nuclear weapons against North Korea and before anybody figures out that it's fake, we're off to the races with a global nuclear meltdown," he says.
Farid doesn't think that's likely, but he also doesn't think it's out of the question.
"Certainly that technology exists today."
At DARPA's offices in Arlington, Va., Turek showed The Fifth Estate some examples of manipulated videos that DARPA's detection technology can spot.
In one example, two people appear to be sitting beside each other. But they never were. DARPA's detection technology picked up on inconsistencies in the lighting in the frame.
Watch the video below.
"You can actually see the sunlight reflecting off the back wall there, and then they were merged together to create this video," Turek says.
Another example was meant to mimic a surveillance video. In that case, DARPA's detection technology looked at motion information in the video and could automatically detect that part of it was missing.
"This frame's going to turn red at the places where the video was spliced, and so basically a series of frames was removed and that produces inconsistency in the motion signal, and that's what the automated algorithm can pick up on," says Turek.
Watch below to see what the detection technology found
It's not just videos. DARPA is analyzing still images, too.
In the image below, the detection technology spotted that not all of the pixels come from the same camera.
"There's sort of an outline of the airplane that you can see in this noise pattern, and so the computer can automatically pick up on that," says Turek. "Likely the airplane pixels come from a different camera than the rest of the scene."
A lot of skill
Farid says developing technology to spot fakes created by technology is an "arms race."
"The adversary will always win, you will always be able to create a compelling fake image, or video, but the ability to do that if we are successful on the forensics side is going to take more time, more effort, more skill and more risk."
While software has been released online that allows almost anyone to create manipulated video, Farid says it still takes a level of skill to develop a convincing fake using this kind of technology.
Eventually, he says, if they are successful with developing automated forensic technology to spot fakes, it will mean only a relatively small number of people will be able to create them.
"That's still a risk, but it's a significantly less risk than we have today."
Farid says that in addition to developing technology to spot fakes, there could be another way to combat the spread of misinformation.
"We as consumers have to get smarter. We have to stop being so gullible. We have to get out of our echo chambers. We have to be more rational about how we digest and consume digital content online."
DARPA's media forensics program has a focus on the threat manipulated media could pose to national security.
The program would also help the U.S. military. Right now, human analysts have to verify videos and images, which is a manual process. Analysts examine imagery like foreign propaganda. Law enforcement agencies and organizations like the FBI analyze video and imagery such as security videos.
The media forensics program would heavily automate the process and would aim to give analysts a tool to make their jobs easier.
But for the general public, Turek says one of the biggest dangers these kinds of fakes could pose is the potential erosion of the idea that seeing is believing.
"I think we as a society right now have significant trust in image or video. If we see it then we have faith that it happened," he says. "And so the ability for an individual or a small group of people to make compelling manipulations really undermines trust."
Turek says that while manipulators may have the upper hand now, in the long term the detectors have the potential winning advantage "because we're coming at things from so many different angles."