When seeing isn't believing: using AI to create 'Deepfakes'

The latest in artificial technology allows creators to swap and meld video, which can be used for everything from a fun camera filter to creating dangerous fake news.
8:08 | 12/11/18

Coming up in the next {{countdown}} {{countdownlbl}}

Coming up next:

{{nextVideo.title}}

{{nextVideo.description}}

Skip to this video now

Now Playing:

{{currentVideo.title}}

Comments
Related Extras
Related Videos
Video Transcript
Transcript for When seeing isn't believing: using AI to create 'Deepfakes'
We're entering an era in which it can look like our enemies are saying anything anytime. Reporter: This looks like president Obama. And sounds like him too. But it isn't. You see, I would never say these things, at least not in a public address, but someone else would. Someone like Jordan pierce. Reporter: This is a deep fake, a video created with artificial intelligence. I'm sorry. Reporter: Used to convert video of real people into potentially damaging doppelgangers, appearing to say and do things they never actually did. Moving forward, we need to be more vigilant. Reporter: In this case, buzz feed and Jordan Peele used the technology to create an eye-grabbing psa. But experts fear that in the wrong hands deep fakes could become the next frontier in fake news and spark very real consequences. What could be so dangerous about a deep fake? We're used to looking at an image or video and putting faith in it. And with the rise of the ability to easily manipulate those, I think that's going to put our faith in dim tal media in jemd. Reporter: It caused widespread concern when Reddit started posting videos. Including with gal gadot without her consent. The early fakes were riddled with glitches, but at technology continues to evolve some people worry they could become indistinguishable from the real deal, triggering widespread panic, riots or war. Now they are grabbing the attention of lawmakers. Deep fakes is a new area and people are going to continue to find new ones. Reporter: And the department of defense. A lot of times there are indicators that can you see. But it's going to get more and more challenging over the time. Reporter: Is this an arms race? There's certainly a bit of a cat and mouse game going on. Reporter: Dr. Matt tarack heads the advanced research agency or darpa. His task, developing Al ga rhythms. You'll see the real video on the right hand. An object was removed. The car that moved through the scene was essentially erased out of the video. Reporter: While the technology isn't flawless, there are pretty convincing fakes out there. Many of them harmless spoofs of popular films, from nic cage and "The sound of music". ??? The hills are alive with the sound of music ??? Reporter: Or "The godfather." Even "The avengers". But the creators of videos like tease are aware of the controversy. We reached out to derpfakes. They claim to be one of the early innovators. They didn't want to reveal their name but told ABC that all of them have a level of controversy attached to them. Their approach is to make the public familiar with the idea that what we see is not true buy default. Either way, it's here to stay. What have you witnessed so far that's worrisome to you? Manipulations that may have required state-level resources or several people and significant financial effort can potentially be done at home by an individual. Reporter: So you're saying it's become easy and cheap to create a deep fake. It is significantly easier than it has been in the past. ??? ??? You'll always be with me. Reporter: The creation of a deep fake is somewhat similar to the state-of-the-art special effects used in today's film making. Like the face mapping used to add the late Paul walker's likeness to the movie "Furious seven." But deep fakes have a lot in common with technology you're probably more familiar with, the photo album on your phone that learns your friends' faces or face swapping on Snapchat. There's Jackie and there's me with Jackie's face. Deep fakes are much more believable. I can tell if the train on that data and take those models and create deep fakes. Reporter: At the university of Colorado Denver, it's Jeff Smith's job to create convincing deep fakes as one of darpa's partners. We act as the bad guys creating manipulated audio, video and imagery so that the Al ga rimms that are developed can be tested and overall, the ability for the detection of manipulated media gets better. Reporter: Take me through the steps. It's a process of machine learning, instructing a computer to understand a face and create that face on its own. Then we actually conduct the swap itself. Reporter: The faces. Yep. Reporter: I love how we're talking about advanced technology, and we're on a whiteboard. At least it's not a chalkboard. Reporter: He even gives us a deep faking demo. What does a deep fake look like? It looks like this. Reporter: We've swapped faces. Pretty spooky. Reporter: I kind of like my beard, actually. Because we look very different, we aren't the best candidates. In this video, you see a convincing deep fake coming to life. This is within the first couple hours. It starts to come in focus. But as we zoom out after a few days, that expression is made based on Jeff's expression. Reporter: Because the software is learning. Facial features and mapping those facial expressions with those features as that training process keeps going. The models get better and better and better, and it can take hours to days. That's right now with today's techniques and technologies. So, in another year or two there will be either improvements to this system or new systems that can do the same thing, maybe even better, maybe faster. Reporter: And as that happens, Jeff hopes his students can stay ahead of the curve. So we took this video, innocuous, tri pod, high quality, nothing happening. You inserted the rifle, right. Reporter: You actual lay put -- actually put a weapon, a gun into that image. It makes you more aware of what is possible with manipulated videos. So hopefully it it makes them think more about it. Reporter: If I were a student and I saw that, I would have thought that was real. By being here, how do you want to prevent that kind of fear? I just want to help develop the software they're trying to create to detect this kind of fake media so hopefully people aren't tricked by videos that you can easily make, which is the software we have. Reporter: Is the mantra here don't always believe what you see? And while seeing doesn't necessarily mean believing in the digital age, looking beyond face value may be more important than ever. For "Nightline," Kiera Phillips, in Denver, Colorado. Up next, the over the top

This transcript has been automatically generated and may not be 100% accurate.

{"duration":"8:08","description":"The latest in artificial technology allows creators to swap and meld video, which can be used for everything from a fun camera filter to creating dangerous fake news.","mediaType":"default","section":"ABCNews/Nightline","id":"59742097","title":"When seeing isn't believing: using AI to create 'Deepfakes'","url":"/Nightline/video/believing-ai-create-deepfakes-59742097"}