Audioburst Search

The surprising danger that deepfakes pose to the presidential elections



Deep fakes. Those digital manipulated videos. That look scary. Real pose a threat to the upcoming presidential election. Real danger will surprise you. I'm Roger Chang and this is your daily charge me reporter John Salesman. Thanks for joining me Joan. Yeah it's great to be back on the daily charge so we've all heard of deep fix but you don't think there's actually a real risk in say candid footage of Joe Biden. Donald trump actually saying something crazy and swing voters. What's real danger deep fix with this election? The Deepak experts that I talk to yes. They said that they're not most worried about a candidate depict like that like something where Donald Trump or Joe Biden is admitting to a hot button. Crime or saying something really inflammatory with. They're more concerned about are two things. One is known as the Liars Dividend. And that's this concept that as more people know that the fakes exists that there can be these completely false highly realistic out there. It gives people who are caught in the act and are guilty more credibility when they deny something when they denial legit video by saying. Oh you heard deep fix. You can't trust what you see anymore. And that just muddies the waters and makes it harder for people to understand and trust what is truth. And what is fiction? Yeah that's that sounds very dangerous because that's like it damages the credibility of basically everything right. Because if you could point to this one thing is oh see this fake. It applies to everything essentially right. Yeah it makes it harder. You know we. Our brains have been wired for so long to believe what we see. And we've learned to you. Know as Photoshop came along and as other sorts of media manipulation have come a long. We've been able to catch up and at least be more skeptical of those but because video is tricking your eyes and your ears because the AI that powers depicts is so sophisticated and so good and making things look real. That's really really deep wiring in your brain telling you all these signals like trust this. Trust this trust this and so when people start saying. Hey you can't trust that anymore. It just means that it's harder for anyone to understand what's even real and speaking of the AI aspect of things you have a nice breakdown of how deep fix work like how how are these videos created defects are created by a kind of artificial intelligence called Ganz that's short for generative adversarial networks and the the the basic way that they work is they have to neural networks. Networks are a kind of learning. That's based on how the kind of inspired by how the brain works so imagine that these two neural networks are actually an artist and an art critic and they're locked in rooms right next to each other. The artist creates a painting trying to make something that looks like a masterpiece. And he shuffles that painting into a stack of other paintings that actually are works by the go or we're in war or whatever they take that stack moving into art critics room and art critic picks out which ones he thinks are a forgeries. The ones that aren't the real masterpieces. That feedback goes back to the artists and the artists gets better and better and better at figuring out how to make a really convincing fake masterpiece up to the point where he's able to he or she is able to. This artificial neural network is able to make something that can trick the critic into thinking that what is fake is actually real. That's how these sort of artificial intelligence systems work. So I mean that sounds complicated by no love this kind of working superfast background but how easy is it for someone to actually make a deep? Do It depends what kind of deep lake we're talking about. You know there are open source tools to make the kind of celebrity face swaps the Elon. Musk sauna babies had sort of thing. They're open source tool sick at that. They're not as easy. Those aren't as easy to make us like a meam or an animated Gif. You need to have technological savvy Know How to get. You need to have a pretty powerful computer you need. Large data sets unique things. That are more difficult than like making taking a photo putting some white text on it of course so those are berry accessible with are kind of acceptable. But what we're talking about here talking about election defects now. These are the kind of things all the experts that I talked to say. You know we have a lower hurdle to suspend disbelief when we're looking at Elon. Musk space on a baby. But when you're presented with a video of a candidate for president or the president of the United States we have a luckily human beings. Have they kind of set a higher bar that you have to clear to actually believe that it's true so what that means is kind of very sophisticated high end e fix that would threaten on election? Those are really reserved for people that work at universities or research centres powerful computers or state actors that have that kind of computing power like China at their disposal. So the idea here. That Kennedy fakes are less of a risk. Like what are some of the defects? We should be worried about what people are more worried about aren't necessarily these candidates it's more an a deep lake that attacks your faith in the election rather than your trust in a candidate so instead of having what are the reasons is at the state in our political discourse where we're very divided. I think everyone agrees that we're divided and our our opinions seem more entrenched than they had before and so in that environment it's harder to convince or sway voters either way with a fake video. You know like if you were to make a video of Donald Trump's hair flying off or something like it will only solidify your beliefs if you liked on trump you'll be like that's a fake. I like Donald Trump. Even more. If you don't like him you'll be like he looks Tom. I dislike them even more. And so a more cunning way to use a deep fake to disrupt the US election would be to create a deep fake of say like an authoritative news anchor or a governor or authority. Figure who not as many people know saying things like. We're in the age of Kobe. Nineteen we have marsh. It's two days before election martial law. You cannot go to your polling place or to create like news. Anchors saying there There are you know there are some sort of you. Know armed militants some sort of supremacists or militants. That are arming themselves. Going to polling places in a specific neighborhood these kind of people need to be scared about showing up to vote and in that way you can suppress votes and you can also after the vote undermine people's faith in the result if you have an authoritative figure saying something about how we have footage of vote-switching from trump to Biden That could so this sort of distrust not only in going to the election but after the election in the results themselves. That's an interesting point because it's it's not necessarily like a defect that would make Joe Biden. Say something like I killed the spurs like this is these are actually kind of believable is right. I think that's your point like this is a lot more coming. It's lot a lot more nuance but I think. That's what makes the lila easier to swallow the fact that it is all what you're saying. It's pretty plausible sounded. Yeah and the other thing to keep in mind. Is that a candidate. Deep fake would. It's kind of like this Yin and Yang. Were like the the head of the snake is eating the tail like a candidate. D. Fake would only be successful if it basically goes viral and lots of people see it right but when it goes viral. One thing that the. Us has say what you will about the US press core. We have a robust free Press we have a robust free press entrenched in our country other countries where there are dictatorships or more emerging democracies. They don't have that quite at their disposal as much as we do. So if a candidate of the president or Joe Biden were to come out. We do have the capacity here built into our democracy to have a force of people trying quickly as possible. To debunk it. Whereas if you were somebody that wanted to make a deep fake that could actually just robbed or suppress boats. It would be more successful if it doesn't go viral if it's not something that draws the attention of an entire press corps. That's entirely focused on this on this election. And so and that way could also kind of be the most successful not going viral kind of existing on the radar enough to disrupt people in say one or two counties that are really important in a swing states. And that wouldn't draw the attention of a national press corps debunk it well defects captial of tension and headlines is really just sort of one way to manipulate the Info right like this. We're looking at it a little bit too narrow. If we're just focusing on defects is that is that the case. Yeah you know. It's it makes sense that people would be scared of depicts because you know as we talked about earlier it. It undermines the species assumption. That if I see it I can believe it. And so that's why. There's a lot of fear around deep fakes and what they could pinch the harm that could potentially cause but the reality is you know because of some of these things we talked before about. How really sophisticated deep picks are still inaccessible to a wide right of people? That's not true for like you said memes for slowing down video like the Pelosi sounding drunk video that went viral. Those are kinds of media manipulations. Sometimes people refer to them as shallow fakes or cheap. Fix that have the power of being cheap easy and still incredibly effective. And so. That's why you know. One of the Edward Snowden slayer. Aclu lawyer this comparison. He said that you know looking at election information manipulation by only looking at depict looking at it through a straw. You're just not seeing. You're seeing something really scary. But you're not seeing the much. Bigger picture of how things could be disrupted in twenty twenty oxygen. Russia played a big role in. You know clouding the two thousand sixteen elections with misinformation disinformation. And you know you talked about how it takes a lot of resources for these fakes to be effective. Obviously Russia's a country with a lot of resources like should we should we be worried about Russia antiques. Well so I talked to one expert on the national security locations of depicts his. Name's Clint Watts. He testified to Congress. He testified to senators about just the sort of thing and he says you know anything's possible but Russia and their disinformation tactics. They are more skilled at the art of this information than they are at the science of deep fakery so they although anything's possible. Russia has lots of oil money could always who knows what Russia could do. But he's more interested in the potential in China or other places China in particular as a place. Where China has you know. They have supercomputers I think Stephen Shanklin expert on nonstop. He always has that but I think they have. More supercomputers than we have in the US or whatever compute they've got lots of supercomputers which is important for making the takes no for sure beyond supercomputers they've invested heavily artificial intelligence. The one leads in the world in a appear that neural network. That's that stuff is a recipe for a lot of potential problems. Down the line. Yeah in China they have completely synthetic television personalities like deep fake news anchors so that a very authoritative anchor can report on something without actually take time out of his day to report on it The fact that a country like that if if they wanted to do that then they could They are the ones that are in the best position to create a deep fake That would disrupt global geopolitics. But you know. State actors could create other kinds of deep fakes. That could cause other kinds of problems. Those are in the world so you know it's just doomsday scenario. No matter how you look at it well that's that's glorious and very positive Just lastly I mean I think we can all figure out that Elon. Musk is not really a baby. But are there giving advice for for like how despotic fake. Or or just a you know how to be a little bit more vigilant when looking at some of the content that surfaces around the web. Yeah so I asked everyone. I talk to you all the extra Saturday. I asked this question. And there's no silver bullet like little loophole that you can find for understanding it's fake For debunking it on your own. If it's a real deep fake than your eyes won't save you like watching it. You won't be able to tell that it's like that's the whole point of a defense that it's an AI. Created where the power of this artificial intelligence outstrips like. Our brains are very attuned human faces. But they're not so fast that they can keep up with how well deep fake technology can progress. And so you know. We don't have computers in our brains that are as powerful as supercomputers at research universities So the advice for normal people that are like hey how do I even know of this fake? It really comes down to like basic hygiene about what you're exposed to if you see a video and it seems like it's so outlandish that it couldn't be true than might not be and if you see a video that is clearly something trying to appeal to some person some segments inflammatory instincts. That's also reason to be skeptical defects just mean. Everyone needs to do what we should be doing with other kinds of manipulated media slowdown. And think before you share. It's hard to do and it's even harder when we're talking about deep ix but it's just as important to act that way what you're presented with a really realistic video as you would be if you presented with a mean or like a cheap slowed down video of a drunk. Nancy Pelosi Right. Yeah well. That's good advice in general whether it's an article or D do a little bit of Homework. Thinks through what you're actually looking at

Coming up next