Deep Fake, that’s the latest thing we have to worry about now. Deep Fake is a technical term meaning digitally altering video or audio of people to make it appear they did or said something they didn’t.
This is very fake, faker than “pure fake like”. Faker even than the fakeness your Facebook friend wrote about when they announced to the world: “Nuttin but snakes on here gud to no ur real frends n not all fake 1s.”, followed by cry-laugh emojis.
Deep Fake is in the news because there’s a video of what looks like Facebook’s founder Mark Zuckerberg saying some fairly earth shattering things. He appears to be saying: “Imagine this for a second: One man, with total control of billions of people’s stolen data, all their secrets, their lives, their futures. I owe it all to Spectre. Spectre showed me that whoever controls the data, controls the future.”
They’ve manipulated an existing video to make it look like Zuckerberg is a megalomaniac working for Spectre. Everyone knows that this is a fake, only because Spectre is a fictional organisation. Other than that, everything else is plausible.
It’s another step along the way to not knowing what to believe any more. The next US election will be the first to witness Deep Fake technology as candidates will be shown in situations or espousing views that might see them being devalued in the eyes of the electorate.
For example the Democratic candidate might be shown in a compromising sexual position or taking bribes or sticking to a principle. Regarding Trump, it’s not beyond the bounds of possibility the other side will fight dirty with him, seeking to alienate him from his base.
Say if a video were to emerge of Trump appearing thoughtful, nuanced, mannerly, arguing logically and not insulting someone — this could lead his fundamentalist followers to desert in their droves.
It’s likely a new general election will happen here within a year so are we Deep Fake-ready? When a video emerges of a candidate candidly admitting “Look that’s just a promise you make in an election campaign” will we know if they really said that?
One wonders though if it would make any difference. It’s not a question here of people not being able to believe what they see or hear. When you see whoever gets elected, it’s more a question of believing it, saying “I know he looked for bribes but feckit, didn’t he get my mother a medical card?”
Deep Fake could have more of an impact in our personal lives though.
Consider the Take It Handy With The Drink Lads ad campaign (I think that’s what it was called) from a few years ago where someone wakes up to texts saying “You’re a gobshite after last night” etc. Imagine what we could be waking up to now. Faked videos of us seen astride a police car or telling bouncers we’ll get them sacked.
Teenagers could show footage to their mother which clearly shows their father saying it’s ok for them to go and hang around The Ruined Factory vaping and doing eyeball shots.
Couples with a baby could fake recordings of conversations about whose turn it is to get up with The Child, what was said in an argument, what was promised, what was agreed about spending the weekend with whose mother.
Children hitting each other in the back seat might produce footage which falsely alleges who started it.
The only way to counteract this is to act now. Start recording yourself saying all the things you wouldn’t normally say, all the stuff that would get you into trouble. Have actual affairs and record them.
Put it in a vault.
When the fake footage emerges, at least you’ll have something to compare it to.