tv Witness LINKTV October 2, 2023 9:00am-9:31am PDT
9:00 am
[tom cruise impersonation] tom cruise: my fellow americans, you deserve the truth and i know you can handle it. i won't be your next president. that point has been made crystal clear. [president barack obama impersonation] barack obama: we're entering an era in which our enemies can make it look like anyone is saying anything at any point in time. like, president trump is a total and complete d--. ♪♪♪ [jeremy fernandez impersonation] jeremy fernandez: hello, welcome to "foreign correspondent." i'm jeremy fernandez. well, it took a global pandemic to force us all indoors and to increasingly rely on video screens to connect our lives. but is seeing still believing?
9:01 am
this program delves into the world of deepfakes. nina schick: a deepfake is essentially a piece of synthetic or fake media that's either been generated entirely by artificial intelligence or manipulated by artificial intelligence. male: malaria isn't just any disease. male: [speaking foreign language] nina: which, by the way, includes fake video of real people saying and doing things that they never did. tom: this is no stunt. jeremy: hollywood is delighted by the movie-making potential of deepfakes, but in washington the tech is increasingly viewed as dangerous and subversive. [vladimir putin impersonation] vladimir putin: you blame me for interfering with your democracy. mounir ibrahim: deepfakes plays into the hands of anybody, any state sponsor, any institution, that wants to create confusion or deceive. matt ferraro: the deepfakes are a fundamental threat to democracy and to any civilization
9:02 am
that relies on the truth. deepfakes could very well undermine our sense of reality. jeremy: so, are we entering a world where artificial intelligence will distort our sense of reality? perhaps, as i have a confession to make. i'm actually a deepfake and you're now watching my computer-generated avatar that hamish macdonald and i have created. jeremy: what's extraordinary is that almost anyone can now do this. all you need is a credit card and internet access. jeremy: okay, show me what you've got. hamish macdonald: time to see something that will probably blow your mind a little bit. jeremy: "jeremy fernandez deepfake." hamish: it might also mean the end of your career. jeremy: i'm jeremy fernandez. tonight's program delves into the world of deepfakes, also called synthetic media. jeremy: what do you think? hamish: it's pretty good. jeremy: it's--i have to say, i'm not entirely convinced, but i think you put the three elements together of the location, the face, the voice,
9:03 am
it makes a reasonably compelling case. jeremy: can we still say that seeing is believing? jeremy: the thing is, i've never said those words to camera before. this has all been generated by the program. hamish: the system was developed by two companies: synthesia and d-script. we created a deepfake jeremy by providing the program with video and audio samples. but you can simply pick your avatar, language, type in the words, wait a few minutes for processing, and there it is. [jeremy impersonation] jeremy: hola. [speaking foreign language] hamish: this is all pitched as a corporate training tool, but there are some restrictions. the developers prohibit you from creating any content that's discriminatory, political, or sexually offensive, so there's a limit to what we can do with this jeremy avatar. i think it all looks pretty impressive but it actually pales in comparison to what's coming. in the united states, some of the security experts say
9:04 am
they are terrified by the next-generation of deepfakes. ♪♪♪ ♪♪♪ ♪♪♪ chris ume: i like living the life of a digital nomad. i never imagined that i would be a digital nomad but somehow i became one. and it's pretty cool. hamish: chris ume, video effects designer, is said to be one of the world's best deepfakers. his computer servers are in belgium. he worked for the us-based animation team that produces
9:05 am
"south park." now, he's riding out the pandemic here in bangkok. chris: i did a few months of research, how to do it, and a half-year later i had my first deepfake and then it evolved rather quickly because i saw the potential and i just started creating crazy things. hamish: it was his latest "crazy thing" that got american national security experts downloading tiktok. [tom cruise impersonation] tom: what's up, tiktok? you guys cool if i play some sports? hamish: teaming up with tom cruise impersonator, miles fisher, and some cutting-edge technology, chris produced a succession of increasingly sophisticated deepfakes. tom: hey, listen up, sports and tiktok fans, if you like what you're seeing, just wait 'til what's coming next. chris: i love how he's not looking in the camera. he's looking a bit wonky, next to the camera.
9:06 am
tom: just wait 'til what's coming next. chris: miles looks a lot like tom cruise. he has a lot of similarities. that makes my work easier in a way. it saves me a lot of time. the tom cruise videos are so special because deepfakes shouldn't be able to do such things. tom: polar bear? hamish: the cut-and-paste look of early deepfakes made them easy to spot. not anymore. in response, the tech industry is now investing heavily in software detection tools, but tiktok tom reportedly beat nearly all of them. chris: it's like a cat-and-mouse game. they can't follow. and of course, the deep tom videos, they're pretty good. i'm not saying they're flawless because i see a lot of mistakes myself, and they cannot detect it. tom: i wanna show you some magic. hamish: deepfake tom's near flawless performance went viral. chris: we never exposed ourselves in the beginning.
9:07 am
people didn't know who created these videos. tom: it's all the real fake. chris: and you had a lot of articles talking about the end of the world and this technology is getting out of hand and anyone can do this right now. so, it was really difficult. i had to think about how to approach this. chris: i see the creative possibilities and i just want to entertain people. i wanna make people smile when they look at my content. but, of course, there will be people misusing this technology. hamish: so, how did he do it? tom: in this reel, you're gonna see my amigo, chris ume, ha, ha. he's gonna introduce to you the wonderful world of deepfakes, how ai and vfx are unlocking the future of our imaginations. chris: for deepfake you start with the source data. and with source data, i mean pictures and videos of the character you want to deepfake.
9:08 am
in my case, it's tom cruise. like, around 6700 source images of all his angles, of all his expressions. hamish: the key to building a deepfake is what's called the generative adversarial network. it takes two neural networks which are algorithms that mimic the way our brains work. they learn from each other and adapt, it then pits the two against each other to build a perfect deepfake. chris: it took me 2 to 2½ months in total, just to prepare it, to get it running, to get where a model is at, and after that it took 2 to 3 days to train on each figure separately and up to 24 hours of post production. but going frame by frame, working on details, working on glitches. if you compare the quality we could achieve with deepfakes a year ago and what we can achieve now, it's a day and night difference. it's unbelievable.
9:09 am
i've been talking with certain governments as an--advising, in a good way, right? like, a lot of governments are still unknown to what deepfakes is and i'm here to explain what it is and what--which things are possible and they realize, governments realize, this could be used as a weapon. ♪♪♪ hamish: chris ume's tiktok tom startled a washington still traumatized by the chaos of the trump years. matt: i think terror is probably not too strong of a word and i think it's because people realize how dangerous they are. it is a tom cruise effect. i mean, in your mind, think about the tom cruise video and then manipulate what he says. like, just think about what he might say. donald trump: we will stop the steal. i know that everyone here will soon be marching over to the capitol building. [crowd noise]
9:10 am
hamish: the mob that assaulted the us capitol on january 6 this year was fueled by a mix of anger, fake news, and alternative facts, egged on by a president claiming he'd been robbed of election victory. for matt ferraro, former cia officer and disinformation specialist, it was all too close to home. matt: yeah, so this is my neighborhood. i--it's funny, during the pandemic i tried to make a point of walking every day and i would come out this way and, to see what happened on january 6, it was terrible. desecration is really the word that comes to mind. male: i will not tolerate fake news no more. matt: these people thought the election was stolen. they were tripped by lies that were as basic as could be, just written in smoke and lies.
9:11 am
and pretty soon, you're gonna have tailor-made video to convince you of anything. hamish: he worked for america's top spymaster, the director of national intelligence who reports directly to the president. matt ferraro fears deepfakes will open the door to a dark dystopian future. matt: the real danger is that what we saw on january 6 is just the beginning. that things are gonna get so much worse. the partisanship that we see, the visceral hatred on one another, is gonna just be made so much worse by deepfakes and the fact that anyone's gonna be able to order up their own synthetic reality, regardless of the truth. ♪♪♪ hamish: now a lawyer, he advises corporate clients on how to counter the rising tide of digital disinformation.
9:12 am
matt: now, what if i told you that seeing isn't believing? that what you saw with your own eyes was not, in fact, true? that what you heard with your own ears was not true, and even what you read, thinking it was by a human, was in fact written by a computer? i think that would have a fundamental effect on how you perceive the world. now, in the national security context, so there's no end to the nightmare scenarios. the fbi put out this rather extraordinary warning in march of 2021 in which they said that it was "almost certain," quote unquote, "almost certain" that in the next 12 to 18 months private businesses would be the victims of synthetic media, by that they mean deepfakes, in both cyber attacks launched by nation states and foreign adversaries and by criminals. hamish: the focus is now on threats to politics and business, but ever since the first low-quality deepfakes
9:13 am
emerged about five years ago, they've been weaponized against women. the figures are disturbing. the number of professionally made deepfakes is now doubling every six months. ninety-three percent of it is porn. american targets account for more than 40% of victims globally; much of it, celebrity porn of actors, musicians, social media stars. ♪♪♪ hamish: humiliation intimidates most deepfake porn victims into silence. most, but not all. noelle martin is locked in a constant battle with anonymous deepfakers on the dark web. hamish: what are the consequences of you choosing to speak out about it now? noelle martin: well, the consequences of speaking out
9:14 am
about the abuse has meant that they've gone from manipulating images of me to creating these deepfakes and so they've essentially weaponized this technology against me. hamish: so they punish you? they go after you because you've criticized them. noelle: yes. hamish: a 26-year-old law graduate, noelle has now been campaigning for almost a decade. it all started when, as a teenager, she was the victim of a random porn attack, her images posted in low-rent, obviously fakes. noelle: they had been targeting me, presumably since i was 17 and it was only until about a year later that i actually found out. so they've gotten images of me from my social media, but they weren't just collecting it from my social media,
9:15 am
they were collecting it from friends' accounts, they were looking at the bars that i went to and the photos from that event on facebook and they would collect it from there. ♪♪♪ hamish: oh, wow. noelle: that's fake me. that's a complete fabrication. so that's the level. hamish: it's escalated a lot from the beginning, though. noelle: yes, it has, yes. hamish: like, really obvious photoshopping-- noelle: yes. hamish: --to that sort of stuff. noelle: yes. it's so dehumanizing and degrading and violating. it's, like, it makes you sick and it's humiliating. hamish: you said it almost destroyed you. noelle: yes. hamish: how close did it come? noelle: very, very close. i didn't--i didn't have the right coping mechanism so i really struggled to cope.
9:16 am
and there was time where, you know, i did try and hang myself and my father stopped me. hamish: noelle's advocacy helped drive the introduction of revenge porn laws in australia. but she's not stopping there. noelle: i really think there needs to be some sort of global response because it is not enough for any one country to target this issue that is global in nature and borderless. and it is not acceptable that in 2021 someone from halfway around the world can misrepresent you and there's nothing you can do, and that impacts your whole life. nina: deepfake porn is still, without a doubt, the most pernicious use case of malicious deepfakes but, rather than this being a tawdry woman's issue, i see this as a harbinger of what's to come.
9:17 am
nina: my name is nina schick and i focus on how exponential technologies are reshaping geopolitics and society. specifically, with deepfakes, i have advised global leaders including anders fogh rasmussen, the former nato secretary general, and joe biden who, at the time, was vp, now president of the united states. when you consider deepfakes as well as the epidemic of bad information that is inundating our societies, they absolutely can be seen as an existential threat to liberal democracies and they're already at a place where a single creator, right, you interviewed him for your documentary, chris ume, can create fake video content that's more sophisticated than anything a hollywood studio would be able to do, even with a multi-million-dollar budget or teams and teams of special effects artists. and this democratization is happening so quickly that experts suggest that by the end of the decade, so 2030,
9:18 am
up to 90% of online video could be synthetically generated, so, made by ai. even before they become ubiquitous, what they're already doing is undermining trust in all authentic media. this is a phenomenon known as liar's dividend and we've already started to see that at play. so, if anything can be faked, including video, then everything can be denied. male: the four officers involved have been fired, but protesters are demanding criminal charges. nina: one of the really interesting examples of the liar's dividend at play is something that i saw around the george floyd death video. this video was so visceral that it united millions of people in protest, right? male: the violence, fueled by anger over the death of 46-year-old george floyd. the fire department reports that he showed no signs of life. winnie heartstrong: hi there, i'm dr. winnie heartstrong.
9:19 am
i'm running for congress to represent missouri. nina: dr. winnie heartstrong, she argued that the entire george floyd video was a deepfake hoax, that the man you see in the video is a former nba player, and that ai has been used to swap george floyd's face onto this former nba player's face. winnie: there is no way that george floyd is dead. george floyd is alive, america. prove me wrong. nina: most people still believe that george floyd died in 2020. however, if your influencer of choice suggests that something like the george floyd video is a deepfake, then by 2030 many people might believe that that is indeed the truth. so that is a liar's dividend in play. winnie: so, folks, do not believe anything you see or read or watch on tv, okay? [putin impersonation] putin: america, you blame me for interfering with your democracy, but i don't have to. you're doing it to yourselves. hamish: these deepfake parodies were commissioned
9:20 am
by the non-profit represent us in the lead-up to the 2020 election to highlight the fragility of democracy and encourage voting. but the group says they were so realistic, american tv networks refused to run them. kim jong-un: i don't have to do anything. you're doing it to yourselves. hamish: but what happens if a deepfaked leader not only looks real but mouths far more dangerous intentions? this is the scenario that keeps former cia expert, matt ferraro, awake at night. matt: so the most plausible nightmare scenario involves a deepfake of the president, say, announcing a missile strike against north korea. so let's walk through that. tensions between the us and north korea are always high, but let's say it's time to come out of a period of particularly difficult relations between the us and north korea, maybe during a military exercise that the us is doing
9:21 am
with the south koreans on the korean peninsula. say that it's a deepfake of president biden announcing that because of north korean perfidy, he is launching a first strike missile attack and it's believable. it's him in front of a podium. [president joe biden impersonation] joe biden: tonight, i come to talk about crisis, to declare war. biden: tonight, i come to talk about crisis, to declare war. north korea's nuclear program presents serious threats to american security and the security of the world. we will respond. matt: but also imagine that it doesn't just appear on the internet. maybe it's combined with another kind of attack like a cyber intrusion. maybe for months now, hackers have been able to infiltrate the white house twitter handle. they post this deepfake video and it goes viral instantaneously, in a moment.
9:22 am
tens of millions of people see it and now, of course, the white house is unsure what happened. this totally blindsides them. biden: peace was violated. we will respond accordingly. god bless you all and may god protect our troops. matt: the north koreans are 80% sure it's him, maybe only 20% sure it's him. are they willing to just sit back and take a missile strike that will end their regime? probably not. [guns firing] matt: they launch a retaliatory strike against seoul which is only so many miles from the north korean border, and millions of people die and it turns out it's all a fake. matt: chris ume, he said it took him a couple of months to train the tom cruise algorithms, but imagine if you're the intelligence service of the people's liberation army, known as the 2pla. they could put 10,000 man hours against the creation
9:23 am
of a deepfake tomorrow and so the fact that it takes 24 hours to create a minute of video really isn't that surprising at all, not when you have the resources of a nation state. ♪♪♪ mounir: my name is mounir ibrahim. i was a us diplomat in syria at the us embassy in damascus and continued to work on the syrian conflict in a variety of contexts, including here at the united nations. i saw user-generated content, images and videos from a conflict zone, influencing the discussions and the narratives and the meetings taking place. proving that these images and videos were, in fact, true became very, very relevant to some of the world's largest issues. hamish: detecting deepfakes has become big business,
9:24 am
with everyone from facebook to the pentagon now working on the problem. male: so facebook is commissioning a first-of-its-kind dataset. it's part of the deepfake detection challenge. hamish: but mounir ibrahim argues detecting deepfakes is a lost cause. mounir: it is going to be impossible to detect forgery, real-time and at scale across the internet, particularly with the advancements of ai moving so rapidly. so if we can't prove what's fake, let's try and prove what's real by authenticating media and digital content as it's being created. hamish: his company, truepic, is developing an encrypted, tamper-proof security app for recording vision, a kind of watermark of authenticity. the system is already being trialed by the united nations. truepic is part of a growing coalition of tech and media companies now working towards creating a new global standard:
9:25 am
tamper-proof vision that can be trusted in the deepfake world. mounir: the world's first standards body on digital content provenance was created by the name of a coalition for content provenance authenticity, the c2pa, and we're-- truepic is proud to be a founding member, along with the likes of adobe and microsoft and intel and arm and the bbc. ♪♪♪ ♪♪♪ hamish: crowds in the big apple still haven't returned to pre-covid levels. those who venture out share the visual experience
9:26 am
with those who still prefer to stay home. the big tech players, apple, google, youtube, have implemented bans on malicious deepfakes, but they've got no control over the dark web. us federal and state lawmakers are also moving to combat the deepfake threat. but will it be enough? mounir: i do believe that the us government is taking it very seriously. we've talked about the fbi warnings that have come out. we've seen multiple states passing deepfakes-related laws, so i do believe there is a level of seriousness. i would also say this is not a simple solution that a government can fix. they can't legislate their way out of this. [president barack obama impersonation] obama: but how we move forward in the age of information is gonna be the difference between
9:27 am
whether we survive or whether we become some kind of f--up dystopia. nina: actually, i am really optimistic. i believe in our ability to overcome these challenges. but i think, realistically, it will probably get worse before it gets better. noelle: well, i think it's terrifying, especially for women who are disproportionately targeted by fake pornographic material that's out there. you know, it affected me so badly, emotionally, in my wellbeing and in my health. hamish: it sounds like it's changed your entire life. ♪♪♪ matt: i think that we've only really started to scratch the surface of the bad things that can happen because of deepfakes, because we're going to move to a world in which photographic
9:28 am
and video evidence is a bit like paintings where you look at a painting now, an abstract painting, and you and i take our own meanings from that painting. [tom cruise impersonation] tom: some people claim i'm not actually tom cruise and i didn't actually run for president, ha, ha, ha. don't let my masks fool you. take a deep breath. chris: people are always scared of things they don't understand or they don't really know. tom: we should be safe down here. chris: they should be worried as in we should work on ways to regulate these things and to detect them as well. i think that makes sense. so that's why i think it's a good thing i created these for years because now i'm raising awareness and they realize, like, this is real. this is--it's coming.
9:29 am
9:30 am
samantha hawley: it's only a jump across the water from great britain to northern ireland, but in some ways it's like traveling back in time. after more than 20 years of relative peace, tensions have broken out again. violence not seen for decades is back on the streets. northern ireland should be celebrating its centenary as part of the united kingdom, but people are angry.
76 Views
IN COLLECTIONS
LinkTV Television Archive Television Archive News Search ServiceUploaded by TV Archive on