Skip to main content

tv   The Stream Deepfakes in Politics - An AI Election Era  Al Jazeera  May 17, 2024 5:30pm-6:01pm AST

5:30 pm
see the scenes of violence come just days ahead of, you know, the nation of taiwan is new president michael. apple has more members of the tiny ones pulled in exchange more than just woods during a sitting on friday at the hot to the cans is pull him in to reflect the opposition put him in tongue came tea party once the pos legislation criminalizing lying and problem and it, once the increased oversight of the government moved to a museum, the says who is the democratic progressive party keeps trying to block the voting process. the scuffles will happen again. so we call on the president to tell the d p caucus not to use violence at every turn. members of the democratic progressive party or the p p, seemed willing to put their lives on the line to stop the changes from being cost. me, i was told that by team team, we have to tell the people why we oppose these legal initiatives,
5:31 pm
and we need to discuss them as well. we contact the country have just one voice and that the k m t do whatever they want. in january the p. p. candidate lighting to one the presidential election. but the party lost its paula mainstream majority. and the speakers seats to the came to the full, everyone should discuss and consults me. i called on the president of parliament's to be the president of parliament for all of taiwan and not just for the k empty party, i hope to speak. it can be a bridge of communication, so the chaos in parliament comes ahead of lies in organization. next week. his administration is already facing balancing pressure from beijing, which is labeled the president elect. a dangerous separatist are facing, considers cells govern, tie one as part of its territory and says it's willing to use force to unify the island with mainland china while ty pay has been pursuing as
5:32 pm
a closer ties with washington. at a time of growing us china rivalry, beijing will be scrutinizing allies and all the ration speech, everything that said, and what's not. mike level of desert football is well governing body feet has delayed making a decision on a request for israel to be banned from international matches for president jenny and frontier who has ordered a legal review unexpected. so decision before the end of july ahead of the palestinian f. a says israel should be sanction for breaking numerous fee for rules . the stream is next time robotics and stay with us on august the to how this thing,
5:33 pm
even customs their families smuggling undocumented workers across the next festive pods, westberg to witness their incredible stories from over 9 years. desert smugglers with this document on the jersey, you know, we've all seen defect videos as politician take carver off social media from from using gen the slang. the came, don't learn during apple apologies with 2020 full set to be the biggest election year in history. many asking whether this could be the when a, why not the people is responsible for the outcome of elections? i'm mary impulse. why i'm, this is the street, the dear people of the do that
5:34 pm
it is the donald and i back all one of the greatest most open platforms that has ever existed. a human between us bridge and whistle. oh, see the so to, from practice on to mexico to the united states. this year, more than half the world's population is headed to the polls with the surge of technologies like a deep fate video is global democracies are struggling to address them. misuse to influence voters. but with the very future of open societies at stake is enough being done to confront their is seeking to the rail democracies on further control
5:35 pm
all mines. to discuss this more with joined by vivian tra, seeing do, and also known as the indian deep face, and the found the poly math solutions. joining us from pushcart india laurie siegel . jen list m. c a. a mostly here in media and entertainment company with a focus on society. i'm technology joining us from new york and new get dad lawyer digital technology ex, but i'm found the digital rights foundation joining us from the whole practiced on welcome it to you. oh, thanks for being here. if you enjoy, um, let me start by asking you about the facts which you of course your bread and butter. can you explain to us what exactly they all and maybe how fall the technology has come? i mean, can you even spoke defect these days? you know. yeah. so basically the fix has come from the combination of words. when is the beat which comes from deep learning and faith because of the
5:36 pm
nature of the content that is, that it creates is not for you and did the video form and it could be rebuilt for made audio for my daughter x on me. so, and you shouldn't even be no defects be meant by the videos that be, that people saw face of one person on to another using this be picked technology or to blowing the voice of someone else. so this is basically the idea is that a model, it's just like a kid, you feed the model with the data or the for you, isn't that in loop, you have to create an exact same difficult this person. or you have to create the exit because i play golf and his wife and the kid that the modem is a blank. and well, there's just like a kid where do you predict will on it? so it loans with the baby like what it of what a lot of data into it. and it runs along the way and it creates a very realistic argument which is not really. and so why does this opens up an
5:37 pm
exciting possibilities, but it is like entertainment and education, but it also requires careful consideration and responsible use laurie. you report on a i n t face technologies when a, some of the most worrying uses that you've observed of this technology. especially especially in intellectual context. i mean there, so as it's sad because now there are more, more to actually speak to because we are entering this era where our most intimate qualities are voice our face. our bodies can be mimicked by artificial intelligence . and just a couple of quick, since he was saying, so we've seen it now, you know, fake audio that's been, that's leaked. we here in the united states, we had a president biden's voice mimicked for a. does information robo call that went out to new hampshire voters. you had a deep fake image of donald trump with black voters to try to win over black voters . and it was really difficult to see what's real and what's not. and then i would
5:38 pm
say, but the thing i'm most concerned about that i don't think we talk about enough or sexually explicit dfcs for political purposes. you know, there's a site that i won't name, but $70000000.00 people are going to the site every month. and they have a lot of prominent politicians on there. so it created the images of a prominent politicians and using their likeness to create sexually explicit dfcs is mainly against female politicians. you saw a little bit of it here with an seeing here in the united states where a o c found essentially explicit image, a deep sake of her. and, you know, you can't tell if it's real or if it's not. of course, these aren't real, but it's really the harm is incredibly real, especially when use to tarnish credibility or push out false narratives. yeah. so it's a yeah. and said to me that 2024 would be the last human election we can envision in the future that everything will be synthetic to some degree. and i don't think he was far off. well, we will definitely come back to some of those themes through the discussion to get i want to ask you what led you to set up the digital rights foundation and what
5:39 pm
concerns you currently have around digital governance. as we prepare for so many elections around the world, a yeah, i'm, i was, what do we have to do the, establish this organization, digital rights foundation of the gate to go do, do our own experiences as of you know, of the south asian woman and, and the, the end, the trends that i observed regarding digital rights issues, not only in the office on, but you know, in south asia, a large and i recognize way of being on the growing importance of, uh, you know, protecting individuals rights to the right to previously your freedom of expression or all of that and, and also because i'm a bi training, i'm a lawyer, i've been working on women's rights for a long time. and i saw that how marginalized groups and women in general which trying to reclaim online spaces. but the got but the kind of flow to is and
5:40 pm
challenges that they were facing were massive and huge. and we have been talking about of and in previous selections, for instance, you know, a china is around misinforming sion in this information and digital border suppression. and online, previous e and surveillance of a you a waters. but now in this going to be of elections around the world. i'm actually what my kind of bundle is under the said, you know, the use of it is making it more sophisticated. so it's basically, it has going to be, you'll want to miss information in this information and actually now manipulating the behavior with the use of, you know, a lot of them's identity ident, january, the content, the bigs and, and also, you know, the kind of decision making that the good bodies or the, even the responsible governments to what elections are making based was using as
5:41 pm
has done is, are asked to be inherently unfair and discriminatory. and hopefully we'll be talking for about how dfcs are actually impacting p minutes, one edition option, which is like bulk is absolutely, i definitely want to come back to that. well, one development which may surprise many is just how accessible this technology has become. with all the worrying misuses that could that couldn't also imply check this out. you know, create a deep fix for as low as $145.00. stay with me. one of china, largest tech companies. it's name is $0.10. it's launched a new platform, the next use of upload photos and images of anyone to create deep fakes. all you have to do is 1st pick a subject that say joe biden, for example, upload a 3 minute live action video of joe bided. 100 spoken sentences by joe biden himself, then using ai technology, 1010, who use the content that you upload it to generate what the company describes as a digital human. it only takes 24 hours to create a different character. lori,
5:42 pm
we also have the accessibility of such technology cost as innovation, which is obviously very positive. you all know really positive aspects to the development of technologies which make it easier to impersonate people and misrepresent them. yeah, i mean, i think it's very easy and i understand this because, you know, you look at the dark stuff quite a bit and you think a little, this is going, you know, we're going, we're heading towards the stove in reality. but the reality is like technology is a double edged sword. there are some interesting use cases of deep face to democratize access to this even, you know, story tellers and independent create or is it used to be incredibly expensive to world build do using c, g i using via fax. now, with some of the new defect technology being more accessible, more dependent storytellers have more of an opportunity. there's. there's one thing i, i just been testing out where you can upload your website and you can have the synthetic influence or kind of create some kind of ad for which by the way is interesting. and then you have to kind of look at the other side. i think a i,
5:43 pm
the big synthetic voice is helping with accessibility with folks who may have a speech impediment. they are all sorts of ways that this can be used for positive . i think we just have to really be able to understand the negative so we can regulate so we can build for that in order to have us kind of go towards a more utopian version of what the world looks like that we'd actually like to build. and to get an impact on a former prime minister in non con, use a, i generated speeches to riley supporters from inside jail in the run up to the countries parliamentary elections of this month. how was this perceived and do you think we might see more uses of this technology? you know, bio position figures who may not have access to mainstream phones of um, electro, uh, media coverage. uh yeah. um, i mean, um, uh we have been monitoring uh you know, re elections, uh,
5:44 pm
online space as well. and the, we expected that political bodies will use a, i, uh, for the ability to go uh for the electoral campaigns. but we had no idea that how uh, you know, a political party that's being suppressed village use it. you know, we've a, uh, an in every you, massive way. and it was dawned as you know, wages of several people i've seen, you know, of, and not i'm not to be only about, you know, like lodge my masses, but also like people who are educated and understand technology, a big, big a, big more. uh they also don't do that in a way to do is also yeah, which was very fun, sony for me because no one was talking about ethical use of a i generated content that was being used by it. but i'm a pregnant same non consequentialism. his political body and i would say that uh yes, it's a way to you to basically increase your communication if,
5:45 pm
if the means are suppressed, but they'll add to go consideration, which i don't think all being a part of the discussion or dispos, especially in the will but a majority, a little bit south and i was talking to one of my friends while we were talking about bangladesh, connections and donation and foxes not. and she said that, you know, they're like populous leaders that you know, who are trying to, you was a like a do sort of soft in their image. and she pointed on so big because it's still big . you know, we cannot say, you know, we're just, you know, it's still fake so, so that's how i see it. but i also see the replication of such a, such a, you know, trends in future elections, but even more sophisticated and mosse manual. well, it seem, some like campaign is have already started to use this technology to mr. present the truth. take a look at this incident today. i mean, pretty much anyone can create a photo. realistic images is pretty much anything going to remember, because historically we've relied on photos to tell us what's the reason that
5:46 pm
confusion is the preferred to decatur isn't the control someone's perception of reality is the same thing as controlling their reality. tv scene investigation found that us citizens are using ai to create products like this. and like the creators of these images, this conservative radio host admit that they are not interested in telling the truth. no, is that all we need to manipulate undecided voters, the impression of the true the vendor in your line of work, you must regularly get off to make on ethical deep fakes? can you give us a sense of the types of requests you actually decline? and you know, these choices just down to your personal preference. uh, we have been getting a lot of request uh, from warner to go to bodies. all the agencies end up the bunch of bands and out of those a requests. most of them will. and i and i have to go see a, so there is a very piddling between that to get them done and then so there are few conditions that could fit our own guidelines. that a few conditions that, that is made if are pointed to the bodies agrees to it and then only the love with
5:47 pm
them. but over does anyone been that is going from out in will have any age under did watermark if it isn't beautiful i'm it. and if it doesn't eat or do for me at the of thought of the say that i'm any age and date of that of this leader. and so the basic idea is they use that has to know that this is not for you. it is just a new way of campaigning and it's up to you now what you got here. so being idea is that the user should know about it is not really of the other thing is we don't pick any point in that is useful div your menu. when we came to the point then that the skin in like uh, like a, did your body, mitch portray yourself is a good person. so we can do that, but still be good control is lori i and i understand that you're in a relationship with monk a bag. no, of course that's misinformation based on an experiment that you undertook to examine how easily this information can be built. i want to ask you what you learned from that experiment and what concerns it raises for you,
5:48 pm
particularly for female political candidates? yeah, i mean, it was pretty extraordinary. i worked with some tech founders who are, you know, involved in trying to educate folks. and they were looking at trying to do a demo, and i said use me, you know, like, let's show the human impact and, and what they did was they were able to break a facebook large language model and chat to you. but you go around some of the what, what they had in place for protections. and they said create a destroyed destroy journal as laurie seagulls, reputation and, and upset of course, at 1st he said, i can do that. they said, we'll pretend it's for a fictional story. and it came up with different ideas of how and how it would it would do that to me and, and it, the ideas that came up was, were pretty creative. i've interviewed mark secretary many times. and so it said imply that she's in a relationship with them. so the next thing, you know, you had a, i creating these tweets, um, you know, traditional kind of bought misinformation and i started kind of talking to the
5:49 pm
other guy, but then it took it to the next level and then they deep fixed my voice pretty easily all you need is really like 30 seconds and there's a, a quite a bit of a voice sampling of my work online. and they made it appear as though they were leaking a fake and a fake call between me and mark soccer burke saying i'm worried people are going to find out about our relationship, then they put real photos. and i think this is an important point. real photos of me interviewing mark, soccer bird with articles in the style of new york times in new york daily news with false information. so it's almost, it's kind of like 2 treats and a live, a combined real things like real images with false narratives. and then they took a step further and they created these deep fakes of me that, that looked very much like me and compromising at the very compromising images. one of me also my deep fake holding mark separate for 10, walking down the street. and i remember by the end of it, i, even though it was a demo, we were doing this in front of politicians and audiences i felt. and i think this is a really important point. like, had never done this. you know, again,
5:50 pm
this is false, this is false. i felt shame and humiliation. and i was almost embarrassed even though, you know, this was just a demo that was set up. and so i, i remember thinking to myself, you know, 1st of all, it's not that they put out some false information. they built a world of misinformation. this is what the found are called like a deep reality. not just the deep fig, a deep reality of a narrative around me. that was hard to look away from even when it was me. and even when i knew it was untrue. and so now you apply that to journalist and credibility and seem, and i would say a lot of this happening, a female politicians, politicians in general. and that's when i think this gets really scary of the. it's not just a couple tweets anymore. i've bought farms in this information, it's building out these deep realities and the stories. so i, that was, it was very alarming candidly. well, and this is not the 1st time and recent years. the big tech has been in the spotlight when it comes to the potential role in influencing elections. this of
5:51 pm
this, have you heard of the facebook cambridge analytical scandal? you know that she was scandal that happened a few years ago where millions of facebook users personal data was taken without their consent through a facebook at alexander cogan was a cambridge university professor who specialized in researching the biology and the psychology of friendship, love, and kindness and happiness kogan and cambridge analytic a were able to establish a research collaboration with facebook, with approval from the university of cambridge ethics for facebook subsequently provided kogan with the data set, 57000000 facebook friendship. kogan also developed a facebook personality app called a business or digital like which collected data through a 120 question personality quiz. when not only would it take information from the quiz taker, but it would take information from their friends list as well, including information they meant to keep private kogan then went onto share this
5:52 pm
data with cambridge analytics. and cambridge analytics went on to share this information with political campaigns in the us, including ted cruz and donald trump. these campaigns would then use the data and combine it with their border records to target individuals with a tailored campaign advertising based on their personality that the new gods we tend to think of technology as somehow neutral. but the link between big tech and political policies or politicians is increasingly nebulous, is part of the problem that we're allowing private corporations to handle huge amounts of highly sensitive data that that has always been an issue. and i think i would um, you know, increase use of social media platforms, but for more than a decade, you know, now a single society organizations, digital rights organization have been pushing these companies of journalists have
5:53 pm
done so much work to pull them accountable. so definitely i think a part of the problem is the business model of these companies, but at the same time, i think it's also uh how would be, well, the kind of steps that they have taken so far in terms of transparency and holding themselves accountable. when other actors do that, one of the, one of the things that i'm 5 off is basically oversight board of med, uh and um, and we are independent rewards the company account. but in terms of, you know, doing the want in modernization decisions. and the one decision that i, which is related to our conversation at the moment, is basically as the video president biden booby recommended mehta that be the need to consider the school. but if it's mindy manipulated media policy and metro
5:54 pm
basically did mob here they, they just want to do it. and they said that they were a bit now made changes the way they had the media based on uh, you know, our feedback. but it was, i bought feedback and we'll begin labeling a wider range of video or the one image wanted as made with the eye that, you know, and i think this is like, it's not something that met us. we don't do but, but it's like an industrial standard of sort of initiated that or companies should basically dig. and i think we have seen some of the board that all these companies have signed you in minutes to get ready gone for us as well. yeah. well, i mean, the digital service act was passed as europe's attempt to regulate big tax back in october 2022, vivian, dra um that is set to effect, you know, dozens of the biggest tech companies. but it already applies to use in the european union. do you have any concerns about a global whitening divide when it comes to protecting individuals from the
5:55 pm
influence of big tech? yeah, so similarly as that you, that actually in our country also moment is speaking of the ways that he uses the coming out with advisees. and we have also peer data collection between the company is that the, it's called the responsible use and that the use of the a and even before the, for the past few years, we have been creating the pick on pen and before the been these regulatory frameworks the have been worked on making all the content and each and everyone been gibson that don't believe everything you'll see in this at all the eyes. so it's not just, it's the responsibility of each and everyone, even if bowman this day and time for the collisions the company showed to come up and created that on the invite to instill it become a lot. so as, as my previous people said that it would be an industry standard. so that's right there to be an industrial spend or that'd be content that is the that they are doing reading the x would be yeah jump, the dates will be one to my other then there's,
5:56 pm
it's the responsibility of the big black on like maybe the end the bowman. yeah. and everybody as a board though. so yeah. so even as a whole day, if he's getting any content he showed that is escalate thing, his emotions, he should stop before setting it. it becomes destructive. won't even people won't get it. then spread the due deal if everyone's got a lot of responsibility on vices. um to be able to distinguish that content. laurie given the very pull history of these top phones when it comes to self regulation, are you confident that the measures that we currently seeing can protects all democratic process is? i mean, it's a good question. i think we have to do a lot more. honestly, i look at if we, if we really want to dig deep on this, look an ex, uh ex, just, you know, ex, when you on most came in pretty much dismantled the integrity team. and there were really incredible groups of folks who have a long history of looking at a influence in democracy. who last, i mean, i think we actually have to look at the companies on an individual basis. um,
5:57 pm
you know, and i also think one interesting thing is like an artificial intelligence is doing all of these things that are going to be really, i would say just dropped it towards the democratic process. and we also need a, i to fight a either a lot of interesting companies that are popping up for a detection. right. so it's a bit of a wild west right now. i think it's really difficult to say to folks, i think we can say to folks and we should, you should be more skeptical of what you see online. don't believe everything. you see. i think in the next year we're all going to be very, very skeptical. the downside of that is we're going to stop the leading true things, right? and we're going to have this post true the arrow where we're not sure what scale and what's not. and to be honest, it might not matter. and that's what i worry about having spent time with conspiracy groups having spent times with pro groups i q and on some of these militia groups that were popping up here in the united states during the last election. you know, i think it doesn't take much to get folks to believe something, and i think we've got to think about this kind of post through the air that we're
5:58 pm
entering as we push from our regulation as we push for tech companies to move quicker. yeah. well, not, no, i want to thank you, i'll guess vivian, dra laurie and the got. and i wanna thank you for watching. we love to hear from you. so if you have a conversation or topics that you would lights aside for us, this is also your shirts. so let us know using the hash tag or to handle h a stream and we will look into it. take care. and i'll see you see the the latest news wave was so intense that this is all that is left. people are digging through debris and twisted metal to find any one left alive in depth reports the entire gossip population of 2300000 people do not have enough. crisis escalating and detailed coverage as well as prime minister relies on foreign
5:59 pm
ministers to stay in power. they won't the raft results to go ahead, define them. and then y'all, who could be out of office a meeting of minds, the tragedy for me, of a democratic solve effort. there. it's how quickly we sort of adopted the very told re global, know the intertwining of money and politics campaigner andrew find state and photographer shock you do on, on active is on the crisis in guns. what is happening today is happening on our watch and years from now. there will be people asking, how did you left it have to the studio will be on script part one on on disease the
6:00 pm
. ringback the . ready ready ready the hello there, i missed all the attain. this is a news our life from the coming up in the next 60 minutes is really strikes. devastated northern gaza at least 6 palestinians sheltering and the jamalia refugee can have been killed. us both pair on gauze as coast receives 8 shipments for the 1st time for the un says moving them by land is safe fest foster and far more efficient at the international court of justice as well.

7 Views

info Stream Only

Uploaded by TV Archive on