Skip to main content

tv   The Stream Deepfakes in Politics - An AI Election Era  Al Jazeera  May 19, 2024 5:30am-6:01am AST

5:30 am
or there's got to see them together on the contain trail gonzales, a little known but well respected former diplomat was chosen by the opposition in april after venezuela. supreme court upheld a band against my child despite the landslide victory and the opposition primaries gonzales promised. that if elected, he will insure political rights in the country and urge the military to uphold the constitution ought to you and this i will get a meeting and will be able to do so. as to the leaving the government, i guarantee a peaceful transition in which all cold cold forces will be able to exercise their rights within the framework of the constitution to the national them forces. you play a fundamental part and security of all of us being guarantors or institutions under constitution. recent surveys show that more than 50 percent of voters back gonzales and 2 thirds one to see change in
5:31 am
a country reeling from. here's an economic and political crisis. opposition figures accused and my daughters government of consequence abuses, illegal, arrests, an intimidation against deal position and making it difficult for people to register to vote on thursday. the government, close the career with telling the city of victoria to prevent the position figures from staying there before saturdays event president. my daughter is also taking to the streets to campaign for his re election poll says he faces an appeal battle in a vote that and the socialist party is 25 years in power. but many are wondering if the regina willing deeds allow fair elections in hand over the offices of state to someone else. alison, the get the address either. well, of heavyweight boxing has a new champion that ukraine's alexander who sick of schooled a narrow split decision victory, the other britons tyson ferry,
5:32 am
and re add the 37 year old is the 1st books that hold or full major heavy weight, both at the same time fury had to take a standing count in the 9th round and was judged to have struggled in the last 3 rounds, with the sick edging him, mouse on the judges, school cods. when makes it was take the 1st undisputed champion. since lennox lewis says rain ended in the year 2005 on that story and support that'll come off a little bit later on. news continues here on out you. they are off to the stream which is coming out next. there is no channel that covers world views like we do as a rolling correspondence. i am constantly on the goals covering topics from politics to environmental issues like nothing ever seen.
5:33 am
what we want to know is how does these things affect peoples? we revisit places day, even when they're no international headlines. houses are really invested in that. and that's a privilege. as a journalist, we've all seen defect videos as politicians take carver off social media from from using gen the slang. the kim, don't, we're not doing apple apologies with 2020 full set to be the biggest election year in history. many asking whether this could be the when a why not the people is responsible for the outcome of elections primary and false . why? and this is the stream, the do people of the do that it is d donald and add back all one of the greatest most open platforms that has ever existed. a human
5:34 am
between us bridge and whistle. oh, see the out. so to, from practice on to mexico to the united states, this year, more than half the world's population is headed to the polls. with the search of technologies like a deep fate video is global democracies are struggling to address the misuse to influence voters. but with the very future of open societies at state is enough being done to confront, there is seeking to the rail democracies on further control all mines. to discuss this more with joined by vivian tra, seeing do, and also known as the indian deep face. and the found the poly math solutions.
5:35 am
joining us from push call india, laurie siegel, john list m. c a, a mostly here in media and entertainment company with a focus on society. i'm text on the g joining us from new york and new get dad, lawyer digital technology x. but i'm found the digital rights foundation joining us from the whole practiced on welcome it to you. oh, thanks for being here. vivian. john. um, let me start by asking you about the fakes which you of course, your bread and butter. can you explain to us what exactly they all and maybe how fall the technology has come? i mean, can you even spoke a defect these days? you know. yeah, so basically, but the fix has come from the combination of words. when is the beat which comes from deep learning and faith because of the nature of the content that is that it creates is not for you and did the video home and it could be in vehicle made audio for my daughter. excellent. so,
5:36 am
and you shouldn't even be no defects be meant by the videos that be that people the soft face of one person on to another using this be picked technology or to blowing the voice of someone else. so this is basically, the idea is the, the model, it's just like a good your feed, the model with the data or the for you, isn't that him, look, you have to create an exact same that because this person is, you have to create the exec deliberately golf, his voice and the kid that the modem is a blank and well, there's just like a kid where do you predict will on it? so it loans with the baby like what it of what a lot of data into it. and it runs along the way and it creates a very realistic on demand, which is not really a. so why does this opens up an exciting possibilities, but it is like entertainment and education, but it also requires careful consideration and responsible use laurie. you report
5:37 am
on a i n t fake technologies when a, some of the most worrying uses that you've observed this technology, especially especially in intellectual contacts. i mean, they're so, as it's sad because now they're more or more to actually speak to because we are entering this era where our most intimate qualities are voice our face. our bodies can be mimicked by artificial intelligence. and just a couple of quick says he was saying, so we've seen it now, you know, fake audio that's been, that's leaked. we here in united states, we had a president biden's voice mimicked for this information robo call that went out to new hampshire voters. you had a deep fig image of donald trump with black voters to try to win over black voters . and it was really difficult to see what's real and what's not. and then i would say, but the thing i'm most concerned about that i don't think we talk about enough or sexually explicit dfcs for political purposes. you know, there's a site that i won't name,
5:38 am
but $70000000.00 people are going to the site every month and they have a lot of prominent politicians on there. so it created the images of prominent politicians and using their likeness to create sexually explicit dfcs is mainly against female politicians. you saw a little bit of it here with an seeing here in the united states where a, a c found essentially explicit image, a deep sake of her. and, you know, you can't tell if it's real or if it's not. of course, these aren't real, but it's really the harm is incredibly real, especially when used to tarnish credibility or push out false narratives. yeah. so it's, um, yeah. and said to me that 2024 would be the last human election we can envision in the future that everything will be synthetic to some degree. and i don't think he was far off. well, we will definitely come back to some of those themes through the discussion to get i want to ask you what led you to set up the digital rights foundation and what concerns you currently have around digital governance. as we prepare for so many elections around the world or yeah, i'm, i was,
5:39 am
what do we have to do is establish this organization digital rights foundation. i think a to go do do our own experiences as a, you know, a little south asian woman and, and the, the end, the trends that i observed regarding digital rights issues. not only in the office on, but you know, in south asia at large. and i recognize way of being on the growing importance of uh, you know, protecting individuals rights to the right to previously. you are putting them on the expression, all, all of the and also because i'm a bi training, i'm a lawyer. i've been working on women's rights for a long time, and i saw that how marginalized groups and women in general, which trying to reclaim online spaces but the got but the kind of flow to is and challenges that they were facing were massive and huge. and we have been talking about of a, in, in previous connections, for instance, you know,
5:40 am
a trying to do that. i'm missing information and this information and digital, what does oppression and online previously, and surveillance of a you a waters. but now uh, in this going to be of elections around the world. i'm actually what my kind of bundle is under the said, you know, the use of a i is making it more sophisticated to it's basically it has going to be you'll want to be simple mation. and this information is actually now manipulating the behavior of the use of, you know, a lot of them's identity ident generated content, the bigs and, and also, you know, the kind of decision making that the good bodies or the even the responsible governments to what elections are making based was using as has done is, are asked to be inherently unfair and discriminatory. and hopefully we'll be talking for about how these dates are actually investing. payment is the one
5:41 am
edition option, which is like bulk is absolutely, i definitely want to come back to that well, one development which may surprise many is just how accessible this technology has become. with all the worrying misuses that could, that couldn't also imply check this out. you know, create a deep fix for as low as $145.00. stay with me. one of the time of the largest tech companies is name is $0.10 is launched a new platform. the next use of upload photos and images of anyone to create deep fakes. all you have to do is 1st pick a subject that say joe biden, for example, upload a 3 minute live action video of joe bided. $100.00 spoken sentences by joe biden himself, then using ai technology, $0.10 to use the content that you upload it to generate what the company describes as a digital human. it only takes 24 hours to create a d put character. laurie, we also have the accessibility of such technology cost as innovation, which is obviously very positive. you all know really positive aspects to the
5:42 am
development of technologies which make it easier to impersonate people and misrepresent them. yeah, i mean, i think it's very easy and i understand that because, you know, you look at the dark stuff quite a bit and you think the little this is going, you know, we're going, we're heading towards the stove in reality. but the reality is like technology is a double edged sword. there are some interesting use cases of deep face to democratize access to is even, you know, story tellers and independent create or is it used to be incredibly expensive to world build do using c, g i using via fax. now, with some of the new defect technology being more accessible, more dependent story tellers have more of an opportunity there's. there's one thing i, i just been testing out where you can upload your website and you can have the synthetic influence or kind of create some kind of ad for which by the way is interesting. and then you have to kind of look at the other side, i think a i, the big synthetic voice is helping with accessibility with folks who may have a speech impediment. they're all sorts of ways that this can be used for positive.
5:43 am
i think we just have to really be able to understand the negative so we can regulate so we can build for that in order to have us kind of go towards a more utopian version of what the world looks like that we'd actually like to build. and to get an impact on a former prime minister in non con, use a i generated speeches to rally supporters from inside jail in the run up to the countries parliamentary elections of this month. how was this perceived and do you think we might see more uses of this technology? you know, bio position figures who may not have access to mainstream phones of um, electro, uh, media coverage. yeah, um, i mean, um we have been monitoring uh you know, re elections, uh, online space as well as the we expected that political bodies will use a i, uh for the ability to go uh for the electoral campaigns. but we had no idea that
5:44 am
how uh, you know, a political party that's being suppressed village use it, you know, be there. uh, and you know, in that really massive way. and it was dawned as you know, wages of several people i've seen, you know, of and not, i'm not what we wanted about, you know, like large amount masses. but also like people who are educated and understand technology, a big, big a, big more. uh, they also don't do that, you know, they do do is also, yeah, which was very fun, sony for me because no one was talking about ethical use of a i generated content that was being used by it. but i'm a brand new, same non consequentialism. his political body and i would say that, i guess it's a way to you to basically increase your communication if, if the means are suppressed, but they'll add to go consideration, which i don't think all being a part of the discussion or dispos, especially in the will but majority
5:45 am
a little bit south and i was talking to one of my friends while we were talking about the connections in donation and boxes not. and she said that, you know, they're like populous leaders that you know, was trying to, you was a, like, a do sort of soft in the image. and she pointed on so big because it's still big. you know, we could not say, you know, which of, you know, it's still fake so, so that's how i see it. but i also see that application of such a, such a, you know, trends in future elections. but in more sophisticated and boss manual, well, it seem, some like campaign is have already started to use this technology to mr. present the truth. take a look at this incident today. i mean, pretty much anyone can create a photo. realistic images is pretty much anything one to remember, because historically we've relied on photos to tell us, let's read it. the reason that confusion is the preferred to decatur isn't the control someone's perception of reality is the same thing as controlling their reality. bbc investigation found that us citizens are using ai to create products
5:46 am
like unlike the creators of these images, this conservative radio host admit that they are not interested in telling the truth. no, is that all we need to manipulate undecided voters, the impression of the true deandra in your line of work, you must regularly get off to make on ethical deep fakes. can you give us a sense of the types of requests to actually decline and you know, these choices just down to your personal preference. uh, we have been getting a lot of request uh, from warner to go to bodies. all the agencies and the funds have them send out of those a request. most of them will and i and i have to go see a so there is a very peddling between that together and them done that together. so there are few conditions that could fit our own guidelines that a few conditions that, that is made. if the parties agree is to it and then only be love with them. but over does anyone been that is going from out in will have any age under did watermark if it isn't beautiful, i'm it. and if it isn't the audio format of thought to say that i'm any age and
5:47 am
date of that of this leader. and so the basic idea is that use that has been know that this is not for you. it is just a new way of campaigning and it's up to you now what you're getting. so being idea is that the user should know about it. it's not a of the other thing is we don't pick any point in that is used to the menu when we came to the point then that the skin in like, uh, like a, did your body. mitch portray yourself is a good person. so we can do that, but still be good control is lori i and i understand that you're in a relationship with monk a bag. now of course that's misinformation based on an experiment that you on to took to examine how easily this information can be built. i want to ask you what you learned from that experiment and what concerns it raises for you, particularly for female political candidates? yeah, i mean, it was pretty extraordinary. i worked with the tech founders who are,
5:48 am
you know, involved to try to educate folks. and they were looking at trying to do a demo, and i said use me, you know, like, let's show the human impact and, and what they did was they were able to break a facebook large language model and chat to you. but you go around some of the what, what they had in place for protections. and they said create a destroyed destroy journal as laurie seagulls, reputation and, and upset of course at 1st and said, i can do that. they said, we'll pretend it's for a fictional story. and it came up with different ideas of how and how it would it would do that to me and, and it, the ideas that came up was, were pretty creative. i've interviewed mark secretary many times. and so it said imply that she's in a relationship with them. so the next thing, you know, you had a, i creating these tweets, you know, traditional kind of bought misinformation. and i started kind of talking to the other guy, but then it took it to the next level and then they deep fix my voice pretty easily . all you need is really like 30 seconds and there's a quite a bit of a voice sampling of my work online. and they made it appear as though they were
5:49 am
leaking a fake and a fake call between me and mark soccer burke saying i'm worried people are going to find out about our relationship, then they put real photos. and i think this is an important point. real photos of me interviewing mark, soccer bird with articles and the style of new york times in new york daily news with false information. so it's almost, it's kind of like 2 treats in a live, a combined real things like real images with false narratives. and then they took a step further and they created these deep fakes of me that, that looked very much like me and compromising at the very compromising images. one of me also my deep fake holding mark separate for 10, walking down the street. and i remember by the end of it, i, even though it was a demo, we were doing this in front of politicians in audiences. i felt. and i think this is a really important point, like, had never done this. you know, again, this is false, this is false. i felt shame and humiliation. and i was almost embarrassed. even though, you know, this was just a demo that was set up. and so i,
5:50 am
i remember thinking to myself, you know, 1st of all, it's not that they put out some false information. they built a world of misinformation. this is what the found are called like a deep reality. not just the deep fig, a deep reality of a narrative around me. that was hard to look away from even when it was me. and even when i knew it was untrue. and so now you apply that to journalist and credibility and seem, and i would say a lot of this type of female politicians, politicians in general. and that's when i think this gets really scary of the. it's not just a couple tweets anymore. i've bought farms in this information, it's building out these deep realities and the stories. so i, that was, it was very alarming candidly. well, and this is not the 1st time and recent years. the big tech has been in the spotlight when it comes to the potential role in influencing elections. this is of this, have you heard of the facebook cambridge analytical scandal? you know that she was scandal that happened a few years ago where millions of facebook users personal data was taken without
5:51 am
their consent through a facebook at alexander cogan was a cambridge university professor who specialized in researching the biology and the psychology of friendship, love, kindness, and happiness kogan and cambridge analytic a were able to establish a research collaboration with facebook, with approval from the university of cambridge ethics for facebook, subsequently provided kogan with the data set, 57000000 facebook friendship. kogan also developed a facebook personality app called a business or digital life, which collected data through a 120 question personality, when it's, when not only would it take information from the quiz taker, but it would take information from their friends list as well, including information they meant to keep private kogan then went onto share this data with cambridge analytics. and cambridge analytics went on to share this information with political campaigns in the us, including ted cruz and donald trump. these campaigns would then use the data and
5:52 am
combine it with their border records to target individuals with a tailored campaign advertising based on their personality that the new gods we tend to think of technology as somehow neutral. but the link between big tech and political policies or politicians is increasingly nebulous, is part of the problem that we're allowing private corporations to handle huge amounts of highly sensitive data that that has always been an issue. and i think i would um, you know, increase use of social media platforms. but from what, then it'd be gauge, you know, now a single society organizations, digital rights organization. i have been pushing these companies of journalists have done so much work to pull them accountable. so definitely i think a part of the problem is the business model of these companies,
5:53 am
but at the same time, i think it's also uh how would be, well, the kind of steps that they have taken so far in terms of transparency and holding themselves accountable. when other actors do that, one of the, one of the things that i'm 5 both is basically oversight board of med, uh and um and we are independent. we would the company account do, but in terms of, you know, doing the wanted modernization decisions. and the one decision that i, which is related to our conversation at the moment, is basically as the video president biden booby recommended mehta that they really need to consider the school. but if it's minute manipulated media policy and metro basically gave mob on it and they, they just wanted to read. and they said that they were a bit now made changes. the way they handled many people that did media based on uh,
5:54 am
you know, our feedback, the was i bought the back and we'll begin labeling of why don't angel we do or do you want image wanted as made with the i the, you know, and i think this is like, it's not something that met us, we don't need to but, but it's like an industrial standard of sort of initiated that or companies should basically dig. and i think we have seen some of the board that one of these companies have find you in minutes to get ready gone for us as well. yeah, well i mean, the digital service act was passed as it europe's attempt to regulate big tech back in october 2022 vivian, dra um that is set to effect, you know, dozens of the biggest tech companies. but the only applies to use is in the european union. do you have any concerns about a global whitening divide when it comes to protecting individuals from the influence of big tech? yeah, so similarly as that you did it clean up in our country. also,
5:55 am
government is speaking of the ways that he uses the coming out with advisees. and we have also created a goal in between the company is that the, it's called the responsible use and that the use of the at, and even before the, for the past 2 years, we have been creating the pick on thing. and before the been these regulatory frameworks, they have been working, working all the content and each and everyone been there to get from that. don't believe everything you'll see and does that all the guys. so it's not just, it's the responsibility of each and everyone, even if government this taking time for the collisions, the company showed the come up and created that on the invite to instill it become a lot. so as, as my previous people said that it would be an industry standard. so that's right there to be an industrial spend, or that'd be content that is the, that they are generating the x would be yeah. and the dates will be, what am i other than this? it's the responsibility of the big black on to make, made in the moment. yeah. and everybody has a vote. oh, so yeah. so even as a whole day, if he's getting any content he showed that is escalate the thing,
5:56 am
is it motions he should stop before setting it, it becomes destructive, won't even people won't get it and spread the due deal. if everyone's got a lot of responsibility on vices, um to be able to distinguish that content. laurie given the very pull history of these top phones when it comes to self regulation, are you confident that the measures that we currently seeing town protects all democratic process is? i mean, it's a good question. i think we have to do a lot more. honestly i look at if we, if we really want to dig deep on this, look an x x just you know, x when elan must came and pretty much dismantled. the integrity team. and there were really incredible groups of folks who have a long history of looking at influence and democracy. who last, i mean, i think we actually have to look at the companies on an individual basis. um, you know, and i also think one interesting thing is like an artificial intelligence is doing all of these things that are going to be really, i would say just relative towards the democratic process. and we also need a,
5:57 am
i to fight a, i, there, a lot of interesting companies that are popping up for a detection, right. so it's a bit of a wild west right now. i think it's really difficult to say to folks, and i think we can say to folks and we should, you should be more skeptical of what you see online. don't believe everything. you see. i think in the next year we're all going to be very, very skeptical. the downside of that is we're going to stop the leading true things, right? and we're going to have this post through the arrow where we're not sure what's fail and what's not. and to be honest, it might not matter. and that's what i worry about having spend time with conspiracy groups having spent times with for groups i q or non, somebody's militia groups that are popping up here in the united states during the last election. you know, i think it doesn't take much to get folks to believe something, and i think we've got to think about this kind of post truth error that we're entering as we push from more regulation as we push for tech companies to move quicker. yeah. well, not, no, i wanna thank you, i'll guess vivian,
5:58 am
dra laurie and the got. and i want to thank you for watching. we love to hear from you. so if you have a conversation or topics that you would like to aside for us, this is also your show. so let us know using the hash tag or to handle a stream, and we will look into it, take care, and i'll see you soon. the in depth analysis of the days headlines if that was a rough or offensive. where would the people go? people have no place to go. each one of 2000 people has to be displaced these 2 times frank assessments. this is a mass and blow to free speech and freedom of the press informed opinions. you can be somebody that says i'm one of the hostages, october 7th, and return. and i want to start inside story on out to 0. interrogate the narrative is the u. s. has continued support for israel affecting
5:59 am
his global standard. there's no question about it. the united states has effectively complicit the genocide challenge. the rhetoric. yes. the correct but so in the international community, can we also say that dells? the cornerstone of democracy is having a free and open democratic pro upfront without any seems in the gaza strip as is continues. there's a deliberate mission of posting and humanity in western media, and it needs to be questioned, sustain coverage that actively humanizes is really is and actively the humanize of palestinians. this is not the time for doing this kind of weight. tracking those stories examining the journalism and the effect that news coverage can have on democracies everywhere. here at the listing past, we know what's happening in our region. we know how to get to places that others
6:00 am
can know as far as instead of going on. the way that you tell the story is what can make a difference. the cottage in guys, dozens of palestinians have been killed in a series of israeli attacks on refugee camps in the north, as well as the southern city of rough or mental guise of homes and another refugee tom came under attack of these 22 people who've been killed there, the, i don't know about the sooner this is the 0 live, but from doha also coming up. why police and tennessee of crack down or in protest is calling for new election is a member of the war cabinet instructions to resign over postwar plans for garza, the.

0 Views

info Stream Only

Uploaded by TV Archive on