tv The Stream Al Jazeera April 17, 2023 10:30pm-11:01pm AST
10:30 pm
of activity, any run italian police have found 2 tons of cocaine floating at sea of the coast of sicily. the authorities say the drugs were stored in some 70 waterproof packages held together by fishermen's nets and equipped with a luminous singling device. the trove is one of the largest confiscated in the country. at least estimate that it's worth about 400000000 euros. basics has cooled off the 1st launch attempt of its giant rocket. after a problem cropped up during fueling, the countdown was halted at the 42nd mark because of a stock valve in the 1st stage, booster crowd had gathered to watch the highly anticipated launch in texas near the mexican border. the company plans to try again on wednesday space ex, plans to use starship to st astronaut and cargo to the moon, and then on to mars. ah,
10:31 pm
this is al jazeera, these are the top stories. the u. n says not a 180 people have been killed in 3 days. a fighting between rival forces ensued on ah, around another 1800 have been injured. the un envoy to sudan says the body will continue, efforts for further humanitarian pauses in the fighting. but he added that the warring parties and not giving the impression they want mediation for a peace between them immediately. you inspector general antonia the terrace has condemned the violence since you don't. he said the humanitarian situation there was already precarious, and now it's catastrophic situation is already lab to war. embers was of life including many civilians. any further escalation could be devastating for the country, and the region adds all those relief once over the situation to use its in the cause of peace, to support efforts to end the violence. res thought ordered and returned to the
10:32 pm
press of transition. the meditative situation, sudan was already pre carriers, and these now catastrophic, a russian opposition. activist and prominent kremlin critic has been sentenced to 25 years in prison by a court in moscow that america mercer was convicted of treason and denigrating, the military, wild leaders and supporters immediately condemned the verdict and called for his release. karen was, i was arrested a year ago when he returned to moscow after denouncing russia's invasion of ukraine . his previously survived 2 suspected poisonings to blame us on the kremlin. and his anger on the streets of france after president emanuel mac hall destination. it was his 1st speech, since his government's controversial pension changes became law. he said he understands the anger of the population and promised hill hammer out an action plan to get the country back on its feet. in the next 100 days,
10:33 pm
i consigned at the legislation on saturday. there's the headlines as always that you know, as catch up on out here dot com, which is the latest on all our stories. and more details there, of course of events since you don, the stream is up next and i'll be back straight up to that with more news. disturbance if you can. bye for now. talk to al jazeera, we ask who is really fighting this russia? is it wagner? or is it the russian or military? we listen, we started talking to me on my own, so that this year i, your citizen, you shock to get him back. we meet with global news makers and talk about the stories that matters on al jazeera. hello and welcome to the stream. what you're watching is josh rushing generated by ai. on today's show, we look into the dangers of
10:34 pm
a power disinformation and how it could be used to manipulate the public fears of a new wave of disinformation loom as sophisticated a programs become increasingly adept at making images and text. how could be used to distort our sense of reality and what's being done to protect the public from misinformation and malicious actors? ah and what you're watching now is the real josh rushing or is it? well that's the pointed today show. so let's talk to some experts about that with us to talk about the future of a i and this information from cambridge in the u. k. is henry, either an expert broadcaster, and since media researcher in new york, sam gregory, executive director of witness and organization that looks at the intersection of video and human rights. and in austin, texas, audra, nadeem,
10:35 pm
the chief operating officer at open a. i an artificial intelligence company. hey, one more guest at the table. that's you. so if you're watching this on youtube worker, we've got a live producer right there in the box waiting to get your questions to me so i can get them to the expert. so why don't we do this together, right. ok, sam, can you start as off? i want to talk about the letter that came out today. that's news. let's save that for just a 2nd because what gets me is it seems like i mean, weeks ago chat g p t was like the thing. and then it's like a couple of weeks later already to what g t p 4 and things seem to be changing really fast. and i know to deal with a i is like the fear is that you've crossed a tipping point before you realize it. so can you just kind of catch up, maybe people who aren't obsessed with us like me? where are we right now in a development? why is this important? so we've had 35 years of really foundational research that started to create the
10:36 pm
ability to do things like creating fakes, like the face of someone who never did something started to create the ability to have interactions with chap. what's that gave you? a parent human answers started to make it easier to falsify or clone someone's voice. and so what's happened in the last year, as we've done the seen this very rapid progress of these tools getting into the public eye and becoming accessible. so between the summer of last year, we've gone from tools like a tool called mid journey that came out in july 2022. and last week created an image of the poke in a puff a jacket or an a parents arrest of president trump. and then empower law. we see the advent of these tools like chad g, p t that apparently allow you to communicate with what appears to be someone who is giving you an answer and maybe doesn't appear like a machine, but it's actually a machine based on a large language model. right? right, right, i want to bring in a video comment. this is from a professor at princeton university,
10:37 pm
our vend narayan here. check this out. there is an open letter making the rounds today, calling for a moratorium and new a i. unfortunately, this is a distraction from the real issues, the idea that he is going to break free and kill us. that's to your speculation. the real harms actually happening are totally different. companies train these models on the back of people's data, but give them nothing and exchange. we don't have to solve that today. we don't need 6 months tax companies and compensated the artists. unfortunately, we don't seem to have the political will to consider anything like that. i guy, so i want to show the audience my laptop. this headline says getty images assuming stable diffusion for staggering. $1.00 trillion dollars, that's $150000.00 for image and all the images they took. so this is what he was talking about. the video comment, this kind of internet scraping henry, you can touch on that,
10:38 pm
but he began to comment by mentioning the letter today. that alarm mosque was also a part of can you bring this up to date on that letter, please? and then maybe you could touch on this idea that he brought him. yes, certainly. so this letter was published today. it was an open letter featuring, i believe, over a 1000 industry experts, researchers and academics, including as you mentioned, ill mosque. but also people are young talon, the co founder of skype, essentially asking for a moratorium all for the brakes to be put on a i development. particularly in the generative space with a focus on models like g, p, t for, but also moving on to other areas. you know, as you mentioned, mid journey text to image and so on. and, and so this is really come off the back of this, you know, massive alms race, were effectively seeing where as i mentioned, within the space of less than a year, mid journeys on version 5, all of the big tech companies are piling in to try and build better and more accessible models. and perhaps, you know,
10:39 pm
the concern is that there isn't enough proactive safety and ethics considerations going on here and the fear as it be reactive but to little too late with regards to the comment from your, from your come into their around and training. he is indeed right, that these models are veracious, the data they consume immense amounts of data to get to the state that they're in. we're talking, you know, hundreds of millions of photos and indeed most of the web being scraped in the context of text. and indeed, many of the kind of companies, the license stock imagery, and indeed artist the saying, well look, this is a violation of our copyright. it's a most the, what is a, is a controversial issue with some people saying, well, look, they're not actually copying. they're sort of taking inspiration from um, but obviously some people are saying, well look, you know, we need to update copyright to make it fit for purpose in this new age of synthetic content. and using data in this way, without permission, without royalties,
10:40 pm
is something which we can't accept moving forwards. either you're at an ai company or you, are you in an arms race and do you think there needs to be someone outside of the company is developing the stuff, stepping in with some kind of regulatory measures? definitely. so we work on text to radio, which is kind of the next big wave of what you're going to see being generated to jamie, i ordered better to be eyes what we call that. i think it's going to have to be a public private partnership later. the raising doctors and companies are raving. dr. going, anyone has never the solution. neither is putting regulation just positively because i don't think dc or anybody, any of the politicians really get the graph for what the technology is. and on the other hand, a lot of people working on it might not know the social must not fully understand the social political aspects of things going into it. so something by a public private partnership, that's what's on some sort of almost like constitutional law kind of things in
10:41 pm
place. and these can bear country by country or by geographic boundaries on how we come up with solutions is going to be the best, at least for me. i think that's going to be the best way to go about it. i want to bring in another video comment. this is jesse lurk. he's the co founder of accountable tech. the rapid escalation and the a i arms are, is really underscores how far behind us has fallen when it comes to regulating tack . we've done nothing to grapple with the catastrophic societal harm social media giant. so i'm the least over the past 15 years. we don't even have a federal privacy law in place. and now these companies are trying to capture the next market, rushing out half baked generative a. i tools that pose entirely novel threats to the information ecosystem. we're not prepared to grapple with them and we do need accountability. that's the job of government. but in the short run, we all need to re learn our critical thinking skills and approach content with
10:42 pm
a healthy skepticism, especially if it seems to confirm your existing biases. so sam, i'm checking in on our youtube audience right now, and it seems like the tone is fairly scary and just opium. is there real reason to be fearful here, or is this being blowing out of proportion? i think the real reason to be fearful, i spent the last 5 years working globally on this issue, talking to people who already face similar problems, right? attacks on them with fake images, attacks using non consensual sexual images of women, which is the biggest way people use these synthetic media debate models already. and all of those 5 years before this letter came out, folks were saying, we need action. we need to think about how this is going to be controlled. we need to think about the safeguards that are going to be built into the systems. so i think it's right for us to say, be fearful. now at the same time, we also have to recognize that when we create a hype cycle around this, it benefits certain people. it benefits people who get away with saying you can't
10:43 pm
trust what you see online. it benefits big companies. when we say, let's put a moratorium on development, it benefits the encumbrance. like jesse said, so i think we have to be fearful, but also really thinking about what are the homes that we already know are going to be generated here? let's not think about hypotheticals, a lot of that, for example, about the future of life lead. i was very focused on big picture like future hypotheticals or palm when we know the homes already from what's happening already . the way in which social media is playing out the way in which miss and as information is playing out. so it's time to respond to those needs right now, rather than play to sort of hypothetical fears. further down the line, henry pointed out rather astutely though, who is it that's going to step in because like i look at congress in the u. s. and what their average age is, that they, this is technology that they don't, i don't think they fully get. so, so who's can step in a regulators who should step in and regulate those? yeah, i think the tick tock, hearing the other day,
10:44 pm
was another example that like the zuckerberg hearing a few years back, which sort of highlighted some of the am ignorant, a lack of knowledge around and emerging technologies. and regulation is a really tricky one. i mean, we have plays around the world who are considering and are actively working to implement legislation. and we have a notably am in the e u, the u. a i act which would classify many of these generative technologies as well. they called high risk, which would then put further measures in place for creators of these tools in terms of how they source data, how they disclose the outputs of their, their tools to audience isn't to 2 people on line. um, but then we also have company, i'm country, sorry, like china who have introduced this year. the deep synthesis law, which takes it, you know, a step further and says, you know, we're going to actively moderate the content in terms of whether it's fake or real
10:45 pm
. and we get to say, if you will, being, you know, essentially a criminal and publishing they content which it's, you know, perhaps a little bit too far considering that track record and censorship. but i think you'll come to again, he's right. the currently government's we're, you know, salmon, i have been warning for years and about this, this problem and trying to get full full, you know, legislation going in to get key stakeholders engaged. i'm and is often is the case . it takes a real shock. it takes x, the heights like to weight people up and, and i think the u. s. and, you know, has a high so fed at work to do as those, the u. k. to kind of get in line here. yep. and henry, is it country by country? can you trust a country by country solution for this? because the internet seems to know no boundaries that way. well i, i'm not entirely sure what your sanity would be. i mean the u. n is the only body that i'm aware of that could potentially try and get all of it met the states involved in some kind of draft legislation to, to cover this kind of action. but again,
10:46 pm
look at the dynamic between china and the us around ship manufacturing and the ions rates, not just between companies, but also between countries. i think it's very unlikely that you're going to get that kind of international consensus built that way. unfortunately. which leads to a difficult challenge of countries trying to balance innovation, right, would safety, which is a one to strike, jump in there. so i think there's also a person that found out, or i think there's also a different framework that's required on how we think about it and saw that right? so there is in danger. so for example, people take phone calls where they're, where they can, you know, coffee towards that somebody that you know, and, and educating people on having safe words or having these secret traces within their family. not sharing information online to a stranger on the phone. even if they do sound like somebody that you know, there's like an image or conversation that needs to happen in terms of security.
10:47 pm
also if there is revenge foreign and a lot of this is used to abuse women of more education that goes into how the police feel today how easily and quickly complaints can be made. so they're immature things that needs to be done and then they're more policy wide things. but at the same time, i think a huge responsibility also live with the people, think these technologies and how you philosophically deal with it and what you what so for example, the creators who data you're using, how do you incentivize those creators? how do you pay those creators? how do you create more opportunities to build things that, that, that they can monetize upon. and i think a lot of the conversations are happening, but because of all the fear mongering around it because i agree with. 5 sam, yours is like a lot of money to be made in your monitoring. right and get him to the hands of the wrong people were talking about it and these images and more philosophical questions get enough of the table. and i think we need the frameworks for those way
10:48 pm
more than the machines are going to take over the world because we're nowhere near that. it's still there. i want to stop the conversation here per 2nd guys, because i want you to expand on something you said at the top to make sure our viewers here. to explain this idea that you need to have a safe word with your family. like why, what are you talking about? i want people to get this yes. for example, there are some difficulties available now where my voice can be used to, you know, for a phone call and somebody can call my dad and be like, hey dad, i'm in this, you know, emergency, can you give me your bank account details, or can you give me your credit card information so i can pay for x, y, and z. that's like the most simplest, you know, scan ever. and what we need to have is our family, this conversation of like, okay, if this happens, what's the 1st question that you're going to ask? that question remains within all if you don't write it anywhere, no, like, put it on your phone, don't put it on a google doc. and that way,
10:49 pm
if you do receive one of these phone calls, it's again, a lot of it goes back to very simple like how to not get down to one to one. but i think the need to have these conversations with our kids, with our families is there, and we don't have enough of that conversation. sam, you're going to jump in. yeah, i, you know, one of the things that i think is really important is that we don't place too much pressure on individuals to handle this. it's, i just came back from last week. we ran out meeting with 30 activists and journalists from across africa talking about this because agents be in response to these types of a i generated images and you know, we weren't able to, when we looked at images to necessarily discern that they were generated. and what that tells us is that instead of trying to put the pressure on each of us, for example, to look at these pictures we see on line and try and guess whether they're manipulated. we really need to think of this pipeline. and this is where, placing the responsibility on tech companies and distributors matters like is
10:50 pm
a tech company that's building one of these tools. so they placing the signals within that enable you to see that it was made with a synthetic tool of a providing watermarks that show that it was generated with a particular site of training data are those durable across time so that they're available to someone watching it cuz otherwise we end up in this world words like we start getting skeptical about every image. we look really closely, we try and apply all media literacy and it undermines truthful images, right? and that's the, the biggest worry is that like we create a world where we can trust and the thing because we create this assumption. and if we place that pressure on individuals, particularly individuals, round the world who don't have access to sophisticated detection tools, who exist in environments where they're not being supported by the platforms very well that were doomed to be in a very tough situation. so let's not place the pressure as, as exclusively on us as the viewers we really need to push the responsibility of the pipeline to the technology developers and the distributors. i want to share something that you're talking about here. this is
10:51 pm
a story about the great cascadia earthquake of 2001. it has pictures of scared families when it's happening, the city is destroyed. i mean, look at some of these, right. this are quick never happened. and if you really look at this photo, which is a great photo of someone on like a battered beach holding a canadian flag and you zoom in on it. the hand is backward on the right hand side, right? but this looks so real. and what i'm really concerned about, henry is here in the states, we saw how misinformation and disinformation affected not election and 2016. i think it affected brett's. it there as well with cambridge analytical. what does cambridge analytical look like in a i for future elections? yeah, it's a good question and i think i just very brief diesel into echo sams coleman. and, you know, i get a lot of journalists reaching out to me to go over the last few days saying, you know, can you tell us how to spot a deep fake as an individual. and i kind of have to copy everything i say by saying,
10:52 pm
you know, look, these images like the ones you just showed are getting better and better. if you look at the images that majority was generating back in july of last year. you know, the new ones are coming out at the moment. all, you know, leaks ahead in terms of realism. and so if we put the button on the individual to look for tell tale signs like the hands which are improving all the time. and it's going to give people false confidence that they can detect something when actually it might be trained out of the models by the new versions in terms of the election context is a really interesting one. and you know, again simon, i think we've been hearing every single election in the u. s. mid term presidential . the, this is going to be the one that deep fakes calls chaos. right? the a video is going to leak or an audio clips going to leave the night before the election is going to swing it or it's gonna cause, you know, chaos. i'm wondering, as you're talking, we're showing the tromp getting arrested photos just so you know what? go ahead, yes, yes. like or image and see which,
10:53 pm
which luckily didn't fool many people. in contrast, the image of pope frances, which we may get on to but um, but yeah, i think the, this election i think will be different. not necessarily because i think that kind of worst case scenario indistinguishable fake the, even the media and experts such as you know, such as myself and others can't detect. but i think is going to be different because big in a see just a mass proliferation of this content as we're seeing right now. you know, there are videos on line of all of the presidents with their voice. it's close to playing mary call party. and it's kind of a really convincing on one level, right? and there's a lot of images it showed of the precedence and these kind of kooky scenarios, a lot of means a lot of artistic content. and then as you said, some low scale information. luckily, most of it is detectable right now, but the, the movement of travel, another direction of travel and the speed of advances means that that's not something that we can be as sure about as we have been in previous selection,
10:54 pm
that it won't have a serious impact i'm really confused and potentially play into the kind of fact with information ecosystem that we are currently experience experiencing. as you mentioned in the us, on here in the u. k. sale if you want to jump in. yeah, it feels like there's a big shift that's really important to name that's around the ability to do it. this is the volume, right? this is being quite nice before and then potentially all sort of personalized that we've already seen. you know, the sense in which, you know, you can ask chat g p t to do something in the style. all right? so when people have done research on this, they look at this ability to create volume. the ability to pass lights, which of course, can be used to target people, and then we're seeing this commercialized and available to people. and so all of those make this a more vulnerable election to maybe organize factors. lots of people just want to have fun with the process like follows. so we have to be careful of all of those. i also want to name, but one of the biggest threats that we've seen in this is in elections, but elsewhere is also nonconsensual sexual images. and we don't always know they've
10:55 pm
been used and they used to target women, politicians, women, journalists, women activists. and so that's definitely going to be happening under the surface even if we don't know it's happening visibly. and so we should name that one is a threat that doesn't get the headlines around like disinformation, but it's equally harmful to all the civic space. yeah, youtube, charming in here, i'm looking at someone named anti cage taro, insights one of my favorite parts of the show when i use people's youtube handles and after he is kinda crazy names. but she has real concerns and says, my fear of a i is folksy, it is neutral, but it's very racist because of contracts with police. the military's, i don't know if that's true and that racism is going to be recreated in art and writing. i'm an artist, there's a lot of decks that can be created record time, but also those extra white ad i guess are sort of us, are you talking for a moment about that? the way that i think race, gender identity might play into what we're getting from me. i hear yeah,
10:56 pm
definitely. so that's why i think that's the country. that's not true. there are biases that are built into like charge. it could be, it would be know that example where it would say all the nice things about buy them, but not from these and people at the back and are fixing these every single day. again, it's human that the round the you are going to see somebody says in the data, but then add more and more new versions of these models are coming out there. 6 there are different, almost like stages being built where you can, you can dictate what kind of way. so what kind of opinion you want it to have. and so, for example, if you want it to be shakespearian or if you want it to be more liberal, whatever you want it to be, you can change and get opinions more and that. so i think more and more of these models get trained. we're going to see more optionality for people to generate things that they want. and again,
10:57 pm
can it be used to create harmful content, a 100 percent. but all the biases that we see are kind of added to built in by the human entities model. so the model themselves are not basis yet, so they reflect humanity, immunity, and henry, i'm going to ask you to soft question now to give you 60 seconds to answer, because the show is going to end whether you do or not. but my son said, you know, when they said that chest was uniquely human, and computers would never be able to beat humans and chess. and of course they did made it look easy. what does this tell us? what does i tell us about being uniquely human? is there something we learn about humanity from this? oh, well thank you for the generous 60 seconds. hm. yeah, that's a really tough question. i think look, we've seen time and time again a i able to replicate certain aspects of human performance capabilities. i'm in quite narrow domains and this i think, is one of the 1st that really flipped the narrative on his head that, you know,
10:58 pm
low skilled, repetitive, narrow jobs were going to be replaced by a i was seeing, you know, creative industries getting really shaken up by this, at the same time, i do think that a, i is not able to be fundamentally and creative in the way that we talk about it necessarily. humans. i don't think there is an intentionality, that there's not a consciousness which is for acting on the world. and i in that way, so i, yeah, some aspects can maybe be got it and it's air henry and i want to thank you. ok, i'm sam for being on the show today in all the humans that joined us out there watching. we'll see you next time on the stream. ah, the 1st commander, lebanese army, after independence from france, who took over as president at the time of crisis in a deeply divided country 50 years after his death,
10:59 pm
al jazeera wo, tells the story. as you heard, architect if the modern lebanese state soldiers statesman on a j 0, a meeting of mine's is deadly word, freedom which is used to cover so many different things. the fan knew about mainstream economics. if dad thought happens in any other profession, they would all be fired. yeah. or not just fire build bubble to prison. musical innovative brian ino meets renowned economist ha, june chang, part one. i see a lot of hope. i see a lot of experiments going on in the studio. b unscripted on al jazeera as people who goes to prepare for around the don. the excitement is mixed with extension for many cars since the muslim holy month is a time to come together with family and friends to break fast and share meals. but for some the memories of conflicts and wars when they've previously celebrated or
11:00 pm
casting a shadow. what you see behind me is called the colored neighbourhood by the people here. they painted pictures on the walls of their homes to celebrate and welcome the coming of the holy month of ramadan. light has been in stabling garza for many years with ongoing conflict and frequent outbreaks of violence. the international community has repeatedly expressed, concerned about the rising tensions in the occupied territories. the united nations has also expressed concern, but about the potential impact of any conflict on those in costa, whose lives are already precarious. ah, hello, lauren taylor on the top stories on al jazeera, the u. n says more than 180 people have been killed in 3.
27 Views
Uploaded by TV Archive on