Skip to main content

tv   The Stream  Al Jazeera  April 18, 2023 11:30am-12:01pm AST

11:30 am
away which the united nation says is likely to lead to food shortages for children while ready vulnerable to mel nourishment and diseases like cholera and malaria. well, i think for country like mozambique, which is year after year by these tropical storms, tropical fight lens of country, where many, lots of the infrastructure is, is not particularly strong. like i said, it really shows the importance of investing it in climate resilient infrastructure, which will stand up. ah, it costs a little bit more in the short term. but in the long term, the value of that investment pays off. for now, the un says it needs $809000000.00 to respond to the immediate education and health needs of children affected by the most recent flooding. in the long term, they're worried that worsening weather conditions will impact years worth of education. for millions of children from me to mala al jazeera one wasn't beek. ah,
11:31 am
fellow again, the headlines on al jazeera. these are the like pictures from cartoon where rival forces are fighting for 4th day. despite international calls for a cease fire, and satellite images show extensive destruction to saddam's airports as well as government facilities. heb. morgan has an update from heart to we can hear a lot of fighter jets flying overhead around the central parts of the capital for tomb. and we can hear as trucks being launched towards the vicinity of the presidential palace where the earth, or from least footage of their presence, as well as the vicinity of the general command of the army headquarters. now that is a scene of heavy fighting over the past 4 days for control from both sides. the rest of claims that it has control of the general command of the army and the army says that that is not true, but that there is fear it's fighting around the facility. so most of the heavy smoke that is coming out from the central part of her to him is around that area.
11:32 am
the kremlin has broadcast the video showing rushes president vladimir putin visiting 2 army posts and russian controlled areas of ukraine, who's in spoke to military leaders in southern her san before traveling to eastern lou guns, china's economy has rebounded more than expected in the 1st quarter of the year government data shows its g, d p rose 4 and a half percent between january to march. china's economic recovery has been under scrutiny after it abandoned its strict 0 code strategy to israelis have been injured in a shooting and occupied east. jerusalem israeli police say the suspect that attacks are open fire on a vehicle and the sheriff neighborhood where their intentions between palestinians and is really settlers police have rated the main headquarters of tennessee has biggest opposition party. this coming hours after another party leader, sheila knew she was arrested at his home. he's the main opponents of president cyprus hyatt. rescuers are still looking for at least 4 people in the northern
11:33 am
indian city of turnover. there believe to be trapped under the debris of a collapse building around $150.00 workers were asleep inside the building when it happened. one morning is coming up at the top of the hour right here on al jazeera, but coming up next. it's the stream. thanks for watching and bye for now. the latest news, as it breaks over the years from has been the target of numerous law suits and criminal investigations, but this time is different. we detailed coverings, although many countries are seeking towards years that defend us across it. we all know genesis the toilet will continue to be in demand from around the world. it's an indication of how relying benjamin netanyahu is on support from the far right plans for the so called national guard, have suddenly feed on monday. i
11:34 am
hello and welcome to the stream. what you're watching is josh rushing generated by ai. on today's show, we look into the dangers of ai power disinformation and how it could be used to manipulate the public fears of a new wave of disinformation loom as sophisticated ai programs become increasingly adept at making images and text. how could a i be used to distort our sense of reality and what's being done to protect the public from misinformation and malicious actors? ah. and what you're watching now is the real josh rushing or is it? well that's the pointed to they show. so let's talk to some experts about that with us to talk about the future of a i and this information from cambridge in the u. k. is henry either and a i, expert broadcaster, and synthetic media researcher in new york, sam gregory,
11:35 am
executive director of witness an organization that looks at the intersection of video and human rights. and in austin, texas, audra, nadine, the chief operating officer at opus a i an artificial intelligence company. hey, one more guest at the table. that's you. so if you're watching this on youtube worker, we've got a live producer right there in the box, waiting to get your questions to me so i can get them to the expert. so why don't we do this together? right. ok, sam, can you start this off? i want to talk about the letter that came out today. that's news. let's say that for just a 2nd because what gets me is it seems like i mean, weeks ago chat g p t was like the thing. and then it's like a couple of weeks later already to what g t p 4 and things seem to be changing really fast. and i know the deal with a i is, it's like the fear is that you've crossed a tipping point before you realize it. so can you just kinda catch up maybe people
11:36 am
who aren't obsessed with us, like me? where are we right now in the development? why is this important? so we've had 35 years of really foundational research that started to create the ability to do things like creat deep fakes. like the face of someone who never did something started to create the ability to have interactions with chap. what's that gave you? a parent human answers started to make it easier to falsify or clone someone's voice. and so what's happened in the last year, as we've done, seen this very rapid progress of these tools getting into the public eye and becoming accessible. so between the summer of last year, we've gone from tools like a tool called mid journey that came out in july 2022. and last week created an image of the poke in a puff a jacket or an a parent arrest of president trump. and then empower law. we see the advent of these tools like chad g, p t that apparently allow you to communicate with what appears to be someone who is giving you an answer and maybe doesn't appear like a machine, but it's actually
11:37 am
a machine based on a large language model. right, right, right. i want to bring in a video comment. this is from a professor at princeton university, our vend narayan here. check this out. there is an open letter making the round stood a calling for a moratorium on new e i. unfortunately, this is a distraction from the real issues. the idea that here is gonna break free and kill us. that's pure speculation. the real harms actually happening are totally different. hey, i companies trained these models on the back of people's data, but give them nothing in exchange. we don't have to solve that today. we don't need 6 months tax that a i companies and compensated the artists. unfortunately, we don't seem to have the political will to consider anything like that. i guess i want to show the audience my laptop. this headline says getty images assuming stable diffusion for a staggering $1.00 trillion dollars. that's $150000.00 for image in all the image
11:38 am
they took. so this is what he was talking about. the video can't comment this kind of internet scraping henry i, you can touch on that. but he began the comment by mentioning the letter today that alon mosque was also a part of, can you bring us up to date on that letter, please? and then maybe you could touch on this idea that he brought in. yes, certainly. so this letter was published today, it was an open letter featuring, i believe, over a 1000 industry experts, researchers and academics, including as you mentioned, ill mosque, but also people are young talon, the co founder of skype, essentially asking for a moratorium all for the brakes to be put on a i development, particularly in the generative space with a focus on models like g, p, t for but also moving on to other areas. you know, as you mentioned, mid journey text to image and so on. and, and so this is really come off the back of this, you know, massive arms race. we're effectively seeing where as sam mentioned with in the space of less than a year,
11:39 am
mid journeys on version 5. i'm all of the big tech companies are piling in to try and build better and more accessible models. and perhaps, you know, the concern is that there isn't enough proactive safety and ethics considerations going on here and the fear is it be reactive. but to little too late with regards to the comment from your, from your comment there around and training, he is indeed right, that these models are veracious, for data. they consume immense amounts of data to get to the state that they're in . we're talking, you know, hundreds of millions of photos and indeed most of the web being scraped in the context of text. and indeed many of the kind of companies, the license stock imagery and indeed artist the saying, well look, this is a violation of our copyright. it's a most of what is a, is a controversial issue with some people saying, well, look, they're not actually copying their so to taking inspiration from um, but obviously some people are saying, well, look, you know,
11:40 am
we need to update copyright to make it fit for purpose. in this new age of synthetic content, and using data in this way without permission, without royalties, is something which we can't accept moving forwards or you're at an ai company. are you, are you in an arms race and do you think there needs to be someone outside of the company is developing the stuff, stepping out with some kind of regulatory measures? definitely. so we work on text to radio, which is kind of the next big wave of what you're going to see being generated to jamie, i ordered better to be eyes what we call that. i think it's going to have to be public. private partnership later is raising taxes and companies are raving back to going. anyone has never the solution. neither is putting regulation just hasn't you because i don't think dc or anybody, any of the politicians really get the graph for what the ecology is. and on the other hand, a lot of people working on it might not know the social might not fully understand
11:41 am
the social political aspects of things going into it. so something by a public private partnership, that's what's on some sort of almost like constitutional law kind of things in place. and these can bear country by country or by geographic boundaries on how we come up with solutions is going to be about at least for me. i think that would be the best way to go about it. i want to bring in another video comment. this is jesse lurk, he's the co founder of accountable tech. the rapid escalation and the i arms are, is really underscores how far behind us this fall. and when it comes to regulating tag, we've done nothing to grapple with the catastrophic societal harm. social media giants of, i'm the least over the past 15 years. we don't even have a federal privacy law in place. and now these companies are trying to capture the next market, rushing out half baked generative a. i tools that pose entirely novel threats to the information ecosystem. we're not
11:42 am
prepared to grapple with them and we do need accountability. that's the job of government. but in the short run, we all need to re learn our critical thinking skills and approach content with the healthy skepticism, especially if it seems to confirm your existing biases. so sam, i'm checking in on our youtube audience right now and it seems like the tone is fairly scary and just dopey and is there a real reason to be fearful here or is this being blowing out a proportion? i think the real reason to be fearful, i spent the last 5 years working globally on this issue, talking to people who already face similar problems, right? attacks on them with fake images, attacks using non consensual sexual images of women, which is the biggest way people use these synthetic media debate models already. and all of those 5 years before this letter came out, folks were saying, we need action. we need to think about how this is going to be controlled. we need to think about the safeguards that are going to be built into the systems. so i
11:43 am
think it's right for us to say, be fearful. now at the same time, we also have to recognize that when we create a hype cycle around this, it benefits certain people. it benefits people who get away with saying you can't trust what you see online. it benefits big companies. when we say, let's put a moratorium on development, it benefits the encumbrance. like jesse said, so i think we have to be fearful, but also really thinking about what are the homes that we already know are going to be generated here? let's not think about hypothetical is a lot of that, for example, that future of life lead. i was very focused on big picture like future hypotheticals of home. when we know the homes already from what's happening already. the way in which social media is playing out the way in which miss and as information is playing out. so it's time to respond to those needs right now, rather than play to sort of hypothetical fears. further down the line, henry pointed out rather astutely though, who is it that's going to step in because like i look at congress in the u. s. and what their average age is, that they, this is technology that they don't,
11:44 am
i don't think they fully get. so, so who's can step in or regulate? who should step in and regulate those? yeah, i think the tick tock, hearing the other day, was another example of it like the zuckerberg hearing a few years back. which sort of highlighted some of the i'm ignorant, the lack of knowledge around them. emerging technologies and regulation is a really tricky one. i mean, we have plays around the world who are considering and are actively working to implement legislation. and we have a notably, i'm in the e u, the u. a i act which would classify many of these generative technologies as well. they called high risk, which would then put further and measures in place for creators of these tools in terms of how they source i data, how they disclose the outputs of their, their tools to audience isn't to 2 people online. um, but then we also have company, i'm country, sorry, like china who have introduced this year. the deep synthesis law, which takes it,
11:45 am
you know, a step further and says, you know, we're going to actively moderate the content in terms of whether it's fake or real . and we get to say, if you will, being, you know, essentially a criminal and publishing they content which is, you know, perhaps a little bit too far considering their track record and censorship. but i think you'll come to again, he's right. the currently government's we're, you know, sam and i have been warning for years and about this, this problem and trying to get full full and, you know, legislation going in to get key stakeholders engaged. i'm and is often is the case . it takes a real shock. it takes extra the height cycle to weight people up. and, and i think the u. s. and, you know, has a high so fed that were to do as those, the u. k. to kind of get in line here. yeah. been henry it country by country. can you trust a country by country solution for this? because the internet seems to know no boundaries that way. well i, i'm not entirely sure what your sanity would be. i mean the u. n is the only body
11:46 am
that i'm aware of that could potentially try and get all member states involved in some kind of draft legislation to, to cover this kind of action. but again, look at the dynamic between china in the us, around ship manufacturing and the i alms, right? not just between companies but also between countries. i think it's very unlikely that you're going to get that kind of international consensus built that way. unfortunately. which leads to a difficult challenge of countries trying to balance innovation, right, with safety, which is a want to strike, jump in there. so i think there's also a person that found out that i think there's also a different framework that's required on how we think about it and saw that right? so there is in danger. so for example, people say phone calls where they're, where they can, you know, copy the words of somebody that you know, and, and educating people on having safe words or having these secret phrases within their family. not sharing information online to
11:47 am
a stranger on the phone. even if they do sound like somebody that you know, that is like an image conversation that needs to happen in terms of personal security. also if there is rebecca foreign and a lot of this abuse to abuse women more education that goes into how the police feel for their how easily and quickly complaints can be made. so they're immature things that needs to be done and then they're more policy wide things. but at the same time, i think a huge responsibility also live with the people building these technologies and how you philosophically deal with it and what you what so for example, the creators who data you're using, how do you incentivize those creators? how do you pay those creators? how do you create more opportunities to build things that, that, that they can monetize upon. and i think a lot of the conversations are happening, but because of all the fear mongering around it because i agree with. 5 samuel, there's a lot of money to be made in your monitoring right and get him to the hands of the
11:48 am
wrong people. we're talking about it and these images and more philosophical questions get enough of the table. and i think we need the frameworks or those way more than the machines are going to take over the world because we're nowhere near about it. so i want to stop the conversation here for a 2nd guys, because i want you to expand on something you said at the top to make sure our viewers here to explain this i did it, you need to have a safe word with your family. like why, what are you talking about? i want people to get this yes. for example, there are some of the technologies available now where my voice can be used to, you know, for a phone call and somebody can call my dad and be like, hey, dad, i'm in this, you know, emergency, can you give me your bank account? details can you give me your credit card information so i can pay for x, y, and z? that's like the most simplest, you know, scan ever. and what we need to have is our family, this conversation of like, okay,
11:49 am
if this happens, what's the 1st question that you're going to ask? that question remains within all if you don't write it anywhere, no, like, put it on your phone, don't put it on a google doc. and that way, if you do receive one of these phone calls, it's again a lot. if it goes back to very simple, like how to not get down to one to one. but i think the need to have these conversations with our kids with our families is there. and we don't have enough of that conversation. sam, you are going to jump in. yeah, i, you know, one of the things that i think is really important is that we don't place too much pressure on individuals to handle this it's, i just came back from last week. we ran a meeting with 30 activists and journalists from across africa talking about this, such as agents be in response to these types of a i generated images and you know, we weren't able to, when we looked at images to necessarily discern that they were a, i generated and what that tells us is that instead of trying to put the pressure on each of us, for example, to look at these pictures we see on line and try and guess whether they're
11:50 am
manipulated. we really need to think of this pipeline. and this is where, placing the responsibility on tech companies and distributors matters like is a tech company that's building one of these tools. so they placing the signals within that enable you to see that it was made with a synthetic tool of a providing watermarks that show that it was generated with a particular sign of training data are those durable across time so that they're available to someone watching it because otherwise, we end up in this world where it's like, we start getting skeptical about every image. we look really closely. we try and apply all media literacy and it undermines truthful images, right? and that's the, the biggest worry is that like we create a world where we can trust and the thing because we create this assumption. and if we place that pressure on individuals, particularly individuals, round the world who don't have access to sophisticated detection tools, who exist in environments where they're not being supported by the platforms very well, then we're doomed to be in a very tough situation. so let's not place the pressure as exclusively on us. as
11:51 am
the viewers we really need to push the responsibility of the pipeline to the technology developers and the distributors. i want to show something that you're talking about here. this is a story about the great cascadia earthquake of 2001. it has pictures of scared families when it's happening, the cities destroyed. i mean, look at some of these, right. this are quick never happened. and if you really look at this photo, which is a great photo of someone on like a battered beach holding a canadian flag and you zoom in on it. the hand is backward on the right hand side, right? but this looks so real. and what i'm really concerned about, henry is here in the states, we saw how misinformation and disinformation affected not election and 2016. i think it affected brett's. it there as well with cambridge analytical. what does cambridge analytical look like in a i for future elections? yeah, it's a good question and i think i just very brief diesel into echo sams coleman. and,
11:52 am
you know, i got a lot of journalists reaching out to me to go over the last few days saying, you know, can you tell us how to spot a deep fake as an individual. and i kind of have to copy everything i say by saying, you know, look, these images like the ones you just showed are getting better and better. if you look at the images that mid journey was generating back in july of last year. you know, the new ones are coming out at the moment. all, you know, leaks ahead in terms of realism. and so if we put the button on the individual to lift a tell tale signs like the hands which are improving all the time. and it's going to give people false confidence that they can detect something when actually it might be trained out of the models by the new versions in terms of the election context is a really interesting one. and you know, again simon, i think we've been hearing every single election in the u. s. mid term presidential . the, this is going to be the one that deep fakes cools chaos. right? the a video is going to leak or an audio clips going to leave the night before the election is going to swing it or it's gonna cause, you know, chaos. i'm wondering,
11:53 am
as you're talking, we're showing the tromp getting arrested photos just so you know what? go ahead, yes, yes, like are image and see them which, which luckily didn't fool many people in contrast that the image of pope frances, which we may get on to. but um, but yeah, i think the, this election i think will be different. not necessarily because i think that kind of worst case scenario indistinguishable fake that even the media and experts such as you know, such as myself and others. com to tags. but i think is going to be different because big in a see just a mass proliferation of this content as we're seeing right now. you know, there are videos a line of all of the presidents with their voice. it's cloned, playing mary, call party. and it's kind of a really convincing on one level, right? and there's a lot of images it showed of the precedence in these kind of kooky scenarios. a lot of means a lot of artistic content. and then as you said, some low scale information. luckily, most of it is detectable right now, but the,
11:54 am
the movement of travel, another direction of travel and the speed of advances means that that's not something that we can be as sure about as we have been in previous selection, that it won't have a serious impact and really confused and potentially play into the kind of fact with information ecosystem that we are currently experience experiencing. as you mentioned in the us, on here in the u. k. sale if you want to jump. yeah, it feels like there's a big shift that's really important to name that's around the ability to do it. this is the volume, right? this is being quite nice before and then potentially all sort of personalized that we've already seen. you know, the sense in which, you know, you can ask chat g p t to do something in the style. all right? so when people have done research on this, they look at this ability to create volume. the ability to pass lights, which of course, can be used to target people, and then we're seeing this commercialized and available to people. and so all of those make this a more vulnerable election to maybe organize factors. most of people just want to have fun with the process like follows. so we have to be careful of all of those. i
11:55 am
also want to name, but one of the biggest threats that we've seen in this is in elections, but elsewhere is also nonconsensual sexual images. and we don't always know they've been used and they used to target women, politicians, women, journalists, women activists. and so that's definitely going to be happening under the surface even if we don't know it's happening visibly. and so we should name that one is a threat that doesn't get the headlines around like disinformation, but it's equally harmful to all the civic space. youtube charming in here, i'm looking at someone named anti cage taro, insights one of my favorite parts of the show when i use people's year to handles and have to kind of crazy names. but she has real concerns and says, my fear of a i is folksy, it is neutral, but it's very racist because of contracts with police and military. i don't know if that's true and that racism is going to be recreated in art and writing. i'm an artist, there's a lot of a decks that can be created record time, but also those extra white ad i guess are sort of us are key to talk for
11:56 am
a moment about that. the way that i think race and gender identity might play into what we're getting from me. i hear yeah, definitely. so that's why i think that's the easiest answer is that's not true. there are biases that are built into like charges to be, it would be know that example where it would say all the nice things about buy them, but not from these and people at the back and are fixing these every single day. again, it's human that the round, so you are going to see somebody says in the data, but then add more and more new versions of these models are coming out there. 6 there are different, almost like stages being built where you can, you can dictate what kind of way. so what kind of opinion you want to have. and so, for example, if you want it to be shakespearian or if you want it to be more liberal, whatever you want it to be,
11:57 am
you can change and get opinions more and that. so i think more and more of these models get trained. we're going to see more optionality for people to generate things that they want. and again, can it be used to create harmful content, a 100 percent. but all the biases that we see are kind of added to built in by the human entities model. so the model themselves are not basis yet, so they reflect, humanity, immunity, henry, i'm going to ask you to soft question now to give you 60 seconds to answer it, because it shows going to end whether you do or not. but my son said, you know, when they said that chest was uniquely human, and computers would never be able to be human and chess. and of course they did made it look easy. what does this tell us? what does a i tell us about being uniquely human? is there something we learn about humanity from this? well, thank you for the generous 60 seconds. yeah, that's a really tough question. i think look, we've seen time and time again
11:58 am
a i able to replicate certain aspects of human performance capabilities. i'm in quite narrow domains and this i think, is one of the 1st that really flipped the narrative on his head that, you know, low skilled, repetitive, narrow jobs were going to be replaced by a i was seeing, you know, creative industries getting really shaken up by this, at the same time, i do think that a, i is not able to be fundamentally un creative in the way we talk about it necessarily. humans. i don't think there is an intentionality, that there's not a consciousness which is for acting on the world. hearing that way, so i yeah, some aspects can maybe be i got it and it's there henry and i want to thank you. ok . ok, sam for being on the show today in all the humans that joined us out there watching . we'll see you next time on the strength ah ah,
11:59 am
inspiring stories from around the wood. ready ah, human life capture and his foster one groundbreaking films from award winning filmmakers. what is going on in new york city? on a jazz eda. it's a $1000000000.00 money laundering operation for coal. marsha is bigger than the company with financial institutions, regulators and governments complicit with. right, right. in a 4 part series, algio 0 is investigative unit,
12:00 pm
goes under cover in southern africa, brook birth control, 90 percent. it was developed once it's too slow and it's perfectly brandon. good part for on al jazeera, i was in a hands on journalist working in asia and africa. there'd be days where i'd be choosing and editing myron stories in a refugee camp with no electricity. and right now where confronting some of the greatest challenges that humanity is ever faced. and i really believe that the only way we can do that is with compassion and generosity and compromise. because that's the only way we can try to solve any of these problem is together. that's why are so important. we make those connections ah, a 4th day of fighting in sudan despite several international appeals for a ceasefire.

22 Views

info Stream Only

Uploaded by TV Archive on