tv The Stream Al Jazeera September 7, 2023 5:30pm-6:01pm AST
5:30 pm
5 site loans like the ones that struck right. you're grunted to sort of suck loans and all the weather phenomena happened naturally. but with global warming, they have become stronger and more frequent. i mean, you get into the soviet emergency is not yet over forecast or say there's more rain on the way monica not give all to 0. re addition arrow, the to without is there are, these are top stories. emergency services increase of last rescue efforts in several flushing villages. 2 days of torrential rains of cause major damage to infrastructure. summers of single range within 24 hours. and they normally see in the a yeah, the head of students running counsel, abdul thought to all behind has met kass wells, a mistake. i mean, how they've been having to autonomy and throw ha, i discussed by law for relations and aid. before his visit, the general issued a decree to dissolve the permanent too rapid support forces. and then i went ahead
5:31 pm
and type the see on the armed forces, re sure this should do these people and the entire world that we're continuing to complete the transition to democratic civilian rule after the defeat of the rebellion and the whole society. how the priority of the current stage is, how to end the rebellion, and defeated. one of the priorities is also how to stop the suffering of suit the needs of people in all parts of the country. russia has cooled washington's decision to send a painted uranium rounds to ukraine. a criminal act us has plenty on the passing tank rounds and its latest $1000000000.00 aid package. a highly dense radioactive materials effective against alma, but is using will, has been controversial. or how does a truck that's drawn to the main border crossing between pakistan and afghanistan of the gun baffled between security forces. on wednesday, the vehicles of box out to columbus has on both sides of the border. i can sounds,
5:32 pm
government says it's in contact with a gun or thirty's to try to diffuse the situation. the number of people younger than 50 diagnosed with cancer, has increased by 79 percent in the past 3 decades. that's according to a new study. the research is published in the purchase of medical journal as, as more than a 1000000 people age, 50 or younger diet, the disease, every yeah. couple minutes. she has released the deposed presence highly bone. go from house arrest saying he's free to leave the country from that was visited by the presence of the central african republic, who was the chief negotiator for his release was overthrown by the army last week. those are the headlines to check out our website out of their adult comes low on on top stories on state street and the stream is up. next the
5:33 pm
the hello and welcome to the stream. what you're watching is josh rushing generated by a guy on today's show. we look into the dangers of a, i powered this information and how it could be used to manipulate the public fears of a new wave of this information. loom a sophisticated a i programs become increasingly adept at making images and text a. how could i be used to distort our sense of reality and what's being done to protect the public from misinformation and malicious actors? and what you're watching now is the real josh rushing or is it?
5:34 pm
well that's the pointed today show. so let's talk to some experts about that with us to talk about the future of a i and this information from cambridge into u. k. is henry either an expert broadcasters, instead of media research or in new york, sam gregory, executive director of witness an organization that looked at the intersection of video and human rights. and in austin, texas. i was running the dean, the chief operating officer at opus a. i an artificial intelligence company. hey, one more guest at the table. that's you. so if you're watching this on youtube and we've got a live producer right there in the box waiting to get your questions to me so i can get them to the expert. so why don't we do this together, right. okay. um, sam, can you start this off? i, i want to talk about the letter that came out today. that's news. let's say that for just a 2nd, because what gets me is it seems like i mean of weeks ago chat g p t was like the thing. and then it's like a couple of weeks later,
5:35 pm
we're already to what g t p 4 and things seem to be changing really fast. and i know the deal with a i is, it's like the periods that you've crossed a tipping point before you realize it. so can you just kind of catch up, maybe people who aren't obsessed with us, like me? where are we right now and they are development? why is this important? and so we've had for 5 years of really foundational research that started to create the ability to do things like creative fakes, like a face or someone who never did. something started to create the ability to have interactions with job thoughts that gave you a power and human answers started to make it easier to falsify or plug in someone's voice. and so what's happened in the last years we've done this, seen this very rapid progress of these tools getting into the public eye and becoming accessible. so between the summer of last year, we've gone from tools like a tool called mid journey that came out in july 2022. and last week to create an image of the pope. in a puff, a jacket or a power to arrest the president,
5:36 pm
trump. and then empower law. we see the advent of these tools like try to be that apparently allow you to communicate with what appears to be someone who has given you an on. so i maybe doesn't appear like a machine, but it's actually a machine based on a large language model. right, right, right. i want to bring in a video comment. this is from a professor at princeton university on our, our vin number and you on your check this out there's an open letter making the rounds to a calling for a moratorium new a i. unfortunately, this is a distraction from the real issues. the idea that the guy is gonna break free and kill us. that's pure speculation. the real harms actually happening are totally different. companies train these models on the back of people's data, but give them nothing an exchange. we don't have to solve that today. we don't need it 6 months, tax the companies and compensated the artist. unfortunately, we don't seem to have the political well to consider anything like that. the right
5:37 pm
guy. so i want to show the audience my laptop. this headline says, getty images assuming stable the fusion for a staggering $1.00 trillion dollars, that's $150000.00 for image and all the images they took. so this is what he was talking about. the video card comment, this kind of internet scraping henry i, you can touch on that. but he began to comment by mentioning the letter today. that alon mosque was also a part of, can you bring us up to date on that letter, please? and then maybe you can touch on this idea that he brought in. so yeah, certainly. so, and this less, it was published today. it was an open left to featuring, i believe, over a 1000 industry experts, researches and academics, including as you mentioned, a little mosque, but it was so people like young tile into kind of founder of skype, essentially asking for a moratorium or for the brakes to be puts on a development, particularly in the generative space. we're not focused on motors, light, cvt full but also moving on to other areas. um, you know, as you mentioned,
5:38 pm
mid journey text image and so on. and so this is really come off the back of this. you know, must evolves, right, it's worth it to be seeing where as i mentioned, within the space of less than a year and mid journeys on version 5 of all of the big tech companies are piling in to try and builds back to or more accessible models and perhaps, you know, the concern is that there isn't enough proactive safety and ethics considerations going on here and the fear is it be reactive, but too little, too late with regards to the comment from your, from your come into that around training. he is indeed right that these models alterations for data. they can seem immense amounts of data to get to the state that, that room we're talking, you know, hundreds of millions of photos and indeed most of the web being scraped in the context of text. and indeed, many of the kind of companies, the license, the imagery, and indeed office the 2nd. well look, this is a violation of our copyright. it's a most of what it's a, um, is
5:39 pm
a controversial issue, which some people say, well, look, then we'll actually copying that. so taking inspiration from him, but obviously some people are saying, well, look, you know, we need to update copyright to make it fit for purpose. in this new age of synthetic content and using data in this way, without permission, without royalties, it's something which we can't accept moving forward. all right, here at an a i company, are you, are you in an arm trace and do you think there needs to be someone outside of the company is developing this stuff? stepping in with some kind of regulatory measures. and definitely, so we work on text to video, which is kind of the next big wave of what you're going to see being generated to trade in a i or directory i, that's what we call it i. i think it's going to have to be a public, private partnerships. they, there's the raising taxes and companies, those raising taxes in anyone is never the solution. neither is putting regulation
5:40 pm
. just have posit me because i don't think dc or anybody, any of the politicians really get the grass or what the technology is. and on the other hand, a lot of people working on it by not no, they're sociable and whatnot. when you understand the social, political aspect of things going into it. so something by a public private partnership, that's what's on some sort of funds, almost like a constitutional law kind of things in place. and these can vary country by country or by geographic boundaries on how we come up with solutions is going to be the bat, at least for me. i think that's going to be the best way to go about it. i'm going to bring in another video comment. this is jesse alert. he's the co founder of accountable tech in the rapid escalation and the arms, or is really underscores how far behind the u. s. as fall and when it comes to regulating tag, we've done nothing to grapple with the catastrophic societal harm, social media giants. so on the least over the past 15 years, we don't even have
5:41 pm
a federal privacy laws in place. and now these same companies are trying to capture the next market, rushing out half baked generative a. i tools that pose entirely novel threats to the information eco system. we're not prepared to grapple with them and we do need accountability. that's the job of government. but in the short run, we all need to re learn our critical thinking skills and approach content with a healthy skepticism, especially if it seems to confirm your existing biases. so sam, i'm checking in on our youtube audience right now, and it seems like the tone is fairly scary and just opium. is there a real reason to be fearful here, or is this being blowing out of proportion? i think there's a real reason to be fearful. i spent the last 5 years working globally on this issue, talking to people who already face similar problems, right? to a tax on them with fake images, a tax using non consensual sexual images of women, which is the biggest way people use these uh, synthetic media,
5:42 pm
deep sight models already. um, on one of those 5 years before this letter came out, folks were saying, we need action. we need to think about how this is going to be controls. we need to think about the safeguards that are going to be built into the systems. so i think it's right for us to say, be fearful. now, at the same time, we also have to recognize that when we create a hype sites around this, it benefits certain people. it benefits people who get away with saying you can't trust what you see online. it benefits big companies. when we say, let's put a moratorium on development. it benefits the incumbents, like jesse said, so i think we have to be fearful, but also really thinking about what are the homes that we already know are going to be generated here? let's not think about hypothetical. it's a lot of that, for example, that future of life letter was very focused on the big picture like future hypotheticals of homes. when we know the homes already from what's happening already, the way in which social media was playing out the way in which miss sent us information is flying out. so it's time to respond to those needs right now,
5:43 pm
rather than play just sort of hypothetical fee as further down the line. henry osler pointed out, rather astutely though, who is it that's going to step in because like i looked at congress in the us and what their average age is, that they is. this is technology that they don't, i don't think they fully get so so who can step in and regulate those who should step into regulate those? yeah, i think the tick tock here in the other day was another example of it. like there's a bug here in a few years back, which sort of highlighted some of the ignorance, the lack of knowledge around emerging technologies. and regulation is a really tricky one. i mean we, we have plays around the world who are considering and our to be working to implement legislation. we have a note to be in the u. d. u a i act which would classify many of these generative technologies as what they call high risk, which would then put the measures in place for creatives of these tools in terms of how they suicide data, how they disclose the outputs of the,
5:44 pm
the tools to audiences and to uh, to people online, but then we also have company in country, sorry, like china, who introduced they see at the deepest emphasis which takes it, you know, a step further and says, you know, we're going to actively moderate the content in terms of whether it's faithful, real and we get to say if you're being, you know, essentially a criminal and publishing content, which is, you know, perhaps a little bit too far considering that track record in censorship. but i think you'll come through again. he's right, the current lead governments. we again, i saw him and i have the morning to use about this, this problem and trying to get a full full um, you know, legislation going and to get key stakeholders engaged as often is the case. it takes a real shock. it takes like the heights like the way people up and i think the us um, you know, has a cost effective was due as does the u. k. to kind of get in line here. yeah. but
5:45 pm
henry, is it, is it country by country? can you trust a country by country solution for this? because the internet seems to know no boundaries that way. a while i'm not entirely sure what you're tentative would be, i mean the u. n. is there anybody that i'm aware of that could potentially try and get all of it met the states involved in some kind of draft. so it legislation to, to cover this kind of auction. but again, look at the dynamic between china and the us around ship manufacturing on the ions raise, not just between companies, but we say between countries i think is very unlikely that you're going to get that kind of international consensus built that way. unfortunately, which leads to a difficult challenges for countries trying to balance innovation right? with safety, which is a dad want to strike, jump in there. so yeah, i think there's also the person that sam, other i think is also a different framework that's required on how we think about it and saw that, right. so there's image of injury. so for example, people take phone calls where they're, where they can, um, you know, coffee,
5:46 pm
the words of somebody that you know, end up and educating people on having safe words or having these uh, secret phrases within their family. not sharing information online to a stranger on the phone, even if they do sound like somebody that you know. so there's like an image of the conversation that needs to happen in terms of personal security. but also if there is a revenge born and a lot of this is used to abuse women of more education that goes into how the police deals with the how easily and quickly complaints can be made. so the image of things that needs to be done and then they're more policy wide things. but at the same time, i think it's huge responsibility also life with the people building these technologies and how you philosophically deal with it and what you put. so for example, the creators whose data you're using, how do you incentivize those creators? how do you pay those creators? how do you create more opportunities for creat, goes to build things that that,
5:47 pm
that they can monetize upon. and i think a lot of these conversations are happening, but because of all the your monitoring around it because i agree with. 5 henry, and so i'm here, there's like a lot of money to be made into your monitoring. right. it gets into the hands of the wrong default. we're talking about it. and these images and more philosophical questions get knocked off the table. and i think we need frameworks for those way, more than all the machines are going to take over the world because we're nowhere near that. it's still humans at the home of every house, or i'm going to stop the conversation here for a 2nd guys, because i want you to expand on something you said at the top to mixture of yours. here. to explain this idea that you need to have a safe word with your family, like why, what are you talking about? i want people to get this yes. a, for example, there's tons of technology is available now where my voice can be used to, you know, for a phone call and somebody can call my dad and be like, hey, dad, i'm in this, you know, emergency, can you give me your bank account details or can you give me your credit card
5:48 pm
information so i can pay for x, y, and z? that's like the most simplest, you know, of scam ever. and what we need to have is with our family, this conversation of like, okay, if this happens, what's the 1st question that you're going to ask? and that question remains within all, if you don't write it anywhere, know like, you know, put it on your phone, don't put it on google docs. and that way, if you do receive one of these phone calls, it's again a lot. if it goes back to very simple, like how not get scanned 101. but i think the need to have these conversations with our kids with our families is there and we don't have enough of that conversation. no, sam, you're going to jump in. yeah, i, you know, one of the things that i think is really important is that we don't place too much pressure on individuals to, to handle. this sits, i just came back from last week. we ran a meeting with 30 activists and journalists from across africa talking about this, such as the agency and response to these types of a i generated images and,
5:49 pm
you know, we, we weren't able to, when we looked at images to necessarily to sun that they weigh i generated and what that tells us is that instead of trying to put the pressure on each of us, for example, to look at these pictures we see on line and try and guess whether that manipulated . we really need to think of this pipeline, and this is where placing the responsibility on tech companies and distribute those matters like is a tech company that's building one of these tools that they placing the signals within it that enable you to see that it was made with a synthetic tool, are they providing watermarks that show that it was generated with the protection, the sight of training data, all those jo, ripple across the time so that they're available to and someone watching it because otherwise we end up in this world words like we start getting skeptical about every image. we look really closely that we try and apply all media literacy and it undermines truthful images, right? and that's the, the biggest worry is that like we create a world where we contrast and the thing because we create this assumption. and if we place that pressure on individuals, particularly individuals around the world who don't have access to sophisticated
5:50 pm
detection tools who exist in environments whether or not being supported by the platforms very well then, or do to, you know, to be in a very tough situation. so let's not place the pressure as, as exclusively on us as the viewers we really need to push the responsibility of the pipelines and the technology developers and the distributive. i wanna show something that you're talking about here. this is a story about the great cascadia earthquake of 2001. it has pictures of scared families when it's happening, the cities destroyed. i mean, look at some of these, right? the sort quite never happened. and if you really look at this photo, which is a great photo of someone on like a battered beach holding a canadian flag and you zoom in on it. the hand is backward on the right hand side . right, but the smoke so real and what i'm really concerned about henry is here in the states. we saw how misinformation and this information effect another election in
5:51 pm
2016. i think it effected brett's. it. there as well with cambridge analytical, what is cambridge analytical look like in a i for future elections? a. yeah, it's a good question and i think i just very brief diesel into echo sams come in and you know, i get a little job list reaching out to me to get over the last few days today. and you know, can you tell us how to spot a deep space because an individual and i kind of have to copy everything i say by saying, you know, look, these images like the ones you just showed are getting better and better. if you look at the images, the mid journey was generating back in july of last year, you know, the new ones that are coming out at the moment, all the, you know, leaks ahead in terms of realism. and so if we put the button on the individual to look the tell tale signs like the hands which are improving the time. and it's going to get people false confidence that they can detect something. when actually it might be trained out to the models by the new versions. and in terms of the election context is a really interesting one. and you know, again simon, i think we've been hearing every single election us submit the presidential,
5:52 pm
the this is going to be the one that the fake schools chaos right. the video is going to leak on. would you eclipse gonna leave the night before the election is gonna swing it? or it's gonna cause you know, the chaos and well know you guys are talking, we're selling the trump getting arrested photos just so you know what? good. go ahead. yes. yes, i can see which, which luckily didn't fool many people in contrast, the image of pipe francis, which we may get on to. but um, but yeah, i think the, the selection i think will be different. not necessarily because i think that kind of worst case scenario indistinguishable fake the, even the media are. and experts such as, you know, such as myself and others comp the tact. but i think it's going to be different because we're going to see just a mass of proliferation, of this content as wasting right now. you know, there were videos on line of one of the presidents with that voice is chloe and playing mary comp policy. and it's kind of irritated convincing on one level,
5:53 pm
right? and there's a lot of images, it just showed the presidents and these kind of tricky scenarios. and a lot of means a lot of autistic content. and then as you said, some, you know, low scale to information of luckily, most of it is detectable right now. but the, the movement of travel, i know the direction of travel and the speed of advance is means that that's not something that we can be as sure about as we have been in previous selections, that it was have a serious impact and really confuse and potentially play into the kind of fact with information ecosystem that we are currently experience experiencing. as you mentioned in the us on here in the u. k. uh say i will talk you want to jump out. yeah. yeah, it feels like there's a big shift to it's really important to name that's around the ability to do it doesn't volume, right? this has been quite an issue before and then potentially will sort of personalize that we've already seen, you know, the sense in which you know, you can aust chat to be to, to do something in the style. all right, so when people have done research on this, they look at this ability to create volume, the ability to pass lights,
5:54 pm
which of course, can be used to target people. and then we're seeing this commercialized and available to people. and so one of those make this a more vulnerable election that may be organized, packed as well. so people are just wanna have fun with the process like falls. um, so we have to be cafe level one of those. i also want to name, but one of the biggest threats that we've seen and this is in elections, males whereas also non consensual, sexual images. and we don't always know they've been used and they used to talk it, women, politicians, women, journalist, women activists. and so that's definitely going to be happening under the surface even if we don't know what's happening visibly. and so we should name that one is a threat. that doesn't get the headlines around like this information, but it's equally harmful to all the civic space. yeah, youtube, charming in here, i'm looking at someone named empty case taro, insights it's one of my favorite parts of the show when i use people's youtube handles. and i have to read these kind of crazy names. but she has real concerns as, as my fear of a, i is bulk see it as neutral, but it's very racist because the contracts with police and military is,
5:55 pm
i don't know if that's true and that racism is going to be recreated in art and writing. i'm an artist, there's a lot of decks that can be created on record time, but also those extra white says add that, i guess those are the other key to be talking for a moment about that. the way that i think raising gender identity might play into what we're getting from a i hear a yeah. definite i. so that's why i, i think that's the easiest answer is that's not true. there are biases that are built into uh, they charge you to be able to get a we know that example where it would say all the nice things about buy them, but nothing about problem. and these are, and people at the back and they're fixing these every single day. again, it's humans at the round. so you are going to see some biases in the data. but then add more and more new versions of these models are coming to out there. 6 there are different, almost like they just being built the way you can,
5:56 pm
you can dictate what kind of way. so what kind of opinion you wanted to have um, and so for example, if you wanted to be experian or if you wanted to be more of, you know, no liberal whatever you want it to be, you can change and get opinions more in that. so i think i was more and more these models get trained, we're going to see more option to other people, people to generate things that they want. and again, can be used to create a harmful content, a 100 percent. but all the biases that we see are kind of data built in by the human entities mano, so the model themselves are not basis yet. so they reflect humanity, amanda and henry. i'm gonna ask you a philosophical question. i'm gonna give you 60 seconds to answer it because the show is going to end whether you do or not. but, but, but my son said, you know, when they said the test was uniquely human in computers would never be able to beat humans and chess. and of course they did made it look easy. what does this tell us?
5:57 pm
what is a i tell us about being uniquely human? is there something we learn about humanity from this? well, thank you for the generals. 60 seconds. yeah, that's a really tough question. i think we've seen time and time again able to replicate certain aspects of human performance capabilities and quite narrow domains. and this, i think, is one of the 1st that really flipped the narrative on his head, you know, low scale, repetitive not results. so we're going to be replaced by ai. we're seeing, you know, creative industry is getting really shaken out by this. at the same time, i do think that i is not able to be fundamentally creative in the way that we talk about it necessarily. humans, i don't think there is an intentionality, that there's not a consciousness which is per item in the world. right. not way so i. yeah, some aspects couldn't maybe the edits there, henry and i want to thank you uh, sam for being on the show today. and all the humans that joined us out there watching. we'll see you next time on the stream,
5:58 pm
the beauty of which the source of nature need to be harmonized with stable and sustainable goals. united with the diversity of cultures that quick jakarta, indonesia is ready to hold the 2023 ations together. we will prove that patient matters at the center of the in september. now i'm just india, who's the g 20 summit,
5:59 pm
where leading economies will discuss global challenges. generation support meets the icons who are challenging preconceptions and using the platforms to change societies, wildly dis, gathering new york for the un general assembly with the ukraine and climate change expected to dominate towards the broad cost premier of a new series. exploring the implications of us on g for golf clothes for the 1st amendment rights to celebration of sports and fees. competition is expected to china who c h. a game. september on al jazeera the world of frequency. shad trading and exposed scientists engine that was basically trading. i couldn't last. $30000000.00 was a terrifying experience. how to efficient intelligence as rates for stakes and rates on the money markets. this market scope faster and faster. we're opening up
6:00 pm
the possibility for an instability for no money box on average is here on the. ready the lower kyle: this is the news our live from the coming up in the next 60 minutes. hundreds of people have been rescued from floods in greece, also heavy rain storms, vines, the most levels have left many stranded. had the $2.00 rooted council needs councils. a mayor and a push for support was off the months of fighting against a priority group. russia condemns us move to supply controversial anti tank lessons to ukraine. plus the news.
13 Views
Uploaded by TV Archive on