Skip to main content

tv   The Stream  Al Jazeera  September 7, 2023 7:30am-8:00am AST

7:30 am
as japan has it almost it's noon, the exploration spacecraft as it now ends to become the 5th nation, to land on the lane. the slots alonda for investigating the moon or slim blasted off from the time of the show islands, bay, and pacific ocean is expected to reach over around the moon early next year. depends to earlier 10th fails in the past in the hello again. this is al jazeera and these other headlines come on. his military has released the post president ali baba from house arrest, saying he's free to leave. the country longer was overthrown by the army last week after he was declared, the winner in disputing selections for me to put on the committee for the transition and restoration of institutions. c t. o. i re offends that given his state of health, the former president of the republic, ali bunco on denver is free to move about the may if he wishes travel abroad for
7:31 am
medical checkups. so don's army chief has issued a decree dissolving the rapids support forces. i will find out about han says the move as a result of the recess, rebellion against the state. the permanent triggers on the on may have been locked in a battle for control for nearly 5 months. and i, jerry and appeals court has dismissed the all positions challenge against president motor attend avery's election victory in march enrolled. that the petition failed to prove claims of a regularities and for the opposition load can still make a final appeal to the supreme court. mexico is sent to have its 1st female president, next year, often rival policies chose women as the account is full and exclusively, mat county assign. bob has been selected by the governing morena policy. she'll face center so, so she'll galvez of the opposition for the election. this extreme weather is causing catastrophic flooding and bulgaria grease and to key at least 14 people has
7:32 am
now been confirmed at full cost as are predicting even more rain in the coming days and was able to is bracing for more rain off site turn struck the southern states and we are going to, sol, it's killed, moving stuff to people and displaced thousands of face to 17 people have been killed. and a russian missile attack on the town of consumptive con, president brought to me as landscape condemned to be attack on the city nearly 20 kilometers from the front line. fighting on the outcome, pockets on board. and has caused the main crossing between the 2 countries to close civilians fled off the pack of sony and ask on security forces exchanged a gun fight and took them. well, there's the headlines. i'll be back with more news for you here. off to the stream . a week the look at the world's top, big, big storage, from global markets to economies and a small business sales force and included security around the world history or
7:33 am
something that the international community your view should be doing to understand how it affects counting the cost on o g, a 0 the hello and welcome to the stream. what you're watching is josh rushing generated by a guy on today's show. we look into the dangers of a, i powered this information and how it could be used to manipulate the public fears of a new wave of this information. loom a sophisticated a i programs become increasingly adept at making images and text. how could i be used to distort our sense of reality and what's being done to protect the public from misinformation and malicious actors? and what you're watching now is the real josh rushing or is it?
7:34 am
well, that's the point of today's show. so let's talk to some experts about that with us to talk about the future of a i and this information from cambridge into u. k. is henry either an expert broadcasters, instead of media research or in new york, sam gregory, executive director of witness an organization that looked at the intersection of video and human rights. and in austin, texas. i was running the dean, the chief operating officer at opus a. i an artificial intelligence company. hey, one more guest at the table. that's you. so if you're watching this on youtube and we've got a live producer right there in the box waiting to get your questions to me so i can get them to the expert. so why don't we do this together, right. okay. um, sam, can you start this off? i, i want to talk about the letter that came out today. that's news. let's say that for just a 2nd, because what gets me is it seems like i mean of weeks ago chat g p t was
7:35 am
like the thing. and then it's like a couple of weeks later, we're already to what g t p for and things seem to be changing really fast. and i know the deal with a i is, it's like the periods that you've crossed a tipping point before you realize that. so can you just kind of catch up, maybe people who aren't obsessed with us, like me? where are we right now and they are development? why is this important? so we've had 3 or 5 years of really foundational research that started to create the ability to do things like create fakes, like a face or someone who never did. something started to create the ability to have interactions with job thoughts that gave you a power and human answers started to make it easier to falsify or plug in someone's voice. and so what's happened in the last years we've done this, seen this very rapid progress of these tools getting into the public eye and becoming accessible. so between the summer of last year, we've gone from tools like
7:36 am
a tool called mid johnny that came out in july 2022. and last week to create an image of the pope. in a puff, a jacket or a power to arrest the president, trump. and then empower law. we see the advent of these tools like chat; gt; that apparently allow you to communicate with what appears to be someone who is given you and on. so i maybe doesn't appear like a machine, but it's actually a machine based on a large language model. right, right, right. i want to bring in a video comment. this is from a professor at princeton university on our, our vin number and you on your check this out there's an open letter making the rounds to a calling for a moratorium new a i. unfortunately, this is a distraction from the real issues. the idea that the guy is gonna break free and kill us. that's pure speculation. the real harms actually happening are totally different. companies train these models on the back of people's data, but give them nothing an exchange. we don't have to solve that today. we don't need it 6 months, tax the companies and compensated the artist. unfortunately,
7:37 am
we don't seem to have the political well to consider anything like that. the right guy. so i want to show the audience my laptop. this headline says, getty images assuming stable the fusion for a staggering $1.00 trillion dollars, that's $150000.00 for image and all the images they took. so this is what he was talking about. the video card comment, this kind of internet scraping henry i, you can touch on that. but he began to comment by mentioning the letter today. that alon mosque was also a part of, can you bring us up to date on that letter, please? and then maybe you can touch on this idea that he brought in. so yeah, certainly. so, and this less, it was published today. it was an open left to featuring, i believe, over a 1000 industry experts, researches and academics, including as you mentioned, the little mosque. but it was that people like young tile into kind of founder of skype, essentially asking for a moratorium or for the brakes to be puts on a development, particularly in the generative space. we're not focused on motors,
7:38 am
light, cvt full but also moving on to other areas. um, you know, as you mentioned, mid journey text image and so on. and so this is really come off the back of this, you know, must evolves, right? it's worth it to be seeing where as i mentioned, within the space of less than a year and mid journeys on version 5 of all of the big tech companies are piling in to try and builds back to or more accessible models. and perhaps, you know, the concern is that there isn't enough proactive safety and ethics considerations going on here and the fear is it be reactive, but too little, too late with regards to the comment from you or from no come into that around training. he is indeed right that these models authoration for data, they consume immense amounts of data to get to the state that, that room we're talking, you know, hundreds of millions of photos and indeed most of the web being scraped in the context of text. and indeed, many of the kind of companies, the license,
7:39 am
the imagery, and indeed office the 2nd. well look, this is a violation of our copyright. it's a most of what it's a, um, is a controversial issue, which some people say, well, look, then we'll actually copying that. so taking inspiration from him, but obviously some people are saying, well, look, you know, we need to update copyright to make it fit for purpose. in this new age of synthetic content and using data in this way, without permission, without royalties. it's something which we can't accept meaningful. all right, here at an a i company, are you, are you in an arm trace and do you think there needs to be someone outside of the company is developing this stuff? stepping in with some kind of regulatory measures. and definitely, so we work on text to video, which is kind of the next big wave of what you're going to see being generated to trade in a i or directory i, that's what we call it i. i think it's going to have to be a public, private partnerships. they, there's the raising taxes and companies,
7:40 am
those raising taxes in anyone is never the solution. neither is putting regulation . just have posit me because i don't think dc or anybody, any of the politicians really get the grass or what the technology is. and on the other hand, a lot of people working on it by not know the social and whatnot. when you understand the social, political aspect of things going into it. so something by a public private partnership, that's what's on some sort of funds. almost like a constitutional law kind of things in place. and these can very country by country or by geographic boundaries on how we come up with solutions is going to be the bat, at least for me. i think that's going to be the best way to go about it. i'm going to bring in another video comment. this is jesse alert. he's a co founder of accountable tech in the rapid escalation and the arms, or is really underscores how far behind the us as fall. and when it comes to regulating tag, we've done nothing to grapple with the catastrophic societal harm social media giants
7:41 am
. so on the least over the past 15 years, we don't even have a federal privacy laws in place. and now these same companies are trying to capture the next market, rushing out half baked generative a. i tools that pose entirely novel threats to the information eco system. we're not prepared to grapple with them and we do need accountability. that's the job of government. but in the short run, we all need to re learn our critical thinking skills and approach content with a healthy skepticism, especially if it seems to confirm your existing biases. so sam, i'm checking in on our youtube audience right now, and it seems like the tone is fairly scary and just opium. is there a real reason to be fearful here, or is this being blowing out of proportion? i think there's a real reason to be fearful. i spent the last 5 years working globally on this issue talking to people who were ready face similar problems, right? to a tax on them with fake images. i tax using non consensual sexual images of women,
7:42 am
which is the biggest way people use these synthetic media deep sight models already . um, on one of those 5 years before this letter came out, folks were saying, we need action. we need to think about how this is going to be controls. we need to think about the safeguards that are going to be built into the systems. so i think it's right for us to say, be fearful. now at the same time, we also have to recognize that when we create a hype sites around this, it benefits certain people. it benefits people who get away with saying you can't trust what you see online. it benefits big companies. when we say, let's put a moratorium on development, it benefits the incumbents like jesse said, so i think we have to be fearful, but also really thinking about what are the homes that we already know are going to be generated here? let's not think about hypothetical is a lot of that, for example, that future of life florida was very focused on the big picture like future hypotheticals of homes. when we know the homes already from what's happening already, the way in which social media are, is playing out the way in which miss and us information is playing out. so it's
7:43 am
time to respond to those needs right now. rob them play just sort of hypothetical feet as further down the line. henry osler pointed out, rather astutely though, who is it that's going to step in because like i looked at congress in the u. s. and what their average age is that they is, this is technology that they don't, i don't think they fully get so so who can step in and regulate those who should step into regulate those? yeah, i think the tick tock here in the other day was another example of it. like there's a good book hearing a few years back, which sort of highlighted some of the ignorance, the lack of knowledge around emerging technologies. and regulation is a really tricky one. i mean we, we have plays around the world who are considering and our to be working to implement legislation. we have a note to be in the u. d. u a i act which would classify many of these generative technologies as what they call high risk, which would then put the measures in place for creatives of these tools in terms of
7:44 am
how they suicide data, how they disclose the outputs of the, the tools to audiences and to uh, to people online, but then we also have company in country, sorry, like china, who introduced they see at the deepest emphasis which takes it, you know, a step further and says, you know, we're going to actively moderate the content in terms of whether it's faithful, real and we get to say if you're being, you know, essentially a criminal and publishing content which it's, you know, perhaps a little bit too far considering that track record in censorship. but i think you'll come through again. he's right, the current lead governments. we again, i saw him and i have been building for years about this, this problem and trying to get full full um, you know, legislation going and to get key stakeholders engaged as often is the case. it takes a real shock. it takes like the heights like the way people up and i think the us
7:45 am
um, you know, has a cost effective what to do as does the u. k. to kind of get in line here. yeah. but henry, is it, is it country by country? can you trust a country by country solution for this? because the internet seems to know no boundaries that way. a while i'm not entirely sure what you're tentative would be, i mean the u. n. is there anybody that i'm aware of that could potentially try and get all of it met the states involved in some kind of draft. so it legislation to, to cover this kind of auction. but again, look at the dynamic between china and the us around ship manufacturing on the ions raise, not just between companies, but we say between countries i think is very unlikely that you're going to get that kind of international consensus built that way. unfortunately, which leads to a difficult challenges for countries trying to balance innovation, right. would safety, which is a good one to strike. jump in there. so yeah, i think there's also the person that sam, other i think is also a different framework that's required on how we think about it and saw that,
7:46 am
right. so there's image of injury. so for example, people take phone calls where they're, where they can, um, you know, coffee, the words of somebody that you know, went out and educating people on having safe words or having these secret faces within their family. not sharing information online to a stranger on the phone, even if they do sound like somebody that you know. so there's like an image of the conversation that needs to happen in terms of personal security. but also if there is a revenge born and a lot of this is used to abuse women of more education that goes into how the police deals with the how easily and quickly complaints can be made. so the image of things that needs to be done and then they're more policy wide things. but at the same time, i think it's huge responsibility also life with the people building these technologies and how you philosophically deal with it and what you put. so for example, the creators whose data you're using, how do you incentivize those creators? how do you pay those creators?
7:47 am
how do you create more opportunities for creat, goes to build things that that, that they can monetize upon. and i think a lot of these conversations are happening, but because of all the your monitoring around it because i agree with. 5 henry, and so i'm here, there's like a lot of money to be made into your lawn. green, right? it gets into the hands of the wrong default. we're talking about it. and these images and more philosophical questions get knocked off the table. and i think we need frameworks for those way, more than all the machines are going to take over the world because we're nowhere near that. it's still humans at the home of every house, or i'm going to stop the conversation here for a 2nd guys, because i want you to expand on something you said at the top to mixture of yours. here. to explain this idea that you need to have a safe word with your family, like why, what are you talking about? i want people to get this yes. a, for example, there's tons of technology is available now where my voice can be used to, you know, for a phone call and somebody can call my dad and be like, hey, dad,
7:48 am
i'm in this, you know, emergency, can you give me your bank account details or can you give me your credit card information so i can pay for x, y, and z? that's like the most simplest, you know, of scam ever. and what we need to have is with our family, this conversation of like, okay, if this happens, what's the 1st question that you're going to ask? and that question remains within all, if you don't write it anywhere, know like, you know, put it on your phone, don't put it on google docs. and that way, if you do receive one of these phone calls, it's again a lot. if it goes back to very simple, like how not get scanned 101. but i think the need to have these conversations with our kids with our families is there and we don't have enough of that conversation. no, sam, you're going to jump in. yeah, i, you know, one of the things that i think is really important is that we don't place too much pressure on individuals to, to handle. this sits, i just came back from last week. we ran a meeting with 30 activists and journalists from across africa talking about this,
7:49 am
such as the agency and response to these types of a i generated images and, you know, we, we weren't able to, when we looked at images to necessarily to sun that they weigh i generated and what that tells us is that instead of trying to put the pressure on each of us, for example, to look at these pictures we see online and try and guess whether that manipulated . we really need to think of this pipeline. and this is where, placing the responsibility on tech companies and distribute those matters like is a tech company that's building one of these tools that they placing the signals within it that enable you to see that it was made with a synthetic tool. are they providing watermarks that show that it was generated with the protection, the sight of training data, all those jo, ripple across the time so that they're available to, and someone watching it because otherwise we end up in this world words like we start getting skeptical about every image we look really closely that we try and apply all media literacy and it undermines truthful images, right? and that's the, the biggest worry is that like we create a world where we contrast and the thing because we create this assumption. and if
7:50 am
we place that pressure on individuals, particularly individuals around the world who don't have access to sophisticated detection tools, who exist in environments whether or not being supported by the platforms very well, then why do to, you know, to be in a very tough situation. so let's not place the pressure as, as exclusively on us as the viewers we really need to push the responsibility of the pipelines and the technology developers and the distributive. i wanna show something that you're talking about here. this is a story about the great cascadia earthquake of 2001. it has pictures of scared families when it's happening, the cities destroyed. i mean, look at some of these, right? the sort quite never happened. and if you really look at this photo, which is a great photo of someone on like a battered beach holding a canadian flag and you zoom in on it. the hand is backward on the right hand side . right, but the smoke so real and what i'm really concerned about henry is here in the
7:51 am
states. we saw how misinformation and this information effect another election in 2016. i think it effected brett's. it. there as well with cambridge analytical, what is cambridge analytical look like in a i for future elections? a. yeah, it's a good question and i think i just very brief diesel into echo sams come in and, you know, i get a little, a jobless reaching out to me to get over the last few days today. and you know, can you tell us how to spot a deep space because an individual and i kind of have to copy everything i say by saying, you know, look, these images like the ones you just showed are getting better and better. if you look at the images that mcdaniel is generating back into live las yeah. you know, the new ones that are coming out at the moment, all the, you know, leaks ahead in terms of realism. and so if we put the button on the individual to look the tell tale signs like the hands which are improving the time. and it's going to get people false confidence that they can detect something. when actually it might be trained out to the models by the new versions. and in terms of the
7:52 am
election context is a really interesting one. and you know, again simon, i think we've been hearing every single election us submit the presidential. this is going to be the one that the fake schools chaos right? the video is going to leak on. would you eclipse, going to leave the night before the election is going to swing it or it's going to cause you know, chaos. um, i'm wondering as you're talking, we're showing the trump getting arrested photos just so you know what? go ahead. yes. yes, i can see which which luckily didn't fool many people. in contrast, the image of pipe francis which we may get on to. but um, but yeah, i think the, the selection i think will be different, not necessarily because i think that kind of worst case scenario indistinguishable fake the, even the media are. and experts such as, you know, such as myself and others comp the tact. but i think it's going to be different because we're going to see just a mass of proliferation, of this content as wasting right now. you know, there were videos online of all of the presidents with that voice is chloe and
7:53 am
playing mary comp policy. and it's kind of irritated convincing on one level, right? and there's a lot of images, it just showed the presidents and these kind of tricky scenarios. and a lot of means a lot of autistic content. and then as you said, some, you know, low scale to information of luckily, most of it is detectable right now. but the, the movement of travel, i know the direction of travel and the speed of advance is means that that's not something that we can be as sure about as we have been in previous selection, is that it would have a serious impact and really confuse and potentially play into that kind of fact with information ecosystem that we are currently experience experiencing. as you mentioned in the us on here in the u. k. uh say i will talk you want to jump out. yeah. yeah, it feels like there's a big shift that's really important to name that's around the ability to do it doesn't volume, right? this has been quite an issue before and then potentially will sort of personalize that we've already seen, you know, the sense in which you know, you can aust chat to be to, to do something in the style. all right,
7:54 am
so when people have done research on this, they look at this ability to create volume, the ability to pass lights, which of course, can be used to target people. and then we're seeing this commercialized and available to people. and so one of those make this a more vulnerable election that may be organized, packed as well. so people are just wanna have fun with the process like falls. um, so we have to be cafe level one of those. i also want to name, but one of the biggest threats that we've seen and this is in elections, males whereas also non consensual, sexual images. and we don't always know they've been used and they used to talk it, women, politicians, women, journalist, women activists. and so that's definitely going to be happening under the surface even if we don't know what's happening visibly. and so we should name that one is a threat. that doesn't get the headlines around like this information, but it's equally harmful to all the civic space. yeah, youtube, charming in here, i'm looking at someone named empty case taro, insights it's one of my favorite parts of the show when i use people's youtube handles. and i have to read these kind of crazy names. but she has real concerns as, as my fear of a, i is bulk see it as neutral,
7:55 am
but it's very racist because the contracts with police and military is, i don't know if that's true and that racism is going to be recreated in art and writing. i'm an artist, there's a lot of decks that can be created on record time, but also those extra white says add that i guess that are going to, um, other key to be talking for a moment about that. the way that i think raising gender identity might play into what we're getting from a i hear yeah. definite i. so that's why i think that's the easiest answer is that's not true. there are biases that are built into uh, they charge you to be able to get a b know that example where it would say all the nice things about buy them, but nothing about the drum and these and people at the back and are fixing these every single day again, it's human that the round so you are going to see some biases in the data. but then add more and more new versions of these models are coming down there. 6 there are
7:56 am
different, almost like um, stages being built the way you can, you can dictate what kind of way. so what kind of opinion you wanted to have. um and so for example, if you want it to be experian or if you want it to be more of, you know, no liberal whatever you want it to be, you can change and get opinions more in that. so i think i was more and more these models get trained, we're going to see more optionality for people to generate things that they want. and again, going to be used to create a harmful content, a 100 percent. but all the biases that we see are kind of data built in by the human entities mano, so the model themselves are not basis yet, so that reflect humanity. amanda. in henry, i'm gonna ask you a philosophical question. i'm gonna give you 60 seconds to answer it. because the show is going to end whether you do or not. but but, but my son said, you know, when they said the test was uniquely human and computers would never be able to
7:57 am
beat humans and chess. and of course they did made it look easy. what does this tell us? what is a i tell us about being uniquely human? is there something we learn about humanity from this? it? well, thank you for the generals. 60 seconds. yeah, that's a really tough question. i think we've seen time and time again able to replicate certain aspects of human performance capabilities and quite narrow domains. and this, i think, is one of the 1st that really flipped the narrative on his head, you know, low scale, repetitive not results. so we're going to be replaced by a i was seeing, you know, creative industry is getting really shaken out by this. at the same time, i do think that i is not able to be fundamentally creative in the way that we talk about it necessarily. humans. i don't think there is an intentionality, that there's not a consciousness which is acting on the world, henry in that way. so i, yeah, some aspects couldn't maybe the edits there, henry and i want to thank you us, um,
7:58 am
sam for being on the show today. and all the humans that joined us out there watching. we'll see you next time on the stream, the, the, the 12th is devastating. war torn somalia and intensifying a hunger crisis that has cleaned the lives of around 40000 people off of children under 5 years old. but at the heart of this tragedy, eliza tales of the immense facility and send the termination of a people fleeing finance. while at the mercy of escalating and unforgiving climate change. people in power somalia,
7:59 am
a fight for survival and jersey to basically entities. the un fits of purpose was like many critics sites, just pump solution doesn't get anywhere near enough done to the amount of money that is poured into its hard hitting interviews. do you think close to the lines of washing it's enough for money to go on its own and built it's on the thought provoking on for centuries, people have been taken care of are. so i have every confidence that future generations will do it as well via the story on told to how does era in prison without trying to i'll just say richard, unless remain behind bars in egypt for hot wood, didn't it brushing detain since february 20? 20. the drop you a chief, detained since august 2021. i'll just say recalls for the immediate release of its gentlest, detained in egypt. journalism is not
8:00 am
a crime. the 1st jessica window i just need is to push for peace. i have a conflict on the final day of the cm summit in chicago. the other one sounds good. hey, this is alice is here at life and also coming cologne's mother tre, release to pose president. i live on guard. i'm saying he's now allowed to leave the country next goes.

17 Views

info Stream Only

Uploaded by TV Archive on