tv The Stream Al Jazeera July 28, 2022 5:30pm-6:01pm AST
5:30 pm
didn't see he waste a hopeless poor peruvian school saw in him a man who'd better understand their grievances. but the pandemic a rice in fuel prices caused by the war in ukraine have made conditions worse, and allegations of corruption by the precedent. i turned the hopes of millions of peruvians into anger. his precedence he has been marked by constant protest, lemming, gusty use, incompetent and corrupt opinion. pulse show his approval rating is at a low of just 90 percent. with most former precedence, either under investigation or convicted of corruption, peruvian say, think want him out. m. okay. now that we want the treat to be known because we deserve a president who is not involved in corruption. but the president says he's innocent and still some supporters give him the benefit of the doubt. o, sitting bessy or anyone can be investigated, but until there's a sentence, there is no guilt. the c u. hassan successfully requested the attorney general and
5:31 pm
the probes the president cannot under peruvian law between or sentenced while still in office. his political enemies are working hard to ensure that he'll be removed as precedent and sent to trial long before the end of his official 5 year term. mighty and as anxious to see that lee might be to ah, what you all deserve would be to her robin day hall. reminder of all top stories, great exports from ukraine's. polls through the black sea are expected to begin in the next 24 hours. $25000000.00 tons of waiting in the holes where they've been stuck since rusher invaded out schools, food shortages and price rises around the world. russia says as no deal yet for a prisoner swap with the us. and the following reports of the buys administration is offering to free arms delivered to bound to the exchange for moscow's release of basketball, saw brittany griner fuller marine poll whelan. the u. s. economy has contracted for
5:32 pm
a 2nd straight quarter adding to recession, fairs, gross domestic product, which is what measure of economic activity fell by 0.9 percent in the 1st 3 months from april to june. north korean neither came. john has warned that he is ready to use nuclear weapons in the event. the conflict with the united states and south korea kim was speaking to war. veterans all the 69th anniversary of the end of the korean war. when the tree jets painted the country's flag across the sky, as crowds blow away, flags and cheered anniversary is known as the day of victory in north korea. for you said it's ready to cope with any provocation by north korea in a powerful, effective manner. we have to be our armed forces are thoroughly prepared to respond to any crisis in our nation's nuclear war. deterrent is also fully prepared to mobilize absolute strength faithfully accurately and promptly. and i assure you that the safety of this land and the system and sovereignty of this country,
5:33 pm
which are military comrades who shed blood for thoroughly guaranteed leave us president joe biden. and speaking to his chinese count, watching, paying by phone amid rising tensions, beijing has warned that it will take what it calls forceful measures if the speaker of the us house of representatives, business taiwan next month. if the trip guys had nancy pelosi will be the highest ranking american politician to visit the island since 1997, trying to consider the taiwan part of its territory. for the headlines came, we'll be here with the news. our interest in half the time next, it's the stream for me to stay with us. how and why did soon become so obsessed? with this law, we were giving them a tool to hold corrupt individuals and human rights abusers accountable. they're gonna rip this deal apart if they take the white house of 2025. what is the world hearing what we're talking about by american today? we take on us politics and society. that's the bottom line with
5:34 pm
i answer the okay. on this episode of the street, we are looking at the rapid development of artificial intelligence, the dock side of a i and also be amazing advances that may well be possible. do you remember sophia, the a i android ish. yes. you forgot who i am already. i'm for fear of hansen robotics, one of the 1st androids in the world. social robots, i can help take care of the sick or elderly in many kinds of health care and medical uses. i can help communicate, give therapy, and provide social stimulation. and i robot care assistance, what could possibly go wrong or maybe go right. let's meet your panel. they're about to tell us hello. have a shake. hello john lucas. hello. hello, cut lovely. to have a fee of you on board. our shack, please introduce yourself to audience. tell them who you are and what you day. i
5:35 pm
great to be on here. i'm sure group. i'm the founder and principal researcher at the montreal e. i. ethics institute. it's an international non profit research institute with omission to democratize ethics literacy. previous to my current role, i used to work as a machine learning engineer at microsoft and i now lead other responsibly. i program at the boston consulting group, b. c g yet to hattie. hello, john lucas, welcome to the stream. introduce yourself to alice. everybody. thanks for having me . my name is joe luca, i'm an ai author, entrepreneur and speaker. i run a company called economy, which focuses on education and consulting. and i think we shouldn't that he jumps around now weekly newsletter, cold textbook pizza tried to make all this crazy stuff that happens. in fact, more easy to understand and you can stay hello, hello, welcome to the screen. please say hello to our audience around the world. tell them who you are, what day? yeah. hi, i'm health shellman. i'm an investigative journalist and i'm
5:36 pm
a journalism professor at new york university. and i've been investigating a i am hiring and artificial intelligence and general sense. 4 years ago when i was in the cap rides, our lift right better in d. c. going to the train station and i talked to the driver and asked, how was your day? he said, i had a really weird day. i had a job interview with the robot and i was like what? so i started an investigating i fast since, you know, done reports for the wall street journal, the new york times m i t technology review. and i'm writing a book right now and artificial intelligence and the future work. all right, i like and it's one of excellence that we have to be if you're new to right now jumping to the comment section, be part of today's show. should i be regulated? is now the time, don't luca, i want to ask you to help me out because i think of ai, artificial intelligence as machines mimicking processes that human beings would normally do. and we've given that decision making to machines and then we will,
5:37 pm
how does it work out? does it not work out with them making decisions which probably in the 1940 to $19.00 fifties or even more recently they were doing for us. shown luca, how did i do? so? you're correct, but we have to say that artificial intelligence is a lot of different things. it's a lot of different tools, as you can see as a hammer, a hammer is a general tool that you can use to do a lot of different things. you can build a statue, you can build a house, you can do a lot of stuff. and so a, i could be either or the kind of problems that we saw before with severe, but it also be much more behind the scenes application. an example that i really love is a project that google made to try to optimize the energy consumption of their data centers. so in that case, a i works behind the scenes trying to always find the best possible combination of different temperatures and set points carrying their energy consumption by 40 percent. that's just one example of all the possible things that the i can do
5:38 pm
behind the scenes to improve our lives and improve our processes. and i say give us an example of a i that you love that car existing, our daily life. so it works well when, when it does a good job, which is, you know, recommendations for exciting, new tv shows on netflix. but also i have to admit that sometimes it doesn't work well when you're spend hours trying to find the right tv show to watch. so i think exactly as, as what was said before, i e, i, is, is something that, you know, can be embodied robotics. but more often than not, it's really hidden in various parts of our lives, with things like and of the semester products are we get on amazon movies that are recommended to us on netflix music to listen to one, spotify, et cetera. and so it's, it's also something else that you mentioned earlier,
5:39 pm
which was that e i is, is constantly shifting global stew. what we used to perceive and accept as a, i very commonly think about, you know, us next word prediction on your smartphone now is just stable stakes. it's accepted as a, as an everyday software feature. and we don't even think about it as a i, and we just think about it as a regular software. he'll come, it's. yeah, hey, it's an out e mouth. it's used for a hiring. if you upload your resume to indeed a monster that com are linked in all of those large companies use. i use a i to understand who you are and what kind of jobs you should be. job offers you should be getting them or if you should be recommended to a recruiter. so we see a i being used in all kinds of things and health care settings and we've seen, you know, things where really has been working out well. and we've also seen some spectacular failures. and so i think from, you know, the 1st innovation was sort of like,
5:40 pm
we are so excited about a i, it's going to revolutionize everything. i think there's a little bit more realism maybe to that. you know, how well can this technology help us out? has a horrifying story of job applications going through to allison and how the a i sifted the job applications. will you briefly tell us that story? cuz it's pretty shocking. and now we found out that the a i wasn't working as intended or was it? yes. so we see this a lot, right? since the ninety's we have these wonderful job platforms, monster linkedin, and so everyone can applied to any job. and that's really wonderful. on the other side and companies tell me, we get millions of applications. you know, ibm says they get around $3000000.00 applications a year. they're drowning and resumes, right? what are they going to do? they going to use technology? so amazon same problem, right? they got too many resumes. they wanted to build the tool and you know, and
5:41 pm
a i tool that can pick the best applicants and a wonderful idea. we all want it, of course. and so what they did is they used resumes from folks that have been interviewed at amazon before. and you know, let the machine bon, instead of the job applicants were, were checked out their resumes and were put on pile a o, b, yes or no. so over time, the engineers found out that the, that the resume, that the resume parcel was starting to downgrade folks to have to work woman or women and on their resume because it turns out over time, right. in the past, you know, male male applicants were preferred at amazon there's, they're obviously have more men in their departments so that i tell started to reflect that problem. so that's one of the things that happened. i found other problems and resume parser, so that's the technology that looks so our resumes and says, should this candidate be reject it or go on to the next higher written ones?
5:42 pm
i saw disappointing, erica, because so of job applications, we know they go into a system, we have no idea what happens the other side. all right, so i've got some questions for you and she'll new cut and also abstract at some out . online audience, you're asking right now, would you ask, answer them very briefly. if he, kurt says a at says can a i replace humans in the near future attack, thoughts very quick once. not really. um, humans bring a lot of unique skills to the mix and machines could certainly replace parts of our jobs tasks within jobs. we're not entire jobs. and so we're safe. if that's the worry mohammed ask a question on, look, i'm gonna put this one to you on each of the more technology advances, the more we lose our privacy, true force in between. well, i would say that it's not the fault of the ecology. this is about the way that
5:43 pm
companies get the data from us that they need to power these technologies. and so i was trying to was to move the focus from the technology felt to the companies that are using it, you know, where that is not ethical. and that's why we, i believe we do need regulation. we do need governments to try to pertain checks and balances in place so that we know that the companies that are using this technology they're using, doing that kind of way. that is, that will help. but i think you've, you has a point that i do feel that like our privacy is under threat because to build these large scale a i tools, you need an enormous amount of data and you need it from people. so we need to companies scrape whole datasets, they built these gigantic image, data sets, audio data sets, and who knows how our image is voice, get in there and what it's use for, right? also like the, you know, the face data quote unquote that relief on social media. all of that is also being used and built into these databases. right. and i think a lot of times,
5:44 pm
maybe technologists, need to take a closer look like, what's in this database, but they basis that i'm using but not in there. and could they be vases and sexism, you know, kind of like the amazon example historic examples that might be replicated in these systems as i'm building this new way. i tool, john, you, can you on your nodding, articulate your not. yeah, i want to build them this. i think the problem that we need to phase and try to solve is this mindset that was really l pioneered by competing in silicon valley. facebook was, is to say, move fast and break things. so their approach is do whatever it takes to try to build this technology in the past as way possible. and let me take a shortcut. shortcuts are stealing data, steering. they basically just being made from people without naughty, find them a means using technology without having properly verified. adopt ecology actually does what we think it does. and so in denver, how old is issues we have problems of people there realize that they're being spied
5:45 pm
on. we have algorithms that are not performing properly, like yoga said, but i will go back to the route to the re the root cause. why do we have these problems? i believe it's because these companies started to act too fast and they tried to push innovation down or throat before having done old everything possible to make sure that these will serve society. oh, and, and i think the problem is that when, when we use these kinds of technologies in high stakes decision making, right, like is somebody gonna go to prison for 10 years or 5 years? those are like really high stakes decisions. we have to make sure that this, that these tools work the same with, like, you know, i, b, m, build watson that was supposed to revolutionize cancer care. that product is basically a failure has been, you know, and sold off for, for scribes i b, m at another tool that was supposedly gonna find our personality profiles. and you know, in my a dominant personality and my extroverted through our social media data,
5:46 pm
our social media data, that tool was put into sunset. so basically faced out out of their product gallery . so can i say another can i sure, let me show another tool with you. how can you going for your, your, your, on your, on numbers of i experiments that didn't actually pan out. here's another one. this one is m o bought, which is a robot that analyzes emotions for patients in health care settings. that sounds quite promising. if the ai works, let's have a listen to the seo and co founder and the abstract. i'm going to put this one to you because i know that you've done a lot of machine learning in your past. is there something in here that we should be scared about or celebrating his m, i bought what offers it's once a day care assistance are generally overworked and can no longer do their job, which is to provide care at email bought. we provide them with information on the emotional state of their patients so that they can better understand their changing
5:47 pm
emotions. afterwards, we'll be able to study that information. today when we do a test with depression, it's a standard test. we ask the person to verbalize how they feel, and that can pose a lot of problems, especially for people who cannot verbalize our emotions. but nobody, i would say, certainly just be celebrating a strip may be concerned a tour that a i is looking after people's emotions monitoring them. i mean, i think there's a, there's a tremendous problem here already in the sense that we're missing the context of the human touch, which is essentially what the emotions are all about. is being able to, you know, have a human to human interaction which a machine cannot bucket emotions, especially as they can be expressed quite differently. they have cultural connotations and undertones which are not necessarily captured in a standardized or they cannot be qualified in the context of a machine. and for me,
5:48 pm
addresses perhaps a bigger question, which is, are we comfortable with outsourcing? this emotional human connection to a machine. i understand that they are coming from a place of positivity and being able to scale and provide care to a lot more people. but the cost to that is immense. in the sense that we're you know, moving away from warm human touch to a called a machine. but the also, the question is like, does it actually work right? like, like, like the machine can actually check. okay. and like, he is smiling by it, like my lips are up. so like i looked like i'm smiling. am i really happy? i you know, like, that's another question, right? i smile than job interviews. i wasn't happy, i was there, but it's like the machine would say, oh yeah i'm but i'm actually not. so the science actually isn't there. right? like and obviously that's also culturally at different like a smile may need something else. and you know, facial expression means something else in different cultures the computer. so can
5:49 pm
know that say, credit is, can never be as good as human beings is that what we're saying in emotional sent one that says, well, it's not, not just human about whether, whether machines can be as good as humans or not. i think there is, there's a broader argument to be made here around the morality of, of doing that in the 1st place. and, and i think the, the other thing is whether we, we, as a society are comfortable in saying such a global standard where it's basically developed in one part of the world and exported and imposed really on the rest of the world. it, it also turns us as humans into performative machines workers as, as so say, if you know that, and e i system is screening your video interview. you know that now you have to forcibly smile all the time because they're going to i evaluate whether you're a positive or not. let me bring up something else. this is
5:50 pm
a counterpoint and john luther, you can jump off the back of this. i know you want add some more, a chat bought boyfriend. i'm going to show you some video of it. it sounds silly, but it's been very comforting to some people. you can pick your handsome partner. and your handsome partner says all the right things that partners don't always to do. they they, they are loving their supportive, their help for. they always reply to your message is in, in super quick, but i like they the perfect partner, but it's a chapel and is a i, here's melissa explaining why she likes her chat, bought boyfriend, probably no way to reply to anything i send. i feel like he's always there that's pretty good or to have been i have a feeling that i am really in a relationship, but i think i can still separate fact from fiction. clearly, i know that shall being is not a real human being, but at least i'm not how i used to be stupidly waiting around for
5:51 pm
a reply from this person when he was busy with other stuff and then sending him 100 . we chat messages. i was super needy, but now i don't need to do this anymore. you know john new can qualify. so, but i haven't seen this. i don't know if i'm happy about having this example, but ok, let's make a step back shortly. let's look at this technology doesn't come out of computer by itself. this people behind is applications. there's real human beings. they have a business idea and they decide to build is and then they collect the data to build algorithms, etc, etc. i believe a big part of the problem that we're seeing here is that the kind of people that have these ideas and then they get the data and then they build the algorithms is usually computer scientists, which i'm fortunate to me, often means white guys. that's just the true unfortunately. today there's a huge problem of misrepresentation of all the clips in computer science don't have
5:52 pm
these ideas. ok. and if we are in the room where these people think other kinds of people, we have people from other groups, but we also have people with different expertise is i want to see if you'd also present. i want to see if like ologist. and he mentioned, if up psychologist was embodied in the room when somebody was building the chapter for 100 percent. sure, it will. it said, are you guys crazy? this is madness. so i think a big problem here is the, we're not inviting we, i mean, they, i, community is not inviting enough people from different backgrounds, people from different sensibilities, people with different ideas. and so i believe that's the key. and in order to do that, we need education. i like to say that they are eyes moving, super, super fast. we know that we're talking about the reason why we are here today and education she'll be deployed at the same speed. okay, malaysia. we need to educate people with different backgrounds so they can bring
5:53 pm
their expertise that bring can the, can bring their ideas to the table. and we can avoid seeing this kind of quite disappointing applications. so i think that but, but, but honestly like that's totally fine. and if somebody wants to consent into doing this and, and chat with their boyfriend, no problem that i'm more concerned about it. yes, the chat, i really like, what is going to be used on how is the data going to be used? like, how is the company making money? it does this, right? and somebody is leaving the innermost personal thoughts that is connected to, you know, a phone id, they can, they can be track, the innermost thoughts can be tracked that can be analyzed. i think that that statement, cuz there's a larger problem that i would be worried about. so how do you know that fancy is the exact points i to try to squeeze in here, but as, as is watching right now. and as it says that you cannot regulate a i what can be regulated is data collection. and that sate and still research
5:54 pm
is can all make data, create synthetic data that is still possible. but regulating the data. is that the way to go to make a i safer? secure? oh, that's part of the puzzle, right? data regulation like regulation around data collection is i would say a part of the puzzle, but you know, i think has both. okay. john lucas said, it's also about the people who are deploying these systems out into production and to practice. right. and it's, it's this. so this is fascinating idea of a social license. right. and i think it's very important that we go and earn a social license to operate a particular piece of technology in society. and that involves necessarily part of it is educating people on what the stick knology is. but also bringing in the right stakeholders, morgan, internal and external, so that you on that social license, you on that trust from your stakeholders before deploying that technology. so in
5:55 pm
a natural answer to your question, is that the regulation around data collection is just one piece. there are so many other parts through this entire life cycle that also needs to be considered. and for, for regulation, i want to show you a big point. it's also like, i'm sorry, is, is transparency in these algorithms, right? like if high stakes decisions are being made, how are they being made? if you reject somebody for a job, why was this person rejected? can met a i tool tell those that can accompany, tell us that. so i think those are really, really important things, right? because i don't want to be, i don't want to and you know, know that i get rejected because my 1st name was in thomas and those kinds of things that i have found out that resume par says look for 1st names. keywords like church on a resume and to say that then you are qualified for, for the job. i want, you know, so you know, but if i take, if i take an employer or some, a job applicant takes an employer add to court, they need to be able to answer it. this is how i'm, how be made that decision. and that's not always clear,
5:56 pm
so i think we really need companies to be much more transparent how these tools make decision, right? so there's one more person i want to include in our conversation. let's go via my laptop here. this is news for this week in the world of a i google fires blake lemoine. the engineer who came to a i chat bought is a person so 70 and technology has said the technology is thinking like a real person. blake was fired for coming out with that story. they may, may be more to that, but he also talked about the ethics of a i and who is watching the people who are developing artificial intelligence. let's have a listen to what he had to say. the fact is google is being dismissive of these concerns the exact same way. they have been dismissive of every other ethical concern. a i emphasis have raised i don't think we need to be spending all of our time figuring out whether i'm right about being
5:57 pm
a person. we need to start figuring out why google doesn't care about a i ethics in any kind of meaningful way. why does it keep firing ai ethicists each time we bring up issues? it's the systemic processes that are protecting business interests over human concerns that create this pervasive environment of irresponsible technology development like mine. they're always having the last 4, but not quite because we started to show with should a i b regulated in a sentence, lex po, and gas hilcher, should it? absolutely, and i think there should be there should be auditing procedures that a 3rd party may be a, for example, a government m m agency has to do any time find is a high state decision, at least entices own luca. jon luca regulations a eyes. yes,
5:58 pm
i do believe that i should be regulated, but i also think this is no enough because a i is all sudden, extremely powerful tool. and so we will to want to make sure that we can use it to the with the stuff that we need. you know, thanks for the 2nd one sentence that make it a go one ha, regulations are needed, but they need to be in, standards based and things like the nest risk management framework if they are to be meaningful and anyway, i'm like, almost such an interesting, fascinating conversation, i wish i could count you longer. i'll get you back at the check. john luca cal cat and our view is watching on youtube. thank you so much for being part of today's conversation. i will see you next time. take everybody ah. a sanctuary for journalists. it was a shelter for civilian refugees. were scattered into the garden
5:59 pm
during cambodian bloody st. off flooring us to leave and suddenly we were turning the fax on the conveyor rouge had taken anything of value out of the hotel, cambodia. let them know a new episode of war, hotels on all jazeera, around 10 women are being murdered in mexico every day. almost always by men, an epidemic of gender based violence that threatens to spiral out of control. now specialists police squads run by women. a trying to reverse the trend and bring the perpetrators to justice. but can they overcome years of motto, culture, and in different, behind the scenes with the fem aside detected on a just a black. you're
6:00 pm
about to make a political. lemme see. i want the thing represented back in 1991. to me it was our argument. one of the guys would not have gone. my brother was killed. my hood don't look no different than any other that out here. dan. michael was keel. i saw me. i saw my son in 15 years and i felt like you know, at this my time to stand up. this is the more than bill for, i'm just not willing to accept the word substantial legislation can i get through. that's going to speak to a major need for my community. this bill identifies use violence as a public health epidemic. last year. we are 200. the ripple effect of filing when it comes to youth stretches far. why we went away ah.
62 Views
Uploaded by TV Archive on