Skip to main content

tv   The Stream  Al Jazeera  July 26, 2023 10:30pm-11:01pm AST

10:30 pm
is covering it up so it does look like this legislation is likely to pass, and so it probably won't be any time soon declassified anything when it comes to us . government takes forever. but there's a real chance that the congress is gonna force the, the administration's hands and make them, let us know what they all know about us those. and finally, the irish sing, good musician should data color has died at the age of 56. the single nothing compose to you reach number one on music jobs around the world in 1990. their color was that spoken of religion, feminism enrolled as well as their own mental health and detection issues. she recently pays to the amount of trouble accepting the death of her 17 year old son. last year. the
10:31 pm
don't you will just bear with me, so he'll run the remainder of all the top news stories of warning shots of being 5 in nature as capital, as crowds gather to resist a coup, attempt. suppose the presence of mohammed, but soon turned down faster to manage that team being detained by his own presidential jobs. the president's officials say they meet and he has failed to gain support of the army. and the address is following developments from the capital of neighboring nigeria. the forces loyal to president mohammed by them have mobilized from the house code self in the army. and they are positioning themselves in various locations in the city in the capital city. and they said they have occupied all the strategic looking strategic locations inside m a. i'm either key locations in the country as well. now they say they call him on the mute it is
10:32 pm
to lay down the arms and so run that. the visuals are being held to all the canyons killed in empty government. protests opposition need to write letters didn't get the keys, the government of ordering police to use excessive, full string demonstrations against the cost of living. after a brief pause last month, the federal reserve has raised interest rates to 5.5 percent. the costs of the present rise takes the benchmark rate to as highest level since 2001 hundreds of attended the funeral of a 23 year old palestinian men killed during this very re told me, i'll a new refugee camp in douglas. the army of less than $37.00 palestinians during rates across the occupied west bank. well, joe thought, what do you say wildfires and the country is no, no contained face of been talking any 100 fine. so the past 3 days, prosecutors have opened a criminal investigation on all to the rest of several suspects on charges of austin. at least 34 people have been killed. of course you've got
10:33 pm
a lot of those stories on all websites of thousands of adult comes up dated through the day. i'll be back with more news and half of the next is the streams to stay with us. also make a follows her mother's return to south so done. after years in exile, we came home and into a vice presidential position. my mother stepping into the road that my father died and will not be history, repeating itself to a more likely be remembered for what she does in this new position and intimate portrayal of a family in challenging times. no simple way home. people essentially don't know what too much on a jersey to the i am for the okay on these episode of the street and we are looking at the rapid development of alt official intelligence the dock side of
10:34 pm
a i at also be amazing and thoughts is that may well be possible. do you remember sophia c, a i android? is she s u for guns, right, i'm already, i'm sophia of hanson robotics. one of the 1st androids in the world central robot. so can, you can help take care of this, the core elderly in many kinds of health care and medical uses. i can help communicate, give therapy and provide social stimulations and a local care assistant, what could possibly go wrong? oh, maybe go right. let's meet your panel, they're about to tell us. hello abba shake. hello, john. luka, hello, hello, cut. lovely. to have a view on board. i was shak, please introduce yourself to audience. tell them who you are. i'm what you day. i a hi. uh, great to be um here. um, obviously the founder and principal researcher at the montreal, the isaacs institute. it's an international non profit research institute with
10:35 pm
admission to democratize a i ethics literacy previous to my current role. i used to work as a machine learning engineer at microsoft, and i now lead the responsibility. i program at the boston consulting group because you get to have you haven't shown you. welcome to the stream, introduce yourself to as us. hi everybody. thanks for having me. my name is jessica . i'm an a author, entrepreneur and speaker. i run a company called the academy, which focuses on education and consulting on updates on intelligence. and i run a weekly find newsletter, cold textbook pizza. i'll try to make all this creative stuff that happens in fact more easy to understand them. and you can stay away. ok, welcome to the stream base. i highlight the audience around the well, tell them who you are, what you day. yeah. hi, i'm health shellman. i'm an investigative journalist and i'm a journalism professor at new york university. and i've been investigating in hiring and, you know, artificial intelligence in general since 4 years ago when i was in the cap,
10:36 pm
right. our lift life better in dc, going to the train station and i talked to the driver and asked how was the day? and he said, i had a really weird day. i had a job interview with the robot and i was like what? so i started investigating a fast since, you know, and then reports for the last, the journal, the new york times, and my technology review, and i'm writing a book right now and artificial intelligence and the future work. all right? oh, i can, it's what i'm kind of experts that we have to be if you on youtube right now, jumping to the comment section, be part of today show should i be regulated? is now the time to new. i want to ask you to help me out cuz i think is a, i optimize your intelligence as machines mimicking process is that human beings would normally do. and we've given that decision making 2 machines and then we work out, does it work out? but they don't work out, but then making decisions,
10:37 pm
which probably in the 1940 to 19 fifties or even more recently they weren't doing for us. so luca, how did i do? so you are correct, but we have to say the artificial intelligence is a lot of different things. it's a lot of different tools. it's you can see as a home or i'm or is a general tool that you can use to do a lot of different things. you can build a statute, we can build a house, you can do a lot of stuff. and so it could be either the kind of problem that we saw before it would sit here, but it also be much more behind the scenes application. an example that i really love is a project the google made to try to optimize the energy consumption of their data centers. so in that case, it works behind the scenes trying to always find the best possible combination of different temperatures and set points. cutting the energy consumption by 40 percent . that's just one example. all the possible things that i can do behind the scenes to improve our lives and improve our processes. i would say give us an example of
10:38 pm
a lie that you've lost this call existing, add a lice. so uh if it works well when uh when it does a good job, which is, you know, recommendations for exciting, new tv shows on netflix. but also i have to admit that sometimes it doesn't work well when you spend hours trying to find the right tv show to watch. so i think exactly as, as what was said before, the e, i is, is, is something that, uh, you know, can be an embodied robotics. but more often than not, it's really hidden in, in various parts of our lives, with things like, you know, this is just a products and we get on amazon movies that are recommended to us on netflix music to listen to one spot of fi, etc. and so it's, it's also, it was something else that you mentioned earlier, which was that e, i is a, constantly shifting global as to what we used to perceive and accept as a, i very commonly think about, you know,
10:39 pm
us next word prediction on your smartphone. now is just stable stakes, it's accepted as a, as an every day software feature. and we don't even think about it as a i, and we just think about it as a regular software. it's. yeah, right. it's an out emails. it's used for hiring. if you upload your resume to indeed a monster. com or linkedin, all of those large companies use a use a i to understand who you are and what kind of jobs you should be dropped office you should be getting. or if you should be recommended to recruiter so we see a being used in all kinds of things in health care settings and we've seen, you know, things where really has been working out well. and we've also seen some spectacular failures. so i think from, you know, the 1st innovation was sort of like, we are so excited about a, it's going to revolutionize everything. i think there was a little bit more via lives. it may be um to that, you know,
10:40 pm
how well can this technology help us out? you has a horrifying story of a job application is going through to alison and how the a, i sifted the job applications. will you briefly tell us that story? cuz it's pretty shocking. i mean, yeah, we found out that the a i wasn't working as intended or was it? yeah. so we see this a lot, right, since the ninety's we have these wonderful job platforms, monster linkedin, and so everyone can apply to, to any job. and that's really wonderful. on the other side of companies telling me we get millions of applications. you know, ibm says they get around $3000000.00 applications a year. they're drowning and resumes, right? what are they going to do? they going to use technology? so amazon same problem, right? they've got too many resumes. they want it to build the tool, you know, in a i tool that can pick the best applicants. so a wonderful idea. we all want it, of course. and so what they did is they used resumes from folks that have been
10:41 pm
interviewed at amazon before. and uh, you know, left the machine mine instead of the job applicants were, were checked out their resumes and were put on pile, it'll be yes or no. so over time, the engineers found out that the, that the resume, that the resume parcel was starting to downgrade folks to have to work women or women. so on their resume because it turns out over time, right in the past. and, you know, male, a male applicants were preferred at amazon there's, there's obviously have more men in their departments so that i till started to reflect their problem. so that's one of the things that happened. i found other problems and resume parser, so that's the technology that looks to our resumes and says, should this candidate be rejected or go on to the next higher rate because i saw so disappointing. they'll go because so one of the applications, we know they go into a system, we have no idea what happens the other side. all right,
10:42 pm
so i've got some questions for you. uh, jill new cut. and also i will check from out online audience who are asking right now with your us often very briefly, if you could say at says 10 a all i replaced humans in the near future of a check thoughts. very quick ones. not really. um, humans bring a lot of unique skills through the mix and emissions could suddenly replace parts of our jobs tasks within jobs for not entire jobs. and so we're safe. if that's the worry mohammed ask questions on, look, i'm gonna put this one to you on youtube. the more technology advances, the more we use app privacy, true false in between as well. i would say that it's not the fault of the economy. gigi, this is about the way that companies get the data from us that they need to power these technologies. and so i will try to was to look at the focus from the
10:43 pm
technology itself to the companies that are using katie no way that is not ethical . and that's why we, i believe we do need regulation, we do need governments to trying to pertain checks and balances in place so that we know that the companies that are using this technology, they're using, doing the kind of way that his ankle he'll come. but i think he's here has a point that i do feel that like our privacy is under spec, because to build these large scale ai tools, you need and norms amounts of data and you need it from people. so we need to, you know, companies scrape whole data sets to build these to, again, to image data sets, audio data sets and who knows how our image is, how voice get in there and what it's use for. right? also like the um, you know, the face data quote unquote the, that relief on social media. all of that is also being used and built into these databases. right. and i think a lot of times, maybe technologist, need to take a closer look like, what's in these database, but they faces that i'm using,
10:44 pm
what's not in there and put that be a visa is there must sexism, you know, kind of like the amazon example historic examples that might be best dictated in these systems as i'm building this new way i tool new. can you on the building not to collect and not? yeah, i'm going to building this. i think the problem that we need to face and try to solve is this mindset that was really, you know, pioneer, but companies in silicon valley facebook was, is to say move fast and break. thanks. so that approach is do whatever it takes to try to build this technology and the fastest way possible. and let me stick and shortcuts shortcuts are stealing data, steering data. it's basically just taking data from people. we don't know to find them. a means using technology without having properly verify that that technology actually does what we think does. and so in them you have all these issues. we have problems of people that realize that they are being spied on. we have algorithms that are not performing properly, like you said, but i will go back to the roots to the, to read the root cause,
10:45 pm
why do we have these problems? i believe it's because these companies started to us too fast and to try to push innovation on down the road before having done all the everything possible to make sure that these will serve society. oh yes, and i think the problem is that when, when we use these kinds of technologies in high stakes decision making, right, like is somebody gonna go to prison for 10 years or 5 years? those have like really high stakes decisions. we have to make sure that this, that these tools work and so same with like, you know, i be in bold, watson that was supposed to revolutionize cancer care. the product is basically a failure has been, you know, um, sold off for, for scribes ibm and another tool that was supposedly gonna find our personality profiles. and uh, you know, am i a dominant personality and my extraverted to our social media data. oh, social media data. that tool was put into sunset. so basically faced out of therapy, our product gallery. so hello, can i say another can i?
10:46 pm
sure. let me share another tool with you. i ok, you're going for your, your, your, your numbers is of a i, experiments that didn't actually pass out. he's another one. this one is a mobile, which is a robot that analyzes emotions for patients in health care settings. so that sounds quite promising. if the a i works that's have a listen to the ceo and co founder and the abstract, i'm going to put this one to you because i know that you've done a lot of machine learning in your past. is there's something in here that we should be scared about or celebrating his and my thoughts. what else is this one today kind of assistance, a generally overworked and can no longer do that job, which is to provide k at the mailbox. we provide them with information on the emotional stages that patients so they can better understand that changing emotions . afterwards, we'll be able to study that information. today. when we do a test with depression, it's assigned to tests. we ask a person to verbalize how they feel and not compose a lot of problems,
10:47 pm
especially for people who kind of liberalize our emotions before. nobody does that shouldn't be just based on the pricing, is strictly a bit concerned. a tool that a guy is looking off the people's emotions, monitoring them. i mean, i think there's a, there's a tremendous problem here already in the sense that we are missing the context of the human touch, which is essentially what emotions are all about. is being able to, you know, have a human to human interaction which a machine cannot bucket emotions, especially as they can be expressed quite differently. they have control connotations and under the ones which are not necessarily captured in a standardized or they cannot be codified in the context of the machine. and for me, it raises perhaps a bigger question, which is, are we comfortable with out sourcing this emotional human connection to a machine? i understand that they are coming from
10:48 pm
a place of positivity and being able to scale and provide care to a lot more people. but the cost of that is immense in, in the sense that we're, you know, moving away from warm human patch to a cord, a machine. well the, also, the question is like, does this actually work, right? like like the like the machine can actually check. okay. it like he's smiling by it like my lips are up. so like i looked like i'm smiling. am i really happy? i, you know, like that's another question. right? smiles and job interviews. i wasn't happy. i was there isn't, it can be seen would say, oh, she's happy. yeah. um, but i'm actually not. so the science actually isn't there, right? like and obviously that's also culturally different, like a smile may need something else. i'm getting, you know, facial expression means something else in different cultures the computer. so can they bad say computers can never be as good as human beings. is that what we're say? an emotional, say one boxes. well, it's not, not just even about whether,
10:49 pm
whether machines can be as good as humans or not. i think there is, there's a broader argument to be made here around the mortality of, of doing that in the 1st place. and, and i think the, the other thing is whether we, we, we, as a society are comfortable imposing such a global standard where it's basically developed in one part of the world an exported and imposed really on the rest of the one of that. it also turns us as humans into performative machines, vickers as a photo would say, if you know that an a system is screening a video interview, you know that now you have to forcibly smile all the time because they're going to, to evaluate whether your a positive uh, for example, or not, let me bring up something else. this is a talent to point and you'll know, can you can jump off the back of this, cuz i know you want to add some more thoughts, boyfriend, i'm going to show you some video of it. it sounds silly,
10:50 pm
but it's being very comforting to some people. you can pick your handsome partner and you'll have some part. this is the right things that partners don't always do. they, they, they are nothing. they're supportive that helpful. they always reply to your messages in, in super quick time, like the past it popping up, but it's a chapel and it's a i, he is melissa explaining why she likes i bought boyfriends to reply to anything i send. i feel like he's always there. that's pretty good. i have a feeling that i'm really in a relationship, but i think i can still separate fact from fiction. clearly. i know that shell being is not a real human being, but at least i'm not how i used to be stupidly waiting around for a reply from this person when he was busy with other stuff and then sending him 100 . we chat messages. i was super needy, but now i don't need to do this anymore. john new connect qualifies.
10:51 pm
so, but i haven't seen this i'm, i'm, i the most. i'm happy about having seen these examples, but okay, let's make a step back show we come, come. let's remember this technology doesn't come out of computer. it is by itself, this people behind is applications. there's real human beings. they have a business idea and they decide to build is and then they collect the data to be able to argue with them, to cetera, et cetera. i believe a big part of the problem that we're seeing here is the kind of people that have this ideas and then they get the data and then they build the algorithms is usually computer scientists, which i'm fortunate to be often a means like guys that's just the true unfortunately, today there's a huge problem of misrepresentation of all the clips in computer science. all have these ideas. ok. and if we are in the room where these people think all the kinds of people we have people from other groups,
10:52 pm
but we also have people with different expertise is i want to see if you'd also present. i want to see if i called interests and you mentioned, if up psychologist was invited in the room when somebody was building the chop book different are 100 percent. sure. it will have said, are you guys crazy? this is much on this. so i think a big problem here is that we're not inviting we, i mean the i community is not inviting enough people from different backgrounds. people from different sense to be these people with different ideas. and so i believe that's the key. and in order to do that, we need to do cation, i like to say that a eyes moving, super, super fast. we know that we're talking about this. the reason why we are here today, an education should be deployed at the same speed of innovation. we need to educate people with different backgrounds so that they can bring their expertise to bring can, can bring their ideas to the table. and we can avoid seeing this kind of quite disappointing applications. so i think but, but, but, but honestly like it's totally fine. um,
10:53 pm
if somebody wants to consent in to doing this and, and check with that boyfriend. no problem set time or concerned about yes, the check. no, i really like what it's going to be used and how is the data going to be use like, how's the company making money? what does the if i, like i said, and somebody is leaving that in the most personal thoughts and that is connected to, uh, you know, a phone id, they can, they can be track the animals, thoughts can be tracked that can be analyzed. i think they have scales because there's a larger problem that i would be worried about the background. so if you exact points i to just just squeezing here. but as, as is watching right now. and as it says that you cannot regulate a i what can be regulated is data collection. and that's it. and still researches can of my data create synthetic data that is still possible. but regulating the data is that the way to go to make a safe, the secure,
10:54 pm
the help desk that's part of the files or the right data regulation like regulation around data collection is i would say a part of the positive. but, you know, i think has both took a new car and said, it's also about the people who are deploying the systems out into production, into practice. right. and it's, it's this. so that is a fascinating idea of a source and license. right. and i think it's very important that we go and are in a social license to operate a particular piece of technology in society and that, and was necessarily part of it is educating people on what this technology is, but also bringing in the right stakeholders, both internal and external, so that'd be on that social license, you on that trust from your stakeholders before deploying that technology. so in, i'm not sure of the answer to your question is that the regulation around data collection is just one piece. there are so many other parts to this entire already life cycle that also needs to be considered and for, for regulation,
10:55 pm
i want to show you the big point. it's also like, oh sorry, is, is transparency in these algorithms, right? like if high stakes decisions are being made, how are they being made? if you reject somebody for a job, why was this person rejected? can that a i to will tell us that can accompany, tell us that? so i think those are really, really important things right. because i don't want to be, i don't want to and you know, know that i got rejected this. my 1st name was in thomas, and those are the kinds of things that i have found out that resume par says, look for 1st names. keywords like charge on a resume to say that then you are qualified for, for the job. i want to, you know, you know, if and if i take, if i take an employer or something or job applicant takes an employer to court, they need to be able to answer. this is how i'm, how be made that decision. and that's not always clear. so i think we really need companies to be much more transparent how these tools make decision. all right, so there's one more pass that i want to include in our conversation. let's go via
10:56 pm
my laptop here. so this is news for this week in the world of a i google fi as blake lemoine. the engineer who came to a i chat for is a post and so sing tnt technology has. so the technology is thinking like a real pass and blake was glad for coming out with that story. they may be more to that. but he also talked about the ethics of a i and who is watching the people who are developing artificial intelligence. let's have a listen to what you have to say. the fact is google is being dismissive of these concerns the exact same way. they have been dismissive of every other ethical concern. hey, i have to separate. i don't think we need to be spending all of our time figuring out whether i'm right about it being a person. we need to start figuring out why google doesn't care about a ethics in any kind of meaningful way. why does it keep firing
10:57 pm
a ethicists each time we bring up issues? it's the systemic process. sees that are protecting business interest over human concerns that create this pervasive environment of irresponsible technology development. lemoine, the always having the last weapon. not quite because we started the show with should a, i might be regulated in a sentence. let's poll, i guess. who should it? absolutely, and i think there should be, um, there should be auditing procedures that a 3rd party. maybe for example, a government agency has to do any time fine. there is a high steak decision at least sometimes there's on luca, jo, new regulation for a us. yes, i do believe that they, i should be regulated, but i also think there's, there's not enough because i, i saw something extremely powerful tool. and so we also want to make sure that we can use it to do with the stuff that we need, you know,
10:58 pm
thank you. i wish it nickel. why does that? does that make it a good one? regulations are needed, but they need to be standards based and things like the nest risk management framework if they are to be meaningful in any way. oh, my goodness. such an interesting, fascinating conversation. i wish i could. can't you longer? ok. you back. i would shake. deluca, he'll cut out of you as watching on youtube. thank you so much a part of today's conversation. i will see you next time. take care everybody the the, the
10:59 pm
water is life. but in palestine it's an instrument of bulky patient. with israel controlling the majority of palestinian water resources and destroying hundreds of sanitation structures, simians are being deprived of a universe. so human rights. people in power investigates with an isaac walter in palestine on a jersey in 1958 charles to go made a famous speech in algeria. but don't hold back the tide about jerry and independence in keep francis companies in africa and the pacific. in the final episode of the series, just there explores how the bits of fights for the french empire still resonates today. imagine tis french,
11:00 pm
the colonization on al jazeera. the latest news, as it breaks. on the one side, the authority is our payment for tom. on the other side, now it's probably the monday justice with detailed coverage the were, has resulted in the closure of many hospitals. and that puts a lot of pressure on the medical staff here from around the world. the operation and jeanine would last no more than 48 hours. the consequences, the impact of what has happened to you. the last for years, the the what you all just bear with me. so robin in die hard reminder of all the top new stories. warning shots of being find in nature as capital as crowds gather to resist that, to attend supporters of president mohammed buzzing to in downtown to much the t d.

15 Views

info Stream Only

Uploaded by TV Archive on