Skip to main content

tv   The Stream  Al Jazeera  July 27, 2023 11:30am-12:01pm AST

11:30 am
of what he calls you, a piece of identified aerial phenomena. another witnesses that happens, you know, look out for talk of a flying black cube, 10 miles off the coast of virginia beach to f. 18 super foreigners were split by you. i p. the object described as a dark gray, or black cube inside of a clear spear, came within 50 feet of lead aircraft and was estimated to be 5 to 15 feet in diameter. the mission commander terminated the flight immediately and returned base . our squadron submitted a safety port, but there was no official acknowledgement of the incident and no further mechanism to report the sightings congress wants to change that in hopes of answering the age old question. are we really alone? medical a. l 20. washington. the cut over again. i'm elizabeth ron and uh, how would be headlines on i'll just say around. a group of soldiers has carried out
11:31 am
a crew and nisha ever moving presidents mohammed presume from pallet 7 ounce to closure of borders and imposed a nationwide curfew. the someone just accused the president pulled governance for diversity. cpg, we, the security defense forces have gathered at the national council for the safe guard of the nation and have decided to put an end to the regime you are familiar with. this follows the continuous deterioration of the security situation and the bad social and economic management. we reaffirm all supports to all commitments undertaken by any chance session. this is real security administer. it's a me up end of it has led an incursion into the lock. so most compound and occupied east jerusalem, he was accompanied by around 1000 alternation and the settlers under is ready police protection. non muslims are allowed inside the compound, but then not permitted to pray under a decades old agreement. and disparity soldiers have shot and killed a palestinian teenager during the laces rate and the occupied westbank. witnesses
11:32 am
and calculus of the 14 year old hilde rocks. the troops who responded with gun file that's really military has been carrying out the nightly rates. it says to clear the west bank of the groups. north korea's lead, a conjunction has hosted russians defense minister, and a chinese delegation, and shown gang the outcomes. first high level visitors since the close of 19 pandemic. the gad, he showed off north korea's military arsenal, including band ballistic missiles. russia has launched a new wave of air strikes across ukraine. cave says it's forces and deceptive. 36 cruise missiles or the keys khaki and to the pro port facilities in odessa, but also targeted. your crime says 26 facilities have been damaged in the past 9 days. that follows rushes withdrawal from the grain due guaranteeing safe passage, and the black sea wall file has destroyed an ancient church and palermo on the tallying island of sicily. and 3 people have been killed. extreme hayes and strong
11:33 am
winds offending files across the mediterranean region, or all those other headlines on algebra as always, a website alger 0 dot com has the latest on all of our top stories. they tune the stream is coming up next. thank you for watching the zeros here to report on the people often ignored, but who must be hurt? how many other channels can you say? we'll take the time and put extensive followed into reporting from under reported areas. of course, we cover major global events that are passion lies in making sure that you're hearing the stories from people in places like how is fine with the region and so many others. we go to them, we make the effort, we care straight the i answer the okay on this episode of the street and we are looking at the rapid
11:34 am
development of all to official intelligence the dock side of a. i don't so be amazing. adults is that may well be possible. do you remember sophia c, a i android? is she s u for guns, right? i'm already, i'm sophia of hansen robotics, one of the 1st androids in the world social robot. so you can take care of this, the court, elderly, in many kinds of health care and medical uses. i can help communicate, give therapy and provide social stimulations and a mobile carrier system. what could possibly go wrong? oh, maybe go right. let's meet your panel. they're about to tell us. hello alba. shake . hello john luca, hello, hello, cut. lovely. to have all 3 of you on board. i was shak. please introduce yourself to audience. tell them who you are. i'm what you day. i a hi. uh, great to be um here. um, obviously the founder and principal researcher at the montreal,
11:35 am
the isaacs institute. it's an international non profit research institute with admission to democratize the ethics literacy previous to my current role. i used to work as the machine learning engineer at microsoft, and i now lead the responsibility. i program at the boston consulting group because you get to have you have a john, nick, a welcome to the stream. introduce yourself to either us a hi everybody. thanks for having me. my name is jessica. i'm an, a author, entrepreneur and speaker. i run a company called the academy, which focuses on education and consulting. and i think there's an intelligence. and i run a weekly find newsletter called textbook pizza. i'll try to make all these crazy stuff that happens in fact more easy to understand them. and you can stay hello, hello, welcome to the stream. please say hello to our audience around the well, tell them who you are, what you day? yeah. hi, i'm hope to shellman, i'm an investigative journalist and i'm a journalism professor at new york university. and i've been investigating and
11:36 am
hiring and you know, artificial intelligence in general since 4 years ago when i was in a cap, right. our lift life better in dc, going to the train station and i talked to the driver and asked how was the day? and he said, i had a really weird day. i had a job interview with a robot and i was like what? so i started investigating a fast since, you know, i'm done reports for the wall street journal, the new york times, and my technology review, and i'm writing a book right now and artificial intelligence and the future work. all right? oh, i can, it's one of the kind of experts that we have to be if you on youtube right now, jump into the comment section. be part of today show should i be regulated? is now the time to move. i want to ask you to help me out because all i think is a, i optimize your intelligence as machines mimicking process is that human beings would normally do. we've given that decision making 2 machines and then we work out . does it work out?
11:37 am
does it not work out with them making decisions which probably in the 1940 to 1950 is always more recently they weren't doing for us. so luca, how did i do? so you are correct, but we have to say the artificial intelligence is a lot of different things. it's a lot of different tools. it's you can see as a home or i'm or is a general tool that you can use to do a lot of different things. you can build a statute, we can build a house, you can do a lot of stuff. and so it could be either the kind of problem that we saw before with sophia, but it also be much more behind the scenes application. an example of that really last is a project the google made to try to optimize the energy consumption of their data centers. so in that case, it works behind the scenes trying to always find the best possible combination of different temperatures and set points, causing the energy consumption by 40 percent. that's just one example. all the possible things that the guy can do behind the scenes to improve our lives and improve our processes. i would say give us an example of
11:38 am
a lie that you've lost this call existing and our daily lives. so uh, if it works well when uh when it does a good job, which is, you know, recommendations for exciting, new tv shows on netflix. but also i have to admit that sometimes it doesn't work well when you spend hours trying to find the right tv show. to watch, so i think exactly as, as what was said before, the e, i is, is, is something that, uh, you know, can be an embodied robotics. but more often than not, it's really hidden in, in various parts of our lives, with things like in terms of success of products. and we get on amazon movies that are recommended to us on netflix, uh, music to listen to one spot of fi, etc. and so it's, it's also, it was something else that you mentioned earlier, which was that e, i is a, constantly shifting global as to what we used to perceive and accept as a,
11:39 am
i very commonly think about, you know, the next word prediction on your smartphone. now is just stable stakes, it's accepted as a, as in every day, software feature. and we don't even think about it as a i, and we just think about it as a regular software. it's. yeah, right. it's an out emails. it's used for hiring. if you upload your resume to indeed a month, so dot com, a linkedin, all of those large companies use, use a i to understand who you are and what kind of jobs you should be. the job offers who should be getting. or if you should be recommended to recruiter so we see it being used in all kinds of things and health care settings and we've seen, you know, things where really has been working out well. and we've also seen some spectacular failures. so i think from, you know, the 1st iteration was sort of like, we are so excited about a, it's going to revolutionize everything. 5 things. there was
11:40 am
a little bit more realism maybe. um, to that, you know, how well can this technology help us out? you has a horrifying story of a job application is going through to alison and how the a i sifted the job applications. will you briefly tell us that story? cuz it's pretty shocking. i mean, yeah, we found out that the a i wasn't working as intended or was it? yeah. so we see this a lot, right, since the ninety's we have these wonderful job platforms, monster linkedin, and so everyone can apply to, to any job. and that's really wonderful. on the other side of companies telling me we get millions of applications. you know, ibm says they get around the 1000000 applications a year. they've drowning and resumes, right. what are they going to do? they going to use technology? so amazon same problem, right? they got too many of resumes. they wanted to build the tool, you know, in a i tool that can pick the best applicants. so a wonderful idea. we all want it,
11:41 am
of course. so, so what they did is they used um, uh, resumes from folks that have been interviewed at amazon before. and uh, you know, left the machine vine instead of the drop applicants were, were, were checked out their resumes and were put on pile. it'll be yes or no. so over time, the engineers found out that the, that the resume, that the resume parcel was starting to downgrade folks who have to work woman or women. so on their resume because it turns out over time, right in the past. and, you know, male male applicants were preferred at amazon there's, there's obviously have more men in their departments so that i till started to reflect that problem. so that's one of the things that happened. i found other problems and resume parser, so that's a technology that looks to our resumes and says, should this candidate be rejected or go on to the next higher rate because i saw so disappointing now because so one of the applications we know they go into a system,
11:42 am
we have no idea what happens the other side. all right, so i've got some questions for you. uh, your new cat, and also i will check out some out online audience who are asking right now, which is often very briefly if you could say at says 10 a all i replaced humans in the near future of a check thoughts very quick once. not really, um humans bring a lot of unique skills through the mix and emissions could certainly replace parts of our jobs tasks within jobs for not entire jobs. and so we're safe if that's the worry. but how much ask a question to look. i'm gonna put this one to you on youtube, the more technology advances, the more we use app privacy, true false in between. well, i would say that it's not the fault of that economy. gigi, this is about the way that companies get the data from us that they need to power
11:43 am
these technologies. and so i was trying to was to move the focus from the technology itself to the companies that are using katie no way that is not ethical . and that's why we, i believe we do need regulation, we do need governments to trying to pertain checks and balances in place so that we know all of the companies that are using this technology. they're using, doing the kind of way that his ankle he'll come. but i think he's here has a point that i do feel that like our privacy is under us back because to build these large scale a i tools you need and normally the amounts of data and you need it from people. so we need to, you know, companies scrape whole datasets, they built these gigantic image, data sets, audio data sets and who knows how our image is, how voice get in there and what it's used for, right? also like the, um, uh, you know, the face data quote unquote the, that relief on social media. all of that is also being used and a built into these databases. right. and i think a lot of times, maybe technologist, need to take a closer look like, what's in these database,
11:44 am
but they faces that i'm using, what's not in there. and could they be vases their most sex doesn't, you know, kind of like the amazon example historic examples that may be replicated in these systems as i'm building this new way i tool new. can you on learning articulate and not? yeah, i want to build them this. i think the problem that we need to face and try to solve is this mindset that was really, you know, pioneer by companies in silicon valley facebook was used to say move fast and break . thanks. so that approach is do whatever it takes to try to build this technology in the past as way possible. and let me, it's thinking shortcuts. shortcuts are stealing data, sticking them. it's basically just taking data from people. we don't know the find them. it means using technology without having properly verify that the technology actually does what we think does. and so in them you have all these issues. we have problems of people that realize that they are being spied on. we have algorithms that are not performing properly, like you said,
11:45 am
but i will go back to the roots to the, to read the root cause, why do we have these problems? i believe it's because these companies started to us too fast and to try to push innovation on down the road before having done old everything possible to make sure that these will serve society. oh yes, and i think the problem is that when, when we use these kinds of technologies in high stakes decision making, right, like is somebody gonna go to prison for 10 years or 5 years? those are like really high stakes decisions. we have to make sure that this, that these tools work and so same with like, you know, i'd be in bold watson that was suppose to revolutionize cancer care. the product is basically a failure has been, you know, um, sold off for, for scribes ibm and another tool that was supposedly gonna find our personality profiles. i'm a, you know, am i a dominant personality and my extraverted to our social media data. oh, social media data. that tool was put into sunset. so basically faced out of therapy,
11:46 am
our product gallery. so oh, can i say another can i sure. let me show another tool with you. i ok, is you going for your, your, your, your numbers is of a i, experiments that didn't actually pass out. he's another one. this one is, am i bought, which is a robot that analyze these emotions for patients in health care settings. so that sounds quite promising. if the i works, let's have a listen to the ceo and co founder and the abstract. i'm going to put this one to you because i know that you've done a lot of machine learning in your past. is there's something in here that we should be scared about or celebrating his and my thoughts. what else is this one today kind of assistance, a generally overworked and can no longer do that job, which is to provide k at the mailbox. we provide them with information on the emotional stages that patients so they can better understand that changing emotions afterwards will be able to study that information. today, when we do
11:47 am
a test with depression, it's a spend the tests. we ask a person to verbalize how they feel and not compose a lot of problems, especially for people who kind of liberalize our emotions. but nobody habits it shouldn't be just based on the pricing is strictly a bit concerned. a tool that is looking off the people's emotions, monitoring them. i mean, i think there's a, there's a tremendous problem here already in the sense that we are missing the context of the human touch, which is essentially what the emotions are all about. is being able to, you know, have a human to human interaction which a machine cannot bucket emotions, especially as they can be expressed quite differently. they have control connotations and under the ones which are not necessarily captured in a standardized or they cannot be qualified in the context of the machine. and for me, it raises perhaps a bigger question, which is, are we comfortable with outsourcing? this emotional human connection to a machine,
11:48 am
i understand that they are coming from a place of positivity and being able to scale and provide care to a lot more people. but the cost of that is immense. in, in the sense that where, you know, moving away from warm human touch to a cord, a machine. well the, also, the question is like, does this actually work, right? like, like the, like the machine can actually check. okay. it like he's smiling by it like my lips are up so like i looked like i'm smiling. am i really happy? i you know, like that's another question. right? i smiled and job interviews. i wasn't happy, i was there, but it's like the machine would say, oh she's yeah. um, but i'm actually not. so the science actually isn't there. right. like and obviously that's also culturally different, like a smile may need something else. i'm getting, you know, facial expression, mean something else in different cultures. the computer. ok, bad. so computers can never be as good as human beings is that what we're say?
11:49 am
an emotional, say one boxes, well, it's not, not just even about whether whether machines can be as good as humans or not. i think there is, there's a broader argument to be made here around the mortality of, of doing that in the 1st place. and, and i think the, the other thing is whether we, we, we, as a society are comfortable imposing such a global standard where it's basically developed in one part of the world an exported and imposed really on the rest of the one of that. it also turns us as humans into performative machines, vickers as a photo would say, if you know that an a system is screening a video interview, you know that now you have to forcibly smile all the time because they're going to as evaluate whether your a positive a present or not. let me bring up something else. this is a talent to point a new thing. you can jump off the back of this, cuz i know you want to add some more thoughts, boyfriend,
11:50 am
i'm going to show you some video of it. it sounds silly, but it's been very comforting to some people. you can pick your handsome partner and you'll have some part. this is the right things that partners don't always do. they they, they all know being that supportive, that helpful. they always reply to your messages. in, in super quick time, like the past it popping up, but it's a chapel and it's a i, here's melissa explaining why she likes i bought boyfriends to reply to anything i send. i feel like he's always there. that's pretty good. i have a feeling that i'm really in a relationship, but i think i can still separate fact from fiction. clearly, i know that shell being is not a real human being, but at least i'm not how i used to be stupidly waiting around for a reply from this person when he was busy with other stuff and then sending him 100,
11:51 am
we chat messages. i was super needy, but now i don't need to do this anymore. john new connect qualifies. so, but i haven't seen this i'm, i'm, i the most. i'm happy about having seen these examples, but okay, let's make a step back show we come to um, let's remember this technology doesn't come out of computer. it is by itself, this people behind is applications. there's real human beings. they have a business idea and they decide to build is and then they collect the data to be able to do with them, so cetera, et cetera. i believe a big part of the problem that we're seeing here is the kind of people that have this ideas and then they get the data and then they build the algorithms is usually computer scientists, which i'm fortunate to be often a means like guys that's just the true, unfortunately today is a huge problem of misrepresentation of all the clips in computer science. all have these ideas, ok. and if we are in the room where these people think all the kinds of people we
11:52 am
have people from other groups. but we also have people with different expertise is i want to see if you'd also present. i want to see if i called logistics and you mentioned, if up psychologist was invited in the room when somebody was building the chapel of different a 100 percent. sure. it will that said you guys crazy, this is month in this. so i think a big problem here is that we're not even biding. we, i mean the i community is not inviting enough people from different backgrounds. people from different sense these, these people with different ideas. and so i believe that's the key. and in order to do that, we need education. i like to say that's a i's moving super, super fast. we know about we're talking about the, the reason why we are here today. an education should be deployed at the same speed of innovation. we need to educate people with different backgrounds so that they can bring their expertise to bring kind of can bring their ideas to the table. and we can avoid seeing this kind of quite disappointing applications. so i think but,
11:53 am
but, but, but honestly like it's totally fine if somebody wants to consent into doing this and, and chat with their boyfriend. no problem set time or concerned about yeah, the check. no, i really like what it's going to be used and how is the data going to be use like how the company making money? does the if i, like i said, and somebody is leaving the, you know, most personal thoughts on that is connected to a, you know, a phone id, they can, they can be track, the animals, thoughts can be tracked that can be analyzed. i think they have scales cuz it's a larger problem, but i will be worried about the background. so if you exact point, so i to just just squeeze in here. but as, as is watching right now. and as it says that you cannot regulate a i what can be regulated is data collection. and that's it. and still research just going over my data, create synthetic data that is still possible, but regulating the data. is that the way to go to make a safe,
11:54 am
the secure that's part of the positive, right? state or regulation like regulation around data collection is i would say a part of the positive. but, you know, i think has both took a new car and said, it's also about the people who are deploying the systems out into production, into practice. right. and it's, it's this. so this is fascinating idea of a source and license. right. and i think it's very important that we go and are in a social license to operate a particular piece of technology in society and that, and was necessarily part of it is educating people on what this technology is, but also bringing in the right stakeholders, both internal and external, so that you on that social license, you on that trust from your stakeholders before deploying that technology. so in, i'm not sure the answer to your question is that the regulation around data collection is just one piece. there are so many other parts to this entire already
11:55 am
life cycle that also needs to be considered and for, for regulation, i want to show you the big point. it's also like, oh sorry, is, is transparency in these algorithms, right? like if high stakes decisions are being made, how are they being made? if you reject somebody for a job, why was this person rejected? can that a i to will tell those that can accompany, tell us that. so i think those are really, really important things. so i, because i don't want to be, i don't want to and you know, know that i got rejected this. my 1st name was in thomas, and those are the kinds of things that i have found out that resume par says, look for 1st names. keywords like charge on a resume to say that then you are qualified for, for the job. i want to, you know, you know, if and if i take, if i take an employer or some, a job applicant takes an employer to court, they need to be able to answer. this is how i'm, how be made that decision. and that's not always clear. so i think we really need companies to be much more transparent how these tools make decision. all right,
11:56 am
so there's one more pass that i want to include in our conversation. let's go via my laptop here. this is news for this week in the world of a i google fi as blake lemoine. the engineer who came to a i chat for is a person so sing tnt technology has. so the technology is thinking like a real pass. and blake was bled for coming out with that story. they may be more to that. but he also talked about the ethics of a i and who is watching the people who are developing artificial intelligence that's have to listen to what you have to say. the fact is google is being dismissive of these concerns the exact same way. they have been dismissive of every other ethical concern. hey, i have to separate. i don't think we need to be spending all of our time figuring out whether i'm right about it being a person. we need to start figuring out why google doesn't care about a ethics in any kind of meaningful way. why does it keep firing
11:57 am
a ethicists each time we bring up issues? it's the systemic process. sees that are protecting business interest over human concerns that create this pervasive environment of irresponsible technology development. lemoine, the always having the last weapon. not quite because we started the show with should a all right, be regulated in a sentence. let's poll, i guess who should it? absolutely, and i think there should be, um, there should be auditing procedures that a 3rd party. maybe, for example, a government agency has to do any time find there is a high state decision, at least in taxes on luca jol, nuka regulations, a us, i, yes, i do believe that they, i should be regulated. but i also think this is not enough because i, i saw something extremely powerful tool and so we also want to make sure that we
11:58 am
can use it to do with the stuff that we need, you know, thank you. i appreciate it. but why does that, does that make it a go? what regulations are needed, but they need to be standards based and things like the nest risk management framework if they are to be meaningful in any way. oh, my goodness, such an interesting, fascinating conversation. i wish i could catch you longer. ok, you back. i will check deluca. he'll cut out of you as watching on youtube. thank you so much for being part of today's conversation. i will see you next time. take care everybody. the of the
11:59 am
water is life. but in palestine it's an instrument of bulky patience. with is a way of controlling the majority of palestinian water resources and destroying hundreds of sanitation structures. palestinians are being deprived of a universe. so human rights. people in power investigates with an icing. walter in palestine on a jersey to a filmmaker follows her mother's return to south so done after years and eggs. so we came home and into a vice presidential position. my mother stepping into the role that my father died and will not be history, repeating itself to a more likely be remembered for what she does in this new position and intimate
12:00 pm
portrayal of a family in challenging times. no simple way home. people with us. we don't know what to us on a jersey to when the news breaks, the story of this village is the same as many of us spread across the eastern front line. no electricity, no running water, when people need to be hurt. and the story needs to be told this children are unable to go outside inside is extremely hot with exclusive intervenes and in depth reports low as people expected its be wide. so all together by now i'll just see right, has teams on the ground to bring you more award winning document trees and life news the saw which is a nation announce of crow saying that asked of the president and close the board as the.

16 Views

info Stream Only

Uploaded by TV Archive on