Skip to main content

tv   Nowhere To Hide  Al Jazeera  June 20, 2019 3:00pm-4:01pm +03

3:00 pm
higher fraction of all the crimes committed by black people than committed by white people so you turn algorithm loose on that it'll say wow black people are really dangerous but we can ignore white people. so what happens is that. the algorithm. calcifies or embodies the bias in the police. by there for a really good summation amount really going to bear really. yeah let's go to. pam and con is one of the lead organizers of the stop l.a.p.d. spying coalition a collective that campaigns against what it believes to be growing police surveillance and criminalization of the local community in 2018 the coalition took the los angeles police department to court forcing it to release the details of its predictive policing program there's 2 layers to predictive policing one is
3:01 pm
a community and a location based where algorithms are used and the company principal has developed that algorithm which was owned by jeffrey brown think it was a professor of anthropology and has a long history himself how this thing was created on the on the feeds of afghanistan and iraq directly coming from the border from the war zones and the other piece is operational laser which is a person and a location based predictable policing program laser stands for los angeles strategic extraction and restoration program and the reason why it's called laser is that the creators of lasers said that we wanted to go into the community with a medical type precision and extract tumors out of the community like lead from a decision that's where they came up with the act and that's how they came up with the acronym as people are tumors the exact track to know exactly.
3:02 pm
credible and laser claim to offer is a one stop crime prediction shop the pitch is to tell police not just where crime will occur but also who might commit crimes in the future. the l.a.p.d. was using these technologies to decide where to deploy their police patrols. focusing resources on so-called crime hotspots fly by these. so this is all the hot spots for a particular time period hot spots are created by the algorithm the prep for longer where they use the information long term crime history or short term crime history and then they create these 505500 square foot hotspots on what basis how are they deciding this so the world to put it very bluntly there's a lot of pseudo science and knowledge it's being presented as these computers are really snootful and they would predict when crime may happen. but
3:03 pm
predictive policing doesn't just flag up a place with laser it also sticks to a person the l.a.p.d. maintain something called a chronic offenders bulletin these bulletins are undisclosed reports on so-called persons of interest people the police believe to be likely to break the law. this risk is calculated using a points based formula based on data from police records field interviews and arrest reports this is pulled together and scored by algorithmic software created by the defense contractor pal and here a company with close ties to the u.s. military. so how do you get yourself on to the laser system so these are the things that identified these risks so if you're stopped a field interview cards filled one point so if another incident if the police stop stop you've got a point one point immediately you're going to point this individual was stopped the same dude. 3 times so the treat it's a 3 point right there if there had been
3:04 pm
a previous arrest with a gun $54.00 inch if you have any violent crime $500.00 parole and probation is 54 and and identified is again as gang affiliated 5 points. when it comes to the chronic offenders bulletin points can mean prison but it's not just about locking people up for hemant con the data suggests increased police attention at the borders of a historically deprived area called skid row. which helps keep the poor contained from the more affluent neighborhoods nearby. so this is like a beachhead so think of the defense of financial district yet from poor people you know when we talk about hot spots you know you will see the dirty divide how the proximity of extreme wealth and extreme poverty coexist right here about 2 blocks from. absolutely. he was going oh man
3:05 pm
doing good to me and cory general. secretary of. the building right there. i meet steve richardson who goes by his street name general dogan he's a former prisoner and skid row resident who now works with the coalition campaigning for greater protection for the local community. say you guys have been doing work on this predictive policing stuff right what does that look like out here on the street to people who live here so predictive policing rolls out at a lot of way because i mean skid row is ground 00 all experiments that happen you know so this is poor folks of course so all a little programs l.a.p.d. spy programs everything that they come out with is 1st tested right here they say 1st last the safest cities just right here was growth 110 actually wrote make it the most oldies community not only in america but 2nd in the world to baghdad get
3:06 pm
all kind of patrols on skid row so we got the cops on motorcycles we got regular cars we got. we got detail cops there's all polies in like a 15 block area what are cops on horses right smack in the middle of the house you know they have no things that continue to come out here from a lot. about 80 percent of people here are black right about 80. people suffer from something. a disability maybe physical you know semi or mental and all of us is full of. the most arrested personnel skewer was a woman 8 and moody sousa arrested under 80808 times for violating 4118 the right 4180 day is up when this bill called to say you can sit sleep a lie on public sidewalk so our only crime was she was homeless and have anywhere
3:07 pm
to go and was forced to sleep in public space she got arrested $118.00 times for being 108. just for just being in public space and it was all over you know based on a lot of predictive stuff like that and the point you you doeg on and you have and are making is that this is a practice that goes way back right over policing in this community goes back decades and then that information from that then gets fed into the computer and the computer turns around and says we'll go back and do some more of the same thing right now and the computer before the information gets in the algorithm is designed for policing so the algorithm would create outcomes that an agency wants to achieve and this is really the key point and the outcome that the agency wants to achieve in this community is cleansing and damage so.
3:08 pm
we walk further along the tense began to thin out as to the local residents gathered on the sidewalk were it's obvious we're approaching the outer limits of skid row the hotspot boundary hammond had pointed out earlier. this is like if on the storm a host of hot spots that a person from skid row would be walking into and this is where you will have more policing waiting for people who are for him than waiting for people to give them tickets waiting for people to throw them against the wall right. for people to intimidate and harass and demand to believe the neighborhood. but few weeks after we left skid row the l.a.p.d. announced that it was canceling the laser program the pushback worked police admitted the data was inconsistent but the l.a.p.d. says the predictive policing tool pred pull is still in operation.
3:09 pm
so let's think about the incentive structures with some of the predictive policing tools that we've been talking about what does it say about the incentives and the problems we're going to have with these tools that you've got counterinsurgency software then essentially used for law enforcement purposes i hate to have such a sinister. interpretation but i think it's about opening up new markets to sell this software to 0 and law enforcement in the last is you know been a great market for lots of military technology is quite frankly i think that there's actually the opposite incentive to get it they have the incentive to get it wrong predictive policing software has an incentive to make the sale with police so their incentive is to is to make predictions that are as close as possible to what the police already believe is correct so given that it's really hard to know. if ai has been tried on representative data or or not if we have real reason to suspect for example that there might be bias than isn't there a question about whether the system should be used at all well i think that's the
3:10 pm
fundamental issue is that we're seeing the deployment of all kinds of automated decision making systems or ai over we want to kind of characterize it and we don't know the effects until after the fact after the damage has been done is primarily how we're learning quite frankly about what doesn't work and i think it goes far beyond bias i mean we're talking about aggregating data about us building data profiles that for close certain types of opportunities to us and what's more dangerous i think in the digital age about this is that you know in the 1950 s. if you tried to get a mortgage or you were black and try to get a mortgage at a bank and you were discriminated against you were very clear about what was happening that discrimination was not opaque and when it moves into a software modeling system what instead you have is a banker who's like you know i'm sorry dr noble you just can't have it and i don't
3:11 pm
really know why and so that lack of transparency is one of the things that i think we're kind of trying to contend with here and this just becomes a wholly normalized process we don't understand or with you know the the models for actuarial science for determining whether you're going to pay more insurance for example because you live in a particular zip code doesn't even account for these histories of racial segregation housing covenants real estate covenants so just because we look at the zip code that doesn't tell us about this long history of discrimination that has sequestered people into particulars of codes those are the kinds of things that i feel like over time become harder and harder to see i think one of the things that i find. worrisome is that we talk about data being collected for these kinds of systems and for the most part they just collect of some completely different purpose it just happens to be there in policing data is created by the police doing
3:12 pm
what they do they're driving around they're stopping people there occasionally arresting people and so forth that data gets produced and then is used in a predictive policing model it's not collected for the print predictive policing model that's a 2nd order effect that's used because the data is already there and it turns out that it is a terrible way to predict where future crime will be because what police do is not collect a random sample of all crime they collect the data they can see this is true in most of the places where people are applying i think it is useful to detect where bias is happening and simulation can be important that i think that's true however it doesn't necessarily allow people to have again this conversation that i have been discriminated against it's just sort of leaving the expert analysis to make that discovery when in fact there are
3:13 pm
a whole bunch of people that wanted to be homeowners or you know wanted to move house and they don't really understand why these decisions are happening so as a data scientist what's your take on this how do we build a kind of test for when it's appropriate at all to use machine learning and when it's not the question should be who bears the cost when a system is wrong so if we unpack a particular system and we say ok we're building a machine learning system to serve ads and the ad that we're serving oh this customer's searching for sneakers but we served or boots at. oh dear we were wrong there no one cares that's a meaningless meaningless problem the consumer could care less we get along ads all the time we're trained to ignore them let's compare that to a system which makes prediction about whether or not someone should get credit. in a credit based system if we're wrong the consumer who should have gotten credit doesn't get it or the consumer who should not have gotten credit does get it in
3:14 pm
both cases and in particular in the case where someone who should have gotten credit does not get it that consumer bears the cost of the air she doesn't get whatever it was that she needed the credit for to buy a house or a car or something else the company that failed to offer the loan may bear a small cost but there are a lot of customers so they don't really bear much of a cost and so when the customer bears the harm. we can predict that the harms will be greater because the people deploying the systems a little incentive to get it right. we know that if people of color are over police or poor people are over policed and over arrested they are also likely to be over sentenced. machine learning isn't just used to predict crime it's also used to decide whether a person should be given bail or how long a sentence
3:15 pm
a prisoner serves. criminal courts in the state of florida and use a predictive sentencing program called the correctional offender management profiling for alternative sanctions compass. in 2016 journalists at the us news outlet pro publica investigated compass and discovered an apparent racial bias at the heart of its algorithm. probably did. investigative report and one of the things that they found in their hand a review of all of their records was that african-americans were twice as likely to be predicted to commit future crime i found it incredibly interesting for example the story of died one of the reporters told that there was a black woman a young black woman who had taken a bike and one of her neighbors frank ahrens and kind of ridden it around and the
3:16 pm
person here on the bike said bring that bike back and so she did it but a neighbor called the police on her and she spent 10 days in jail and the compass software gave her a score of 8 out of 10 that she was likely to commit a crime again and that and they looked. white man who had a history of violent crime of history of being in a. now out of jail and the software gave him at 3 years so he was more likely to be replaced. once again the bias in society was revealing itself in the machine. the. voters beaten on polling stations storms 12 european politicians on trial for their role in a referendum on the line independence their political opponents in the prosecutor's
3:17 pm
seat for a case that traces crucial questions about democracy and self-determination. but is the outcome already decided by a hostile spanish state. the catalonia trials justice or vengeance on a. it's my privilege to name al-jazeera english the broadcaster of the year the cartels are fighting each other and we've been told that we can still hear these is the largest demonstration that's been held by winter refugees since over 700000 why we're seeing the some of the nicest losers on the planet earth here calling to think that they could be plus they can hear. al-jazeera english proud recipient of the crystals gold coast of the year award for the city of. al-jazeera world to meet some extraordinary women. who are making things happen that way. following their daily struggle to survive.
3:18 pm
for their families to thrive. egypt's women street silent as on al-jazeera. hello i missed on here tehan doha with the top story is on al jazeera un special investigator yes calum all has described last year's murder of jamal khashoggi as an extrajudicial killing for which saudi arabia is responsible her report calls for an international investigation and to the responsibility of saudi officials at the highest level and creating crown prince mohammed bin. when mr cashel
3:19 pm
loses consciousness there is no evidence of the people of power there in the room attempting to take care of him attempting to rescue c 2 came attempting to do something that to me points to the fact that the notion that there is an accident that the next few going to happen doesn't quite match what do you have heard there is nor tempt to to do something and there is no scream so or. any expression of of fear over what's happening in the us says one of its military drones has been shot down over the straits of hormuz by an iranian surface to air missile it contradicts was around the revolutionary guard said earlier that it shot down a u.s. spy drone after it entered into iranian territory over the southern who was gotten province and the u.s.
3:20 pm
special envoy to iran is heading to the gulf region brian hook will hold meetings in several countries and during saudi arabia the united arab emirates and kuwait president trump and secretary pompei i have expressed very clearly willingness to negotiate with iran when the time is right. about our desire for peace or our readiness to normalize relations should we reach a comprehensive deal we have put the possibility of a much brighter future on the table for the. president. with.
3:21 pm
the big picture.
3:22 pm
the. you know. the risks of bias baked into machine learning aren't just confined to law and order. upon release prisoners must reintegrate into a world that is increasingly automated. today for them as for you and me opaque computerized systems will help decide their access to state welfare to private finance and to housing take credit scores these are shorthand for a person's financial trustworthiness in many ways credit scores are the gatekeepers to opportunity and increasingly they're produced by algorithms fed on data blind to context and history. if that credit report comes back with
3:23 pm
a low score that means this individual is supposedly a high risk so you begin to sort of just go around in a circle. low credit score criminal background can't get housing because you don't have housing you can't get a job because the job that you're applying for requires a permanent residence. there for a ged and are stuck in this cycle of an opportunity you're at the whim of a machine driven system that decides on the basis of different criteria that are on the notes to you. this is one of the darkest topics of our era there are human biases in targeting on the on the battlefield their human biases in who gets loans their human biases in who is subject to arrest and these human biases are horrible couldn't we fix it with algorithms that
3:24 pm
wouldn't be biased but then it turns out the algorithms are perhaps worse the algorithms have refined the worst of human cognition rather than the best because we don't know how to characterise the best. i went to the work rebooted conference in the heart of the tech industry san francisco california to see if ai could be used to bring out the best in human endeavor some people are going to do well some people can do less well i met ben prng who heads the center for the future of work at cognizant a multinational corporation specializing in i.t. services. i know a lot of people anxious about the whole notion of bias within the algorithm and so one of the jobs we've speculated on little be creative is what we call an algorithm
3:25 pm
bias order to which could be a sort of morphing off of the traditional kind of cool order row to make sure of that the reason unconscious bias within. algorithms are going into production environments within big businesses so that people can reverse engineer decisions made by software you do look at job opportunities opening up you know have said that you do anticipate some job losses in certain areas yeah occluding some that actually people you think haven't seen there is a class of new software there's a motion the last couple of years in the industry it's called reports of process automation. and you can get a team of 500 people down to 50 people that's the reality of what's going to happen in big business is that a lot of that kind of white collar. skilled semi skilled work mid level mid skill level what is going to be you know is replaced by this kind of software in the snake denying that some people will be kind of left behind in that
3:26 pm
in that transition so what other jobs do you think that ai might open up in 5 or 10 years time so we came up with this job we call a war could talk which is this idea that you know in a lot of. towns around the world certainly where i live in massachusetts lot of seniors they're very isolated so what if there was an imbecile platform where. people in the neighborhood could log on to the platform i've got a spare hour on a tuesday afternoon or saturday morning i could go and walk and talk with a senior in my neighborhood so people living in the kind of gave the economy a living a kind of portfolio style set of jobs they maybe drive. they maybe drive a lift they maybe do their house through the they may do things to task rabbit what if they could literally monetize that spare time they have to go and walk and talk with a senior that doesn't sound like
3:27 pm
a technology based job but that would always rise on a ai infused platform in the same way that. most of the people who do care work are women and women of color guess what guess who has been taking care of other people's kids sense they were in slaved and brought to north america black women this idea that somehow these historically oppressed suppressed communities are now in some type a better situation because there's an app interface between them and the new people who want that work done and then call it a fascinating new gig ng opportunity i think is just completely nonsense the experience of marginalized people basically foretells what's to come for the entire population degrees of control lessening of
3:28 pm
autonomy. a real difficulty in confronting and sometimes resisting these systems. some say if you want to know what's to come with ai you need to look to china. the chinese want to be the primary innovations center for. it is seen as both a potential driver of more social instability but at the same time the chinese state thinks that i can use the 12 to call social address. china is home to 1400000000 people its capital beijing has more surveillance cameras than any other city in the world facial recognition technology is woven into everyday life getting you into a bank your residence checking you out at a shopping till 800000000 internet users and weak data protection laws the chinese
3:29 pm
state has access to colossal amounts of data and china's credit scoring system aims to go far beyond finance. there is this ambitious goal to have a national unified social credit system that would assign a score to citizens to judge whether they were their behavior was politically acceptable or was socially desirable. the plan is for all chinese citizens to be brought into the social credit scoring system in 2020. and uses data everything from financial records and traffic violations to use of birth control and processes that data through algorithmic software to give people a score for their overall trustworthiness. a high social credit score can mean better access to jobs loans travel and even online dating
3:30 pm
opportunities. can mean being denied some of the modern benefits of citizenship. probably the most troubling aspect of social criticism is not necessarily the social credit system itself but actually the application of some of these facial recognition technology is to expand the surveillance state and to check behavior of citizens in the western region of job where at. minorities waders have been disproportionately targeted in terms of their location being tracked 247 whether they're going to mosques which areas of their traveling to and that has been powered or is in the process of you know empowered by facial recognition algorithms being connected through security integrators. autonomous region is home to china's weaker population. and ethnic muslim minority
3:31 pm
that has faced systemic forced dissimilation. a small fraction of the weaker resistance to the suppression have turned to violence . including attacks on civilians. leading president g jumping to embark on a so-called people's war on terror. aimed at stamping out weaker separatism and imposing a secular ideology. new an ai led technologies particularly facial recognition are the latest weapon in xi jinping crackdown. some reports have indicated that it was a database that tracked 2600000 residents. tracked where they were going and that database had labels of sensitive locations like whether they're going to a mosque or whether they were going to this particular region. so that was updated
3:32 pm
on a 24 hour basis and that database had i believe more than 6000000 records so it showed it was tracking these people real time waiters are now in reeducation camps. so that's a pretty significant departure from normal life where you're forced to study in a camp and repeat party montra. it's a stark picture of how artificial intelligence can go wrong the chinese government deploying ai to track and suppress its own minority populations. facial recognition checkpoints engine junk use deep learning technology to identify individual leaders cross checking them with data collected from smart phones to flag anyone not conforming to communist party as unsafe a threat to state security. has become a test bed for authoritarian. this harsh system of control may seem
3:33 pm
a world apart from the west but systems like social credit actually have some parallels. in some ways if you think about the origin of some of the signs the social credit coming from some of the major private businesses in china how different is it really from a kind of experience or an equifax or one of these set of credit rating agencies that actually do collect also very granular debt on westerners and and that data is then shared with all kinds of other entities and used to make consequential decisions in current operation i would say that they are different i think the difference will be when it's not just your financial behavior one it's also your social your political behavior that gets observed and oftentimes the social credit system becomes a projection of our own fears about what is happening in our societies in the west where it's not necessarily what is subjectively happening in china that is
3:34 pm
important but it's about using what's happening in china as a way to project what we're afraid of. so when i think about china's millions of wiggers being tracked 24 seventh's by a and potentially put into reeducation camps i think about the black community in the united states i think about predictive policing and i think kind of east and west one of the problems and worries about with ai is the way that it gets road tested on the communities of color in the form. the way the district knowledge that are being developed is not empowering people
3:35 pm
it's empowering corporations they are in the hands of the people who hold the data and that data is being fed into algorithms that we don't really get to see or understand that are opaque even to the people who wrote the program and they're being used against us rather than for us. there's this incredible informational imbalance isn't there that even as a handful of companies are acquiring more and more and more detailed information about each of our intimate lives we've got in some ways less and less information about them in the way that they operate it's nonsense when you think about the way in which fraud and corruption are words that get pointed out poor people who get tracked into these high highly surveilled systems. if they don't participate they actually have no other option you don't get food if you don't participate you don't get to go to school if you're not in the system properly tracked and so i
3:36 pm
think these kinds of things are. are the questions again and that also might need to be regulated you know beyond kind of the technical regulations one of the limits there is that so much of the pressure or focus in those movements is about perfecting the technology instead of thinking of more broadly about like what are the values that we're trying to implement and. who are they in service of now see to you worked a little bit with the obama administration didn't you on trying to determine how we make some of these automated systems more accountable to us did you find that that was a useful exercise how did that go i think there was a genuine interest in and thinking about what might be harmful what might be helpful what should we think about it now in order to forestall or prevent particular outcomes that we can't undo that's right further down the line and so there was a lot of interest i think what's happened since then is there has been this
3:37 pm
increasing crescendo from industry saying these technologies are inevitable whether or not you like it they're coming. and what that creates for members of the community for citizens consumers is. increasingly a sense of despair or resignation well we might not be able to do anything about it and given that increasingly it looks like governments are actually punting to corporate governance structures or cut corporate governance bodies it can create a sense of despondence i'd like to pick up a slightly different but i think related part of that to do with a the kind of supply chain and actually the labor that is involved with some of this artificial intelligence because i feel like the phrase ai sometimes kind of hides like a lot of human labor that's used to make a given system work right so you've got people in kenya who've got
3:38 pm
a label images to train software for self driving cars or people in phoenix right on very little wages looking at videos that would come up on you tube looking basically all day every day at a stabbing or a beheading so that that stuff can be taken off and you and i don't see it on our social media feeds is that one of the kind of problems we don't see a hidden problem of some of the artificial intelligence economy that there's a lot of human labor that is required to prop it up surely you don't see the data janitors who pay the day right they're not the ones that are. you know in our line of sight as things like silicon beach expands we don't see the sort of this aggregated geographically dispersed nature of these ai companies and who all is involved in making and cleaning data right and i think that's that's highly problematic because it is contributing to this sort of magical or that surrounds
3:39 pm
ai can do all of these things efficiently and instantly and yet there's this whole kind of body of people that contribute to that and the fact that in many cases their labor rights are being disrespected i think is also a cause for concern. yeah i mean i think we know now for example from researchers i think of my colleague at u.c.l.a. sarah roberts who's done all this work around commercial content moderators bringing them out of the shadows so that we actually understand that there are huge . dispersed global networks of call center like environments where people are doing this kind of moderation that you talk about. you know one of the reasons why i think we previously didn't know about that is because there's such a deep investment by you know the sector and thinking at least in the us context of the internet as a free speech zone for example and that anything goes but of course we know that
3:40 pm
anything doesn't go i find it always interesting when i hear the machine learning experts talk about how how crude in many ways things like kind of visual mapping is like you know is a table a table is the cat a cat right still trying to figure out these really rudimentary kinds of questions and yet when we see tech leaders in front of congress they say things like we're going to take down you know damaging content of violent content content you know live murders live suicides with i sed to protect workers and i think you know that's really interesting because ai is not there. corey right to me. the role of ai in medicine is to. make better predictions but to make the doctors lives better if you just look at the camera and smile. but there are some fields where ai is already there and has the potential to do great good.
3:41 pm
it could well transform the way we practice medicine a lot of what we're trying to do machine learning or big data in health care is to predict these healthy to disease transitions so really tracking your trajectory over time right and use those think i'd trade. so let's have you think about bill's on here at the lab 100 clinic at mount sinai hospital in new york city and relax i go through and i've driven health check that generates a heap of data so now i'm going to hear results right providing a more complete understanding of my physical well being. with access to this kind of information doctors could save lives and potentially millions of dollars along the way one of the most mature areas in medicine is the application of ai to imaging data and to actually deep learning came from image analysis and video analysis so it was really well tuned for that type of thing so an example looking at radiology images and diagnosing a tumor or you know finding
3:42 pm
a hip fracture those tools were already well tuned for that task i mean finding cats and videos but i think what's clear is is that the ai is at least as good and men and sent many cases in reality as a human is equivalent to a human radiologist might be more like airline pilots in a way so airline pilots are kind of there for you know takeoff and landing and then the plane flies itself for the most part but i think what radiologists are going to basically be doing is looking at the radiology image and basically rubber stamping it for. legal purposes really until we solve that problem with ai. ai that prevents disease what could be better but in a world where people have to pay for health care what would be so great is that private companies use your ai health profile to charge you more with. the future of ai and health doesn't just depend on the tech it depends on our values is health
3:43 pm
care a human right. should a person predisposed to heart disease or cancer because of their low income or ethnic backgrounds have worse care than those better off. the aim should be decent standards for all not a 2 tiered system. there are so many positive potential applications of artificial intelligence that would change the world for the better one is a very obviously the pattern recognition that ai is good at has proven incredibly good at spotting malignant tumors an incredibly powerful and inspiring medical advance that i've seen some papers on just in the past year but the technology is going to shortly underpin all aspects of our daily lives very shortly some form of machine learning artificial intelligence will determine whether somebody can loan whether somebody gets a mortgage whether somebody gets bail whether somebody gets paroled and as we've
3:44 pm
seen it may well determine matters of life or death in a military context so the stakes could not be higher the quality of your decision making absolutely depends on the quality of the material that is coming into it and we have seen and other uncertain human contacts such as policing that in math data their machine learning algorithms has a distressing tendency to replicate and accelerate all of our preexisting human biases. if you have a whole. technological culture infrastructure a whole language that emphasizes the lack of human responsibility and instead emphasize a system where there are these artificial agents we pretend they have agency but what's really going on is received a trick to manipulate each other then they'll be more and more trickery and manipulation and if we want to reduce things like trying to tax we have to emphasize human responsibility.
3:45 pm
the drone attacks that killed my client's family showed just how much responsibility we're handing over to technology. but we can take that responsibility back our curiosity and drive to innovate has been pushing the bounds of what we can do with artificial intelligence for decades. as ai is used to make more and more decisions about us from targeting to policing to social welfare it raises huge questions while i would be used to target minorities or clean up our air will it destroy our privacy or treat disease will it make us more unequal or fight climate change these are questions that should be decided in the boardroom of a software company what happens with ai is everyone's business the world according to ai is our will and it's up to all of us to be sure it's a just one. hello
3:46 pm
there we're still seeing plenty of showers over tikki that have been causing a few problems giving some of us some flooding and they still want to defeat about on all satellite picture that we more to as we head through the next few days many parts of turkey and stretching a bit further north towards the extreme southwestern parts of russia as we head through into friday looks like most of those showers will begin to disintegrate but there will still be a fair amount of cloud just the chance of a shower here in that for the south just pretty hot at the moment but about kabul up at 31 and force in baghdad we're a 40 and the winds of them firing that hot air for the south so q 8 is also a very very warm and here in doha it's hearts and it's really blowing out there at the moment that wind will stick around as we head through thursday and into friday
3:47 pm
and will only begin to ease as we head through friday night and into saturday the south will be a fair amount of cloud over possible man and in say yemen but this shouldn't be anything in the way of what weather from that there will also be some cloud just grazing the coast of mozambique this is likely to be giving us a few showers that are moral 2 of them could turn out to be rather heavy we're also expecting some showers around the coast of madagascar and this system is going to pick up as we head through the day on friday so expect some shellfish showers if you hear. they want to 43000000 homes with a weapon that was 6000000000 and. there's no any more because there's always a. mist. in essence we in the united states have privatized the old time
3:48 pm
a public function more. on al-jazeera. al jazeera where ever you. washington and tehran both confirmed the rainy and missile shot down a u.s. military drone but differ as to where it happened. wrong when you're watching al-jazeera life my headquarters here in doha also coming up yemen's who the rebels say they've attacked
3:49 pm
a major power station in saudi arabia with a missile also. execution of me casually warthe killing by. the u.n. special investigator says there is credible evidence linking the saudi crown prince to the murder of jamal khashoggi and calls for a comprehensive inquiry. arrives in pyongyang it's the 1st visit by the leader of north korea's closest ally in 14 years. welcome to the program the us says one of its military drones has been shot down in international airspace over the strait of hormuz by a rainy and surface to air missile it's believed the drone was being used for high altitude surveillance but iran is disputing where the incident took place saying it violated iranian airspace in the south of the country according to a statement posted by the revolutionary guard the drone was flying over hormuz can
3:50 pm
province. a disagreement over the type of drone that was shot iran's revolutionary guard has identified it as a q 4 global hawk which you can see pictured here but the u.s. says it was actually a u.s. navy and sea full sea trying to. cross over to do such a bari and to sort of clear this all up what more do we know about this incident also. well so i'll according to the revolutionary guard they issued a statement overnight they said they shot down this u.s. surveillance military drone over the coup hamel barak port city is an enormous gone province now this port city is very strategically important it is about 65 kilometers northwest of joss hort that is of course where those sailors were taken just a week ago when there are vessels were attacked in the gulf of oman the cool mubarak
3:51 pm
area is just east of it is where the entrance is to the strait of hormuz so all the ships that go through the strait of hormuz pass through quha mubarak supports cities so it's very very strategically important the revolutionary guards have said that they shot this drone down because it was flowing flying over that port city and that is in iranian air space we've also heard from various m.p.'s that are starting to react to this news as they wake up here in the capital they are saying that they want a full investigation into some of even suggested that the iranian official says it should take this issue to the united nations and complain officially within that body of course the real why did patients to this incident in line with the rising tensions in the region. yes certainly it's been a very very tense time here and this incident is of course not going to help
3:52 pm
matters the iranian president has said that the future stability of the region relies on the future of the 2 $1015.00 nuclear agreement which the united states withdrew from last year and according to the iranians the left the people that are left with in this agreement the europeans are not upholding their end of the deal and they've been given 60 days to remedy that all this we're about 3 weeks away from that deadline and all these tensions we're seeing it remain officials are saying that it is because of the u.s. withdrawing from the nuclear agreement and waging economic war on the rain in people and the country and saying that this will not help matters unless this nuclear agreement is by the bye and actually something is something comes from it in the arabian sea a positive reaction from the europeans did you see what happens in the coming hours for the moment dawson thanks so much for joining us from. the secretary general
3:53 pm
does not believe he has the power to set up a criminal investigation into the murder of saudi journalist jamal khashoggi a report from a special rapporteur. on the state of saudi arabia responsible for all legal grounds for antonio to launch an international inquiry but she recommends the u.n. should look into the individual responsibility of saudis highest officials including prince mohammed bin salon the report also gives the minutes by minute account of the journalists final moments has rejected the report saying it was full of baseless allegations and clear contradictions my colleague spoke to agnes about . the state of saudi arabia 1st and foremost response support for for the murder i think it's important to and says that the executor of my. struck actually was a killing by was a state we have focused extensively on the identity of various individuals that
3:54 pm
where involved in the commission of the crime but 1st and foremost we must insist in putting the responsibility of the killing tools the state of saudi arabia it must bear responsibility for that killing and we must take action as a state to repay over killing what did you learn in these recordings about the way jamal has showed g.'s body was disposed of because his remains have still not been found the recordings needs to be interpreted they do not tell a very straightforward story. what was done to use body i cannot deduct from the sounds i have a hope that i can infer from the sound that something was done based on the technique or. knowledge of the various people i have consulted it is well possible that mr cashel greenwald's 1st put that he was 1st injected with something
3:55 pm
and then that he was actually asphyxiated weezer plastic bag this is the possibility the nature and the extent of the dismemberment of these bodies i cannot comment upon it's not possible allow me to add one thing when at least this was a recording is concerned that i have heard when mr cashel loses consciousness there is no evidence of the people that are there in the room attempting to take care of him attempting to rescue seated him attempting to do something that to me points to the fact that the notion that there is an accident that the accident happened doesn't. quite to match what i have heard there is nor tempt to to do something and there is no scream so or.
3:56 pm
any expression of of fear over what's happening you'll be presenting this column are this report i understand on june 26th to the un human rights council which saudi arabia is a member of what do you think is going to happen what do you hope will happen now that you've released this report you know my report makes a friend of recall mendacious including to saudi arabia as a special rapporteur i am committed to establishing constructive relationship preserve the governments that i work with i have attempted to work with saudi arabia for the last 6 months of not shown an interest in doing so or my reporting crowds around every commendations including with regard to the ongoing trial including with regard to feather per saudi arabia should take to demonstrate non repetition which is
3:57 pm
a from them and toward dimension of their responsibilities given that they have established that the responsibility of the state is involved. correspondent is live for sister ball and how she really since that initial report was released by agnes on wednesday we've obviously had reaction from turkey and certainly from officials but also closer to home those are very close to jamal khashoggi have also been speaking. indeed how these are jenkins his fiance so he was one of the 1st people to react to the findings of the united nations special report or. that she is looking forward to see the perpetrators brought to justice her digital was traumatized by the whole quizes from the beginning because she was with her when he entered the consulate to retrieve documents for his upcoming marriage has been
3:58 pm
waiting for hours and when he didn't emerge she knew that something terribly wrong happened as she contacted the turkish authorities and she reacted basically to a tweet made by this hour the minister of state for foreign affairs our way when he casts doubt on the whole findings of the united nations special report or saying that it's full of in a curacy and baseless allegations let's listen to what are these are junkies had to say well. when it comes to the comments from chip beer and other saudi officials that the u.n. report did not come with anything new i would like to ask them why they didn't put the perpetrators on public trial and why you didn't hold them accountable why can't the saudis be more open and provide us with details of the investigation and the so-called trial that is happening behind closed doors. at the official level
3:59 pm
we've heard some strong reactions from the turkish president as a governor who said he was pleased with the findings of the inquiry and that he was looking forward to see the perpetrators brought to justice of that he would make sure that they will definitely have to face a prosecution was the same statements made by the minister for that up there's more to. this crisis has 2 angles legal and political from a legal perspective the turkish government has been saying about the saudi government has been stalling the invite investigation in demining the chances for the saudis for the turkish police to try to find out what exactly happened inside the consulate from a political level there's also an angle to the story it comes against a backdrop of some strained relations between turkey and saudi arabia about different regional issues iran yemen syria and also the blockade imposed on qatar in 2017 it could be that turkey is trying to give saudi arabia a chance to solve this crisis diplomatically if that doesn't happen so hey i think
4:00 pm
the turkish will the government will push forward to words and international criminal investigation to start any time soon indeed and we'll have to watch very carefully what comes out of ankara for the moment to hashem with you in istanbul thanks very much. as did the regional because yemen's hutu rebels say that they were tired of the shock power station in the saudi problems of jews armed with a cruise missile but there's been no confirmation from riyadh u.s. president donald trump has been briefed on the details of the strike and the white house says its closely monitoring the situation is the latest in a series of strikes on saudi cities in the past 2 weeks or correspondent mohammed joins us now live from the yemen the capital center who had complete focus on attacking strategic locations in southern saudi arabia for over a week now what more can you tell us about this latest attack. this seems no. site for the candle for.

51 Views

info Stream Only

Uploaded by TV Archive on