Skip to main content

tv   [untitled]    August 23, 2021 9:30am-10:01am AST

9:30 am
american mercenaries, ah, and a bazaar here, 2 days, attempt at regime change in the ball of every in republic of venezuela. people empowered the bay of pigs on al jazeera. ah, i'm money site and he has top stories on al jazeera us president joe biden has come on. the enormous criticism for want some say is a botched departure from us chemist on that. and the telephone's takeover has led to chaos said cobble airport. thousands of afghans are desperate to leave the country bought and says more than 11000 people have so far been lifted from cobble
9:31 am
just this weekend though it's not clear if this number includes non us evacuations like yours, president biden has also said the taliban has so far kept its wed on allowing access to the airport. but at least 20 people have died in the chaos of trying to leave the capital in recent days, nearly 6000 us and foreign troops are currently protecting the apple. but biden has warned a growing, concerned that the field could become a target of attacks. us vice president come la harrison and singapore, to reaffirm us a launches in the pacific region. she and prime minister and the sheen loom have held a nice conference. are they said they discuss the south china sea and the mall, and support for afghanistan evacuation? tropical storm on re, is mastering a stretch of the northeastern coast of the us, just hours after being downgraded from hurricane. more than 100000 people in new
9:32 am
england without power. you zealand has extended its nationwide locked down until friday. prime minister just in the den says more certainty is needed as a country struggles to contain the spread of the delta variant sheets that it's unlikely. new zealand has reached the peak of the outbreak. authorities have recorded 107 infections since the fest delta case was identified. on tuesday, delta has changed the rules of the game, but we've changed our approach to with hot and dilute levels. why didn't our contact tracing required more mosque wearing and we are testing more people. it is absolutely possible to get on top of this. we just need to keep it up and ensure that we not on restrictions any longer than we need to be in today. just single days had a new cove at 19 infections has all into its lowest level in more than 2 months. the country had just more than 12000 new cases on sunday. is all the headlines
9:33 am
coming up next or hell? the algorithm did you know you can watch or english streaming light on? i get 2 channels. plus thousands of our programs award winning documentary. and you get to choose subscribe to, you choose dot com, forward slash al jazeera english. the trust is fundamental to all our relationship, not just with our family and friends. we trust banks without money. we trust doctors without really personal information. but what happens to trust in a world driven by algorithms as more and more decisions are made for us by these complex pieces of code? the question that comes up is inevitable. can we trust algorithm
9:34 am
from google searches to gps navigation algorithms are everywhere. we don't really think too much about them, but increasingly, governance corporations and various institutions are using them to make decisions about who gets public service, who, who gets denied how people are monitored and policed. how insurance is charged. i want to start here in australia were an algorithm used by the government has resulted in more than 400000 people being in debt to the countries welfare system centrally. it's being called the robot scandal. back in 2016, a decision was made to fully automate key part of the welfare system. the part with the earnings of low income people compared with the amount of government money they received. the government says they do this to ensure the right amount of financial assistance as well. the data match algorithm officially called the online
9:35 am
compliance intervention had been in place since 2011. any discrepancies previously flagged by the system were investigated by government employee, thirst with automation or human checks were renewed. the government had instituted an algorithm that essentially said, let's match 2 lots of data together and smashed them together and see if people have a debt. so some of the math was just bad, just plain wrong, like it was spreadsheet for mashing to sell together in the self do a job. ashley wolf is a journalist who has been reporting on the roadway debt story since it broke. she's also an activist, one of the chief organizes of the not my death, ross routes campaign. often people didn't realize that this was automated in the 1st place. and it wasn't all we started getting people talking to get an official meeting on twitter that we realized. actually the government, a town,
9:36 am
it was almost like 100000 people have been gas, slide it into thinking they've done the wrong thing, that it was their fault. and our outrage when they realized that there was a fault in the actual algorithm in the current these really government disagrees. we are doing more compliance checks because we want to be more sarah price if we are covering greatest examples. and we are retrieving money for the tax. more checks is a bit of an understatement. the old system resulted in around $20000.00 discrepancy notice of the year. but in the early days of the new automated system, that jumped 220008 week. more than a 1000000 letters had been sent out by the algorithm. sometimes disputing government payments from as far back as seventy's. and what was even worse was the systems were imposed on paper with intellectual disabilities, with timelessness, with chronic health issues, people who would barely literature, not literally all people who didn't know how to use a computer,
9:37 am
people who are living in remote communities without access to internet people who just had no bloody clue how to deal with this sort of administrative, bureaucratic bumble, david digna, was notified. he incorrectly declared anything come from a teaching job while he was on a disability pension back in 2011. ready he's red odette. ready $4088.00 in essence . what robert, he is going an accusation towards you that you've done that she re done the wrong thing. i know ahead. i want to detail them. have a joke here. i did, and i was told i couldn't have that. and the reason i was told was that the computer looks my personal information and then sources a piece of information near another patient here, another piece of me. and i can provide to me because it comes into many pike and we can bring back. in other words, the elder rhythm is crucible. it's totally unknowable. even the staff don't really
9:38 am
understand it. rather, can you tell me how much evidence or how much notification generally provide you proving that there was a good many provide me with anything other than that. and the other thing that i have is finally a text message came through to say, hi, the money you are doing today. the fact that you couldn't get any concrete evidence about this is how we have calculated your debt. here is what you hear the hours you work that really i found it. sure. any confidence that i had in that the government will do the right thing. the fact that they couldn't prove to me that i owe the money really concerned me. i never find that you received a letter in the mail that generated by an i that essentially says why the government wants to let you know that we underpaid you by $5000.00. that you should
9:39 am
have been eligible for the services. but we didn't tell you, therefore we're telling you now and we come back pay. nobody gets back paid. in fact, you're only eligible for like back pay as i think it's 6 weeks with services that the government can read about it. you back for many, many automation, computerization algorithm optimization. if that's even a word they're always sold to us is such a positive thing. all upside, no downside, as strong as department of human services put it. computerized decision making can reduce red tape, ensure decisions are consistent and create greater efficiencies for recipients and the department. the problem is had he challenges system that has no face, no name, and nobody signs about him. if your letter say, you know, i'm in charge of this good afternoon. welcome to the department of human services. since like on a good day, end up sitting on hold for a couple of hours trying to speak to a human. the real question is,
9:40 am
how has it come about that the government has overpaid people by billions. because really the criminal weiss is occurring at the end of the governance line. it's the government that's doing this. otherwise you're saying 800000 citizens have made mistakes or sets the case and the system is too difficult for people to negotiate. so i'm not here shaking my fist at technology, it's not digital phone, it's not a computers fault. this system has been designed in aquatic swiftly by government governments responsible for its failures and governments really responsible for the hell they're putting all sorts of welfare recipients through unfairly, by issuing them fullstep. this is something i heard from virtually everyone i spoke to that regard that they said we're not against technology. it's not like algorithms are all bad. it's the people and the institutions designing these codes
9:41 am
. we can't seem to trust. and this really gets to the heart of our relationship with algorithms. they're often complex, hidden behind walls of secrecy, with no way for those whose lives are actually impacted by them to produce them. because they've been kept off limits. despite all the criticism and even a form of inquiry, the sterling government stand by its algorithm and automation in the welfare system . we do re past compliance to be applied for the last 6 months align with recruit. we've recovered the $300000000.00 to the tax through that process. so the system is working and we will continue with that system there at least 20 different laws in australia that explicitly enable algorithms to make decisions previously made by ministers or staff. we don't really know the full extent of how these are being applied, but there are places around the world whether use of algorithm or even more widespread. like here in the united states where algorithms are being used to make
9:42 am
big decisions across everything from the criminal justice system. health, educate action and employment. the united states has a longer history of algorithm use than many other countries. silicon valley is a big reason for that, of course, but also there's much lisa regulation here when how private companies and governments can collect and use data. but for those studying the effects of algorithms on american society, one thing is clear. often it's the poor, marginalized who get the worst you the time on my way now to troy in new york state to meet with virginia you think she's already on everything to do with the automating inequality here. actually, the title of one of her books, virginia says america's poor working class had long been subject to invasive surveillance and punitive policies. she writes about prison like poor houses of the 19th century. the bad conditions were brought to discourage undeserving poor from
9:43 am
supposedly taking advantage of the system. what i see as being part of the digital poorhouse are things like automated decision making tools. statistical models that make risk predictions about how people are going to behave in the future or algorithms that match people to resources. and the reason i think of them as a digital poorhouse is because that the decision that we made an 820 to build actual poor houses was a decision of that public service systems should. first and foremost, be moral thermometers that they should act to decide who is most deserving of receiving their basic human rights. the genius studies into the automation of public services in the united states points to developments in the late sixties and seventies. along with the civil rights movement came a push for welfare rights. people are forced to live in the most in human
9:44 am
situations because of that poverty, african americans and unmarried women who are previously bod, from receiving public funds, could now demand state support when they needed it. while technology was touted as a way to distribute financial aid more efficiently, it almost immediately began to serve as a tool to limit the number of people getting support. so you have this moment in history where there's a recession and backlash against social spending and social movement. that's winning successes, that ends discriminatory treatment. and there really is no way to close the roles. they can't close the roles the ways they had in the past, which is just to discriminate against people. and that the moment we see these tools start to be integrated into public assistance. i think it's really important to understand that history. i think too often we think of these systems, it's just simple administrative upgrades sort of natural and inevitable. but in
9:45 am
fact, there systems that make really important concept and shall political decisions for us. and they were from the beginning, supposed to solve political problems among them. the power and the solidarity of foreign working people in the early 19 seventy's close to 50 percent of those living below the poverty line and the united states received some form of cash welfare from the government. today it's less than 10 percent in public assistance. the assumption of many folks who have not had direct experience with these systems is that they're set up to help you succeed. they are not in fact set up to help you succeed. and they're very complicated systems that are very diversionary that are needlessly complex. and that are incredibly stigmatizing and emotionally, very difficult. so it shouldn't then surprise us that a tool that makes that system faster and more efficient and more cost effective.
9:46 am
further, that purpose of diverting people from the resources that they, that they need. having algorithms make decisions such as who gets financial aid, who owes money back to the government has cause concern among many different groups . but what's holding a full clinic system is the fact that algorithms are being used to actually make predictions about people. one of the most controversial examples is the correctional offender management profiling for tentative sanctions. it's a bit of a mouthful, but it's short form is compass and it's an algorithm that's been used in courtrooms across the country to assist judges during sentencing. now of course, algorithms can't weigh up, arguments, analyze evidence, or assess remorse. but what they are being used for is to produce something known as a risk assessment school to predict the likelihood of a defendant committing another crime in the future. the school is then used by judges to help them determine who should be released and who should be detained,
9:47 am
pending trial. now the judge has to consider a couple factors here. there's public safety and flight risk on the one hand, but under the wheel costs, social and financial of detention on the defendant on their family and the other. now historically, what happens is the judge looks into this defense eyes and tries to say, hey, you're high risk person or your low risk person. i trust your, i don't trust you. now what algorithms are helping us do is make those decisions better. the compass algorithm was brought in to offset balance out inconsistencies in human judgment. the assumption being of course, that a piece of code would always be less biased and less susceptible to prejudice. however compass is faced several criticisms, primarily accusations of racial bias, inaccuracy and lack of transparency. in 2016, a man named eric loomis, sentenced to 60 is in prison, took his case to the with sconces state supreme court. his allegation was that the
9:48 am
use of compass violated his right to due process. it made it impossible for him to appeal his sentence. since the algorithm is a black box impenetrable, unquestionable eric limits didn't get very far. the supreme court ruled the use of compass you need sentencing was legal. the verdict, however, revealed the ways in which the ever increasing use of algorithm they normalized. the court had a funny argument saying that nobody knows where the decisions are coming from. and so it's, it's okay, you know, it's not as the state has some particular advantage over the defendant, but that everyone is that this sort of equal playing field. and it's not that there's an informational advantage for one side or the other. now, to me, i find that somewhat dissatisfying, i do think that in these high stakes decision, particular criminal justice system, we don't just want to have an equal playing field of no one knows. but i think we
9:49 am
need to have the equal playing field of everybody yet. we need to have this transparency built into the system for the record equivalent, the company that sells compass software has defended its algorithm. it points to research commission that the company meets industry standards for fairness and accuracy where the compass so most of the privately developed algorithms meet acceptable standards for transparency. is another question. even when they are used in the provision of public services, algorithms are often close to the public. they cannot be scrutinized. regardless of that. and sharon says that in certain cases he would still be comfortable being judged by robust algorithm. so i do think it's true that many of the people in the criminal justice system are the most disadvantage. and the reality is they probably don't have a lot of say in their futures, in their faith and how these algorithms are going to evaluate them. whether this would happen, if more powerful people are being judged by these algorithms,
9:50 am
i don't know. now, me personally, i would rather be judged by a well designed algorithm than a human in part because i believe the statistical methods for assessing risk in fact are better than than humans and may situations. and it can, at least when it's well designed, eliminate a lot of these biases that, that human decision makers often exhibit. the united states has a massive racial discrimination problem and public services. that's real. so it is really understandable when agencies want to create tools that can help them keep an eye on frontline decision making in order to maybe identify discriminatory decision making and correct it. the problem is that that's not actually the point at which discriminant discrimination is entering the system. and this is one of my huge concerns about these kinds of systems as they tend to only understand
9:51 am
discrimination as something that is the result of an individual who is making a rational decision. and they don't, the systems are not as good at identifying bias that is systemic and structural. the promise of algorithms is that we can mitigate the bi, see that human decision makers always have, you know, we always have, we're, we're always responding to the way somebody look this way somebody acts. and even if we try as hard as we can, and if we really have these good intentions of a try to just focus on what matters, i think is exceptionally difficult. now that again is the promise of algorithms. the reality is much more complicated. the reality is that alvin's are trained on past human decisions and they're built by fallible human themselves. and so there's still this possibility that, that buys he's creep in to the development and application of these algorithms. but certainly, the promise is that we can least make the situation better than it currently is.
9:52 am
one of the things i'm really concerned about about these systems is that they seem to be part of a philosophy that increasingly sees human decision making as black box and unknowable and computer decision making as transparent and accountable. and that to me is really frightening because of course, computer decision making is not as objective and as not as unbiased as it seems. at 1st glance, we build bias into our technologies, just like we build them into our children, right? we teach our technologies to discriminate, but on the other hand, people's decision making is actually not that opaque. we can ask people about why they're making the decisions or making that can be part of their professional development. and i think this idea that human decision making is somehow unknowable is a sort of ethical abandonment of the possibility to grow and to change that we really,
9:53 am
really need as a society to truly address the systemic routes of racism in classes in texas. i'm in our society, so it feels to me like we're saying, we'll never understand why people make discriminatory decisions. so let's just let the computer make it. and i think that's a mistake. i think that's a tragic mistake. that will lead to a lot of suffering for a lot of people. so going back to the question that started us on this journey, can we trust our group? will the biggest thing outlet from speaking with asha the junior shirad and many is that i've actually got the question around. it isn't really so much about with that algorithm trustworthy. it's more about the quality of the data that feed and the
9:54 am
objectives of designing and controlling human bias and human intersections. that's what we see reflected in our algorithms. and without that oversight, we which we think our prejudices and social inequalities such algorithm our program to show that the future as well. and by the path that's often things the stigma and bias and stereotypes and rejection and discrimination. and really what we need is to allow for me a different from the old of course we can build better tool and not the rhythmic tools. and i see them everywhere that i go to. but what makes a difference about good tools about just tool is building those tools with a broader set of values from the very beginning. so not just efficiency, not just cost savings, but dignity and self determination and justice and fairness and accountability and
9:55 am
fair process. and all of those things that we really care about as a democracy have to be built in at the getting from step one in every single tool the, we're actually getting our hands on the data, we're analyzing the data. now one thing that we've done is we try to make as much of that data available as possible. so to encourage people to look at death, a, one of our, one of our projects is called the stanford open police project. we release lots of data and the criminal justice system, we release code for people to play with the data. and i encourage everyone to look at that and try to understand what's going on. and maybe they'll discover a pattern that picture. so my biggest pieces of thought is to never underestimate your invoice. you know, you might be fighting some shane,
9:56 am
some computer system that you've never been able to mate or say that has been flicked into huge, harmless suffering. but your words can make government scared. your voices combined can make senates and court sit up and pay attention together. we can shape the way these tools are created and the ways that they impact us as a political community. if we want better outcomes from these systems, we have to, we have to claim our space as decision making in decision makers at these tables. and we can't do that if we think that these technologies are somehow god, they're built just just the way we build our kids. we build these technologies and we have a right to be in dialogue with them. i
9:57 am
think of some of the biggest companies in the world today, all of them are big tech, with algorithms that they're called, the move that we use, the more data we were in the midst of a great, great and big tech companies around the empires are rising on a wealth of information and we other commodity in the 2nd 5 part series 90 re examined whether corporations are colonizing the internet like me, moderately and power of big tech on a jazzy lose. hello there remains dry, hot and sunny for much of the middle east and levant. this week we are seeing temperatures up in the north sitting where we expect them to be. but down in the south for some of the gulf states, we are going to see them dip down slightly. for example, in cuz hom, doha is going to see the temperature dip down slightly. and the humidity will kick
9:58 am
in on tuesday. but farther south, we will have somebody wind to keeping things cool across the coasts of yemen and omen. and if we do find some showers, it will be in those western mountains of yemen with a possibility of a storm or 2. and those showers join up with the largest thunderstorms rolling across that central band of africa. we have seen flooding in eastern parts of the democratic republic of congo. edging into uganda, we could see more of that as those rains continue to fall rather heavily across western parts. of the democratic republic of congo. but the storms continue around the gulf of guinea. as we move further south, though it is, fina and dryer. we put a bit of a brisk wind blowing across parts of what's wanted. that's bringing the temperature down slightly. but there's plenty of sunshine around for south africa with a few showers across the suit too. but cape town we'll see the wet weather come tuesday. the
9:59 am
the taliban has reclaims, i've done is done. but us withdraw began years earlier in 2013 and 14 witness follow the unit of the afghan national army as they commenced the onerous part of confronting with taliban without making support. a betrayal of young men fighting for their country while knowing each day could be their last. i've got to stand own battle on a job here. in countries like mine, people have been killed to be william, the united states have privatized the ultimate public war. this was a deal with saudi arabia. things were done differently. saudis and other arabs when they came to britain to be all to help the bombs deals along your rum. so was meeting saddam. is it that interesting? there?
10:00 am
i am. shadow on al jazeera, the taliban has taken control of afghanistan, 20 years after it was the pose from power. the country now faces a new reality. how will that impact people on fall in the world react with the latest news and analysis. on l. o i the force of violence, the cobble and pull, it will have a live updates on the chaos there. as thousands are evacuated from afghanistan will speak to some african women who have managed to escape out to the taliban. take a
10:01 am
welcome peace. adobe your watching l. 20 life and also coming up arrest storm lashes, the northeast region of the united states.

18 Views

info Stream Only

Uploaded by TV Archive on