tv [untitled] September 13, 2021 9:30am-10:01am AST
9:30 am
an extraordinary film archives fanning for decades, reviews the forgotten truth of the country's modern history. the forbidden real part one, the birth of i've done, it's done on a job. oh, hello, i mainly am going into harvey's stories on al jazeera, the chief of the u. n's nuclear watchdog says he's talks in iran have averted a showdown between the islamic republic and the west. iran has agreed to allow inspectors to install memory cars in violence. cameras added sensitive nuclear sign . coming together of the jigsaw puzzle will come when that he's an agreement as a j c p devil. but at that time we will have all this. ready information and there will not have been a gap. so i think with,
9:31 am
with this agreement we have today we are going to be able. ready to do exactly that, a plane from pakistan has just arrived in cobble. it's one of the 1st flights into afghanistan since the taliban to pakistan international airlines landed at 100 cars a port just a moment ago. the plane is loaded with vaccines and is due to return to pakistan with passengers. has it been cut as foreign minister has met with the taliban leadership to address humanitarian and security problems in afghanistan? mohammed been abdul rahman el sunny is the most senior official from any country to visit since the groups takeover. north korea says it has successfully tested a new type of long range cruise miss all over the weekend. the u. s. military said the tests conducted by appealing yang, it posed a threat to its neighbors. brazil, for elliptical movements, both on the right and left have joined forces against president j. both scenarios,
9:32 am
pushing for his impeachment, brazilians are unsatisfied with why the inflation in poverty rates, as well as the handling of the pandemic. i didn't take his main opposition party is leading in the capital and the key problems of one is there is in mid term primary elections above is seen as a key test for president elbert hernandez and the popularity of his ruling sent left government to women have kicked off their campaigns to be frances 1st female president, veteran, far right. politician moraine le pen address supporters from her national rally party in the phrase, huge and paris man. and it del go is the favorite to win the socialist party's nomination. the election is next year and those other headlines speak around now for our 0. after all hail the algorithm. i can unlock my phone with my face. you can access your bank account with your voice. and fingerprints
9:33 am
are often the key information on a national id card. all of this face, voice fingerprint, biometrics, unique algorithmic measurements of us that are revolutionizing the process of identification. the biometrics, a saw from perfect. they convenience and seeming infallibility comes at across most crucially, our privacy. the l di metrics are individual unique so much so that they've always served as a gold standard for identification with really high levels of accuracy and strong security. fingerprints and dna databases have been the mainstay for police and investigators the decades and across many parts of the world. people who are
9:34 am
illiterate use some prints in place of written signature, and stephanie has been researching the growing use of biometrics. there's also your face now which has been quadrants, that's just your facial point and that's called facial recognition technology. your voice is biometric data. there's also something called a gate analysis which is how you walk. so those are ways that they could identify you. and another way is behavioral biometrics that might be your online behavior. so how you use your mouse, where you click on things as you go through the internet, but even how regularly are posting on facebook. there's a lot that you can, can get just from people's ordinary life. and that's why it's so important to have this debate and in society, we all are getting our consent about whether or not we want such to only being used . and if so, under what circumstances and with what regulatory checks the world is on a mission, a mission to give everybody a legal identity by 2030. that was
9:35 am
a target sent by the united nations as part of that sustainable development goals campaign. the key segment of the population that you went to focusing on is more than 1000000000 people who currently have no way to prove their identity. the unverified include millions of refugees, traffic children, homeless and other people will never get a chance to establish documents and create a digital footprints that's so essential for more than mine. here at zachary, can the united nations well. graham is using biometric technology iris scans to provide aid to the can 75000 and syrian residents. refugees can shop for the groceries with the blink of an eye. no need for bank card or registration pages. the system is quite aptly named. when a shop has the iris scanned, the world food program system, verifies the person's identity against the biometric database held by the un high
9:36 am
commissioner for refugees, the you and then it checks the account, balance, confirms the purchase and prints and i pay receipt. all of this happens in seconds and according to the world food program, this not only makes transactions quicker, but more secure than we'll use. biometrics authentication for 2 main reasons. one, to guarantee 100 percent accountability on the identity of the person for chasing. and using the assistance that we provide. and secondly, to facilitate that attention process of the beneficiary by not using a car by not using a pin in camps, which is an environment where beneficially tend to go with a market most people's time student month for them going with their own iris. it's easier been going with a car for a pin. the alibi comes in and i
9:37 am
can get it done by myself. i can do for them. so you have the 1st part of the day which is everything. april shipping price, this is fascinating and this is a super high tech that's been rolled out in what you could call a low rights environment. sure, people here are under the protection of the united nation and have more rights than they would have in the war. signs of the countries they fled for such a syria. however, they also have little choice when it comes to giving up their binding tricks, erupting out with blinding tricks. programs, taking somebody's biometric data from them, is about the most personal data that you could take. these are not people who
9:38 am
necessarily are in a position to ask for legal representation to have this explains them. second, if they don't want it, what is the alternative that they can exercise? instead? are they using behavioral psychology, something called nudge theory to make it where it's just easier to hand over your data? and then you get your food and your clothes and your money faster. because that would be unethical. we're testing out again, extremely experimental, really invasive technology. on people who actually have some of the least rights and protections. if anyone's. when a middle class person living in france or germany or the united states, the united kingdom were sweden, consent to, to use their iris to pay for things to transact. probably not, it's easy to see the immense potential of the i pay system to track a disbursement smooth out payments and reduce the of corruption or fraud. the wolf food programs is the benefits go even further. they are able to monitor shopping
9:39 am
habits and nutritional intake, and there's a possibility in the future that the credit histories of the refugees could help them out. and bank accounts or get line. they also think they've got the security bid covered for the repair, regulate the management of the beverages through at data sharing agreement with you . and so through that agreement, we are able to access the data sensitive data, which again does not include the name, just the case, id, phone number, and location. we are confident that that data being crypted is what protect the reason why we are doing regularly data privacy and infect assessment on the project. to guarantee that if you're in the world, we are able to pack of them and address them properly before they come to us. you and hcr remain fully committed to the biometrics registration program so much
9:40 am
though that they are rapidly expanding it with the aim of be active in 75 countries by 2020 the remains. lots of problematic questions that have yet to be fully answered, such as, is the tech foolproof? who has access and how can anyone plan for the unforeseen issues to come? these are the kinds of questions that have made other 8 organizations pause before jumping on board with biometric technology. in 2015, oxfam voluntarily imposed a moratorium on its use of biometrics in its work. it stated, given the number of unknowns around most effective operation and governance models and risks of this incredibly sensitive data falling into the wrong hands. we felt it was best not to become an early adopter. the one field in which biometrics has long been is security and surveys. and facial recognition is one of the most popular technology right now. in china,
9:41 am
there's been an exponential increase in the use of facial tracking and artificial intelligence to monitor citizens. the united states also currently operates one of the largest facial recognition systems in the world with a database of 117000000 americans with photos typically drawn from drivers licenses . and in the u. k. police forces have been trialing law facial recognition since 2016 at public spaces such as shopping centers, football matches, protests, music events, and crowded city spots. this green ban that's behind me here in central london is part of the playful recognition technology trial that's being run by the metropolitan police. and what it's doing is it's basically scanning people's faces when they were passed and then comparing that to a database that has wanted offenders or some clicks on the met police. a facial recognition could enable them to more easily protect people, prevent offences and bring offenders to justice. however, privacy groups,
9:42 am
such as the brother, watch, say the technology is authoritarian and lawless groups, legal and policy officer research. this even goes so far as to say that facial recognition is possibly the most dangerous surveillance mechanism that's ever been invented. the facial recognition technology can capture up to 300 faces a 2nd, which could be around 800000 faces in a minute. it's a vast number of people who in the police can identify check against police states basis, whether that's police immigration. so what we're seeing is police being able to identify people in seconds. that puts so much power in the hands of the state and the police, which i think is fundamentally wrong. it's not democratically accountable because there's no legal basis for this. so this is an intense, intrusive, and authoritarian surveillance technology. while advocates, the facial recognition would debate some of crystal sessions. one thing is undeniable. the technology currently being used by the u. k. police is dangerously
9:43 am
inaccurate. latest figures show that 96 percent of the met polices, so call matches when miss identifications. and this research showing that many facial recognition algorithms were disproportionately mis identified, darker skin tones, and women the cause of the numerous. and they very ranging from poor quality cctv images to the fact that the algorithms are often trained so to speak. using faces that are mostly white and male, this technology looks like a really nice, quick fix to the fact that we have not got as much money to pay for human intelligence operations. so it sounds great in theory. the problem is it doesn't work very well on people who are not white men, which is quite a lot of the population on the planet being arrested, wrongfully means that you get put into predictive policing algorithms. for the more often you're having contact with law enforcement, the more you are at risk of being stopped again,
9:44 am
even erroneous lee and also people in your network because they build the network out of the number just about you. proponents of facial recognition in the u. k. will argue that issues with accuracy can be fixed, they are wrong, technology can always be improved on. what's the big concern is that currently there are no laws governing the use of facial technology in the country. whether it's the state using it, or even private companies. i think what's really troubling at the moment is the technology has been rolled out without legislation and empowered regulators. this is not technology that has a very good track record of being countable. so i can find out a who's using it under what circumstances, what's, what's done with the data, where is it stored? what's the track record of cybersecurity on keeping that data protected? all of the things we have no idea. it's just being ruled out when people feel that they're being observed all the time. that has a really chilling effect. so things like you're right to protest. you're right to go to a job interview, to hang out with some friends,
9:45 am
to go to church. these are things that perhaps the state doesn't have a right to keep an eye on the met, police have defended the trials saying they're quote event, and that members of the public are informed through posters and leaflets. but at the trial, i was at that wouldn't be the word i'd use. there were literally hundreds of people rushing through the space. and the chances of seeing the tiny signs reading the leaflets or even understanding what the unmarked band was being used for. when minimal, i stopped a few people to see what they thought of the trial if and not the level of invasion of privacy. yeah. but then we live in not well, in my opinion, i think it's a good thing to have facial recognition because as long as you're not doing anything bad and it also helps police track people down, to be honest, the way technology is going at the moment. this will be the norm all around the world. so i think we just need to get used to it. if you've done nothing wrong, that no issue. i think if you really believe that the state never done anything wrong to the citizens, then you have nothing to fear from this technology. but as we know, no state has
9:46 am
a perfect track record, and we should not be putting so much power into the hands of the state on the police. take a look around you in the world. the technology is already being used by certain countries. all you have to do is pick up a newspaper and see people who are being incarcerated in concentration camps and china. right now. biometrics data is part of that. that's how they're monitoring those people and tracking them. and anyone who comes into contact with them, right? so there's your proof of concept of what could be done. now it's really easy to go . that would never happen here, but your government can always change, right? so history is full of examples that even in liberal democracies, in times of war and times of economic difficulty, people get voted into power, who change. so you have to think about how a system is being built and what it could be used for years down the road when there's a very different political flavor. b u. k cle expired metrics from another key
9:47 am
segment of the population. one that many wouldn't have even considered children. few are aware that schools have been recording the biometrics of children for the past 20 years. it is estimated that since 999, approximately 70 to 80 percent of children in the u. k. have interacted with some sort of biometric device in school. p. the king is a parent campaign for children's rights and creator of the biometrics in school blog. i think companies are putting the tech into a school setting because you put a compliant population school children ask all question if they're being civil just a bit more than general population simply because they didn't know any better. the concern i have biometrics in schools is that for the way back in 1999 and throughout the whole of that next decade in 2000 is that we have and it'll population when using biometrics at all. not even the phones and suddenly we had children and 3 and 4 using the fingerprint to get now to school system. the growth of affordable biometric technology means that fingerprints iris scans,
9:48 am
facial recognition and infrared palm scanning had been used to speed up access to contains libraries, registrations, payments, and lockers. a big selling point of course has been security. biometric enabled access is seen as a full proof way of keeping school building safer. however, a big concern is how robustly systems are who has access to the biometric data. is there a process for deletion and what happens if the system is compromised? i also sent the publication a few years ago friedman, this mason request about they checked the software, they checked encryption standards, adhering to sort of international standards, hardware, it's secure, nobody should know we've never check system. no, we don't know when it's national standards. it seems to be going under the carpet and nobody's aware of what's in school, what's being sold to schools who are access to it. and whether or not there's been
9:49 am
any biometric data breaches for entire generations of british school children. questions of consent around their biometrics have been bypass to a great extent. it was only in 2012 that a law was enacted putting in place processes for consent to be given or withheld. the overall effect of biometrics in schools however, is that the sharing and use some very personal data and implications of surveillance. a being normalized that's millions of british children who've been taught to understand that it's no big deal to head over your body data. in order to get a service or a product, they don't understand how it could be abused necessarily. there's no reason that they should understand it because nobody's helping them to understand it. we haven't had public discussion about it, the test, but it's not, it isn't necessarily the tag because we've got the tech all ready acceptance. and if you go into schools and you desensitize, normalize the surveillance technology. the smart city is that already, you know that on deck into it. so i think there's
9:50 am
a good argument sort of for all whole to be a little bit where we were smart and especially when it was smart cities or not, no toys because it is sensitive valence. it would be one thing if extensive biometric systems were being just used by governmental, state funded organizations like the us it wouldn't make the lack of accountability or inaccuracy or updated security protocols any easier to live with. but at least across many countries, governments can be questioned and pressured to give answers of some form. the reality, however, is that biometrics are increasingly being used by private companies. shopping move, recruitment agencies, online dna and ancestry services, and even private security companies. all of them a taking and using our biometrics and finding out how the technology is being used, what daughter is being stored and with whom it's being shared. not just today but also in the future involves a lot of products because these are transparent systems. even some of them are
9:51 am
seemingly benign, data driven products can pose a threat. a lot of people, for instance, are really interested in finding out about their family history. so they're handing over their dna to companies like ancestry dot com, but not unique combination of the world's largest d n. a family tree databases. who can show you a more precise picture of your origins. you as a citizen, have fewer rights about your dna with a private company than you do with law enforcement. right. so like put that in your head for just a moment and paula to implications through. in some countries, your dna could be used to reveal all sort. so for instance, predispositions to health problems that you might have. and in countries where there isn't national health insurance, and you have to pay to be insured for your health that could be used against you. and you would never know maybe even how they got the data. cuz this is all being potentially traded by hardy brokers because it's not illegal yet because no one's regulated it. you'll biometrics are
9:52 am
a really powerful data and they're being used not only to id you to more reliably track you, but also to judge you and to make assessments about your personality and your behavior. there are companies that offer this exec service take high. the, for example, on its website, it says it leverages a i in video to provide comprehensive candidate insights. when a candidate takes the video interview, they're creating thousands of unique points of data. a candidates, verbal and non verbal cues give us insight into their emotional engagement. thinking and problem solving style. according to high view, it services are already being used by big employees like union lever, vodafone and duncan, don't. patient offices, something that's increased in the real world. that's the share range of things we can do and the ways that we can use data to affect people's lives. so for example, machine learning systems are used to recommend doctorates and shopping scenarios,
9:53 am
but they're also used to assess people for jail sentences. and so if those algorithms have got problem, whether they be technical inaccuracies or bias with an algorithm, we need to start addressing those kind of issues. ryan kelly's of research on computing and information systems at the university of melbourne. he's been involved in an elaborate climb metric experiment to raise awareness about the potential and limitations of biometric analysis. it's called biometric mirror and i gave it a go using nothing but an image of my face. the system produces a detailed report and it's assessment of my age, race, level of attractiveness, and even aspects of my personality, ranging from happiness and witness to aggressiveness than responsibility. to teach the algorithm to do this. research is asked human volunteers to judge thousands of photos for the same characteristics. everything yes,
9:54 am
really. now it's easy to laugh at the results or shrugged them off as just a bit of fun. but there's more to it than that. so one of the reasons that important to teach people about the limitations of artificial intelligence and these kinds of analyses is because people might assume that because it's done by a computer, it's objective and correct. and what they might not realize is actually an application like biometric mirror draws on a data of faces that have been rated by people and say those ratings contain human biases. and so one example in the dates that biometric mirrors, anybody with a beard, is classified as aggressive. so of course i am classified as an aggressive person by biometric mer, even though i don't think class, i hope. so if an application like this is deployed in the real world, immediately the people are classified, perhaps unfairly and ways that aren't accurate. they might say, for example, you could imagine a scenario where you have a set of job applicants and you want to make it easy for people to kind of filter
9:55 am
them based on a responsibility. so somebody who is responsible for that task might say, oh, i can use biometric mirror to identify people who high and responsibility without really realizing that it's not an accurate thing to do. and there are various problems associated with that. regardless of those problems, biometric technologies being developed and used at a rate that far as strips the pace at which regulations have been created. in many senses, it feels as though we're sitting on a taking time one. we don't even have an established field of ethics for technology . there's voluntary codes by companies. these are not legally enforceable us citizen or a consumer cannot use these to protect you in any way to derive no comfort from that. so i think we're entering a really interesting space in terms of what it means to be human. because as we become a more quantified and there's going to be such a temptation to take all data about you and reduce you to zeros and ones. that is
9:56 am
what is coming and whether or not you want that to happen has to be something that's disgust it. we're willing this technology out and saying that this is going to change the way that we work and live within the next 51020 years to me. and it's really worrying. we need to elevate ethics for technology, right, to the top of the agenda. my advice would be to know that no data is 100 percent secure. you should always be able to know who is taking your data, how it's being used. what right, you have to correct or amended if it's incorrect and whether or not you can delete it. and that goes for law enforcement or any government branch in your country. but also any private company that you might interact with. if children are using technology, awesome, who owns the technology? i from the question and they sharing it with anybody. and i just think generally
9:57 am
being prudent keeping your additional footprints. minimum is a good thing to do. also as well. technology is bright, so do you enjoy? it is amazing. the flows of information knowledge out, but just be very aware of that. it's your data and data is very valuable. there is a huge group of people at work behind the screen and the power they have is that edge to keep swiping through your twitter feed design. the way we click, i agree to the terms and conditions that's designed, and most of us never even give it a 2nd. and actually, that's designed as ali riggs is how designers are manipulating behavior in the final episodes. all hail the algorithm on
9:58 am
a job. ah, it is hotter in north west africa than in ravia there mode that hasn't happened for a while. no fifty's recorded, typically low forties from iraq to west the wrong dancer, the gulf states. so we're watching for wins pick up at this time. the and, and this miles pic, yet some degree that's full of dust increasingly so mass, the orange pole. you see here through eastern saturday, across the gulf states and then towards a man, the coastal man still catching the edge of the monsoon wind. so in cella, it's, or you can still, it's sometimes drizzly and it should be green land, obviously not true for most. so that's the standard picture that though house $42.00 is about 3 degrees above average. but the principle change here is the normally should be a dry one. so it'll reduce humidity's been quite high in the gulf states fairly recently. in turkey and increasing chances,
9:59 am
showers significant rain is spreading itself is just of what was shouting to libya . now having come out of southern italy, and the breeze is still blowing through the jade into the western with not as strong as it was, but it's, they're all the same. the rain to study, going sas, now in more tropical africa units like victoria showers bill by night died by day and then they rebuild further west of prices. like, for example, rwanda, the frank assessments. by way, it is a lesson freedom, surprising informed opinions. what you saw happening is come on to it was what it was for. petune is the critical debate that we are here. it's not between any other group that we have compet here for 6 years. re running that even people in depth analysis of the days global headlines inside story on our jazeera part of the
10:00 am
time we are the was reveling the extra mile, the media don't go. we go there and we give them a chance to tell their story. ah, me, i've got his son's health care system in need of a lifeline. the un is hoping to raise $600000000.00 for void ac theory and quite a low. this is al jazeera life from the surely back. people also coming up north korea says it's successfully launch a new kind of long range. cruise may sile, japan warns the threat to peace the cause.
26 Views
Uploaded by TV Archive on