tv [untitled] September 12, 2021 4:30am-5:01am AST
4:30 am
we understand the differences and similarities of culture across the world. so no matter why you call i'll just bring you the news and current affairs algebra me . ready on has him see can in doha, with the headlines on edge data commemorations had been held across the united states to mark 20 years since the september 11th attacks us presidents past and present joint families of ground 0 in new york. to remember the nearly 3000 people killed. this is alive. look at ground 0 right now with the tributes in lights in the distance. a memorial was also held in shanks vill pennsylvania to mark the
4:31 am
moment when united airlines flight 93 crashed into an empty field. their president, joe biden laid a wreath of the ceremony to honor those killed. he arrived after attending the commemoration in new york. my canada has more from washington de, specifically hard for the people of america to remember those actions. 20 years ago, president biden shuttled between the side to the various attacks. he was in new york, then he went to chance will pennsylvania. and finally ended his day by laying a reef at the pentagon, he was joined to new york by to form a precedence. president clinton and president barack obama biden being, in fact, the 4th president who has led to the member use the remember and services to those who died and were injured. security officials in northern iraq say to drones, with explosive struck outside the real international airport. where us forces are station. this is the 3rd time the airport has been attacked recently. previous
4:32 am
attacks were blamed on iranian back, a rocky groups. and the times investigation has revealed the u. s. may have mistakenly targeted an aid work in a drug strike in combo. on august 29th, 10 people including 7 children, were killed. the us military has said it was targeting an eyesore operation. the threatened the cobble airport to uneasy as president says, his plans to suspend the constitution and offer a new version that he has written himself. president case said to control of the government and dismissed the prime minister in july. health officials in columbia say they are worried about a potential coded 19 search next month. columbia has run out of vaccines and many people were the 1st. those are currently unable to get a 2nd shot. it comes as the delta variance is on the rise. those, all the headlines were back with a full bolton in half an hour. right now it's all hail the algorithm. i can unlock
4:33 am
my phone with my face. you can access your bank accounts with your voice. and fingerprints are often the key information on a national id card. all of this face, voice fingerprint, biometrics, unique algorithmic measurements of us that are revolutionizing the process of identification, but biometrics a far from perfect. they convenience and seeming infallibility comes at a price. most crucially, our privacy the l biometrics are individual unique. so much so that they've always served as a gold standard for identification with really high levels of accuracy and strong
4:34 am
security fingerprints and dna databases have been the mainstay for police and investigators the decades and across many parts of the world. people who are illiterate use some prints in place of written signature. stephanie has been researching the growing use of biometrics. there's also your face now which is being recorded. so that's just your facial point and that's called facial recognition technology. your voice is biometric data. there's also something called a gate analysis which is how you walk. so those are ways that they could identify you. and another way is behavioral biometrics that might be your online behavior. so how you use your mouse, where you click on things as you go through the internet, but even how regularly are posting on facebook, there's a lot that you can, can get just from people's ordinary life. and that's why it's so important to have this debate and in society, we all are getting our consent about whether or not we want such to only being used . and if so, under what circumstances?
4:35 am
and with what regulatory checks the world is on a mission, a mission to give everybody a legal identity by 2030. that was a target set by the united nations as part of that sustainable development goals campaign. the key segment of the population that the u. n. it's focusing on more than 1000000000 people who currently have no way to prove their identity. the unverified include millions of refugees, traffic children, homeless and other people will never get a chance to establish documents and create a digital footprints that's so essential for more than line. here at zachary, can the united nations well program is using biometric technology. iris scans to provide aid to the can 75000 syrian residents. refugees can shop for their groceries with the blink of an eye. no need for bank card or registration pages.
4:36 am
the system is quite aptly named. when a shopper has the iris scanned, the world food program system verifies the person's identity against the biometric database held by the un high commissioner for refugees. the u. n. h c. then it checks the account, balance, confirms the purchase, and prints and i pay receipt. all of this happens in seconds and according to the world food program, this not only makes transactions quicker but more secure than we use. biometrics authentication for 2 main reasons. one to guarantee 100 percent accountability on the identity of the person for chasing and using the assistance that we provide. and secondly, to facilitate the redemption process of the beneficiary by not using a card by not using a pin in camps, which is an environment where beneficially tend to go just with the market more people's times during the month for them going with their own iris. it's easier
4:37 am
been going with a car for it been denied, and i can get it done by myself. i can work with them. so you have the 1st part of the day which in the everything a be shipping process is very fascinating. and i've been on this is a super high tech that's been rolled out and what you could call a low rights environment. sure, people here are under the protection of the united nation and have more rights than they would has in the was that the countries they fled from such a syria. however, they also have little choice when it comes to giving up their binding tricks, erupting out with blinding tricks. programs,
4:38 am
taking somebody's biometric data from them, is about the most personal data that you could take. these are not people who necessarily are in a position to ask for legal representation to have this explains them. second, if they don't want it, what is the alternative that they can exercise? instead? are they using behavioral psychology, something called nudge theory to make it where it's just easier to hand over your data? and then you get your food and your clothes and your money faster. because that would be unethical. we're testing out again, extremely experimental, really invasive technology. on people who actually have some of the least rights and protections of anyone's when a middle class person living in france or germany or the united states, the united kingdom were sweden, consent to, to use their iris to pay for things or to transact. probably not. it's easy to see the immense potential of the pay system to track
4:39 am
a disbursement smooth out payments and reduce the chances of corruption or fraud the wolf food programs. this the benefits go even further. they are able to monitor shopping habits and nutritional intake. and there's a possibility in the future that the credit histories of the refugees could help them open bank accounts or get line. they also think they've got the security bid covered for the repair, regulate the management of they could just throw at data sharing agreement with you . and so through that agreement, we are able to access the beta tentative date, which again does not include the name, just the case, id, phone number, and location. we are confident that that data being crypted is what protect the reason why we are doing regularly data privacy. and in fact, assessment on the project to guarantee that if you're in the world, we are able to track of them and address them properly before they come to us.
4:40 am
you and hcr remain fully committed to their biometric registration program. so much though, that they are rapidly expanding it with the aim of be active in 75 countries by 2020. the remains lots of problematic questions that have yet to be fully answered . such as, is the tech foolproof? who has access and how can anyone plan for the unforeseen issues to come? these are the kinds of questions that have made other 8 organizations pause before jumping on board with biometric technology. in 2015, oxfam voluntarily imposed a moratorium on its use of biometrics in its work. it stated, given the number of unknowns around most effective operation and governance models and risks of this incredibly sensitive data falling into the wrong hands. we felt it was best not to become an early adopter. the one field in
4:41 am
which climate trade has long been is security and surveys. and facial recognition is one of the most popular technologies. right now. in china, there's been an exponential increase in the use of facial tracking and artificial intelligence to monitor citizens. the united states also currently operates one of the largest facial recognition systems in the world with a database of 117000000 americans with photos typically drawn from drivers licenses . and in the u. k. police forces have been trialing law facial recognition in 2016 at public spaces such as shopping centers, football matches, protests, music events, and crowded city sport. this green ban that's behind me here in central london is part of the facial recognition technology trial that's being run by the metropolitan police. and what it's doing is it's basically scanning people's faces when they will pass and then comparing that to a database that has wanted offenders. so click on the met police say facial
4:42 am
recognition could enable them to more easily protect people, prevent offences and bring offenders to justice. however, privacy groups such as big brother, watch say, the technology is authoritarian and lawless groups, legal and policy officer research. this even goes so far as to say that facial recognition is possibly the most dangerous surveillance mechanism that's ever been invented. facial recognition technology can capture up to 300 patients a 2nd, which could be around 800000 faces in a minute. it's a vast, vast number of people who the police can identify check against police states basis, whether that's police immigration. so what we're seeing is police being able to identify people in seconds that puts so much power in the hands of the state and the police, which i think is fundamentally wrong. it's not democratically accountable because there's no legal basis for this. so this is an intense, intrusive, and authoritarian surveillance technology. while advocates,
4:43 am
the facial recognition would debate some of christ assertions. one thing is undeniable. the technology currently being used by the u. k. police is dangerously inaccurate. the latest figures show that 96 percent of the met police's so called matches when miss identifications. and this research showing that many facial recognition algorithms disproportionately miss identified darker skin tones and women the causes a numerous and they vary ranging from poor quality cctv images. to the fact that the algorithms are often trained so to speak, using faces that are mostly white and male. this technology looks like a really nice, quick fix to the fact that we have not got as much money to pay for human intelligence operations. so it sounds great in theory, the problem is it doesn't work very well on people who are not white men, which is quite a lot of the population on the planet being arrested, wrongfully means that you get put into predictive policing algorithms for the more
4:44 am
often you are having contact with law enforcement, the more you are at risk of being stopped again, even erroneous lee and also people in your network because they build the network out. it's never just about you. proponents of facial recognition in the u. k. will argue that issues with accuracy can be fixed, they are wrong, technology can always be improved on. what's a big concern is that currently there are no laws governing the use of facial technology in the country. whether it's the state using it, or even private companies. i think what's really troubling at the moment is a technology has been rolled out without legislation and empowered regulators. this is not technology that has a very good track record of being countable. so i can find out a who's using it under what circumstances, what's, what's done with the data, where is it stored? what's the track record of cybersecurity on keeping that data protected? all of the things we have no idea. it's just being rolled out when people feel that they're being observed all the time. that hasn't really,
4:45 am
channing effects. so things like you're right to protest. you're right to go to a job interview to hang out with some friends to go to church. these are things that perhaps the state doesn't have a right to keep an eye on the met police a defended the trials saying they're quote event and that members of the public are informed through posters and leaflets. but at the trial, i was at that wouldn't be the word i'd use. there were literally hundreds of people rushing through the space. and the chances of seeing the tiny signs reading the leaflets, or even understanding what the unmarked van was being used for. when minimal, i stopped a few people to see what they thought of the trial. i so not the level of invasion of privacy. yeah. but then we live in not well, in my opinion, i think it's like a good thing to work and mission because as long as you're not doing anything bad and it also helps the police track people down, to be honest. the way technology is going at the moment, this will be the norm all around the world. so i think we just need to get used to it. if you've done nothing wrong,
4:46 am
that no issue. i think if you really believe that the state has never done anything wrong to its citizens, then you have nothing to fear from this technology. but as we know, no state has a perfect shot record, and we should not be putting so much power into the hands of the state and the police. take a look around you in the world. this technology is already being used by certain countries. all you have to do is pick up a newspaper and see people who are being incarcerated in concentration camps in china. right now. biometrics data is part of that. that's how they're monitoring those people and tracking them. and anyone who comes into contact with them, right? so there's your proof of concept of what could be done. now it's really easy to go . that would never happen here, but your government can always change, right? so history is full of examples that even in the broad democracies, in times of war and times of economic difficulty, people get voted into power, who change. so you have to think about how a system is being built and what it could be used for years down the road when
4:47 am
there's a very different political flavor. the u. k. coll expired metrics from another key segment of the population. one that many wouldn't have even considered children. few are aware that schools have been recording the biometrics of children for the past 20 years. it is estimated that since 999, approximately 70 to 80 percent of children in the u. k. have tracked with some sort of biometric device in school picking is a parent campaign for children's rights and creator of the biometrics in school blog. i think companies are putting the tech into a school setting because you put a compliant population school children. my question, if they're being surveilled little bit more than general population, simply because they didn't know any better. the concern i have biometrics in schools is that for the way back in 1999 and throughout the whole of that next decade in 2000 is that we have and it'll population when using biometrics a tall, not even
4:48 am
a phones. and suddenly we had children with the $3.00 and $4.00 using their fingerprint to get to the now to school systems. the growth of affordable biometric technology means that fingerprints iris scans, facial recognition and infrared palm scanning have been used to speed up access to contains libraries, registrations, payments, and lockers. a big selling point of course has been security. biometric enabled access is seen as a foolproof way of keeping school building safer. however, a big concern is how robustly systems are who has access to the biometric data. is there a process for deletion and what happens if the system is compromised? i also sent the publication a few years ago. freedom of information request about they check the software. they checked encryption standards, adhering to sort of international dundas hardware. it's secure, nobody should know we've never tech system. no, we don't know when national sun this is,
4:49 am
justine has been gone under the carpet and nobody's aware of what's in schools, what's being sold to schools who has access to it and whether or not it's been any biometric data breaches for entire generations of british school children, questions of consent around their biometrics have been bypass to a great extent. it was only in 2012 that a law was enacted putting in place processes for consent to be given or withheld. the overall effect of metrics in schools however, is that the sharing and use of very personal data and implications of surveillance . a being normalized that's millions of british children who've been taught to understand that it's no big deal to head over your body data. in order to get a service or a product, they don't understand how it can be abused necessarily. there's no reason that they should understand it because nobody's helping them to understand it. we haven't had public discussion about it, the test, but it's not necessarily the tag because we've got the tech already acceptance. and if you go into schools and you desensitized, normalize the surveillance technology,
4:50 am
the smart city is that already? nobody objected to it. so i think that's a good argument sort of for all whole to be a little bit where we have the which smart and especially when it with small cities or smaller toys with is the census fail and it would be one thing if extensive biometric systems were being just used by governmental, state funded organizations like the us, it wouldn't make the lack of accountability or inaccuracy or outdated security protocols any easier to live with. but at least across many countries, governments can be questioned and pressured to give answers of some form. the reality however, is that by metrics are increasingly being used by private companies, shopping moves, recruitment agencies, online dna and ancestry services, and even private security companies. all of them a taking and using our biometrics and finding out how the technology is being used
4:51 am
. what daughter is being stored and with whom it's being shared, not just today, but also in the future involves a lot of protests because these aren't transparent systems. even some of them are seemingly benign, data driven products can pose a threat. so lots of people, for instance, are really interested in finding out about their family history. so they're heading over their dna to companies like ancestry dot com. now, unique combination of the world's largest d n a mitree databases. who can show you a more precise picture of your origins? you as a citizen, have fewer rights about your dna with a private company than you do with law enforcement. right? so like, put that in your head for just a moment and follow the implications through in some countries your dna could be used to reveal all sorts. so for instance, predispositions to health problems that you might have. and in countries where there isn't national health insurance, and you have to pay to the insured for your health that could be used against you.
4:52 am
and you would never know maybe even how they got the data. because this is all being potentially traded by 3rd party brokers because it's not illegal yet because no one's regulated it. you biometrics are a really powerful dodson and they're being used not only to i d, u to more reliably tracking, but also to judge you and to make assessments about your personality and your behavior. there are companies that offer this exact service, take high the, for example, on its website. it says it leverages a i in video to provide comprehensive candidate insights. when a candidate takes the video interview, they're creating thousands of unique points of data. a candidates, verbal and non verbal cues give us insight into their emotional engagement thinking and problem solving style. according to high, if you it services are already being used by big employees like unilever, vodafone and duncan, don't. patient access is something that's increased many years in the real world. the share range of things we can do in the ways that we can use data to affect
4:53 am
people's lives. so for example, machine learning systems are used to recommend in shopping scenarios, but they're also used to assess people for jail sentences. and so if those algorithms have got problems, whether they be technical inaccuracies or bias within the algorithm, we need to start dressing those kinds of issues. ryan kelly is a researcher on computing and information systems at the university of melbourne. he's been involved in an elaborate flying metric experiment to raise awareness about the potential and limitations of biometric analysis. it's called biometric mirror and i gave it a go using nothing but an image of my face. the system produces a detailed report and it's assessment of my age, race, level of attractiveness, and even aspects of my personality, ranging from happiness and witness to aggressiveness and responsibility. to teach
4:54 am
the algorithm to do this. research is asked, human volunteers to judge thousands of photos for the same characteristics. everything it really is. now it's easy to laugh at, the results. shrugged them off as just a bit of fun. but there's more to it than that. one of the reasons it's important to teach people about the limitations of artificial intelligence. and these kinds of analyses is because people might assume that because it's done by a computer, it's objective and correct. and what they might not realize is actually an application like biometric mirror draws on a data of faces that are being rated by people and say those ratings contain human biases. and so one example in the dataset, biometric mirrors that anybody with appeared is classified as aggressive. so of course i am classified as an aggressive person by biometric math, even though i don't think class, i hope i'm not. so if an application like this is deployed in the real world, immediately the people are classified unfairly and ways that aren't accurate in my
4:55 am
price. for example, you can imagine a scenario where you have a set of job applicants and you want to make it easy for people to kind of filter them based on a responsibility. so somebody who is responsible for that task might say, oh, i can use biometric mirror to identify people who high and responsibility without really realizing that it's not an accurate thing to do and our various problems associated with that. regardless of those problems, biometric technologies being developed and used at a rate that far at strips the pace at which regulations have been created. in many senses, it feels as though we're sitting on a ticking time. we don't even have an established field of ethics for technology. there's voluntary codes by companies. these are not legally enforceable. you as a citizen or a consumer cannot use these to protect you in any way to derive no comfort from that. so i think we're entering a really interesting space in terms of what it means to be human. because as we
4:56 am
become a more quantified world, there's going to be such a temptation to take all data about you and reduce you to zeros and ones. that is what it's coming and whether or not you want that to happen has to be something that's disgust. we're willing this technology out and saying that this is going to change the way that we work and live within the next 51020 years to me. and it's really worrying. we need to elevate ethics for technology, right, to the top of the agenda. my advice would be to know that no data is 100 percent secure. you should always be able to know who is taking your data, how it's being used. what right, you have to correct or amended if it's incorrect and whether or not you can delete it. and that goes for law enforcement or any government branch in your country. but
4:57 am
also any private company that you might interact with. if children are using technology, awesome, who owns the technology? i from the question, they sharing it with anybody. and i think generally being prudent keeping digital footprints an absolute minimum is a good thing to do. also as well. technology is bright, so do you enjoy it? is amazing, the flows of information knowledge out, but just be very aware of that. it's your data and data is very valuable. there is a huge group of people at work behind the screen and the power they have is massive . that urge to keep swiping through your twitter feed design. the way we click, i agree to the terms and conditions that's designed and most of us never even give it a 2nd. and actually that's designed as ali riggs is how designers are manipulating
4:58 am
behavior in the final episodes. all hail the algorithm on a just me ah, ah ah, ah, now this is unusual hurricane larry rushing up. look at the speed of that thing. this is normal satellite picture that i show you is really rushed past newfoundland, still called a hurricane, causing potentially a lot of damage, although it did go quickly through. now after that it's left things behind fatty. normally this is occur crabbing rain through the prairies of western canada. much
4:59 am
of the u. s. is looking quite so hot in the south west but around the gulf coast things don't look so good. in fact, he's zoom in there because by sunday and monday to be rain and significant rain, i think coming up from mexico across the border into texas. this is j. my. the origin represents the heaviest of rain, but it step back 24 hours because i say wake me week. this means significant rain through this part of mexico and a few big showers elsewhere, particularly in assist on your la. but i think otherwise is fairly drawn to the columbia south. american particular looking quite, quite at the moment has been flooding significant funding in northern columbia and that could carry on then there's the bit of the gap. charles are returning to some pass. i'll see i was in there not many asencion at 37 degrees casing point. this is young of temperature is far too high for where it is. it will cool down significantly, but it'll take 3 days. the, the pot to of
5:00 am
a special investigation. one, when a visitor western history is only youth detention center and travel to the remote out town, where many of the indigenous inmates come from on how to 0 ah, marking 20 years is the deadliest attack on us soil. some the ceremonies are held to remember the victims of the 911 attacks ah, on having think of this is educated live from the also coming up bearing the scars of the us lead.
65 Views
Uploaded by TV Archive on