tv The Context BBC News September 26, 2024 9:30pm-10:01pm BST
9:30 pm
for al decoded. facial recognition is hard, for two reasons. teaching a computer to process a human face is difficult enough. matching that face to someone�*s identity in a database, requires significant computing power. and billion of photographs, tied to accurate data, on which those computers can train. the technology has been around since the earlier 1970's in its most primative form — but so unreliable was it — that other biometrics, fingerprinting, retinal scanning, came to market quicker. about five years ago a company called clearview ai claimed to have made a breakthrough. tonight their ceo willjoin us live from oakland, california. the advances in this technology are probably the biggest breakthrough in crime detection since we introduced dna testing. this week police scotland
9:31 pm
announced they will be rolling out facial recognition cameras across the country. chief constable jo farrell says it would be an "abdication of her duties" not to be using it we were given a very good demonstration of it through the summer. retrospective facial recognition tracked down many of those taking part in the riots. even those who were masked but is it becoming too intrusive? even those who were masked. but is it becoming too intrusive? the software that identifies ourfaces is now being developed for authentication, we use it on our phones instead of a code. british telecom are currently trialling that same technology to improve cybersecurity, so that only authorised workers have access to critical systems and data. there is a lot to consider. with me in the studio, as ever, the font of all ai knowledge priya lakhani, ceo at century tech. she has come back from new york this morning, so bear with her. i'm going to put you on the spot nonetheless. give us a
9:32 pm
quick explainer of how this technology works.- technology works. firstly, let's say _ technology works. firstly, let's say we _ technology works. firstly, let's say we want - technology works. firstly, let's say we want to - technology works. firstly, let's say we want to spot. technology works. firstly, i let's say we want to spot you in an image, and then, go off a bit but we want to match that to a wanted list, kristin. put an image of kristen here on the camera, in the background you will have him in the studio and there is lots of noise in that studio, tables, monitors. first we want to do what we call classification, detect the fact that we have got a face here. the way we do that is we use deep learning models to be able to classify the image and find the faith. the way that they traditionally do that is look at lots of images, tag lots of faces in those images, say these are pieces and these are not faces then you would build up not faces then you would build up a idp models and you would train them to learn where the faces are. they would look at those feature sets of buffets, there are some models that use unsupervised learning, where you have that sort of tagging and labelling. the model then learns what the specific features are of the face, the texture, the edges of what a
9:33 pm
face looks like. you have got a model that can potentially spot in that very big studio over there all of the faces. we then take your face specifically and we create, and if we can play a clip, a boundary box, a bounding box around your face. i've got a clip to show our viewers, this is friends. you can see, it is very short, you can see, it is very short, you can see, it is very short, you can see the sort of boxes around the faith. you create these bounding boxes around the face and the model knows that it has a face, but we want to detect and whether that is your face and it does that match something in the database. so we normalise the image. say you have an image of a face, you normalise the image, you want to take away lots of variables that could create inaccuracies. the lighting, you might want to turn all the images into grayscale, look at the alignment of the faith, once you have done that, and this is where it is interesting, in those images you sought lots of spots. the deep learning models
9:34 pm
will extract what it thinks are the key features of the faith. once it extracts those key features of christian's face, and this is where it is really interesting, we turn it all into mats. we vectorised the data. you end up with christian fraser right now, what you're used to seeing in an image, actually is a string of numbers. just a string of numbers. just a string of numbers. every image of you that taken on cameras out there, modelling that face in using the sort of ai, it will have slightly different string of numbers because your position might be different, your expression. but they will be very similar because it is yourface. a mathematical representation of your faith. then, let's say you have wanted list. you have got a database of faces that you are looking for. they have been through a similar process where they have been encoded into numbers. and thenit been encoded into numbers. and then it is a matching exercise. is there a similarity between the string of numbers that represent... 50 the string of numbers that
9:35 pm
represent. . ._ the string of numbers that represent... so that is where the computing _ represent... so that is where the computing power - represent... so that is where the computing power comes| represent... so that is where l the computing power comes in and where the chips coming? because... as the chips improve, those calculations will much quicker.— will much quicker. yes, and remember. _ will much quicker. yes, and remember, i— will much quicker. yes, and remember, i really - will much quicker. yes, and remember, i really want. will much quicker. yes, and remember, i really want to | will much quicker. yes, and i remember, i really want to get the expert in this area on, but do you remember when we did an episode on semiconductors? we talked about how they are two big processes that happen. all the training data initially, for example, all the faces and images, then we have come up we run the model, so that is what you're talking about. you're running the model against that facial recognition to be able to find the faith. it facial recognition to be able to find the faith.— to find the faith. it is extremely _ to find the faith. it is| extremely impressive to find the faith. it is - extremely impressive what clearview ai do. before we talk to the expert, let me show you the promo video that clearview ai puts out. in 2019, homeland security investigations were trying to identify an adult male who was in a child abuse video.
9:36 pm
the adult male was abusing a six—year—old girl and selling this abuse video on the dark web. the only clue was a photo of the adult male who was in the background of the abuse. video for just a few frames, with no other clues. the investigator and the case turned to clearview ai. prior to searching on clearview ai we must provide a reason for the search. in this example, we will choose felony sex offence. secondly, you upload a photo of the suspect from your desktop and press the search button. as you can see, 25 results now match the uploaded photo, whereas in 2019, during homeland security's investigation, only one result came back from clearview ai. this is the photo. as you can see, the suspect is in the background of the photo.
9:37 pm
press the locate button on the top left to zoom into the photo, or use the compare button to see them side by side. in this case, the investigator clicked the link to a public social media post online, uncovering two key pieces of information. the photo was tagged in las vegas and the name of the company the suspect appeared to work for. with those two clues, the investigators at homeland security travelled to las vegas to obtain the suspect�*s name from the employer and, with additional corroborating evidence, secured a search warrant for the suspect�*s computers. the search warrant revealed that the suspect had thousands of videos and photos of child abuse material on his computer. he pled guilty and is now doing 35 years injail, and the six year old girl was rescued. let's speak to hoan ton—that, who's
9:38 pm
chief executive officer of clearview ai... how many cases do you think your technology has solved, and what was the big leap forward for you? what was the big leap forward for ou? . ~ what was the big leap forward for ou? ., ~ ,, what was the big leap forward for ou? ., ~ . ., for you? thank you so much for havin: for you? thank you so much for having me _ for you? thank you so much for having me on. _ for you? thank you so much for having me on, it— for you? thank you so much for having me on, it is— for you? thank you so much for having me on, it is great - for you? thank you so much for having me on, it is great to - for you? thank you so much for having me on, it is great to be l having me on, it is great to be here — having me on, it is great to be here~ i— having me on, it is great to be here. i appreciate your interest— here. i appreciate your interest in clearview ai. we have — interest in clearview ai. we have done it now over 2 million searches— have done it now over 2 million searches on behalf of law enforcement. that is how many searches — enforcement. that is how many searches they have used on our platform — searches they have used on our platform. we do not know how many — platform. we do not know how many crimes they have solved, but if _ many crimes they have solved, but if you — many crimes they have solved, but if you take even a conservative estimate, you would _ conservative estimate, you would be in the hundreds of thousands at least. sometimes cases_ thousands at least. sometimes cases you — thousands at least. sometimes cases you have to search multiple images, so on... but anecdotally, as well, most recently— anecdotally, as well, most recently we worked with the international centre of missing and exploited children, and we were _ and exploited children, and we were in— and exploited children, and we were in ecuador, and latin america. _ were in ecuador, and latin america, in three days these law enforcement agencies, about eight _ law enforcement agencies, about eight of— law enforcement agencies, about eight of them, went through a list of— eight of them, went through a list of the _ eight of them, went through a list of the hardest cold cases they— list of the hardest cold cases they have not solved. missing kids. — they have not solved. missing kids. kids _ they have not solved. missing kids, kids have been abused,
9:39 pm
and _ kids, kids have been abused, and they— kids, kids have been abused, and they find them on these internei _ and they find them on these internet forums as victims. in those — internet forums as victims. in those three days, they made hundred _ those three days, they made hundred ten identifications of missing — hundred ten identifications of missing and exploited children, and rescued over 50 of them. the _ and rescued over 50 of them. the impact is incredible. on the the impact is incredible. on ihe flip— the impact is incredible. on the flip side, we know it is a very— the flip side, we know it is a very powerful technology, so we have _ very powerful technology, so we have limited the usage of our application to law enforcement and governments. but application to law enforcement and governments.— and governments. but what is it... i and governments. but what is it- -- i said _ and governments. but what is it... | said that _ and governments. but what is it... i said that there - and governments. but what is it... i said that there had - it... i said that there had been... with this technology, what has been the real breakthrough for facial recognition? i breakthrough for facial recognition?— breakthrough for facial recognition? i think it is neural networks, - recognition? i think it is neural networks, which| recognition? i think it is i neural networks, which are recognition? i think it is - neural networks, which are part of artificial _ neural networks, which are part of artificial intelligence. previous algorithms for facial recognitions with, try and look at the — recognitions with, try and look at the distance between the eyes, — at the distance between the eyes, or— at the distance between the eyes, orthat at the distance between the eyes, or that i send the eyebrows, orthe eyes, or that i send the eyebrows, or the nose in the eyes, — eyebrows, or the nose in the eyes, things like that. but that— eyes, things like that. but that does not work very well if you have — that does not work very well if you have an image from a different— you have an image from a different angle, say a security camera — different angle, say a security camera. with neural networks you are — camera. with neural networks you are able to train, it is
9:40 pm
called _ you are able to train, it is called supervised learning, on a lot— called supervised learning, on a lot of— called supervised learning, on a lot of different examples of photos — a lot of different examples of photos to improve accuracy. so, the weight— photos to improve accuracy. so, the weight we train our algorithm was to get a lot of publicly— algorithm was to get a lot of publicly available images. say you have _ publicly available images. say you have 100 images of george ciooney, — you have 100 images of george clooney, 100 photos of brad pitt, — clooney, 100 photos of brad pitt, the _ clooney, 100 photos of brad pitt, the algorithm will learn that— pitt, the algorithm will learn that the _ pitt, the algorithm will learn that the black—and—white photo of brad — that the black—and—white photo of brad pitt with the sunglasses on is the same one of him — sunglasses on is the same one of him from 20 years ago with different— of him from 20 years ago with different hairand so of him from 20 years ago with different hair and so on. the algorithm _ different hair and so on. the algorithm learns what stays the same _ algorithm learns what stays the same in — algorithm learns what stays the same in a — algorithm learns what stays the same in a face. and so, the more — same in a face. and so, the more data _ same in a face. and so, the more data you have, the more accurate — more data you have, the more accurate it— more data you have, the more accurate it gets. there has been — accurate it gets. there has been some great research out there, _ been some great research out there, thanks to pushing the learning _ there, thanks to pushing the learning and all this data —— machine _ learning and all this data —— machine learning, the research community has done a really good — community has done a really good joh— community has done a really good job to improve it. we huiit— good job to improve it. we built upon a lot of those what we were _ built upon a lot of those what we were able to do was bring a lot of— we were able to do was bring a lot of data _ we were able to do was bring a lot of data to train our algorithm. and so, now when you look— algorithm. and so, now when you took at _ algorithm. and so, now when you took at all— algorithm. and so, now when you look at all the top facial
9:41 pm
recognition algorithms, not just— recognition algorithms, not just clearview ai, there are others, _ just clearview ai, there are others, there is a national institute _ others, there is a national institute of standards and technology in the us that ranks hundreds— technology in the us that ranks hundreds of these algorithms. we can— hundreds of these algorithms. we can pick a photo out of a tine-up _ we can pick a photo out of a line-up of— we can pick a photo out of a line—up of 12 million images at a 99~85%_ line—up of 12 million images at a 99.85% accuracy rate. that is across — a 99.85% accuracy rate. that is across all— a 99.85% accuracy rate. that is across all demographics. the technology has now become much more _ technology has now become much more accurate. the innovation, it really— more accurate. the innovation, it really is, _ more accurate. the innovation, it really is, artificial intelligence, and the amount of data that — intelligence, and the amount of data that is out there that you can use — data that is out there that you can use to— data that is out there that you can use to train these algorithms.— can use to train these algorithms. what i am interested _ algorithms. what i am interested in - algorithms. what i am interested in this i algorithms. what i am i interested in this something you just said, i thought that prior tests show there is a significant statistical difference in the performance of the model. when it came certain demographics.- certain demographics. yes, ranked demographics i certain demographics. yes, ranked demographics as i certain demographics. yes, i ranked demographics as well. certain demographics. yes, ranked demographics as well. if you look— ranked demographics as well. if you look at the top performing algorithms, the differentials are very— algorithms, the differentials are very small. you're looking at over— are very small. you're looking at over 99% accuracy across all the demographics we do test. but if— the demographics we do test. but if you looked for five years _ but if you looked for five years ago, that is when a lot
9:42 pm
of these _ years ago, that is when a lot of these algorithms did have issues — of these algorithms did have issues with accuracy, especially on certain demographics. but today, a lot of the — demographics. but today, a lot of the too _ demographics. but today, a lot of the top algorithms are very accurate — of the top algorithms are very accurate regardless of demographics. it is also things like angles,... 50 demographics. it is also things like angles,...— like angles,... so you can get artial like angles,... so you can get partial faces? _ like angles,... so you can get partial faces? we _ like angles,... so you can get partial faces? we have - like angles,... so you can get partial faces? we have talked about people with masks who are identified through the summit here during the riots. but you could have, i don't know, like a third of yourface, someone in a balaclava with eyebrows. you might still pick up an image? you might still pick up an imaue? , , , , image? yes, we were surprised as well. image? yes, we were surprised as well- i— image? yes, we were surprised as well- i am — image? yes, we were surprised as well- i am a _ image? yes, we were surprised as well. i am a software - as well. i am a software engineer by background, and went— engineer by background, and went the _ engineer by background, and went the pandemic happened we had many issues identifying people _ had many issues identifying people with masks. we added photos— people with masks. we added photos with masks to our training _ photos with masks to our training data, about 3% of the photos— training data, about 3% of the photos we photoshopped masks onto, _ photos we photoshopped masks onto, it — photos we photoshopped masks onto, it was kind of incredible to see — onto, it was kind of incredible to see that now almost all the time, — to see that now almost all the time, even with the mask on, our— time, even with the mask on, our algorithm works very well in searching out of billions of images _ in searching out of billions of images. so that it technology even — images. so that it technology even surprises people who are making — even surprises people who are making it _ even surprises people who are making it. so even surprises people who are making it— making it. so when you talked oriainall making it. so when you talked originally about _
9:43 pm
making it. so when you talked originally about having - making it. so when you talked originally about having a i making it. so when you talked originally about having a lot i originally about having a lot of data to be able to add to this, the data, the labelling, the supervised learning that then allows you to have these really preferment models, data data was taken from the internet. in terms of some of the concerns, what is clearview doing to ensure that this technology is not being misused? have you got safeguards in place? your cells of the business, have you got safeguards in place with the law enforcement authorities that you're dealing with, across the world? we want to hear a little bit more about that, given where this data has come from and where the inputs have come from. and how those potential outputs may be used. yes, so that we have a wide variety— yes, so that we have a wide variety of— yes, so that we have a wide variety of ways to make sure that — variety of ways to make sure that this _ variety of ways to make sure that this technology is used for the _ that this technology is used for the best and highest purposes, solving crime. it also — purposes, solving crime. it also a _ purposes, solving crime. it also a lot— purposes, solving crime. it also a lot of it is used by law enforcement and other users. we do restrict— enforcement and other users. we do restrict this to government and law— do restrict this to government and law enforcement agencies. we have — and law enforcement agencies. we have a _ and law enforcement agencies. we have a vetting process ourselves before we onboard any customer — ourselves before we onboard any customer. we look at their human— customer. we look at their human rights background, and
9:44 pm
all those — human rights background, and all those kind of things. i human rights background, and all those kind of things.- all those kind of things. i was auoin to all those kind of things. i was going to ask _ all those kind of things. i was going to ask you _ all those kind of things. i was going to ask you about - all those kind of things. i was going to ask you about that, l all those kind of things. i wasj going to ask you about that, i was thinking of the scribbles who were poisoned in salisbury and of course the russian government will be looking for them and wondering whether they would have access. but you have a vetting process?— a vetting process? yes, we will not sell to _ a vetting process? yes, we will not sell to russia, _ a vetting process? yes, we will not sell to russia, iran, - a vetting process? yes, we will not sell to russia, iran, we i a vetting process? yes, we will not sell to russia, iran, we do| not sell to russia, iran, we do sell— not sell to russia, iran, we do sell this— not sell to russia, iran, we do sell this to _ not sell to russia, iran, we do sell this to the ukrainians. they— sell this to the ukrainians. they use _ sell this to the ukrainians. they use it very effectively since — they use it very effectively since the beginning of the war. there — since the beginning of the war. there have been over 2000 were crimes— there have been over 2000 were crimes where they have been solved — crimes where they have been solved or— crimes where they have been solved or the suspects have been — solved or the suspects have been identified because of clearview. they would not have been _ clearview. they would not have been identified otherwise. sorry— been identified otherwise. sorry to _ been identified otherwise. sorry to interrupt, how safe is this database? because through our series, this database? because through ourseries, one this database? because through our series, one of the things we have been talking about is that sort of cat and mouse of aia, ai accessing ai, are you concerned at all, given the
9:45 pm
immense amount of data that you have gathered, that it is safe from the bad actors who would want access to it?— want access to it? that is a treat want access to it? that is a great question. _ want access to it? that is a great question. we - want access to it? that is a great question. we bet i want access to it? that is a l great question. we bet every customer, we talk to them and find out — customer, we talk to them and find out and try to verify who they — find out and try to verify who they are _ find out and try to verify who they are. of course, before we onboard — they are. of course, before we onboard them. second, we make sure _ onboard them. second, we make sure that — onboard them. second, we make sure that there is an administrator in charge of the facial— administrator in charge of the facial recognition programme in particular— facial recognition programme in particular agency. any law enforcement officer using it, before — enforcement officer using it, before they do a search, zeus on the — before they do a search, zeus on the demo, they have to put in a _ on the demo, they have to put in a case — on the demo, they have to put in a case number and a type. and— in a case number and a type. and that— in a case number and a type. and that allows the administrators to audit on a regular— administrators to audit on a regular basis which offices are using — regular basis which offices are using the _ regular basis which offices are using the technology and what reasons — using the technology and what reasons. so, that is another safeguard that really helps these _ safeguard that really helps these agencies make sure it is used _ these agencies make sure it is used for— these agencies make sure it is used for the right purpose. and they— used for the right purpose. and they can — used for the right purpose. and they can take action as appropriate if there is any kind — appropriate if there is any kind of— appropriate if there is any kind of misuse of the technology. i think that is another— technology. i think that is another control we think is an innovation _ another control we think is an innovation we have had that sets — innovation we have had that sets us— innovation we have had that sets us apart. finally, we give
9:46 pm
training — sets us apart. finally, we give training to _ sets us apart. finally, we give training to all the people who use clearview to make sure that they— use clearview to make sure that they know — use clearview to make sure that they know how to not just take they know how to not just take the search results and go with it, the search results and go with it. but— the search results and go with it, but verify the information that — it, but verify the information that comes back from clearview. we do _ that comes back from clearview. we do not — that comes back from clearview. we do not allow the search results _ we do not allow the search results to be used as the only source — results to be used as the only source of— results to be used as the only source of evidence. these are leads, _ source of evidence. these are leads, and _ source of evidence. these are leads, and these investigators follow — leads, and these investigators follow research to verify the identity _ follow research to verify the identity. with those things, that— identity. with those things, that is— identity. with those things, that is how we have been able to really— that is how we have been able to really get the best out of the technology and minimise a lot of— the technology and minimise a lot of the — the technology and minimise a lot of the downsides.— lot of the downsides. very quickly. — lot of the downsides. very quickly, humans _ lot of the downsides. very quickly, humans to - lot of the downsides. very quickly, humans to have l lot of the downsides. very | quickly, humans to have to review the results of that? absolutely. one thing we have done _ absolutely. one thing we have done with our software, we actually _ done with our software, we actually show the person using it if we — actually show the person using it if we think it is 98 or 99%, that— it if we think it is 98 or 99%, that is— it if we think it is 98 or 99%, that is not— it if we think it is 98 or 99%, that is not shown in our software for that reason. amazing to talk to you, thank you very much for coming on the programme. you very much for coming on the programme-— programme. thank you so much for having _ programme. thank you so much for having me. _ is this all feeling too orwellian for you? coming up after the break, we will get the other side of the debate. how do we protect our freedoms?
9:47 pm
9:48 pm
welcome back, so now we know how impressive this ai technology can be. its power stretches far beyond the average search egine. this is a radical re—imagining of the public space. if, as looks likely, it is widely adopted by our police forces, then increasingly the freedom to wander without being watched will disappear. facial recognition will bind us to our digital history in ways we have not yet imagined. it will be the end of what was previously taken—for—granted — the right to be publically anonymous. with us tonight is silkie carlo, director of big brother watch. before we speak to her lets watch their campaign video.
9:49 pm
this is how it works. they bring out a van. there are cameras on top of the van that scan everyone's faces as they walk past. i came to london bridge and i was pulled up at london bridge in regards to this facial recognition i did cry the entire way home. i felt so helpless. i just really wanted to find any way to prove that i wasn't a thief. so typically, on a busy day in a london area like this, they can be scanning thousands of people's faces. it's thousands of people that are effectively walking through a digital police line—up as you walk past. it's a passport style check that sees if you are known to the police. if you are on a database. there are lots of different reasons that people can be put onto watch lists. you can be put on a watch list to protect you from harm, whatever that means. typically what we see is an alert comes up, bam! the person is stopped. if they don't stop voluntarily, then they can be physically apprehended.
9:50 pm
silkie carlo is director of big brother watch... welcome to the programme. it is difficult for someone in my position, who is on television every night, to argue for more privacy. but if i were not in thisjob, i think privacy. but if i were not in thisjob, ithink ifi privacy. but if i were not in thisjob, i think if i could choose to opt out of the database from clearview ai, i might want to do that, but it seems very tricky. in might want to do that, but it seems very tricky.— seems very tricky. in british law and in — seems very tricky. in british law and in european - seems very tricky. in british law and in european law, i law and in european law, extraordinarily, you have the right— extraordinarily, you have the right to — extraordinarily, you have the right to protect your own data including _ right to protect your own data including your photographs. of course, — including your photographs. of course, what he is doing with clearview _ course, what he is doing with clearview ai is actually not just — clearview ai is actually not just stealing the billions of photographs from the internet that people have not consented to, there — that people have not consented to, there is about 30 billion in his database alone, but it is also — in his database alone, but it is also extracting biometric data — is also extracting biometric data from them, that is information as sensitive as what — information as sensitive as what is _ information as sensitive as what is on your passport. these companies _ what is on your passport. these companies are making that very debt— companies are making that very debt sensitive data available to the — debt sensitive data available to the highest bidder. it is really— to the highest bidder. it is reallyjust a question of what does — reallyjust a question of what does the _ reallyjust a question of what does the buyer want to do with it _ does the buyer want to do with it... ., _ does the buyer want to do with it... ., ., does the buyer want to do with it... you say to stealing, is it... you say to stealing, is it stealing _ it. .. you say to stealing, is it stealing or— it... you say to stealing, is it stealing or scraping? is l
9:51 pm
it stealing or scraping? is there a difference?- it stealing or scraping? is there a difference? they are two words — there a difference? they are two words for _ there a difference? they are two words for the _ there a difference? they are two words for the same i there a difference? they are i two words for the same thing... because — two words for the same thing... because you have rights over your— because you have rights over your data _ because you have rights over your data. and you will be fought _ your data. and you will be fought for a long time to have the right— fought for a long time to have the right to privacy is a protected right, a fundamental human— protected right, a fundamental human right. and we, the thing about— human right. and we, the thing about facial recognition, especially when you have got companies either taking it from people — companies either taking it from people on the street as they walk— people on the street as they walk around through cctv or scraping _ walk around through cctv or scraping it from the internet, is that— scraping it from the internet, is that it _ scraping it from the internet, is that it actually reverses the — is that it actually reverses the perception...- is that it actually reverses the perception... the reality is that it _ the perception... the reality is that it happens _ the perception... the reality is that it happens in - the perception... the reality is that it happens in stages, | is that it happens in stages, we give the images to social media, we assume, do we assume they would be used for other purposes? i don't know. but at the same time we enjoy using google photos, which then puts all our photos into certain order and identifies people for it. so, we like all that, are you saying that we have to give up you saying that we have to give up all that if we want to keep ourfreedom? up all that if we want to keep our freedom?— our freedom? no, not at all, but there _ our freedom? no, not at all, but there is _ our freedom? no, not at all, but there is a _ our freedom? no, not at all, but there is a different i our freedom? no, not at all, but there is a different level| but there is a different level of protection for biometric data — of protection for biometric data because it is so sensitive, it is like dna. in the — sensitive, it is like dna. in the same _
9:52 pm
sensitive, it is like dna. in the same way, and i do think with— the same way, and i do think with facial— the same way, and i do think with facial recognition there are ways _ with facial recognition there are ways that highly regulated you can — are ways that highly regulated you can use it for the public benefit, _ you can use it for the public benefit, but the problem is we don't _ benefit, but the problem is we don't allow companies to go around _ don't allow companies to go around scraping buses and playgrounds and high streets for people's dna and making massive _ for people's dna and making massive databases because it is so unregulated at the moment, that is— so unregulated at the moment, that is what is happening with facial— that is what is happening with facial recognition. we that is what is happening with facial recognition.— facial recognition. we are lookinu facial recognition. we are looking at _ facial recognition. we are looking at the _ facial recognition. we are looking at the actual i facial recognition. we are i looking at the actual process, we are traditionally right lying on people recognising people and human eyewitnesses, so we are relying on this human facial recognition rather than a machine. iam interested in the public interest argument, you've got the security forces we have got the met police using this technology now, in the uk, the scottish police, the uk, the scottish police, the welsh please, and i think it was big brother quote with said police are failing to turn up said police are failing to turn up to even 40% of violent shoplifting incidences, the met police started the year 1000 offices short and will probably end the year 1&00 offices short. i don't want those offices trolling through faith is trying make those matches...
9:53 pm
it is a tool, technology, i can understand the point about the consent of the data and the input, but how do we achieve that balance where there is still a job to be done here, right? i know crime is reduced by 8%, but sexual assault has rocketed upwards. how do we use the technology to benefit us while being able to mitigate against potential risks and harms that are caused by the breach of our privacy? we have to have regulation _ breach of our privacy? we have to have regulation and - breach of our privacy? we have to have regulation and laws i to have regulation and laws around _ to have regulation and laws around this. we have got laws on fingerprints, on dna, we have — on fingerprints, on dna, we have got— on fingerprints, on dna, we have got laws on cctv, facial recognition isjust a kind of vacuum _ recognition isjust a kind of vacuum at the moment. it is a kind _ vacuum at the moment. it is a kind of— vacuum at the moment. it is a kind of wild west for companies to go— kind of wild west for companies to go in— kind of wild west for companies to go in and build these databases of billions of photos and also — databases of billions of photos and also have a say for the police _ and also have a say for the police i_ and also have a say for the police. i have been watching the police use this for seven years— the police use this for seven years now, i have given an example _ years now, i have given an example of what it actually looks — example of what it actually looks like on the street. i go to high— looks like on the street. i go to high crime area like crate and — to high crime area like crate and where police are using facial— and where police are using facial recognition, masses of
9:54 pm
resource _ facial recognition, masses of resource standing around looking _ resource standing around looking at ipads. i walked past a robbery— looking at ipads. i walked past a robbery on my way to watch the police _ a robbery on my way to watch the police using facial recognition. and they are getting _ recognition. and they are getting it wrong as well. that is where _ getting it wrong as well. that is where we stepped in, because there _ is where we stepped in, because there are — is where we stepped in, because there are injustices, people wrongly— there are injustices, people wrongly stopped, questioned, harassed by police because they have _ harassed by police because they have been misidentified.- have been misidentified. there is one other — have been misidentified. there is one other issue _ have been misidentified. there is one other issue we _ have been misidentified. there is one other issue we have i have been misidentified. there is one other issue we have not| is one other issue we have not talked about, that third story i had in the introduction where british telecom is saying we can use this for our own security. would we actually be snooped on by our employer? absolutely, we have just snooped on by our employer? absolutely, we havejust rip absolutely, we have just rip least — absolutely, we have just rip least a _ absolutely, we have just rip least a report on that. whether it is on— least a report on that. whether it is on construction sites, the — it is on construction sites, the gig _ it is on construction sites, the gig economy, people are being — the gig economy, people are being basically told that they have — being basically told that they have to — being basically told that they have to give over dna and fingerprint style data, sometimes it is literally fingerprints and increasingly facial— fingerprints and increasingly facial recognition, just to get their— facial recognition, just to get their pay— facial recognition, just to get their pay packet at the end of their pay packet at the end of the month. we have to be careful— the month. we have to be careful about what that data is used _ careful about what that data is used for — careful about what that data is used for. they have controls over— used for. they have controls over it. _ used for. they have controls over it. do _ used for. they have controls over it, do they have a choice. it over it, do they have a choice. it even — over it, do they have a choice. it even happens in schools now
9:55 pm
that children are given facial recognition to get their school lunch — recognition to get their school lunch. ~ , recognition to get their school lunch. y ,,,, recognition to get their school lunch. g , lunch. my boss is watching me, we are out _ lunch. my boss is watching me, we are out of— lunch. my boss is watching me, we are out of time. .. _ lunch. my boss is watching me, we are out of time... i - lunch. my boss is watching me, we are out of time. .. i need i lunch. my boss is watching me, we are out of time... i need a l we are out of time... i need a u-rum we are out of time... i need a grumpy face _ we are out of time... i need a grumpy face recognition, i we are out of time... i need a l grumpy face recognition, when you're with your spouse or partner, it goes single, smiled, we do not want buses up here, we need one of those. get in touch if— here, we need one of those. get in touch if you — here, we need one of those. get in touch if you have thoughts on what we discussed. that's it for this week. as i like to remind you each week, if you enjoyed tonight show you can catch up on the back catalogue. on our you tube channel ai decoded. if you work in al do get in touch. you have got thoughts for a programme. we would love to hear from you. we will do this again same time next week. thanks for watching. hello there, good evening. the month of september can often be quite a turbulent one. the transition from summer into autumn and the final full week of september has been just that. look at thursday's rain. a lot of heavy rain drifting out of northern england and northern ireland. this darker blue here, a line of torrential, thundery showers with some hail in there as well.
9:56 pm
already still with days to spare, woburn in bedfordshire has had its wettest september on record, but it's also had its wettest month ever on record, and we've got an amber weather warning which will remain in force for the next few hours. and so we're just going to continue to add to those rainfall totals. so a line of heavy rain will move its way south out of the midlands, down into southern england and south east england. behind it, the wind direction swinging around to a northerly. a few scattered showers with elevation. could turna little wintry as well, as it turns colder from the north. so this weather front slowly eases away during the day this friday and then behind it, that colder air starts to tuck in. a real noticeable difference to the feel of our weather story. so yes, it will be a wet start across east anglia, south east england for a time. that rain clearing perhaps away from the kent coast by lunchtime and then quite an improvement, actually, some sunshine coming through. a few scattered showers, but a brisk northerly wind making it feel quite cool out there.
9:57 pm
eight to 1a degrees below par for this stage of the month. now, as we move out of friday into the weekend, we start with this ridge of high pressure. a quiet start to the weekend, but there's more wet weather to come as we move into sunday. but with that high pressure, well, that means we could have quite a chilly start first thing on saturday morning. temperatures in sheltered glens of scotland and northeast england down below freezing, so a frost is not out of the question here. we will continue to see the wind direction swinging around to more of a northwesterly that will drive in showers on saturday across exposed west coast, and one or two running down through the cheshire gap as well. top temperatures on sunday at around 15 degrees, the best of the sunshine in sheltered southern and eastern areas. then, as we move out of saturday into sunday, here's that rain and it means an unsettled start into next week slowly improving.
9:59 pm
10:00 pm
she did carry out intimate checks. the metropolitan police say that 19 women came forward over two decades with serious sexual offence allegations against al fayed. benjamin netanyahu arrives in new york, as pressure grows at the united nations for a ceasefire between israel and hezbollah in lebanon. but despite those calls, it's been another day of heavy cross—border fighting, including a fresh strike here in the capital beirut. i'll be live with the latest. more weather warnings for parts of the country, as heavy rain sweeps in. and bouncing back — how basketball has overcome funding problems to become one of the uk's most popular sports. living in temporary accommodation.
14 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on