tv Our World BBC News March 25, 2023 4:30am-5:00am GMT
4:30 am
this is bbc news, the headlines: the american social media influencer and actress gwyneth paltrow has been giving evidence on the fourth day of a trial in which she's being sued over a skiing accident at a resort in utah in 2016. ms paltrow has counter—sued, and testified that she was the victim of the incident. president biden and the canadian prime minister justin trudeau have agreed to deepen their countries�* economic cooperation and to stand together against authoritarian regimes, during a state visit to ottawa. king charles�* first visit to france as monarch has been postponed, as demonstrations continue across the country against president macron�*s changes to the country's retirement age. the king's visit to germany later in the week will still go ahead.
4:31 am
now on bbc news, our world. across america, police are increasingly using facial recognition technology to fight crime. . , ., recognition technology to fight crime. . i” .,, recognition technology to fight crime. . crime. can you open the door! we want _ crime. can you open the door! we want law— crime. can you open the door! we want law enforcement - crime. can you open the door! we want law enforcement to l crime. can you open the door! - we want law enforcement to have more _ we want law enforcement to have more tools — we want law enforcement to have more tools at our disposal, not less _ more tools at our disposal, not less. . , . . more tools at our disposal, not less. . , , ., ., less. images can be fed into a database _ less. images can be fed into a database to _ less. images can be fed into a database to sevastova - less. images can be fed into a l database to sevastova matches track people down. but critics argue the use of this technology could be inaccurate. if law enforcement knows how accurate that is, how come out fighting so hard to keep that information from us when we ask
4:32 am
for it? why won't they share it? �* . ., , for it? why won't they share it?�* , ., for it? why won't they share it? , ., a, for it? why won't they share it?�* , ., a, �* it? i'm james clayton and i'm investigating _ it? i'm james clayton and i'm investigating whether - it? i'm james clayton and i'm investigating whether the - investigating whether the police should be using this controversial tech. the technology _ controversial tech. the technology itself - controversial tech. the technology itself is - controversial tech. tue: technology itself is harmful. it is too dangerous, and that is just a massive threat to civil liberties in this nation. march 2017. andrew conlon is driving with a friend. t march 2017. andrew conlon is driving with a friend.- driving with a friend. i think we made — driving with a friend. i think we made about _ driving with a friend. i think we made about 1.5 - driving with a friend. i think we made about 1.5 miles i driving with a friend. i think i we made about 1.5 miles into driving with a friend. i think . we made about 1.5 miles into a three mile trip.— three mile trip. andrew is in the front — three mile trip. andrew is in the front passenger - three mile trip. andrew is in the front passenger seat, i three mile trip. andrew is in | the front passenger seat, his friend is driving. find the front passenger seat, his friend is driving.— friend is driving. and he is probably _ friend is driving. and he is probably hitting _ friend is driving. and he is probably hitting 80, - friend is driving. and he is probably hitting 80, 90. l friend is driving. and he is probably hitting 80, 90. i| friend is driving. and he is i probably hitting 80, 90. i am saying, you know, slowdown, it was falling on deaf ears, i don't think he at all. i
4:33 am
basically reached the conclusion that somebody was going to die that night. does our fire going to die that night. does your fire extinguisher - going to die that night. does your fire extinguisher work? | your fire extinguisher work? it's out? _ your fire extinguisher work? it's out? i _ your fire extinguisher work? it's out? .., �* , it's out? i can't put the fire out. it's out? i can't put the fire out- watch _ it's out? i can't put the fire out. watch out. _ it's out? i can't put the fire out. watch out. the - it's out? i can't put the fire out. watch out. the car - it's out? i can't put the fire| out. watch out. the car has it's out? i can't put the fire - out. watch out. the car has hit a tree. out. watch out. the car has hit a tree- the _ out. watch out. the car has hit a tree. the driver— out. watch out. the car has hit a tree. the driver was - out. watch out. the car has hit a tree. the driver was thrown l a tree. the driver was thrown into bushes nearby. he died from his injuries. t’m into bushes nearby. he died from his injuries.— from his in'uries. i'm very fortunate — from his injuries. i'm very fortunate to _ from his injuries. i'm very fortunate to have - from his injuries. i'm very fortunate to have walked | from his injuries. i'm very - fortunate to have walked away from that wreck with as little injury as i did. you have seen pictures of the car, there was not much left of it. in the scheme of things you have walked away from that wreck is very fortunate. 0ne walked away from that wreck is very fortunate. one of the first people on the scene pulled andrew out of the body —— out of the car. he also told police what has happened. did anybody see what happened?
4:34 am
that's — anybody see what happened? that's not the driver? did you see where _ that's not the driver? did you see where the driver went? the driver_ see where the driver went? the driver is — see where the driver went? the driver is the person in the bushes? _ driver is the person in the bushes?— driver is the person in the bushes? , , , ., �* , bushes? despite the man's testimony. _ bushes? despite the man's testimony, the _ bushes? despite the man's testimony, the police - bushes? despite the man's i testimony, the police suspect andrew was in fact the driver. there was no way that is the passenger. the passenger side of the vehicle is all smashed in, there's no way the passenger survive that, the driver. . , passenger survive that, the driver. ., , , driver. that guy is the driver. he's saying — driver. that guy is the driver. he's saying is _ driver. that guy is the driver. he's saying is not _ driver. that guy is the driver. he's saying is not the - driver. that guy is the driver. he's saying is not the driver, | he's saying is not the driver, but people are already saying that he's _ but people are already saying that he's the driver.— that he's the driver. after tellin: that he's the driver. after telling the _ that he's the driver. after telling the police - that he's the driver. after telling the police what. telling the police what happened, the man left the scene. tt happened, the man left the scene. , , ., scene. if this guy was involved in the vehicle, _ scene. if this guy was involved in the vehicle, he _ scene. if this guy was involved in the vehicle, he was - scene. if this guy was involved in the vehicle, he was driving. | in the vehicle, he was driving. do we believe the man who pulled _ do we believe the man who pulled him out?— do we believe the man who ulled him out? �* , ' pulled him out? andrew suffered in'uries pulled him out? andrew suffered injuries including _ pulled him out? andrew suffered injuries including broken - pulled him out? andrew suffered injuries including broken bones. injuries including broken bones injuries including broken bones in his hand and right hip. lipan
4:35 am
in his hand and right hip. upon bein: in his hand and right hip. upon being released _ in his hand and right hip. upon being released -- _ in his hand and right hip. upon being released -- released - in his hand and right hip. tinm being released —— released from hospital not whole lot happened. nothing really happened. nothing really happened until about 2.5 years later in november 2019, i was indicted for vehicular homicide. , �* �* homicide. christopher o'brien was one of— homicide. christopher o'brien was one of the _ homicide. christopher o'brien was one of the defence - homicide. christopher o'brien l was one of the defence lawyers who took on andrew's case. the chart e who took on andrew's case. the charge was _ who took on andrew's case. the charge was a — who took on andrew's case. tte: charge was a reckless who took on andrew's case. t'te: charge was a reckless driving with death. so it is called a vehicular homicide, and it is that he was driving at such speed and so recklessly that he killed the guy in his car. so it is a serious drive —— charge? it is punishable by up to 15 years in prison. t charge? it is punishable by up to 15 years in prison.— to 15 years in prison. i don't think the — to 15 years in prison. i don't think the gravity _ to 15 years in prison. i don't think the gravity of - to 15 years in prison. i don't think the gravity of the - think the gravity of the situation really hit me because obviousiy— situation really hit me because obviously i knew what happened, i obviously i knew what happened, i was _ obviously i knew what happened, i was there. and it never really— i was there. and it never really occurred to me that they would — really occurred to me that they would be — really occurred to me that they would be able to convince a jury— would be able to convince a jury that _ would be able to convince a jury that i was actually driving _ jury that i was actually driving stop andrew's only hope was for— driving stop andrew's only hope was for his legal team to track down — was for his legal team to track down the _ was for his legal team to track down the man who pulled him from — down the man who pulled him
4:36 am
from the — down the man who pulled him from the car.— down the man who pulled him from the car. this unknown man could prove _ from the car. this unknown man could prove andrew _ from the car. this unknown man could prove andrew wasn't - could prove andrew wasn't driving. could prove andrew wasn't drivinu. ~ , could prove andrew wasn't drivinu. . , , could prove andrew wasn't drivin~.~ , , ,. , driving. we printed up pictures of him off _ driving. we printed up pictures of him off the _ driving. we printed up pictures of him off the body _ driving. we printed up pictures of him off the body can, - driving. we printed up picturesl of him off the body can, handed them out at every shop, every starbucks downtown, put it on social media. and no—one knew who he was. social media. and no-one knew who he was-— who he was. chris and his colleagues _ who he was. chris and his colleagues convinced - who he was. chris and his colleagues convinced a i colleagues convinced a controversial facial recognition company to license their technology and help find their technology and help find the witness. they ran a search on this image from the police body can footage using artificial intelligence. —— body camera. artificial intelligence. -- body camera.— artificial intelligence. -- bod camera. , ~ body camera. this ai popped him u . body camera. this ai popped him u- in like body camera. this ai popped him up in like 3- _ body camera. this ai popped him up in like 3- five _ body camera. this ai popped him up in like 3- five seconds. - body camera. this ai popped him up in like 3- five seconds. it i up in like 3— five seconds. it wasjust... picturesjust popping up, pop, pop, wasjust... picturesjust p°ppin9 up, p0p, p0p, p0p wasjust... picturesjust popping up, pop, pop, pop here he is, this is the guy. it was him, every one. it was wild, it was like hitting the lottery. 0ver four years after the crash, chris and his team finally found their key
4:37 am
witness. vince ramirez. were you surprised he was being charged with vehicular homicide? t charged with vehicular homicide?— charged with vehicular homicide? ., , homicide? iwas, iwas really surprised- — homicide? iwas, iwas really surprised. because _ homicide? iwas, iwas really surprised. because i- homicide? i was, i was really i surprised. because i remember telling them what happened, it's like, wow, i can't remember this was going on, i thought it was case closed, all right, he wasn't the driver. but it was a shock to me, it really was. but it was a shock to me, it really was-— really was. what do you remember _ really was. what do you remember about - really was. what do you remember about where | really was. what do you i remember about where he really was. what do you - remember about where he was in the car? . ., ., , the car? wareemba he was in the assenrer the car? wareemba he was in the passenger seat. _ the car? wareemba he was in the passenger seat, but _ the car? wareemba he was in the passenger seat, but he _ the car? wareemba he was in the passenger seat, but he was i passenger seat, but he was mangled up where the outside it definitely had him crushed in with the seatbelt on, so it took me a while to take him out and get his feet and body over the centre console to get him out of the vehicle. 50 the centre console to get him out of the vehicle.— out of the vehicle. so he was definitely _ out of the vehicle. so he was definitely in _ out of the vehicle. so he was definitely in the _ out of the vehicle. so he was definitely in the passenger. definitely in the passenger seat. , definitely in the passenger seat-- we _ definitely in the passenger seat.- we knew i definitely in the passenger seat.- we knew at i definitely in the passenger seat. yes. we knew at that oint seat. yes. we knew at that point we — seat. yes. we knew at that point we were _ seat. yes. we knew at that point we were going - seat. yes. we knew at that point we were going to i seat. res we knew at that point we were going to win this case _ point we were going to win this case and _ point we were going to win this case. and within an hour of his giving — case. and within an hour of his giving his— case. and within an hour of his giving his testimony and his deposition, the case was dropped. deposition, the case was dropped-—
4:38 am
deposition, the case was droued. , ., dropped. they called me that afternoon _ dropped. they called me that afternoon and _ dropped. they called me that afternoon and said, _ dropped. they called me that afternoon and said, hey, i dropped. they called me that afternoon and said, hey, it i afternoon and said, hey, it went— afternoon and said, hey, it went as _ afternoon and said, hey, it went as expected - afternoon and said, hey, it went as expected and i afternoon and said, hey, it went as expected and mayj afternoon and said, hey, it- went as expected and may drop the case — went as expected and may drop the case. ., , ., went as expected and may drop the case-— a i the case. how did you feel? a hue the case. how did you feel? a huge weight _ the case. how did you feel? a huge weight lifted. _ the case. how did you feel? a huge weight lifted. after i huge weight lifted. after spending _ huge weight lifted. after spending years - huge weight lifted. after spending years facing i huge weight lifted. after spending years facing a l spending years facing a potential prison sentence, andrew could finally move on with his life. t andrew could finally move on with his life.— with his life. i am happy that he reached _ with his life. i am happy that he reached out _ with his life. i am happy that he reached out to _ with his life. i am happy that he reached out to me - with his life. i am happy that he reached out to me so i with his life. i am happy that he reached out to me so he l with his life. i am happy that. he reached out to me so he can clear his name to something he didn't do. irate clear his name to something he didn't do. ~ . r' clear his name to something he didn't do. . . ,~' ., didn't do. we asked the fort m ers didn't do. we asked the fort myers police _ didn't do. we asked the fort myers police department i didn't do. we asked the fort myers police department in| myers police department in florida to comment on andrew's case but they didn't respond. the facial recognition technology that andrew's lawyers used is new york based clearview ai. lawyers used is new york based clearview al. the company has been fined over and over again for scraping billions of users' photos in countries like the uk, greece, italy and france.
4:39 am
clearview is perhaps the most famous and controversial facial recognition system in the world. . . , recognition system in the world. ., ., world. critics say a controversial i world. critics say a i controversial start-up world. critics say a - controversial start-up poses world. critics say a _ controversial start-up poses a controversial start—up poses a new _ controversial start—up poses a new and — controversial start—up poses a new and profound threat to every — new and profound threat to every one's privacy. a company called _ every one's privacy. a company called clearview ai has the largest— called clearview ai has the largest database of basal images in the us, larger than the fbi — images in the us, larger than the fbi. ., ., , images in the us, larger than i the fbi._ does the fbi. -- facial images. does it work? yeah, _ the fbi. -- facial images. does it work? yeah, it _ the fbi. -- facial images. does it work? yeah, it works. i'm i it work? yeah, it works. i'm looking over at newjersey. the looking over at new jersey. the com an looking over at new jersey. the company is _ looking over at new jersey. the company is run _ looking over at newjersey. the company is run by this man. he is the eccentric founder of clearview ai. a privately owned company that promises the most comprehensive asthma comprehensive asthma comprehensive image search solutions in the world. t haven't played that for like a few years! t haven't played that for like a few years!— few years! i want to find out how many — few years! i want to find out how many pictures - few years! i want to find out how many pictures the i few years! i want to find out i how many pictures the system could find of me.— could find of me. take a
4:40 am
selfie. _ could find of me. take a selfie, take _ could find of me. take a selfie, take any - could find of me. take a selfie, take any place i could find of me. take a l selfie, take any place you want. as you scroll down they will be photos of that. less similar or if you see a plus one... . . . one... this is all me. clearview scours i one... this is all me. clearview scours the | one... this is all me. i clearview scours the internet for images. a powerful algorithm compares size, shape and features —— distance between facial features to find a potential match. i don't know where that is. it's wild how many pictures ijust out there on the open web. the search finds pictures i have literally never seen before. i'm in the back of someone's profile picture and it still finds me. fascinating. that is once again extraordinary. i don't know where that is, and it has picked me up because i am right in the back there. it doesn't need to be a picture that you upload all your friends uploaded, it isjust upload all your friends uploaded, it is just if you are in the back of a picture you can be found. tt in the back of a picture you can be found.— can be found. it is a really accurate _ can be found. it is a really accurate technology. i can be found. it is a really accurate technology. how| can be found. it is a really i accurate technology. how many ima . es is accurate technology. how many images is this _ accurate technology. how many images is this scraping, i accurate technology. how many images is this scraping, how i images is this scraping, how many images are in the database?— many images are in the database? . . , . database? the database is about 30 billion images. _ database? the database is about
4:41 am
30 billion images. the _ database? the database is about 30 billion images. the debate i 30 billion images. the debate is, some 30 billion images. the debate is. some say _ 30 billion images. the debate is, some say this _ 30 billion images. the debate is, some say this is _ 30 billion images. the debate is, some say this isjust i is, some say this is just google forfaces and is, some say this is just google for faces and others say this will change privacy as we know it. �* . this will change privacy as we know it. �*, ,., this will change privacy as we know it. �* , ,., . know it. it's both. i mean it really is— know it. it's both. i mean it really is google _ know it. it's both. i mean it really is google for - know it. it's both. i mean it really is google for faces, l really is google for faces, that's exactly how it works, but i do think it is a big change in the fact that people can be identified withjust change in the fact that people can be identified with just a photo. and what we are trying to do is figure out what is the most compelling prosocial use case and i think for law enforcement and government usage it is a total game changer in the ability to keep all of us safe together. clearview all of us safe together. clea rview says all of us safe together. clearview says they have been used by hundreds of law enforcement agencies across the us. but there is no official record of which police forces use this tech. 0ne police force we do know that uses it though, is miami pd. we are on a ride
4:42 am
along with 0fficerjack peris. what has been reported? tt is along with 0fficerjack peris. what has been reported? it is a bailout, i don't _ what has been reported? it is a bailout, i don't know— what has been reported? it is a bailout, i don't know if- what has been reported? it is a bailout, i don't know if it i what has been reported? it is a bailout, i don't know if it is i bailout, i don't know if it is a social —— stolen vehicle or a robbery vehicle. t a social -- stolen vehicle or a robbery vehicle.— a social -- stolen vehicle or a robbery vehicle. i want to see how facial _ robbery vehicle. i want to see how facial recognition i robbery vehicle. i want to see how facial recognition could l how facial recognition could potentially help the cops fight crime. do you still get the adrenaline when you are going to these things? you adrenaline when you are going to these things?— to these things? you 'ust become i to these things? you 'ust become smarter, i to these things? you just| become smarter, because to these things? you just l become smarter, because i to these things? you just - become smarter, because i am older and i become smarter, because i am olderand i can... become smarter, because i am older and i can. . ._ older and i can... (bleep). someone _ older and i can... (bleep). someone has _ older and i can... (bleep). someone has reported - older and | can... (bleep). | someone has reported their older and | can... (bleep). - someone has reported their car is being stolen. we someone has reported their car is being stolen.— is being stolen. we can do a traffic step _ is being stolen. we can do a traffic stop and _ is being stolen. we can do a traffic stop and he _ is being stolen. we can do a traffic stop and he jumped l is being stolen. we can do a l traffic stop and he jumped out of the car and ran. traffic stop and he 'umped out of the car and ran._ of the car and ran. dozens of armed police _ of the car and ran. dozens of armed police are _ of the car and ran. dozens of armed police are circling - of the car and ran. dozens of armed police are circling the| armed police are circling the area, searching houses for the perpetrator. they believe he may be armed. so the police here think that there might be a person who is hiding in one of these houses so they are being very careful, they are going to basically take this gate apart before they go in. they arejumping over it right now. they are 'umping over it right now. , . , ., now. they are 'umping over it ri . ht now. they are 'umping over it
4:43 am
right now? — now. they are jumping over it right now? can _ now. they are jumping over it right now? can you _ now. they are jumping over it right now? can you open - now. they are jumping over it right now? can you open the l right now? can you open the door? the — right now? can you open the door? the cops _ right now? can you open the door? the cops didn't - right now? can you open the door? the cops didn't have l right now? can you open the l door? the cops didn't have a hoto door? the cops didn't have a photo of _ door? the cops didn't have a photo of the _ door? the cops didn't have a photo of the perpetrator, - door? the cops didn't have a photo of the perpetrator, wel photo of the perpetrator, we didn't find him.— photo of the perpetrator, we didn't find him. this would be an excellent _ didn't find him. this would be an excellent want _ didn't find him. this would be an excellent want to - didn't find him. this would be an excellent want to facial - an excellent want to facial recognition. if we had those cameras up, and got a picture of the offender, they could go to the victim and show a photo lineup and say ok, here is a photo lineup, do you see the person that robbed you, oh yeah, this guy, we got off facial recognition. armando a: uilar facial recognition. armando aguilar is — facial recognition. armando aguilar is the _ facial recognition. armando aguilar is the head - facial recognition. armando aguilar is the head of- aguilar is the head of investigations at miami pd. the force pays clearview for access to their database. we
4:44 am
investigate _ to their database. we investigate crimes - to their database. , investigate crimes that are heinous in nature, we investigate crimes that are where sometimes it is difficult to get support from eyewitnesses, support from the public, for many reasons. because of perhaps loyalty to the people carrying out the crimes, fear of retribution. for cooperating with the police. i5 for cooperating with the olice. . . , . for cooperating with the olice. , . , . ., police. is there any crime that ou police. is there any crime that you can't _ police. is there any crime that you can't use _ police. is there any crime that you can't use this _ police. is there any crime that you can't use this for? - police. is there any crime that you can't use this for? that i police. is there any crime that| you can't use this for? that we cannot use _ you can't use this for? that we cannot use it — you can't use this for? that we cannot use it for? _ you can't use this for? that we cannot use it for? no, - you can't use this for? that we cannot use it for? no, as - you can't use this for? that we cannot use it for? no, as long| cannot use it for? no, as long as it is a violation of a criminal statute, our detectives and our analysts are allowed to use it. so detectives and our analysts are allowed to use it.— allowed to use it. so it is shoplifting _ allowed to use it. so it is shoplifting all— allowed to use it. so it is shoplifting all the - allowed to use it. so it is shoplifting all the way i allowed to use it. so it isl shoplifting all the way up allowed to use it. so it is i shoplifting all the way up to murder. ., . , , shoplifting all the way up to murder._ bbc i shoplifting all the way up to i murder._ bbc can murder. correct, yes. bbc can revealthat — murder. correct, yes. bbc can reveal that clearview - murder. correct, yes. bbc can reveal that clearview has i murder. correct, yes. bbc canl reveal that clearview has been by american law enforcement nearly a million times. and there are many other systems in use. what happens when it doesn't work?
4:45 am
in 2018, this man walked into a shopin in 2018, this man walked into a shop in new york and stole some shocks before appearing to waive box cutters at a member of staff. caitlin jacklin waive box cutters at a member of staff. caitlinjacklin is the lawyer for the man the police believed was the perpetrator. he wants to remain anonymous, but has agreed for caitlin to speak out on his behalf. �* . caitlin to speak out on his behalf. �* , . , . behalf. because it was a theft lus behalf. because it was a theft [us a behalf. because it was a theft plus a weapon. _ behalf. because it was a theft plus a weapon, it _ behalf. because it was a theft plus a weapon, it was - behalf. because it was a theft | plus a weapon, it was charged as first degree robbery. that means if you are convicted of it it is a pretty hefty prison sentence, my client would have been looking at five and 25 years for essentially stealing six socks. years for essentially stealing six socks-— years for essentially stealing six socks. . , , . , six socks. the main witness was a security _ six socks. the main witness was a security guard, _ six socks. the main witness was a security guard, sometimes i a security guard, sometimes referred to as a loss prevention officer in the us. what we learned was shortly after the theft happened, a detective from nypd went and
4:46 am
met with the loss prevention officer and said he wanted to see the surveillance, and took a screen grab of the face of the person that stole the socks, and the detective told the officer we're going to put this in facial recognition software. essentially when i got was a of paper that had a screenshot from a surveillance lens and my client's mugshot that said possible match. they don't do a photo lineup, they don't do a photo lineup, they don't do a photo lineup, they don't do any sort of better identification procedure, they just shoot him a text and say is this a guy? imagine you are the loss prevention officer, he says yes, that's the guy, but he did, you have to be telling the detective your software does not work, that's not the person, it is such a suggestive way to do the id shortly after my client got arrested. fin way to do the id shortly after my client got arrested. on the da of my client got arrested. on the day of the _ my client got arrested. on the day of the robbery _ my client got arrested. on the day of the robbery caitlin's i day of the robbery caitlin's client has an alibi. his
4:47 am
day of the robbery caitlin's client has an alibi.- client has an alibi. his son was born _ client has an alibi. his son was born that _ client has an alibi. his son was born that day, - client has an alibi. his son was born that day, he i client has an alibi. his son was born that day, he was client has an alibi. his son i was born that day, he was at the hospital. the both happen within a few hours, so they were not at the exact same time but in order to believe that you had the right person, you have to believe that on the way to the birth of his child, my client stopped at a big box store to steal a sixpack of socks and then immediately went to the hospital for the socks and then immediately went to the hospitalfor the birth of his child. itjust wasn't him, he did not do it. caitlin's client was sent to jail for five caitlin's client was sent to jailforfive months caitlin's client was sent to jail for five months awaiting trial. he agreed to plead guilty so that he could be released. today he still maintains his innocence. the only way for him to get out and get home to his newborn baby was to take appellee and that's exactly what he ultimately did, it's also a way for the police and the prosecutors not to really have to investigate or sit with the fact that this might have got it wrong. the nypd told us they had not taken enforcement action based solely
4:48 am
on identification of a possible facial recognition match. it's important emphasise that clearview al's technology was not used in this case, but it does throw up a crucial question. how accurate is facial recognition technology? clearview will also say that it's almost 100% accurate, but that's on mugshot, not photos taken outside like cctv or body can footage. so we thought we would put it to the test. we took a range of different photos to see whether clearview could find them. we photos to see whether clearview could find them.— could find them. we took some ictures, could find them. we took some pictures, progressively - could find them. we took some pictures, progressively more i pictures, progressively more and more difficult, so we have,
4:49 am
there is one of you with glasses on, one with a mask on looking at the camera. sometimes blurry ones work, sometimes they don't, it's tricky _ sometimes they don't, it's tric . . ., ., tricky. the algorithm found me in some of— tricky. the algorithm found me in some of the _ tricky. the algorithm found me in some of the shots _ tricky. the algorithm found me in some of the shots but i tricky. the algorithm found me in some of the shots but not i tricky. the algorithm found me in some of the shots but not ini in some of the shots but not in others. none of me wearing a mask were found.— others. none of me wearing a mask were found. when they are hiuher mask were found. when they are higher quality — mask were found. when they are higher quality is _ mask were found. when they are higher quality is a _ mask were found. when they are higher quality is a higher - higher quality is a higher chance of making it. it depends on the quality _ chance of making it. it depends on the quality of _ chance of making it. it depends on the quality of the _ chance of making it. it depends on the quality of the picture? i on the quality of the picture? that's— on the quality of the picture? that's why we have that warning when we say this is a lower quality image so we take extra care when taking a look at the results. a trained investigator would be able to see this result, and it's up to them do their research because if you look at the first folder there is no name associated, you have to click the links and find out more. ,., . . . more. the police argued that they don't — more. the police argued that they don't just _ more. the police argued that they don'tjust rely _ more. the police argued that they don'tjust rely on - more. the police argued that they don'tjust rely on faciall they don'tjust rely on facial recognition to make an arrest, so we treat a match from anyone about facial recognition
4:50 am
platforms as a tip. we don't run out and make an arrest based on that tip. aha, run out and make an arrest based on that tip. a detective lo: a based on that tip. a detective log a 4096 _ based on that tip. a detective log a 40% positive _ log a 40% positive identification rate across all about facial recognition searches, we're talking about, we got this match, detectors investigated and then reported back to our real—time crime centre that the person that you sent me is a matter was in fact the person we establish probable cause to arrest. aha, the person we establish probable cause to arrest. a 4096 ositive probable cause to arrest. a 40% positive identification _ probable cause to arrest. a 4096 positive identification rate i probable cause to arrest. a 4096 positive identification rate is i positive identification rate is a very different figure to clearview�*s a very different figure to clea rview�*s claim a very different figure to clearview�*s claim of almost 100% accuracy. it doesn't seem to want his technology tested in court. . in court. our view is it will be better— in court. our view is it will be better if— in court. our view is it will be better if we _ in court. our view is it will be better if we don't i in court. our view is it will be better if we don't have | be better if we don't have facial recognition of evidence in court, because the investigators are using other methods to also verify. shouldn't it be interrogated in court, if it is being used to find people?— court, if it is being used to find people? did testify how
4:51 am
the algorithms _ find people? did testify how the algorithms work, - find people? did testify how the algorithms work, what l the algorithms work, what mistakes have been made with facial recognition, and none of them made with clearview have come down to poor policing or investigative work, not checking or not having other people look at the results of the facial recognition, just basic stuff. the facial recognition, 'ust basic stuff.i the facial recognition, 'ust basic stuff. . . . , basic stuff. had there been any mistakes from _ basic stuff. had there been any mistakes from the _ basic stuff. had there been any mistakes from the police? i basic stuff. had there been any mistakes from the police? not| mistakes from the police? not that we know _ mistakes from the police? not that we know of. _ mistakes from the police? not that we know of. in _ mistakes from the police? not that we know of. in many i mistakes from the police? not i that we know of. in many places in the us, _ that we know of. in many places in the us, the — that we know of. in many places in the us, the police _ that we know of. in many places in the us, the police often i in the us, the police often don't have to diebold whether they even use facial recognition, we have no idea how many people have been arrested because of this technology. a few cities have pushed back. in san francisco, the police's use of facial recognition technology is hand. matthew works for the electronic frontier foundation which pushed for the band. part ofthe which pushed for the band. part of the problem _ which pushed for the band. part of the problem is _ which pushed for the band. pant of the problem is notjust that it is invasive, wrong, but also
4:52 am
that the police departments themselves are incredibly opaque about how and when it is used and how the company actually works, so whether you are in court or not and whether you are questioning how the police use it, it's often very hard for a defence eternities or these organisations concerned citizens to find out when i how it is being used and how it works. iis when i how it is being used and how it works.— how it works. is hard to know how it works. is hard to know how many — how it works. is hard to know how many people _ how it works. is hard to know how many people have i how it works. is hard to know how many people have been| how many people have been victims of mistaken identity, but if you we do know of almost only involve african—americans. only involve african—america ns. the only involve african—americans. the technology is racially biased, it has been known to misidentify black people in the united states, and also because it is these communities that are most are subject to massive amounts of police surveillance. government use of facial recognition is an accurate, dangerous and a huge problem for civil liberties and rights and also needs to be changed.
4:53 am
he accepts buyers can be a problem but the use of technology needs to be regulated, not banned. irate regulated, not banned. we en . a . e regulated, not banned. we engage with _ regulated, not banned. , engage with lawmakers a lot, it is up to them to decide with the regulation should be. i think racial bias is a really important issue to look at, i am a person of mixed race myself, i am am a person of mixed race myself, iam half am a person of mixed race myself, i am half asian and australian, so when we look at the algorithm, we are thinking about all these ethnicities, southeast asians, asians, african phases to make sure the algorithm is not biased, and i think there are two parts to buyers, buyers in the policing world and buyers in the algorithms, so we worked really hard to make sure that this algorithm works across all demographics.— demographics. facial recognition _ demographics. facial l recognition technology demographics. facial- recognition technology may be controversial, but it's undoubtably part of fighting crime in the future. it's up to lawmakers to strike a balance between the battle for justice and our right to privacy.
4:54 am
hello there. the month of march has been a bit of a roller—coaster, hasn't it? and friday was no exception. look at these contrasting weather conditions — a beautiful afternoon in scarborough, north yorkshire. pleasantly warm as well. different story in wiltshire. in fact, there was just shy of an inch of rain by some torrential and at times thundery downpours that moved through the country. we have actually seen quite an unsettled month for many — some areas seeing double the amount of rainfall — and the month is not out
4:55 am
with more wet weather to come before we move into april. now, as for the start of the weekend, we are going to see further showers, not quite as many as friday, but this little weather front will enhance the showers from time to time. it's going to be a mild start to the day. sunny spells and a few isolated showers during the morning become a little bit more widespread into the afternoon, but there will be some drier, brighter interludes and favoured spots for that is where we had the wettest of the weather, actually, on friday, so across southern and south west england, along with wales, 13 or 14 degrees. a line of showers from that front across east anglia, northern england and into northern ireland. a little bit more cloud but some sunshine into northern scotland but noticeably cooler as that northerly wind starts to kick in — 5—9 degrees here. now, it looks likely that we are going to see some wetter weather, though, from saturday into sunday with this area of low pressure bringing some rain once again into the south—west and so, that brings a bit of a contrast first thing on sunday morning. milder air sitting down to the south—west but eventually
4:56 am
as that rain clears, the cooler northerly flow starts to push further south across the country. so, a grey, potentially wet start across the south first thing on sunday morning, slowly easing away to brighter, sunnier skies. a few wintry showers in the far north of scotland — keir sitting in the cooler air with around 4—6 degrees. further south, it will be noticeably cooler but not particularly biting with it. now, it looks likely that that cooler trend stays on monday with plenty of sunshine before more rain arrives for tuesday. and just before i say goodbye, don't forget, as we move into the early hours of sunday morning, it's the start of british summer time. we all lose an hour's sleep but we gain more daylight. take care.
5:00 am
this is bbc world news, i'm vishala sri—pathma. our top stories: more protests are planned across france as fury over president macron's pension battle escalates. from the slopes to the stand: gwyneth paltrow gives evidence in a trial where she's accused of causing a skiing accident — she insists she was the real victim. i was skiing, and looking downhill, as you do, and i was skied directly into by mr sanderson. biden and trudeau talk tough, as the us and canada pledge to stand together against authoritarian regimes. a russian child's drawing against the war in ukraine sparks a police investigation and tears a family apart.
37 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on