tv Click BBC News April 24, 2021 12:30pm-1:01pm BST
12:30 pm
hello this is bbc news with annita mcveigh. the headlines: hospitals in india struggle with overwhelming demand for beds, ventilators and oxygen — as it records the world's highest—ever daily rise in coronavirus infections for the third day in a row. the prime minister's former adviser, dominic cummings, has questioned his "competence and integrity". borisjohnson has refuted the claims, which allege that he planned for donors to pay for the renovation of his flat, and considered trying to block an inquiry into a leak last year in case it involved a friend of his fiancee. indonesian rescue teams recover debris that is believed to be from the submarine that went missing off the coast of bali on wednesday. medical experts are recommending that people who lose their sense of smell — due to covid—19 — are offered smell training rather than being treated with steroids. now on bbc news, click.
12:31 pm
this week: they are biased. they discriminate. they are racist. they are the algorithms. hey, welcome to click! we're going to start with a quiz this week! we're going to play a game of guess the famous face! using possibly the freakiest faces that you've seen in a while. they are pretty disturbing, aren't they, lara? they certainly are. although i have to say that the end result is not quite as scary as the process of actually of making them. ok, so this may very well be something that you cannot unsee, but here we go!
12:32 pm
0k. who is this? and who's this? and who is this? crikey! right, well, while you ponder, let me tell you that this is what happens when you ask an al to generate fake faces based on other faces. are you ready? here come the answers. the first one is a blend of lara and me. so odd! i think it is more you than me, to be honest. i think it is more you than me! really? i don't know. ok, the next one is chris fox and omar mehtab. and this is our oz and our kitty. goodness. i think the really odd bit is actually seeing the progression from one person into another. now, this is a really weird, fun thing that has come out of the really serious issue that we are going to be talking about for the rest of the
12:33 pm
programme, and that is the fact that computers have got much, much better at recognising faces — but not all faces. especially faces that aren't white. 2020 highlighted many inequalities in how we treat each other as humans. inequalities in who could afford to shelter from the virus and who had no choice but to physically go to work. and inequalities in how we are treated by the authorities. the killing of george floyd, the protests that followed and this week's conviction of derek chauvin have reminded us all that racism still exists in our societies and we all need work together to truly root it out. and it is against this backdrop that we are going to be looking at biases in technology — an industry which has often been criticised for coding our prejudices into its products. and systems recognising
12:34 pm
faces definitely fall into this category. these technologies are now being used in many applications, and it's well documented that they don't work as well for everyone. so i spoke to somebody who fell foul of one of these systems a couple of years ago when she was trying to get a new passport. cat hallam was using the home office's new tool to check and verify the photo was up to the job. i noticed that it was multiple attempts that i'd been trying. it could not recognise my features on my face. the first thing that it mentioned was it was looking for my eyes, it was looking for the outline of — it couldn't see the outlines of my mouth. i also held back my hair, so i tried doing that as well, just to be able to see if i could be able to assist the camera to take the photograph of me properly, or upload it properly. cat, who works as a technologist, knew that what she'd experienced was an ongoing issue, and tweeted about it. i'm glad that the response did go out because it obviously showed that there had been a problem there,
12:35 pm
detecting that with especially ethnic minorities. two years on, ethnic minorities still face similar issues. the home office says: how does this make you feel, as somebody with dark skin, where you possibly don't feel like this has been designed properly to cater for you? there is a broad spectrum of people with various skin tones, ranging from very light individuals who are from an ethnic minority background to people who are very, very dark skinned, and that needs to be taken into consideration. it is actually about individuals or software
12:36 pm
companies that are putting out software that is not ready for the mass market. that was cat hallam. now, facial recognition is powered by artificial intelligence, a technology — possibly the technology — which is going to impact our lives more than any other in the next decade. it will underpin the self—driving cars that need to recognise pedestrians, the decisions that officers will take when policing our streets, the selection ofjobs that we may or may not be offered and, very probably, the next vaccine we develop too. so it needs to get things right, right? well, craig langran from the bbc radio programme people fixing the world has been looking at how we can create facial recognition systems that work for everyone. facial recognition is slowly
12:37 pm
seeping into everything we do. while it can be a convenient way of interacting with tech for some, it has been creating problems for others. you know those passport gates at the airport? well, the thing is for me, they often don't work as well as they should. sometimes it can take me several goes before i finally get through, and sometimes they don't even seem to work at all. it's a little bit annoying and i'm never quite sure what is going on, or why. but at least the issue at the gates is not affecting my livelihood. i travelled to meet sahir — he wanted to remain anonymous, so we've given him a different name. he lost his job overnight. uber introduced this face—recognition technology. they sent messages out, saying that "occasionally we are going to ask you to take a selfie of yourself". they compare against the profile picture you have got on there. so it is quite dark and i had
12:38 pm
a cap on and i took a selfie and i got an e—mail about 11 o'clock, saying "your account has been deactivated due to not being recognised. we have chosen to end our partnership with you. i hope you understand this." you know, i was shocked at the time — i did not know what to do. sahir has since sent dozens of messages to uber eats, and he has got a similar generic response each time. he asked for his case to be reviewed and whether it would be acceptable if the mcdonald's manager where he usually picks his deliveries up from could vouch for him. he got nowhere. so all of these messages were sent by you? mmm—hmm. and that's the first message that got back? yeah, but as you can see, it's a generic message. uber says it believes the picture provided to its system: an assertion sahir denies. when he pleaded with the company to be able to appeal the decision and asked for a review
12:39 pm
of his file, uber told him the decision was final. five months on, sahir is still waiting for a response to his letter that he sent to holland, where uber�*s appeal team is based. since we have taken on his case, uber has agreed to share the image captured on the night of the incident. so 22 of our members have been dismissed by uber eats for substitution, and that is through a facial recognition software system that uber eats use. of that figure, 12 of them were bame and four of those were from a brazilian—portuguese heritage. we've since spoken to another union who have said that their members, too, have faced similar issues. so basically, uber launched a new system to check - the real—time id. they have blocked my account for — on that time. _ it's very bad for me - because i have no other source of income. in a statement, uber said: .
12:40 pm
these stories show us just how crucial it is to get this right. these systems have got to be robust and they've got to work for everyone. a key part of the problem is often within the data sets these algorithms have been trained on. they are often built from images scraped from the web — images of celebrities in the media or pictures on social networks. even our data sets predicting a wedding dress will show, you know, the western christian wedding as the prediction for that, because it's very heavily influenced by western media. so as a result of that —
12:41 pm
of course, western media, it is not very diverse, does not feature people of colour in a lot of tv shows. so now some companies are hoping that al itself could just be the answer to solving the problem, using the might of something called generative adversarial networks, or gans for short. to see how it works, we have embarked on a little experiment. these are the faces of the click team which we have fed directly into an off—the—shelf gan software from nvidia. on the right is the image of the person the software already knows and we are on the left. the algorithm starts by comparing the facial features it knows to the new image that it's looking at. after hundreds of iterations, it's able to work out what makes that face look the way it does — at least mathematically, anyway. once we have this digital replica, we can start playing around with different features. we can fiddle with age, ethnicity and mix faces,
12:42 pm
but most importantly, create people that don't exist at all. this technology is used to create large databases of fake faces, which is then used to train facial recognition systems. but creating faces in this way is not enough. if you want to create something that works and treats everyone more equally, the real images you feed into the gan have to be representative of life. after all, we would not be looking straight into a camera in the real world. this is a photo shoot by generated photos in the us, a company specialising in creating gan images. it spent several months taking pictures of thousands of people. these models were specifically chosen for their diversity, but they are also being captured doing all sorts of things.
12:43 pm
so do you think that gans can totally eliminate bias in an area like facial recognition? totally is probably a very strong term, but i think they can mitigate significantly, yes, i think so. i would say that if you do collect more real data, if you are able to do that, then you should do that. this technique should optimise how these systems work, but what is more important are the people behind the code. the reality is that technology always reflects the biases that exist in society. and just changing facial recognition tools to make them more accurate isn't going to change that. how did it feel back in october when you realised that you were going to
12:44 pm
lose your livelihood? it was horrible, i can't even explain it. sleepless nights, things were going on in my head. what am i going to do now, how am i going to survive? stories like sahir�*s show us just how important it is to have people on the other side that you can easily talk to and reason with. ethical debates around how these technologies are used and deployed need to continue, and life—impacting decisions shouldn't be left to machines alone. hello and welcome to the week in tech. it was the week apple unveiled all—new ipads and a spectrum of imacs, both built around its newest mi chip. much trailled airtag item trackers also launched at the spring loaded event. and slipped between product launches came confirmation of a ios update that requires users to opt into ad tracking. with the default set to off,
12:45 pm
this could hit revenues for companies and developers like facebook. instagram announced it would let users filter out abusive messages. the move follows footballers speaking out about experiencing harassment through direct messages. a tesla 2019 model s crashed and killed two people with police believing there was nobody present in the driver seat. the firm's founder elon musk tweeted that data recovered so far showed the car's autopilot driver assistance system was not enabled. nasa flew a mini helicopter on mars in the first powered controlled flight of its kind. the autonomous drone hovered for nearly a0 seconds. and how do you follow a helicopter on mars? with a dna robot shaped like an aeroplane. where previously it took days to design these tiny devices, new software has helped scientists develop minuscule structures, complete with rotors and hinges, injust minutes.
12:46 pm
looks like that tech�*s taking off. ain't i a woman? face by face... earlier in the programme we saw how algorithmic bias is our impacting people's lives in a very real way. joy buolamwini is an ai researcher at mit, and she spent the last four years trying to raise awareness of the social implications and harms of ai. as i make whimsical systems to paint walls with our smiles or project inspirations on faces, but at times... i'm invisible. in 2016, joy founded the algorithmicjustice league, which was inspired by her own experiences of facial recognition, and she is now the star of brand—new documentary on netflix called coded bias. it is notjust face classification, it is any data—centric technology.
12:47 pm
computer: i am making predictions for your - life right now. one of the first questions we should be asking is is the technology necessary in the first place, are there alternatives? and after we ask that, if the benefits outweigh the harms, we also need to do something that i consider "algorithmic hygiene", right. and so with algorithmic hygiene, you are actually checking, who do the systems work for and who doesn't it work for? there is actually continuous oversight for how they are used, because you would not just floss once or take a shower once and think you are ok. when these symptoms are in the real world, we have to see how they are actually being used. so what about the solution we talked about earlier, making up for the lack of ethnically diverse training data by generating fake faces for the al to learn from? when we are thinking about how
12:48 pm
systems are being trained, and how systems are being evaluated, do the evaluation methods actually reflect the real world conditions? because if they don't, we can give ourselves a false sense of progress, which in the research i have done has been very characteristic of the field, right? we take a limited data set, we make major claims about that limited data set, that often doesn't reflect real—world conditions. do you know whether there are, or used to be any genuine technical reasons why cameras couldn't pick up the details in dark skin as well as they could in light skin? cameras are often looked at as being objective, but default settings generally reflect the priorities of the people who are creating a specific technology. if we look in the analogue space, we saw that with kodak for example, they actually
12:49 pm
changed the chemical composition of their film when chocolate companies and furniture companies complained that you can't see the difference between my dark chocolate and my milk chocolate, or the fine grain in the mahogany. and so where this is fascinating to me is, what happens in the analogue, right, is then replicated in the digital space. that's not to say we don't have differences in skin reflectance and so forth, it is to say we can choose to develop systems that account for those differences, or not. do you think these technology companies are serious about removing racial bias, or do you think they are just paying lip service? yes, companies are now saying they want to have responsible ai, they want to have ethical ai, they can be well—intentioned but we have to look at the impacts of the products they are creating, and so when we see researchers being dismissed
12:50 pm
for pointing out problems, or when we see problems being minimised, then we can see that we can'tjust listen to what companies are saying, we have to watch what they are doing. that was joy buolamwini. and let's talk about those companies now, from a different perspective. as the pandemic eases, there will inevitably be more people looking forjobs. over the last few years, more and more recruitment companies have been using al to match positions with people. and it may be that cvs, references, past salaries and academic grades are not the best way tojudge suitability. now some companies have been using other tools to try and uncover hidden talent. 22—year—old jess, who left school after her gcses, feels that the skills that she has are being missed injob interviews, as there is too much focus on qualifications
12:51 pm
rather than ability. i have a history in hospitality, which has obviously been hard—hit with covid, and so i spent a good few months searching and applying for hundreds ofjobs, with no success. so she joined a scheme that as part of the recruitment process uses al to gamify the assessment of someone's personality and behaviour traits, to match them to a job that would really suit them. the last question i remember vividly — we had to try and crack the code, so there would be a certain amount of numbers in the middle of a wheel and you had to try and click it when it hovered over the number, and it got faster and faster each time, it was an impossible thing. i actually found out in the end that theyjust wanted to test how long you kept it up for, kept trying, so that was interesting. the arctic shores platform looks at thousands of data points. it's a! comparing results
12:52 pm
to that of a larger dataset, aiming to push users in the direction of a job they will do well, while hopefully overcoming any potential bias or stilted job interviews. so there is like, abstract thinking and detailed thinking, and i scored all the way to abstract, which is apparently rare, so. i feel like abstract thinking in itself is never really looked at as a strength, or at least it's hard to sell it in an interview. cognitive diversity is something that businesses are starting to consider a bit more, so i think it is great to have this game that kinda picks up on that, even if you are not aware of it in yourself, it can be picked up on. it's a really great feeling to be working with other people that have more traditional qualifications, and went through more traditional means of interviewing. i also tried the process myself to see how it feels. "so skyrise corporation is advertising their office spaces to potential companies that are visiting the building.
12:53 pm
you must inflate 45 balloons for their event taking place later today." i reckon i can get three. oh no... they are not the same size. as well as testing your logic, it is a game of risk, showing how averse you are to it. something there is not a right or wrong answer to, itjust shows that you may be more suited to some roles than to others. myjob doesn't actually depend on this, does it? neuroscientists know what can be gleaned from these tasks. the process wasn't pressure free either, but apparently that's also part of the test. so how did i fare? you were very measured in the risk you take, - so you are less likely to be impulsive... i oh, yeah, iam not impulsive at all, that is actually very accurate. interesting in an interview- situation, where you are having to absorb information that is being given.
12:54 pm
to you while at the same time j processing what is being said, how am i going to respond to it, how is it relevant - to the discussion — - that is a lot of information you are holding at any one time in order to be able i to do that successfully. so it shows that you are in the right role... - if you ever needed telling - that, lara, then it's confirmed here. that was pretty intense, you really don't want to be interrupted by a child or a housemate. laughs. so let's see, you are risk averse and can hold a lot of information in your head, i have to say that is you — and come to think of it, that's me too. what a coincidence! we must both be on the rightjob then. well, there you go, and long may it continue. part of the job now involves saying goodbye, so could you offload all about information that you have been holding in your head please? yes, as ever you can find the team on social media, on youtube, instagram, facebook and twitter at @bbcclick. thanks for watching and we'll see you soon. bye bye.
12:55 pm
this extraordinarily dry spell of april weather continues and for most of us is a lot of sunshine be had, turning a little hazy in places due to some areas of wispy, dry cloud but through the week and it will stay dry. we'll turn a bit cooler, particularly by tomorrow, there will but high pressure remains in charge of that in keeping things largely dry. the winds around the southern flank will be quite brisk so you notice the strength of the wind through this afternoon around the beaches of the southern coasts and certainly across the channel islands. there'll be some areas
12:56 pm
of patchy high cloud here and there turning that sunshine hazy but for most of us it is a fine and sunny afternoon and a relatively warm one, especially across western areas, with coasts of north devon and cornwall, west wales, north—west england, counties of northern ireland most favoured for the highest temperatures, 18, i9, possibly 20 degrees — but, for many of us, high levels of tree pollen, so do bear that in mind if you are a hay fever sufferer. through this evening and tonight it looks like long clear spells but generally we will see a bit more cloud developing across the eastern areas, maybe into mid wales, that even there, temperatures in many places staying just above freezing and we are most lik dry through the week but turning a bit colder.
12:57 pm
winds coming from the east and they will take temperatures down, highest values for parts of northern ireland, western scotland and north—west england, 17. if you are looking for rain and i know many people are, this frontal system pushing in from the north will bring some rain across scotland on monday. maybe the odd shower into northern ireland and northern england, some spells of sunshine in the south, temperatures between i! and i6. spells of sunshine in the south, temperatures between i! and 16. some of that rain in the north will push south but it will tend to fizzle and petered out so southern areas not saying much in the way of wet weather. generally dry through the net turning cooler.
1:00 pm
23 Views
IN COLLECTIONS
BBC News Television Archive Television Archive News Search ServiceUploaded by TV Archive on