Skip to main content

tv   Click  BBC News  April 25, 2021 12:30pm-1:01pm BST

12:30 pm
in the last 2a hours — a world record number of daily coronavirus infections for a fourth day in a row now. the indonesian military say that the submarine that that went that the submarine that went missing last week has been found split into three pieces and that all 53 crew on board are dead. on saturday, the navy said the vessel had sunk in the bali sea and that they'd discovered debris including personal items. the iraqi interior ministry now says at least 82 people have died and more than 100 others have been injured in a devastating fire at a hospital in baghdad treating coronavirus patients. the blaze — reportedly caused by an exploding oxygen tank — tore through an intensive care ward with patients on ventilators, unable to move. england's biggest football teams and the sport's governing bodies will stage a four—day boycott of social media from next friday to campaign for social media companies to crack down on online abuse.
12:31 pm
i will be back here at one o'clock with the latest news for you. first, click explores how biases built into everyday technologies impact people's lives and how the tech industry is trying to address it. this week: they are biased. they discriminate. they are racist. they are the algorithms. hey, welcome to click! we're going to start with a quiz this week. we're going to play a game of guess the famous face.
12:32 pm
using possibly the freakiest faces that you've seen in a while. they are pretty disturbing, aren't they, lara? they certainly are. although i have to say that the end result is not quite as scary as the process of actually of making them. ok, so this may very well be something that you cannot unsee, but here we go! 0k. who is this? and who's this? and who is this? crikey! right, well, while you ponder, let me tell you that this is what happens when you ask an al to generate fake faces based on other faces. are you ready? here come the answers. the first one is a blend of lara and me. so odd! i think it is more you than me, to be honest. i think it is more you than me! really? i don't know. ok, the next one is chris fox and omar mehtab.
12:33 pm
and this is our oz and our kitty. goodness. i think the really odd bit is actually seeing the progression from one person into another. now, this is a really weird, fun thing that has come out of the really serious issue that we are going to be talking about for the rest of the programme, and that is the fact that computers have got much, much better at recognising faces — but not all faces. especially faces that aren't white. 2020 highlighted many inequalities in how we treat each other as humans. inequalities in who could afford to shelter from the virus and who had no choice but to physically go to work. and inequalities in how we are treated by the authorities. the killing of george floyd, the protests that followed and this week's conviction of derek chauvin have reminded us all that racism still exists in our societies and we all need work together
12:34 pm
to truly root it out. and it is against this backdrop that we are going to be looking at biases in technology — an industry which has often been criticised for coding our prejudices into its products. and systems recognising faces definitely fall into this category. these technologies are now being used in many applications, and it's well documented that they don't work as well for everyone. so i spoke to somebody who fell foul of one of these systems a couple of years ago when she was trying to get a new passport. cat hallam was using the home office's new tool to check and verify the photo was up to the job. i noticed that it was multiple attempts that i'd been trying. it could not recognise my features on my face. the first thing that it mentioned was it was looking for my eyes, it was looking for the outline of — it couldn't see the outlines of my mouth. i also held back my hair,
12:35 pm
so i tried doing that as well, just to be able to see if i could be able to assist the camera to take the photograph of me properly, or upload it properly. cat, who works as a technologist, knew that what she'd experienced was an ongoing issue, and tweeted about it. i'm glad that the response did go out because it obviously showed that there had been a problem there, detecting that with especially ethnic minorities. two years on, ethnic minorities still face similar issues. the home office says: how does this make you feel, as somebody with dark skin, where you possibly don't feel like this has been designed properly to cater for you?
12:36 pm
there is a broad spectrum of people with various skin tones, ranging from very light individuals who are from an ethnic minority background to people who are very, very dark skinned, and that needs to be taken into consideration. it is actually about individuals or software companies that are putting out software that is not ready for the mass market. that was cat hallam. now, facial recognition is powered by artificial intelligence, a technology — possibly the technology — which is going to impact our lives more than any other in the next decade. it will underpin the self—driving cars that need to recognise pedestrians, the decisions that officers will take when policing our streets, the selection ofjobs that we may or may not be offered and, very probably, the next vaccine we develop too.
12:37 pm
so it needs to get things right, right? well, craig langran from the bbc radio programme people fixing the world has been looking at how we can create facial recognition systems that work for everyone. facial recognition is slowly seeping into everything we do. while it can be a convenient way of interacting with tech for some, it has been creating problems for others. you know those passport gates at the airport? well, the thing is for me, they often don't work as well as they should. sometimes it can take me several goes before i finally get through, and sometimes they don't even seem to work at all. it's a little bit annoying and i'm never quite sure what is going on, or why. but at least the issue at the gates is not affecting my livelihood.
12:38 pm
i travelled to meet sahir — he wanted to remain anonymous, so we've given him a different name. he lost his job overnight. uber introduced this face—recognition technology. they sent messages out, saying that "occasionally we are going to ask you to take a selfie of yourself". they compare against the profile picture you have got on there. so it is quite dark and i had a cap on and i took a selfie and i got an e—mail about 11 o'clock, saying "your account has been deactivated due to not being recognised. we have chosen to end our partnership with you. i hope you understand this." you know, iwas shocked at the time — i did not know what to do. sahir has since sent dozens of messages to uber eats, and he has got a similar generic response each time. he asked for his case to be reviewed and whether it would be acceptable if the mcdonald's manager where he usually picks his deliveries up from could vouch for him. he got nowhere. so all of these messages were sent by you? mmm—hmm. and that's the first message that got back? yeah, but as you can see,
12:39 pm
it's a generic message. uber says it believes the picture provided to its system... ..an assertion sahir denies. when he pleaded with the company to be able to appeal the decision and asked for a review of his file, uber told him the decision was final. five months on, sahir is still waiting for a response to his letter that he sent to holland, where uber�*s appeal team is based. since we have taken on his case, uber has agreed to share the image captured on the night of the incident. so 22 of our members have been dismissed by uber eats for substitution, and that is through a facial recognition software system that uber eats use. of that figure, 12 of them were bame and four of those were from a brazilian—portuguese heritage. we've since spoken to another
12:40 pm
union who have said that their members, too, have faced similar issues. so basically, uber launched a new system to check the real—time id. they have blocked my account for — on that time. it's very bad for me because i have no other source of income. in a statement, uber said: these stories show us just how crucial it is to get this right. these systems have got to be robust and they've got to work for everyone. a key part of the problem is often within the data sets these algorithms have been trained on. they are often built from images scraped from the web — images
12:41 pm
of celebrities in the media or pictures on social networks. even our data sets predicting a wedding dress will show, you know, the western christian wedding as the prediction for that, because it's very heavily influenced by western media. so as a result of that — of course, western media, it is not very diverse, does not feature people of colour in a lot of tv shows. so now some companies are hoping that al itself could just be the answer to solving the problem, using the might of something called generative adversarial networks, or gans for short. to see how it works, we have embarked on a little experiment. these are the faces of the click team which we have fed directly into an off—the—shelf gan software from nvidia. on the right is the image of the person the software already knows and we are on the left. the algorithm starts by comparing the facial features it knows to the new
12:42 pm
image that it's looking at. after hundreds of iterations, it's able to work out what makes that face look the way it does — at least mathematically, anyway. once we have this digital replica, we can start playing around with different features. we can fiddle with age, ethnicity and mix faces, but most importantly, create people that don't exist at all. this technology is used to create large databases of fake faces, which is then used to train facial recognition systems. but creating faces in this way is not enough. if you want to create something that works and treats everyone more equally, the real images you feed into the gan have to be representative of life. after all, we would not be looking straight into a camera in the real world. this is a photo shoot by generated photos in the us, a company specialising in creating gan images. it spent several months taking pictures of thousands of people.
12:43 pm
these models were specifically chosen for their diversity, but they are also being captured doing all sorts of things. so do you think that gans can totally eliminate bias in an area like facial recognition? totally is probably a very strong term, but i think they can mitigate significantly, yes, i think so. i would say that if you do collect more real data, if you are able to do that, then you should do that. this technique should optimise how these systems work, but what is more important are the people behind the code.
12:44 pm
the reality is that technology always reflects the biases that exist in society. and just changing facial recognition tools to make them more accurate isn't going to change that. how did it feel back in october when you realised that you were going to lose your livelihood? it was horrible, i can't even explain it. sleepless nights, things were going on in my head. "what am i going to do now, how am i going to survive?" stories like sahir�*s show us just how important it is to have people on the other side that you can easily talk to and reason with. ethical debates around how these technologies are used and deployed need to continue, and life—impacting decisions shouldn't be left to machines alone. hello and welcome to the week in tech. it was the week apple unveiled all—new ipads and a spectrum of imacs, both built around its newest mi chip.
12:45 pm
much trailled airtag item trackers also launched at the spring loaded event. and slipped between product launches came confirmation of a ios update that requires users to opt into ad tracking. with the default set to off, this could hit revenues for companies and developers like facebook. instagram announced it would let users filter out abusive messages. the move follows footballers speaking out about experiencing harassment through direct messages. a tesla 2019 model s crashed and killed two people with police believing there was nobody present in the driver seat. the firm's founder elon musk tweeted that data recovered so far showed the car's autopilot driver assistance system was not enabled. nasa flew a mini helicopter on mars in the first powered controlled flight of its kind. the autonomous drone hovered for nearly a0 seconds.
12:46 pm
and how do you follow a helicopter on mars? with a dna robot shaped like an aeroplane. where previously it took days to design these tiny devices, new software has helped scientists develop minuscule structures, complete with rotors and hinges, injust minutes. looks like that tech's taking off. ain't i a woman? face by face... earlier in the programme we saw how algorithmic bias is our impacting people's lives in a very real way. joy buolamwini is an ai researcher at mit, and she spent the last four years trying to raise awareness of the social implications and harms of ai. as i make whimsical systems to paint walls with our smiles or project inspirations on faces, but at times... i'm invisible. in 2016joy founded
12:47 pm
the algorithmicjustice league, which was inspired by her own experiences of facial recognition, and she is now the star of brand—new documentary on netflix called coded bias. it is notjust face classification, it is any data—centric technology. computer: i am making predictions for your - life right now. one of the first questions we should be asking is is the technology necessary in the first place, are there alternatives? and after we ask that, if the benefits outweigh the harms, we also need to do something that i consider "algorithmic hygiene", right. and so with algorithmic hygiene, you are actually checking, who do the systems work for and who doesn't it work for? there is actually continuous oversight for how they are used, because you would not just floss once or take
12:48 pm
a shower once and think you're ok. when these symptoms are in the real world, we have two see how they are actually being used. so what about the solution we talked about earlier, making up for the lack of ethnically diverse training data by generating fake faces for the al to learn from? when we're thinking about how systems are being trained, and how systems are being evaluated, do the evaluation methods actually reflect the real—world conditions? because if they don't, we can give ourselves a false sense of progress, which, in the research i have done, has been very characteristic of the field, right? we take a limited data set, we make major claims about that limited data set, that often doesn't reflect real—world conditions. do you know whether there are, or used to be, any genuine technical reasons why cameras couldn't pick out the details in dark skin as well as they could in light skin? cameras are often looked at as being objective,
12:49 pm
but default settings generally reflect the priorities of the people who are creating a specific technology. if we look in the analogue space, we saw that with kodak, for example, they actually changed the chemical composition of their film when chocolate companies and furniture companies complained that you can't see the difference between my dark chocolate and my milk chocolate, or the fine grain in the mahogany. and so where this is fascinating to me is what happens in the analogue, right, is then replicated in the digital space. that's not to say we don't have differences in skin reflectance and so forth, it is to say we can choose to develop systems that account for those differences, or not. do you think these technology companies are serious about removing racial bias, or do you think they are just paying lip service?
12:50 pm
yes, companies now are saying they want to have responsible ai, they want to have ethical ai, they can be well—intentioned but we still have to look at the impacts of the products that they are creating, and so when we see researchers being dismissed for pointing out problems, or when we see problems being minimised, then we can see that we can'tjust listen to what companies are saying, we have to watch what they are doing. that was joy buolamwini. and let's talk about those companies now, from a different perspective. as the pandemic eases, there will inevitably be more people looking forjobs. over the last few years, more and more recruitment companies have been using al to match positions with people. and it may be that cvs, references, past salaries and academic grades are not the best way tojudge suitability. now some companies have been using other tools to try and uncover hidden talent.
12:51 pm
22—year—old jess, who left school after her gcses, feels that the skills that she has are being missed injob interviews, as there is too much focus on qualifications rather than ability. i have a history in hospitality, which has obviously been hard—hit with covid, and so i spent a good few months searching and applying for hundreds ofjobs, with no success. so she joined a scheme that as part of the recruitment process uses al to gamify the assessment of someone�*s personality and behaviour traits, to match them to a job that would really suit them. the last question i remember vividly — we had to try and crack the code, so there would be a certain amount of numbers in the middle of a wheel and you had to try
12:52 pm
and click it when it hovered over the number, and it got faster and faster each time, it was an impossible thing. i actually found out in the end that they just wanted to test how long you kept it up for, kept trying, so that was interesting. the arctic shores platform looks at thousands of data points. it's a! comparing results to that of a larger dataset, aiming to push users in the direction of a job they will do well, while hopefully overcoming any potential bias or stilted job interviews. so there is like, abstract thinking and detailed thinking, and i scored all the way to abstract, which is apparently rare, so. i feel like abstract thinking in itself is never really looked at as a strength, or at least it's hard to sell it in an interview. cognitive diversity is something that businesses are starting to consider a bit more, so i think it is great to have this game that kinf of picks up on that. even if you are not aware of it in yourself, it can be picked up on.
12:53 pm
it's a really great feeling to be working with other people that have more traditional qualifications, and went through more traditional means of interviewing. i also tried the process myself to see how it feels. "so skyrise corporation is advertising their office spaces to potential companies that are visiting the building. you must inflate 45 balloons for their event taking place later today." i reckon i can get three. oh, no...that one went quicker. they're not the same size. as well as testing your logic, it's a game of risk, showing how averse you are to it. something there isn't a right or wrong answer to, itjust shows that you may be more suited to some roles than to others. myjob doesn't actually depend on this, does it? neuroscientists know what can be gleaned from these tasks. the process wasn't pressure—free either, but apparently that's also part of the test. so how did i fare?
12:54 pm
you were very measured in the risk you take, so you are less likely to be impulsive... oh, yeah, i'm not impulsive at all, that's actually very accurate. interesting in an interview situation, where you are having to absorb information that is being given to you while, at the same time, processing, "right, what is being said, how am i going to respond to it, how is it relevant to the discussion?" that is a lot of information you are holding at any one time in order to be able to do that successfully. so it shows that you are in the right role... if you ever needed telling that, lara, then it's confirmed here. oh, that was pretty intense, you really don't want to be interrupted by a child or a housemate. laughs so let's see, you're risk—averse and you can hold a lot of information in your head, i have to say that is you — and come to think of it, that's me too. what a coincidence! we must both be on the rightjob then.
12:55 pm
well, there you go, and long may it continue. part of the job of course now involves saying goodbye, so could you offload all about information that you have been holding in your head, please? yes, as ever you can find the team on social media, on youtube, instagram, facebook and twitter at @bbcclick. thanks for watching and we'll see you soon. bye bye. hello. for the vast majority it is turning into yet another dry and mostly sunny day. but with cool easterly winds developing, especially across england and wales, it is feeling cooler, compared with yesterday. high pressure in charge of the scene giving dry and settled weather. winds around high pressure flow in a clockwise direction which is pulling some rather cool
12:56 pm
air in across some parts of england and wales. the easterly wind is also pulling in some cloud, rolling in across eastern parts of england, some of it getting across into the midlands and east wales, tending to break up as it goes. there will still be sunny spells. the odd shower popping up across high ground in highland scotland. any showers which do develop, quite slow—moving. the winds will be light, stronger winds further south. quite gusty in the south—west of england and the channel islands. the lowest temperatures will be on the east coast, the highest temperatures will be in shelter further west. 16 for cardiff, 17 for liverpool and glasgow, maybe 17 in the western counties of northern ireland as well. this evening and overnight it will stay dry for the majority. but we will bring more of this cloud in across england, getting into wales. at the other end of the uk, more cloud into north—west scotland. in this clear slot, here, southern scotland, the far north of england, northern ireland, that is where we are most likely to see a touch of frost. if you are waiting for rain,
12:57 pm
this weather feature is going to bring some over the next few days, bringing some rain on monday across scotland. quite heavy bursts of rain, actually. could see the odd shower breaking out in northern ireland and maybe the north of england. further south, another predominantly sunny day. still quite windy through the channel islands, but the wind using a touch for southern england, the south of wales. highest temperatures for parts of england and wales at 16 degrees. turning a bit chillier to the north—west. if you are waiting for rain down in the south, on tuesday, some of the wet weather will migrate southwards. it will break up into showers. quite hit and miss. some places will fall through the gaps and stay completely dry. warmer in the south—east, turning chillier to the north. the cooler air will spread southwards through the middle of the week. one or two showers but still quite a lot of dry weather around. a similar story in the north. any showers on the high ground in scotland could start to turn wintry.
12:58 pm
12:59 pm
1:00 pm
this is bbc news with the latest headlines. england's biggest football teams — and the sport's governing bodies — will stage a four—day boycott of social media from next friday — to campaign for social media companies to crack down on online abuse. india's prime minister narendra modi says the surge in coronavirus cases has shaken the nation — his comments come as the country hits a record number of new cases for the fourth day in a row. a campaign's being launched to encourage younger people to get the covid vaccine when their turn comes. figures show more than half the uk population has now received a first dose of the jab. the indonesian military say that the submarine that that went missing last week has been found split into three pieces and that all on board are dead.
1:01 pm
a fire sweeps through a hospital treating coronavirus patients in baghdad, killing at least 82 people according to iraqi officials.

27 Views

info Stream Only

Uploaded by TV Archive on