Skip to main content

tv   Shift  Deutsche Welle  March 27, 2021 7:15am-7:30am CET

7:15 am
berlin i mean he said thanks for joining us. more than a 1000 years ago europe witnesses a huge construction boom. christianity firmly established itself. both religious and secular leaders to display their power. to trace against. concrete the tallest biggest most beautiful structures. stone masons builders and architects compete with each
7:16 am
other. this is how massive churches are created. contest of the cathedrals starts to slow on d. w. . the fact that. a black and a white hand each holding the same device full facial and image recognition software these photos may be interpreted quite differently just how race is ai autopsy today on ship. one of the key elements of auto physically intelligence is that it teaches itself but to learn ai needs to be fed with massive amounts of data and that data is selected by humans and that's the way i operate. and the conclusions it draws
7:17 am
dependent on the data sets it has to work with here's an example of how things can go wrong those 2 nearly identical images were presented to google's cloud vision image recognition software the shocking results the temperature got and the white hand was identified as a monocular a bit strange perhaps but an understandable mistake you might say about the same object held in the dock skinned hand was identified as a handgun has since apologized and has removed the label gun from its system but is that really enough to counter graces assumptions made by its algorithm black people and people of color in particular often suffer from discrimination by ai systems as the new american documentary coded bodies reveals during my 1st semester at mit i got computer vision software that was supposed to track. didn't work until i put on this white that i'm thinking all right what's going on here is the lighting
7:18 am
conditions that is it the angle at which i'm looking at the camera or is there something more in 20 joy we need was one of the 1st computer scientists to reveal the systems are racist the documentary style rented by she. tells her story. the film explains why i systems are racist many of the algorithms using the software are developed and tested by teams consist mainly of white men their view of the world is reproduced in the software and they're less likely to notice what is really it's very biased. systems or that doesn't made up mostly of like people as a result the technology ends up less accurate when it comes to people of color. the media that the most diverse the data that spread into these ai system has the better it can identify a variety of. and i mentioned i can but the bias goes even deeper and begins even
7:19 am
before the deficits are created. this thing mid-air hardware on. poorly it starts with the hardware which was originally designed for use in cameras and movies so it was designed with white men and minds and so on the light senses the way the devices are calibrated infrared technology developed in recent decades they were all designed with white skin in mind and with how light reflects on white skin hard they ignore the fact that people have many different skin colors which also reflects light differently the least and that's a 50 and that's why tweaking the algorithms isn't enough a more fundamental change is needed. clewiston to mention the decent people using these technologies are key here in other words while all of us at the end of the day what's being programmed is not the only factor i mean what also matters is
7:20 am
how we interact with the technology and what data we feed into it that's what we're missing of we're missing that fundamental debate about social inclusion and exclusion about discrimination and how biased permeates our society with our devoted humans here in this community. joy. is also continuing her fight i mean her findings public even appearing before the u.s. congress. she's also the founder of the auguries mc justice league which is working to create less biased a more equitable artificial intelligence when it comes to equality we still have a long way to go especially when algorithms are involved algorithms also used an online search engines the most obvious example is google image search helps us navigate the world if you're looking for information on something or want to see what someone looks like you'll probably start googling them but the. recent
7:21 am
experiment conducted by the w. shows the rhythms often lead to skilled results searching for women of specific nationalities online is a harsh lesson in stereotypes searching for brazilian women yields young women in suggestive poses but german woman brings up many politicians and actually. used google's very own cloud vision safes to analyze thousands of images this image processing software was designed to spot sexualized content google uses it to flag suggestive images. search for women from the dominican republic brazil and up to 40 percent of sexually suggestive american women it's just 5 percent and for german women 4 percent. this has a major impact on how women are seen and treated and women from certain countries are especially vulnerable such for ukrainian women for example and 61 of the top
7:22 am
100 results are links to dating sites websites with sexualized content. algorithms are key to our data driven and automated world images they deliver can be shockingly biased racist and sexist but for data scientist catherine it's not really the algorithms that are to blame algorithm learns because we train it using machine learning so either whether it's deep learning with a neural network so to speak or it's a shallow machine learning where feeding it data and were asking it to form an opinion and of course when that opinion is based on data that has unfair treatment of groups then the algorithm or very much learned in china is being used to identify specific ethnic groups take the telecom giant while away which took part in testing and we go. software that could identify faces of the countries we go
7:23 am
minority that's according to i p v m a u.s. based company specializing in video surveillance the software as accuracy rate is estimated at about 70 percent is reportedly being used across china so is ethnic discrimination being automated. this camera system tested by wa way is designed to use ai to identify people of specific of this city's it was trained on data sets that we teach it to identify age and gender and ethnicity. the system was trained on data sets containing people from china's weak a minority and non weakness to teach it to use facial features to determine if this city what's even more disturbing is have china plans to use it when the ai system blank someone is a member of the reagan minority it can trigger a week alone and notify the police. in the shin jiang region of northwestern china
7:24 am
security forces have implemented a massive surveillance network designed to clamp down on the we can minority hundreds of thousands if we have been detained in so-called reeducation camps. officially chinese companies have denied that their testing software is being used for commercial purposes but internet giant has already admitted that its cloud division has developed discriminatory ai software and even greater concern is the fact that it's cost them a step by step instructions for how to use the software to target we muslims paired with mobile phones and surveillance cameras the software enables users in china to send images of people to its cloud service to flag them as suspicious we guess in other words users as opposed to keep feeding the facial recognition with data that enhances racial targeting that's truly scary tools like that show that can be used to target ethnic groups. says karen young an expert in artificial intelligence and
7:25 am
human rights very powerful systems increasingly being used to sort organize people people at a population what level and even though we don't see that kind of drastic action being taken in western industrialized societies there are an increasing number of social. scoring people particularly in the criminal justice system to make predictions based on algorithmic assessment of the kinds of risks that they pose to public safety in the u.s. african-americans and asian americans where up to 100 times more likely to be mis identified by facial recognition systems than white men that's according to a study commissioned by the u.s. government falls maps can lead to all sorts of problems like a mistake in the rest but these systems are growing increasingly popular and not just in the u.s. that could have a serious impact on society says karisma beliefs from the institute for ethics at
7:26 am
oxford university and i can be extremely disruptive in a negative way to society it can be can be racist it can be extremely unfair to take one example in 2016 microsoft launched its. design to reply to twitter users in less than a single day of exposure. had to be taken off. twitter had taught it to parrot sexist racist anti-semitic and a sense of common sense. why are such bias in every poorly i systems being unleashed on society to berlin based artist and researchers. are exploring that question. that new data sets being fed into these ai systems are always old so the systems and that conservatives and outdated. when they're used to predict to recommend. to.
7:27 am
to expect. to be. different. persons can be very i guess anyone they deem to. gender friends. and you may never realize it. because. it's truly a massive problem an easy way to visualize how ai works is to imagine you baking a cake if you use the wrong gradients the cake just won't taste good similarly if the data used to train ai software is poor and not representative of the diversity found in society then it won't work properly except that here we're not talking
7:28 am
about a cake but more dramatic consequences and by now this poses a greater threat to certain people like people of color it's not the technology that is to blame i don't think that we should be working to make the world this is something for my opinion we need to raise awareness for how ai systems can have a racist and discriminatory impact so before we adopt the more widely they need to be rigorously tested according to ethical criteria what about you have you ever been negatively affected by algorithmic systems or do you believe that you are not really affected by them but it's not on you tube facebook dot com. take care and you next time.
7:29 am
i mustn't come on. the muscle car that's good mustang. with electric engine new survey me a new can even muster we tried out. red . w. . they've been robbed of their soul that's what a people experiences when their heritage is taken from them. countless cultural riches were brutally stolen from africa and carted off to europe by colonialists. each artifact has blood on it from the looms that have yet to feel.
7:30 am
what should be done with the stone or from africa. this is being hotly debated on both continents. stolen so. long t.w. . sung. what about that flag that. we are living during the most extraordinary time in history. of france 44 before the electric. the bat.

31 Views

info Stream Only

Uploaded by TV Archive on