tv Shift Deutsche Welle September 8, 2019 7:15am-7:30am CEST
7:15 am
sieved is a dynamic meeting place and cultural venue for itself. and watching t w news next up it's ship living and digital age this time with a look at how your algorithm contributed to the radicalization of its users make sure to stay tuned for that and it will be back at the top of the hour on the fly for me and everybody here in the newsroom in berlin and for watching. earth the home or saving googling to us tell stories of creative people and innovative projects around the world ideas to protect the climate and boost green energy solutions by global warming to get the environment series of global 3000 on d w and online. when your
7:16 am
family scattered across the globe. the birds do listen to. the attorney mark 2 groups should get a minimum of the. charge family from somalia live around the world. one of them needed urgent assistance english. family starts october on d. w. . sexism homophobia and political extremism school hits on you tube even though the site has plenty of high quality content it's the extreme videos that attract audiences and to big profits for the channel owners and for the site itself why does 8 make money on you tube and what does that mean for the user our topic today on shift.
7:17 am
you tube's recommendation algorithms favor a poll. rising content and that tends to radicalize users conservatives for instance increasingly steered toward right wing extremist channels as a brazilian study has shown $79000000.00 comments were analyzed for it it seems about time that you tube takes action so as an evil diski see all the website has announced changes to the community guidelines you tube is promising to remove condon that violates you tube's policies faster give trustworthy sources a bigger voice reduce the spread of content that's borderline rule breaking and raise the standards for monetisation. big words sounds good but will it work in practice not everyone thinks it will. american video journalist carlos has criticized you tube for not standing up for minorities he endured continuous homophobic and racist abuse from another you tube or stephen crowder widely seen as
7:18 am
a far right commentator. by the way they did not. appear to pick up anything but you tube says videos like this are protected by freedom of speech after widespread protests from the community the video sharing site decided to do monetize steven crowder's videos that means no advertising can be run together with these videos meanwhile the public dispute had only increased that you tubers notoriety and his number of subscribers. far right and extremist channels often enjoyed great success on due to the more extreme the more comments. this has been shown in a study by brazilian computer scientist manuel ribeiro. the leader of austria's identity movement martin is active on you tube among his political weapons are
7:19 am
trolling and hate speech heckling and abusing other web users. i think these are normal tactics hate trolling up part of his space and if you can't handle it you shouldn't even enter it almost. 2019 you tube attempted to march in selma and other right wing extremist channels from that site but with little success. to see a lawyer on the case and the next day his channel was back on line. you tubers post their band videos on other channels or simply wait for their fans to reproach them . it can be a rough place where trolls and haters can poison the atmosphere with impunity radical cons and may be taken down but other users only put it up again right away and people who can't stand the heat may have only one option to leave their accounts but does it have to come to that abuse and huge you can hit very hard and
7:20 am
it can hit any user as the example of brennan gilmore from the us shows he received death threats after conspiracy theories about him started circulating online it's living you know sort of with the constant threat of what you know of these these threats that i've received it just it changes you psychologically you know you're kind of sort of constantly looking over your shoulder. the former u.s. foreign service officer was present at the protests in charlottesville in 2017 when a man drove a car into a crowd of demonstrators. they were protesting against a march by right wing extremists. brennan gilmore called the attack on video it cost a young woman her life but when i saw what i had on video i gave it to the police 1st but when i saw it was not accidental it was not an accident it was clear show
7:21 am
you know that this guy had a 1000000 other choices to drive to this crowd and so i thought that it was important to share for those reasons and so i sent it out soon far right websites began claiming gilmore had staged the bloodshed some said he was part of a shadow government out to topple president trump with cia support this conspiracy theory spread like wildfire landing on the right wing hard line info was paid. i think i was. reading your. the hey campaign rapidly infiltrated his real life to this day he receives death threats by e-mail burning to war you are directly involved you are sick sick sick. hash tag down with the deep state there's one burning gilmore's body found in the ravana river there are going to reverse a half mile down the road so i think you know they're quite specific in that in the threats brendan gilmore has decided to fight he's filed suit against the well known
7:22 am
right wing conspiracy theory peddler alex jones and his kin for wars website. now in 4 wars and alex jones no longer have their own channels on you tube. it seems the site is that at least trying and is moving faster to remove content that violates the community guidelines still you tube continues to favor extreme use the more controversial the video the more lucrative that's how the youtube algorithm sees it as law helps to create the algorithm so he knows it inside out. french computer scientist left in 2013 he says the algorithms that spell you tube 6 asaad g.p.s. at best the use of a deeply learning algorithm which is not usually interdictions that can process billions of user sessions all see which videos are going to be the most likely to keep people in the names of all. the key to me is
7:23 am
a watch time it's the product of hits to 2 sticks and average running times together with the time you spend on each of the longer users stay on the website the better for you cheap that's the company run more ads which then add up to lots of money. the videos that have proven most effective out those with polarizing and extreme views and recommendation algorithms play that part. it's like a vicious circle you just watch one because you're all your curious and then here goes and we recommend it again and again jews rule the world homosexuality is a. disease and refugees rape women and steal jobs wild theories like these are hugely popular on you tube because an algorithm gives them greater exposure but why exactly are we so receptive to videos like these why do we click on this kind of nonsense and help it to spread. all about creating is
7:24 am
a professor of communications science at the university of mines and an expert on media innovations he explains what goes on in our heads when we spend time on you tube using. volved to react faster to dangerous threats and insults them to harmless stimuli no standard response is not a neutral but very emotional. when i read hatred and provocations on the internet for example even after all the research i've done on it i can't control myself it affects me immediately. more watch time more hits and more comments it's really not surprising that you tube favors extreme content the perceived anonymity of the internet also makes it easier for users to overcome their inhibitions and say things they would never say to someone thanks to face. the kids meet new years when i talk to you but i can see your responses but with an abstract entity online that's no longer the case they lose part of their humanity
7:25 am
and i'm a big problem is that we've learned we don't have to worry about being prosecuted even if we spread hate. when. the you tube algorithm appeals to our lowest instincts with content we react to emotionally and spontaneously but does this fully explain the extremist tendencies on you tube the algorithms are primarily designed to increase revenues and watch time of course companies are supposed to earn money but at the expense of ethics and anyway how would programmers go about creating ethical algorithms. rhythms are programmed by human beings humans whose prejudices and opinions flow into the programming so algorithms can never be perfectly neutral they always need ethical guidelines the rayna how much policy founded a nonprofit organization called the ethical tech society to focus. search on the
7:26 am
ethics of algorithms. it is that other rhythms can't be programmed to be ethical. but many of the methods we knew this assert appendant on the data it's the data that shapes the algorithms or the program in and of the cause of the study so the program has set up the scanner. and we provide the meat and the muscle. so the users actually provide the algorithms with the necessary data through their browsing habits but how is this data evaluated. the arena is demanding that the inner workings of the algorithms be opened for evaluation and regularly inspected. the youtube algorithms prioritize watch time and advertising revenues many people only watch videos that are recommended to them this is just what you tube likes to see the center this is not interested in quality their interest is primarily monetary keeping users on the site so they'll watch as much as possible and this is
7:27 am
and that's not necessarily policy that does that quality doesn't always evoke an emotional reaction. but emotion is a decisive factor in you tube's business model many users seek out videos that. well researched facts are not the highest priority. if you to work purely an entertainment site such issues would not be as serious. but it's established as a primary source of information for society. as a fan is the politician and so it's a political decision by the owner google as a corporation. since it's optimized for monitors asian way and not for quality but if. so you tube is all about watch time digital ethics can only become a factor if youtube and other companies make fundamental changes to their websites and they're reluctant because the current business model is working all too well so
7:28 am
questionable content will keep on popping up in our feeds no matter how extreme you tube most because that freedom of speach. is there anything we can do about it sadly not much but if we stop opening every single sensational video we see then it will stop them being recommended to us so often if everyone did the same it might just push the you tube algorithm and a generally more positive direction what do you think doesn't you tube facebook and d w dot com that's all for today take care and see you next time.
7:29 am
the venice film festival. director takes pride in his ponderous lines. and we can look forward to some quality movie. and we ask you streaming good for film and bad for cinema. ok venice the feature film on. kong t w b don't you know. he uses gunpowder to make burning art. dusts cars. and he floods sidewalks. norwegian artist do you know to make works with reactive media. what other surprises does he got in
7:30 am
store. your rocks in 60 minutes and all d w. earth 1st dance school in the jungle. or 1st cleaning lesson and then stores grand moment arrives join the arena tango on her journey back to freedom in our interactive documentary torah entering it in returns home. cooked. practicing it is looking at the bottom door would take. the cinema going through a thorough is revolution diprivan who should invite consumers everywhere and his colleagues in this border.
24 Views
Uploaded by TV Archive on