Skip to main content

tv   Shift  Deutsche Welle  September 7, 2019 7:15pm-7:30pm CEST

7:15 pm
information on our website just go to d.w. dot com american evan steen for me and the entire team thanks for watching. my 1st vice like was a sewing machine. i come from women are bound by this ocean for. something as simple as learning how to write a bicycle isn't. since i was in the. bicycle of my home and it took me miss them and. finally they gave up invented by young minds like this and returned with the sewing machine sewing i suppose was more appropriate for girls than writing and by. now i want to have those woman back home. and
7:16 pm
social rules and inform them about their basic rights my name is. and they were born into. sexism homophobia and political extremism score hits on you tube even though the site has plenty of high quality content it's the extreme videos that attract audiences and to big profits for the channel owners and for the site itself why does hate make money on you tube and what does that mean for the user our topic today on shift. you tube recommendation algorithms favor polarizing content and that tends to
7:17 pm
radicalize uses conservatives for instance increasingly steered toward right wing extremist channels as a brazilian study has shown 79000000 comments were analyzed for it it seems about time that you tube takes action so as we will discuss the c.e.o. the website has announced changes to the community guidelines you tube is promising to remove condom that violates you tube's policies faster give trustworthy sources a bigger voice reduce the spread of content that's borderline rule breaking and raise the standards for monetize ation. big words sounds good but will it work in practice not everyone thinks it will. american video journalist carlos has criticized you tube for not standing up for minorities he endured continuous homophobic and racist abuse from another you tube or stephen crowder widely seen as a far right commentator. by the way the good mexican guy did not have to pick up
7:18 pm
anything but you tube says videos like this are protected by freedom of speech after widespread protests from the l.g.b. t.q. community the video sharing site decided to d. monetize steven crowder's videos that means no advertising can be run together with these videos meanwhile the public dispute had only increased the you tube or his notoriety and his number of subscribers. far right and extremist channels often enjoyed great success on you tube the more extreme the more comments. this has been shown in a study by brazilian computer scientist manuel ribeiro. the leader of austria's identity movement martin is active on you tube among his political weapons are trolling and hate speech speculating and abusing other web users.
7:19 pm
i think these are normal tactics hate trolling up part of his space and if you can't handle it you shouldn't even enter it takes almost. august 2900 youtube attempted to march in selma and other right wing extremist channels from its site but with little success. to see a lawyer on the case and the next day his channel was back on line. post their banned videos on other channels or simply wait for their fans to reproach them. it can be a rough place where trolls and haters can poison the atmosphere with impunity radical content may be taken down but other users only put it up again right away and people who can't stand the heat may have only one option to leave their accounts but does it have to come to that abuse and huge you can hit very hard and it can hit any user as the example of brendan gilmore from the us shows he received
7:20 pm
death threats after conspiracy theories about him started circulating online it's living you know sort of with the constant threat of you know. these threats that ever see it it's just it changes you psychologically you know you're kind of sort of constantly looking over your shoulder. the former u.s. foreign service officer was present at the protests in charlottesville in 2017 when a man drove a car into a crowd of demonstrators. they were protesting against a march by right wing extremists. brennan gilmore called the attack on video it cost to young woman her life but when i saw what i had on video i gave it to the police 1st but what i saw was not an accident it was not an accident it was clear you know that this guy had a 1000000 other choices drive through this crowd and so i thought that it was
7:21 pm
important to share for those reasons and so i sent it out soon far right websites began claiming gilmore had staged the bloodshed some said he was part of a shadow government out of toppled president trump with cia support this conspiracy theory spread like wildfire landing on the rightwing hard line info wars page. i think it's like reading your they hate campaign rapidly infiltrated his real life to this day he receives death threats by e-mail burning to war you are directly involved you are sick sick sick. hash tag down with the deep state here's one burning gilmore's body found in the ravana river there a van to reverse a half mile down the road. they're quite specific in that in the threats brendan gilmore has decided to fight he's filed suit against the well known right wing conspiracy theory peddler alex jones and his kin for wars website. now in 4
7:22 pm
wars and alex jones no longer have their own channels on you tube. it seems the site is at least trying and is moving faster to remove content that violates the community guidelines still you tube continues to favor extreme use the more controversial the video the more lucrative that's how the youtube algorithm sees it as law helps to create the algorithm so he knows it inside out. french computer scientist left in 2013 he says the algorithms that spell you cheap 6 asaad g.p.s. at best they use d. printing and reason which is not usually interdictions that can process billions of user sessions all see which media is are going to be the most likely to keep people in the name of the us. the key to me is it watch time it's the product of hit
7:23 pm
statistics and average running times to get that with the time you spend on the cheap the longer you sustain on the website the better for you cheap that's the company run more ads which then add up to lots of money. the videos that have proven most effective out those with polarizing and extreme views and recommendation algorithms play that part. it's like a vicious circle you just watch one because you're all your curious on the negras and we recommend it again and again jews rule the world homosexuality is a. disease and refugees rape women and steal jobs while theories like these are hugely popular on you tube because an algorithm gives them greater exposure but why exactly are we so receptive to videos like these why do we click on this kind of nonsense and help it to spread. all of a clearing is a professor of communications science at the university of mines and an expert on
7:24 pm
media innovations he explains what goes on in our heads when we spend time on you tube using. volved to react faster to dangerous threats and insults than to harmless stimuli no standard response is not a neutral but very emotional. when i read hatred and provocations on the internet for example even after all the research i've done on it i can't control myself it affects me immediately. just one. more watch time more hits and more comments it's really not surprising that you tube favors extreme content the perceived anonymity of the internet also makes it easier for users to overcome their inhibitions and say things they would never say to someone face to face with which it's made in years when i talk to you here you can see your response but with an abstract entity online that's no longer the case they lose part of their humanity under the big problem is that we've learned we don't have to worry about
7:25 pm
being prosecuted even if we spread hate. when we asked for the you tube algorithm appeals to our lowest instincts with content we react to emotionally and spontaneously but does this fully explain the extremist tendencies on you tube the algorithms are primarily designed to increase revenues and watch time of course companies are supposed to earn money but at the expense of ethics and anyway how would programmers go about creating ethical algorithms. rhythms are programmed by human beings humans whose prejudices and opinions flow into the programming so algorithms can never be perfectly neutral but you always need ethical guidelines the rayna how much policy founded a nonprofit organization called the ethical tech society to focus. you search on the ethics of algorithms. and the been going in and it is that algorithms can't be
7:26 pm
programmed to be ethical. but many of the methods we use asserted pendent on data it's the data that shapes the algorithms or the program and i thought that was. the start so the program has set up the scammers. and we provide the meat and the muscle. so the users actually provide the algorithms with the necessary data through their browsing habits but how is this data evaluated. the arena is demanding that the inner workings of the algorithms be opened for evaluation and regularly inspected. the youtube algorithms prioritize watch time and advertising revenues many people only watch videos that are recommended to them this is just what you tube likes to see this internet is this fair not interested in quality their interest is primarily monetary keeping users on the site so they'll watch as much as possible and this is and that's not necessarily policy that does that
7:27 pm
quality doesn't always evoke an emotional reaction. but emotion is a decisive factor in you tube's business model many users seek out videos that. well researched facts are not the highest priority. if you tube work purely on entertainment site such issues would not be as serious. but it's established as a primary source of information for society. as a fantasist know what it is and so it's a political decision by the owner google as a corporation. since it's optimized for monitor is a sure way and not for quality but if. so youtube is all about watch time digital ethics can only become a factor if youtube and other companies make fundamental changes to their websites and they're reluctant to. because the current business model is working all too well so questionable content will keep on popping up in our feet no matter how
7:28 pm
extreme you tube mostly cause that freedom of speech. is there anything we can do about it sadly not much but if we stop opening every single sensational video we see then it will stop them being recommended to us so often if everyone did the same it might just push the us you've got a rhythm and a generally more positive direction what do you think doesn't you tube facebook and d w dot com that's all for today take care and see you next time.
7:29 pm
on the 77 percent talk about the issues that. are going to be germane. to the german coming up we're going to. start resume as it were that she sings about it but it sets an example for others. next. * week off. the list and some of the ways the closest relationship. to the 1st time for she misses a top flight teams again. is just a news conference coming up. next to. the mainstream media. in 60 minutes. literature invites us to see people in
7:30 pm
particular said i like to see myself as the kids find strength growing up. my objective is to show. me books on youtube. and welcome to the 77 percent i want to show for a week as you own one jacomo are. coming up on today's program in germany for all its treats the big easy for him i mean it's making young up and out for germans in believed to talk about their experiences as a part of that the country asked for. 3 heads to at least algebra when you.

23 Views

info Stream Only

Uploaded by TV Archive on