Skip to main content

tv   Shift  Deutsche Welle  September 7, 2019 8:15am-8:31am CEST

8:15 am
if you're watching you know the new slang from the man coming up next shift to living in the digital age looks at the algorithms used by machines to keep users on the site for as long as possible stay tuned for that if you can and remember you can get all the latest headlines around the clock and i web site you don't need dot com or on twitter you'll find us on the details when you. want. to. do the you. know i don't think the gym well i guess sometimes i am but i'm standing up in which the research i have been thinks deep into the german culture of looking at stereotypes square if you
8:16 am
think the future of the country that i no longer. needed so you can take this drama down to you it's all that. new i'm rachel join me to meet the germans on the w. post. sexism homophobia and political extremism score hits on you tube even though the site has plenty of high quality content it's the extreme videos that attract audiences and to big profits for the channel owners and for the site itself why does he make money on you tube and what does that mean for the user our topic today on shift. you tube recommendation algorithms favor polarizing content and that tends to.
8:17 am
attica lies uses conservatives for instance increasingly steered toward right wing extremist channels as a brazilian study has shown $79000000.00 comments were analyzed for it it seems about time that you tube takes action so as it would escape see all the website has announced changes to the community guidelines you tube is promising to remove condon that violates you tube's policies faster give trustworthy sources a bigger voice reduce the spread of content that's borderline rule breaking and raise the standards for monetize ation. big words sounds good but will it work in practice not everyone thinks it will. american video journalist carlos has criticized you tube for not standing up for minorities he endured continuous homophobic and racist abuse from another you tube or stephen crowder widely seen as a far right commentator. by the way they did nothing.
8:18 am
but you tube says videos like this are protected by freedom of speech after widespread protests from the community the video sharing site decided to dean monetize stephen crowder's videos that means no advertising can be run together with these videos meanwhile the public dispute had only increased that you tube or his notoriety and his number of subscribers. far right and extremist channels often enjoyed great success on you tube the more extreme the more comments. this has been shown in a study by a brazilian computer scientist manuel ribeiro. the leader of austria's identity movement martine sell nor is active on you tube among his political weapons are trolling and hate speech heckling and abusing other web users.
8:19 am
i think these are normal tactics hate trolling are part of this base and if you can't handle it you shouldn't even enter it it's almost. august 2900 youtube attempted to play and march in selma and other right wing extremist channels from its site but with little success. to see a lawyer on the case and the next day his channel pushed back on line other post their band videos on other channels or simply wait for their fans to reproach them . it can be a rough place where trolls and haters can poison the atmosphere with impunity radical content may be taken down but other users only put it up again right away and people who can't stand the heat may have only one option to leave their accounts but does it have to come to that abuse and huge you can hit very hard and it can hit any user as the example of brennan gilmore from the us shows he received
8:20 am
death threats after conspiracy theories about him started circulating online it's living you know sort of with the constant threat of you know these these threats that i've received it's just it changes you psychologically you know you're kind of sort of constantly looking over your shoulder. the former u.s. foreign service officer was present at the protests in charlottesville in 2017 when a man drove a car into a crowd of demonstrators. they were protesting against a march by right wing extremists. brennan gilmore called the attack on video it cost a young woman her life but when i saw what i had on video i gave it to the police 1st but when i saw it was not accidental it was not an accident it was clear show you know that this guy had a 1000000 other choices drive through this crowd and so i thought that it was
8:21 am
important to share for those reasons and so i sent it out soon far right websites began claiming gilmore had staged the bloodshed some said he was part of a shadow government out to topple president trump with cia support this conspiracy theory spread like wildfire landing on the right wing hard line info wars page. i think it's. reading your. the hey campaign rapidly infiltrated his real life to this day he receives death threats by e-mail burning gilmore you are directly involved you are sick sick sick. hash tag down with the deep state here's one burning gilmore's body found in the ravana river they were meant to reverse a half mile down the road. they're quite specific in the threats brandon gilmore has decided to fight and he's filed suit against the well known right wing conspiracy theory peddler alex jones and his kin for was web site. now in 4
8:22 am
wars and alex jones no longer have their own channels on you tube. it seems the site is that at least trying and is moving faster to remove content that violates the community guidelines still you tube continues to favor extreme use the more controversial the video the more lucrative that's how the youtube algorithm sees it as law helps to create the algorithm so he knows it inside out. french computer scientist left in 2013 he says the algorithms that spell you cheap 6 asaad g.p.s. at best they use d. printing and grizzle which is not usually interdictions that can process millions of user sessions we'll see which videos are going to be the most likely to keep people the names of all. the key to me is a watch time it's the product of hits to 2 sticks and average running times
8:23 am
together at what the time uses spend on you tube the longer uses stay on the website the better for you tube that's the company run more ads which then add up to lots of money. the videos that have proven most effective out those with polarizing and extreme. recommendation algorithms play that part. it's like a vicious circle you just watch one because you're all your curious and then here goes and we recommend it again and again jules rule the world homosexuality is a. disease and refugees rape women and steal jobs wired theories like these are hugely popular on you tube because an algorithm gives them greater exposure but why exactly are we so receptive to videos like these why do we click on this kind of nonsense and help it to spread. oliver creating as a professor of communications science at the university of mines and an expert on media innovations he explains what goes on in our heads when we spend time on you
8:24 am
tube using. volved to react faster to dangerous threats and insults and then to harmless stimuli no standard response is not a neutral but very emotional. when i read hatred and provocations on the internet for example even after all the research i've done on it i can't control myself it affects me immediately. more watch time more hits and more comments it's really not surprising that you tube favors extreme content the perceived anonymity of the internet also makes it easier for users to overcome their inhibitions and say things they would never say to someone face to face when the kids meet new years when i talk to you where you can see your response but with an abstract entity online that's no longer the case they lose part of their humanity under the big problem is that we've learned we don't have to worry about
8:25 am
being prosecuted even if we spread hate. when. the you tube algorithm appears to our lowest instincts with content we react to emotionally and spontaneously does this fully explain the extremist tendencies on you tube the algorithms are primarily designed to increase revenues and watch time of course companies are supposed to earn money but at the expense of ethics and anyway how would programmers go about creating ethical algorithms. rhythms are programmed by human beings humans whose prejudices and opinions flow into the programming so algorithms can never be perfectly neutral they always need ethical guidelines the rainer how much policy founded a nonprofit organization called the ethical tech society to focus. search on the ethics of algorithms. it is that other than scanning program to be
8:26 am
ethical. but many of the methods we use asserted pendent on data it's the data that shapes the algorithms or the program that was on the study so the program has set up the scammers. and we provide the meat and the muscle. so the users actually provide the algorithms with the necessary data through their browsing habits but how is this data evaluated. the arena is demanding that the inner workings of the algorithms be opened for evaluation and regularly inspected. the youtube algorithms prioritize watch time and advertising revenues many people only watch videos that are recommended to them this is just what you tube likes to see the center this is not interested in quality their interest is primarily monetary keeping users on the site so they'll watch as much as possible and this is and that's not necessarily policy that does that quality doesn't always evoke an
8:27 am
emotional reaction. but emotion is a decisive factor in you tube's business model many users see count videos that. well researched facts are not the highest priority. if you tube work purely on entertainment site such issues would not be as serious. but it's established as a primary source of information for society. as a fan is this not and so it's a political decision by the owner google as a corporation. and it's optimized for monetisation way and not for quality. so youtube is all about watch time there's little ethics can only become a factor if youtube and other companies make fundamental changes to their websites and they're reluctant because the current business model is working all too well so questionable content will keep on popping up in our feeds no matter how extreme you
8:28 am
tube mostly cause that freedom of speach. is there anything we can do about it sadly not much but if is stop opening every single sensational video we see then it will stop them being recommended to us so often if everyone that the same it my just push the use you bag a rhythm and a generally more positive direction what do you thing tell us some you tube facebook and d w dot com that's all for today take care and see you next time you
8:29 am
speak this line. this music your attempts. to mean feeling. and does this take your heart racing. to come to the right place. to keep up with. natural riches precious resources and a rewarding investment farmland has been called ethiopia's gringo the country has an abundant supply of leases it to international amber for giants. the government is after export revenues the corporations high profit margins.
8:30 am
but not everyone benefits from the booming business. the selling out of a country dead donkey. starts september 18th on d w. drive it the d w motor magazine this week we look at an exclusive off road on the new mercedes g. a cushy s.u.v. the citroën c 5 an across. and they're snappy comeback falls new focus is team. meeting doing.

37 Views

info Stream Only

Uploaded by TV Archive on