Skip to main content

tv   Shift  Deutsche Welle  September 8, 2019 11:15am-11:31am CEST

11:15 am
secret peace talks with afghanistan's taliban need as a camp david the move came off the insurgent group claimed responsibility for an attack in kabul that killed 12 people including an american soldier on thursday. you're watching you know the news next hour we ask why new chief a basic string videos that song shake in the digital age when they can always find an eighty's headlines around the clock on our website d.w. dot com and had an entree in valentine's day a company. every journey begins with the 1st step and every language with the 1st word published in the cook. eco is in germany to learn german wine ok with him it's simple online on your mobile and free to something. you learn in course nikos speak german meeting c. . with your family scattered across
11:16 am
the globe. looking. to come. back to the roots should get a minimum of the. charge family from somalia live around the world among them needed urgent assistance and. family starts october any on w. sexism homophobia and political extremism score hits on you tube even though the site has plenty of high quality content it's the extreme videos that attract audiences and to big profits for the channel owners and for the site itself why does 8 make money on you tube and what does that mean for the user our topic today on shift.
11:17 am
you tube's recommendation algorithms favor polarizing content and that tends to radicalize uses conservatives for instance increasingly steered toward right wing extremist channels as a brazilian study has shown 79000000 comments were analyzed for it it seems about time that you tube takes action so as an evil diski c.e.o. of the web site has announced changes to the community guidelines you tube is promising to remove condon that violates you tube's policies faster give trustworthy sources a bigger voice reduce the spread of content that's borderline rule breaking and raise the standards for monetize ation. they'd work sounds good but will it work in practice not everyone thinks it will. american video journalist carlos massa has criticized you tube for not standing up for minorities he endured continuous
11:18 am
homophobic and racist abuse from another you tube or stephen crowder widely seen as a far right commentators. by the way because nothing. like anything but youtube says videos like this are protected by freedom of speech after widespread protests from the l g b t q community the video sharing site decided to dean monetize stephen crowder's videos that means no advertising can be run together with these videos meanwhile the public dispute had only increased the you tube or his notoriety and his number of subscribers. far right and extremist channels often enjoyed great success on you tube the more extreme the more comments . this has been shown in a study by brazilian computer scientist manuel ribeiro. the leader of austria's identity movement martine is active on you tube among his political weapons are
11:19 am
trolling and hate speech heckling and abusing other web users. normal tactics hate and trolling up parts of his space and if you can't handle it you shouldn't even enter it takes almost. august 29000 you tube attempted to pay and march in selma and other right wing extremist channels from that site but with little success. they see a lawyer on the case and the next day his channel was back on line. and post their banned videos on other channels or simply wait for their fans to reproach them. it can be a rough place where trolls and haters can poison the atmosphere with impunity radical content may be taken down but other users only put it up again right away and people who can't stand the heat may have only one option to leave their
11:20 am
accounts but does it have to come to that abuse and you do you can hit very hard and it can hit any user as the example of brennan gilmore from the u.s. shows he received death threats after conspiracy theories about him started circulating online this living you know sort of with the constant threat of what you know of these these threats that i've received it's just it changes you psychologically you know you're kind of sort of constantly looking over your shoulder. the former u.s. foreign service officer who was present at the protests in charlottesville in 2017 when a man drove a car into a crowd of demonstrators. they were protesting against a march by right wing extremists. brennan gilmore called the attack on video it cost to young woman her life but when i saw what i had on video i gave it to the police 1st but what i saw was not an accident it was not an accident it was clear
11:21 am
show you know that this guy had a 1000000 other choices drive through this crowd and so i thought that it was important to share for those reasons and so i sent it out soon far right websites began claiming gilmore had staged the bloodshed some said he was part of a shadow government out to topple president trump with cia support this conspiracy theory spread like wildfire landing on the right wing hard line info wars page. i think i was. reading your. the hey campaign rapidly infiltrated his real life to this day he receives death threats by e-mail vernon gilmore if you are directly involved you are sick sick sick. hash tag down with the deep state here's one burning gilmore's body is found in the ravana river there a van or reverse a half mile down the road. they're quite specific in that in the threats brendan gilmore has decided to fight he's filed suit against the well known right wing
11:22 am
conspiracy theory peddler alex jones and his kin for wars website. now in 4 wars and alex jones no longer have their own channels on you tube. it seems the site is that at least trying and is moving faster to remove content that violates the community guidelines still you tube continues to favor extreme use the more controversial the video the more lucrative that's how the youtube algorithm sees it as law helps to create the algorithm so he knows it inside out. french computer scientist left in 2013 he says the algorithms that spell you tube 6 asaad g.p.s. at best the use of d. printing and reason which is not official interdictions that process billions of user sessions will see which media is are going to be the most likely to keep
11:23 am
people in the names of all. the key to me is a watch time it's the product of hits to 2 sticks and average running times together that with the time you spend on each of the longer uses stay on the website the better for you cheap that's the company run no ads which then add up to lots of money. the videos that have proven most effective are those with polarizing and extreme. recommendation algorithms play their part. it's like a vicious circle you just watch one because you're all your curious on the negras and we recommend it again and again jews rule the world homosexuality is a. disease and refugees rape women and steal jobs wild theories like these are hugely popular on you tube because an algorithm gives them greater exposure but why exactly are we so receptive to videos like these why do we click on this kind of nonsense and help it to spread. oliver creating as
11:24 am
a professor of communications science at the university of mines and an expert on media innovations he explains what goes on in our heads when we spend time on you tube using. volved to react faster to dangerous threats and insults than to harmless stimuli no standard response is not a neutral but very emotional. when i read hatred and provocations on the internet for example even after all the research i've done on it i can't control myself it affects me immediately. more watch time more hits and more comments it's really not surprising that you tube favors extreme content the perceived anonymity of the internet also makes it easier for users to overcome their inhibitions and say things they would never say to someone face to face. which is mean years when i talk to you here you can see your responses but with an
11:25 am
abstract entity online that's no longer the case they lose part of their humanity and a big problem is that we've learned we don't have to worry about being prosecuted even if we spread hate. when we asked for the you tube algorithm appeals to our lowest instincts with content we react to emotionally and spontaneously but does this fully explain the extremist tendencies on you tube the algorithms are primarily designed to increase revenues and watch time of course companies are supposed to earn money but at the expense of ethics and anyway how would program us go about creating ethical algorithms. algorithms are programmed by human beings humans whose prejudices and opinions flow into the programming so algorithms can never be perfectly neutral they always need ethical guidelines the rainer how much policy founded a nonprofit organization called the ethical tech society to focus. search on the
11:26 am
ethics of algorithms. and government can and it is that other rhythms can't be programmed to be ethical. but many of the methods we use asserted pendent and data it's the data that shapes the algorithms or the program that was on the study so the program has set up the scammers. and we provide the meat and the muscle. so the users actually provide the algorithms with the necessary data through their browsing habits but how is this data evaluated. the arena is demanding that the inner workings of the algorithms be opened for evaluation and regularly inspected. the youtube algorithms prioritize watch time and advertising revenues many people only watch videos that are recommended to them this is just what you tube likes to see the center this is not interested in quality their interest is primarily
11:27 am
monetary keeping users on the site so they'll watch as much as possible and this is and that's not necessarily policy that does that quality doesn't always have emotional reactions. but emotion is a decisive factor in you tube's business model many users see count videos that. well researched facts are not the highest priority. if you to work purely on entertainment site such issues would not be as serious. but it's established as a primary source of information for society. as a fantasist and so it's a political decision by the owner google as a corporation. and it's optimized for monitor and not for quality. so you tube is all about watch time digital ethics can only become a factor if youtube and other companies make fundamental changes to their websites
11:28 am
and they're reluctant because the current business model is working all too well so questionable content will keep on popping up in our feeds no matter how extreme you tube mostly cause that freedom of speach. is there anything we can do about it sadly not much but if we stop opening every single sensational video we see then it will stop them being recommended to us so often if everyone did the same it might just push the us you've got a rhythm and a generally more positive direction what do you think doesn't you tube facebook and e.w. dot com that's all for today take care and see you next time. plug
11:29 am
. on the some of the 7 percent talk about big issues. american to be germane. to the german africa. never. stop reserves it between the 2 of us she sings about it and sets an example for others. next double. * click on. the link and something on the way is the closest relationship. to the 1st time facing williams this is a top flight teams again. such as germany's capitals finally coming up. next to the town. really ends.
11:30 am
in 60 minutes on the top of. her 1st day of school in the jungle. her 1st clinging lesson and bent towards a grand moment arrives join the arena tank on her journey benji. in our interactive dungeon tura entering it and returns home. and welcome to the 77 percent i want to show for a week as you. coming up on today's program we are in germany of all the streets. if you can my me needs me young africans and i'll call germans in believe to talk about their experiences of part of the hour because they asked for
11:31 am
a. heads to at least as it was when you vicente you shall when they get sick leave this doofus.

14 Views

info Stream Only

Uploaded by TV Archive on