tv Shift Deutsche Welle September 9, 2019 8:30am-8:45am CEST
8:30 am
price for government and corporate. selling out of our country. don't use fear no how you know it. starts september 18th on d w. sexism homophobia and political extremism score hits on you tube even though the site has plenty of high quality content it's the extreme videos that attract audiences and to big profits for the channel owners and for the site itself why does hate make money on you tube and what does that mean for the user our topic today on shift. you tube recommendation algorithms favor polarizing content and that tends to
8:31 am
radicalize users conservatives for instance increasingly steered toward right wing extremist channels as a brazilian study has shown 79000000 comments were analyzed for it it seems about time that you tube takes action so as an evil discreet see all of the website has announced changes to the community guidelines you tube is promising to remove condon that violates you tube's policies faster give trustworthy sources a bigger voice reduce the spread of content that sport line rule breaking and raise the standards for monetize asian. big words sounds good but will it work in practice not everyone thinks it will. american video journalist carlos has criticized you tube for not standing up for minorities he endured continuous homophobic and racist abuse from another you tube or stephen crowder widely seen as a far right commentators about. you by the way they did not. appear to pick up
8:32 am
anything but you tube says videos like this are protected by freedom of speech after widespread protests from the community the video sharing site decided to dean monetize steve encounters videos that means no advertising can be run together with these videos meanwhile the public dispute had only increased the you tubers notoriety and his number of subscribers. far right and extremist channels often enjoyed great success on you tube the more extreme the more comments. this has been shown in a study by brazilian computer scientist manuel ribeiro. the leader of austria's identity movement martin is active on you tube among his political weapons are trolling and hate speech perfect lying and abusing other web users.
8:33 am
i think these are normal tactics hate trolling up part of their space and if you can't handle it you shouldn't even enter it it's almost. august 2900 youtube attempted to ban march in selma and other right wing extremist channels from its site but with little success. got a lawyer on the case and the next day his channel was back on line. post their band videos on other channels or simply wait for their fans to reproach them . it can be a rough place where trolls and haters can poison the atmosphere with impunity radical content may be taken down but other users only put it up again right away and people who can't stand the heat may have only one option to leave their accounts but does it have to come to that abuse and huge you can hit very hard and it can hit any user as the example of brendan gilmore from the us shows he received
8:34 am
death threats after conspiracy theories about him started circulating online living you know sort of with the constant threat of you know of these these threats that i've received it's just it changes you psychologically you know you're kind of sort of constantly looking over your shoulder. the former u.s. foreign service officer was present at the protests in charlottesville in 2017 when a man drove a car into a crowd of demonstrators. they were protesting against a march by right wing extremists. brennan gilmore called the attack on video it cost a young woman her life but when i saw what i had on video i gave it to the police 1st but what i saw was not an accident it was not an accident it was clear show you know that this guy had a 1000000 other choices drive through this crowd and so i felt that it was
8:35 am
important to share for those reasons and so i sent it out soon far right websites began claiming gilmore had staged the bloodshed some said he was part of a shadow government out to topple president trump with cia support this conspiracy theory spread like wildfire landing on the right wing hard line info wars page. i think it's like reading your. the hate campaign rapidly infiltrated his real life to this day he receives death threats by e-mail burning gilmore you are directly involved you are sick sick sick. hash tag down with the deep state here's one burning gilmore's body found in the ravana river there are many rivers a half mile down the road. they're quite specific and the threats present gilmore has decided to fight and he's filed suit against the well known right wing conspiracy theory peddler alex jones and his kin for wars website. now in 4
8:36 am
wars and alex jones no longer have their own channels on you tube. it seems the site is that at least trying and is moving faster to remove content that violates the community guidelines still you tube continues to favor extreme use the more controversial the video the more lucrative that's how the youtube algorithm sees it as law helps to create the algorithm so he knows it inside out. french computer scientist left in 2013 he says the algorithms that spell you tube success are g.p.s. at best they use deeply earning and reason which is not usually interdictions that can process billions of user sessions i'll see which videos are going to be the most likely to keep people in the names of all. the key to me is a watch time it's the product of hits to 2 sticks and average running times
8:37 am
together with the time users spend on each of the longer users stay on the website the better for you cheap that's the company run more ads which then add up to lots of money. videos that have proven most effective out those with polarizing and extreme views and usually a recommendation algorithms play their part. it's like a vicious circle you just watch one because you're all your curious and then here goes and we recommend it again and again jules ruled the world homosexuality is a. disease and refugees rape women and steal jobs wild theories like these are hugely popular on you tube because an algorithm gives them greater exposure but why exactly are we so receptive to videos like these why do we click on this kind of nonsense and help it to spread. oliver creating as a professor of communications science at the university of mines and an expert on media innovations he explains what goes on in our heads when we spend time on you
8:38 am
tube because in. volved to react faster to dangerous threats and insults than to harmless stimuli no standard response is not a neutral but very emotional. when i read hatred and provocations on the internet for example even after all the research i've done on it i can't control myself it affects me immediately. just one. more watch time more hits and more comments it's really not surprising that you tube favors extreme content the perceived anonymity of the internet also makes it easier for users to overcome their inhibitions and say things they would never say to someone face to face. which leads me to the nearest when i talk to you here i can see your response but with an abstract entity online that's no longer the case they lose part of their humanity and a big problem is that we've learned we don't have to worry about being prosecuted
8:39 am
even if we spread hate. when we asked for the you tube algorithm appeals to our lowest instincts with content we react to emotionally and spontaneously but does this fully explain the extremist tendencies on you tube the algorithms are primarily designed to increase revenues and watch time of course companies are supposed to earn money but at the expense of ethics and anyway how would programmers go about creating ethical algorithms. rhythms are programmed by human beings humans whose prejudices and opinions flow into the programming so algorithms can never be perfectly neutral they always need ethical guidelines the rainer how much policy founded a nonprofit organization called the ethical tech society to focus. search on the ethics of algorithms. and government can and it is that algorithms can't be
8:40 am
programmed to be ethical. but many of the methods we use assert a pendant on the data it's the data that shapes the algorithms or the program that was on the stand so the program has set up the scam. and we provide the meat and the muscle. so the users actually provide the algorithms with the necessary data through their browsing habits but how is this data evaluated. the arena is demanding that the inner workings of the algorithms be open for evaluation and regularly inspected. the youtube algorithms prioritize watch time and advertising revenues many people only watch videos that are recommended to them this is just what you tube likes to see the center this is not interested in quality their interest is primarily monetary keeping users on the site so they'll watch as much as possible and this is the one that's not necessarily policy that does that quality doesn't always evoke an emotional reaction. but emotion is
8:41 am
a decisive factor in you tube's business model many users seek out videos that. well researched facts are not the highest priority. if you tube work purely on entertainment site such issues would not be as serious. but it's established as a primary source of information for society. as a fantasist and so it's a political decision by the owner google as a corporation. since it's optimized for monetisation way and not for quality. so youtube is all about watch time digital ethics can only become a factor if youtube and other companies make fundamental changes to their websites and they're reluctant because the current business model is working all too well so questionable content will keep on popping up in our feeds no matter how extreme you
8:42 am
tube most because that freedom of speach. is there anything we can do about it sadly not much but if we stop opening every single sensational video we see then it will stop them being recommended to us so often if everyone did the same it might just push the you tube algorithm in a generally more positive direction what do you think tell us on you tube facebook and d w dot com that's all for today take care and see you next time.
8:43 am
i'm secure that the they were not hard and in the end it's a me you're not allowed to stay here anymore we will send you back. are you familiar with this. with the smugglers we're alliance and. what's your story. ready on what numbers of women especially of victims of violence. take part and send us your story we are trying in all ways to understand this new culture. another visitor nothing yet you want to become a citizen. in for migrants your platform for reliable information. to know that 77 percent of catholics are younger than 6 of 5. that's me and me and you. if you know what it's time all voices.
8:44 am
thank god the 77 percent body issue. this is where you keep. the 77 percent this weekend on d w. come. hardly a shift goes by without a physical assault. police officers are experiencing violence more and more often. now some german states are hoping body camps may help deescalate that trend worn on the uniforms the cameras record the encounter and may discourage potential attackers. that's the idea anyway.
8:45 am
minds the state capital of a violent collaton it giving shias and stephon male are going on t.v. the 2 officers carry a new weapon in the fight against the increasing violence and decreasing respect from the public body cams. the primary aim is our own safety and other words we wear the body cam as a preventative measure. we always carry them and we turn them on in dangerous and tense situations to try to influence the behavior of the people we gauge with.
23 Views
Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=742526964)