tv NEWS LIVE - 30 Al Jazeera October 7, 2018 10:00am-10:33am +03
10:00 am
over eighty a video of someone saw him is not necessarily going to you know thirty forty one of your markets to buy cigarettes when you just could as well let the policy of money stay or not basically there's a cost you're. going to go even to a war i'm going to respond and consider most of. it's been dissolved or. videos of people who died do not show what i mean of these we were just marketing threatens the book and still share those like raise awareness for. so it was part of that max policy is far less serious is. a short child abuse others find warm up before. the repeated it can be easier for sloppy overshadowed by adults or an awful lot of very you also are the director of
10:01 am
a bird book called group boy otto or he calls it rotating or shooting or if it was too young to start by the wrists i goes this arms are not. always for one. course. you recognize those images and i for one did you know. what video from they are from. that was actually sent to us from one of those supporters on skin. with a. sunday i am in boxes on this facebook page without i am asking for help because they saw i thought they'd seen and they were obviously it was scared they didn't want to call.
10:02 am
saw initially see a little tiny boy it was about two or three on the video with a man talking to him which i not say. and then he was hit him and punched him it was so i am about on the newest stump and he can on him and then i'll be so the video call saul. l.f. if noise and absolutely no say not paul from a sickening feeling not you've just you've seen some autonomy little boy. i mean you know you start from actually not a day off. not just go to. your
10:03 am
city hall get out your ass. you say and that's all you think about is a little why a maybe say in that way we say was happening to him and you're full of idiots that . i am i did. and we received. messages back saying it didn't violate their terms and conditions. as it is our. their life wasn't made. no gray area but this is so interesting to the mobster yeah because it's so obvious he's still technically on. our reporter asks another moderate so why he thinks some graphic content gets marked as disturbing rather than deleted then i will be raising is behind gleeful murder using. people all over the years it's all for kids with warnings and. yeah and also for. the.
10:04 am
budgeting it should just be delayed and like i said. don't assume of censorship them ok if you start censoring through her if you lose interest but it's all a bit insulting to read what is going there. in the first two days after it was posted on facebook the video of the little boy being beaten was shared more than forty four thousand times. from facebook point of view this is essentially. you know the crack cocaine of their of their product right it's the really extreme really dangerous form of content that. attracts the most highly engaged people on the platform. facebook understood that it was desirable to have people spend more time on site if you're going to have an advertising based business you need them to see the ads see one site what facebook has learned is the
10:05 am
people on the extremes of the really valuable. because one person on either extreme can often provoke fifty or one hundred other people and so they will live as much extreme content as they can get it. we put the results of our investigation to facebook's vice president of global policy. shocking contents does not make us more money is just a misunderstanding of how the system works shocking consensually keeps people on facebook that means it's more likely that they will view advertising that makes you money that's why shocking content is good for facebook again i don't that's not our experience of the people he's also his around the world there is a minority who are prepared to abuse our systems and other internet platforms to share the most offensive kind of material by just don't agree that that is the experience that most people want and that's not the experience we're trying to deliver. there's no thousand as
10:06 am
a couple of photo op means on that page so that with that we have is looking to see if you can find any time gaggle of fans through kevin and if. i've been going on for a while with no idea where to go with ace and if you porter facebook not it's saw a scene with there are three of his just saying to try and find out some information on the video and to tell you i say. all possible be dark blue discovered we don't look over there go get his big we drill motor through. yes but there's nothing all to got to do our policies for not even your own without your goods this is the living videos that are no more i'm not munching on percentage beans will be all the girls or hotels. over the propulsion. all over the world go for more work in order to smoke in the.
10:07 am
our investigation found that unless they are streamed live videos of physical child abuse a not usually reported to the police. so best occasional price as a follow up price. and may be. off the not so. call me at all known child because it will be a story. yes very good stuff right. there is just so herschel cards there are other words just for reasons of course research from your. i know it's with facebook do anything i would they possibly know unless if you see something where troll is going to read like this it was a large be the case of folks who might be going to the law it may be a different card but as i've heard of star trek a video store recently. is no good to be reported by off to the
10:08 am
police i think the reality is someone is going to it's going to be most you know now about how to know boys we found out was in malaysia these child i'd. say it was stepfather and been arrested and jailed for twelve months. how long ago was it that you first kind of most of it yeah i mean that was enough think it was december two thousand and twelve. saw it's being sick you know it's almost. a video still on facebook yeah. this video was used in facebook's training as an example of the kind of child abuse
10:09 am
that should be marked as disturbing and left old. silent. if that's being used as an example for motivators of what is considered acceptable to say what is tolerated on facebook is truly shocking this is. why. in order that a possible and education and rescue into the physical abuse we may not immediately remove this content from facebook it's perfectly possible for facebook to take down footage of child abuse and for that to be passed to the author of a copy of that material to be retained if that will assist in a law enforcement in a police say investigation it's difficult to conclude why that material would need to stay on the site we know that for as long as that content command on a social media site. exacerbates the trauma that children might feel not only
10:10 am
has that child been subject to sustained physical abuse but unfortunately that child is being the abused by the fact that that content is there for anyone to go on to facebook and see unfortunately the child is being the abused with every click . on facebook an individual shared it i want to be clear that's not acceptable but it shouldn't be that material should have been taken down on the site that you went to the c.p.l. site you see part of the total system so those are the front line reviewers but behind them sits a team of child safety experts there are actually facebook full time staff they will make an assessment of whether the child is at risk they will make a decision about what to do with the content including referring it to law enforcement agencies where that's a program. after we made facebook aware of this child abuse it was still available
10:11 am
on the platform they told us they have now reviewed the material you. used to train new moderates as. just this rule i'm always a bit just what i want. so they were just saying basically if i said to me i'm a pretty show that minus. is kind of just to go fighting. that's over up at the very mention of a school stuff or lack of it. on the pulsar was moderating a video showing two teenage girls fighting both girls are clearly identifiable and the video has been shared more than a thousand times. one and definitely more i don't want to. play for the face. also for those. she has said. that she's how she's very selfless. and i and.
10:12 am
my friends from home you know and i just saw a movie star you don't fare. as a video if you toss she thought you need to see it you say. stop the fight. in the throw each other on the flaw. the other girl saw. just posts time on my thoughts i. just repeat a lot of nice and kicks in and they had. and looks she just looks out of control basically she looks like a wild animal to wake up and stand behind. the whole world is watching as well a senior most debate you know horrifying it was humiliating for. the video has been posted with a caption condemning the violence based on the new policy which is one of the
10:13 am
reforms proposed for the. really rather you don't even if it's going to i'm with my mom and everything like everybody condemning of the use of more would be before we go over it all. over the book. folding a recent change to the publishing even experienced moderate says i'm not sure whether the video should be deleted almost as disturbing i'm left on the song it's really going. to saw anything on the video of the border and warners sort of condemning books of its ability. or was this one of them out of business and out. of it. because there's a caption condemning the violence the video gets mugged is disturbing i'm left alone the songs. from. we showed footage from our undercover filming to the girl's mother.
10:14 am
should really have been a question about its take it. should have been a discussion of those he said himself to solve. anyway have been. dition of being questioned and. you see they have an addon it is horrible. is the school stood so why was it is why was it the school should why was it a discussion of its sad. i don't know at it. you know someone's child but. it's not fast spoken time and.
10:15 am
so that's interesting because i asked if there is like bullying between. if you can just delete it i'm sure there were. if there were very few groups groups for the world to look around for a new kid from aruba to screw look for you our daughter. will be on for her face with her i say overdue for if there was a caption saying brilliant she had a clearly you're over forty three and four from. what option is. there all the west spend awareness with. someone's daughter being battered but was no emotion oh my god what's happening to the not think you know the beginning this is the thinking about the policies if they're watching a video of their own daughter what decision. about.
10:16 am
if a parent or guardian sees a video of that child in circumstances that they object to they do have the right to insist that we take it down and we do take it down where we're made aware so you're putting the onus on the victim here to complain to you why aren't you taking this material down before humiliates children and again if the content is shed in a way that praises or encourages that violence is going to come down but where people are highlighting an issue and condemning the issue even if the issue is painful there are a lot of circumstances where people will say to us look facebook you should not interfere with my ability to highlight a problem that's occurred as people watch these videos facebook's making money. yes the money we make is by people using the service and seeing ads within the newspaper. when you have forty billion of sales and tens of billions of profit per
10:17 am
year you pretty much have an obligation to. do everything in your power to make sure that you're not making the world worse for the users of your product. one of the really special things that working for al-jazeera is that even as a camera woman i get to have so much input in contribution to a story i feel we cover this region better than anyone else would be what it is you know is that it shouldn't be but it is but the good because you have a lot of people that are divided on political issues we are we the people we live to tell the real stories i'll just mended is to deliver in-depth generalism we don't feel in favor to the audience across the globe. brother leader or brutal dictator. with discontent spreading through north africa time was running
10:18 am
out for libya's self-styled king of kings. in the first of a two part series the big picture charts the rise and fall of one of the and the events that helped fuel the violence of his final. the last for libya on a few. i'm kemal santa maria with another check of the headlines and sources have told al jazeera the saudi journalist jamal khashoggi may have been murdered inside his country's consulate in istanbul turkish prosecutors are wiping their investigation into his disappearance after a group of saudi officials flew into a stone building visited the consulate on the day you can show he went missing he was last seen entering the building on tuesday however the reuters news agency says
10:19 am
a source at the consulate denies reports krishnaji was killed in the compound. in istanbul turkey or security officials are now dealing with the case of. a murder investigation late on saturday they had said that they had information that fifteen saudi nationals among them officials had flown in on tuesday on two separate flights had gone to the concert at the same time that cultural g. was a very are and then the had left till now there has been no disclosure of the whereabouts of his body in other news donald trump has spoken of his delight at the decision by the u.s. senate to confirm brett kavanaugh as a supreme court judge kevin has now been sworn in after that contentious senate vote there were thirty hours of intense debate for and against the candidate who was of course subject to an f.b.i. investigation into sexual assault allegations. just a few hours ago the u.s.
10:20 am
senate confirmed. brett kavanaugh it was illegal it was and i gradually sanj the judges commission of the air force one just before land was flattened neighborhoods in indonesia and may soon be declared mass graves following last week's earthquake and tsunami more than sixteen hundred people are now confirmed dead and there are growing concerns about the outbreak of disease and brazil's care for what's been described as the most divisive election in its democratic history now action officials have distributed ballot boots and electronic voting machines ahead of sunday's vote some one hundred forty seven million voters will choose from thirteen candidates for president new south even twenty five minutes right now it's back to part two of
10:21 am
inside facebook the consequence of war. veterans and roger shoals he served in the marine corps mentioning. that just doesn't go away. a little knot of true for the last couple years in this home was zero follows a group of u.s. army veterans traumatized by war. as they struggle to get their lives back shelter. good. luck. let. them up.
10:22 am
10:23 am
if they use a finds a piece of content they think is inappropriate as they can recall since a facebook. the moderator will then decide whether almost it breaks the platforms rules that were brought to you from a report over the so there were a group of over a motorcycle. so for us it's pretty good says. each of these reports is called a ticket and the tickets build up in the queues that the moderates is one of three off to three and a half weeks of training all reports and is now working through his own queue of tickets. while we were filming on the cover. was cool to fool the u.s. senate and accused of not doing enough to protect facebook's uses we want to hear more without delay about what facebook and other companies plan to do to take
10:24 am
greater responsibility for what happens on their platforms. it's not enough to just build tools we need to make sure that they're used for good or in the last year we've basically doubled the number of people doing security and content review we're going to have more than twenty thousand people working on security and content review by the end of this year. recently there has been a huge display good fulfilling reported but we have closed the book. to a good fifty those reports today that the two of us try to do maybe continue to do is sit through those and she does move quicker to rebuild them so just more. facebook aims to assess all reported contents within twenty four hours but it's become clear that there is a backlog of fifteen thousand tickets that have been waiting longer than last year i just want to say that the current for the active fire within our community. it's really been recognized not just with it you have on the higher level in the mountains in the case but. i know the last couple of weeks have been really harsh
10:25 am
the backlog is crazy i feel this understaffed and the three years. longer and those are. fifty seven hours like over six hundred sixty looks like it's not one part ok so our latest five is eighteen hours and forty five that's very good because our turnaround time for this queue is twenty four hours. so that means we have to get everything down with twenty four hours. or so within the high risk area but there be anything that i. say someone saying i'm going to crisis out in ten minutes. i would yes. and he reports the related to somebody at risk of suicide would not have gone into the queue where there was a backlog they would have gone into a high priority queue where we were meeting the standards we've checked since you
10:26 am
brought this issue to us and we're very confident that even in that period where we had a backlog in the normal q we did not have a backlog in the suicide q so the c.t.o. moderation was rolled there were all these licenses yes right ok facebook told us this backlog was cleared by the sixth of april and they are doubling the number of people working on safety and security. reasonable says he's relieved for them because they have every you know five hundred. i feel sorry for me very feisty i've weighed more than many other things. undercover reporter is being trained in how to deal with content relating to self home. so some examples over three sides have. any material promoting self harm or suicide gets deleted in the first one we have four of them
10:27 am
caught they don't read or write or my books i'm ok let me say wait so these very serious i mean maybe right. so these. i believe. any material that shows self harm but doesn't promote says is called self harm admission and gets left almost without a warning so. we are going to send a check for you if you go for more masonic resources like riders. with home admission the user who posted the contents of the sensor message called a checkpoint containing information about mental health support services. or means of research and life resources. spent a small people over. i
10:28 am
think that probably around sixty five percent of my skull tribute to the impact social media has had on me. there's definitely a bit of a rush of adrenaline. amazing in the blood was a relief because it reminded me that there was something inside of me and that i wasn't as empty as i felt one of these fine. totally fine so your recent checkpoint. were. somebody that i met who had facebook she used to regularly post pictures of herself home before they were healed and before they were scar and i would look at her picture as there was also a group that i used to follow on facebook. it meant that i was surrounded myself
10:29 am
with people that were self harm and it would encourage me to cut a lot more it became a bit of a competition and that after all i needed to duty cards and that i needed to have with self harm. the other person. for the. birthday of the fool for. once to moderating our reporter comes across graphic images of self. over the third point mugu for those goods or look good i will read it over the rest of the soul for the new food but it has. no promotion. first we'll see more of. both as both because.
10:30 am
it's not appropriate can see the people of a badly cut themselves as old scars there so this isn't just a one off and they're cutting themselves that's not appropriate that's not ok misery loves company so if you can get out there and you can actually see people doing the thing that kind of is a representation of their misery there might be something. attractive in that and no track team to a mind that isn't functioning how it should be. or more your. thoughts your looks are the most ludicrous of the hour nothing is working or due to what was going to school to school work or they were or were. praying for a year older and you're. using. is there any justification for for allowing your images of yourself it's not
10:31 am
really. you know there are healthier ways of doing that it's not that we don't want to listen to them it's not that we don't want to help them in the long run it's not a healthy thing to do you know this needs this level of of harm needs proper professional intervention and putting stuff out on social media is not professional intervention. or one. of those with more than strong school barrow and. forces. for. using the drugs. for the mother or. i just think it's something that's going to be hard for other people to sound who haven't experienced this i think that self harm is something that's so complex that
10:32 am
. it can't be characterized in a guideline or in a room and it can't be understood by somebody who doesn't understand how it feels to do that would you have a look at. images a cell phone and facebook. now oh no not at all i'm strong enough for it not to affect me but i'm not strong enough to take the rest of it i believe that it wouldn't affect me completely believe that but i'm not prepared to take a chance. there's actually a very strong valid interest from that person if they're expressing distress to be able to express their distress to their family and friends friends through facebook and then get help and we see that happen every day the individuals are provided
10:33 am
with help by their family and friends because the content stayed up if we took it down the family and friends would not know that that individual was at risk. i was looking for really. good food so workers were not so. good looks pretty good woop doing the training session on self harm the issue of child uses comes up so we don't know how to. account for their meeting you know very facebook's rules state that no one under thirteen can have an account you have what they provide but think. yeah so it's very light and we ignore the rules you got she really said. yes. so you thought this whole.
21 Views
Uploaded by TV Archive on