tv Frontline PBS October 31, 2018 4:00am-5:01am PDT
4:00 am
>> narrator: tonight- ioontline's investigatn into facebook continues. >> there is absolutely no company who has d so much influence on the informationme thatcans consume. >> narrator: he's the man who connected the world. but at what co? >> polarization was the key to the model. >> narrator: the global threat... >> this is an information ecosystem that just turns democracy upside down. >> narrator: the 2016 election... >> ...facebook getting over a billion political camp posts. >> narrator: and the company denials... >> the idea that fake news on facebook influenced the election in any way i think is a pretty crazy idea. >> ...facebook ceo mark zuckerberg will testify... >> ...and i'm responsible for what happens here.
4:01 am
>> narrator: is facebook reafo the mid-term elections? >> there are a lotf questions heading into this midterm... >> ...the midtm elections... >> i still have questions if we're going to make sure that in 2018 and 2020 this doesn't happen again. >> narrator: tonight on frontline, part two of he facebook dilemma". ro >>line is made possible by contributions to your pbs station from viewers like you. thank you. and by the corporation for public broadcasting. major support is provided by the john d. and catherine t. macarthur foundation, committed to building a more jusdant and peaceful world. more information is available at macfound.org. the ford foundation, working with visionaries on e front lines of social change worldwide. at fordfoundation.org. additional support is provided by the abrams foundation,xc committed tolence in journalism. the park foundation, dedicatedng to heighteublic awareness of critical issues. the john and helen glessner family trust. supporting trustworthy
4:02 am
journalism that informs and inspires. the wyncote foundation. and by the frontline journalism fund, with major support from jon and jo ann hagler. and additional support fromri and lisa kaneb. >> i accept your nomination for president of the united states.h >>bly accept your nomination for the presidency of the united stas. >> hey, everyone. we are live from my backyard, where i am smoking a brisket and some ribs and getting ready for the presidential debate tonight. >> some of the questions for tonight's debate will be formed by conversations hapning on facebook. >> 39% of people get their election news and decision-making material from facebook. >> facebook getting over a billion political campaign sts.
4:03 am
>> i love this, all the comments that are coming in. it's, like, i'm sitting here, smoking these meats and, um, and just hangingut with 85,000 people who are hanging out withy me iackyard. >> make no mistake, everything you care about, everything care about and i've worked for, is at stake. >> i will beat hillary clinton, crooked hillary, i will beat her so badly, so badly. >> i hope that all of you t ant and vote. this is going to bmportant one. ♪ al tonight's broadcast wil include facebook, which has become a gathering place for political convertion. (cheers and applause) ♪ >> thank you. ank you. >> facebook is really the new town hall. >> better conversations happen on facebook. >> poke for a vote.a >> poke fote. >> u.s.a.! u.s.a.! >> hillary! hillary!
4:04 am
>> facebook is the ultimate growth stock. >> facebook is utterly dominating this new, mobile, digital economy. >> have you been measuring political conversation ongs facebook, thinike the most likes, interactions, shares. >> hillary clinton has evaded justice. >> i thank you for giving me the opportunity to, in my view, clarify. >> 2016 is the social election. >> facebook getting over a billion political campaign post >> narrator: 2016 began as banner year for ma zuckerberg. his company had become one of the most popular and profitable in the world, despite an emerging dilemma that, as it was connecting billions, it was inflaming divisions. >> people really forming these pibal identities on facebook, where you will sple getting into big fights. >> nrator: we've been investigating warning signs that existed as facebook grew, and interviewing those inside the company who were there at the time. >> we saw a lot of our numbers
4:05 am
growing like crazy, as did the re of the media and the ne world in particular. and so, as a product designer, when you see your pr used more, you're happy. >> it's where we're seeing conversation happening about the election, the candidates, the issues. >> narrator: amid all this political activity on facebook,e nosed the platform more successfully than donald trump's digital media rector, brad rscale. >> i asked facebook, "i want to spend $100 million on your platform-- send me a manual." theyay, "we don't have a manual." i say, "well, send me a human manual, then." >> james jacoby: and what does the manual provide? >> youave a manual for your car. if you didn't have that for your car, there might be things you would ner learn how to use in your car, right? i spent $100 million on a platform, the most in history, it made sense for them to be there to help us make sure how we spent it right and did it right. >> with custom audiences, you ycan get your ads to peop already know who are on facebook. >> narrator: what facebook'sta represves showed them was how to harness its powerful advertising tools to fd and
4:06 am
target new and receptive audiences. >> now i'll target my ad to iends of people who like my page. rk what i recognized was the simple process of ing. i needed to find the right people and the right places to show them the right message. micro-targeting allows you to do is say, "well, these are the people that are most likely to show up to vote, andthhese are right audiences we need to show up." the numbers were showing in theh consumer sid people were spending more and more hours of their day consuming facebook content, sif you have any best place to show your content, it would be there. it was a place where their eyes were. that's where they were reading their local newspaper and do.g things and so we could get our message injected insat stream. and that was a stream which was controlling the eyeballs of most places that we needed to win. ♪ >> narrator: it wasn't just politics. by this time, facebook was also dominating t news business. >> 62% of americans say they get their news frosocial media sites like facebook. >> more than a dozen developerse , orked with us to build social news appsall with the
4:07 am
goal of helping you discover and read more news. >> narrator: facebook's massive audience enticed media organizations to publishnt straightthe company's news feed-- making it one of the most important distributors of news in the world. i >>'m personally really excited about this. i think that it has the potential to not only reink the way that we all read news, but to rethink a lot of the way that the whole news industry works. >> narrator: but unlike traditional media companies, facebo did not see itself as responsible for ensuring the accuracy of the news and information on its site. >> the responsibilities that they should have taken on areat sed to be called editing. and editors had certain waresponsibilities for wha going to show up on the first page versus the last page, the relative importance of things that don't relate purely to money and don't relate purely to popularity. so they took over the role ofth editing t ever taking on the responsibilities of editing.
4:08 am
>> narrator: instead, facebooked or was its algorithm, designed to feed users whatever was most engaging to t inside facebook, they didn't see that as a problem. >> jacoby: was there a realization inside facebook as to wt the responsibilities would be of becoming the main distributor of news? >>ot don't think there was a of thinking about that, that waea. i don't think therany, any thought that news content in particular had, had more value ornad more need for protect k.an any of the other pieces of content on faceb >> narrator: andrew anker was in charge of facebook's news k oducts team, and is one of eight former facebsiders who agreed to talk on camera about their experiences. >> ias surprised by a lot of things when i joined facebook. and as someone who grew up in the media world, i expected there to be more of a sense of how people interact with media
4:09 am
and how important media can be to certain people's information diet. (applause) >> we have a video from davida from napoli. >> no. (laughter) you know, we're a technology company. we're not a media company. >> the fact that so many big, well-known news brands really pushed into facebook p aggressively legitimized it as a place to get, kind of, information. and i think that also strangely created the opportunity for people who weren't lege, as well. because if the legitimate players are there, and you'rete not legitimaall you need to do is set up a website and then share links to it, and your stuff on facebook is going took imilar enough that you've just gotten a huge leg up. >> hillary clinton is the most corrupt person... >> narrator: but as the 2016gn campeated up... >> and i'll tell you, some of what i heard coming from my opponent...
4:10 am
>> narrator: ...reporter craig silverman was sounding alarms that facebook's news feed wasre sping misinformation-- what he called "fake news." >> fake news just seemed like the right term to use. and i was trying to get people to pay attention. i was trying to get journalists pay attention. i was trying to also get facebook and other companies like twittero pay attention to this, as well. >> narrator: silverman tced misinformation back to some unusual places. >> we started to see this small cluster of websites being run, the vast majority, from one town in macedonia. >> how popular is it? >> about 200 people, maybe. >> 200 people? >> yeah. >> are making fake news websites? y . >> most of them didn't really care about who won the election. they weren't in this for politics. if you put ads on these completely fake websites, and you've got a lot of trfic from facebook, that was a good way to make money. >> there are some people who gde, like, 200k or someth like that. >> 200,000 euros?
4:11 am
>> yeah, yeah, yea >> i remember one guy, i think he was 15 or 16 years old, telling me, you know, "americans want to read about trump, so i'm writing trumstuff." trump earned them money. we saw macedonia's publishing hillary clinton being indicted, the pope endorsing trump, hillary clinton selling weaponso isis, getting close to or above a million shares, likes, comments. that's an insane amount of engagement. it's more, for example, than when "the new rk times" had a scoop about donald trump's tax returns. how is it that a kid in macedonia can get an article that gets more engagement than a scoop from "the new york times" on facebook? >> jacoby: a headline during the campaign was "pope endorses trump," which was t true, but it went viral on facebook. was it known within facebook that that had gone viral? >> um, i'm sure it w i didn't necessarily know how viral it had gotten, and i certainly didn't believe that anybody believed it. >> jacoby: but would that have been a red flag inside the company, that something that's
4:12 am
patently false was being propagated to millions of people on the platform? >> i think if you asked the question that way, it would have been. but i think when you asked, then the next question, which is the harder and the more imt question, was, which is, "so what do you do about it?", you then very quickly get into issues of not only free speech, but to what degree is it anybody's responsibility, as a technology platform or as a distributor, to start to decide when you've gone over the line between something that is clearly false from something that may or may not be perceivee rybody to be clearly false and potentially can do damage?th >> jacoby: ovecourse of the 2016 election, there was a lot of news about misinformation. i mean, there was, famously, "the pope endorses tru do you remember that? >> absolutely. i, i wasn't working on these issues at the time, but, but absolutely i, i doemember it. >> narrator: tessa lyons was chief of staff to facebook's
4:13 am
number two, sheryl sandberg, and is now in charge of fighting misinformation. she is one of five current officials facebook put forward to answer questions. >> jacoby: was there any kinde, of sense of, loh, my goodness, facebook is getting polluted with misinformation-- someone should do something about this"? >> there certainly was, and there were people who were thinking about it. what i a real awareness of, internally or externally, was the scope of the oblem and the, the right course of action. >> jacoby: how could it surprising that, if you're becoming the world's information source, that the may be a problem with misinformation? >> there was certainly awarenese that tould be problems related to news or quality of news. and i think we allecognized afterwards that of all of the threats that we were considering, we focused a lot threats that weren't misinformation and underinvested in this one. ♪ >> narrator: but there was another problem that was going unattended on facebook beyond minformation.
4:14 am
>> one of the big factors that emerged in the election wawhat started to be called hyper-partisan facebook pages. these were facebook pages that o kilived and died by really ginning up that partisanship-- "we're right, they're wrong." but not even just that, it was also, "they're terrible people, and we're the best." and the facebook pages were getting tremendous engagement. >> "a million migrants are comingver the wall, and they're going to, like, rape your children," you know? that stuff is doing well. >> and the stuff that was true would get far less shares. it the development of these hyper-partisan s i think turned the informational commons into this trre. and there's some kind of parable in that for the broaderc effects ofook. that the very things that divides most cause the most engagement. >> (barking, laughin
4:15 am
>> which means they go to the top of the news feed, whic p means the mople see them. >> narrator: this worried an early facebook investor who wasu once close terberg. >> i am an analyst by training and profsion. and so, my job is to watch and interpret. at this point, i have a series of different examples that suggest to me that there is something wrong, systemically, with the facebook algorithms and business model. in effect, polarization was the key to the model. this idea of appealing t people's lower-level emotions, things like fear and anger, to acreate greater engagemen in the context of facebook, more time on site, more sharing, and, therefore, more advertising valu i found that incredibly distbing. >> narrator: ten days before the election, mcnamee wrote ouckerberg and sandberg abt his concerns. >> i mean, what i was really trying to was to help mark and sheryl get this thing right. and their responses were more or isss what i expected, whic to say that what i had seen were
4:16 am
ndolated problems, and that they had addressed eachvery one of them. i thought facebookould stand up and say, "we're going toor reassess oities. we're going to reassesthe metrics on which we run the company to try to take into account the fact that our impact is so much greater now than it used to be. and that as facebook, as ath company you know, billions of users, we have influence on how the whole social fabric works that no one's had before." (cheers and applause) >> i've just received a call from secretary clinton. >> clinton has called trump to concede the election >> the clinton campaign is... really a somber mood here.>> he crowd here at trump campaign headquarters... >> narrar: trump's targeted ads on facebook paid off... >> did things like facebook help
4:17 am
one of the nastiest elections ever? >> narrator: ...leading to complaints that facebook helped tilthe election... >> facebook elected donald trump, that's basically... >> narrator: ...which the trump campaign dismissed as anr over the results. >> there has been mounting criticism of facebook. >> no one ever complained about facebook for a single day until donald trump was president. the only reason anyone's upset about this ithat donald trump is president and used a system that was all built by liberals. when i got on tv and told everybody after my interview of what we did at faceboo it exploded.g the funny th, the obama campaign used it, then went on tv and newspapers, and they put it on the front of magazine, and the left and the media called them geniuses for doing that. >> accusations that phony news stories helped donald trumwin the presidency... >> narrator: trump's victory put facebook on the spot. >> facebook even promoted fake news into its trending... >> narrator: and two days after the election, at a tech conference in northern california, zuckerbe spoke publicly about it for the first time. >> well, you know, one of the things post-election, you've been getting a lot of pushbackfe
4:18 am
from people wh that you didn't filter out enough fakes, storight? >> you know, i've seen some of the stories that you're talking about, around this election. aere is a certain profound lack of empathy erting that the only reason why someone could have voted the way they did is because ty saw some fake news. you know, personally, i think the, the idea that, you know, fake news on facebook, of which, you know, it's, it's aun very small aof, of, of the content, influenced the election in any way, i think, is a pretty crazy idea, right? re if i had been sitting t in an interview, i would have said, "you're lying." when he said, "we had no impact on the election," that... i remember reading that and being furious. i was, like, "are you kidding me?" like, "stop it." like, you cannot say that and not be lying. of course they had an impact, it's obvious. they were the most important distribution, news distribion. there are so many statistics
4:19 am
about that. like, i don't knowow you could possibly make that claim in public and with such a cavalier attitude. that infuriated me.ed and i teverybody there, saying, "you're kidding me." >> jacoby: is he not recognizing the importce of his platform in our democracy at that point in time? yes, i think he didn't understand what he had built, or didn't care to understand or wasn't paying attention, and doesn't... they really do want ne pretend, as they're getting on their private p as they're getting... going to their beautiful homes, as th're collecting billions dollars, they never want to acknowledge their power. they're powerful, and they have... they don't. >> thank you so much for being here. >> thank you, guys. >> i think it was very easy for all of us sitting in menlo park to not necessarily understand how valuable facebook had become. i don't think any of us, mark included, appreciad how much of an effect we might have had. and i don't even know today, two years later, or almost two years later, that we really understand wehow much of a true effec had.
4:20 am
but i think more importantly, we all didn't he the information to be saying things like that at the time. i my guess ithat mark now realizes that there was a lot more to the story than, than he or any of us could have imagined at that point. >> narrator: barely two months later, in washington, an even more serious situation was developing. inlligence agencies were investigating russian interference in the election, and whether social media had played a role. >> classical propaganda, disinformation, fake ns. >> does that continue? >> yes. hein my view, we only scra the surface. i say "we," those that assembled the intelligence community assessment that we published on the 6th of january 201 meaning nsa, c.i.a., fbi, and my office. but i will tell you, frankly, that i didn't appreciate the full magnitude of it until wellf ter. n arrator: amid growing
4:21 am
scrutiny... r >> allht. >> narrator: ...zuckerberg set out on a cross-country tripub he pcized by streaming on facebook. >> so i've been going around to different states for my personal allenge for the year to see how different communities are working across the country. >> narrator: but while he was on the road, the news was getting worse. >> the u.s. intelligence community officially is blamings russian prent... >> ...russian president vladimir putin ordered an influence campaign aimed at the presidential election. >> narrator: zuckerberg's chief of security, alex stamos, had been asked to see what he could nd on facebook's servers. >> we kicked off a big look into the fake news phenomenon, specifically what coonent of that might have a russian part in its origin. >> narrator: they traced disinformation to whatppeared to be russian government-linked sources. >> jacoby: so, what was it like bringing that news to others in the company, and up to
4:22 am
mark and sheryl, for instance? >> you know, we had a big responsibility in the security team teducate the right people about what had happened without being kind of overly dramatic. d it's kind of h a security person to balance that, right? like, everything seems like any emerge you. but in this case it really was, right? this really s a situation in which we saw the tip of this iceberg, and we knew there was thsome kind of iceberg bent. >> narrator: stamos expanded his investigation to look at how the ruian operation may have also used facebook's targeted advertising system. >> so what we did is, we then decided we're going to look at all advertising and see if we rncan find any strange pat that might link them to russian activity. w enlisted huge parts of the company. we kind of dragooned everybody into one big, unified team. so you have pele in a war room working 70-, 80-hour weeks, billions of dollars of ads,mi
4:23 am
hundreds ollions of pieces of content, and by kind of a painstaking process of going through thousands and thousands of false positives, eventually found this large cluster that we were able to link tthe internet research agency of st. petersburg. >> narrator: it was one of the same groups that had been using facebook to spread disinfortion in ukraine three years earlier. this time, using fake accounts, russian operatives had paid around $100,000 to run ads thatr oted political messages and enticed people to join fakefa book groups. >> what the internet research agency wants to do is, they want to create the appearce of legitimate social movements. so ty would create, for example, a pro-immigration group and an anti-immigration group. and both of those groups would be almost caricatures of what those two sides think of each other. and their goal of running ads were to find populations of
4:24 am
people who are open to those kinds of messages, to get them in those groups, and then deliver content on a regular basis to drive them apart. really what the russians are eying to do is to find th fault lines in u.s. society and amplify them, and to mak americans not trust each other. >> narrator: in septem17, nearly a year after the election, zuckerberg announced on facebook what the company had found. w are actively working with the u.s. government on its ongoing investigations intoia ruinterference. we've been investigating this for many months now, and for a while, we had found no evidence of fake accounts linked to russian... linked to russia running ads. when we recently uncovered this activity, we provided that information to the special counsel. we also briefed congress.ni and this m, i directed our team to provide the ads we've found to congress, as well. >> we do know that facebook-related posts touched about 150 million americans,er
4:25 am
thatposts that originated either through russian faketh accounts ough paid advertising from the russians. but the paid advertising was really a relatively small piecee of thell problem. a much bigger problem was the ability for someone to say they were james in washington, dc, but it was actually boris in st. petersburg creating a fake persona that would generate followers, and then they wouldse it with the fake information and the false news d the political content. one acunt was set up to try to rally the muslim community in, in texas. another was an attempt to kind of rally the right wing in texas. they created an event. >> white power! >> stop the hate! stop the fear! >> protest, with both sides protesting against each other. >> (yelling) >> at a mosque in houston, in
4:26 am
2016. ri >> this is a, we have the right to speak out. >> (yelling) >> but for the good work of the houston police, you could haveth hakind of horrible activity take place then and there that iaw unfortunately take place in charlottesville in my state last year. so the rl human consequences of some of these... of some of this abuse, we've been very lucky that it hasn't actually cost people's lives.eb >> narrator: fk also found that the russians had used the site to orchestrate a apro-trump rally outside cheesecake factory in florida and to promote an anti-trump protest in new york ty just after the election. >> hey, hey, ho, ho, donald has got to go... >> we are under threat, and i need to defe the country that i love. >> we are right in t middle of the protest. >> narrator: the details of facebook's internal investigation set off alarm bes in washington. >> we're such a ripe target for that sort of thi, and the russians know that.
4:27 am
so the russians exploited that atdivisiveness, that polarn, because they had, they had messages for everybody. you know, black lives matter, white supremacists, gun control vocates, gun control opponents, it didn't matter-- they had messages for everybody. >> jacoby: did you thit was a pretty sophisticated campaign? >> it was.th and i believrussians did a lot to get people out to vote that wouldn't have and helped the appeal for... of donald trump. >> jacoby: and the role that social media played in that was what?hu >> it wa. >> i mean, it's really quite both ingenious and evil to, to atck a democratic society that manner. >> jacoby: but there were warning signs along the way in the trajectory of the company. >> the company's been dealing with t negative side effects of its product for years, right? when you have two billion people on a mmunication platform, there's a infinite number of potentially bad things that could hapn. the tough part is trying to decide where you're going to put your focus.
4:28 am
>> narrator: but by 2017, facebook was being accused of not focusing on other serious issues in developing, fragile democracies where the company had expanded its business. countries like the philippines, where almost all internet users are on facebook, and problemsnt had been ming. ff in a year, i probably met with more than 50 ent officials, high-ranking ofcials, including mark zuckerberg. i wanted them to know what we were seeing, i wanted them to hell me what they thought about it, and i wantedto fix it. >> narrator: maria ressa, who runs a prominent news bsite, says she had been warning facebook since 2016 that president rodrigo duterte was using a network of paid followers and fake accounts to spread lies about his policies and attack his critics. >> the u.n.'s branded his war a crime under international law. >> narrator: especially critics of his brutal war on drugs, whichas taken an estimated 12,000 lives.
4:29 am
>> human rights watch has called government-sanctione butchery. >> president duterte was targeting anyone who questioned the drug war, anyone who questioned the allegedia extrajudkillings. anyone on facebook who questioned that would get brutally bashed. we're protected by the constitution, we've been stripped of those protections online. n rator: ressa herself would eventually come under attack. the way were attacks i look, the way i sounded, that i should be raped, that i shoulk led. we gave it a name: patriotic trolling. online, state-sponsored hate that is meant to silence, meant to intimidate. so this is an information ecosystem that just turnsy democracside-down. >> jacoby: and where lies are prevalent. >> where lies are truth.
4:30 am
>> narrator: she traced the disinformation to a networof 26 fake accounts and reported it to facebook at a meeting in singapore in august of 2016. >> jacoby: what were you asking them to do? >> exactly what every news grou doesich is, take control and be responsible for what you create. >> jacoby: were you given an explanation as to why they weren't acting? >> no.. no i think facebook walked into the philippines, and they were focused on growth. what they didn't realize is that countries like the philippines... >> (chanting) ea ...countries where institutions are where corruption is rampant, these countries don't have the safeguards. and what happens when you bring everyone onto a platform and do not exercise any kind of rules, right? if you don't implement those rules beforehand, you're goinge
4:31 am
to creaos. >> jacoby: there's a problem in e philippines, we've hea about from people on the ground there, that facebook has been to some degree weaponizede duterte regime there. what are you doing t to stem this problem in the philippines? to do, thing we're tryin any time that we think there might be a connection between violence on the ground and online speech, the first thing for us to do is actually understand the landscape. >> narrator: monika bickert is facebook's head of global polict and worked f justice department in southeast asia. a >> thereundamental s estion, which is, "what should our role be, and are identifying misinformation, should we be telling people what we're finding, should we be removinghat content, should we be down-ranking that content?" and we now have a team that is focused on how to deal with exactly that sort of situation. >> narrator: in april, facebook created a news verification program and hired ressa's organization as one of its fact-checkers, though she says
4:32 am
the problemsre ongoing. the company ultimately took down the accounts ressa identified-- just last week, removed dozens more. >> i think what is happening is that this company is way in over its head in terms of its responsibilities. it's way in over its head in terms of what power it holds. like you magically add facebook and horrible things happen, but you have facebook as this effective gasoline to simmering fires. >> (shouting) >> narrator: elsewhere in the region.... >> buddhists are inciting hatred and violence agnst muslims through social media... >> narrator: ...facebook was also being used to fan ethnic tensns with even more dire consequences. on violence between buddhists and muslims is cnuing. >> misinformation, disinformation, rumors, extremist propaganda, all kinds ofad content. >> narrator: for several years, david madden, a tech
4:33 am
entrepreneur living in myanmar,s ell as journalists and activists, had been warning facebook that the muslim minority there was being targeted with hate speech. >> (speaking local language) >> you would see the use of memes, of images, things that were degrading and dehumanizing, targeting the muslim community. >> (speakingocal language) s narrator: the warning signs had been presentr back as 2014, when a fake news story spread on cebook. >> reports, later proved to be false, that some muslim men had raped a buddst woman, were shared on facebook. >> an angry mob of about 400 surrounded the sun teashop, shoutingnd throwing bricks and stones. >> narrator: two people died in the incident. >> one budist and one muslim were killed in riots today. hi i was really concerned that the seriousness ofwas not understood. and so i made a presentation at
4:34 am
facebook headquarters in may of 2015. i was pretty explicit about the state of the problem. i drew the analogy with what had happened in rwanda, where radios had played a really key role in the execution of its genocide. and so i said, "facebook run the risk of being in myanmarre what radios n rwanda." that this platform could be used to foment hate and to incite violence. >> jacoby: what was the reaction to that at faceok? >> i got an email shortly after that meeting to say that what had been discussed at that meeting had been shared internally and apparently takens veryeriously. >> narrator:he violence intensified. >>assive waves of violence that displaced over 150,000 people. >> narrator: and in early 2017, madden and other local activistn had another mewith facebook. >> the objective of this meetinwas, was really to be
4:35 am
crystal-clear about just how bad the probm was, and that the processes that they had in place to try to identify and pull down problematic content, they just weren't working. and we were deeply concerned that something even worse was going to happeimminently. it was a sobering meeting. i think... i think the main response from faceok was, "we'll need to go away and digco into this an back with something substantive." kne thing was, it never came. >> and how do yo that? >> we can look at the evidencet? on the ground. >> what we've seen here tells us a story of ethnic cleansing, of driving muslims out of myanmar. >> narrator: the united nations would call the violence in danmar a genocide, and fo social media, and facebook in particular, had played a significant role
4:36 am
>> the ultra-nationalist buddhists have their own facebooks and really inciting a lot of violence and hatred against ethnic minorities. facebook has now turned into a beast than what it was originally intended to be used. >> jacoby: i'm curious what it's like when the u.n. comes out with a report that says that facebook played a significt role in a genocide. running content policy at cebook? >>ntell, this would be impor to me even if i didn't work at facebook, givemy background. my background is as a federal prosecutor, ani worked specifically in asia and specifically on violent crimes against people in ia. so something like that really hits home to me. >> jacoby: facebook rned as early as 2015 about the potential r a really dangerous situation in myanmar.
4:37 am
what went wrong there? why s it so slow? >> we met with civil society organizations in myanmar far before 2015. this is an area where we've been focused. i think what we've learned overe s, it's important for us to build the right technical tools that can hp us find some of this content and also work with organizations on the grouna eal-time fashion. we are in the process of building those relatiohips around the world on a much deeper level, so that we can stay ahead of any kind of situation like that. >> narrator: in the past year, facebook says it's taken down problematic accounts in myanmar, hired more language experts, and improved its policies. >> jacoby: should there be any liability orny legal accountability for a company like facebook when something so disastrous goes wrong onour platform? there's all sorts of accountability. but probably the group that holds us the most accountable are e people using the service. if it's not a safe place for them to come and communicate,re theyot going to use it.
4:38 am
>> we are working here in menlo park in palo alto, california. t extent that some of these issues and problems manifest in other countries around the world, we didn't have sufficient information and a pulse on what was happening in southst asia. >> narrator: naomi gleit is facebook's second-longest- serving employee. >> and so one change that we've made, along with hiring so many more people, is that aot of these people are based internationally and can give us that insight that we may not get from being here at headquarters. >> jacoby: i'm trying to understand, you know, the choices that are made. do you regret chces going backward, decisions that were made about not taking into account risks or not measuring risks? >> yeah, i definitely we angret not having 20,000 people working on safetsecurity back in the day, yes. so i regret that wwere too
4:39 am
slow, that it wasn't our priority. >> jacoby: but were those things even considered at the time?in toof amp up safety and security, but there was some reason not to or... >> not really. i mean, we had a safety and security team. i think we just thought it w sufficient. i just... it's not that we were, like, "well, we uld do so much more here," and decided not to. i think we... we just didn't... again, we were just a bit idealistic. >> facebook has created this platform that in many countries, not just myanmar, has become the dominant information platform, and it has an outsized influencn ots of countries. that comes with a lot of responsibility. >> using social media, rumors of alleged muslim wrongdoing spread fast. >> many of those countries are wrestling with some pretty big challenges.
4:40 am
tensions between groups within w countries, andhave seen this explode into what mark zuckerberg would call real-worlh harm, others would just call violence or death, in manya other ets. we're seeing it right now in india.be >> calloo me a victim of india's fake news. >> we've seen examples of this in places like sri lanka. v >> to keep tlence from spreading, sri lanka also shut down facebook... >> the myanmar example suld be sounding an alarm at the highest level of the company, that this requires a comprehensive strate. >> narrator: but it would be far from myanmar, and a different kind of problem, that would cause an international uproar over facebook. >> cambridge analytica and its mining of data on millions of americans for polical purposes... >> cambridge is alleged to have used all this data from tens ofs millf facebook users... >> escándalo cambridge analytica, facebook...in (reporters spedifferent
4:41 am
languages) >> narrator: it was a scandal overow facebook failed to protect users data, exposed by a whistleblowd naristopher wylie. >> christopher wylie, he was able to come forward and say i can prove this. >> narrator: he said that facebook knew that a political consulti firm he'd worked for, cambrie analytica, had been using the personal data of more than 50 million users to try to influence voters. >> at cambridge analytica, we are creating the future ofca policampaigning. >> this is a company that specializes and woulrtise itself as specializing in rumor campaigns. >> political campaigns have changed. >> seeding the internet with misinformation. >> putting the right message int front of the rerson at the right moment.th >> and that's e power of data. you can literally figure out who are the people who are most susceptible. >> ...data about personality, so you know exactly who to target...ar >>tor: the firm gained access to the data from a third party, without facebook's permission. >> the overwhelming majority of people who had their data
4:42 am
collected did not know. when data leaves facebook's servers, there is no way for facebook to track that data to know how that data is being used or to find out how many copies there are. >> narrator: facebook eventually changed its data sharing policies and ordered cambridge analytica to delete the data. we know that facebook h known about this... >> narrator: after wylie came forward, they banned the firm from their site, and announced they were ending another controversial practice: working directly with companiesd known a brokers.ar but the upas so intense that in april 2018, mark rg zuckeras finally called before congress, in what would become a reckoning, over facebook's conduct, it's business model, and its imct on democracy.
4:43 am
>> we welcome everyone today's hearing on facebk's social media privacy and the use and abuse of data.o i now turnu, so proceed, sir. >> we face a number of importano issuesd privacy, safety, and democracy.ll and you ightfully have some hard questions for me to answer.an facebook idealistic and optimistic company. and as facebook has grown, people everywhere have gotten a powerful new tool for making their voices heard and for building communities and businesses. but it's clear now that we didn't do enough to preventol these from being used for harm, as well. and that goes for fake news, for foreign interference in elections, and hate speech, as well as developers aa privacy. we didn't take a broad enough view of our responsibility, and that was a big mistake. and it was my mistake. and m sorry. >> if, like me, you're following this stuff, you e years and
4:44 am
years and years of people begging and pleading with the company, saying, "please pays, attention to tat every channel people could find. and basically being ignored. "we hear you, you're concerned,e we apoloof course we have a responsibility, we'll do better." ind the public record here is that they are a coion of unable and unwilling to graspd al with this complexity. >> you may decide, or facebook may decide, it needs to police a whole bunch of speech, that ik therica might be better off not having policed by one company that has a really big and powerful platform. >> senator, i think that this is a really hard question. and i think it's one of the reasons why we struggle with it. >> these are very, very powerful corporations. they do not have any kind of traditional democratic accotability. and while i personally know a lot of people making these decisions, if we set the norms
4:45 am
that these companies need to decide what, who does and does not have a voice oine, eventually that is going to go to a very dark place. >> when companies become big and powerful, there is a instinct to either regulate or break up, right? >> i think we're finding ourselves now in a position where people feel like somethi should be done. there's a lot of questions what should be done, but there's no questi that something should be done. >> you don't think you have a monopoly? >> it certainly doesn't feel like that to me. >> okay. >> you know, there's a lot of problems here, there, but all of these problems get worse when one company has too much power, too much information, over too many people. >> narrator: after yrs of checked growth, the talk now is increasingly about how to rein in facebook. already, in europe, there's ate new et privacy law aimed at companies like facebook. inside the company, the people
4:46 am
we spoke to insisted facebook is still a force for good. >> jacoby: has there ever been a minute where you've questioned the mission? you know, internly? whether anyone has taken a second to step back and say, "all right, has this blinded us in some way?" have you had a moment like that? i till continue to firmly believe in the mission, but in terms of stepping back, in terms of reflecting, absolutely. but that isn't on the mission. the reflection is really about, how can we do a better job of minimizing bad experien facebook? >> jacoby: why wasn't that part of the metric earlier? in terms of, how do you minimize the harm? >> you know, it's poible that we could have done more sooner, and we haven't been as fast asee wed to be. >> narrator: that line was repeated by all the current fficials facebook put forward to answer question >> we've been too slow to act on... >> i think we were too slow... >> we didn't see it nough. >> we were too slow. >> mark has said this, that we have been slow. >> one of my greatest regrets in
4:47 am
running the company is that we hewere slow in identifying russian information operations in 2016. and we're going to take a number of measures, from building and deploying new a.i. tools that take down fake news, to growing our security team to more than 20,000 people... >> the goal here is to deep-divr on thet nuances there... >> narrator: the company says it's now investing resources and lent to tackle a range o problems, from the spread of hate speech to election interference. even if we can't do fact-checking, if we can do more work around the programmatic aspect of it... >> narrator: this is part of the team tackling the spread ofma misinfon around the world, led by tessa lyons. i >> the electioegrity team has a framework for how they're thinking about secondaryge langin each country. and i feel like from the misinformation side, we've mostly priorited primary languages. >> narrator: it's a problem the company admits it is a lonway from solving. >> the next thing is about the arabic fact-checking project. i think the main blocker here is potentially getting a fact-checker that can cover an entire region. >> you know, i came into this
4:48 am
job asking myself, "how long is it going to take us to solve this?" and the answer is, this isn't a problem that you solve. 's a problem that you contain. >> awesome. next, segue into upcoming launches. >> narrator: in advance of next week's midterms, facebook has mobilized an election team to monitor false news stories and delete fe accounts that may be trying to influence voters. nathaniel gleich runs the team. >> there are going to be actors that are going to try to manipulate that public debate. how do we figure out what are the techniques they're using and how do we make it much harder? >> jacoby: is there going to be real-time monitoring on election day of what's going on on facebook, and how are you going to actually find things that mai sow distruthe election? io absolutely, we're going to have a team on eleday focused on that problem, and one thing that's useful here is,'v already done this in other elections. >> jacoby: and you're confident you can do that here?
4:49 am
>> i think that... yes, i'm confident that we can do this here.to >> nar gleicher says his team continues to find foreign actors using the platform to spread disinformation. >> iran was revealed to be a new ayer in worldwide disinformation campaigns, and on top of this... >> narrar: and less than two weeks ago, federal prosecutors announced they'd found evidence that russian operative been trying to interfere in next week's election. >> jacoby: what is the standard that the publishould hold facebook to, in terms of solving some of these seeminglenormous problems? >> i think the standard, the responsibility, what i'm focused on, is amplifying good and minimizing the bad. and we need to be transparent about what we're doing on both des, and, you know, i think this is an ongoing discussion. >> jacob what's an ongoing discussion? >> how we're doing on minimizing
4:50 am
the bad. >> jacoby: but we're dealing with such consequential issues, right? we're talking about integrity of our elections, we're talking about...ol >> aely. >> jacoby: ...in some cases, playing a role in a genocide. an ongoing conversation mean what, exactly, about that? about a standard for success here? >> ihink, you know, this is the number-one priority for the company. mark has been out ere, sheryl is out there, you're talking to me and a bunch of the other leaders.'s thhat we mean by having an ongoing conversation. this is something that we need to, as you sd, this is serious, this is consequential. we take this extremely... like, we understand this responsibility, and it's not going away tomorrow. >> jacoby: do you think facebook ws earned the trust to be able to say, "trust uve got this"? >> i'm not going to answer that, i'm sorry. that's just... i mean, that, everybody can make that decision for themselves >> jacoby: but what... do you trust them? >> i trust the people who i worked with.i ink there are some good
4:51 am
people who are working on this. that doesn't mean i don't think k we should pass laws to bat up. >> it has not been a good week for fabook... >> ...social media giant... >> narrator: for facebook, the problems have been multiplying. >> ...massive setback for facebook, the soci media giant... >> ...a massive cyber attack affecting nearly 50 million facebook users... >> facebook continues to crack down on fake political ads and news... >> narrator: but markue zuckerberg's to connect and change the world continues. >> hey, everyone! hey. welcome to f8. this has been an intense year. i can't believe we're only four months in >> after all these scandals, facebook's profits just still going up, right? so they don't really have a huge incentive to change the core problem, which is their business model. >> we are announcing a new set of features coming soon... >> they're not going to do it as long as they're doing so well
4:52 am
financially and there's nore latory oversight. and consumer backlash esn't really work, because i can't leave facebook-- all my friends and family aroe world are there. you might not like the company you might not like its privacy policies, you might not like the way its algorithm works, you e ght not like its business model, but what u going to do? >> now, there's no guarantee r that we get thht. this is hard stuff. we will ma mistakes, and they will have consequences, and we will need to fix them.>> arrator: as he has since the beginning, he sees facebook, his invention not as part of the problem, but the solution. >> so if you believe i do, that giving people a voice is important, that building relationships is important, that creating a sense of mmunity is important, and that doing the hard work of trying to bring the world closer together is important, thei say this: we
4:53 am
will keep building. (cheers and applause) ♪ >> narrator: nextime... >> this is just the beginning. >> narrator: the white power movement goes underground. >> what do you think was going on in this house? >> they were making bombs. man: >> narrator: frontline and propublica investigate. >> they are actively recruiting military members. does that surprise you? >> go to pbs.org/frontline to read more about more about facebook from our partner, washington post repoer dana priest. >> for facebook the dilemma is can they solve these serious problems without... >> and later this month, as part of frontline's transparency project... >> what i'm focused on isif amng good and
4:54 am
minimizing the bad. >> ...see more of the interviews in the film in context >> this isn't a problem that you solve, it's a problem that you contain. >> connect to the frontline community at pbs.org/frontline. >> frontline is made possible by contributions to your pbs station from vwers like you. thank you. and by the corporation for public brocasting. major support is provided by the john d. and catherine t.ma rthur foundation, committed to building a more just, verdant and peeful world. more information is available at macfound.org. the ford foundation, working with visionaries on the front lines of social changewo dwide. at fordfoundation.org. additional support is provided by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy journalism that informs and inspires. the wyncote foundation.li
4:55 am
and by the fro journalism fund, with major support from jon and jo ann hagler. and additional support from chris and lisa kaneb. captioned by media access group at wgbh access.wgbh.org. >> for more on this and other frontline programs, visit our website at pbs.org/frontline. ♪ to order frontline's "the febook dilemma" on dvd visit shop pbs, or call 1-800-play-pbs. p thgram is also available on amazon prime video. ♪ >> you're watching pbs.
4:56 am
5:00 am
(upbeat music) - broadcasting from the ktwu studios, the air command presents theater of the mind, radio you can see. return with us now to those thrilling radio days of yesteryear, and another margo mason mystery, th call omummy. (dramatic music) 1941, egypt, the land of the pharaohs, and a newly discovered tomb.
158 Views
IN COLLECTIONS
KQED (PBS) Television Archive Television Archive News Search Service The Chin Grimes TV News ArchiveUploaded by TV Archive on