Skip to main content

tv   Frontline  PBS  October 30, 2018 10:00pm-11:00pm PDT

10:00 pm
>> narrator: tonight' frontlins investigation into facebook continues. >> there is absolutely no company who has had so much influence on the information that americans consume. >> narrator: he's the man who connected the world. but at what cost? >> polarization was the key to the model. >> narrator: the global threat... >> this is an information ecosystem that just turns democracy upside down. >> narrator: the 2016 election... >> ...facebook getting over an billlitical campaign posts. >> narrator: and the company denials... >> the idea that fake news on facebook influenced the election in any way i thinis a pretty crazy idea. >> .facebook ceo mark zuckerberg will testify... >> ...d i'm responsible fo what happens here.
10:01 pm
>> narrator: is facebook ready for the mid-term elections? on there are a lot of ques heading into this midterm... >> ...the midterm elections... >> i still have questions if we're going to make sure than in 2018 2020 this doesn't happen again.r: >> narraonight on frontline, part two of "the facebook dilemma". >> frontline is made possible by contributions to your pbs station from viewersike you. thank you. and by the corporation for public broadcastg. major support is provided by the john d. and catherine t.hu macafoundation, committed to building a more just, verdant and peacefulorld. more information is available at macfound.org. e ford foundation, working with visionaries on the front lines of social change worldwide. at fordfoundation.org. additional support is provided by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy
10:02 pm
journalism that informs and inspires. the wyncote foundation. and by the frontline journalism fund, with major support fro jon and jo ann hagler. and additional support from chris and lisa kaneb. >> i accept your nomination for president of the united states. >> i humbly accept your nomination for the presidency of the united states. >> hey, everne. we are live from my backyard, etwhere i am smoking a brind some ribs and getting ready for the presidential debate tonigh >> some of the questions for tonight's debate will be formed byonversations happening o facebook. >> 39% of people get their ection news and decision-making material from facebook. >> facebk getting over a billion political campaign posts.
10:03 pm
>> i love this, all the commentc that aing in. it's, like, i'm sitting here, smoking these meats and, um, 85and just hanging out wit00 people who are hanging out with me in my backyard. >> make no mistake, everything you carebout, everything i care about and i've worked for, is at stake. >> i will beat hillary clinton, crooked hillary, i will beat her so badly, so badly. >> i he that all of you t out and vote. this is going to be an important one. ♪ 's >> tonigroadcast will also include facebook, which has become a gathering place for political conversation. (cheers and applause) ♪ >> thank you. thank you. >> facebook is really the new town hall. >> better conversations happen on facebook. >> poke >> poke for a vote. >> u.s.a.! u.s.a.! >> hilry!
10:04 pm
hillary! >> facebook is the ultimate growth stock >> facebook is utterly dominating this new, mobile, digital economy. >> have you been measuring political conversation on cebook, things like the most likes, interactions, shares. >> hillary clinton has evaded justice. >> i thank you for ging me the opportunity to, in my view, clarify. >> 2016 is the social election. >> facebook getting er a billion political campaign posts. ga >> narrator: 2016 as banner year for mark zuckerbergn his cohad become one of the most popular and profitable in the world, despite an emerging dilemma that, as it was connecting billions, it was inflaming divisions. >> people really forming these tribal identities on facebook, where you will see people getting into big fights. e' >> narrator: been investigating warning signs that existed as facebook grew, and interviewing those inside the company who were there at the time. >> we saw a lot of our numbers
10:05 pm
growing like crazy, as did the rest of the media and the news world in particular. and so, as a product designer, when you see your products being used more, you're happy. >> it's where we're seeing conversation happening about the election, the candidates, the issues. >> narrator: amid all political activity on facebook, no one used the platform more successfully than donald trump's digital media director parscale. >> i asked facebook, "i want to aend $100 million on your platform-- send anual." they say, "we don't have a manual." i sa "well, send me a human manual, then." >> james jacoby: and what does the manual provide?nu >> you have a for your car. if you didn't have that for your car, there might be things you would never learto use in your car, right? i spent $100 million on a platform, the most in history, it made sensfor them to be there to help us make sure how we spent it right and did it right. >> with custom audiences, you can get your ads to people you already know who are on facebook. >> narrator: what facebook's representatives showed them was how to harness its powerful adrtising tools to find an
10:06 pm
target new and receptive audiences. >> now i'll target my ad to friends of people who like my page. >> what i recognized was themp process of marketing. i needed to find the right people and the right places to show them the right message. micro-targeting allows you to do is say, "well, these are the people that are most likely to sh up to vote, and these a the right audiences we need to show up." the numbers were swing in the consumer side that people were spending more and more hou of their day consuming facebook content, so if you have any best place to show your content, it would be there. it was a place where their eyes were. that's where they were reading newspaper and doing things. and so we could get our message injected inside that stream. and that was a stream which was controlling the eyeballs of most places that we needed to win. ♪ >> narrator: it wasn't just politics. by this time, facebook was also dominating the news ss. >> 62% of americans say they get their news from social sites like facebook. >> more than a dozen developers have worked with us to build social news apps, all with the
10:07 pm
goal of helping you discover and read mornews. narrator: facebook's massive audience enticed media organizations to publish straight into the company's news feed-- making it one of the most important distributors of news in the world. >> i'm personally really excited about this. i think that it has the potentl to not only rethink the way that we all read news, but to rethink a lot of thway that the whole news industry works. >> narrator: but unlike eeaditional media companies, facebook did not itself as responsible for ensuring the accuracy of the news and information on its site. >> the responsibilities that they should have taken on ar what used to be called editing. and editors had certainbi responties for what was going to show up on the first page versus the last page, the relative importance of things that don't relate purely to money and don't relate purely to popularity. e so they took over le of editing without ever taking on the responsibities of editing.
10:08 pm
>> narrator: itead, facebook's editor was its algorithm, designed to feed users whatever was most engaging to them. inside facebook, they didn't see that as a problem. >> jacoby: was there a realization inside facebook assp to what the sibilities would be of becoming the main distributor of news? >> i don't think there was a lot of thinking about that, that idea. i don't think there was any, any thought that newcontent in particular had, had more value or had more need for protection than any of the other pieces ofo ent on facebook. >> narrator: andrew anker was in charge of facebook's news products team, and is one ofgh former facebook insiders who agreed to talk on camerath abour experiences. >> i was surprised by a lot of things when i joined facebook. and as someone who grew up in the media world, i expected ere to be more of a sense of how people interact with media
10:09 pm
and how important media can be to certain people's informationt (applause) >> we have a video from davida from napoli. >> no.gh (lr) you know, we're a technology company. we're not a media company. >> the fact that so many big, well-known news brands really pushed into facebook pretty aggressively legitimized ias a place to get, kind of, information. and i think that also strangely created the opportunity for w peop weren't legitimate, as well. because if the legitimate not legitimate, all you need toi set up a website and then share links to it, and your stuff on facebook is going to look similar enough that you've just gotten a huge leg up. >> hillary clinton is the most corrupt person... but as the 2016 campaign heated up... >> and i'll tell you, some of what i heard coming from my opponent...
10:10 pm
>> narrator: ...reporter craig silverman was sounding alarms that facebook's news feed was spreading misinformation-- what he called "fake news." >> fake news just seemed like the right term to use. and i was trying to get people to pay attention. i was trying to get journalists to pay attention. i was trying to also get facebook and other companies like twitter to pay ion to this, as well. >> nrator: silverman traced misinformation back to some unusual places. >> we started see this small cluster of websites being run, the vast majority, from one town in macedonia. >> how popular is it? >> about 200 people, maybe.eo >> 200e? >> yeah. >> are making fake news websites? >> yes. >> most of them didn't really care about who won the election. they weren't in this for politics. if you put ads on these completely fake websites, and you' got a lot of traffic fr facebook, that was a good way to make money. >> there are some people who, made, li0k or something like that.
10:11 pm
>> 200,000 euros? y h, yeah, yeah. >> i remember one guy, i think he was 15 or 16 years old, tellg me, you know, "america want to read about trump, so i'm writing trump stuff." trump earned them money. we saw macedonia's publishing hillary clinton being indicted the pope endorsing trump, hillary clinton selling weapons to isis, getting close to orn above a millares, likes, comments. that's an insane amount of engagement. it's more, for example, than when "the new york times" had a scoop about donald trump's tax returns. how it that a kid in macedonia can get an article that gets more engagement than a scoop from "the new york times" on facebook? >> jacoby: a headlinmpduring the gn was "pope endorses trump," which was not true, but it went viral on facebook. was it known within facebook that that had gone viral?m, >>'m sure it was. i didn't necessarily know how viral it had gotten, and i certainly didn't believe that anybody believed it. >> jacoby: but would that have been a red flag inside the
10:12 pm
company, that somethg that's patently false was being propagated to millions of people on the platform? >> i think if you asked thequ tion that way, it would have been. but i think when you asked, then the next question, which is the harder and the more important question, was, which is, "so what do you do about it?", you then very quickly get into issues of not only free speech, but to what degree is it anybody's responsibility, as a technology platform or as a distributor, to start to decide when you've gone over the line between something that is clearly false from something that may or may not be perceived by everybody to be clearly false and potentially can do damage? jacoby: over the course of the 2016 election, there was a lot of news about misinformation i mean, there was, famously, "the pope endorses trump." do you remember that? >> absolutely. i wasn't working on the issues at the time, but, but absolutely i, i do remember it. >> narrator: tessa lyo chief of staff to facebook's
10:13 pm
number two, sheryl sandberg, and is now in charge of fighting misinformation. she is one of five current officials facebook put fward to answer questions. >> jacoby: was there any kind sense of, like, "oh, my goodness, facebook is getting polluted with misinfmation-- someone should do something about this"? >> there certainly was, and there were people who we thinking about it. what i don't think there was a real awareness of, internally o externally, was the sco the problem and the, the right course of action. >> jacoby: how could it be surprising that, if you're becoming the world's information asource, that there may b problem with misinformation? >> there was certainly awareness that there could be problems related to news or quality of news. dd i think we all recogni afterwards that of all of the threats thate were considering, we focused a lot on threats that weren't misinformation and underinvested in this one. ♪ >> narrator: but there was another problem thatas going unattended on facebook beyond misinformation.
10:14 pm
>> one of the big factors that emerged in the election was what started to be called hyper-partisan facebook pages. these were facebook pages that kind of lived and died by really ginning up that partisanship-- "we're right, they're wrong." but not even just that, it was"t alsoy're terrible people, and we're the best." and the facebook pages were getting tremendous engagement. >> "a million migrants are coming over the wall, and they're going to, like, rape" your childreu know? that stuff is doing well. >> and the stuff that was truet would r less shares. >> the development of these hyper-partisan sites i think turned the informational commons into this trash fire. and there's some kind of parable in that for the broader effects of facebook. that the very things that divide us most cause the most engagement. >> (barking, laughing)
10:15 pm
>>hich means they go to th top of the news feed, whic means the most people see them. >> narrator: this worried an early facebook investor who was once close to zuckerberg. >> i am an analyst by training and profession. and so, my job is to watch and interpret. at this point, i have a series of different examples that suest to me that there is something wrong, systemically, with the febook algorithms and business model. in effect, polarization was the key to the model. this idea of appealing to people's lower-level emotions, things like fear and anger, tore createer engagement, and in the context of facebook, more time on site, more sharing, and, therefore, more advertising value. i found that incredibly disturbing. >> narrator: ten days before the election, mcnamee wrote zuckerberg and sandberg about his concerns. >> i mean, what i was really trying to do was to help mark and sheryl get this thing right. and their responses were more o less whapected, which is to say that what i had seen were
10:16 pm
isolated problems, and that the dressed each and every one of them. ndthought facebook could s up and say, "we're going to reassess our priorities. we're going to reassess the metrics on which we run the company to try to take into account the fact that our impact is so much greater now than itus to be. and that as facebook, as a company with, you know, billions of users, we have influence on how the whole social fabricwo s that no one's had before." (cheers and applause)ec >> i've justved a call from secretary clinton. >> clinton has called trump toed cothe election. >> the clinton campaign is... really a somber mood here. >> the crowd here at trumpmp cagn headquarters... >> narrator: trump's targeted acs on facebook paid off... >> did things likeook help
10:17 pm
one of the nastiest elections ever? >> narrator: ...leading to complaints that facebook helpedo tilt the ele.. >> facebook elected donald trump, that's basically... >> narrator: ...which the trump campai dismissed as anger over the results. >> there has been mounting critism of facebook... >> no one ever complained about facebook for a single day until donald trump was president. the only reason anyone's upset about this is that donald trump is president and used a system that was all built by liberals. when i got on tv and told everybody after my interview of whate did at facebook, it exploded. the funny thing is, the obama campaign used it, then wt on tv and newspapers, and they put it on the front of magazine, and the left and the media called them geniuses for ing that. >> accusations that phony news stories lped donald trump win the presidency... >> narrator: trump's victory put facebook on the spot. >> facebook even promoted fake news into its trending... >> narrator: and two days afteri the el, at a tech conference in northern california, zuckerberg spoke i publicly abofor the first time. >> well, you know, one of yothe things post-electionve been getting a lot of pushback
10:18 pm
from people who feel that you didn't filter out enough fake stories, right? >> you know, i've seen some of the stories that you're talking about, arod this election. there is a certain profoundla of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news. you know, personally, i think the, the idea that, you know, fake news on facebook, of which, you know, it's, it's a very small amount of, of, of the content, influenced the election in any way, i think, is a pretty crazy idea, righ >> if i had been sitting there in an interview, i would hav said, "you're lying." when he said, "we had no impact on the election," that... i remember reading that and being furious. i was, like, "are you kidding me?" like, "stop it." like, you cannot say that and nobe lying. of course they had an impact, it's obvious. they were the most important distribuon, news distribution. there are so many statistics
10:19 pm
about that. ouke, i don't know how you possibly make that claim in public and with such a cavalier attitude. that infuriated me. and i texted everybody there, saying, "you're kidding me." >> jacoby: is he not recognizing the importance of his platform in our democracy at that point in time? >> yes, i think he didn't understand what he had built, or didn't care to understand or wasn't paying attention, and doesn't... they really do want to pretend, as they're getting t ir private planes, as they're getting... going to their beautiful homes, asec they're cong billions of dollars, they never want to acknowledge their power. they're powerful, an have... they don't. >> thank you so much for being here. >> thank you, guys. >> i think it was veryasy for all of us sitting in menlo park to not necessarily understand how valuable facebook had become. i don't think any of us, mark included, appreciated how much of an effect we might have had. and i don't even know today, two years later, or almost two years later, that we really understano how mua true effect we had.
10:20 pm
but i think more importantly, we all didn't he the information to be saying this like that at the time. my guess is, is that mark now realizes that there wat more to the story than, than he agor any of us could have ed at that point. >> narrator: barely two months later, in washington, an even more serious situation was developing. a intelligenncies were investigating russian interference in the election, and whether social media had played a role. >> classical propaganda, disiormation, fake news. >> does that continue? >> yes. in my view, we only scratched the surface. i say "we," those that assembled the intelligence community assessment that we published on the 6th of january 2017, meaning nsa, c.i.a., fbi, and my office. but i will tell you, frankly, that i didn't appreciate the full magnitude of until well after. >> narrator: amid growing
10:21 pm
scrutiny... zu all right. >> narrator: ...erberg set out on a cross-country trip he publicized by streaming on facebook. >> so i've been ing around to different states for my personal challenge for the year to see how different communities are working acro the country.ut >> narrator: b while he was on the road, the news was getting worse. >> the u.s. intelligence community officially is blaming russian president... >> ...russian president vladimir putin ordered an influenceed campaign at the presidential election. >> narrator: zuckerbers chief of security, alex stamos, had been asked to see what he could find on facebook's servers. >> we kicked off a big lk into the fake news phenomenon, spifically what component that might have a russian part in its origin. >> narrator: they traced disiormation to what appeare to be russian gornment-linked sources. >> jacoby: so, what was it like bringing thatews to
10:22 pm
others in the company, and up to mark and sheryl, for instance? >> you know, we had a big responsibility in the securityth team to educatright people about what had happened without being kind ooverly dramatic. it's kind of hard as a security person to balance that, rit? like, everything seems like an emergency to you. but in this se it really was, right? this really was a situation in which we saw the tip of this iceberg, and we knew there wasd some k iceberg beneath it. >> narrator: stamos expanded n s investigation to look at how the russian operatmay have also used facebook's targeted advertising system. >> so what we did is, we then decided we're going to look at all advertising and see if we a can fi strange patterns that might link them to russian activity. so we enlisted huge parts of the company. we kind of dragooned everybody into one big, unified team. so you have people in room working 70-, 80-hour weeks, hundreds of millions of pieces
10:23 pm
of content, and by kinof a painstaking process of going through thousands and thousands of false positives, eventually found this large cluster that we we able to link to the internet research agency of st. petersburg. >> narrator: it was one of the same groups that had been using facebook to spread disinformation in ukraine three years earlier. this time, using fake accounts, russian operatives had paidnd arou $100,000 to run ads that promoted political messages and enticed people to join fake facebook groups. >> what the internet research agency wants to do is, they want toreate the appearance of legitimate social movements.cr so they woulte, for example, a pro-immigration group and an anti-immigration group. and both of those groups would be almost caricatures of what those two sides think of each other. and their goal of running ads
10:24 pm
were to find populations of people who are open to those kinds of messages, to get them into those groups, and then to delivecontent on a regular basis to drive them apart. really what the russians aredo trying ts to find these fault lines in u.s. society and amplifthem, and to make americans not trust each other.a >> nr: in september 2017, nearly a year after the election, zuckerberg announced on facebook what the company had found. >> we are actively working with the u.s. government on its ongoing investigations into russian interference. we've been investigating this for many months now, and for a while, we had found no evidencef ake accounts linked to russian... linked to russia ovnning ads. when we recently ued this activity, we provided that information to the special counsel. we also briefed congress. and this morning, i directed our team to provide the ads we've found to congress, as we. >> we do know that facebook-related posts touchedt ab0 million americans,
10:25 pm
that were posts that originated either through russian fake accounts or through paid vertising from the russians. but the paid advertising was really a relatively small piece of the overall problem. a much bigger problem was the ability for someone to say they were james in washington, dc, but it was actlly boris in st. petersburg creating a fake persona that would genere followers, and then they would seed it with the fake information and the false news and the political content. one account was set up to try to rally the muslim community in, in texas. another was an attempt to kind of rally the right wing in texas. they created an event.er >> white p >> stop the hate! stop the fear! >> ptest, with both sides protesting against each other. >> (yelling)
10:26 pm
>>t a mosque in houston, i 2016. >> this is america, we have the right to speak out. >> (yelling) but for the good work of the houston police, you could have had the kind of horrible activity take place then and there that i saw unftely take place in charlottesville in co state last year. so the real humaequences of some of these... of some of this abuse, we've been very lucky that it hasn't actually cost people's lives. >> narrator: facebook also found that the russians had used the site to orchestrate ap pro-trlly outside of a cheesecake factory in florida d to promote an anti-trump protest in new york city just after the election. >> hey, hey, ho, ho, donald has got to go... >> we are under threat, and i need to defend the c that i love. >> we are right in t middle of the protest. >> narrator: the details of facebook's internal investigation set off alarm bells in washington.e' >> such a ripe target for that sort of thing, and the russians know that.
10:27 pm
so the russians exploited that divisiveness, that polarization, because they had, they had messages for everybody. you know, black lives matter, white supremacists, gun control advocates, gun control opponents, it didn't matter-- they had messages for everybod >> jacoby: did you think that was a pretty sophisticatedgn camp >> it was. and i believe the russians did a lot to get people out vote that wouldn't have and helped the appeal for... of donald trump. >> jacoby: and the role that social media played in that was what? >> it was huge. >> i mean, it's really quite both ingenious and evil to, oc attack a dtic society in that manner. >> jacoby: b there were warning signs along the way in the trajectory of the company. >> the company's been dealing with the negative side effects of its product for years, right? when you have two billion people on a communication platform, there's a infinite number of potentially bad things that could happen. the tough part is trying to decide where you're going to put your focus.
10:28 pm
>> narrator: but by 2017, facebook was beingccused of not focusing on other serious issues in developing, fragile democracies where the company had expanded its business countries like the philippines, where almost all internet users are on facebook, and problems had been mounting. >> in a year, i probably metth ore than 50 different officials, high-rankingnc officials,ding mark zuckerberg. i wanted them to know what we were seeg, i wanted them to tell me what they thought about it, and i wanted them to fix it. >> narrator: maria ressa, who runs a prominent news website, sa she had been warning facebook since 2016 that president rodrigo durte was using a network of paid followers and fake accounts to spread lies about his policies and attack his critics. >> the u.n.'s branded his war a crime under international law. >> narrator: especially critics of his brutal war on drugs,n which has ta estimated 12,000 lives.
10:29 pm
>> human rights watch has called government-sanctioned butchery. >> president duterte was targeting yone who questioned the drug war, anyone who questioned the alleged extrajudicial killings. anyone on facebook who questioned that wod get brutally bashed. be're protected by the constitution, we'v stripped of those protections online. >> narrator: ressa herself would eventually come under attack. >> tre were attacks on the w i look, the way i sounded, that i should be raped, that i should be killed. we gave it a name: patriotic trolling. online, state-sponsored hate that is meant to silence, meant to intimidate. so this is an information ecosystem that just turns democracy upside-down. >> jacoby: and where lies are prevalent. >> where lies are truth.
10:30 pm
>> narrator: she traced the disinfortion to a network of 26 fake accounts and reported it to facebook at a meeting in singapore in august of016. >> jacoby: what were you asking them to ? >> exactly what every news group does, which is, take control and be responsible for what you create. >> jacoby: were you given an explanation as to why they weren't acting? >> no. no. i think facebook walked into the philippines, and they were focused on growth. t whathey didn't realize is that countries like the philippines... >> (chanting) >> ...countries wherein itutions are weak, where corruption is rampant, these countries don't have the safeguards. and what happens when you bring everyone onto a plform and do not exercise any kind of rules, right? if you don't implement those rules beforehand, you're going
10:31 pm
to create chaos. >> jacoby: there's a problem in the philippines, we've heard about from people on the ground there, that facebook has been tg somee weaponized by the duterte regime there. wh are you doing to, to st this problem in the philippines? >> onehing we're trying to do, any time that we think there might be a connection between violence on the ground and online speech, the first thing for us to do is actually understand the landscape. >> narrator: monikbickert is facebook's head of global policy and worked for the justice departmentn southeast asia. >> there's a fundamentalh question, wh, "what should our role be, and as we are identifying misinformation, should we be telling people what en're finding, should we be removing that co should we be down-ranking that content?" and we now have a team that is focused on how to deal with exactly that sort of situation. if narrator: in april, facebook created a news vation program and hired ressa's organization as one of itsck fact-cs, though she says
10:32 pm
the problems are ongoing.e mpany ultimately took down the accounts ressa identified-- just last week, removed dozens more. >> i think what is happening is that this company is way in over its head in terms of its responsibities. it's way in over its head in terms of what power it holds. the idea isn't that it's just like you magically add facebook and horrible things happen, but you have facebook as thisef ctive gasoline to simmering fires. >> (shouting) >> narrator: elsewhere in the region.... >> buddhists are inciting hatr and violence against muslims through social media... >> narrator:..facebook was also being used to fan ethnicev tensions with more dire consequences. >> violence between buddhistsan muslims is continuing. >> misinformation, disinformation, rumors, extremist propaganda, all kindst of bad con >> narrator: for several years, david madden, a ch
10:33 pm
entrepreneur living in myanmar, as well as journalists and acvists, had been warning facebook that the muslim minority there was being targeted with hate spe >> (speaking local language) >> youould see the use of memes, of images, things that were degradi and dehumanizing, targeting the muslim community. >> (speaking local le) >> narrator: thehaarning signs been present as far back as 2014, when a fake news story spread on facebook. >> reports, later proved to be false, that some musm men had raped a buddhist woman, were shared on facebook. >> an angry mob of about 400 owrrounded the sun teashop, shouting and thrg bricks and stones. t >> narrator: people died in the incident. >> one buddhist and one muslim were killed in riots today. >> i was really concerned that the seriousness of this was not understood. and so i made a presentation at
10:34 pm
facebook headquarters in may of 2015. i was pretty explicit about the state of the problem. i drew the analogy with what had happened in rwanda, where radios had played a really key role in the execution of its genocide. and so i sd, "facebook runs the risk of being in myanmar what radios were in rwanda." that this platform could be used to foment hate and to incite violence. >> jacoby: what was the reaction to that at facebook? af>> i got an email shortlr that meeting to say that what had been discussed at that meeting had been shared internally and apparently taken very seriously. >> narrator: the viole intensified. >> massive waves of violence that displaced over 150,000 people. >> narrator: and in early 2017, madden and other local activists had another meeting with facebook. >> the objective of this meeting was, was really to be
10:35 pm
crystal-clear about just how bad the problem was,hat the processes that they had inid place to try ttify and pull down problematic content, they just weren't working. and we were deeply concerned that something even worse was lygoing to happen imminent it was a sobering meeting. i ink... i think the main response from facebook was, "we'll need to go away and dig into this and come back withsu somethintantive." the thing was, it never came. >> and how do you know that? >> we can look athe evidencet? on the ground. >> what we've seen here tells us a story of ethnic cleansing, of driving muslims out of myanmar. >> narrator: the united nations would call the violence inge myanmar cide, and found social media, and facebook inic paar, had played a
10:36 pm
significant role. u >> tra-nationalist buddhists have their own facebooks and really inciting a lot of violence and hatred against ethnic minorities. facebook has now turned into a beast than what it was originally intended to be used. >> jacoby: i'm curious what it's like when the u.n. comes out with a report that says that facebook played a significant role ia genocide. running content policy at facebook? >> well, this would be important to me even if i didn't work at facebook, given my bacd. my background is as a federal prosecutor, and i work specifically in asia and specifically on violent crimes against people in asia. so something like that really hits home to me. >> jacoby: facebook was warned as early as 2015 about the potential for a really dangerous situation in myanmar.
10:37 pm
what went wrong there?sl why was it s? >> we met with civil society organizations in myanmar far01 before this is an area where we've been focused. i think what we've learn over time is, it's important for us to build the right technal tools that can help us find some of this content and also work with organizations on the ground in a real-time fashion. we are ithe process of building those relationships around the world on a much deeper level, so that we can stay ahead of any kind of situation like that. >> narrator: in the past year, facebook says it's taken down problematic accounts in myanmar, hired more language experts, and improved its policies. >> jacoby: should there be any liability or any legal accountabili for a company like facebook when something so disastrous goes wrononour platform? >> there's all sorts of accountability. but probably the group that holds us the most accountable u are the peopng the service. if it's not a safe place for te,m to come and communi they are not going to use it.
10:38 pm
>> we are working here in menlot park in palo california. to the extent that some of these issues and problems manifest in other countries around the world, we didn't he sufficient information and a pulse on what was ppening in southeast asi >> narrator: naomi gleit is facebook's second-longes serving employee. >> and so one change that we've made, along with hiring so many moreeople, is that a lot of these people are based internationally and can give us that insight that weay not get from being here at headquarters. >> jacoby: i'm trying to understand, you know, the oices that are made. do you regret choices going backward, decisions that were made about not taking into account riskor not measuring risks? >> yeah, i definitely think we regret not having 20,000 peopleo ing on safety and security back in the day, yes. i regret that we were t
10:39 pm
slow, that it wasn't ourio ty. >> jacoby: but were those things he time?sidered at to kind of amp up safety and security, but there was some reason not to or... >> not really. i mean, we had a safety and security team. i think we just thought it was sufficient.. i just's not that we were, like, "well, we could do so much more here," and decided not to i think we... we just didn't... again, we were just a bit idealistic. >> facebook has created this platform that in many countries, not just myanmar, has become the dominant information platform, end it has an outsized inf in lots of countries. that comes with a lot of responsibility. >> using social media, rumors of alleged muslim wrongdoing spread fast. >> many of those countries are wrestling with some pretty biges
10:40 pm
challe tensions between groups within countries, and we have seen this explode into what mark zuckerberg would call real-world harm, what others would just call violence or death, in many other markets. we're seeing iright now in india. >> calloo became a victim of india's fake news. >> we've seen examples of this in places like sri lanka. >> to keep the violence from spreading, sri lanka also shut down facebook... >> the myanmar example should b sounding aalarm at the highestle l of the company, that this requires a comprehensive strategy. >> narrator: but it would be fam frnmar, and a very different kind of problem, that would cause an international uproar over facebook. >> cambridge analytica and its ning of data on millions of americans for political gerposes... >> cambridge is alto have used all this data from tens of millions of facebook users... >> escándalo cambridge analytica, facebook... eporters speaking different
10:41 pm
languages) >> narrator: it was a scandaloo over how facfailed to protect users data, exposed by a whistleblow named christopher wylie. >> christopher wylie, he was able to come forward and ty i can prove this. >> narrator: he sat 'dcebook knew that a political consulting firm he worked for,ic cambridge anal had been using the personal data of more than 50 million users to try to influence voters. >> at cambridge analytica, we are creating the future of political campaigning. >> this is a company thatia spzes and would advertise itself as specializing in rumor campaigns. >> political campaigns have changed. >> seeding the internet with misinformati. >> putting the right message in front of the right person at the right moment. >> and that's the power of data. you can literally figure out who are the people who are mos susceptible. >> ...data about personality, so you know exactly who to rget... >> narrator: the firm gained access to the data from ir a thd party, without facebook's permission. >> the overwhelming majorityf people who had their data
10:42 pm
collected did not know. when data leaves facebook's servers, there is no way for facebook to track that data to know how that data is being used or to find out how many copies there are. >> narrator: facebook evtually changed its data sharing policies and ordered cambridge analytica to delete the data. t >> we knt facebook had known about this... >> narrator: after wylie came forward,th banned the firm from their site, and announced they were ending another controversial practice: working directly with companies known as data brokers. but the uproar was so intense that in april 2018, mark zuckerberg was finally called before congress, in what would become a reckoning, over facebook's conduct, it's business model, and its impactem onracy.
10:43 pm
>> we welcome everyone today's hearing on facebook'sso al media privacy and the use and abuse data. i now turn to you, so proceed, sir. >> we face a number of important issues around privacy, safety, and democracy. and you will rightfully have some hard questions for me to answer. facebook is an idealistic and optimistic company.an as facebook has grown, people everywhere have gotten powerful new tool for making their voices heard and for building communities and businesses.ow but it's clearhat we didn't do enough to prevent these tools from being used for harm, as well. and that goes for fake new for foreign interference in elections, and hate speech, as a wedevelopers and data privacy. we didn't take a broad enough view of our responsibility, and that was a big mistake. and it was my mistake. and i'm sorry. >> if, like me, you're following this stuff, you see years and
10:44 pm
years and years of people begging and pleading with the company, saying, "please pay tention to this," at every channel people could find. and basically being ignored. "we hear you, you're concerned, we apologize, of course we have a responsibility, we'll do better." and the public record here isat hey are a combination of unable and unwilling to grasp and deal with this complexity. >> you may decide, or facebook may decide, it needs to police a whole bunch of speech, that i think america might be better off not having policed by one company that has a reay big and powerful platform. >> senator, i think that this is a really hard question. and i think it's one of the reasons why we struggle with it. >> these are very, very powerful corporations. ofthey do not have any kin traditional democratic accountability. and while i personally know lot of people making these decisions, if we set the norms
10:45 pm
that these companies need to decide what, who does andoes not have a voice online, eventually that is going to goto very dark place. >> when companies become big and powerful, there is a instinct to either regulate or break up, right? >> i think we're finding ourselves now in a position where peopleeel like something should be done. there's a lot of questions what should be done, but there's nome question that ing should be done. >> you don't think you have a monopoly? >> it certainly doesn't feel like that to me. >> okay. >> you know, there's a lot of problems here, there, but all of these problems get worse when one company has too much power, too much information, over too many people. >>arrator: after years of unchecked growth, the talk now is increasingly about how to rein in facebook. already, in europe, there's a new internet privacy law aimed at companies like facebook. inside the company, the people
10:46 pm
we spoke to insisted that facebo is still a force for good. qu jacoby: has there ever been a minute where you'vtioned the mission? you know, internally? whether anyone has taken a second to step back and say, "all right, hathis blinded us in some way?" have you had a moment like that? >> i still continue to firmly believe in the mission, but in terms of stepping back, in terms of reflecting, absolutely. but that isn't on the mission. the reflection is really about, how can we do a better job of minimizing bad experiences on facebook? >> jacoby: why wasn't that part of the metriearlier? in terms of, how do you minimize the harm? >>tou know, it's possible t we could have done more sooner, and we haven't beeas fast as we needed to be. >> narrator: that line was repeated by all the current officials facebook put forward to answer questions. >> we've been too slow tact on... >> i think we were too slow... >> we didn't see it fast enough. >> we were too slow. m k has said this, that we have been slow. >> one of my greatest regrets in
10:47 pm
running the company is that wew were s identifying the russian information operations in 2016. and we're going to take a number of measures, from building and deploying new a.i. tools tha take down fake news, to growing our security team to more than 20,000 people... >> the goal here is to deep-dive om the market nuances there... >> narrator: theny says it's now investing resources and talent to tackle a range of problems, from the spread of hate speech to election interference. w >> even can't do fact-checking, if we can do more work around the programmatic aspect of it... >> narrator: this is part of the team tackling the spread of misinformation around the worlds led by lyons. >> the elections integrity team has a framework for how they're thinking about secondary languages in each country. and i feel like from the misinformation side, we've mostly prioritized pri languages. >> narrator: it's a problem the company mits it is a long way from solving. >> the next thing is about the arabic fact-chking project. i think the main blocker here is potentially getting a ct-checker that can cover an entire region. >> you know, i came into this
10:48 pm
job asking mys is it going to take us to solve this?" and the answer is, this isn't a problem that you solve. it's a problem that you contain. >> awesome. next, segue into upcoming launches. >> narrator: in advance of next week's midterms,acebook has mobilized an election team to monitor false news sries and delete fake accounts that may be trying to influence voters. enathaniel gleicher runs team. >> there are going to be actors that a going to try to manipulate that public debate. how do we figure out what are the techniques they're using and how do we make it much harder? >> jacoby: is there going to be real-timmonitoring on election day of what's going on on facebook, and how are you going to actually find things that may sow distrust in the election? >> absolutely, we're goingveo team on election day focused on that problem, and one thing that's usel here is, we've already done this in other elections. >> jacoby: and you're confident you n do that here?
10:49 pm
>> i think that... yes, i'm confident that we can do this here. >> narrator: gleicher says his team continues to find forei actors using the platform to spread disinformation. >> iran was revealed to be a neo player idwide disinformation campaigns, and on esp of this... >> narrator: andthan two weeks ago, federal prosecutors announced they'd found evidences thatan operatives had been trying to interfere in next week's election. >> jacoby: what is the standard that the public should hold facebook to, in tes of solving some of these seemingly enormous problems? >> i think the standard, the responsibility, what i'm focus on, is amplifying good and minimizing t bad. and we need to be transparent about what we're doing on both sides, and, you know, i think this is an ongoing discussio >> jacoby: what's an ongoing oscussion? >> how we're doiminimizing
10:50 pm
the bad. >> jacoby: but we're dealing with such consequential issues, right? we're talking about integrity of our elections, we're talking t... >> absolutely. >> jacoby: ...in some cases, playing a role in a genocide. an ongoingonversation means what, exactly, about that? about a standard for success here? >> i think, you know, this is the number-one prioritfor the company. mark has been out there, sheryl is out there, you're talking tod me bunch of the other leaders. that's what we mean by having an ongoing conversation. this is something that we need to, as you said, thi serious, this is consequential. we take this extremely... like, we understand this responsibility, and it's not going away tomorrow. >> jacoby: do you think cebook has earned the trust to be able to say, "trust us, we've gotth "? >> i'm not going to answer that, i'm sorry. that's just... i mean, that, everybody can make that decision for themselves. >> jacoby: but what... do you trust them? >> i trust the people who i worked with. i think there are some good
10:51 pm
people who are working on this. that doesn't mean i don't think we shoss laws to back that up. >> it has not been a good week for facebook.. >> ...social media giant... >> narrator: for facebook, the problems have been multiplying. >> ...massive setback for facebook, the social media giant... >> ...a massive cybeck affecting nearly 50 million facebook users... >> facebook continues to crack down on fake political ads and news... >> narrator: but mark zuckerberg's quest to connect and change the world continues. >> hey, everyone! hey. welcome to f8. this h been an e tense year. i can't believe we only four months in >> after all these scandals, facebook's profits just still going up, right? so they don't really have huge incentive to change the core problem, which is eir business model. >> we are announcing a new set of features coming soon... >> they're not going to do it as long as they're doing so well
10:52 pm
financially and there's no regulatory oversight.ns and er backlash doesn't really work, because i can't leave facebook-- all my friends d family around the world are there. you might nolike the company, you might not like its privacy policies, you might not like thl way itrithm works, you might not like its business model, but what are you going to do? >> now, there's no guarantee that we get this right. this is hard stuff. we will make mistakes, and they will have consequences, and we will need to fix them. >> narrator: as he has since the beginning, he sees facebook, his invention t part of the problem, but the solution.o >> you believe, like i do, that giving people a voice is important, that building relationships is important, that creating a sense of community is importan and that doing the hard work of trying to bring the world closer together is important, then i say we
10:53 pm
will keep building. (cheers and applause) ♪ >> nrator: next time... >> this is just the beginning. >> narrator: the white power movement goes underground. >> what do you think was going on in this house? ing bombs. man: >> narrator: frontline and propublicanvestigate. >> they are actively recruiting military members.su does that rprise you? >> go to pbs.org/frontline to read me about more about facebook from our partner, washington post reporter dana priest. >> for facebook the dilemma is can they solve these serious problems without... >> and later this month, as part of fronine's transparency project... >> what i'm focused on is
10:54 pm
amplifying good and minimizing the bad. >> ...see more of the interviews in the film in context. >> this isn't a problem that you solve, it's a problem that you contain. >> connect to the fronine community at pbs.org/frontline. >> frontline is made possible by contributions to your pbs station from viewers like you. thank you. and by the corporation for public broadcasting. major suppt is provided by the john d. and catherine t. macarthur foundation, committed to building a more just, verdand and peaceful w more information is available at macfound.org. the ford foundation, working with visionaries on the front lines of social change worldwide. at fordfoundation.org. additional support is proved by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john andelen glessner family trust. supporting trustworthy journalism that informs and spires. the wyncote foundation.
10:55 pm
d by the frontline journalism fund, with major support from jon and jo ann hler. and additional support from chris and lisa kaneb. captiod by media access group at wgbh access.wgbh.org. >> forore on this and other frontline programs, visit our website at pbs.org/frontline. ♪ to order frontline's "the facebook dilemma" on dvdbs visit shop or call 1-800-play-pbs. this program is also available on amazon prime video. ♪ >> you're watching pbs.
10:56 pm
♪ ♪ hold, hold on hold on to me ♪ iouse i'm a little unstea♪ >> what's the situn there? f how do you explain that? >> are you ready this world that we are facing today? ♪
10:57 pm
10:58 pm
10:59 pm
11:00 pm
(upbeat music) - broadcasting from the ktwu studios, the air command presents theater of the mind, radio you can see. return with us now to those thrilling radio days of yesteryear, and another margo mason mysty, call of the mummy. (dramatic music) 1941, egypt, the land of the pharaohs, and a newly discovered tomb.