Skip to main content

tv   Frontline  PBS  December 18, 2018 10:00pm-11:00pm PST

10:00 pm
>> narrator: tonight- frontline's investigation into facebook continues.s >> there isolutely no company who has had so much influence on the information that americans consume. >> narrator: he's the man who connected the world. but at what cost? >> polarization was the key to the model.lo >> narrator: the gl threat... >> this is an information ecosystem that just turns democracy upside down. >> narrator: the 2016 eltion... >> ...facebook getting over a billion political campaign posts. >> narrator: and the company denials... >> the idea that fake news on facebook influenced then election in any way k is a pretty crazy idea. >> ...fabook ceo mark zuckerberg will testify... >> ...ani'm responsible for what happens here.
10:01 pm
>> narrator: can facebook be fixed? r >> in light ent revelations that the company may have covered up russian inference in the016 election... >> the problem is too big, because facebook is too big. >> narrator: tonight on frontline, part two of "the facebook dilemma". >> frontline is de possible by contributions to your pbs station from viewers like you. thank yo and by the corporation for major support is provi the john d. and catherine t. macarthur foundation, committed to building a more just, verdant and peaceful world. more information is available at macfound.org. the ford foundation, working with vionaries on the front lines of social change worldwide. at ford foundation.org. additional support is provided mmitted to excellence in, journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy
10:02 pm
journalism that informs and inspires. and by the frontline journalism fund, with major support from jon and jo ann hagler. corporate support is provided by... z >> t code you're born into can determine your future, your school, your job, your dreams, your problems... at the y, our goal is to createno opportunities atter who you are or where you're from. the y, for a better us. >> i accept your nomination for president of the united states. >> i humbly accept you
10:03 pm
nomination for the presidency of the united states. >> hey, everyone.fr we are liv my backyard, where i am smoking a brisket and some ribs and getting ready for the presidential debate tonight. >> some of the questions for tonight's debate will be formed cebook.ersations happening on >> 39% of people get their election news and-m decisiing material from facebook. >> facebook getting over a billion politicacampaign posts. >> i love this, all the comments 's, like, i'm sitting here, smoking these meats and, um, and just hanging out with 85,000 people who are hanging out with me in my backyard. >> make no mistake, everything you care about, everything i care about and i've worked for, is at stake.at >> i will illary clinton, crooked hillary, i will beat her badly, so badly. >> i hope that all of you get out and vote.
10:04 pm
this is going to be an important one. ♪ clude facebook, which hasll also become a gathering place for political conversation. (cheers and applause) ♪ >> thank you. thank you. >> facebook is really the ne town hall. >> better conversations happen on facebook. >> poke for a vote. >> poke for a vote..a >> u u.s.a.! >> hillary! hilly! >> facebook is the ultimate growth stock. >> facebook is utterly dominating this new, mobile, digital economy. >> have you been measuring political conversation on facebook, things like the most likes, interactions, shares.nt >> hillary c has evaded justice. >> i thank you for giving me the opportunity to, in my view, clarify. >> 2016 is the social election. >> facebook getting over a billn political campaign posts. >> narrator: 2016 began as
10:05 pm
banner year for mark zuckerberg. his company had become one of the most popular and profitable in the world, despite an emerging dilemma that, as was connecting billions, it was infling divisions. >> people real forming these tribal identities on facebook, where you will see people getting into big fights. >> narrator: we've beenin investigwarning signs that existed as facebook grew, and interviewing those inside the company who were there at the time. >> we saw a lot of our numbers growing like crazy, asid the rest of the media and the news world in particular.so anas a product designer, when you see your products being used more,ou're happy. >> it's where we're seeing conversation happening about the election, the candidates, the issues. >> narrator: amid all this cepolitical activity on fabook, no one used the platform more successfully than donald trump's digital media director, brad parscale. >> i asked facebook, "i want to spend $100 million on your platform-- send me a manual."
10:06 pm
they say, "we don't have a manual." i say, "well, send me a human manual, then." >> james jacoby: and what does the manual provide? >> you have a manual for your car. if you didn't have that for your car, there might be things you would never learn how to use in your car, right? i spent $100 million on a y,atform, the most in hist it made sense for them to be there to help us make sure how we spent it right and did it right. >> with custom audnces, you can get your ads to people you already know who are on facebook. >> narrator: what facebook's representatives showed them was how to harness its powerful advertising tools to find and target new and receptive audiences. >> now il target my ad to friends of people who like my page. >> what i recognized was the simple process of marketing. i needed to find the right people andhe right places to show them the right message. micro-targeting allows you to do is say, "well, these are the y people that are most lik show up to vote, and these are the right audiences we need to show up." the numbers were showing in the consumer side that people were spending more and more hours of their day consuming cebook content, so if you have any best place to show your content,
10:07 pm
it would be there. w itas a place where their eyes were. that's where they were reading their local newspaper and doing things.so ane could get our message injected inside that stream. and that was a stream which waso coing the eyeballs of most places that we needed to win. ♪ >> narrator: it wasn't just politics. by this time, faceok was also dominating the news business. >> 62% of americans say they get sites like facebook.al media >>erore than a dozen develop have worked with us to build social news apps, all with the goal of helping you discover and read more news. >> narrator: facebook's massive audience enticed mediaon organizato publish straight into the company's news feed-- making it one of the most important distributors ofrl news in the wod. >> i'm personally really excited about this. i think th it has the potential to not only rethink the way that we all read news, but to rethink a lot of the way that the whole news industry works.ut >> narrator: b unlike traditional media companies, facebook did not see itself
10:08 pm
accuracy of the news auring the information on its site. ey should have taken on aret what used to be called editing. and editors had certain responsibilities for what was going to show up on the first hege versus the last page, hirelative importance of tngs that don't relate purely to money and don't relate purely to popularity. so they took over the role of editing without ever taking on the responsibilities of editing. >> narrator: instead, facebook's editor was its algorithm, designed to feed usershatever was most engaging to them. inside facebook, they didn't see that as a problem. >> joby: was there a realization inside facebook as to what the responsibilities would be of becoming the main distributoof news? >> i don't think there was a lot of thinking about that, that idea. i don't think there was any, any
10:09 pm
thought that news content in alparticular had, had more or had more need for protection oan any of the other piec content on facebook. >> narrator: andrew anker was in chargef facebook's news products team, and is one of eight former facebook insiders who agreed to talk on camera about their experiences. ings when i joined facebook.of and as someone who grew up in the dia world, i expected there to be more of a sense of how people interact with media and how imrtant media can be to certain people's information diet. (applause) >> we have a video from davida from napoli. >> n (laughter) you know, we're a technologyan co we're not a media company. ig>> the fact that so many well-known news brands really pushed into facebook pretty
10:10 pm
aggressively legitimized it aset a place tokind of, information. and i think that also strangelyh createopportunity for people who weren't legitimate, as well. because if the legitimate a players are ther you're not legitimate, all you need to do is set up a website and then share links to it, and your stuff on facebook is going to look similar enough th you've just gotten a huge leg up. st hillary clinton is the corrupt person... >> narrator: but as the 2016 campaign heated up... >> and i'll tell you, some of ywhat i heard coming from opponent... >> narrator: ...reporter craig silverman was sounding alarms that facebook'news feed was spreading misinformation-- what he called "fa news." >> fake news just seemed like the right term to use.wa and trying to get people to pay attention. i was trying to get journalists to pay attention. i was trying to also get facebook and other companies like twitter to pay attention t th well. >> narrator: silverman traced s misinformation back e unusual places. >> we started to see this smalle
10:11 pm
cluster of websis being run, the vast majority, from one town in macedonia. how popular is it? >> about 200 people, maybe. >> 200 people? >> yeah. >> are making fake news websites? >> yes. >> most of them didn't really care about who won the election. they weren't in this for politics. if you put ads on these completely fake websites, and you've got a lot of traffic from faceok, that was a good way make money. >> there are some people who made, like, 200k or something like that. >> 200,000 euros? >> yeah, yeah, yeah. >> i remember one guy, i think he was 15 16 years old, telling me, you know, "americans want to read about trump, so i'm writing trump stuff." trump earned them money. we saw macedonia's publishing hillary clinton being indicted,u the pope endorsing, hillary clinton selling weapons to isis, getting cse to or above a million shares, likes, comments. that's an insane amount of engagement. it's more, for example, than when "the new york times" had a scoop about donald trump's tax returns.
10:12 pm
how is it that a kid in macedonia can get an articleem that gets more engt than a scoop from "the new york times" on facebook? >> jacoby: a headline during the campaign was "pope endorses trump," which was not true, but it went viral on facebook. was it known within facebook that that had gone viral? >> um, i'm sure it was. i didn't necessarily know hown, viral it had gotnd i certainly didn't believe that anybody believed it. j oby: but would that have been a red flag inside the company, that something that's patentlyalse was being propagated to millions of peoplf on the pm? >> i think if you asked the question that way, it would have been. but i think when you asked, then the nextuestion, which is the harder and the more important question, was, which is, "so what do you do about it?", you thenery quickly get into issues of not only free speech, but to wgree is it anybody's responsibility, as aor technology plaor as a distributor, to start to deciden
10:13 pm
ou've gone over the line betwn something that is clearly false from something that may or may not be perceived by everybody to be clely false and potentially can do damage? >> jacoby: over the course of the 2016 election, there was a lot news about misinformation. i mean, there was, famously, "the pope endorses trump." do you remember that? >> absolutely. i, i wasn't working on these issues at the me, but, but absolutely i, i do remember it. >> narrator: tessa lyons was chief of staff to facebook's number two, eryl sandberg, and is now in charge of fighting misinformation. she is one of five current officials facebook put forward to answer questions. w >> jacob there any kind of sense of, like, "oh, my goodness, facebook is getting polluted with misinformation-- someone should do something about this"?, >> there certainly wd there were people who were thinking about it. what i don't think there was a real awarene of, internally or externally, was the scope of the problem and the, the right urse of action.
10:14 pm
>> jacoby: how could it be surprising that, if you're becoming t world's information source, that there may be ale prwith misinformation? >> there was certainly awareness that there could be proble related to news or quality of news. and i think we all recognized afterwards that of all of the threats that we were considering, we focused a lot on threats that weren't misinformation and underinvest in this one. ♪ >> narrator: buthere was another problem that was going unattended on facebook beyond misinformation. >> one of the big factors th emerged in the election was what started to be calledrt hyper-an facebook pages. these were facebook pages that kind of lived and diedy really ginning up that partisanship--e' "w right, they're wrong." but not even just that, it was also, "they're terrible people, and we're the best." and the facebook pages were getting tremendous engagement. >> "a million migrants are coming over the wall, andin they're to, like, rape your children," you know?
10:15 pm
that stuff is doing well. t >> and the stut was true would get far less shares.nt >> the developmef these hyper-partisan sites i think turned the informational commons into this trash fire. and there's some kind of parable in that for the broader at the very things that divide us most cause the most engagement. >> (barking, laughing) >> which means they go to thehe top ofews feed, which means the most people see them. >> narrator: this worried an early facebooknvestor who was once close to zuckerberg. >> am an analyst by traini and profession. and so, my job is to watch and interpret. at this point, i have a series of different examples that suggest to me that there is something wrong, systemically, with the facebook algorithms ani ss model. in effect, polarization was the key to the model. this idea of appealing to people's lower-level emotions,
10:16 pm
things like fear and anger, to create greater engagement, and mo the context of facebook time on site, more sharing, andm therefore advertising value. i found that incredibly disturbing. >> narrator: ten days before the election, mcnamee wrote zuckerberg and sandberg about his concerns. >> i mean, what i was really trying to do was to help mark and sheryl get this thing righte and their responses ore or less what i expected, which is toreay that what i had seen isolated problems, and tt they had addressed each and every one of them. i thought facebook could stand up and say, "we're going to reassess our priorities. we're going to reassess the company to try to take into account the fact that our impact is so much greater now than it used to be.an that as facebook, as a company with, you know, billns of users, we have influence on
10:17 pm
how the whole social fabric works that no one's had before." (cheers and applause) >> i've just received a call from secretary clinton. >> clinton has called trump to concede the election. >> the clinton campaign is... really a somber mood here. >> the crowd here trump campaign headquarters... >> narrator: trump's targeted adon facebook paid off... >> did things like facebook help one of the nastiest elections ever? >> narrator: ...leading to complaints that facebook helped tilt the election...d >> facebook elecnald trump, that's basically... >> narrator: ...which the trump campaign dismissed as anger over the results. >> there has been mounting criticism of facebook... >> no one ever complained about facebook for a single day untilu donald was president. the only reason anyone's upset about this is that donald trumpe isdent and used a system that was all built by liberals. when i got otv and told everybody after my interview of what we did at facebook, it exploded.
10:18 pm
the funny thing is, the obama campaign used it, then went onnd tvewspapers, and they put it on the front of magazine, and the left and the media called them geniuses for doing that. >> accusations that phony news stories helped donald trump win the presidency... >> narrator: trump's victory put facebook on the spot. >> febook even promoted fake news into its trending... >> narrator: and two days after the election, at a techno conference ihern california, zuckerberg spoke publicly about it for the first time. >> well, you know, one of the things post-election, you've babeen getting a lot of pu from people who feel that you didn't filter out enough fake stories, right? >> you know, i've seen some of the stories that you're talking about, around this election. theris a certain profound lack of empathy in asserting that the only reason why someone could have ved the way they did is because they saw some fake news. you know, rsonally, i think the, the idea that, you know, fake news on facebook, of
10:19 pm
which, you know, it's, it's a very small amount of, of, of the content, influenced the election in any way, i think, is a pretty crazy idea, right? >> if i had been sitting there in an interview, i would have said, "you're lying." when he said, "we had no impact on the election," that... i remember reading that and being furious. " i was, like you kidding me?" like, "stop it." like, you cannot say that and not be lying. ofourse they had an impact it's obvious. they were the most important distribution, news distributiono there arany statistics about that. like, i don't know how you could possibly make that claim in public and with such a cavalier attitude. that infuriated me. and i texted everybody there,, sayiou're kidding me." >> jacoby: is he not recognizing the importance of his platformra in our dem at that point in time? >> yes, i think he didn't understand whahe had built, or didn't care to understand or wasn't paying attention, and doesn't. they really do want to pretend, as they're getting on their private planes, as they're getting... going to their beautiful homes, as they're collecting billions of
10:20 pm
dollars, they never want to acknledge their power. they're powerful, and they have... they don't. >> thank youo much for being here. >> thank you, guys. >> i think it was very easy for all ofs sitting in menlo park to not necessarily understand how valuable facebook had become. i don't thk any of us, mark included, appreciated how much of an effect we might have had. ann't even know today, two years later, or almost two years later, that we really understand how much of a true effect we had. but i thk more importantly, we all didn't have the information to be saying things like that at the time. my guess is, is that mark now realizes that there was a lot more to the story than, than he or any of us could have imaginet t point. >> narrator: barely two monthsla r, in washington, an even more serious situation was developing. intelligence agencies were investigating russian inrference in the election and whether social media had
10:21 pm
played a role. >> classical propaganda, disinformation, fake news. does that continue? >> yes. in my view, we only scratchedth surface. i say "we," those that assembled the intelligence community assessment that we publied on the 6th of january 2017, meaning nsa, c.i.a., fbi, and my office. but i will tell you, frankly, that i didn't appreciate the full magnitude of it until well after. >> narrator: amid growg scrutiny... >> all right. >> narrator: ...zuckerberg setou out on a cross-cry trip he publicized by streamingn facebook. >> so i've been going around to diffent states for my person how different communities are working across the country. >> narrator: but whileth he was oroad, the news was getting worse. >> the u.s. intelligence community officially is aming russian president... >> ...russian president vladimir putin orded an influence campaign aimed at the
10:22 pm
presidential election. >> narrator: zuckerberg's chief of security, alex stamos, had been asked to see what he could find on facebook'servers. >> we kicked off a big look inte the ews phenomenon, specifically what component of at might have a russian in its origin. he >> narrator:y traced disinformation to what appeared to be russian government-linked sources. >> jacoby: so, what was it like bringing that news to others in the company, and up to mark and sheryl, for instance? >> you know, we had a big responsibility in the security team to educate the right peoplh about wh happened without being kind of overly dramatic. it's kind of hard as a security person to balance that, right?ms like, everything sike an emergency to you. but in this case it really was, right? this really was a situation in which we saw the tip of this iceberg, and we knew there was some kind of iceberg beneath it. >> narrator: stamos expanded his investigation to look at how the russian operation may have
10:23 pm
also used facebook's targeted advertising system. >> so what we did is, we then decided we're going to look at aladvertising and see if w can find any strange patterns that might link them to russian activity. so we enlisted huge partof the we kind of dragooned everybody into one big, unified te so you have people in a war room working 70-, 80-hour weeks, billions of dollars of ads, hundreds of millions of pieces of content, and by kind of a painstaking process of going false positives, eventuallynd found this large cluster that we were able to link to the internet research agency of st. petersburg.on >> narrator: it waof the same groups that had been usingd facebook to spre disinformation in ukraine three years earlier. this time, using fake accounts, russian operatives had paid around $100,000 to run ads that
10:24 pm
promoted political messages and enticed people to join fake facebook groups.th >> whainternet research agency wants to do is, they want to create the appearance of legitimate social movements. so they would create, for example, a pro-immigration group and an anti-immigration group.e and both of thoups would be almost caricatures of whate tho sides think of each other.ru and their goal oing ads were to find populations of pele who are open to those kinds of messages, to get them into those groups, and then to deliver content on a regularto basirive them apart. ally what the russians a trying to do is to find these fault lines in u.s. society and amplify them, and to make americans not trust each other. >> narrator: in september 2017, nearly a year after the election, zuckerberg announced on facebook what the company had found.
10:25 pm
>> we are actively working with the u.s. government on its ongoing investigations into russian interference. we've been investigating this for many months now, and for a while, we d found no evidence of fake accounts linked to russian... linked to russia running ads. when we recently uncovered this activity, we provided that infoation to the special counsel. we also briefed congress. and this morning, i directed our team to provide the ads we've found to congress, as well. >> we do know that facebook-rated posts touched about 150 million americans, that were posts that origited either through russian fake accounts or through paid advertising from the russians. but the paid advertising was really a relatively small piece of the overall problem. a much bigger problem was the ability for someone to say they we james in washington, dc but it was actually boris in st. petersburg creating a fake persona that would generate followers, and then they would seed it with the fak information and the false news
10:26 pm
and the political content. one account was set up t to tryo rally the muslim community in, in texas.em another was an apt to kindig of rally the wing in texas. they created an event. >> white power! >> stop the hate! stop the fear! >> protest, with both sides protesting against each other. >> (yelling) >> at a mosque in houston, in 2016. >> this is america, we have the right to speak out. >> (yelling) >> but for the good work of the houston police, you could have had the kind of horrible activity take place then and there that i saw unfortunately p tace in charlottesville in my state last year. so the real human consequences of some of these... of some of this ase, we've been very lucky that it hasn't actually cost people's lives. >> narrator: facebook alsoun that the russians had used the site to orchestrate a
10:27 pm
pro-trump rally outside of a cheesecake factory in florida and to promote an anti-trump protest in new york city just after the election. >> hey, hey, ho, ho, donald has got to go... >> we are under threat, d i need to defend the country that i love. >> we are right in the middle of the protest. >> narrator: t details of facebook's internal investigation set off alarm bells in washington. >> we're such a ripe target for that sort of thing, and the russians know that. sohe russians exploited th divisiveness, that polarization, because they had, they hadss es for everybody. you know, black lives matter, white supremacists, gun control advocates, gun ctrol opponents, it didn't matter-- they had messages for everybody. >> jacoby: did you think that was a pretty sophisticated campaign?s. >> it and i believe the russians did a lot to get people out to voteh that wouldn't have aped the appeal for... of donald trump. >> jacoby: and the role that
10:28 pm
social media played in that was what? >> it was huge. >> i mean, it's really quite both ingenious and evil to, to attack a democratic society in that manner. >> jacoby: but there were warning sis along the way in the trajectory of the company. t company's been dealing with the negative side effects of its product for years, right? when you have two billion people on a communicationin platform, there's nite number of potentially bad things that could happen.ng the tough part is tro decide where you're going to put your focus. >> narrator: but by 2017, facebook was being accused of not focusing on other serious issues in developing, fragile democracies where the company had expaed its business. countries like the philippines,t where almost allnet users are on facebook, and problem had been mounting. >> in a year, i probab met with more than 50 different officials, high-ranking officials, including mark zuckerberg. i wanted them to know whate were seeing, i wanted them to
10:29 pm
tell me what they thought about it, and i wanted them to fix it. >> narrator: maria ressa, who runs a prominent news website, says she had been warning facebook since 2016 that president rodrigo duterte was using a network of paid followers and fake accounts to spread lies aboutpo hicies and attack his critics. >> the u.n.'s anded his war a crime under international law. >> narrar: especially critics of his brutal war on drugs, 12,000 lives.en an estimated >> human rights watch has called government-sanctioned butchery. >> president duterte was targeting anyone who questioned the drug war, anyone whohe questionedlleged extrajudicial killings. anyone on facebook who questioned that would get we're protected by the constitution, we've been stripped of those protections online. >> narrator: ressa hself would eventually come under attack.
10:30 pm
>> there were attacks on the way i look, e way i sounded, that i shldld be raped, that i shou be killed. we gave it a name: patriotic trolling. online, state-sponsored hate that is meant silence, meant to intimidate. so this is an inrmation ecosystem that just turns democracy upside-down. >> jacoby: and where lies are where lies are truth. >> narrator: she traced the disinformation to a network of 26 fake accounts and reported it to facebook at a meeting in >> jacoby: what were ying16. them to do? >> exactly what every news group does, which is, take contr and be responsible for what you create. >> jacoby: were you given an explanation as to why they weren't acting? >> no. no. i think facebook walked into
10:31 pm
the philippines, and they were focused on growth. what they didn't realize is that countries like the philippines... >> (chanting) >> ...countries where institutions are weak, where corruption is rampant, these countries don't have the safeguards. and what happens when you bring everyone onto a platform and do not exercise any kind of rules, right? if you don't implement those rules beforehand, you're going to create chaos.he >> jacoby:'s a problem in the philippines, we've heard about from people on the ground there, that facebook has been to some degree weaponized by the duterte regime there. what are you doing to, to stemis roblem in the philippines? >> one thing we're trying to do, any timehat we think there might be a connection between violence on the ground andst online speech, the fhing for us to do is actuly understand the landscape. >> narrator: monika bickert is facebook's head of gbal policy and worked for the justice
10:32 pm
department in southeast asia. >> there's a fundamental question, which is, "what should our role be, and as we are identifying misinformation, shou we be telling people wh we're finding, should we be removing that content, should w- be dnking that content?" and we now have a team that is focused on how to deal with exactlthat sort of situation. >> narrator: in april 2018, facebook created a news verificationpr ram and hired ressa's organization as one of its fact-checkers, though she says the problems are ongoing. the company ultimately took down the accounts ressa identified-- and went on to remove dozens more. >> i think what is happening is its head in terms of its in over responsibilities. it's way in over its head in terms of what power it holds. the idea isn't that it's just like you magical add facebook and horrible things happen, but you have facebook as this effective gasoline to simmering
10:33 pm
fires. >> narrator: elsewherethe region.... >> buddhistsre inciting hatred and violence against muslims through social media... >> narrator: ...facebook was also being used to fan ethnic tensions with even more dire consequences. >> violence between buddhists and muslims is continuing. >> misinformation, disinformation, rumors, extremist propaganda, all kinds of bad content. >> narrator: for several years, david madden, a tech entrepreneur living in myanmar, as well as journalists and activists, had been warning facebook that the muslim minority there was being targeted with hate speech. >> (speaking local language) >> you would see the use of memes, of images, things that rgeting the muslim community.ng, >> (speaking local language) >> narrator: the warning signs had been present as far back as
10:34 pm
2014, when a fake news story spread on facebook. >> reports, later proved to be false, that some muslim men had raped a buddhist woman, were shared on facebook. >> an angry mob of about 400 surrounded the sun teashop, shouting and throwing bricks and stones. >> narrator: two people died in the incident. >> one buddhist and one muslim were killed in riots today. >> was really concerned th the seriousness of this was not understood. and so i made a presentation at facebook headquarters in may of 2015. i was pretty explicit about thee state ofroblem. i drew the analogy with what had happened in rwda, where radiospl had ayed a really key role in the execution of its genocide. and so i said, "facebook runs the risk of being in myanmar what radios were in rwanda." that this platform could be used to foment hate and to incite violence.wa
10:35 pm
>> jacoby: whathe reaction to that at facebook? >> i got an email shortly afterh meeting to say that what had been discussed at that meeting had been shared internally and apparently taken very seriously. >> narrator: the violence intensified. >> massive waves of violence that displaced over 150,000 people. >> narrato and in early 2017, madden and other local activists had another meeting with facebook.ti >> the obj of this meeting was, was really to be crystal-clear about just how bad the problem was, and that the processes that they had in place to try to identify and pull down problematic content, theyust weren't working. and we were deeply concerned that something even worse was going to happen imminently. it was a sobering meetg. i think... i think the main response from facebook was, "we'll need too away and dig
10:36 pm
into this and come back with something substantive." the thing was, it never came. >> and how do you know that? >> we can look at the evidencet? on the gund. >> what we've seen here tells ua a story of ethnic ing, of driving muslims out of myanmar. >> narrator: the united nations would call the violence in myanmar a genocide, and found social media, and facebook in particular, had played a significant role. >> the ultra-nationalistbu hists have their own facebooks and really inciting a lot of violence and hatred against ethnic minorities. facebook has n turned into a beast than what it wasal orig intended to be used. >> jacob i'm curious what it's like when the u.n. comes out with a report that says that facebook played a significant role in a genocide.po running contency at
10:37 pm
facebook? >> well, this would be important to me even if i didn't work at facebook, given my background. my background is as a federal prosecutor, and i worked specifically in asia and specifically on violent crimes against people in asia. so something like that really hitsome to me. >> jacoby: facebook was warned as early as 2015 about the potential for a really dangerous situation in myanmar.th what went wrone? why was it so slow? >> we met with civil society organizations in myanmar far before 2015. this is an area where we've been focused. i think what we've learned over time is, it's importt for us to build the right technical tools that can help us find some of this content and also work grwith organizations on thnd in a real-time fashion. we are in the process of building those relationships ound the world on a much deeper level, so that we can stay ahead of any kind of situation like that.
10:38 pm
>> narrator: throughout 2018, ncebook says it's taken d problematic accounts in myanmar, hired more language experts, and improved its policies. >> jacoby: should there be any liability or any legal accountability for a company like facebook when something so disastrous goes wrong on your platform? >> there's all sorun of acbility. but probably the group that holds us the most accountable are the people using the service. if it's not a safe place for them to come and communicate, they are not going to use it. >> we are working here in menlo park in palo alto, california. to the extent that somof these issues and problems manifest in other countries ound the world, we didn't have sufficient information ana pulse on what was happening in southeast asia. >> narrator: naomi gleit is facebook's second-longest-pl serving ee. >> and so one change that we've made, along with hiring so many more people, is that a lot of b these people aed internationally and can give us
10:39 pm
that insight that we may not get from being here at headquarters. >> joby: i'm trying to understand, you know, the choices that are made. do you regret choices going backward, decisions that were ma about not taking into account risks or not measuringri s? >> yeah, i definitely think we regret not having 20,000 people working on safety and security back in the day, yes. so i regret that we were too slow, that it wasn't our priority. >> jacoby: but were those things even considered at the time? to kind of amp up safetynd security, but there was some reason not to or... >> not really. i mean, we had a safety and security team. i think we just thought it was sufficient. i just... it's not that we were, like, "well, we could do so much more here," and decided not to. i think we... we just didn't... again, we were just a bit idealistic.te
10:40 pm
>> facebook has crthis platform that in many countries, not just myanmar, has become the dominant information platform, and it has an outsized influence in lots of countries.th comes with a lot of responsibility. >> using social media, rumors of alleg muslim wrongdoing spread fast. >> many of those countries are wrestling with some pretty big challenges. tensions between groups within countries, and we have seen thik explode into what zuckerberg would call real-world harm, what others would just callyiolence or death, in man other markets. we're seeing it right now in indi >> calloo became a victim of india's fake news. >> we've seen examples of this inlaces like sri lanka. >> to keep the violence from spreading, sri lanka also shut down facebook... >> the myanmar example should be
10:41 pm
sounding an alarm at the highest level of the company, that this requires a comprehensive strategy. >> narrator: but it would be far from myanmar, and a very different kind of problem, that would cause an international o upror facebook. id >> cam analytica and its mining of data on millions of americans for political purposes... >> cambridge is alleged to have used all this data from tens of millions of facebook users. >> escándalo cambridge analytica, facebook... (reporters speaking different languages) >> narrator: it was a scandal over how facebook failed to protect users data, exposed by ahistleblower named christopher wylie. >> christopher wylie, he was say i can prove this.and >> narrator: he said that facebook knew that a political consulting firm he'd worked for, cambridge analytica, had been using the personal data of moreh an 50 million users to try to influence voters. >> at cambridge analytica, weur are creating the fof political campaigning. >>his is a company that specializes and would advertise
10:42 pm
itself as specializing in rumor campaigns. >> political campaigns have changed. >> seeding t internet with misinformation. >> putting the right message in front of the right person at the right moment. >> and that's the power of data. you can literally figure out who are the people who are most susceptible.ut >> ...data aersonality, so you know exactly who to target... >> narrator: the firm gained access to the data from a third party, without fabook's permission. >> the overwhelming majority of people who had their dataco llected did not know.es when data facebook's servers, there is no way for facebook to tracatthat data to know how th data ibeing used or to find out how many copies there are.ac >> narrator:ook eventually changed its data sharing policies and ordered cambridge an.ytica to delete the data >> we know that facebook had known about this... >> narrator: after wylie wa came f, they banned the firm from their site, and announced they were ending
10:43 pm
another controversial practice: working dictly with companies known as data brokers. but the uproar was so intense that in april 2018, rk zuckerberg was finally called before congress, in what would become a reckoning, over facebook's conduct, it's business model, and its impact on democracy. >> we welcome everyone today's hearing on facebook's social media privacy and the use and abuse of data. i now turn to you, so proceed, sir. w face a number of important issues around privacy, safety, d democracy. and you will rightfully have some hard questions for me to answer. facebook is an idealistic and optimistic company. and as facebook has grown, people everywhere have gotten a ngpowerful new tool for ma
10:44 pm
their voices heard and for building communities and busisses. but it's clear now that we didn't do enough to prevent these tools from being used r harm, as well. and that goes for fake news, for foreign interference in elections, and hate speech, as well as developers and data privacy.ke we didn't broad enough view of our responsibility, and that was a bigistake. and it was my mistake. and i'm sorry. >> if, like me, you're following this stuff, you see years and years and years of people begging d pleading with the company, saying, "please pay attention to this," at every channel people could find. and basically being ignored.re "we hear you, yooncerned, we apologize, of course we have a responsibility, we'll do better." and the public record here is unable and unwilling tption of and deal with this complexity.
10:45 pm
>> you may decide, or facebook ma adecide, it needs to poli whole bunch of speech, that i think america might be betr off not having policed by one company that has a really bigm. and powerful p >> senator, i think that this is a really hard question. and i think it's one of the reasons why we struggle with it. >> these are very, vwerful corporations. they do not have any kind of traditional democratic accountability. and while i personally know a lot of people making these decisions, if we set the norms that these companies need to decide what, who does and does not have a voice online,tu evly that is going to go to a very dark place. >> when companies become big and powerful, there is a instinct tt either regor break up, right? w >> i thire finding ourselves now in a position where people feel like something should be done there's a lot of questions what should be done, but there's no question that something should be done. >> you don't think you have ano
10:46 pm
ly? >> it certainly doesn't feel like that to me. >> oy. >> you know, there's a lot of problems here, there, but all of these problemset worse when one company has too much power, too much informaon, over too many people. >> narrator: after years of unchecked growth, e talk now is increasingly about how to in in facebook. already, in europe, there's a new internet privacy law aimed at compani like facebook. inside the company, the people we spoke to insisted that facebook is still a force for good.ve >> jacoby: has therebeen a minute where you've questioned the mission? whether anyone has tak second to step back and say, "all right, has this blinded us in some way?" have you had a moment like that? >> i still continue to firmly m believe in tsion, but in of reflecting, absolut, in terms but that isn't on the mission. bothe reflection is really, how can we do a better job of
10:47 pm
minimizing bad experiences on facebook? >> jacoby: why wasn't that part of the metric earlier?in erms of, how do you minimize the harm? >> you know, it's possible that we could have done more sooner, and we haven't been as fast as we needed to be. >> narrator: that line wasal repeated by l the current officials facebook put forward to answer questions. >> we've been too slow to act on... >> i think we were too slow... >> we didn't see it fast enough. >> we were too slow. >> mark has said this, that we have been slow. >> one of my greatest regrets in running the company is that we were slow in identifying the russian information operations in 2016.ak and we're going toa number of measures, from building and deploying new a.i. tools that take dowfake news, to growing our security team to more than >> the goal here is todive on the market nuances there... >> narrator: the company says it's now iesting resources and talent to tackle a range of problems, from the spread of hate speech to election interference. >> even if we can't do fact-checking, if we can do more work around the programmaticpe of it...
10:48 pm
>> narrator: this is part of the team tackling the spread of misinformation around the world, led by tessa lyons. >> the elections integrity team has a framework for how they're thinking abo secondary languages in each country. and i feel like from the misinformation side, we've mostly prioritized primary languages. >> narrator: it's a problem the om solving.its it is a long way >> the next thing is about the arabic fact-checking project. i isink the main blocker her potentially getting a fact-checker that can cover an entire region. >> you know, i came into this job asking myself, "how long is it going to take us to solv this?" t d the answer is, this is problem that you solve. it's a problem that you contain. >> awesomeme.e next, seto upcoming launches. >> narrator: iadvance of the 2018 midterms, facebook mobilized an election team to monitor false news stories and delete fake accounts that may have been trying to influence voters. nathaniel gleicher runs the team.
10:49 pm
>> there are going to be actors that are going to try to manipulate that public debate. how do we figure out what areth techniques they're using and how do we make it much harder?in >> jacoby: is there to be real-time monitoring on election day of what's going on onbo fa, and how are you going to actually find things that may sow distrust in the election? >> absolutely, we're going to have a team on election day focused on that problem, and one thing that's useful here is, we've already done thiin other elections. >> jacoby: and you're confident you can do that here? >> i think that... ans, i'm confident that we do this here. team continues to find foreign actors using the platform to spread disinformation. >> iran was revealed to be a new player in worldwide disinformation campaigns, and o. top of thi.. >> narrator: and in october ofec 2018, federal prors announced they'd found evidence that russian operatives had been trying to interfere in thet u.s. midterm en. >> jacoby: what is the standard
10:50 pm
that the publishould hold facebook to, in terms of solving some of these seemingly enormous proble? >> i think the standard, there onsibility, what i'm focused on, is amplifying good and minimizing the bad. and we need to be transparentt abat we're doing on both is is an ongoing discussion.k >> jacoby: what's an ongoing discussion? >> how we're doing on minimizing the bad. >> jacoby: but we're dealing with such consequential sues, right? we're talking about integrity of our elections, we' talking about... >> absolutely. >> jacoby: ...in some cases, playing a role in a genocide. an ongoing conversation means what, exactly, about that? about a standard for success here? >> i think, you know, this is the number-one priority for the company. mark has been out there, sheryl is out there, you're talking to me and a bunch of the other leaders. that's what we mean by having an
10:51 pm
this is something thated to, as you said, this is serious, this is consequential.e ake this extremely... like, we understand this responsibility, and it's not >> jacoby: do you think facebook has earned the trust to be able to say, "trust us, we've got this"? >> i'm not going to answer that, i'm sorry. that's just... i mean, th, everybody can make that decision for themselves. >> jacoby: but what... do you trust them? >> i trust the peoe who i worked with. i think there are some good people who are working on this. that doesn mean i don't think we should pass laws to back that up. >> it has not been a good week for facebook... . ...social media giant.. >> narrator: for facebook, theul problems have beenplying. >> ...massive setback for facebook, the social media giant... affecting nearly 50 miattack facebook users... >> facebook continues to crack down on fake political ads narrator: but mark zuckerberg's quest to connect and change the world continues. >> hey, everyone! hey.
10:52 pm
welcome to f8. this has been an intense year. i can't believe we're >> after all these sca facebook's profits just still going up, right? so they don't really have a huge incentive to change the core problem, which is their business model. >> we are announcing a new setmi of features soon... >> they're not going to do it t as long y're doing so well financially and there's no regulatory oversight. and consumer backlash doesn't really work, because i can't leave facebook-- all my friends and family around the world are there. you might not like the company, you might not like its privacy policies, you might not like the way its algorithm works, youno mighlike its business model, but what are you going to do?ow >>there's no guarantee that we get this right. this is hard stuff. we will make mistakes, and they will have consequences, and we will need to fix them.
10:53 pm
>> narrator: as heas since the beginning, he sees facebook, his invention not as part of the problem, but the solution. >> so if you believe, like i do, important, that buildioice is relationships is important, that creating a sense of community is important, and that doing the hard work of trying to bring the world closer together is important, then i say this: we will keep building.au (cheers and ap) ♪ ♪ ♪ hold, hold on hold on to me ♪ 'cause i'm a little unsteady ♪ >> what's the situation there? >> how do you explain that? >> are you ready for this world
10:54 pm
that we are facing today? ♪ >> go to pbs.org/frontline for the latest developments in the facebook story. then check out the new frontline "transparency project",m and see key quotes fe film in context. >> this really was a situation in which we saw thtip of this iceberg, and we knew there was some kind of iceberg beneath i >> this isn't a problem that you solve, it's a oblem that you contain. >> what i'm focused on is amplifying good, and minimizing the bad. >> connect to the frontline community on fa book, twitter, or pbs.org/frontline. >> frontline is made possible by contributions to your pbs station from vwers like you. thank you. and by the corporation for public brocasting. major support is provided by the carthur foundation,ne t. committed to building a more
10:55 pm
just, verdant and peaceful world. more information is available at macfound.org. f the fondation, working with visionaries on the front lines of social change at ford foundation.org. additional support is provided by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john andelen glessner family trust. supporting trustworthy journalism that informs and inspires. and by the frontline journalism fund, with major support from jon and jo ann hagler. and by the y, for a better us. captioned by media access group at wgbh access.wgbh.org >> for more on this and other frontline programs, visit our website at pbs.org/frontline.
10:56 pm
♪ to order frontline's "the facebook dilemma" on dvd visit shop pbs, or call 1-800-play-pbs. this program is also available on amazon prime video. ♪ wu' inatregch
10:57 pm
10:58 pm
10:59 pm
11:00 pm
the corporation for public broadcasting. and by viewers like you. thank you. (music plays) uhhh let's think of some christmas carols.t we've go oh holy nig.t we've ggrandma got run over by a reindeer, which is my personal favorite. jingle bells . ♪ chestnuts roasting on an open fir. (laughs) god. whether you celebrate christmas, hanukkah, kwanzaa, chrismukkah, y just get two days off from work,ts everybody rapped up in the holidays. and i'm no different.ha so deck ths y'all. (theme music plays- the avett brothers "will you return")'m