Skip to main content

tv   Frontline  PBS  December 19, 2018 4:00am-5:01am PST

4:00 am
>> narrator: tonight- frontline's investigationk into faccontinues. >> there is absolutely no company who has had so much influence on the information that americans consume. >> narrator: he's the man who connected the world. but at what cost? >> polarization was the key to the model. >> narrato threat...al >> this is an information ecosystethat just turns democracy upside dn. >> narrator: the 2016 election...fa >> ...cebook getting over a billion political campaign sts. >> narrator: and the company.. denials. >> the idea that fake news on facebook influenced the election in anway i think is ara pretty czy idea. >> ...facebook ceo mark zuckerberg will testify... >> ...and i'm responsie for what happens here.
4:01 am
>> narrator: can facebook be fixed? >> in light of recent revelations that the companyru may have covered ussian inference in the 2016 election... >> the problem is too big, because facebook is too big. >> narrator: tonight on frontline, part two of"t facebook dilemma". >> frontline is made possible by contributions to your pbsio stfrom viewers like you. thank you. and by the corporation forbl broadcasting. major support is provided by the john d. and catherine t. macarthur foundation, committed to building a more just, verdant and peacefulwo d. more information is available at macfound.org. the ford foundation, working with visionaries on the ont lines of social change worldwide. at ford foundation.org. additional support is provided by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy
4:02 am
journalism that informs and inspires. and by the frontline journalism fund, with major support from jon and jo ann hagler.or corporate suis provided by... >> the zip code you're born into can determine your future, ur school, your job, you dreams, your problems... at the y, our goal is to create oppouunities no matter who yo are or where you're from. the y, for a better us. >> i accept your nomination for president of the united states. >> i humbly accept your nomination for the presidency of
4:03 am
the united states. >> hey, everyone. we are live from my backyard, where i am smoking a brisket and some ribs and getting ready fo the presidential debate tonight. >> some of the questions for tonight's debate will be formed by conversations happening on facebook. >> 39% of people get their election news and rodecision-making material facebook. >> facebook getting over a billion political campaign posts. >> i love this, all e comments that are coming in. it's, like, i'm sitting here,s smoking these med, um, and just hanging out with 85,000 people who are hanging out with me in my backyard. >> make no mistake, everything you care about, everything i i care about ae worked for, is at stake. >>, will beat hillary clint crooked hillary, i will beat her so badly, so badly. >> i hope that all of you get out and vote.
4:04 am
this is going to be an important one. ♪ >> tonight's broadcast will als include faceboich has become a gathering place for political conversation. pl (cheers and se) ♪ >> thank you. thank you. >> facebook is really the newal town hl. >> better conversations happen on facebook. >> poke for a vote. >> u.s.a.!r a vote. u.s.a.! >> hillary! hillary! >> facebook is the ultimate growth stock. >> facebook is utterly dominating this new, mobile, digital economy. >> have you been measuring political conversaon on facebook, things like the most likes, interactions, shares. >> hillary clinton has evaded justice. t >>nk you for giving me the opportunity to, in my view, clarify. >> 2016 is the social election.c >>ebook getting over a billion political caaign posts. >> narrator: 2016 began as
4:05 am
banner year for mark zuckerberg. his company had become one of the most popular and profitable in the world, despite an emerging dilemma that, as it was connecting billions, it was inflaming divisions. >> people really forming these tribal identities on facebook, where you will see people getting into big fights. >> narrator: we've been investigating warning signs that existed as facebook grew, an interviewing those iide the company who were there at the time. >> we saw a lot of our numbers growing like crazy, as did the rest of the media and the news world in particular.es and so, as a productner, when you see your products being used more, you're happy. >> it's where we're seeing conversation happening about the election, the candidates, the issues. >> narrator: amid all this political activity on facebook, no one used the platform more successfully than donald trump's digital media director, brad parscale. >> i asked facebook, "i want to spend $100 million on your platform-- send me a manual."
4:06 am
they say, "we don't have a manual." i say, "well, send me a human manual, then." >> james jacoby: and what does the maal provide? >> you have a manual for your car. if you didn't have that for your car, the might be things you would never learn how to use in your car, right? i spent $100 million on a made sense for them to bey, there to help us make sure how we spent it right and did it right. >> with custom audiences, you can get your ads to people youho already knowre on facebook. >> narrator: what facebook's representatives showed them was how to harness its powerful advertising tools to find and target new and receptive audiences. >> now i'll target my to friends of people who like my page. he what i recognized was t simple process of marketing. i needed to find the right people and the right places to show them the right meage. micro-targeting allows you to do is say, "well, these are the people that are most likely to show up to vote, and these are show up." audiences we need to the numbers were showing in the consumer side that people wereor spendie and more hours of their day consuming facebook content, so if you have any best place to ow your content,
4:07 am
it would be there. it was a place where their eyes were. that's where they were reading their local newspaper and doing things. and so we could get our message injected inside that stream. and that was a stream which was controlling the eyeballs of most places that needed to win. ♪ >> narrator: it wasn't just politics. by this time, facebook was also dominating the news business. >> 62% of americans say they get their news from social mediaeb sites like facook. >> more than a doz developers have worked with us to build social news apps, all with the goal of helping you discover and read more news. >> narrator: facebook's massive audience enticed media ganizations to publish straight into the company's news feed-- making it one of the most important distributors of ne in the world. >> i'm personally really excited about this. i think that it has the potential to not only rethink the way that we l read news, but to rethink a lot of the way that t whole news industry works. >> narrator: but unlike traditional media companies, facebook did not see itselfle
4:08 am
as responsor ensuring the accuracy of the news and information on its site. >> the responsibilities that t they should haen on are what used to be called editing. and editors had certain responsibilities for what was going to show up on the first page versus the last page, the relative impornce of things that don't relate purely to money and don't relate purely to popularity. so they took over the role of editing without ever taking onth responsibilities of editing. >> narrator: instead, facebook's editor was its algorithm, designed to feed users whatever was most engaging to them. inside facebook, they didn't see that as a problem. >> jacoby: was there a realization inside facebook as to what the responsibilities would be of becoming the main distributor of news? >> i don't think there was a lot of thinking about that, that idea. i don't think there was any, anh
4:09 am
ght that news content in particular had, had more value or had more need for protection than any of the other pieces of content on facebook. >> narrator: andrew anker was in charge of facebook's news products team, and is one of eight former facebook insiders who agreed to talk on camera about their experiences. >> i was surprised by a lot ofin things when i facebook. and as someone who grew up in the media world, i expted there to be more of a sense of how people interact with media and how important media can be to certain people's infoation diet. (applause) >> we have video from davidafrom napoli. >> no. (laughter) you know, we're a technology company. we're not a media company. >> the fact that so many big, well-known news brands really pushed into facebook pretty aggressively legitimized it as
4:10 am
a ace to get, kind of, information. and i think that also strangely rcreated the opportunity people who weren't legitimate, as well. because if the legitimate players e there, and you're not legitimate, all you need to do is set up a website and then share links to it, and your ngstuff on facebook is goio look similar enough that you've just gotten a huge leg up. >> hillary clinton is the most corrupt person... >> narrator: but as the 2016 campaign heated up... >> and i'll tell you, some of what i heard comg from my opponent... >> narrator: ...reporter craig silverman was sounding alarms that facebook's news feed was spreading misinformation-- what he called "fake news." >> fake news just seemed like and i was trying to gele to pay attention. i was trying to get journalists to pay attention. i was trying to also get facebook and other companies like twitter to pay attention to this, as well. >> narrator: silverman traced misinformation back to some unusual places. >> we started to see this small
4:11 am
e vast majority, from one town in macedonia. >> how popular is ? >> about 200 people, maybe. >> 200 people? >> yeah. >> are making fake news websites? >> yes. >> most of them didn't really care about who won the election. they weren't in this for politics. if you put ads on these completely fake websites, and you've got a lot of traffic from facebook, that was a good way to make money. >> there are sompeople who made, like, 200k or something like that. >>00,000 euros? >> yeah, yeah, yeah. >> i remember one guy, i think he was 15 or 16 years old, telling me, you know, "americans want to read about trump, so i'm writing trump stuff."he trump earnedmoney. we saw macedonia's publishing hillary clinton being indicted, the pope eorsing trump, hillary clinton selling weapons to isis, getting close to or above a million shares, likes, comments. that's an insane amount of engagement. it's more, for example, than when "the new york times" had a scoop about donald trump's tax returns.
4:12 am
how is it that a kid in macedonia can get an article that gets re engagement than a scoop from "the new york times" on facebook?ob >> j a headline during the campaign was "pope endorses trump," which was not true, but it went viral on facook. was it known within facebook that that had gone viral? >> um, i'm sure it was. i didn't necessarily know how viral it had gotten, and i certainly didn't believe that anybody believed it. t >> jacoby: but wout have been a red flag inside the company, that something that's patently false was being propagated to millions of people on the platform? i think if you asked th question that way, it would have been. y but i think wh asked, then the next question, which is the harder and the more importantti qu, was, which is, "so what do you do about it?", you then very quickly get to issues of not only free speech, t to what degree is it anybody's responsibility, as a technogy platform or as a distributor, to start to decide
4:13 am
when you've gone over the line between somethinthat is clearly false from something that may or may not be perceed by everybody to be clearly false and potentially can do damage? >> jacoby: over the course of the 2016 election, there was a lot of news about misinformation. i mean, there was, famously, "the pope endorses trump." do you remember that? >> absolutely. i, i wasn't working on these issues at the time, but, but absolutely i, i do remember it. >> narrator: tessa lyons was chief of staff to facebook's number two, sheryl sandberg, a is now in charge of fighting misinformation. she is one of five current officials facebook put forward to answer questions. d jacoby: was there any k of sense of, like, "oh, my goodness, facebook is getting polluted with misinformation-- someone should dsomething about this"? >> there certainly was, and there were people who were thinking about it. what i don't think there was a real awareness of, internally or externally, was the scope of the problem and the, the right course of action.
4:14 am
>> jacoby: how could it be surprising that, if you're becoming the world's informaon source, that there may be aat problem with misinfon? >> there was certainly awareness that there could be problemsd rela news or quality of news. and i think we all recognized afterwards that of all of the threats that we were considering, we focused a lot ow threats then't misinformation and underinvested in this one. ♪ >> narrator: but there was another problem that was going unattended on facebook beyd misinformation. >> one of the big factors that emerged in the election was what started to be called gehyper-partisan facebook these were facebook pages that kind of lived and died by really ginning up that partisanship-- "we're right, they're wrong." but not ev just that, it was also, "they're terrible people, and we're the best." and the facebook pages wereus getting tremendongagement. >> "a million migrants are coming over the wall, and ey're going to, like, ra your children," you know?
4:15 am
that stuff is doing well. >> andhe stuff that was true would get far less shares. >> the development of these hyper-partisan sites i think turned the informational commons into this trash fire. and there's some kind of parable in that for the broader effects of facebook. that the very things that divide us most cause the most engagement. >> (barking, laughing) >> which means they go to the htop of the news feed, wh means the most people see them. >> narrator: this worried an early facebook investor who was once close to zuckerberg. >> i am an analyst btraining and profession. and so, my job is to watch and interpret. at this point, have a series of different examples that suggest to me that there is s something wrontemically, with the facebook algorithms and business model. in effect, polarization was the this idea of appealing to people's lower-level emotions,
4:16 am
things like fear and anger, to create greater engagement, and in the context of facebook, more time on site, moreharing, and, therefore, more advertising value. i fod that incredibly disturbing. >> narrator: ten days before the election, mcnamee wrote zuckerberg and sandberg about his concerns. >> i mean, what i was really trying to do was to help mark and sheryl get this thing right. and their reonses were more or less what i expected, which is to say that what i h seen were isolated problems, and that they had addressed each and every one of them. i thought facebook could stand up and say, "wre going to reassess our priorities. we're going to reassess the metrics on which we ruthe company to try to take into account the fa that our impact is so much greater now than it used to be.ok and that as faceas a company with, you know, billions of users, we have influence on
4:17 am
how the whole social fabric works that no one's had before." (cheers and applause) >> i've just received a call from secretary clint. >> clinton has called trump to concede the election.li >> theon campaign is... really a somber mood here. >> the crowd here at trump campaign headquarters... >> narrator: trump's targeted ads on facebook paidff... >> did things like facebook help one of the nastiest elections ever? >> narrator: ...leading to complaints that fabook helped tilt the election... >> facebook elected donald trump, that's basically... >> narrator: .which the trump campaign dismissed as anger over the results. >> there has been mounting criticism of facebook... >> no one ever complained about facebook for a single day until .donald trump was preside the only reason anyone's upset about this is that donald trump is president and usestem that was all built by liberals. when i got on tv and told everybody after my interview of what we did at facebook, it explod. the funny thing is, the obama u
4:18 am
campaid it, then went onnd tv and newspapers, a they put it on the front ofagazine, and the left and the media called them geniuses for doing that. >> accusations that phony news stories helped donald trump win the presidency.. >> narrator: trump's victory put facebook on the spot. >> facebook even promod fake news into its trending... >> narrator: and two days after the election, at a tech conference in northern california, zuckerberg spoke publicly about it for the first time. >> well, you know, one o the things post-election, you've been getting a lot of pushback from people who feel that you didn't filter out enough fake stories, right? >> you know, i've seen some of the stories that you're talking about, around this election. there is a certain pround lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fa news. you know, personally, i think the, the idea that, you know, fake news on facebk, of which, you know, it's, it's a
4:19 am
very small amount of, of, of the content, influenced the election in any way, i think, is a pretty crazy idea, right? >> if i had been sitting there in an interview, i would have said, "you're lying." when he said, "we had no impact on the election," that...di i remember r that and being furious. i was, like, "are you kidding like, "stop it." like, you cannot say that and not be lying. of course they had aimpact, it's obvious. they were the most important distribution, news distribution. ere are so many statisti about that. like, i don't know how you could possibly make that claim in public and with such a cavalier attitude. that infuriated me. saying, "you're kiddin there, >> jacoby: is he not recognizing the importance of his platform inur democracy at that poi in time? >> yes, i think he didn't understand what he had built, or didn't care to understand ornt wasn't paying atn, and doesn't... they really do want to pretend, as they're getting on their private planes, as they're getting... going to their beautiful mes, as they're collecting billions of dollars, they never want to
4:20 am
acknowledge their powe they're powerful, and they have... they don't. >> thank you so much for beingre >> thank you, guys. >> i think it was very easy for all of us sitting in menlo park to notecessarily understand how valuable facebook had become. i don't think any of us, mark included, appreciated how much of an effect we might have had.o and i don't even knoy, two years later, or almost two years later, that we really understand how much of a true effect we had. but i think more importantly, we all didn't have the information to be saying things like that at the time. my guess is, is that mark now realizes that there was a lotor more to the than, than he or any of us could have imagined at that point. >> narrator: barely two monthson later, in washinan even more serious situation was developing. vestigating russianes were interference in the election, and whether social med had played a role.
4:21 am
>> classical propaganda, disinformation, fake news. >> does that continue? >> yes. in my view, we only scratched the surface. i say "we," those that aembled the intelligence community assessment that we published on the 6th of january 2017, meaning nsa, c.i.a., fbi, and mf ce. but i will tell you, frankly, that i didn't appreciate theag fulltude of it until well after. >> narrator: amid growing.. scruti. >> all right. >> narrator: ...zuckerberg set out on a cross-country trip he publicized by streaming on facebook. >> so i've been going around to different states for mpersonal challenge for the year to seet how differmmunities are working across the country. >> narrator: but while he was on the road, the news was getting worse. u >> the. intelligence community officially is blaming russian president...ru >> .ian president vladimir putin ordered an influence campaign aimed at thees
4:22 am
pridential election.to >> nar zuckerberg's chief of security, alex stamos, had been asked to see what he could find on facebook's servers.ed the fake news phenomeninto specifically what component of that might have a ruian partn. in its origi >>arrator: they traced disinformation to what appearedb e russian government-linked sources. >> jacoby: so, what was it like bringing that news tohe in the company, and up to mark and sheryl, for instance? >> you know, we had a big responsibility in the security team to educate the right people ouout what had happened wi being kind of overly dramatic. it's kind of hard as a security person to balance that, right? like, everything seems like an emergency to you. but in this case it really was, right? this really was a situation in t which we s tip of this iceberg, and we knew there was some kind of iceberg beneath ita >> narrator:s expanded his investigation to look at how the russian operation may have u
4:23 am
alsosed facebook's targeted advertising system. >> so what we did is, we then decided we're going to look at all advertising and see if we can find any strange patterns that might link them to russian activity. so we enlisted huge parts of the company.on we kind of dragoed everybody into one big, unifd team. so you have people in a war roo work-, 80-hour weeks,of billions of dollar ads, hundreds of millions of piecesnt of con, and by kind of a painstaking procesof going through thousands and thousands of false positives, eventually found this large cluster that we were able to link to the in.rnet research agency of petersburg. >> narrato it was one of the same groups that had been using facebook to spread disinformation in ukraine threea years ier. this time, using fake accounts, russian operativ had paid around $100,000 to run ads that
4:24 am
promoted political messages and enticed pele to join fake facebook groups. >> what the internet research agency wants to do is, they want to create the appearance of legitimate social movements. so they would create, for example, a pro-immigration group and an anti-immigration group. and bo of those groups would be almost caricatures of what those two sides think of each other. and their goal of running ads were to find populions of people who are open to those kinds of messages, to get them into those groups, and then to basis to drive them apegular really what the ruians are trying to do is to find these amplify them, and to makety and americans not trust each other. >> narrator: in september 2017, nearly a year after the election, zuckerberg announced on febook what the company h found. >> we are actively working with
4:25 am
the u.s. government on its ongoing investigations into russian interference. we've been investigating this for many months now, and for a while, we had found no evidee of fake accounts linked to russian... linked to russia running ads. when we recently uncovered this activity, we provided that information to the special counsel. we also brfed congress. and this morning, i directed our team to provide the ads we'veon found toess, as well. >> we do know that facebook-related posts touch about 150 million americans, that were posts that originated either through russian fake accounts or through paid advertising from the russians. but the paid advertising was ecally a relatively small of the overall problem. a much bigger problem was theto ability for someonay they were james in washington, dc,bu it was actually boris in st. petersburg creating a fakeat persona ould generate followers, and then they would seed it with the fake information and the false news
4:26 am
and the political content. one account was set up to try to rally the musl community in, in texas. another was an attempt to kind of rly the right wing in texas. they created an event. >> white power! s p the hate! stop the fear! >> protest, with both sides protesting against each other. >> (yelling) >> at a mosque in houston, in 2016. >> this is america, we have the right to speak out. >> (yelling) >> but for the good work of the houston police, you could have had the kind of horrible activity take place then and there that i saw unfortunatelyes take place in charlolle in my state last year. so the real human consequences of some of these... of some of this abuse, we've been vy lucky that it hasn't actually cost people's lives. >> narrator: facebook also found that the russians had used the site to orchestrate a
4:27 am
pro-trump rally outside of a cheesecake factory in florida and to promote an anti-trump protest in new york city just after the election. >> h, hey, ho, ho, donald has got to go... >> we are under threat, and i need to defend the country that i love. >> we are right in the middle o thotest. >> narrator: the details of facebook's internal investigation set off alar bells in washington. >> we're such a ripe target for that sort of thing, and the russians know that. so the russians explted that divisiveness, that polarization, because they had, they had messages for everybody. you know, black lives matter, white supremacists, gun control advocates, gun control opponents, it didn't matter-- they had messages for everybody. >> jacoby: did you think that was a pretty sophisticated campaign? >> it was. and i believe the russians did g a lot people out to vote that wouldn't have and helped the appeal for... of donald trump. >> jacoby: and the role that
4:28 am
social media played in that was what? >> it was huge. i ean, it's really quite both ingenious and evil to, to attack a democratic society in that mner. >> jacoby: but there were warning signs along the way in the trajectory of the coany. >> the company's been dealing with the negative side effects gh its product for years, when you have two billion people on a communication platform, there's a infinite number of potentially bad things that could happen. the tough part is trying to decide where you're going to put your focus. >> narrator: but by 2017,bo fa was being accused of not focusing on other serious issues in develong, fragile democracies where the business.ad expanded its countries like the philippines, where alst all internet users are on facebook, and problems had been mounting. >> in a year, i probably met with more than 50 different officials, high-ranking officials, including markbe zuck. i wanted them to know what we were seeing, i wanted them to
4:29 am
tell me whathey thought about it, and i wanted them to fix it. >> narrator: maria ressawho runs a prominent news website, says she had been warning facebook since 2016 thaten prest rodrigo duterte wasne using ork of paid followers and fake accounts to spread lies aboutk his policies and atts critics. >> the u.n.'s branded his war a crime under internationalaw. >> narrator: especially crics of his brutal war on drugs, which has taken an estimated 12,000 lives. >> human rigs watch has called government-sanctioned butchery. >> president duterte was targeting anyone who questionedo the drug war, who questioned the alleged extrajudicial killings. anyone on facebook who questioned that would get brutally bashe we're protecteby the constitution, we've been stripped of those protections online. >> narrator: ressa herself would eventually come under attack.
4:30 am
>> there were attacks on the way i look, the way i sounded,hat i should be raped, that i should be killed. we gave it a name: patriotic trolling. online, state-sponsored hate that is meant to silence, meant to intimidate. so this is an information ecosystem that just turns democracy upside-down. evalent.y: and where lies are >> where lies are truth. disinformation to a network of 26 fake accounts and reported it to facebook at a meeting ine singap august of 2016. them to do?king >> exactly what every newsroup does, which is, take control and be responsible for what you create. >> jacoby: were you given an explanation as to why they weren't acting? >> no. no. i think faceok walked into
4:31 am
the philippines, and they wereon focuserowth. what they didn't realize is that countries like the philippines... >> (chanti) >> ...countries where institutions are weak, where corruption is rampant, these countries don't have the safeguards. and what happens when you bring everyone onto a platform and do not exercise any kind of rules, right? you don't implement tho rules beforehand, you're going to create chaos. >> jacoby: there's a problem in the philippines, we've heard about from peoplon the ground there, that facebook has been to some degree weaponized by the duterte regime there. what are you doing to, to stemph this problem in thippines? >> one thing we're trying to do, any time that we think the might be a connection between violence on the ground and online speech, the first thing for us to do is actuallyun rstand the landscape. >> narrator: monika bickert is facebook's head of global policy department in southeast asia.
4:32 am
>> there's a fundamental question, which is, "what should our role be, and as we are identifying misinformation, should we be telling pple what we're finding, should we be be down-ranking that c?"hould we and we now have a team that is focused on how to deal with exactly that sort of sittion. >> narrator: in april 2018, facebook created a news verification program and hired ressa's organizaon as one of its fact-checkers, though she says the problems are ongoing. the company ultimately took down the accounts ressa identified-- and went on to remove dozens more. t >> i think w happening is that this company is way in ovee its head is of its responsibilities. w it in over its head in terms of what power it holds. i the ideasn't that it's just like you magically add facebook and horrible things happen, but you have facebook as thi effective gasoline to simmering
4:33 am
fires. >> (shouting) >> narrator: e ewhere in theregion.... >> buddhists are inciting hatred and violence against muslims through social media... >> narrator: ...facebook was also being used to fan ethnic tensions with even more dire consequences. >> violence between buddhists and muslims is continuing. >> misinformation, disinfmation, rumors, extremist propaganda, all kinds of bad content. >> narrator: for several years, david madden, a tech entrepreneur living in mnmar, as well as journalists and facebook that the musliming minority there was being targeted with hate speech. >> (speaking local language) >> you would see the use of re degrading and dehumanizing, targeting the muslim community. >> (speaking local language) n rator: the warning signs had been present as far back as 2014, when a fake news story
4:34 am
spread on facebook. >> reports, later proved to be false, that some muslim men had raped a buddhist woman, wereok shared on facebo. >> an angry mob of about 400 surrounded the sun teashop, stones.g and throwing bricks and >> narrator: two people died in the incident. >> one buddhist and one muslim re killed in riots today >> i was really concerned that the seriousness of this was not understood. and so i made a presentation at facebook headquarters in may of 2015. i was pretty explicit about the ate of the problem. i drew the analogy with what had happened in rwanda, where radio had played a rearole inex thution of its genocide. and so i said, "facebook runs the risk of being in myanmar what radios were in rwanda." that this platform could be used to foment hate and to incitece viol
4:35 am
>> jacoby: what was the reaction to that at facebook? >> i got an email shortly after that meeting to say that what had been discussed at that meeting had been shared internally and apparently taken very seriously. >> narrator: the violence innsified. >> massive waves of violence that displaced over 150,000 people. >> narrator: and in early 2017, madden and other local activists had another meeting with facebook. >>he objective of this meeting was, was really to be crystal-clear about just how bad the problem was, and that the processes that they had in place to try to identify and pull down problematic content, they just weren't workg. and we were deeply concerned that something even worse was going to happen imminently. it was a sobering meeting. i think... i think the main response from facebook was, "we'll need to go away and dig
4:36 am
into this and come back with something substantive." the thing was, it never came. >> and how do you know that?>> e can look at the evidencet? on the ground. what we've seen here tells us a story of ethnic cleansing, of drivg muslims out of myanmar >> narrator: the united nations would call the violence in myanmar a genocide, and found social media, and facebook in particular, had played a significant role. >> the ultra-nationalist buddhists have their own facebookand really inciting a lot of violence and hatredhn against et minorities. facebook has now turned into a originally intended toed. >> jacoby: i'm curious whaliit's when the u.n. comes out with a report that says that facebook played a significant role in a genocide. running content policy at
4:37 am
facebook? >> well, this would be important to me even if i didn't work at facebook, given my background. my background is as a federal specifically in asia and specifically on violent crimes against people in asia. so something like that really hits home to me. as early as 2015 abouts warned potential for a really dangerous situation in myanmar what went wrong there? we met with civil society organizations in myanmar far before 2015.th is an area where we've been focused. i think what we've learned over time is, it's important for us to build the right technical tools that can help us find some of this content and al work with organizations on the ground we are in the process of building those relationships around the world on a much deeper level, so that we can stayhead of any kind of >> narrator: throughou,
4:38 am
facebook says it's taken down problematic accounts in myanmar, hired more language experts, and improved its policies. >> jacoby: should there be any countability for a company like facebook when something soo disa goes wrong on your platform? >> there's all sorts of accountability. but probably the group that holds us the most countable are the people using the service. if it's not a safe place forme them to nd communicate, they are not going to use it. >> we are working here in menlo park in palo alto, california. to the extent that some of these issues and problems manifest in other countries around thed, woe didn't have sufficient information and a pulse on what was happening in southeast asia. >> narrator: naomi gle is facebook's second-longest- serving employee. >> and so one change tt we've made, along with hiring so many more people, is that a lot of these ople are based internationally and can give us
4:39 am
that insight that we may not geb frng here at headquarters. >> jacoby: i'm trying un rstand, you know, the choices that are made. do you regret choices goings backward, decisiat were made about not taking into account risks or not measuring risks? >> yeah, i definitely think we20 regret not havin00 people working on safety and security back in the day, yes. so i regret that we were too slow, that it wa't our priority. >> jacoby: but were those things even considered at the time? to kind of amp up safety andbu securitythere was some reason not to or... >> not really. i mean, we had a safety and security team. i think we just thought it was sufficient. i just... it's not that we were, like, "well, we could do so muc more here," cided not to. i think we... we just didn't...j again, we wet a bit idealistic.
4:40 am
>> faceboohas created this platform that in many countries, not just myanmar, has become the dominant information platform, and it has an outsized influence in lots of countries.a that comes with t of responsibility. >> using social media, rumors of alleged muslim wrongdoing spread fast.an >> mof those countries are wrestling with some pretty big challenges.we tensionsen groups within countries, and we have seen this explode in what mark zuckerberg would call real-world harm, what others would just call violence or deathin many india.eing it right now in >> calloo became a victim of india's fake news. >> we've seen examples of this in places like sri lanka. >> to keep the violence froman spreading, sri l also shut down facebook... >> the myanmar example should be
4:41 am
sounding an alarm at the highest level of the company, that this requires a comprehensive strategy. >> narrator: but it would be far from myanmar, and a very different kind of problem, that uproar over facebook.ational i>> cambridge analytica a mining of data on millions of americans for political purposes... >> cambridge is alleged to have used all this data from tens of millions of facebook users... >> escándalo cambridge analytica, facebook... (reporters speaking different languages) l narrator: it was a scan over how facebook failed to protect users data, exposed by a whistleblower named christopher wylie.hr >>topher wylie, he was able to come forward and say i can prove this. >> narrator: he said that facebook knethat a political consulting firm he'd worked for, cambridge analytica, had been using the personal data of mores than 50 millios to try to influence voters. >> at cambridge analytica, we are creati the future of political campaigning. >> this is a company that specializes and would advertise
4:42 am
itself as specializing in rumor campaigns. >> political camigns have changed. >> seeding the internet with misinformation. he >> puttingight message in front of the right person at th. right moment >> and that's the power of data. you can literally figure out whe are the peho are most susceptible. >> ...data about personality, so you know exactly who to target... >> narrator: the firm gained access to the data from a third party, without facebook's permissn. lm >> the overwheing majority of people who had their datat collected did noknow. en data leaveser facebook's srs, there is no wafor facebook to track that data to know how that data is being used or to find out how many copies there ar >> narrator: facebook eventually changed its data sharing policies and ordered cambridge analytica to deletthe data. >> we know that facebook had known about this.. >> narrator: after wylie came forward, they banned the firm from their site, and annoced they were ending another controversial practice:
4:43 am
working directly with compans known as data brokers. but the uproar was so intense that in april 2018, mark zuckerberg was finally called before congress, in what would become a reckoning, over facebook's conduct, it's business model, and its impact on democracy. today's hearing on facebook's social media privacy and the use and abuse of data. i now turn to you, so proceed, sir.of >> we face a numbemportant issues around privacy, safety, and democracy. and you will rightfully have some hard questions for me to answer. facebook is an idealistic and optimistic company. and as facebook has grown, people everywhere have gotten a powerful new toofor making
4:44 am
their voices heard and for building communities and businesses. but it's clear now that we didn't do enough to prevent these tools from being used for harm, as well.s and that gr fake news, for foreign interference in elections, and hate speech, as well as developers and data privacy. weidn't take a broad enoug view of our responsibility, and that was a big mistake. and it was my mistake. and i'm sorry. >> if, like me, you're following this stuff, you see years and years and years ofeople begging and pleading with the company, saying, "please pay attention to this," at every channel people could find. and basically being ignored. "we hearou, you're concerned, we apologize, of course we have a responsibility, we'll do better." and the public record here is that they are a combination of unable and unwilling to grasp and deal with this complexity.
4:45 am
>> you may decide, or facebook may decide, it needso police a whole bunch of speech, that i think america might be better off not having policed by one company that has a really big and porful plam. >> senator, i think that this is a really hard question. and i think it's one of the reasons why we struggle with it. >> these areery, very powerful corporations. they do not have any kind of traditional democratic accountability. pe and while onally know a lot of people making these decisions, if we set the norms that these companies need tot, decide who does and does not have a voice online,ng eventually that is go go to a very dark place. >> when companies become big and powerful, there is a instinct to either regulate or break up, right? >> i think we're finding ere people feel like something should be done. there's a lot of questions what should be done, but there's no question that something should be done. >> you don't think you have a
4:46 am
monopoly? >> it certainly doesn't feel like that to me. >> okay. >> you know, there's a lot of problems here, there, but all of these problems get worse when one company has too much power, too much information, over too many people. >> narrator: after years of unchecked growth, the talk now is increasgly about how to rein in facebook. th already, in europee's a new internet privacy law aimed at companies like facebook. inside the company, the people we spoke to insisted that facebook is still a force for good. >> jacoby: h there ever been a minute where you've questioned the mission? you know, internally? whether anyone has taken a second to step back and say,ll ight, has this blinded us in some way?" have you had a moment like that? >> i still continue to firmly terms of stepping backermst in of reflecting,bsolutely. but that isn't on the mission. the reflection ireally about, how can we do a better job of
4:47 am
minimizing bad experiences on facebook? the metric earlier?t th part in terms of, how do you minimize the harm? >> you know, it's possible that we could have done more sooner,e anaven't been as fast as we needed to be. >> narrator: that line was peated by all the curren officials facebook put forward to answer questions. n >> we've beetoo slow to act >> i think we were too.. >> we didn't see it fast enough. >> we were too slow. >> mark has said this, that we have been slow. >> one of my greatest regrets in running the company is that we were slow in identifying the russian information operations in 2016. and we're ing to take a number of measures, from building and deploying new a.i. tools that take down fake news, to gring our security team to more than 20,000 people... >> the goal here is to deep-dive on the market nuances there... >> narrator: the company says it's now investing resources and talent to tackle a range of problems, from the spread of hate speech to election terference. >> even if we can't do fact-checking, if we can do more work around the programmatic aspect of it...
4:48 am
>> narrator: this is part of the team tackling the spread of misinformation around the world, led by tessa lyons. >> the elections integrity team has a framework for ey're thinking about secondary languages in each country. and i feel like from the misinformation side, we've mostly prioritized primary languages. >> narrator: it's a problem the company admits it is a long way from solving. >> the next thing is about the arabic fact-checking project. i think the main bloer here is potentially getting a fact-checker that can cover an entire region. b asking myself, "how longhis is it going to take us to solve this?" problem that you solves isn't a it's a problem that you contain. >> awesomeme. xt, segue into upcoming launches. >> narrator: in advance of the20 midterms, facebook mobilized an election team toto mofalse news stories and delete fake accounts that may have been trying to influence voters. nathaniel gleicher runs the team.
4:49 am
>> there are gng to be actors that are going to try to manipulate that public debate. how do we figure out what arey' the techniques t using and how do we make it much harder? real-time monitoring on election day of what's going on onyo facebook, and how argoing to actually find things that may sow distrust in the election? y, >> absolute're going to have a team on election day focused on that problem, and onn that's useful here is, we've already done this in other elections. >> jacoby: a you're confident you can do that here? >> i think that... yes, i'm confidenthat we canan do this here. >> narrator: gleicher says hises team contio find foreign actors using the platform to spread disinformation. >> iran was revealed to be a new player in worldwide top of this... campaigns, and on >> narrator: and in october of 2018, feral prosecutors announced they'd found evidence that russian operatives had been trying to interfere in the u.s. mterm election. >> jacoby: what is the standard
4:50 am
that the publishould holdbo fa to, in terms of solving some of these seemingly enormous problems? >> i think the standard, theat responsibility, 'm focused on, is amplifying good and minimizing the bad.sp and we need to be trent about what we're doing on both sides, and, you know, i think this is an ongoing discussion. >> jacoby: what's an ongoing discussion? >> how we're doing on minimizing the ba >> jacoby: but we're dealing with such consequential issues, right? we're talking about integrity of our elections, we're talking about... >> absolutely. >> jacoby: ...in some cases, playing a role in a genocide. an ongoing conversation means what, extly, about that? about a standard for success here? >> i think, you know, this is the number-one priority for the company. mark has been out there, sheryl is out there, you're talking to me and a bunch of the others. lead that's what we mean by having an ongoing conversation.
4:51 am
this is somethg that we need to, as you said, this is serious, this is consequential.e we take this ext... like, we understand this going away tomorrow.it's not >> jacoby: do you think facebook has earned the trust to beble to say, "trust us, we've got this"?ng >> i'm not go answer that, i'm sorry. that's just... at decision for themselves. ke >> jacoby: but what... do you trust them? >> i trust the people who ied woith. i think there are some good people who are working on this. that doesn't mean i don't think we should pass laws to back that up. >> it has not been a good week for facebook... >> ...socialedia giant... >> narrator: for facebook, the problems have been multiplying. >> ...massive setbacfor facebook, the social media giant... >> ...a massive cyber attack affecting near 50 million facebook users... >> facebook continues to crack down on fake political ads and news... m >> narrator: bk zuckerberg's quest to connect and change the world continues. >> hey, everyone! hey.
4:52 am
welcome to f this has been an intense year. only four months in e >> after all tse scandals, facebook's profits just still going up, right?re so they don'ly have a huge incentive to change the corem, probhich is their business model. >> we are announcing a new set of features coming soon... >> they're not going to do it as long as they're doing so well financially and there's no regulatory oversight. and consumer backlash doesn't really work, because i can't leave facebook-- all my friends and family around the worldth are. you might not like the company,i you might no its privacy policies, you might not like the way its algorithm works, you might not like its business model, but what are you going to do? >> now, there's no guarantee that we get this right. this is hard stuff. we will make mistakes, and theyq will have conces, and we will need to fix them.
4:53 am
>> narrator: as he has since thn beg, he sees facebook, his invention not as part of the problem, but the solution. >> so if you believe, like i do, that giving people a voice is important, that building relationshs is important, that creating a sense of community is important, and that doing the hard work of trying to bring the world closer together is important, then i say this: we will keep building. (cheerand applause) ♪ ♪ ♪ hol hold on hold on to me ♪ 'cause i'm a little unsteady ♪ >> what's the situation there?o >> how du explain that? >> are you ready for this world
4:54 am
that we are facing today? ♪ >> go to pbs.org/frontline for the latest developments in the facebook story. then check out the new frontline "transparency project", and see key otes from the film in context. >> this really was a situation in which we saw the tip of this iceberg, and we knew there was some kind of iceberg beneath it. >> this isn't a problem that you solve, it's a problem that you contain. >> what i'm focused on is amplifying good, and minimizing the bad. >> connect to the frontline community on facebook, twitter, or pbs.org/frontle. >> frontline is made possible by contributions to your pbs station from viewers like you. thank you. and by the corporation for public broadcasting. major support is provided by the john d. and catherine t. macarthur foundation, committed to building a more just, verdant and peaceful
4:55 am
world. more information is available at macfound.org. the ford foundation, working with visionaries on the ont lines of social change worldwide.da at ford foon.org. additional support is provided by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy journalism that informs and inspires. and by the frontlineournalism fund, with major support from jon and jo ann hagler. and by the y, for a better us. captioned by media access group at wgbh access.wgbh.org hi >> for more onand other frontline programs, visit our website at pbs.org/frontline.
4:56 am
♪ to order frontline's f "tebook dilemma" on dvd visit shop pbs, or call 1-800-play-pbs. this pgram is also available on amazon prime video. ♪ you're watching pbs
4:57 am
4:58 am
4:59 am
5:00 am
this program was made possible in part by ble corporation for pu broadcasting. rs and by viewe like you. thank you. (music plays) uhhh let's think of some christmas carols. we've got oh holy nig.t we've ggrandma got run over by a reindeer, which is my personal favorite. jingle bells . ♪ chestnuts roasting on an open fire.. (lgod.s) whether you celebrate christmas, hanukkah, kwanzaa, chrismukkah, or you just get twdays off , everybody gets wrapped up in the holidays. and i'm no different. so deck the halls y'all. (theme music plays- the avett brothers "will y return")