tv Frontline PBS August 18, 2020 10:00pm-11:02pm PDT
10:00 pm
0s >> n into facebook continues. no company who has had so much influence on the infon that americans consume. >> narrator: he's the man who connected the world. but at what cost?>> polarization was the.arrator: t >> this is an information ecosystem that just turns mocracy upside down. election... >> ...facebook getting over a billion political campaign posts. >> narrator: and the company denials... >> the idea that fake news on facebook influenced the election in any way i think is pretty crazy ia...facebook ceo mark zuckerberg will testify... >> ...and i'm responsible for what happens here. facebook be fixed?t of recent re may have covered up russian inference in the 2016 >> the problem is too big,bo >> narrator: tonight on frontline,art two of "the fac]cook dilemma".
10:01 pm
>> front contributions to your pbs thank you. and by the corporation for public broadcasting. major support is provi and cathine macarthur foundation committed to building a more just, verdant and peaceful world. macfound.org. the ford with visionaries on the front lines of sial change worldwide. at ford foundation.org. additional support is provided committed to excellence in journa. the park foundat to heightening public awareness critical issues. e john and helst. supporting trustworthy journalism that informs and inspires. and by the frontline journalism fund, with major support from jon and jo ann hagler. corporate support is provided >> the zip code you're born into can determine y your school, your job, your dreams, your problems... at the y our goal is to create opportunities no matter who you
10:02 pm
are or where you're from. the y, for a bet >> i accept your nomination for president of the united states >> i humbly accept yourhe presidency ofs. >> hey, everyone.ya where some ribs and getting ready for the presidential debate tonight. >>night's debate will be formed by conversations happening on facebook. >> 39% el decision-making material from >> facebook getting over a billion political campaign po >> i love thcommentsthat are coming in. smoking these meats and, um, and ju hanging out wit people who me in my backyard. >> make no mistake, everything you care about, everything i care about and i've worked forake.>>
10:03 pm
i will beat hillary clintony, i will beat hso badly. >> i hope that all of you get out and vote. this is going to be an important one. ♪ ♪ >> tonight's broadcast will also become a gathering place for political conversation. (cheers and applause) ♪ ♪ >> thank you. thank you. >> facebook is really e new town hal >> better conversations happen on facebook. >> poke for a vote. >> poke for a vote. >> u.s.a.! u.s.a.! >> hillary! hillary! >> facebook is the ultimate growth stock. >> facebook is utterly dominating this new, mobile, digital economy. >> have political conversationn facebook, things like the most likes, interactions, shares.as evaded justice. >> i thank you for giving me the opportunity to, in my view clarify. >> 2016 is the social election. >> facebook getting over a billion political campaign posts. >> narrator: 2016 began as
10:04 pm
banner year for mark zuckerberg.come one of in the wor emergihat, as it was connecting billions, it was inflaming division >> people really forming these tribal ideities on facebook, where you will s getting into big fig >> narrator: wve been investigating warning signs that existed as facebook grew, and interviewing those inside the company who were there at the time.umbers growing like crazy, as did the rest of the media and the news world in particular. and so, as a product designer, when you see your products being used more, you're happy. >> it's where we're seeing conversation happening about the election, the candidates, the issu sed the platform more successfully than donald trump's digital media director, brad parsca >> i asked facebook, "i want to spend $100 million on your fo they say, "we don't have a manual." i say, "well, send me a human manual, then." >>ames jacoby: and what does the manual provide? >> you have a manual for your car.t have that fors you woulse in your car, right? i spent $100 million on a
10:05 pm
platform, the most in history, it made sense for them to be there to help us make sure how we spent it right and did it right. >> with custom audiences, you can get your ads to people you already know who a fabook. >> narrator: what facebook's representatives showed them was how to harness its powerful advertising tools to target new and receptive audiences. friends of people who like my page. >> what i recognized was the simple process of marketing. i needed to find the right people and the right places to micro-targeting allows you to do is say, "well, these are the people that are most likely to show up to vote, and these are the righences we need to show up." the numbers were showing in the consumer side that people were spending more anmore hours of their day consuming facebook best place to show your ntent, it would be there. it was a place where their eyes were. where they were reading their local newspaper and doing things. and so we could get our message injected inside that stream. and that was a stream which was controlling the eyeballs of most places that we needed to win.♪'t justas also dominating the news business. >> 62% of americans say they get their news from social media
10:06 pm
sites like facebook. >> more than a dozen developers have goal of aad more news. >> narrator: facebook's massive audience enticed media organizations to publish straight int news feed-- making it one of the most important distributors ofd. >> i'm personally really excitedabout this. i think that it has the potential to not only rethink but to rethink a lot of the ways industry works. >> narrator: but unlike traditional media companies,book didelf as responsible for ensuring the accuracy of the news and information on its site. >> the responsibilities thatey should have taken on are what used to be called editing. and editors had certain÷g respsibilities for what was page versus the last page, the relative importance of things that don't relate purely tod don't relate purely to popularity. so they took over the role of editing without ever taking on the responsibilities of editing. >> narrator: instead, facebook's editor was its algorithm designsers whatevers most engaging
10:07 pm
to theook, they didn't see >> jacoby: was there a realization inside facebook as to what the responsibilities would be on distributor of news? >> i don't think there was a lot of thinking about th, that idea. i don't think there was any, any thought that news content in particular had, had more value or had more need for protection than any of the other pieces of content on facebook. >> narrator: andrew anker was in charge of facebook's new products team, and is one of eight former facebook insiderso agreed to talk on camera about their experiences. >> i was surprised by a lot things when i joined facebook. anrew up in the a there to be more of a sense of how people interact with media and how important media can be to certarmation diet. (applause) >> we have a video from davida from napoli. >> no.a technology
10:08 pm
company. we're not a media company. >> the fact that so many big well-known news brands really pushed into facebook pretty aggressively legas a place to get, kind of, information. and i think that also strangely created the opportunity for people who weren't legitimate, because if the legitimate players are there, and you're not legitimate, all you need to do is set up a website and thenr stufook is going to look similar enough that you've just gotten a huge leg up. >> hillary clinton is the moston... >> narrator: but as the 2016 campaign heated up... >> and i'll tell you, some of,y what i heard coming from my opponent... >> narrator: ...reporter craig silverman was sounding alarms that facebook's news feed was spreading q=sinformation-- what he called "fake news." >> fake news just seemed like and i was trying to get people to pay attention. i was tryingo get journalists to pay atteno also get facebook and other com like twitter to pay attention to >> narrator: silverman traced misinformation back to some unusual places.
10:09 pm
>> w this small cluster of websites being run, the va majority, from one town in macedonia. >> how popular is it? maybe. >> 200 people?>> >> yes. >> most of them didn't reall care about who won the election. they weren't in this for politics. if you put ads on these wefic from facebook, that was a good way to make money.>> there are some people made, like, 200k or something like that. >> 200,000 euros? >> yeah, yeah, yeah. he was 15 want to read about trump, so i'miting trump stuff." trump earned them money. we saw macedonia's publishgy clinton being indicted the pope endorsing trump hillary clinton selling weapons to isis, getting close to or above a million shares, likes,÷o comments. that's an insane amount of engagement. it's more, for example, thanthe new york times" had a scoop about donald trump's tax returns. how is it that a kid in macedonia can thatent than a scoop from on facebook?ring the campaign was "pope endorses trump," which was not true, but it went viral on facebook. was it known within facebook
10:10 pm
that that had gone viral? >> um, i'm sure it was. i didn't necessarily know how viral it had gotten, and icertainly didn't believe that anybody believed it. >> jacoby: but would that have been a red flag inside the company, that something that's patently false was being propagated to millions of people on the platform? >> i think if you asked the question that way, it would have but i think wh the next question, which is the harder and the more important question, was, which is, "so what do you do about it?", you then very quickly get into issues of not only free speech but to what degree is it anybody's responsibility, as a technology platform ors a distributor, to start to decide when you've gone over the linesomething that iclearly false from something that may or may not be perceived by everybody to be clearly f anpontially can do damage? >> jacoby: over the course of there was a lot of news about sinformation. i mean, there was, famously, "the pope endorses trump." do you remember that? >> absolutely. issues at the time, but, but absolutely i, i do remember it.
10:11 pm
>> narrator: tessa lyons was chief of staff to facebook's number two, sheryl sandberg, and is now in charge of fighting misinformation. she is one of five current officials facebook put forward to answer questions. as there any kind sense of, l polluted with misinformation-- someone should do something about this"? >> there certainly was, ando were thinking about i't think there was a real awareness of, internally or externally, was the scope of the problem and the, the right course of action. >> jacoby: how cou it berprising t becoming the world's information source, that there may be a prob >> there was certainly a th]x related to news or quality of news. and i think we all recognizedll of the considering, we focused a lot on threats that weren't misinformation and underinvested in this one. ♪ ♪ >> narrator: but there was another problem that was going unattended on facebook beyond misinformation. >> one of the big factors thatd in the election was what started to be called hyper-partisan facebook pages.
10:12 pm
these were facebook pages that ki ginning up that partisanship-- "we're right, they're wrong." but not even just that, it was"they're terrible people and we're the best." pages were getting tremendous engagement. >> "a million migrants are coming over thifwall, andlike, rape your children," you know? that stuff is doing well >> and the stuff that was true would get far less shares. >> the development of these hyper-partisan turned the informational commons into this trash fire. and there's some kind of parable in that for the br effects of facebook. that the very thin divide us most cause the most engagement. >> (barking, >> which means they go to thep of the news feed, which means the most people see them. >> narrator: this worried an early facebook investor who was once close to zuckerberg. >> i am an analyst by traini and profession. s to watch and at this point, i have a series of different examples thats
10:13 pm
something wrong, systemically, with the facebook algorithms and business model. in effect, polarization was the key to the model. this idea of appealing to people's lower-level emotions, things like fear and anger, to create greater engagement, and in the context of facebook, time on site, more sharing, and, more advertisg i found that incredibly disturbing. >> narrator: ten days before the election, mcnamee wrote zuckerberg and sandberg about his concerns. >> i mean, what i was ally trng to do was to help mark and sheryl get this thing right. and their responses were more or less what i expected, which is to say that wheen were isolated p had addrsed each and every one of them. i thought facebook could stand up and say, "we're going to reassess our priorities. we're going to reassess the metrics on which we run the company to try to take intoe fact that our im is so much greater now than it usedo be. and that as facebook, as abillions
10:14 pm
of users, we have influence on how the whole works that no one's had before." >> i've just received a call from secretary clinton. >> clinton he the election. >> the clinton c really a somber mood here.>> the crowd here at trump campaign hea >> narrator: trump's targeted ads on facebook paid o...ebook help one of the nastiest elections ever? >> narrator: ...leading to complaints that facebook helped tilt the election... >> facebook elected donald that's basically... >> narrator: ...which the trump campaign d as anger over the results. >> there has been mounting criticism of.. >> no one ever complained about facebook for a single day until donald trump was pre the only reason anyone's upsetabout this is that donald trump is president and used a system all built by liberals. when id everybody after my interview of what we did at facebook, it exploded. the funny thing is, the obamacampaign used it, then went on tv and newspapers, and they put it on the front of magazine, and the left and the media called
10:15 pm
them geniuses for doing that. >> accusations that phony news stories helped donald trump win the presidency...r: trump's victory put facebook on the spot. >> facebook even promoted fake news into its trending... >> narrator: and two days afterfornia, zuck publicly about it for the first time.el ogs post-election, you've been getting a lot of push from people who feel that you stories, right? >> you know, i've seen some of the stories that you're talking about, around this election.tain profound that the only could have voted the way they did is because they saw some fake news. you know, personally, i think the, thedea that, you knowhc fake news on facebook, of whh, you know, it's, it's amall amount of, of, of the content, influenced the election in any way, i think, is a pretty crazy idea, right? >> if i had been sitting there in an interview, i would have said, "you're lying." when he said, "we had no impact on the election," that..remember reading t being furious. i was, like, "are you kidding me?"
10:16 pm
like, "stop it." like, you cannot say that and not be lying. of course they had an impaus. theye most important on, news distribution. there are so many statistics about that. like, i don't know how you could possibly make that claim in attitude. that infuriated me.ed everybody there saying, "you're kidding me." >> jacy: is he n the importance of his platform in our democracy at that point in time?es, i think he didn't understand what he had built, oror wasn't paying doesn't... they really do want to pretend, etting on their private planes, as they're gett their beauti6hl homes, as they're collecting billion dollars, they never want to ackn have... they don't. foreing here. >> thank you, guys. >> i thinkt all of us sitting in menlo park to not necessarily understand how valuable facebook had become. i don't think any of us, mted how much of an effect we might have had. years later, or almost two years later, that we really understand how much of a true effect we but i think more importantly, we all didn't have the information to be sayingke that at
10:17 pm
the time. my guess is, is that mar realizes that there was a lot more to the story than, than he or any of us could have imagined at that point. >> narrato barely two months later, in washington, an even more serious situation was developing. intelligen investigating russian rference in the electionether soci >> disinformation, fake news. >> does that continue? >> yes. in my view, we only scratched i say "we," those that assembled the intelligence community assessment that we published on the 6th of january 2017, meaning nsa, c.i.a., fbi, and my office. but i will tell you, frankly that i didn't appreciate the full magde of it until well after. >> narrator: amid growing scrutiny... ht. >> narrator: ...zuckerberg set out on a cross-country trip facebook. different states for my personal challenge for the year to see how different communitieare
10:18 pm
working ountry. >> narrator: but while he was on the road, the news was getting worse. >> the u.s. intelligence community officially is blamingesident... >> ...russian president vladimir putin ordered an influence campaign aimed at the presidential election. >> narrator: zuckerberg's chief ofbeen asked to see what he could find on facebook's servers. >> we kicked off a big look into the fake news phenomenon specifically what component of that might have a russian part in its origin. >> narrator: they traced disinformation to what appeared to be russian government-linked sources.>> jacoby: so, what was it like bringing that n others in the company, and up to mark and sheryl, for itance? >> you know, we had a big responsibility in the security team to the right peoplewhat had happened without being kind of overly dramatic. it's kind of hard as a security person to balance that, right? like, everything seems like an emergency to you.s case it really was right?
10:19 pm
this really was a situation in of this iceberg, and we knew there was some kind of iceberg beneath it. >> narrator: stamos expanded his investigation to look at hown operation may have also used facebook's targeted advertising >> so what we did is, we then decided we're going to look at all advertising and see if we can find any strange patterns that might link them to russian activity. so we enlisted huge parts of the company.ind of dragooned everybody inteam. so you have people in a war room working 70-, 80-hour weeks billions of dollars of ads of pieces of content, and by kind of a painstaking process of going through thousands and thousands of false positives, eventually found this large cluster that we were able to link to the4j internet research agency of st. petersburg. >> narrator: it was one of theroups that had been using facebook to spread disinformation in ukraine three years earlier. this time, uake accounts
10:20 pm
russian operatives had paid around $100,000 to run ads that promoted political messages and facebook g >> what the internet research agency wants to do is, they wantate the appear of legitimate social movements. so they would create, for example, a pro-immigration group and an anti-immigration group. and both of those groups would be almost caricatures of what those two sides think of each othe and their goal of running ads were to find populations o people who are open to thosends of messages, to get them into those groups, and then to deliver content on a regular basis to drive them apart. really what the russians are trying to do is to find these fault lines in u.s. society and amplify them, and to make americans not trust each other. >> narrator: in september 2017 nearly a year after the ection, zuckerberg announced on facebook what the company had found. >> we are actively working with the u.s. government on i
10:21 pm
ongoing investigations into russian interference. we've been investigating this for many months now, and for a while, we had found no evide of fake accounts linked to russian... linked to russia running ads. when we recently uncovered this activity, we provided that information to the special co and this morning, i directed our team to provide the ads we've found to congress, as well. >> we do know that facebook-related posts touched about 150 mi that were posts that originated either through russian fake ac but the paid advertising was a relatively small piece of the overall problem. a much bigger problem was the ability for someone to say they were james in washington, dc but it was actually boris in st. petersburg creating a fake persona that would generate followers, and then they would seed it with the fake information and the false news and the political content. one account was set up to try to muslim community in, in texas.
10:22 pm
another was an attempt to kind of rally the right wing in texas. they created an event. >> white power! >> stop the hate! stop the fear! >> protest, with both sides protesting against each other. >> (yelling)mosque in houston, 2016. >> this is america, we have out. >> (yelling) >> butor the good work of the hous, had the kind of horrible activity take place then and there that i saw unfortunately take place in charlottesville in my state last year. so the real human consequences of some of these... of some of this abuse, we've been very lucky that it hasn't actually cost people's lives. >> narrator: facebook also found that the russians had used the site to orchestrate a pro-trump ra cheesecake factory in florida and to promote an anti-trump protest in new york city just after the election >> hey, hey, ho, ho, donalds got to go... >> we are under threat, and i need to defend the country that
10:23 pm
i love. >> we are right in the middle of the protest. >> narrator: the details of facebook's internal investigation set off alarm bells in washington. >> we're such a ripe target for that sort of thing, and the russians know that. soat divisiveness, that polarization, because they had, they had messages for everybody. you know, black lives matter white supremacists, gun control advocates, gun control opponents, it didn't matter-- they had messages foerybody. >> jacoby: did you thi that was a pretty sophisticated campaign? >> it was. and i believe the russians did a lot to get people out to vote that wouldn't have and helped the appeal for... of donald trump. >> jacoby: and the role that social media played in that was what? >> it was huge. >> i mean, it's really quiteious and evil to, to attack a democratic society in that manner. >> jacoby: but there were warning signs along the way in the trajectory of the company. >> the company's been dealing effects of its product for years, right?
10:24 pm
when you have two billion people on a communication platform, there's a infinite number of potentially bad things that could happen. the tough part is trying to decide where you're going to put your focus. facebook was being accused of not focusing on other seriou issues in developing fragile democracies where the company had expanded its business. countries like the philippines where almost all internet users are on facebook, and problems had been mounting. >> in a year, i probably met with more than 50 differentals, high-ranking officials, including mark zuckerberg. i wanted them to know what we were seeing, i wanted them to tell me what they thought about it, and i wanted them to fix it. >> narrator: maria ressa, who runs a prominent news website, says she had cebook since 2016 that president rodrigo duterte was using a network of paid followers and fake accounts to spread lies about his policies and attack his critics. >> the u.n.'s branded his war aunder
10:25 pm
international law. >> narrally critics of his brutal war on drugs which has taken an estimated 12,000 lives. >> human rights watch has called government-sanctioned butchery. >> president duterte waseting anyone who questioned the drug war, anyone who questioned the alleged extrajudicial killings. questioned that would get brutally bashed. we're protected by the constitution, we've been stripped of those protections online. >> narrator: ressa herself would eventually come under attack. >> there were attacks on the way i loe way i sounded, that i should be raped, that i should be killed. we gave it a name: patriotic trolling. online that is meant to silence, meant to i so this is an information ecosystem that just turns democracy upside-down. >> jacoby: and where lies are prevalent. >> where lies are truth.
10:26 pm
>> narrator: she traced the disinformation t 26 fake accounts and reported it to facebook at a meeting in singaporin august of 2016. >> jacoby: what were you asking them to do? >> eroup does, which is, take control and be responsible for what you >> jacoby: were you given an explanation as to why they weren't acting? >> no. no. i think facebook walked into the philippines, and they were focused on growth. what they didn't realize is that countries like the philippines... >> (chanting) >> ...countries where institutions are weak, where corruption is rampant, these countries don't have the safeguards. and what happens when you bring everyone onto a platform and do not exercise any kind of rules right? if you don't implement those rules before to c the philippines, we've heard about from people on the ground there, that facebook has been to
10:27 pm
some degree weaponized by the duterte regime there. what are y this problem in the philippines? >> one thing we're trying to do, any time that we think there might be a connection between violence on the ground and online speech, the first thing for us to do is actually understand the landscape. >> narrator: monika bickert facebook's head of global policy and worked for the justice department in southeast asia. >> there's a fundamental question, which is, "what should our role be, and as we are identifying misinfor should we be telling people what we're finding, should we be removing that content, should we be down-ranking that content?" and weow have a team that is focused on how to deal with exactly that sort of situation. >> narrator: in april 2018 facebook created a news verificationhired ressa's organization as one of its fact-checkers, though she says the problems are ongoing. the company ultimately took down the accounts ressa identified-- and went on to remove dozens more. >> i think what is happening is that this company is way in over
10:28 pm
its head in terms of its responsibilities. it's way in over its head in terms of wha. the idea isn't that it's just like you magically add facebook and horrible things happen, but you have facebook as this effective gasoline to simmering fires. >> narrator: elsewhere in the region.... >> buddhists are inciting hatred and violence against muslims through social media... >> narrator: ...facebook was also being used to fan ethnic tensions with even more dire consequences. >> violence between buddhists and muslims is continuing. >> misinformation, disinformation, rumors extremist propaganda, all kinds of bad content. >> narrator: for seral years david madden, a techepreneur living in myanmar as well as journalists and activists, had been warning facebook that the muslim minority there was being targeted with hate speech. >> (speaking local langue) you would see the use of
10:29 pm
memes, of images, things thatre degrading and dehumanizing, targeting the muslim community. >> (speaking local language) >> narratoresent as far back as 2014, when a fake news story spread on facebook. >> reports, later proved to be false, that some muslim men had raped a buddhist woman, were shared on facebook. >> an angry mob of about 400 surroued the sun teashop shouting and throwing bricks and >> narrator: two people died in the incident. >> one buddhisand one muslim >> i was really concerned that the seriousness of this was not understood. and so i made a presentation at facebook headquarters in may of 2015. i was pretty explicit about the state of the problem. had happened ida, where radios had played a really key role in and so i sai the risk of being in myanmar what radios were in rwanda."
10:30 pm
that this platform could be used to fent hate and to incite violence. >> jacoby: what was the reaction to that at facebook? >> i got an email shortly after that meeting to say that whatat meeting had been shared internally and apparently taken very seriously. >> narrator: the violence intensified. >> massive waves of violence that displaced over 150,000 people. >> narrator: and in early 2017 madden and other local activists had another meeting with facebook. >> the objective of this meeting was, was really to be crystal-clear about just how bad the problem was, and that the processes that they had in place to try to identify and pull down problematic content, they just weren't working. and we were deeply concerned that something even worse was going to happen imminently. it was a sobering meeting. i think... i think thenaiok was
10:31 pm
"we'll need to go away and dig into this and come back with something substantive." the thing was, it never came. >> and how do you know that? >> we can look at the evidencet? on the ground. >> what we've seen here tells us a story of ethnic cleansing, of >> narrator: the united nations would call the violence in myanmar a genocide, and found social media, and facebook in particular, had played a significant role. >> the ultra-nationalist buddhists have their own facebooks and really inciting a lot of violence and ha against ethnic minorities. facebook has now turned into a beast than what it was originally intended to be used. >> jacoby: i'm curious what it's like when the u.n. comes out with a reporyed a significant in a genocide. running content poli facebook? >> well, this wo to me even if i didn't work at
10:32 pm
facebook, given my background. my background is as a federal prosecutor, and i worked specifically in asia and specifically on violent crimes against people in asia. so something like that really >> jacoby: facebook was warned as early as 2015 about thel for a really dangerousion innm what went wrong there? why was it so slow? >> we met with civil society organizations in myanmar farfore 2015. this is an area where we've been focused. i think what we've learned overimportant for us to build the right technical tools that can help us find some of this content and also work with organizations on the ground in a real-time fashion. we are in the process of building those relationships around the world on a much deep stay ahead of any kind of >> narrator: throughout 2018 facebook says it's taken down prcounts in myanmar, hired age experts, and improved its policies. >> jacoby: should there be any liability or any legal accountability for a company like facebook when something so
10:33 pm
disastrous goes wrong on your accountability. but probably the group that holds us the most accountable are the people using the service. if it's not a safe place for them to come and communicate they are not going to use it. >> we are working here in menlo park in palo alto, california. to the extent that some of these issues and problems manifest in other countries around the world, we didn't have sufficient information and a pulse on what was happening in southeast asia.lf >> narrator: naomi gleit is facebook's second-longest- serving employee. >> and so one change that we've made, along with hiring so many more these people are based internationally and can give us that insight that we may not get from being her headquarters. >> jac understand, you know, the choices that are made. do you regret choices going backward, decisions that0ere made about not taking into account risks or not measuring risks? >> yeah, i definitely think we
10:34 pm
regret not having 20,000 peopleand security back in the day, yes. so i regret that we were too slow, that it wasn't our priority.sidered at the time? to kind of amp up safety and security, but there was some re >> not really. mean, we had a safety and security team. i think we just thought it was sufficient. hi just... it's not that we were, like, "well, we could do s more here," and decided not to. i think we... we just didn't...t a bit idealistic. >> facebook has created this platform that in many coun not just myanmar, s become the dominant information p and it has an outsized influence in lots of countries. that comes with a lot of responsibility. >> using social media, rumors of alleged muslim wrongdoing spread fast.es are wrestling with some pretty bigallenges.
10:35 pm
tensions b countries, and we have seen this explode into what mark zuckerberg would call real-world harm, what others would just many other markets. we're seeing it right now in india. >> calloo became a victim of >> we've seen examples of this in places like sri lanka. >> to keep the violence from spreading, sri lanka also shut down facebook... >> the myanmar example s sounding an alarm at the highest level of the company, that this requires a comprehensive strategy. >> narrator: but it would be far from myaar, and a very different kind of problem, that would cause an international uproar over facebook. >> cambridge analytica and its mining of data on millions of americans for political purposes... >> cambridge is alleged to he used all this data from tens of millions of facebook users... >> escándalo cambridge analytica, facebook... (reporters speaking different languages) >> narrator: it was a scandal over how facebook failed to protect users data a whistleblower
10:36 pm
named christopher wylie.ylie, he was say i can prove this. >> narrator: he said that facebook knew that a political firm he'd worked for, cambridge analytica, had been using the personal data of more than 50 million users to try to inuence voters. >> at cambridge analytica, we are creating the future of politica >> this is a company that specializes and would advertise itself as specializi campaigns. >> political campaigns have changed. >> seeding the internet with misinformation. >> putting the right message in front of the right person at the right moment. >> and that's the power of data. you can literally figure out who are the people who are most susceptible. >> ...data about personality, so you know exactly who to >> narrator: the firm gained access to the data from a third party, without facebook's permission. >> the overwhelming majority of people who had their data collected did not know. when data leaves there is no way for facebook to track th data to know how that data is being used or to find
10:37 pm
out how many copies there are. >> nrator: facebook eventually changed its data sharing policies and ordered cambridge analytica to delete the data. >> we know that facebook had known about this... >> narrator: after wylie came forward they banned the firm from their site, and announced they were ending another controversial practice: working directly with companies known as data brokers. but the uproar was so intense that in april 2018, mark zuckerberg was finally called before congress, in what would become a reckoning, over facebook's conduct, it's business model, and its impact on d >> we welcome everyone today's hearing on facebook' social media privacy and the use and abuse ata. i now turn to you, so proc >> we face a number of important issues around privacy, safety, and democracy.
10:38 pm
and you will rightfully have some hard questions for me to answer. facebook is an idealistic and optimistic company. and as facebook has grown, people everywhere have gotten a powerful new tool for making their voices heard and fornities and businesses. but it's clear now that we didn't do enough to prevent harm, as well.s for fake news, for foreign interference in elections, and hate speech, as well as developers and data privacy. we didn't take a broad enough view of our responsibility, and that was a big mistake. and it was my mistake. and i'm sorry. >> if, like me, you're following this stuff, you see years and years and yes of people begging and pleading with the company, saying, "please pay attention to this," at every channel people could find. and basically being ignored. "we hear you, you're concerned we apologize, of courswe have a responsibility, we'll do and the public record here is that they are a combination of
10:39 pm
unable and unwilling to grasp÷ and deal with this complexity. >> you may decide, or facebook may decide, it needs to police a whole bunch of speech, that i think america might be better off not having policed by one company that has a really big and powerful plam. >> senator, i think that this is a really hard question. and i think it's one of the reasons why we struggle with it. >> these are very, very powerful corporations. they do of traditional democratic accountability. and while i personally know a lot of people making these decisions, if we set the norms that these companies need to decide what, who does and does not have a voice online, eventually that is going to go to a very dark place. >> when companies become big and powerful, there is a instinct to either regulate or break up, right? >> i think we're finding ourselves now in a position
10:40 pm
where people feel like something should be done. there's a lot of questions what should be done, but there's no question that something should be done. >> you don't think you have a poly? >> it certainly doesn't feel likehat to me. >> okay. >> you know, there's a lot of problems here, there, but all of these problems get worse when one company has too much power too much information, over too many people. narrator: after years ofowth, the talk now is increasingly about how to rein in facebook. already, in europe, there's a new internet privacy law aimed at companies like facebook. inside the company, the people we spoke to insisted that facebook is still a force for good. >> jacoby: has there ever been a minute where you've questioned the mission? you know, internally? whether anyone has taken a second to ep back and say, "all right, has this blinded us in some way?" have you had a moment like that? >> i still continue to firmly believe in the mission, but in terms of stepping back, in terms of reflecting, absolutely.
10:41 pm
but that isn't on the mission. how can we do a better job of minimizing bad experiences on >> jacoby: why wasn't that part of the metric earl in terms of, how do you minimize the harm? >> you know, it's possible that we could have done more sooner and we haven't been as fast as we needed to be. >> narrator: that line was repeated by all the current officials facebook put forward to answer questions. >> we've been too slow to act on... >> i think we were too slow... >> we didn't see it fast enough. >> we were too slow. >> mark has said this, that we have been slow. >> one of my greatest regr running the company is that we were slow in identifying the russian information operations 2016. and we're going to take a number of measures, from building and deploying new a.i. tools that take down fake news, to growing our security team to more than 20,000 people... >> the goal here is to deep-dive on the market nuances there... >> narrator: the company says it's now investing resources and talent ttackle a range of problems, from the spread of speech to election interference. >> even if we can't do fact-checking, if we can do more
10:42 pm
work around the programmatic aspect of it... >> narrator: this is part of the team tackling the spread of misinformation around the world, led by tessa lyons. >>s a framework for how they're thinking about secondary áuntry. and i feel like from the misinformation side, we've mostly prioritized primary languages. >> narrator: it's a problem the company admits it is a long way from solving. >> the next thing is about the arabic fact-checking project. i think the main blocker here is potentially getting a fact-checker that can cover an entire region. >> you know, i came into this job asking myself, "how long is it going to take us to solve this?" and the answer is, this isn't a problem that you solve. it's a problem that you contain. >> awesome next, segue into upcoming launches. >> narrator: in advance of the 2018 midterms, facebook mobilized an election team to monitor false news stories and delete fake accounts that may have been trying to influence voters. nathaniel gleicher runs the team. >> there are going to be actors that are going to try to manipulate that public debate. what
10:43 pm
are the techniques they're using how do we make it much harder? >> jacoby: is there going to be real-time monitoring on elec day of what's going on on facebook, and how are you going to actually find things that may sow distrust in the election? >> absolutely, we're going tove a team on election day focused on that problem, and one thing that's useful here is, we've already his in oth elections. you can do that here? >> i think that... yes, i'm confident that we canan do this here. >> narrator: gleichesays his team continues to find foreign actors using the platform to spread disinformation. >> iran was revealed to be a new player in worldwide disinformation campaigns, and on top of this... >> narrator: and in october of 2018, federal prosecutors announced they'd found evidence that russian operatives had been trying to interfere in the u.s. midterm election. >> jacoby: what is the standard that the publishould hold facebook to, in terms of solving some of these seemingly enormous
10:44 pm
problems? >> i think the standard, the responsibility, what i'm focused on, is amplifyingood and mimizing the bad. and we need to be transparent about what we're doing on both sides,nd, you know, i think this is an ongoing discussion. >> jacoby: what's an ongoing discussion? >> how we're doing on minimizing the bad. >> jacoby: but we're dealing with such consequential issues right? we're talking about integrity of our elections, we're talking about... >> absolutely. >> jacoby: ...in some cases, playing a role in a genocide. an ongoing conversation means what, exactly, about that? about a standard for success here? >> i think, you know, this is the number-one priority for the company. mark has been out there, sheryl is out there, you're talking to me and a bunch of the other leaders. that's what we mean by having an ongoing nversation. this is something that we need to, as you said, this is serious, this is consequential. we take this extremely...
10:45 pm
like, we understand this responsibility, and it's not >> jacoby: do you think facebook has earned the trust to be able to say, "trust us, we've got this"? >> i'm not going to answer that, i'm sorry. that's just... that decision for themselves. >> jacoby: but what... do you trust them? >> i trust the people who i worked with. i think there are some goodle who are working on this. that doesn't mean i don't ink we should pass laws to back that up. for facebook... >> ...social media giant... >> narrator: for facebook, the problems have been multiplying. >> ... e cyber attack affecting nearly 50 million facebook users... >> facebook continues toical ads and news... >> narrator: but mark zuckerberg's quest to connect and change the world continues. >> hey, everyone! welcome to f8. this has been an intense year. i can't believe we're
10:46 pm
>> after all these scandals, facebook's profits just still going , right? so they don't really have a huge incentive to change the core problem, which is their business model. >> we are announcing a new set of features coming soon... >> they're not going to do it as long as they're doing so well financially and there's no regulatory oversight. and consumer backlash doesn't really work, because i can'tfriends and family around the world are there. you might not like the company you might not like its privacy policies, you might not like the way its algorithm works, you might not like its business model, but what are you going to do? >> now, there's no guarantee that we get this right. this is hard stuff. we will make mistakes, and they will have consequences, and we will need to fix them. >> narrator: as he has since the beginning, he sees facebook, his invention not as part of the problem, but the solution. >> so if you believe, li that giving people a voice is important, that building
10:47 pm
relationships is important, that creating a sense of community is important, and that doing the hard work of trying to bring the world closer together is important, then i say this: we will keep building. (cheers and applause) ♪ ♪ ♪ ♪ hold, hold on hold on to me ♪ 'cause i'm a little unsteady ♪ >> what's the situation there? >> how do you explain that? >> are you ready for this world that we are facing today? ♪ >> go to pbs.org/frontline for
10:48 pm
the latest developments in the facebook story. then check out the new frontlineransparency project" and see key quotes from the film in context. >> this really was a situation in which we saw the tip of this iceberg, and we knew there was some kind of iceberg beneath it. >> this isn't a problem that you solve, it's a probu contain. >> what i'm focused on is am minimizing the bad. >> connect to the frontline community on facebook, twitter, or pbs.org/frontline. >> frontline is made possible by contributions to your pbs ation from viewers like you. thank you. and by the corporation for public broadcasting. major support is provided by the john d. and catherine t. macarthur foundation committed to building a more just, verdant and peaceful world. more information is available at macfound.org. the ford foundation, working with visionaries on the front lines of social change worldwide. at ford foundation.org. additional support is provided by the abrams foundation committed to excellence in journalism. the park foundation, dedicated
10:49 pm
to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy journalism that informs and inspires. and by the frontline journalism fund, with major support from jon and jo ann hagler. and by the y, for a better us. captioned by media access group at wgbh access.wgbh.org >> for more on this and other frontline programs, visit our website at pbs.org/frontline. ♪ ♪ to order frontline's "the facebook dilemma" on dvd visit shop pbs or call 1-800-play-pbs. this program is also availablee video.
10:53 pm
10:54 pm
168 Views
IN COLLECTIONS
KQED (PBS) Television Archive Television Archive News Search Service The Chin Grimes TV News ArchiveUploaded by TV Archive on