Skip to main content

tv   Frontline  PBS  October 29, 2018 9:00pm-10:00pm PDT

9:00 pm
>> narrator: tonight- part one of a two night special. >> we face a number of iortant issues around privacy, safety, and democracy. >> narrator: frontline investigates... brcebook. >> we didn't take oad enough view of our reonsibility. and it was my mistake. and i'm sorry. >> narrator: told by company inside... >> it's possible that we haven't been as fast as we needed to be. >> we've been too slow to ac on- we didn't see it fast enough- >> i think we were too slow- >> narrator: ...and former employees.ev >> i meaerybody was pretty upset that we hadn't caught it during the elect >> narrator: how facebook was used to disrupt democracy around the globe. >> i don't think any of us, mark included, appreciated how much of an effect we might have had. >> narrator: correspondent, james jacoby takes a hard look at the man who wanted to connect the world.
9:01 pm
>> jacoby: is he not recognizing the importance of his platform? >> he didn't understand what he had built. >> narrator: but is he accountable for helping divide it? >> there is something wrong systemically with thbook zagorithms. in effect polarition was the key to the model. >> narrator: tonight on frontline- "the facebook dilemma". >> fntline is made possible contributions to your pbs station from viewers like you. thank you. and by the corporation for public broadcasting. major support is provided by the john d. and catherine t. macarthur foundation, coitted to building a more just, verdant and peaceful world. more information is available at macfound.org. the ford foundation, working with visionaries on the frontoc lines ofl change worldwide. at fordfoundation.org. ditional support is provided by the abrams foundation, committed to e journalism. the park foundation, dedicated to heighteni public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy
9:02 pm
journalism that informs and inspires. the wyncote foundation. and by the frontline journalism fund, with major support from jon and jo ann hagler. and additional support from chs and lisa kaneb. (birds chirping) ♪ >> are we good? >> should i put the beer down? m >> nah, no, actually, i'gonna mention the beer. aughing) >> hard at work. >> so i'm here in palo alto, california, chilling with mark zuckerberg of the facebook.com, and we're drinking out of a keg of heineken because... what are we celebrating, mark? >> we just got three million users. >> 11, 12, 13...
9:03 pm
>> whoo!yo >> tell know, simply what facebook is. >> i think facebook is an online rectory for colleges. i realized that because i didn't have people's information, i needed to make it interesting enough so that people would want to use the site and want to, like, t their information up. so we launched it at harvard, and within a couple of weeks, two-thirds of the school had signed up. so we're, like, "all right, this is pretty sweet, like, let's just go all out." i mean, it's just interesting seeing how it evolves. we have a sweet office. >> yeah, well, show us... show us around the crib. (talking in background) we didn'want cubicles, so we got ikea kitchen tables instead. vthought that kind of went along with our whoibe here. >> uh-huh. i what'syour fridge? >> some stuff. there's some beer down there. >> how many people work r you? >> it's actually 20 right now. g >> did youet this shot, this one here, the lady riding pit bull? >> oh, nice. r >> aht, it's really all i've got. >> that's cool. >> where are you taking facebook at this point in your life?..
9:04 pm
>> um, i meahere doesn't necessarily have to be more. ♪ >> from the early days, mark had this vision of connecting the whole world. so if google was about providing you access tall the information, facebook was about connecting all the people. >> can you just say your name and pronounce it so nobody messes it up and ty have it on tape? >> sure, it's mark zuckerberg. >> great. >> it was not crazy. somebody was going to connect all those people, why not him? >> we have our facebook fellow, we have mark zuckerberg. >> i havthe pleasure of introducing mark zuckerberg, founder of facebook.com. (applause) >> yo. >> when mark zuckerberg was at harvard he was fascinated by hacker culture, this noti that software programmers could do things that would shock the world. >> and a lot of times, people are just, like, too careful. think it's more useful t like, make things happen and
9:05 pm
then, like, apologize later, than it is to make sure that you dot all your i's now and then, like, just not get stuff done. >> so it was a little bit of a renegade philosophy and a disrespect for authority that t led facebook motto "move fast and break things." >> never heard of facebook? (laughing) >> our school went crazy for the faceok. >> it creates its own world that you get sucked into. >> we started adding things like status updates and pho s and groups and apps. when we first launched, we were hoping for, you know, maybe 400, 500 pele. (cheering) >> toast to the first 100 million, and the next 100 million. >> cool. >> so you're motivated by what? >> building things that, youch knowge the world in a way that it needs to be changed. i >> wbarack obama? the answer is right there on my facebook page. >> mr. zuckerberg... >> 'sup, zuck? 't in those days, "move fast and break things" dieem to be sociopathic. >> if you're building a productt
9:06 pm
people love, you can make a lot of mistakes. >> it wasn't that they intended to do harm so much as they weree unconcabout the possibility that harm would result. >> so just to be clear, you're not going to sell or share any of the information on facebook? >> we're not gonna share people's information, expt for with the people that they've asked for it to be shared. >> technology timism was so deeply ingrained in the value system and in the beliefs of people in silicon valley... >> we're here for a hackathon, soet's get started. >> ...that they'd come to believe it is akin to the law of gravity, that of course technology makes the world a better place.t it always had,ways will. and that assumption essentially masked a set of changes that were going on in the culture that were very dangerous. >> from kxjz in sacramento... >> for monday, june 27... >> narrator: marzuckerberg's quest to connect the world would bring about historic change, and far-reaching consequences, iniv
9:07 pm
politics, y, and technology. we've been investigating warning signs that existed long before pr burst into public view. >> it was my mistake, and i'm sorry... >> narrator: but for thoseoo inside fac the story began with an intoxicating vision that turned into a lucrative business plan. >> well, the one thing that mark zuckerberg has been so good at being incredibly clear and compelling about the mission that facebook has always had. >> facebook's mission give people the power to share. give people the power to share. in order to make the world more open and connected... more open and connected... open and connected... more open and connected. (applause) >> james jacy: how pervasive a mission was that inside of the company? give me a sense of that. >> it was something that... you ow, mark doesn't just sa it when we do, you know, ordered calisthenics in the morning and we yell the miion to each other, right? we would actually say it to each other, you know, when mark wasn't around. >> jacoby: and that was a mission that you really believed in? i mean... >> how could you not? how exciting. wh
9:08 pm
if connecting the world actually delivered a promise th we've been looking for to genuinely make the world a better place?ob >> j was there ever a point where there was questions internally about this mission being naïve optimism? >> i think the short answer is completely yes, and i think that's why we loved it. especially in a moment like when we crossed a billion monthly active users for the first time. and mark's... the way i recall mark at the time, i remember thinking"i don't think mark is going to stop until he gets to everybody." >> i think some of us had an early understanding that we were creating in some ways a digital nation-state. this was the greatest experimene inspeech in human history. >> there was a sense inside the company that we e building the future and there was a real focus on youth being a good
9:09 pm
thing. it was not a particularly diverse workforce. it was very much the sort of harvard, stanford, ivy league group of people who were largely in their 20s. >> was a big believer in t company. like, i knew that it was going to be a paradigm-shifting thing. there was this, definitely this feeling of everything for the company, of this, you know, world-stirring vision. everyone more or less dressed with the same fleece and s with logo on it. posters on the wall that looked somewhat orwellian. bun of course, you know, in upbeat way, obviously. and, you know, some of the slogans are pretty well-known-- s,"move fast and break thi "fortune favors the bold," "what would you do if you weren't afraid?"kn yo, it was always this sort of rousing rhetoric that would push you to go further.r: >> narrantonio garcia martinez, a former product manager on facebook's adverting team, is one of eight former facebook insiders, who agreed to talk on camera about their experiences. >> in silicon valley, t there's a, you know, alm mafioso code of silence that
9:10 pm
you're not supposed to talk about the business in any but the most flattering way, right? basically, you can't say anything, you know, measured or truthful about the business. and i think, as perhaps with facebook, 's kind of arrived at the point at which it's so important, it needs to be a little more transparent about how it works. like, let's stop the little a (bleep) paraut everyone in silicon valley, you know, creating, disrupting this and improving the world, rht? it's, in many ways, a business like any other. it's just kind of more exciting and impactful. (techno music playing) >> narrator: by 2007, zuckerberg had made it clear that the goal of the business was rldwide expansion. >> almost a year ago, when we towere first discussing hoet everyone in the world into facebook, i remember someone said to me, "mark, we ready have nearly every college student in the u.s. on facebook. it's incredible that we were even able to do that. but no one gets a second trick like that." well, let's take a look at how we did. (cheering and applause) >> jacoby: what was the growth team about? what did you do at growth? >> t story of growth has really been about making
9:11 pm
facebook available to people that wanted it but couldn't have access to it. >> narrator: naomi gleit, facebook's second-longest rving employee, is one of five officials the company put forward to talk to frontline.wa shan original member of the growth team. >> one of my first projects was expandcebook to high school students. i worked on translating facebooo inr a hundred languages. when i joined, there were one million users, and now's over two billion people using facebook every month. >> jacoby: some of the problems that have reared their head with facebook over the past couple of years seem to have been causedth in some ways b exponential growth. >> so, i think mark-- rk has said this, that we have been slow to reallunderstand the ways in which facebook might be used for bad things.ll we've been rfocused on the good things. >> so who are all of these n users? >> the growth teamad tons of engineers figuring out how you
9:12 pm
could make the new user experience more engaging, how you could figure outow to get more people to sign up. everyone was focused on growth, growth, growth. >> give people the power to shar >> narrator: and the key to keeping all these new people engaged... >> to make the world more open and connected. >> narrator: ...was fa most important feature... >> news feed. >> narrator: news feed, the seemingly endless streamf stories, pictures, and updates shared by friends, advertisers, and others.es >> it analll the information available to each user, and it actually computes what's going to be the most interesting piece of ishesmation, and then pu a little story for them. >> it's your personalized newspaper, it's your "the new york times" of y, channel you. it is, you know, your customized, optimized vision of the world. >> narrator: but what appeared in users' news feed wasn't random. it was driven by a secret mathematical formula, an algorithm. >> the stories are ranked inte s of what's going to be the most important, and we design a lot of algorithms so we canin produce integ content for you.he >> toal of the news feed is to provide you, the user, with the content on facook that you most want to see.
9:13 pm
it is designed to make you want to keep scrolling, keep looking, ep liking. >> that's the key. that's the secret sauce. that's how... that's why we're worth x billion llars. >> narrator: the addition of the new "like" button in 2009 leallowed news feed to col vast amounts of users' personal data that would prove invaluable facebook. >> at the time we were a little bit skeptical about the like button-- we we concerned. std as it turned out our intuition was ju dead wrong. and what we found was that the like button acted as a social lubricant. and, of course, it was also driving this flywheel of engagement, that people felthe likewere heard on the platform whenever they shared something. >> connect to it by liking it... >> and it became a driving force for the product. >> it was incredibly important because it allowed us to understand who are the people that you care more about, that cause you to react, and who are the businesses, the pages, the other interests on facebook that are important to you. and that gave us a degree of constantly increasing understanding about people.
9:14 pm
>> news feed got off to a bit of a rocky start, and now our users love news feed. they love it. >> narrator: news feed's edponential growth was spu on by the fact that existing laws didn't hold internet companies liable for all thet being posted on their sites. >> so, section 230 of the communications decency act is the provision which allows the internet economy to grow and thrive. and facebook is one of principal beneficiaries of this provision. init says don't hold this rnet company responsible if some idiot says something violent on the site. don't hold the internet company responsible if somebody publishes something that creates conflict, that violates the law. it's the quintessential provision that allows them todo say,t blame us." >> narrator: so it was up to facebook to make the rules, and
9:15 pm
inside the company, they made a fateful decision >> we took a very libertarian perspective here.ed we alleople to speak andwe said, "if you're going to incite violence, that's clearly out of bounds. k we're going k you off immediately." but we're going to allow people to go right up to the edge and we're going to allow other people to respond. set up some ground rules. basic decency, no nudity, and no violent or hatef speech. and after that, we felt some reluctance to interpose our value system on thiswide community that wasrowing. >> jacoby: was there not a concern, then, that it could be become sorof a place of just utter confusion, that you have lies that are given the same weight as truths, and that it kind of just becomes a place where truth becomes completely obfuscated? >> no. we relied on what we thought were the public's commonense
9:16 pm
and common decency to police the site. >> narrator: that approach would soon contributto real-world consequences far from silicon valley, where mark zuckerberg's optimistic vision at first seemed to be playing out. (crowd chanting) the arab spring had come to ypt. (crowd chanting) it took hold with the help of a facebook page protesting abuses by the regime of hosni mubarak. >> not that i was thinking that this facebook page was going to be effective. i st did not want to look back and say that happened and i just didn't do anything about it. >> narrator: at the time, wael ghonim was working for google in the middle east. >> in just three days, over 100,000 people joined the page. throughout the next few months, the page was growing until what happened in tunisia. >> events in tunisia have captured the attentif viewers around the world, and a lot of it was happening online.
9:17 pm
ti it took just 28 days un the fall of the regime. >> and it just created for ma moment of, "maybe we can do this." lud i just posted an event calling for a reon in ten days, like we should all get to llthe street and we should bring down mubarak. >> organized by a group of online activists... >> they're calling it the facebook revolution... (crowd chanting) >> narrator: within days, ghonim's online cry had helped fill the streets of cairo with hundreds of thousands of protesters. (crowd chanting) 18 days later... >> (translated): president muhammad hosni mubarak has deded to step down. (cheering)>> they have truly achieved the unimaginable. man: >> it's generally acknowledged
9:18 pm
that ghonim'facebook page first sparked the protests. >> jacoby: there was a moment y th were being interviewed on cnn. >> yeah, i remember that. >> first tunisia, now egypt, wh's next? >> ask facebook. >> ask what? >> facebook. >> facebook. >> the technology was, for me, the enabler. i would have not have en able to engage with others, i would atve not been able to prop my ideas to others without social media, withoufacebook. >> you're giving facebook a lot of credit for this? >> yeah, for sure. i want to meet mark zuckerbergon day and thank him, actually. >> had you ever think that this could have an impact onre lution? >> you know, my own opinion is that it would be extremely s arrogant for acific technology company to claim any meaningful role inin those. but i do think that the overall trend that's at play here, which is people being able to shary
9:19 pm
what tnt with the people who they want, is an extremely powerful thing, right? and we're ki of fundamentally rewiring the world from the ground up. and it starts with people. >> they were relatively restrained externally about taking credit for it, but internally they were, i wouldy, ery happy to take credit for the idea thasocial media was being used to effect democratic change. >> activists and civil societywo leaders uld just come up to me and say, you know, "wow, we thcouldn't have done this t you guys." government officials, you know,a woul "does facebook really realize how much you guys are changing our societies?"fa >> it felt likbook hady extraordinwer, and power for good. >> narrator: but while facebookm was enjoying ient... (man shouting, crowd chanting) back in egypt, on the ground and on facebook, the situation was unraveling. >> following the revolution,
9:20 pm
things went into a much worse direction than what we have anticipated. >> there's a complete split between the civil community and those who are calling for an islamic state. >> what was happening in egypt was polarization. >> deadly clashes between christians and military police. >> (translated): the brotherhood cannot rule this country >> and all these voices started to clash, and the environment od social breeded that kind of clash, like that polarization-- rewarded it. >> when the arab spring t happened, i know that a people in silicon valley thought our technologies helped bring freedom to people, which w true. but there's a twist to thi which is facebook's news feed algorithm. >> if you increase the tone of your posts against your opponents, you are gonna get more distribution. becauswe tend to be more tribal. so if i call my opponents names, my tribe is happy and celebrating, "yes, do it, like, comment, share, so more people
9:21 pm
end up seeing it." because the algorithm is going say, "oh, okay, that's engaging content, people like it, show it to more people." >> the were also other groups of thugs, part of the pattern of sectarian violence. >> the hardest part for me was seeing the tool that brought us gether tearing us apart. these tools are just enablers for whomever, they don't separate between what's good and bad. they just look at engagement metrics. >> narrator: ghonim himself became a victim of those trics. >> there was a page, it had, like, hundreds of thousands of followers-- all what it did was creating fake statements, and i was a victim of that page. they wrote stements about me insulting the army, which puts me at serious risk because that is n something i said. i was extremely naïve in a way i don't like, actually, no thinking that these are liberating tools. it's the spread of misinformation, fake news, in egypt in 2011.
9:22 pm
>> narrator: he says he later talked to people he knewk at facebd other companies about what was going on. >> i tried to talk to people who are in silicon valley, but i feel like it was not, it was not being heard. >> jacoby: what were you trying to express to people in siliconi valley at th? >> it's very serious. whatever that we... that you are building has massive, serious tintended consequences on the lives of people s planet. and you are not investing enough in trying to make sure that what you are building does not go in the wrong way. and it's very hard to be in their position. no matter how they try and move and change things, there will be always unintended consequences. >> activists in my region were on the front lines of, you know, spotting corners of facebook that the rest of the world, the rest of the company, wasn't yet talkg about, because in a
9:23 pm
company that's built off numbers and metrics and measurements, anecdotes sometimes got lost along the way. and that was always a al challenge, and always bothered me. >> narrator: elizabeth linder, facebook's representative in the region at the time, was also hearing warnings from government officials. >> so many countryen reprtives were expressing to me a huge concern about the ability of rums to spread on facebook, and what do u do about that? >> jacoby: how did you respond to that at the time? >> we, we didn't have a solution for it, and so the best that i cod do is report back to headquarters that this is something that i was hearing on the ground.>> acoby: and what sort of response would you get from headquarters? >> you know, i... it's impossible to be specific about that, because it was always just kind of a, "this is 'm hearing, this is what's going on." but i think in a... in a company
9:24 pm
where the, the people that could have actually, you know, had an impact on making thoseno decisions arnecessarily seeing it firsthand. >> i think everything that happened after the arab spring should have been a warning signf ebook. >> narrator: zeynep tufecki, a researcher and former computer, programmd also been raising alarms to facebook and othesocial media companies. >> these companies were terribly understaffed, in over their heads in terms of the important role they were playing. like, all a sudden you're the public sphere in egypt. so i kept starting to talk to my afriends at these compani saying, "you have to staff up. you have to put in large amounts of people who speak the language, who undetand the culture, who understand the complexities of wherever you happen to operate." >> narrator: but facebook hadn't been set up to police the amount of content coming from all the new places it was expanding to. >> i think no one at any of these companies in silicon
9:25 pm
valley has the resources for this kind of scale.yo had queues of work for people to go through and ulndreds of employees who spend all day every day clicking yes, no, keep, take down, take down, take down, keep up, keep up, making judgment calls,l snap judgment about, "does it violate our terms of service? does it violate our standards of decency? what are the consequences of this speech?" mo you have this fabulously talented group oly 20-somethings who are deciding what speech matters, and they're doing it in real time, all day, every day. >> jacoby: isn't that scary? >> it's terrifying. right? the responsibility was awesome. no one could ever have predicted how fast facebook would grow. the, the trajectory of growth of the user base and of the issues was like this. and of all... all staffing throughout the company was le this. the company was trying to make
9:26 pm
money, it was trying to keep costs wn. it had to be a going concern. it had to be a revenue- generating thing, or would cease to exist. >> narrator: in fact, facebook was preparing to takits rapidly growing business to the next level by going public. >> i'm david ebersman, facebook's cfo.r thank you king the time to consider an investment infa cebook. >> the social medigiant hopes to raise $5 billion. >> the pressure heading into thf i.p.o.ourse, was to prove that facebook was a great business.ot rwise, we'd have no shareholders. >> facebook-- is it worth $100 billion? should it be valued at that? >> narrator:uckerberg's challenge was to show investors and advertisers the profit that could be made from facebook's most valuable asset-- the personal data it had on its users. >> mark, great as he was at vvision and product, he hy little experience in building a big advertising business. >> narrator: that would be the job ofuckerberg's deputy,
9:27 pm
sheryl sandberg, who'd donthe same for google. >> at facebook we have a broad mission: we want to ma world more open and connected. >> the business model we see today was created by sheryl sandberg and the team she built at facebook, many of whom hadit beenher at google. >> narrator: publicly, sandbergk and berg had been downplaying the extent of the personal data facebook was collecting, and emphasizing p uservacy. >> we are focused on privacy. we care the most about privacy. our business model is by far the most privacy-friendly consumers. >> that's our mission, all right? i mean, we have to do that p because ple feel like they don't have control over how they're sharing things, then we're failing em. >> it really is the point that the only things facebook knows about you are thin done and told us. >> narrator: but internally, sandberg would soon lead facebook in a very different direction. >> there was a meeting, i think it was in march of 2012, in which, you know, it was everyone
9:28 pm
,who built stuff inside a myself among them. and, you know, she basically recited the reality, which is, revenue was flattening. it wasn't slow, it wasn't declining, but it wasn't growing nearly as fast as investorsul have guessed. and so she basically said, like, "we have to do something. you people have to do something." td so there was a big eff to basically pull out all the stops and start experimenting way more aggressively. the reality is that, yeah, facebook has a lot of personal data, your chat with your girlfriend or boyfriend, yr drunk party photos from college, etc. the reality is that none of tha is actualuable to any marketer. they want commercially interesting data. you know, what products did you take off the shelf at best buy?b what did youuy in your last grocery run? did it include diapers? do you have kids? are you head of household? right, it's things like that, things that exist inutside world, that just do not exist inside facebook at all. >> narrator: sandberg's team started developing new ways to collect personal data from urs wherever they went on the internet and when they weren'tnt on the inet at all. >>nd so, there's this extraordinary thing that happens that doesn't get much attention
9:29 pm
at the time. about four or five months before the i.p.o., the company announces its first relationship with data broker companies, companies that most americans en't at all aware of, that go out and buy up data about each and every one of us-- what we wey, where we shop, where live, what our traffic patterns are, what our families are doing, what our likes are, what magazines weead-- data that the consumer doesn't even know that's being collected about them because it's being collected from the rest of their liveby companies they don't know, and it's now being shared with facebook, so that facebook can target ads back to the user. >> what facebook does is profile you. ifou're on facebook, it's collecting everything you do. if you are off facebook, it's using tracking pixels to colle what you are browsing. and for its micro-targeting to work, for its business model to work, it h to remain a surveillance machine. >> they made product that was
9:30 pm
a better tool for advertisers than anything that had ever com. before >> and of course the ad revenue spikes.e, that change al think, is a sea change in the way the company felt about its future and the direction it was headed. >> narrato sparapani was so uncomfortable with the directios facebook w going, he left before the company's work with data brokers took effect. te the of facebook's data collection was largely a secret iatil a law student in aus had a chance encounter with a company lawyer. >> i kind of wanted a semester off so i actually went to california, to sta clara university in the silicon valley. someone fromacebook was a guest speaker explaining to us basically how they deal with european privacy law. and the general understanding was, you can do whatever you want to do in europe because they do have data protection laws, but they don't really
9:31 pm
enforce them at all. so ient an email to facebook saying i want to have a copy of all my data. so i got from facebook about 1,200 pages, and i read through it. in my persal file, i think the most sensitive information was in my messages. for example, a friend of mine was in the closed unit of the... of a psychological hospital in vienna.th i deleted ale messages, but all of them came back up.ha and yo messages about, you know, love life and sexuality. and all of that kept. facebook tries to give you the impression that you sharthis only with friends. the reality is, facebook is always looking. ere is a data category called "last location," where they store where they think you've been the last time. if you tag people in pictures, there's gps location, so by that ey know ich person has been at what place at what time. back on the servers, there is, likea treasure trove just,
9:32 pm
like, ten times as big as anything we ever see on the screen. >> narrator: as facebook was rampg up its data collection business ahead of the i.p.o., scems filed 22 complaints with the data prection commission in ireland, where facebook has its international headquarters. >> and they had 20 people at the time over a little supermarket in a small town, it's called portarlington. it's 5,000 people in the middle of nowhere. and they were meant to regulate google or facebook or linkedin and all of em. >> narrator: schrems claimed facebook was violating european privacy law in the way it was collecting personal data and not telling users what they were doing with it. >> and after we filed these complaints, that was when actually facebook reached out, basically saying, you know, "let's sit down and coffee and talk about all of this."ly so we actuad a kind of notable meeting that was in 2012 at the airport in vienna. but the interesting thing is that most of these points, they simply didn't have an answer. you totally saw that their pants were down. however, aa certain point, i just got a text message from the data protection authority
9:33 pm
saying they're not available to speak to me anymore.th was how this procedure basically ended. facebook knew that the system plays in their favor, so even if you violate the law, the reality ist's very likely not k nna be enforced. >> narrator: facebsputed schrems's claims, and said it takes european privacy laws seriously. it agreed to make itsar policies c and stop storing some kinds of user data. >> so without further ado, mark zuckerberg. >> narrator: in silicon valley, thoswho covered the tech industry had also been confronting facebook about howas itandling users' personal data. >> privacy was my number-onehe concern back so when we were thinking about welking to mark, the platform was an issue, ther a bunch of privacy violations, and that's what we wanted to talk to him about. is there a level of privacy that just has to apply to everyone? or do you think... i mean, you might have a view of, this is what privacy means to mark zuckberg, so this is what it going to mean at facebook. >> yeah, i mean, people can control this, ght, themselves.
9:34 pm
simple control always has been one of the important parts ofce using ok. >> narrator: kara swisher has covered zuckerberg since the beginning. she interviewed him after the company had changed its default privacy settings. >> do you feel like it's a balash? do you feel like you are violating people's privacy? and when we started to ask questions, he became increasingly uncomfortable. >> younow, it's... >> i think the issue is, you became the head of the biggest social networking company on the planet. >> yeah, no, so... but i... then interething is that, you know, so i started this when i was, you know, started wking on this type of stuff when i was 18. >> so he started to sweat quite a lot, and then a lot a lot, and then a real lot. so the kind that... thisind of thing where, you know, like "broadcast news," where it was dripping down, like, or tom cruise in that "missio impossible." it was just... it was going to his chin and dripping off. >> you know, a lot of stuff one fromas we've building this project in a dorm room... >> and it wasn't stopping and i was noticing that one of the ople from facebook was, like, "oh, my god," and was... we were... i was trying to figureou
9:35 pm
what to do. >> yeah. i mean, a lot of stuffg happened ahe way. i think, you know, there were real learning points and turning points along the way in terms of. in terms of building things. >> he was in such distress, and i know it sounds awful, but i felt like his mother. like, "oh, my god, this poor guy is gonna faint." i thought he was gonna faint, i did. do you want to take off the hoodie? >> uh, no. (chuckles) whoa. >> well, different people thk different things. he's told us he had the flu. i felt like... he had had a panic attack, is wt happened. >> maybe i should take off the hoodie. >> take off the hoodie. >> go ahead. what the hell?t >> t a warm hoodie. >> yeah. no, it's a thick hoodie. we... it's, um, it's a companyie ho we print our mission on the inside >> what?! oh, my god, the inside of the hoodie, everybody. take a look. what is it? "making the..." >> "making the world more open and coected." >> oh, my god.
9:36 pm
it's like a secret cult. >> jacoby: from that interviewd om others, i mean, how would you have characterized mark's view of privacy? >> well, you know, i don't know he thought about that. it's kind of interesting because they're very... they're very loose on it. they have a viewpoint that this helps you as the user to getor more ition, and they will deliver up more... that's the whole ethos of silicon valley, by the if you only give us everything, we will give you free stuff. there is a trade being made between the user and facebook. the question is, are they protecti that data? >> thank you, mark. >> narrator: facebook had been free to set its own privacy standards, because i the u.s. there are no overarching privacy laws that apply to this kind of data collection. but 2010, authorities at th federal trade commission became ncerned. >> in most other parts of the world, privacy is a right. in the united states, not exactly. >> narrator: at the ftc,avid vladek was investigating whether facebook had been deceiving its users.
9:37 pm
what he founwas that facebook had been sharing users' personal data with so called "third-party developers"-- companies that built games and apps for the and our view was that, you know, it's fine for facebook to collect this data, but sharing this data with third parties without consent was a no-no. >> but at facebook, course, we believe that our users should irve complete control of t information. >> the heart of our cases against companies like facebook was deceptive conduct. that is, they did not make it cltor to consumers the exten which their personal data would be shared with third parties. >> narrator: the ftc had another worry: they saw the potential for da to be misused because facebook wasn't keeping track of what the third parties were doing with it. >> they had, imy view, no real control over the third-party apa developershad access to the site. they could have been anyone. there was no due diligen. anyone, essentially, who could develop a third-party app could get access to the site.
9:38 pm
>> jacoby: it could have been somebody working for a foreign adversary. >> certainly.be it could hav somebody working... yes, for, you know, for the russian government. >> narrator: febook settled with the ftc without admitting guilt and, under a consent order, agreed to fix the problems. >> jacoby: was there an expectation at the time of t consent order that they would staff up to ensure that their users' data was not leaking out all over the place? >> yes. that was the point of this provision of the consent order tithat required them to id risk to personal privacy and to plug those gaps ickly.id >> narrator: ifacebook, however, with the i.p.o. on the horizon, they were also undero pressureep monetizing all that personal information, not just fix the ftc's privacy issues. >> nine months into my first job in tech, i ended up in an interesting siation where, because i had been the main person who was working on privacy issues with respect to
9:39 pm
facebook platform-- which had many, many, many privacy issues, it was a real hornet's nest. and i ended up in a meeting with a bunch of the most senior executives at the company, and they went around the room, and they basically said, "well, who's in charge?" and the answer was me, because no one else really knew anytng about it. you'd think that a company of the size and importance of facebook, you know, would have really focused and hadm of people and, you know, very senior people working on these issues, but itnded up being me.di >> jacoby: whayou think about that at the time? >> i was horrified. i didn't think i was qualified. >> narrator: parakilas tried to exine all the ways that th data facebook was sharing with third-party developers could be misused. >> my concerns athat time were that i knew that there were all these malicious actors who would do a wide range of bad things, given the opportunity, given the
9:40 pm
ability to target people basedin on thirmation that facebook had. so i started thinking through, w what are tst-case scenarios of what people could do with this data? and i showed some of the kinds of bad actors that might try to tack, and i shared it out with a number of senior executives. and the response was muted, i would say. i got the sense that this just wasn't their priorit they weren't that concerned about the vulnerabilities that the company wacreating. they were concerned about revenue growth and user growth. >> jacoby: and that was expressed to you, or that's something that you just gleaned from the interactions? >> from the ck of a response, i gathered that, yeah. >> jacoby: and how swere the senior executives? >> very senior. like, among the tofive executives in the company.ra >> nr: facebook has said it took the ftc order seriously and, despite pakilas's
9:41 pm
account, had large teams of people working to improve users' privacy.as but to paraknd others inside facebook, it was clear the business model continuedo drive the mission. in 2012, parakilas left the company, frustrated. >> i think there was a certain arrogance there that la lot of bad long-term decision- making. the long-term ramifications of those decisions was not well- thought-through at all. and it's got us to where we are right now. (cheers and applause) >> your visionary, your founder, your leader. markm.please come to the podiu (cheers and applause) 2,>> narrator: in may of 2he company finally went public. >> the world's largest social network managed to raise moreil than $18on, making it the largest technology i.p.o. in u.s. history. >> people literally lined up in times square around the nasd board. >> we'll ring this bell and we'll get back to work.
9:42 pm
>> with founder mark zerg ringing the nasdaq opening bell remotely from facebook ,adquarters in menlo park california. >> narrator: mark zuckerberg was now wortan estimated $15 billion. facebook would go on to acquire instagram and whatsapp on its way to becoming one of the most valuable companies in the world. >> going public is an important milestone in our history but here's the thing: our mission isn't to be a public company. our mission is to make the world more open and connected. (cheering) >> narrator: at facebook, the gebusiness model built on ing more and more of users' personal data was seen as a success. but across the country researchers working for the department of defense wereng seeing sometlse. >> the concern was that social media cod be used for really nefarious purposes. s the opportunitr disinformation, for deception, for everything else, are enormous.
9:43 pm
bad guys or anybody could use this for any kind of purpose in a way that wasn't possible before. that's the concern. >> jacoby: and what did you see as a potential thrt of people giving up their data? >> that they're opening themselves up to being targets for manipulation. i can manipulate you to buy something, i can manipulate you to vote for somebody. it's like putting a target... t painting a bget on your front and on your chest and on your back, and saying, "here i am. come and manipulate me. you have every... i've given you everythi you need. have at it." that's a threat. >> narrator: waltzman says facebook wouldn't provide data to help his research. lit from 2012 to 2015, he and his colleagues ped more than 200 academic papers and reports about the threats they were seeing from social media. >> what i saw over the yearsf the program was that the mediumr enables you lly take disinformation and turn it into a serious weapon. >> jacoby: was your research revealing a potential threat to
9:44 pm
national security? >> sure, when you looked at how it actually work. you see where the opportunities are for manipulati, mass manipulation. >> jacoby: and is there an assumption there that people are easily misled? >> yes, yes, people are easily misled, if y do it the right way. for example, when you see people forming into communities, okay, what's called filter bubbles. i'm goa exploit that to craft my message so that it resonates most exactly with that community, and i'll do that fori everle community. it would be pretty easy... it would be pretty easy to set up a fake account, and a large numb of fake accounts, embedd it in different communities, and use them to nosseminate propaganda. >> jacoby: at anous scale? >> yes, well, that's why it's a serious weapon, because it's an enormous scale. it's the scale that makes it a weapon. >> narrator: in fact, waltzman'e fears lready playing out at a secret propaganda factoryt. inetersburg, russia, called the internet research
9:45 pm
agency. hundreds of russian operatives i were ung social media to fight the anti-russian government in neighboring ukraine. vitaly bespalov says he was one of them. >> jacoby: can you explain, what is the internet research agency? (speaking russian) y> (translated): it's a comp that creates a fake perception of russia. they use things like illustrations, pictures-- anything that would nce people's minds.d when i worere, i didn't hear anyone say, "the government runs us" or "the kremlin runs us," but everyone there knew and everye realized it. >> jacoby: was the main intention to make the ukrainian vernment look bad? >> (translated): yeah, yeah, that's what it was. this was the intention with ukraine. put president poroshenko in a bad light and the rest of the
9:46 pm
government, and the military, and so o (speaking russian) you come to work and there's a pile of sim cards, many, many sim cards, and an old mobile phone.co you need an t to register for various social media sites.o you pick any pf a random person, choose a random last name, and start posting links to news in different groups. >> narrator: the russian propaganda had its intended effect: helping to sow distrust and fear of the ukrainian government. (chanting) >> pro-russia demonstrators against ukraine's new interim government. >> "russ, russia," they chant. >> russian propaganda was massive on social media. it was massive. >> there was so many stories e that startrging on the facebook. >> "cruel, cruel ukrainianon naists killing people or torting them because they speak russian." >> they scared people. "you see, they're gonna attack, th're gonna burn your villages. you should worry."
9:47 pm
(speaking russian) >> and then the fake staged news. (speing russian) >> "crucified child by ukrainian soldiers," which is totally nonsense. (speaking russian) >> it got proven that ose people were actually hired actors. >> complete nonsense. >> but it spads on facebook. >> so facebook was weaponized. >> narrator: just as in the arab spring, facebook was being used to inflame divisions. but now by groups working on behalf of a foreign power, using facebook's tools built to help advertisers boost their content. >> by that time in facebook, you could pay money to promote these stories. so your stories emerge on the tap lines. and suddenly you to believe in this, and you immediately get immediate response. you can test all kind of nonsenses and understand to which nonsense people do not believe...
9:48 pm
(man speaking ukrainian) and to which nonsenses people start believing. (chanting in russian) which will influence the behavior of person receptive toa propagand then provoking that person on certain action. ♪ >> they decided to underminein ukfrom the inside... (gunre echoing, shouting) ...rather than from outside. >> i mean, basically, think about this-- russia hacked us. >>arrator: dmytro shymkiv, top adviser to ukraine's president, met with facebook representatives and says he asked them to interv >> the response that facebook gave us is, "sorry, we are open platform, anybody can do anything without... within our policy, which is written on the website." and when i said, "but this is fake accounts." (laughs): "you could verify that." "well, we'll think about this but, you know, we, we have a
9:49 pm
freem of speech and we are very pro-democracy platform. everybody can say anything." >> jacoby: in the meing, do you think you made it explicitly clear that russia was using facebook to meddle in ukraine politics? >> i was explicitly saying that there are trolls factory, that there are posts and news that are fake, that are lying, and they are promod on your platform by, very often, fake accounts. have a look. at least sending somebody to inveigate. >> jacoby: and no one... sorry. >> no. >> jacoby: no one was sent? >> no, no. for them, at that time, it was not an issue. >> narrator: febook told frontline that shymkiv didn't raise the issue of misinformation in their meeting, and that tir converstations had nothing to do with what would happen in the
9:50 pm
united states two years later. >> jacoby: it was known to facebook in 2014 there was potential for russian disinformation campaigns on facebook. >> yes. and there were disinformation campaigns from a numr of different countries on facebook. you know, disinformation campaigns were a regular facetbr of facebookeryd. and... i mean, yeah, technically that should have led to a learning experience. i just don't know. >> jacoby: there was plenty that was known about the potential downsides of social and facebook-- you know, potential for disinformation, potentialfo bad actors and abuse. were these things that you just wern't paying attention to, were these things that were kind of conscio choices to kind of say, "all right, we're gonna
9:51 pm
kind of abdicate responsility from those things and just keep growing?" >> iefinitely think we've be paying attention to the things that we know. and one of the biggest challenges here is that this is really an evolving s threats and risks. we had a big effort around scams. we had a big effort around bullying and harassment. we had a big efforaround nudity and porn on facebook. it's always ongoing. and so some of these threats and problems are new, and i think we're grappling with that as a company with other companies ins pace, with governments, with other organizations, and so i, i wouldn't say that everything is new, it's just different problems. f ebook is the ultimate... >> narrator: at facebook headquarters in menlo park, they would stick to the mission and the business model, despite a gathering storm.>> ..get their election news and decision-making material from facebook. >> the most extraordin election... >> narrator: by 2016, russia was continuing to use
9:52 pm
social media as a we >> ...hillary clinton cannot seem to extinguish... d >> narrator: aision and polarization were running through the presidential campaign. >> just use it on lying, crooked hillary...r >> the race e white house was shaken up again on super tuesday... s narrator: mark zuckerbe threats to his vision of an open and connected world. >> as i look around, i'm starting to see people and nations turning inward, against this idea of a connected world and a global community. i hear fearful voices calling for building walls and distancing people they label as others. for bocking free eression, for slowing immigration, reducing trade and in some cases around the world, even cutting access to the internet. >> narrator: but he continued to view his invention not as part of the problem, but as the solution.y >> and that's think the work
9:53 pm
that we're all doing together is more important now than it's ever been before. (cheers and applause) to >> nr: tomorrow night- frontline's investigation continues. >> there is absolutely no company who has had so much influence on the information that americans consume. >> narrator: he's the man who connected the world. but at what cost? >> polarization was the key to the model. >> narrator: the global threat... >> this is an inrmation ecosystem that just turns democracy upside down. >> narrator: the 2016 ngection... >> ...facebook getver a billion political campaign posts. >> narrator: and the company denials... >> the idea that fake news on facebook influenced thei elecon in any way i think is a pretty crazy idea. >> ...facebook ceo mark zuckberg will testify... >> ...and i'm responsible for what happens here. >> narrator: is facebook ready for the mid-term elections? >> there are a lot of questions heading into this midterm...
9:54 pm
>> ...the midterm elections...ti >> i sll have questions if we're going to makeure that in 2018 and 2020 this doesn't happen again. >> narrator: part two of "the facebook dilemma"- tomorrow night on frontline. >> go to pbs.org/frontline to read more about more about facebook from our partner, washington post reporter dana priest. >> for facebook the dilemma is can they solve these serious problems without completely revamping their business model. >> then watch a video explainer about what facebook knows about you and how. >> ...even though you never signed up for it, facebook now has data abt you and stores it as a shadow profile... >> connect to the frontline community at pbs.org/frontline. >> frontline is made possible by contributions to your pbs station from viewers like you. thank you.or and by theration for public broadcasting. major support is provided by thr john d. and cae t. macarthur foundation, committed to building a more jus verdant and peaceful world.
9:55 pm
more information is available at macfound.org. the ford foundation, working with visionaries on the front lines of social change worldwide. at fordfoundation.org. additional support is providedbr by thes foundation, committed to excellence in journalism.ou the parkation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthyli jour that informs and inspires. the wyncote foundation. and by the frontline journalism fund, with major support from jon and jo ann hagler. d additional support from chris and lisa kaneb. ca ioned by media access group at wgbh access.wgbh.org >> or more on this and other frontline programs, visit our website at pbs.org/frontline.
9:56 pm
♪ to order frontline's "the facebook dilemma" on dvd visit shop pbs, or call 1-800-play-pbs.or sol 1-800-play-pbs. this program is vailable on amazon prime video. ♪ >> you're watching pbs
9:57 pm
9:58 pm
9:59 pm
10:00 pm
male announcer: as wildfires ral ordinary men set out on an extraordinary job. - people don't get up y when thefall down. fa they just kind o down. announcer: witness the struggles and challenges they face. - [cghing] - go, get fresh air! - don't just sit there and eat it. it'll ki you later. - there's gotta be some sort of timeline on my own body, where i'm not gonna be able to do this. - fighting fire is just hours gof hard, borinrk nc ated by moments of sheer terror. nouncer: filmmakers alex jablonksi and kahlil hudson went behind the fire line to film this heroic, selfless journey. - but, folks, this is not a safe occupation. i certainly don't want to see any of you hurt or killed out there fighting fire. - can't think of anywhere else i'd rather be right now, or doing anything different