tv Frontline PBS December 11, 2018 10:30pm-11:31pm PST
10:30 pm
>> narrator: tonight- part one of a two part investigation. w e face a number of important issues around privacy, safety,cr and dey. >> narrator: frontline investigates... facebook. >> we didn't take a broad enough view of our responsibility. and it was my mistake. and i'm sorry. >> narrator: told by company insiders... >> it's possible t we haven't been as fast as we needed to be. b >> we'veeen too slow to act on- >> we didn't see it fast enoue - >> i thinkre too slow- >> narrator: ...and former employees. >> i mean everybody was pretty upset that we hadn't caught it during the election. >> narrator: how facebook was y used to disrupt democrac around the globe. >> i don't think any of us, mark included, appreciated how much of an fect we might have had. >> narrator: correspondent, james jacoby takes a hard look at thent man who to connect the
10:31 pm
world. >> jacoby: is he not recognizing the importance of his platform? >> he didn't understand what he ilt. >> narrator: but is he accountable for helping divide it? >> there is something wrong systemically with the facebook algorithms. in effect polarization was the key to the model. >> narrator: tonight frontline- "the facebook dilemma". >> frontline is made possible by contribuons to your pbs station from viewers like you. thank you.po and by the ction for public broadcasting. major support is provided by the hn d. and catherine t. macarthur foundation, committed to building a more st, verdant and peaceful world. more information is available at cfound.org. the ford foundation, working with visionaries on the front lines of social change worldwide. s ford foundation.org. additional supportovided by the abrams foundation, committed to excellence in journalism. the park foundation, dedicated .o heightening public awareness of critical issu the john and helen glessner family trust.st
10:32 pm
supporting trthy journalism that informs and inspires. aland by the frontline joum fund, with major support from jon and jo ann hagler. corporate support is provided by... >> the zip code you're bornan intoetermine your future, your school, your job, your eams, your problems... at the y, our goal ito create opportunities no matter who you are or where y're from. the y, for a better us. (birds chirping) ♪
10:33 pm
>> are we go? >> should i put the beer down? >> nah, no, actuallygonna mention the beer. (laughing) >> hard at work.he >> so re in palo alto, california, chilling with mark zuckerberg of the facebook.com, and we're drinking out of a keg of heineken because... what are we celebrating, mark? >> we just got three m users. >> 11, 12, 13... >> whoo! >> tell us, you know, simply what facebook is. >> i think facebook is an online directory for colleges. i realized that because i didn't have people's information, i needed to make it interesting enough so that people would want to use the site and want to, .ke, put their informatio so we launched it at harvard, and within a couple of weeks, two-thirds of the school had signed up. so we're, like, "all right, this is pretty sweet, like, let's just go all out." i mean, it's just interesting seeing how it evolves. we have a sweet office. >> yeah, well, show us... show
10:34 pm
us around the crib. (talking in background) we didt want cubicles, so we got ikea kitchen tables instead. i thought that kind of went along with our whole vibe here. >> uh-huh.at wh in your fridge? >> some stuff.th e's some beer down there. >> how many people work for you? >> it's actually 20 right now. >> did you get this shot, this one here, the lady ridg a pit bull? >> oh, nice. >> all right, it's really all i've got.co >> that'. >> where are you taking facebook at this point in your life? >> um, i mean... there doesn't necessarily have to be more. ♪ >> from the early days, mark had this vision of connecting the whole world. so if google was about providing you access to all the information, facebook was about connecting all the people. >> can you jussay your name and pronounce it so nobody messes it up and they have it on tape?
10:35 pm
g. sure, it's mark zuckerb >> great. >> it was not crazy. sotbody was going to connec all those people, why not him? >> we have our facook fellow, have mark zuckerberg. >> i have the pleasure of introducing mark zuckerberg, founder facebook.com. (applause) er yo. >> when mark zucke was at harvard he was fascinated by hacker culture, this noon that software programmers could do things that would shock the world. >> and a lot of times, people are just, like, too careful. o,i think it's more useful like, make things happen and then, like, apologize later, than it is to make sure that you dot alyour i's now and then, like, just not get stuff done. >> so it was a little bit of a renegade philosophy an disrespect for authority that led to the facebook motto "move fast and break things." >> never heard of facebook? (laughing) >> our school went crazy for the facebook. >> it creates its own world that you get sucked into. >> we started adding things like tos andupdates and p groups and apps. when we first launched, we were
10:36 pm
hoping for, you know, maybe 400, 500 ople. (cheering) >> toast to the first 100 million, and the next 100 million. >> cool. >> so you're motivated by what? >> building things that, you, knange the world in a way that it needs to be changed.ho >>s barack obama? the answer is right there on my facebook page. >> mr. zuckerberg... 'sup, zuck? >> in those days, "move fast and break things" didn't seem to be sociopathic. >> if you're building a product that people love, you can make a lot of mistakes. >> it wasn't that they intendedm to do hao much as they were unconcerned about the possibility that harm would result. re so just to be clear, yo not going to sell or share any of the information on facebook? >> we're not gonna share people's informationexcept for with the people that they'veo asked for it tbe shared. >> technogy optimism was so deeply ingrained in thvalue system and in the beliefs of people in silicon valley... >> we're here for a hackathon, so let's get started.
10:37 pm
>> ...that they'd come toev beliit is akin to the law of gravity, that of course technology makes the world a better place. it always had, it always will.an that assumption essentiallysk a set of changes that were going on in the cultureth were very dangerous. >> from kxjz in sacramento... >> foronday, june 27... >> narrator: mark zuckerberg's quest to connect the world would bring about historic change, and far-reaching consequences, inpr politicsacy, and technology. we've been investigating warning signs that existed long before problems burst into public view. >> it was my mistake, and i'm sorry... >> narrator: but for thoseeb inside fk, the story began with an intoxicating vision that turned into a lucrative business plan. hi >> well, the one that mark zuckerberg has been so good at is being incredibly clear and compelling about the mission that facebook has always had. >> facebook's mission is to give people the power to share.
10:38 pm
give people the power to sre. in order to make the world more open and connected... more open and connected... open and connected... more open and connected. (applause) >> james jacoby: how pervasive a mission was that inside of the company? give me a sense of that. >> it was something that... you know, mark doesn't just say it when we do, you know, ordered calisthenics in the morning and we yell the mission to each saher, right? we would actuallit to each other, you know, when mark wasn't around. >> jacy: and that was a mission that you really believed in? i mean... >> how couw you not? citing. what if connecting the world actually delered a promise that we've been looking for to genuinely make the worlda a better? >> jacoby: was there ever a point where there was questions internally about this mission being naïve optimism? >> i think the short answer is completely yes, and i think that's why we loved it. especially in a moment like when we crossed a billion monthly active users for the first time. and mark's... the way i recall
10:39 pm
mark at the time, i remember thinking, "i don't think mark is going totop until he gets to everybody." >> i think some of us had an early understanding that we were creating in some ways a digital nation-state. this was the greatest experimenn ree speech in human history. >> there was a sense inside the company thate are building the future and there was a real focus on youth being a good thing. it was not a particularly diverse workforce. it was very much the sort of harvar stanford, ivy league group of people who were largely in their 20s. >> i was a big believer in the company. like, i knew that itoing to be a paradigm-shifting thing. there was this, definitely this feeling of everything for the company, of this, you know, world-stirring vision. everyone more or less dressed wagh the same fleece and with logo on it. posters on the wall that looked
10:40 pm
somewhat orwellian. at, of course, you know, upbeat way, obviously. and, you know, some of the slogans are pretty well-known-- "move fast and break t" "fortune favors the bold," "whao would u do if you weren't afraid?" you know, it was always this sort of rousing rhetoric thatul push you to go further. >> narrator: antonio garcia martinez, a former product manager on facebook's advertising team, is one of eight former facebook insiders, who agreed to talk on camera about their experiences. >> in silicon valley, there's a, you know,t a mafioso code of silence that you're not supposed to talk about the business in any bu the most flattering way, right? basically, you can't say anything, you know, measured or truthful about the business. and i think, as perhaps with facebo, it's kind of arrived at the point at which it's so important, it needs to be a little more transparent aboutit hoorks. like, let's stop the little (bleep) parade about everyone in silicon valley, you know, creating, disrupting this and improving the world, right? it's, in many ways, a business like any other. it's just kind of more exciting and impactful. (techno music playing) >> narrator: by 2007, zuckerberg had made it clear that the goaln
10:41 pm
of the bs was worldwide expansion. >> almost a year ago, when we were first discussing how to let everyone in the world into facebook, i remember someone said to me, "mark,e already have nearly every college student in the u.s. on facebook. it's incredible that we were even able to do that. but no one gets a second trick like that." well, let's take a look at how we did. (cheering and applause) >> jacoby: what was the growth team about? what did youo at growth? >> the story of growth has really been about makingab facebook ava to people that wanted it but couldn't have access to it. >> narrator: naomi gleit, facebook's second-longest serving employee, is one of five officials the company put forward to talk to frontline. she was an original member of the growth team. >> one of my first projects was expanding facebook to high school students. i worked on translating facebook into over a hundred languages. when i joined, there were one million users, and now there's over two billion people using facebook every month.
10:42 pm
>> jacoby: some of the problems that have reared their head with facebook over the past couple om years o have been caused in some ways by this exponential - owth. >> so, i think mard mark has said this, that we have been slow to rlly understand the ways in which facebook mig be used for bad things. r we've belly focused on the good things. >> so who are all of the new users? >> the growth te had tons of engineers figuring out how you could ke the new user experience more engaging, how you could figure out how to get more people to sign up.yo ev was focused on growth, growth, growth. >> give people the power to share. >> narrator: and the key to keeping all these new people engaged... >> to make the world more open and connected. cebook'stor: ...was most important feature... >> news feed. >> narrator: news feed, the seemingly endless stream of stories, pictures, and updates shared by friends, advertisers, and others. >> it analyzes all the information available to eacher and it actually computes what's going to be the most
10:43 pm
interesting piece of blishestion, and then a little story for them. >> it's your personalized newspaper, it's your "the new york times" ofou, channel you. it is, you know, your custfized, optimized vision o the world.to >> narra but what appeared in users' news feed wasn't random. it was driven by a secret mathematical formula, an algorithm. >> the stories are ranked in terms of what's going to be the most important, and we desn a lot of algorithms so we cante produce insting content for you. t he goal of the news feed is to provide you, e user, with the content on facebook that you most want to see. it is designed to make you want to keep scrolling, keep looking, keep liking. >> that's the key. that's the secret sauce. that's how... that's why we're worth x billn dollars. >> narrator: the addition of th" new "lutton in 2009 allowed news feed to collect vast amounts of ers' personal data that would prove invaluable to facebook. >> at the time we were little bit skeptical about the like button-- we were concerned. and as it turned out ours intuition wajust dead wrong. and what we found wa like button acted as a social
10:44 pm
lubricant. and, of course, it was also iving this flywheel of engagement, that people felt like they were heard on the platform whenever they sharedin some >> connect to it by liking it... >> and it became a driving force for the product. >> it was incredibly importante beca allowed us to understand who are the people that you care more about, that cause you to react, and who are the businesses, the pages, the other interests on facebook that are important to you. and th gave us a degree of constantly increasing understandg about people. >> news feed got off to a bit of a rocky start, and now our users love news feed. they love it >> narrator: news feed's edponential growth was spu on by the fact that existing laws didn't hold internet companies liable for all theco ent being posted on their sites. >> so, section 230 of the communications decency a is the provision which allows the intern economy to grow and thrive.
10:45 pm
and facebook is one of the principal beneficiaries of this provision. it says don't hold this internet company responsible if some idiot says something violent on the site. don't hold the internet company responsible if somebody publishes something that createo conflict, thattes the law. it's the quintessential provision that allows them toy, don't blame us." >> narrator: so it was up to facebook to make the rules, ande insideompany, they made a fateful decision. >> we took very libertarian perspective here. we allowed people to speak and ha said, "if you're going to incite violence, tt's clearly out of bounds. we're going to kick you off llmediately." but we're going to people to go right up to the edge and we're going to allow other peop to respond. we had tset up some ground rules. basic decency, no nudity, and no violent hateful speech. and after that, we felt some
10:46 pm
reluctance to interpose our value system on this worldwide community that was growing. >> jacoby:as there not a concern, then, that it could be hacome sort of a place of just utter confusion,you have lies that are given the same weight as truths, and that it kind of just becomes a place where truth becomes comptely obfuscated? >> no.el wed on what we thought were the public's common sense and common decency to police the site. >> narrator: that approach would soon contribute to real-world consequences far from silicon valley, where mark zuckerberg's optimistic vision at first seemed to be playing out. (crowd chanting) the arab spring had come to egypt. (crowd chanting) it took hold with the help of a facebook page protesting abuses by the regime of hosni mubarak.
10:47 pm
>> not that i was thinking that this facebook page was going to be effecti. i just did not want to look back and say that ppened and i just didn't do anything about it. >> narrator: at the time, wael ghonim was working for google in the middle east. >> in just three days, over 100,000 peop joined the page. throughout the next few months, the page was growing until what happened in tunisia. en events in tunisia have captured the atttion of viewers around the world, and ai lot t was happening online. >> it took just 28 daytilf the fall othe regime. we and it just created fore a moment of, "mayban do this." and i just posted an eventvo calling for a tion in ten days, like we should all get to the street and we shou bring down mubarak. .. organized by a group of online activists. >> they're calling it th facebook revolution... (crowd chanting)
10:48 pm
>> narrator: withidays, ghonim's online cry had helped fill the streets of cairo with hundreds of thousands of protesters. (crowd chanting) 18 days later... >> (translated): president muhammad hosni mubarak has decided to step down. (cheering) >> they have truly achieved the unimaginable. man: >> it's generally acknowledged that ghom's facebook page first sparked the protests. >> jacoby: there was a momentth you were being interviewed on cnn. >> yeah, i remember that. >> first tunisia, now egypt, what's next? >> ask facebook. >> ask what? >> facebook. >> facebook. >> the technology was, for me, the enabler. i would have not have been able to engage with others, i would have not been able to propagate my ideas to others without social media, without facebook. a lotu're giving faceboo
10:49 pm
of credit for this? >> yeah, for sure. i want to meet mark zuckerberg one day and thank him, actually. >> had you ever think that this could have an impact on revolution? is you know, my own opinio that it would be extremely arrogant for any specific technology company to claim any meaningful rolin, in those. but i do think that the overall trend that's aplay here, which is people being able to share what they want with the people who they want, is an extremely powerful thing, right? and we're kind of fundamentally rewiring the world from the ground up. and it starts with peoe... >> they were relatively restrained externally about taking credit foit, but internally they were, i would say, very happy to take credit for the idea that social media was being us to effect democratic change. >> activists and civil societyad les would just come up to me and say,ou know, "wow, we couldn't have done this without
10:50 pm
you guys." government officials, you know,l say, "does facebook reallye realizhow much you guys are changing our societies?" >> it felt like facebook had extraordinary power, and power for good. >> narrator: but while facebook was enjoying its moment... (man shouting, crowd chanting) back in egypt, on the ground and on facebook, the situation. was unraveling >> following the revolutn, things went into a much worse direction than what we have anticipated. >> there's a complete splitee betwthe civil community and those who are calling r an islamic state. >> what was happening in egypt was polarization. >> deadly clashes between christians andilitary police. >> (translated): the brotherhood cannot rule this country. >> and all theseoices started to clash, and the environment on social media breeded that kind of clash, like that polarization-- rewardeit. >> when the arab spring happened, i know that of
10:51 pm
people in silicon valley thought our technologies helped bring freedom to people, whi was true. but there's a twist tohis, which is facebook's news feed algorithm.in >> if yoease the tone of your posts against your oppones, you are gonna get more distribution. because we tend to be more tribal. so if i call my opponents names, my tribe is happy and celebrating, "yes, do it, like, comment, share, so more people end up seeing it." because the algorithm is going to say, "oh, okay, that' engaging content, people like it, show it to more people." >>uphere were also other gro of thugs, part of the pattern of sectarian violence. >> the hardest part for me washe seeingool that brought us together tearing us apart. erthese tools are just ena for whomever, they don'tse rate between what's good and bad. they just look at engagement metrics. >>arrator: ghonim himself became a victim of those
10:52 pm
trics. >> there was a page, it had, like, hundreds of thousands of followers-- all what it did was creating fake statements, and i was a victim of that page. they wrote statements about me insulting the army, which puts me at serious risk because that not something i said. i was extremely naïve in a way i don't like, actually, w, thinking that these are liberating tools. it's the spread of misinformation, fake news, in egypt in 2011. >> narrator: he says he later talked to people he kneweb at fk and other companies about what was going on. >> i tried to talk to people who are in silicon valley, but i feel like it was not, it was not being heard. >> jacoby: what were you trying to express to people in silicon valley at thme? >> it's very serious. whatever that we... that you are building has massive, serious unintendedonsequences on the lives of people on this planet.o anare not investing enough
10:53 pm
in trying to make sure that what you are building does not go in the wrong way. and it's very hard to be in their position. no matter how they try and mov and change things, there will be always unintended consequences. >> activists in my region were on the front lines of, you know, spotting corners of facebookhe thatest of the world, the rest of the company, wasn't yet talking about, because in a company that's built off numbers and metrics and measurements, anecdotes sometimes got lost along the way. and that was alwaya real challenge, and always bothered me. >> narrator: elizabeth linder, facebook's representative in the region at e time, was also hearing warnings from government officials. >> so many country representatives were expressing to me a huge concern about the ability ofumors to spread on
10:54 pm
facebook, and whato you do about that? >> jacoby: how did you respond to that at the tim >> we, we didn't have a solution for it, and so the best that i could do is report back headquarters that this is something at i was hearing on the ground. >> jacoby: and what sort of response would you get fromte headqu? >> you know, i... it's impossible to be specific about s at, because it was always just kind of a, "thisat i'm hearing, this is what's going on." but i think in a... in a company where the, the people that could have actually, you know, had an impact on making thosear decisionnot necessarily seeing it firsthand. >> i think everything that rihappened after the arab should have been a warning sign to facebook. >> narrator: zeynep tufecki, a researcher and former computermm prog, had also been raising alarms to facebook and .her social media compani >> the companies were terribly understaffed, in over their heads in terms of the important role they were playing.
10:55 pm
like, all of a sudden you're the public spherin egypt. so i kept starting to talk to my friends at these companies and saying, ou have to staff up. you have to put in large amounts of people who speak the language, who unrstand the culture, who understand the complexities of wherever you happen to operate." >> narrator: but facebook hadn't been set up to police thef amount ontent coming from all the new places it was expanding to. >> i think no one at any of these companies in silicon valley has the resources for this kind of scale. u had queues of work for people to go through and hundreds of employees who would spend all day every icking yes, no, keep, take down,n, take dake down, keep up, keep up, making judgment calls,a snap judgmens, about, "does it violate our terms of service? does it violate our standards of decency? what are the consequences of this speech? so you have this fabulously talented group of mostly
10:56 pm
20omethings who are decidi what speech maers, and they're doing it in real time, all day, every day. j oby: isn't that scary? >> it's terrifying. right? the responsibility was awesome. no one could ever have predicted how fast facebook would grow. the, the trajectory of growth o the user base the issues was like this. and of all... all staffing throughout the company was le this. the compy was trying to make money, it was trying to keep costs down. it had to be a going concern. it had to be a revenue- generating thing, or would cease to exist.>> arrator: in fact, facebook was preparing to take its rapidly growing business to the next level by going public. >> i'm dav ebersman, facebook's cfo. thank you for taking the time to consider an investment in facebook. >> the social media giant hopes to raise $5 billion. >> the pressure heading into th i.p. course, was to prove
10:57 pm
that facebook was a great business. herwise, we'd have no shareholders. >> facebook-- is it worth $100 billion? should it be valued at that? >> narrar: zuckerberg's challenge was to show investors and advertisers the it that could be made from facebook's most valuable asset-- the personal data it had on s users. >> mark, great as he was at vision and product, he had very little experience in building a big advertising business. >> narrator: that would be the job of zuckerberg's deputy, sheryl sandberg, who'd done the same for google. >> at facebook we have a broad mission: we want to make the world more open and connected. >> the business model we see today was created by sherylnd berg and the team she builty at facebook, m whom had been with her at google. >> narrator: publicly, sandberg and zuckerberg had been downplaying the extent of the personal data facebook was collecting, and emphasizings' usrivacy.
10:58 pm
>> we are focused on privacy. we care the most about privacy. y r business model is by far the most privacy-frien consumers. >> that's our mission, all right? i mean, we have to do that because if people feel like they don't have control over howth 're sharing things, then we're failing them. >> it really is the point that the only things facebook knows about you are things you've >> narrator: but internally, sandberg would soon lead facebook in a very different direction. >> there was a meeting, i think it was in march of 2012, in which, you know, it was everyone who built stuff inside myself among them. and, you know, she basically recited the reality, ws, revenue was flattening. it wasn't slow, it wasn't declinin but it wasn't growing nearly as fast as investors would have guessed. and so she basically said, like, "we have to do something.e you people h do something." and so there was a big effort to basically pull out all thest stops ant experimenting way more aggressively. the reality is that, yeah, facebook has a lot of personal data, your chat with y girlfriend or boyfriend, your drunk party photos from
10:59 pm
college, etc. the reality is that none of thaa is ay valuable to any marketer. they want commercially interesting data. you know, what produd you take off the shelf at best buy? what did you buy in your last grocery run? did it include diapers? eh you have kids? are you head of hod? right, it's things like that, things that exist in the outside world, that just do not exist inside facebook at all. >> narrator: sandberg's team started developing new ways to collect personal data from users wherever they went on thete et and when they weren't on the internet at all. >> and so, there's this extraordinary thing that happens that doesn't get much attention m the time. about four or fiths before the i.p.o., the company announces its first relationship with data broker comnies, ost americans aren't at all aware of, that go out and buy up data about each and every one of us-- what we buy, where we shop, where we live, what our traffic patterns are, what our families are doing, what our likes are, what magazines we read-- data thatco thumer doesn't even know
11:00 pm
that's being collected about them because it's being collected from theest of their lives by companies they don't know, and it's now being shared with facebook, so that facebook can target ads back tohe user. >> what facebook does is profi you. if you're on facebook, it's co.ecting everything you do if you are off facebook, it's using tracking pixels to collect what you are browsing. and for its micro-targeting to work, for its business model to work, it has to remain a surveillance machine. >> they made a product that was a better tool for advertisers than anything that hadcome before it. >> and of course the ad revenue spikes. that change alone, i think, is a sea change in the way the mpany felt about its future and the direction was headed. >> narrator: sparapani was so ticomfortable with the dir facebook was going, he left before the company's work with data brokers took effect.
11:01 pm
the extent of facebook's data collection was largely a secret until a law student in austria had a chance encounter with a company lawyer. >> i kind of wanted a semester off so i actually went to california, to santa clara univsity in the silicon valley. someone from facebook was a guest spear explaining to us basically how they deal with european privacy law. and the general understanding was, you can do whatever you otnt to do in europe because they do have data tion laws, but they don't really enforce them at all. so i sent an email to facebook saying i wanto have a copy of all my data. so i got from facebook about 1,200 pages, and i read through it. in my personal file, i think the most sensitive information was in my messages. itr example, a friend of mine was in the closed f the... of a psychological hospital in vienna. i deleted all these messages, but all of them came back up.
11:02 pm
and you have messages about, you know, love life and sexuality. and all of that is kept. facebook tries to give you the pression that you share this only with friends. the reality is, facebook is alwa looking. there is a data category called "last location," where theye store whey think you've been the last time. if you tag peoe in pictures, there's gps location, so by that they know which person has been at what place at what time. back on the servers, there is, like, a treasure trove just, like, ten times as big as anything we ever see on the screen. as>> narrator: as facebook ramping up its data collection business ahead of the i.p.o., schrems filed 22 complaints with the data protection commission in ireland, where facebook has its international headquarters. >>nd they had 20 people at the time over a little supermarket in a small town, it's called portarlington. it's 5,000 people in the middle of nowhere. and ey were meant to regulate google or facebook or linkedin and all of them. ed>> narrator: schrems cla facebook was violating european daivacy law in the way it was
11:03 pm
collecting persona and not telling users what they were doing with it. >> and after we filed these complaints, that w when actually facebook reached out, basically saying, you know, "let's sit down and have a coffee and tk about all of this." so we actually had a kindee of notable mting that was in 2012 at the airport in vienna. but the interesting thing is that most of these points, they simply didn't have an answer. you totally saw thatheir pants were down. however, at a certain point, i just got a text message from the data protectn authority saying they're not available to speak to me anymore. that was how this procedure basically ended. facebook knew that the system plays in their favor, so even if you violate the law, the reality is it's very likely not gonna be enforced. >> narrator: facebook disputed schrems's claims, and said it takes european privacy laws seriously. it agreed to make policies clearer and stop storing some kinds of user data. >> so without further ado, mark zuckerberg.
11:04 pm
>> narrator: in silicon valley, those who covered the tech indusy had also been confronting facebook about how it was handling users' personal data. >> privacy was my number-one concern back then. so when we were thinking about talking to mark, the platform was an issue, there were a bunch of privacy violations, and that'shat we wanted to talk to him about. is there a level of privacy that just has to apy to everyone? or do you think... i mean, you might have a view of, this is what privacy mea to mark zuckerberg, so this is what it's going to mean at facebook. >> yeah, i mean, people can control this, right, themselves. simple control always has been one of the iortant parts of using facebook. >> narrator: kara swisher has covered zuckerberg since the beginning. she interviewed him after the company had changed its defaults privacy sett >> do you feel like it's a backlash? do you feel like you are violating people's privacy? and when we started to ask questions, he became increasingly uncomfortable. >> you know, it's... >> i tnk the issue is, you became the head of the biggest social networking company on the planet. >> yh, no, so... but i... th
11:05 pm
interesting thing is that, you know, so i started this when i was, you know, started working on this ty of stuff when i was 18. >> so he started to sweat quite a lot, and then a lot a lot, and then a real lot. so the kind that... this kind of thing where, younow, like "broadcast news," where it was dripping down, like, or tom cruise in that "mission: im wssible." just... it was going to his chin and dripping off. >> you know, a lot of stuff changed as we've gone from building this project in a dorm room... >> and it wasn't stopping and i was noticing that one of the people from facebook was, like, "oh, my god," and was... we were... i was trying to figure out what to do. >> yeah. i mean, a lot of stuff happened along the way. i think, you know, there were real leaing points and turning points along the way in terms of... in terms of building things. s >> he was h distress, and i know it sounds awful, but i felt like his mother. like, "oh, my god, this poor guy is gonna faint."ou i t he was gonna faint, i did. do you want to take off the hoodie? >> uh, no.
11:06 pm
(chuckles) whoa. >> well, different people think different things. he's told us he had the flu. i felt like... he had had a panic attack, is what happened.e >> m should take off the hoodie. >> take off the hoodie. >> go ahead. what the hell? >> that is a warm hoodie. >> yeah. no, it's a thick hoodie.it we.., um, it's a company hoodie. we print our mission on the inside. >> what?! oh, my god, the inside of the hoodie, everybody. take a look. what is it? "making the..." >> "making the wor more open and connected." >> oh, my god. it's like a secret cult.ac >>y: from that interview and from others, i mean, how would you have characterized mark's view of privacy >> well, you know, i don't know if he thought about that. 's kind of interesting because they're very... they're very loose on it. they have a viewint that this helps you as the user to get more information, and they wille deup more... that's the whole ethos of silicon valley, by the way. if you only give us everything, we will give you free stuff. b there is a trang made between the user and facebook. the question is, a they protecting that data?
11:07 pm
>> thank you, mark.ra >> nr: facebook had been free to set its own privacy standards, because in he u.s. there are no overarching privacy laws that apply this kind of data collection. but in 2010, authorities at the federal trade commission became concerned. >> in most other parts of the world, privacy is a right. in the united states, not exactly. >> narrator: at the ftc, david vladek was investigating whether facebook had been deceing its users. what he found was that facebook had been sharing users' personal dataith so called "third-par"- developers- companies thatme built gaand apps for the platform. >> and our view was that, you bow, it's fine for facebook to collect this dat sharing this data with third parties without consenwas a no-no. >> but at facebook, of course, we believe tt our users should have complete control of their information. >> the heart of our cases against coanies like facebook was deceptive conduct.
11:08 pm
that is, they did not make it clear to consumers the extent to ich their personal data would be shared with third parties. th narrator: the ftc had a worry: they saw the potential for data to be misused because escebook wasn't keeping track of what the third par were doing with it. >> they had, in my view, no real control over the third-party app developers that had access to the site. they could have been anyone. there was no due diligence. anyone, essentially, who couldd- develop a thrty app could get acce to the site. >> jacoby: it could have beenbo so working for a foreign adversary. >> certainly. it could have been somebody working. yes, for, you know, for the russian government. >> narrator: facebook settled with the ftc without admitng guilt and, under a consent order, agreed to fix the problems. >> jacoby: was there anpe ation at the time of the consent order that they would staff up to ensure thar users' data was not leaking out all over the place? >> yes.
11:09 pm
that was the point of this provision of the consent order that required them to identify risk to personal privacy and to plug those gaps quickly. p. narrator: inside facebook, however, with the on the horizon, they were also under pressure to keep monetizing all that personal information, not just fix the ftc's privacy issues. >> nine months into my first job in tech, i ended up in an interesting situation where, because i had been tn person who was working on privacy issues with reto facebook platform-- which had many, ny, many privacy issues, it was a real hornet's nest. and i ended up in a meeting with a bunch of the most senior executives at the company,nd they went around the room, and they basically said, "well, who's in charge?" and the answer was me, because no one else really knew anything about it. you'd think that a company of the size and importance facebook, you know, would have really focused and had a team of people and, you know, very
11:10 pm
senior people working these issues, but it ended up being me. >> jacoby: what did you think about that at the time? >> i was horrified.'t i dihink i was qualified. >> narrator: parakilas tried to shamine all the ways that the data facebook waing with third-party developers could be misused. >> my concerns at that time were that i knew that thee all these malicious actors who woulg do a wide of bad things, given the opportunity, givenhe ability to target people based on this information that facebook had. h, i started thinking thro what are the worst-case scenarios of what people could do with this data? and i showed some of the kinds of bad actors that might try to attack, and i shared it out with a number of seniores execut and the response was muted, i would say. i got the sense that this just wasn't their priority.
11:11 pm
they weren't that concerned about the vulnerabilities that the company was creating. they were concerned about revenue growth and user growth.j oby: and that was expressed to you, or that's something that you just gleaned from the interactions? >> from the lack of a response, i gathered that, yeah. >> jacoby: and how senior were the senior executives? >> very senior. like, among the top five executives in the company. >> narrator: facebook has said it took the ftc order seriously and, despite parakilas's account, had large teams of people working to improve users' privacy. but to parakilas and others inside facebook, it was cleare siness model continued to drive the mission. in 2012, parakilas left th company, frustrated. >> i think there was a certain arrogance there that led to a lot of bad long-term decision- maki. the long-term ramifications of those decisions wanot well- thought-through at all.
11:12 pm
and it's got us to where we are right now. (cheers and applause) >> your visionary, your founder, your leade mark, please come to the podiume (cs and applause) >> narrator: in may of 2012, thp coy finally went public. >> the world's largest social network managed to raise more than $18 billion, making it the largest technology i.p.o. in u.s. history. >> people literally lined up in times square around the nasdaq board. >> we'll ring this bell and we'll get backo work. >> with founder mark zuckerberg ringing the nasdaq opening bell remotely from facebook headquarters in menlo park, california. >> narrator: mark zuckerberg was now worth an estimated $15 billion.bo face would go on to acquire instagram and whatsapp on its way to becoming one of the mos valuable companies in the world. >> going public is an important milestone in our history. but here's thehing: our mission isn't to be a public company.
11:13 pm
our mission is to make the world more open and connected. (cheering) >> narrator: afacebook, the business model built on getting more and mor users' personal data was seen as a success. but across the country,rk researchers g for the department of defense were seeing something else. >> the concern was that social media could be used for really nefarious purposes. the opportunities for disinformation, for deception, for everything else, are enormous. bad guys or anybody could use this for any kind of purpose in a way that wasn't possible before. that's the concern. >> jacoby: and what did you see as a potential threat of people giving up their data? >> that they're opening themselves up to being targets for manipulation. i can manipulate you to buy something, i can manipulate youo to vote somebody. it's like putting a target... painting a big target on yourou front and onchest and on your back, and saying, "here i am. come and manipulate me. you have every... ve given you everything you need. have at it." a
11:14 pm
thathreat. >> narrator: waltzman says facebook wouldn't provide data to help his research. 2 but fr2 to 2015, he and his colleagues published more than 200 academic papers andou reports the threats they were seeing from social media. >> what i saw over the years of the program was that the medium enables you to really take disinformation and turn it into a serious weapon. >> jacoby: was your research revealing a potential threat to national security? >> sure, when you looked at ho it actually worked. you see where the opportunities are for manipulation, mass manipulation. >> jacoby: and is there an assumption there that people are easily misled?ye >> yes people are easily misled, if you do it the right wa for example, when you see people forming into communities, okay, what called filter bubbles. i'm gonna exploit that to craft my message so that it resonates most exactly with that corunity, and i'll do that
11:15 pm
every single community. it would be pretty easy... it would be pretty easy to set up a fake account, and a large number of fake accounts, embedded it in different communities, and use them toat dissempropaganda. >> jacoby: at an enormous scale? >> yes, well, that's why it's a serious weapon, because it's an enormous scale. it'she scale that makes it a weapon. >> narrator: ifact, waltzman's fears were already playing out at a secret propaganda factory in st. petersburg, russia, called the internet research agency.f hundredsssian operatives were using social ht media to fighe anti-russian government in neighboring ukrae. vitaly bespalov says he was one of them. >> jacoby: can you explain, what is the internet research agency? (speaking russian) ( anslated): it's a company that creates a fake perception of russia. they use things like illustrations, pictures--
11:16 pm
anything that would influencele pe minds. when i worked there, i didn't hear anyone say, "the government runs us" or "the kremlin runs us," but everyone there knew and everyone realized it. >> jacoby: was the main intention to make the ukrainian government look bad? >> (translated): yeah, yea that's what it was. this was the intention with ukraine. put president poroshn a bad light and the rest of the government, and the military and so on. (speaking russian) you come to work and there's a pile of sim cards, many, many sim cards, and an old mobile phone. you need an account to registeru for vasocial media sites. you pick any photo of a randomoo person, a random last name, and start posting links to newin different groups. >> narrator: the russian propaganda had its intended
11:17 pm
effect: helping to sow distrust and fear of the ukrainian government.nt (cg) >> pro-russia demonstrators against ukraine's new interim government. >> "russia, russia," they chant. >> russian propaganda was massive on social media. it was massive. >> there was so manyes that start emerging on the facebook. c >> "cruel,el ukrainian nationalists killing people or torturing them because they spearussian." >> they scared people. "you see, they're nna attack, they're gonna burn your villages. you should worry." (speaking russian) >> and then the fake staged news. d peaking russian) >> "crucified ch ukrainian soldiers," which is totally nonsense. (speaking russian) >> it got proven that those hired were actual actors. >> complete nonsense. >> but it spreads on facebook.>> o facebook was weaponized. ab narrator: just as in the spring, facebook was being used to inflame divisions. but now by groups working
11:18 pm
on behalf of a foreign power, using facebook's toers built to help advertiss boost their content. >> by that time in facebook, you could pay money to promote these stories. so your sties emerge on the top lines. and suddenly you start to believe in ts, and you immediately get immediate response. k you can test aind of nonsenses and understand to which nonsense people do notli e... (man speaking ukrainian) and to which nonsenses people start believing. (chanting in russian) which will influence the behavior of person receptive to propaganda, and then provoking that person on certain action. ♪ >> they decided to undermine ukraine from the inside... (gunfire echoing, shouting) ...rather than from outside.
11:19 pm
>> i mean, basically, think about this-- russia hacked us. >> narrator: dmytro shymkiv, a top adviser to ukraine's president, metith facebook representatives and says he asked them to intervene. >> the response that facebook gave us is, "sorry, we are open platform, ybody can do anything without... within our policy, which isritten on the website." and when i said, "but this is fake accounts." (laughs): "you could verify that." "well, we'll think about this but, you know, we, we have a freedom of speech and we are very pro-democracy platform. everybody can say anything." >> jacoby: in the meeting, doma you think yo it explicitly clear that russia was using facebook to meddle in ukraine politics >> i was explicitly saying that there are trolls factory, that there are pos and news that are fake, that are lying, and they are promoted on your platform by, very often, fake
11:20 pm
accounts. have a look.di at least s somebody to investigate.>> acoby: and no one... sorry. >> no. >> jacoby: no one was sent? >> no, no. for them, at that time, it was t an issue. >> narrator: facebook told frontline that shymkiv didn't raise the issue of misinformation in their meeting, and that their converstations had nothing to do with what would happen ithe united states two years later.>> acoby: it was known to facebook in 2014 there was potential for russia disinformation campaigns onbo fa. >> yes. and there were disinformation campaigns from a number of different countries on facebook. ow, disinformation campaigns were a regular facet t facebookery abroad.
11:21 pm
and... i mean, yeahnically that should have led to a learning experience. i just don't know. >> jacoby: there was plenty that was known about the potential downsides of social media and facebook-- you know, potential for disinformation, potential for bad actors and abuse. were these things that you just weren't paying attention to, or were these things that were kind of conscious choices to kind of say, "all right, we're gonna nd of abdicate responsibility from those things and just keep growing?" >> i definitely think we've been paying attention to the things that we know. and one of the biggest challenges here is that is is really an evolving set of threats and risks. we had a big effort around scams. we had a big effort around bullying and harassment. we had a big effort around nudity and porn on facebook. it always ongoing. and so some of these threats and problems are new, and i think we're grappling with that as a
11:22 pm
company with other companies in this space, with governments, with other organizations, and so i, i wouldn't say that everything is new, it's just different problems. >> facebook is the ultimate... >> narrator: at facebook headquarters in menlo park, they would stick to tission and the business model, despite aer gag storm. >> ...get their election news and decision-making material from facebook. >> the most extraordinary election... >> narrator: by 2016, russia was continuing to use social media as a weapon. >> ...hillary clinton cannot seem to extinguish... >> narrator: and division and polarization were running through the presidential campaign. >> just use it on lying, crooked hillary... >> the race for the white house was shaken up again on super tuesday... >> narrator: mark zuckerberg saw threats to his vision of an open and connted world. a i look around, i'm starting to see people and nations turning inward, against this idea of a connected world
11:23 pm
and a global commuty. i hear fearful voices calling for building walls a distancing people they label as others. for bocking free expression, for slowing immigration, reducing trade and in ases around the world, even cutting access to the internet. >> narrator: but he continued to view his invention not as part of the problem, but as the solution. >> and that's why i think the work that we're all doing together is more important now than it's ever been before. (cheers and applause) >> we stand for connecting eve person. for a global community. >> facebook systematically went from interconnecting people, to essentially having a surveillance system of their whole lives. >> facebook has come under fe for it's role during the election... >> i mean everybody was pretty upset that we hadn't
11:24 pm
caught it during the election, and it was a very intense time. >> mark zuckerberg will testify... >> i still have questions s we're going to make sure that in 2018 and 2020 tesn't happen again. >> narrator: next time on frontline. >> go to pbs.org/frontline for the latest in frontline's "tnsparency project", and see key quotes from the film in context. >> there will be always unintended consequences. >> this isn't a problem that you solve, it's a problem that you contain. >> then read additional reporting and watch a video explainer about what facebook know's about you and how. e n though you never signed up for it, facebook now has data about you and stores it as a shadow profile. >> connect to the ontline community on facebook,twitter or pbs.org/frontline. >> frontline is made possible by contributions your pbs station from viewers like you. thank you.or and by theration for public broadcasting. major support is provided by the john d. and catherine t. macarthur foundation, committed to building a more
11:25 pm
just, verdant and peaceful world. more information is available at macfound.org. the rd foundation, working with visionaries on the front lines of social chan worldwide. at ford foundation.org. additional support is provided by the abrams foundation, committed to excellence in journalism.ou the parkation, dedicated to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy journalism that informs andsp es. and by the frontline journalism fund, with major support from jon and jo ann hagler. and by the y, for a better us. ptioned by media access group at wgbh access.wgbh.org >> for more on this and other frontline programs, visit our website at pbs.org/frontline.
11:26 pm
11:30 pm
145 Views
IN COLLECTIONS
KQED (PBS) Television Archive Television Archive News Search Service The Chin Grimes TV News ArchiveUploaded by TV Archive on