tv Frontline PBS December 12, 2018 4:30am-5:30am PST
4:30 am
>> narrator: tonight- part one of a two part investigation. >> we face a number of important issues around privacy, safety, and democracy. >> narrator: frontline investigates... facebook. >> we didn't take a broad ough view of our responsibility. and it was my mistake. and i'm sorry.>> arrator: told by company insiders... >> it's possible that we haven't been as fast as we eded to be. >> we've been too slow to act on- >> we didn't see it fast enough- >> i think we were too slow- >> narrator: ...and former employees. >> i mean everybody was pretty upset that we hadn't caught it during the election. >> n used to disrupt democracy around the globe. >> i don't think any of , mark included, appreciated how much of an effect we might have had. >> narrator: correspondent, james jacoby takes a hard look at the man who wanted to connect the world. >> jacoby: is he not recognizin
4:31 am
portance of his platform? >> he didn't understand what he had built. >> narrator: but is he accountable for helping divide it? >> there is something wrong systemically with the facebook algorithms. in effect polarization was the key to the model. >> narrator: tonight on frontline- "the facebook dilemma". >> frontline is made possible by contributions to your pbs statn from viewers like you. thank you. and by the corporation for puic broadcasting. major support is provided by the john d. and catherine t. macarthur foundation, committed to building a more just, verdant and peaceful world. more information is availablat macfound.org. the ford foundation, working with visionaries on the front lines of social change worldwide.at ord foundation.org. additional support is provideds by the abrundation, committed to excellence in journalism. the park foundation, ded to heightening public awareness of critical issues. the john and helen glessner family trust. supporting trustworthy
4:32 am
journalism that informs and inspires. and by the frontline journalism fund, with major support from jon anjo ann hagler. corporate support is provided by... >> the zip code you're born into can determine your future, your school, your job, your dreams, your problems... at the y, our goal is to create opportunities no matter who yo w arhere you're from. the y, for a better us. (birdshirping) ♪ w
4:33 am
>> aree good? own?hould i put the beer >> nah, no, actually, i'm gonna mention the beer. (laughing) >> hard at work. >> so i'm here in palo alto, california, chilling with mark zuckerberg of the facebook.com, and we're drinking out of a keg of heineken because... what are we celebrating, mark? >> we just got three million users. >> 1 12, 13... >> whoo! >> tell us, you know, simply what facebook is. >> i think facebook is an online directory for colleges. i realized that because i didn't have people's information, i needed to make it interesting enough so that people would want to use the site and want to, like, put their information up. so we launched it at harvard, and thin a couple of weeks, two-thirds of the school had signed up. so we're, like, "all right, this is pretty sweet, like, let's just go all out." i mean, it's just ining seeing how it evolves. we have a sweet office. >> yeah, well, show us... show us around the crib.
4:34 am
(tking in background) we didn't want cubicles, so we got ikea kitchen tables instead. i thought at kind of went along with our whole vibe here. >> uh-huh. what's in your fridge? >> some stuff. there's some beer down there.op >> how many pe work for you? >> it's actually 20 right now. >> did you get this shot, thisla one here, the riding a pit bull? >> oh, nice. >> all right, it's really all i've got. >> that's cool. >> where are you taking facebook at this point in your life? >> um, i mean... there doesn't necessily have to be more. ♪ >> from the early days, mark had this vision of connecting the whole world. so if google was about providin cess to all the information, facebook was about connecting all the people. >> can you just say your name and pronounce it so nobodyup messes ind they have it on tape? >> sure, it's mark zuckerberg. >> great.
4:35 am
>> it was not crazy. somebody was going to connect all those people, y not him? >> we have our facebook fellow, we have mark zuckerberg. i have the pleasure of introducing mark zuckerberg, founder of facebook.com. (applause) >> yo. >> when mark zuckerberg was at harvard he was fascinated by hacker t culturs notion that software programmers could do things that wod shock the world. >> and a lot of times, people are just, like, too carefu i think it's more useful to, like, make things happen and then, like, apologize later, than it is to make sure that you dot all your i's now and then, e.ke, just not get stuff d >> so it was a little bit of a honegade philosophy and a disrespect for aty that led to the facebook motto "movea fa break things." >> never heard of facebook? (laughing) >> our school went crazy for the facebook. >> it creates its own world that you get sucked into. >> we started adding things like status updates and photos and groups and apps. when we first launched, we were hoping for, you know, maybe 400,
4:36 am
500 people. rsheering) >> toast to the fit 100 million, and the next 100 million. >> cool. >> so you're motivated by what? >> building things that,ou know, change the world in a way that it needs to be changed. >> who is barack obama? the answer is right there on myg facebook pe. >> mr. zuckerberg... >> 'sup, zuck? >> in those days, "move fast and break things" didn't seem to be ciopathic. >> if yore building a product that people love, you can make a lot of mistakes. >> it wasn't that they intended to do harm so much as they were unconcerned about the ssibility that harm woul result. >> so just to be clear, you're not going to sell or share any of the information on facebook? >> we're not gonna shareor people's ition, except for with the people that they've asked for it to be shared.ec >>hnology optimism was so deeply ingrained in the value system and in the beliefs of people in silicon valley... >> we're here for a hackathon, so let's get started. >> ...that they'd co to
4:37 am
believe it is akin to the law of gravity, that of co technology makes the world a tter place. it always had, it always will. and that assumption essentially masked a set of changes that were going on in the culture that were very dangerous. ..>> from kxjz in sacramen >> for monday, june 27...to >> nar mark zuckerberg's quest to connect the world would bring about historic change, and far-reaching consequences, in politics, privacy, and technology. we've been investiga warning signs that viisted long before problems burst into publi. >> it was my mistake, and i'm sorry... >> narrator: but for tse inside facebook, the story began with an intoxicating vision that turned into a lucrative siness plan. >> well, the one thing that mark zuckerberg has been so od at is being incredibly clear and compelling about the mission that facebook has always had.mi >> facebook'ion is to give people the power to share. give people the power to share.
4:38 am
in order to make t world more open and connected... more open and connected... open and connected... more open and nnected. (applause) >> james jacoby: how pervasive a mission was thatnside of the company? give me a sense of that. >> it was something that... you know, mark doesn't just say it when we do, you know, ordered calisthenics in the morning and we ye mission to each other, right? we would actually say it to meach other, you know, whk wasn't around. >> jacoby: and that was a mission that you really believed in? i mean... h could you not? how exciting. what if connecting the worldal ac delivered a promise that we've been lookingui for to gly make the world a better place? >> jacoby: was there ever a point where there was questions internally abouthis mission being naïve optimism? >> i think the short answer is completely yes, ani think that's why we loved it. especially in a moment like when we cross a billion monthly active users for the first time. and mark's... the way i recall mark at the me, i remember
4:39 am
thinking, "i don't think mark io g to stop until he gets to everybody." s >> i thie of us had an early understanding that we were creating in some ways a digital nation-state. this was the greatest experiment in free speech in human history. >> there was a sense inside the company that we are building the future and there was a real focus onouth being a good thing. it was not a particularly divee workforce. it was very much the sort of harvard, stanford, ivy league group of people who were largel2 in their. >> i was a big believer in the company. like, i knew that it was going tog.e a paradigm-shifting th there was this, definitely this feeling of everything for the company, of this, you know, world-irring vision. everyone more or less dressed with the same fleece and swag with logo on it. posters on the wall that looked somewhat orwellian.
4:40 am
but, of course, you know, in an upbeat way, obviously. and, you know, some of the slogans are pretty well-known-- "move fast and break things," "fortune favors the bold," "what would you do if you weren't afraid?" you know, it was always this sort of rousing rhethat would push you to go further. >> narrator: antonio garcia martinez, a former product manager on facebook's advertising team, is one of eight former facebook insiders, who agreed to talk on camera about their experiences. >> in silicon valley, there's a, you know, almost a mafioso code of silence at you're not supposed to talk about the business in any but the most flattering waht? basically, you can't say usything, you know, measured or truthful about theess. and i think, as perhaps with facebook, it's kind of arrived at the point at which it's so important, it needs to be a lile more transparent abou how it works. like, let's stop the little (bleep) parade about everyone in silicon valley, yw, creating, disrupting this and improving the world, right? it's, in many ways, a businesske ny other. it's just kind of more exciting and impactful. (techno music playing) >> narrator: by 2007, zuckerberg had made it clear that the goa of the business was
4:41 am
worldwide expansion. >> almost a year ago, when we were first discussing how to let evyone in the world into facebook, i remember someone said to me, "mark, we already have neay every college student in the u.s. on facebook. it's incredible that we were a eve to do that. but no one gets a second trick like that." well, let's take a look at how we did. (cheering and applause) >> jacoby: what was the growth team about? d wh you do at growth? >> the story of growth has really been about mang facebook available to people that wanted it but couldn't have access to it. >> narrator: naomi gleit, facebook's second-longest serving employee, is one of five ofcials the company put forward to talk to frontline. she was an original member of the growth team. >> one of my first projects was expanding facebook to high scol students. i worked on translating facebook into over a hundred languages. when i joined, there werone million users, and now there's over two billion people using facebook every month. >> jacoby: some of the problems
4:42 am
that have reared their head with facebook over the past couple of years seem to have been caused in some ways by this exponential growth. >> so, i think mark-- and markhi has said that we have been slow to really understand the ways in which facebook might be used for bad things. we've been really focused on the ofod things. >> so who are al these new users? >> the growth team had tons of engineers figuring out how you could make the new user experience more engaging, how you could figure out how to get more people to sign up. everyone was focused on growth,. growth, grow t>> give people the power share. >> narrator: and the key to keeping all these new people engage.. >> to make the world more open and connected. >> narrator: ...was facebook's most important feature... >> news feed. >> narrator: news feed, theen seemingless stream of stories,ictures, and updates shared by frnds, advertisers, and others. >> it analyzes all the information available to each ter, and it actually computes what's going to most interesting piece of information, and then publishes
4:43 am
a little story for them. >> it's your personalized newspaper, it's your "the newim york" of you, channel you. it is, you know, your customized, optimized vision of the wod. >> narrator: but what appeared in users' news feed wasn't random. it was driven by a secret mathematical formula, an algorithm. >> the stories are rked in terms of what's going to be the most important, and we design a lot of algorithms so we n produce interesting content for u. >> the goal of the news feed is to provide you, the user, withen the coon facebook that you most want to see. it is designed to makeou want to keep scrollg, keep looking, keep liking. >> that's the key. that's the secret sauce. that's how... that's why we're worth x billion dollars. >> nrator: the addition of t new "like" button in 2009 allowed news feed to collectst vamounts of users' personal would prove invaluable to facebook. >> at the time we were a little bit skeptical about the likeon bu we were concerned. and as it turned out our intuition was just dead wrong. and what we found was that the like button acted as a social lubricant.
4:44 am
and, of course, it was also driving this flywheel of engagement, that people lt like they were heard on the platform whenever they shared limething. >> connect to it bng it... >> and it became a driving force for the product. >> it was incredibly important because it allowed us to understa who are the people that you care more about, that cause you to react, and who are the busisses, the pages, the other interests on facebook that are important to you. d that gave us a degree of constantly increasingde tanding about people. >> news feed got off to a bit of a rocky art, and now our users love news feed. they love it. >> narrator: news feed's exponential growth was spurred on by the fact that existing laws didn't hold internet companies liable for all the content being posted on their sites. >> so, section 230 of theec communicationscy act is the provision which allows the internet economy to grow and thrive. and facebook is one of the
4:45 am
principal beneficiaries of this provision. it says don't hold this internet company responsible if some idiot says something violent on the site. don't hold the internet company responsible if somebody publishes something that creates conflict, that violates the law. it's the quintessential provision that allows them to say, "don't blame us." >> narrator: so it was up to facebook to make the rules, and inside the company, they made a fateful decision. w took a very libertarian perspective here. we allowed people to speak and we said, "if you're going to incite violence, that's clearly out of bounds. we're going to kick you off immediately." but we're going to allow people to go right up to the edge and we're going to allow other people to respond. we had to set up some ground rules.ce basincy, no nudity, and nool vit or hateful speech. and after that, we felt some reluctance to interpose our
4:46 am
value system on this worldwideth communit was growing. >> jacoby: was there not a concern, then, that it could be become sort of a place of just utter confusion, that you have lies that are given the same weight as truths, and that it kind of just becomes a place where truth becomes completelyed obfusc >> no. we relied on what we thoughts were the publimmon sense and common decency to police the site. >> narrator: that approach would soon contribute to real-world consequences far from silico valley, where mark zuckerberg's optimistic vision at fir seemed to be playing out. rowd chanting) the arab spring had come to egypt. (crowd chanting) it took hold with the lp of a facebook page protesting b abusesthe regime of hosni mubarak. >> not that i was thinking that
4:47 am
this facebook page was going toe ective. i just did not want to lookay back and s that happened and i just didn't do anything about ti. >> narrator: at th, wael ghonim was working for google in the middle east. >> in just three days, over 100,000 people joined the page. throughout the next few months, e page was growing until what happened in tunisia. >> events in tunis have captured the attention of viewers arou the world, and a lot of it was happening online. >> it took just 28 days until ede fall of the regime. >> and it just c for me a moment of, "maybe we can do this." and i just posted an event calling for a revolution in ten days, like we should all get to the street and we should all bring down mubarak. >> organized by a group online activists... >> they're calling it the facebook revolution... (crowd chanting)ra >> narr: within days,
4:48 am
ghonim's online cry hahelped fill the streets of cairo with hundreds of thousands of protesters. (crowd chanting) 18 days later... >> (translated): president muhammad hosni mubarak has decided to step down. (cheering) >> they have truly achieved the unimaginable. man: >> it's generally acknowledged that ghonim's facebook page first sparked the protests. >>acoby: there was a momen that you were being interviewed on cnn. >> yeah, i rember that. >> first tunisia, now egypt, what's next? >> ask facebook. >> ask what? >> facebook. >> facebook. or>> the technology was, f me, the enabler. i would have not have been able to engage with others, i would have not been able to propagate my ideas to others w social media, without facebook.f >> you're giviebook a lot of credit for this?
4:49 am
>> yeah, for sure. i want to meet mark zurg one day and thank him, actually. >> had you ever think that this could have an impact on revolution? >> you know, my own opinion is that it would be extremely arrogant for any specific chnology company to claim anygf mean role in, in those. but i do think that the overallh tren's at play here, which is people being able to share what they want with the people who theyant, is an extremely powerful thing, right? and we're kind of fundamentally rewiring the world from the ground up. and it starts with people... >> they were relatively restrained externally about taking credit for it, but internally they were, i would say, very happy to take credit for the idea that social mediabe wag used to effect democratic change. >> activists and civil socie leaders would just come up to me and say, you know, "wow, we couldn't have done this without
4:50 am
you guys." government officials, you know, would say, "does facebook really realize how much you guys are changing our societies?" >> it felt like facebook had extraordinary power, and power for good. >> narrator: but while facebook was enjoying its moment... (man shouting,rowd chanting) back in egypt, on the ground and on facebook, the situation was unraveling. >> following the revolution, things went into a much worse direction than what we have anticipated. >> there's a complete split between the civil community anda those who aring for an islamic state. >> what was happening in egyptti was polariza. >> deadly clashes betweenns christiand military police. >> (translated): the brotherhood cannot rule this country. ll >> andhese voices started to clash, and the environment on social media breeded that kind ew clash, like that polarization-- rded it. >> when the arab spring happened, i know that a lot of ugople in silicon valley t
4:51 am
our technologies helped bring freedom to people, which was true. tw but there's t to this, which is facebook's news feed algorithm. >> if you increase the tone of your posts against yourop ponents, you are gonna get more dtribution. because we tend to be more tribal. so if i call my opponents names, my tribe is happy and celebrating, "yes, do it, like, comment, share, so more people end up seeing it." oinguse the algorithm is to say, "oh, okay, that's engaging content, people like it, show it to more people." >> there were also other groups of thugs, part of the pattern of sectarian violence. >> the hardest part for me was seeing the tool that brought us together tearing us apart. these tools are just enablers for whomever, they don't separate between what's good and bad. they juslook at engagement metrics. >> narrator: ghonim himself became a victim of those metrics.
4:52 am
>> there was a page, it had, like, hundreds of thousands of followers-- all what it did was creating fake statements, and i was a victim of that page. w thte statements about me insulting the army, which puts me at serious risk because that is not something i said. i was extremely naïve in a way u don't like, ly, now, thinking that these are liberating tools. it's the spread of misinformation, fake news, in egypt in 2011. >> narrator: he says he later talked to people he knew at facebook and other companies about what was going on. >> i tried to talk to people who are in silicon valley, but i feel like was not, it was not being heard. >> jacoby: what were you trying to express tpeople in silicon valley at the time? >> it's very serious. whatever that we... that you are building has massiveous unintended consequences on the lives of people on this planet. and you are not investing enough in trying to make sure that what
4:53 am
you are building does not go inn the way. and it's very hard to be in their position. no matter how they t move and change things, there will be always unintended consequences. re activists in my region on the front lines of, you know, spotting corners of facebook that the rest of the world, the rest of the company, wasn't yet talking about, because in a company that's built off numbers d metrics and measurements, anecdotes sometimes got lost along the way. and that was always a real a challenge, aays bothered me. >> narrator: elizabeth linder, facebook's representative in thi at the time, was also hearing warnings from government officials.ou >> so manyry representatives were expressing to me a huge concern about the ability of rumors to spread on
4:54 am
facebook, and what do you do about that? >> jacoby: how did you respond to that at the time? >> we, we didn't have a solution for it, and so the best that i could do is report back to headquarters that this isme ing that i was hearing on the ground. >> jacoby: and what sort of response would you gm headquarters? >> you know, i... it's impossible to be specific about that, because it was always just kind of a, "this is what i'm hearing, this what's going on." but i think in a... in a company where the, the people that could have actually, you know, had an impact on making those decisions are not necessarily seeing it firsthand. >> i think everything that happened after the arab spring should have been a warning sign to facebook.r: >> narraeynep tufecki, a researcher and former computer programmer, had also beenra ing alarms to facebook and other social media companies. >> these companies were terribly understaffed, in over their heads in terms of the portant role they were playing. like, all of a sudden you're thc
4:55 am
puphere in egypt. so i kept starting to talk to my friends at these companies and saying, "you have to staff up. you have to put in large amounts of people who speak the, languao understand the culture, who understand the complexities of wherever youer happen to e.">> arrator: but facebook hadn't been set up to police the amount of content coming fromac all the new ples it was expanding to. >> i think no one at any of ese companies in silicon valley has the resources for this kind of scale. you had queues of work for people to go through and hundreds of employees who would doend all day every day clicking yes, no, keep, tak, take down, take down, keep up, u ke making judgment calls, snap judgment calls, about, "does it violate our terms of service? does it viate our standards of decency? what are the consequences of this speech?" so you have is fabulously talented group of mostly 20-somethings who are decidingpe
4:56 am
whath matters, and they're doing it in real time, all day, every day. >> jacoby: isn't that scary? >> it's terrifying. right?re thonsibility was awesome. no one could ever have prected how fast facebook would grow. the, the trajectory of growth of the user base and of the issues sts like this. and of all... alfing throughout the company was like this.th company was trying to make money, it was trying to keep costs down. it had to be a going conce. it had to be a revenue- generating thing, or would cea to exist. >> narrator: in fact, facebookg was prepar take its rapidly growing business to the next level by going puic. >> i'm david ebersman, facebook's cfo. thank you for taking the time to consider an investment facebook. >> the social media giant hopes to raise $5 billion. >> the pressure heading into the i.p.o., of course, was to prove that facebook was a great
4:57 am
buness. otherwise, we'd have no shareholders. >> facebook-- is it worth $100 billion? should it be valued at that?>> arrator: zuckerberg's challenge was to show investors and advertisers the profit thato uld be made from facebook's most valuable asset-- theit personal data had on its users. >> mark, great as he was at vision and product, he had very little experience in building a big advertising business. >> narrator: that would be the job of zuckerberg's deputy, sheryl sandberg, who'd done the same for google. >> at facebook we have a broad d ssion: we want to make the world more open nnected. by the business model we see today was created sheryl sandberg and the team she built at facebook, many of whom had been with her at google. >> narrator: publicly, sandberg and zuckerberg had been downplaying the extent of the personal data cebook was collecting, and emphasizing users' privacy. >> we are focused on privacy.
4:58 am
we care the most about privacy. our busiss model is by far the most privacy-friendly to consumers. >> that's our mission, all right? i mean, we have to do that because if people feel like they don't have control over how they're sharing things, then we're failing them. at it really is the point the only things facebook knows about you are things you've done and told us. >> narrator: but internally, sandberg would soon lead facebook in a very different direction. >> there was a meeting, i thinki it wmarch of 2012, in which, you know, it was everyone who built stuff inside ads, myself among them. and, you know, she basically recited the reality, which is, revenue was flattening. it wasn't slow, it wasn'tde ining, but it wasn't growing nearly as fast as investors suld have guessed. and so she basicald, like, "we have to do something. you people have to do something." and so there was a big effort to basically pull out e elops and start experimenting way more aggressy. the reality is that, yeah, facebook has a lot of personal data, your chat with your girlfriend or boyfriend, your drunk party photos fro college, etc.
4:59 am
the reality is that none of that is actually valuable to any marketer. they want commercially interesting data. you know, what products did you take off the she at best buy? what did you buy in your last grocery run? did it include diapers? do you have ds? are you head of household? right, it's things like that, things that exist in the outside world, that just dnot exist inside facebook at all. >> narrator: sandberg's team a arted developing new ways to collect personal dom users wherever they went on the internet and when they weren't on the internet at all. >> and so, there's this petraordinary thing that h that doesn't get much attention at the time. about four or five months befor. the i.p.the company announces its first relationship with data broker companies, companies that most americans aren't at all aware of, that go out and buy up data about eachd ery one of us-- what we buy, where we shop, where we live, what our traffic patterns are, what our families are doing, what our likes are, whati maganes we read-- data that the consumer doesn't even know that's being collected aboutth
5:00 am
because it's being collected from the rest of their lives by companies they don't haknow, and it's now beingd with facebook, so that facebook can target ads back to the user. >> what facebook does is profile you. if you're on facebook, it's collecting everything u do. if you are off facebook, it's using tracking pixels to collect what you are browsing. and for its micro-targeting to work, for its business model too , it has to remain a surveillance machine. >> they made a product that was a better tool for aders than anything that had ever come before it. nu and of course the ad re spikes. that change alone, i think, is a sea change in the way theut company felt ats future and the direction it was headed. >> narrator: sparapani was so uncomfortable with the direction facebook was going, he left before the company's work with data brokers took effect.
5:01 am
the extent of facebook's data collection was largely a secret until a law student in austriaha a chance encounter with a company lawyer.of >> i kinanted a semester off so i actually went to california, to santa clara university in the silicon valley. someone from facebook was a guest speaker explaining to us basically how they deal with eupean privacy law. and the general understanding was, you can do whever you want to do in europe because they do have data protection laws, but they don't really enforce them at all. so i sent an email to facebook saying i want to have a copy ol my data. so i got from facebook about 1,200 pages, and i read through it. in my personal file, i think the most sensitive information was in my messages. for example, a friend of mine was in the closed unit of the... of a psychological hospitain vienna. i deleted all these messages, but all of them came back up. and you have messages about, you
5:02 am
know, love life and sexuality.al l of that is kept. facebook tries to give you theyo impression thashare this only with friends. the reality is, facebook is always looking. there is a data category called "last location," where they eore where they think you been the last time. if you tag people in pictures, there's gps location, so by that they know which person has been at what place at what time. back on the servers, there is, like, a treasure trove just, like, ten times as big as anything we ever see on the screen. >> narrator: as ramping up its data collection business ahead of the i.p.o., schrems filed 22 complaints with the data protection commission in ireland, where facebook has its international headquarters. >> and they had 20 people at the time over a little supearket in a small town, it's called portarlington. it's 5,000 people in the middle of nowhere. and they were meant to regulate google or facebook or a linked all of them. >> narrator: schrems claimed facebook was violating european pracy law in the way it wa hellecting personal data and not
5:03 am
telling users whatwere doing with it. >> and after we filed these complaints, that was when tually facebook reached out, basically saying, you know, "let's sit down and have a coffee and talk about all of this." so we actually had a kind of notable meeting that was in 2012 at the airport in vienna. but the interestg thing is that most of these points, they simply didn't have an answer. you totally saw that their pants were down. however, at a certain point, i just got a text message from the data protection authority saying they're not available to eak to me anymore. that was how this procedure basically ended. facebook knew that the sysm plays in their favor, so even if you violate the law, there ity is it's very likely not gonna be enforced. >> narrator: facebook disputed schrems's claims, and said it takes european pvacy laws seriously. it agreed make its policies clearer and stop storing some kinds of user data. >> so without further ado, mark zuckerberg. >> narrator: in silicon valley, those who covered the
5:04 am
tech industry had also been confronting facebook about how it was handling users' personal data. >> privacy was my numberne concern back then. so when we were thinking about talking to mark, the platform was an issue, there were a bunch of privacy violations, and that's what we wanted to talk to him about. is there a level of privacy that just has to apply to everyone? or do you think... i mean, you might have a view of, this is what privacy means to mark zuckerberg, so this is what it's going to mean atacebook. >> yeah, i mean, people can control this, right, themselves. simple controllways has been one of the important parts of using facebook. ha>> narrator: kara swishe covered zuckerberg since the beginning. she interviewed him after the company had changed its default privy settings. >> do you feel like it's a backlash? do you feel like you arepe violatinle's privacy? and when we started to ask questions, he became increasingly uncomfortable. >> you know, it's... >> i think the issue is, you became the head of the biggest social networking mpany on the planet. >> yeah, no, so... but i... the interesting thing is that, you i
5:05 am
know, tarted this when i was, you know, started working on this type of stuff when ias . >> so he started to sweat quite a lot, and then a lot a lot, and then a real lot. so the kind that... this kind of thing where, you know, like "broadcast news," where it was dripping down, like, or tom cruise in that "mission: impossible." it was just... it was going to his in and dripping off. >> you know, a lot of stuff changed as we've gone from building this project in a dorm rm... >> and it wasn't stopping and i was noticing that one of the people from facebook was, like,n "oh, my god,was... we were... i was trying to figure out what to do. >> yeah. i mean, a lot of stuff happened along the way.no i think, you there were real learning points and turning points along the way in terms of... in terms of building things. >> he was in such distress, and i know it sounds awful, but i felt like his mother. like, "oh, my god, this poor guy is gonna faint." i thought he was gonnt, i did. do you want to take off the hoodie? >> uh, no. (chuckles) whoa.
5:06 am
>> well, different people think different things. he's told us he had the flu. i felt like... he had had a panic attack, is what happened. >> maybe i should takehe hoodie. >> take off the hoodie. >> go ahead. what the hell? >> that is a warm hoodie. >> yeah. no, it's a thick hoodie. we... it's, um, it's a company hoodie. we print our mission o inside. >> what?! oh, my god, the inside of the hoodie, everybody. l takek. what is it? "making the..." >> "making the world more open and connected." >> oh, my god. it's like a secret cult. >> jacoby: from thatview and from others, i mean, how would you have characterized mark's view ofrivacy? >> well, you know, i don't know if he thought about that. it's kind of interesting because they're very... they're veryt. loose on they have a viewpoint that this helps you as the user to get more information, and they will deliver up more... that's the whole ethos ofey silicon vaby the way. if you only give us everything, we will give you free stuff. there a trade being made between the user and facebook. the question is, are they otecting that data?
5:07 am
>> thank you, mark. >> narrator: facebook en free to set its own privacy standards, because in the u.s. there are no overaring privacy laws that apply to this kind of data collection. but in 2010, authorities at the federal trade commission became concerned. >> in most other parts of the world, privacy is a right. in the united states, not exactly.t >> narrator:e ftc, david vladek was investigating whether facebook had been deceiving its users. what he found was that facebook had been sharing users' personal data with so called "trd-party developers"-- companies that tbuilt games and apps forhe platform. >> and our view was that, you know, it's fine for facebook to collect is data, but sharing this data with third parties without consent was a no-no.ce >> but at book, of course, we believe that our users should have complete control of their information. >> the heart of our cases against companies like facebook was deceptive conduct. that is, they did not make it
5:08 am
clear to consumers the extent tn which their pe data would be shared with third parties. >> narrator: the f had another worry: they saw the potential for data to be misused becausebo faceok wasn't keeping track of what thehird parties were doing with it.he >> thad, in my view, no real control over the third-party app developers that had access to the site. they could have been anyone.ue there was noiligence. anyone, essentially, who could devep a third-party app coul get access to the site. >> jacoby: it could have been somebody working for a foreign adversary. >> ctainly. it could have been somebody gorking... yes, for, you know, for the russian rnment. >> narrator: facebook settled with the ftc without admitting guilt and, under a consent orr, agreed to fix the problems. >> jacoby: was there an expectation at the time of the consent order that they would staff up to ensure that their users' data was not leaking out all over the place? >>es. that was the point of this
5:09 am
provision of the consent order that required them to identify risk to personal privacy and toe plug taps quickly. >> narrator: inside facebook, however, with the i.p.o. on the horizon, they were also under pressure to keep monetizing all that personal information, not just fix the ftc's privacy issues. >> nine months into my first j in tech, i ended up in an interesting situation where, because i habeen the main person who was working on privacy issuesith respect to facebook platform-- which had many, many, many privacyssues, it was a real hornet's nest. and i ended up in a meeting with a nch of the most senior executives at the company, and they went around the room, and ll,y basically said, " who's in charge?" kd the answer was me, because no one else realw anything about it. you'd think that a company of the size and imptance of facebook, you know, would have really focused and had a team of people and, you know, very senior people working on thesesu
5:10 am
, but it ended up being me. >> jacoby: what did you think abt that at the time? >> i was horrified. i didn't think i was qualified. >> narrator: parakilas tried to examine all the ways that the data facook was sharing with third-party developers could be misused. >> my concerns at that time were that i knew that there were all these macious actors who would do a wide range of bad things, given the opportunity, given the ability to target people based on this information that facebook had. so i started thinking through, what are the worst-case scenarios of what peopld do with this data?ow and i some of the kinds of bad actors that might try to attack, and i shared it out with a number of senior executives. and the response was muted, i would say. i got the sense that this just wasn't their priority. they weren'that concerned
5:11 am
about the vulnerabilities that the company was creating. ey were concerned about wavenue growth and user growth. >> jacoby: and tha expressed to you, or that's something that you just gleaned from the interactions?ro >>the lack of a response, i gathered that, yeah. >> jacoby: and how senior were the senior executives? >> very senior. like, among the top five executives in the company. r narrator: facebook has said it took the ftc orriously and, despite parakilas's account, had large tea of people working to improve users' privacy. but to parakilas and others onside facebook, it was clear the business modelnued to drive the mission. in 2012, parakilas left the company, frustrated. >> i think there was a certain arrogance there that led to a lot of bad long-term decision- making. the long-term ramifications of those decisions was not well- thought-through at all. and it's got us to where we are
5:12 am
right now. (cheers and applause) >> your visionary, your founder, your leader. serk, please come to the podium. (cheers and applau >> narrator: in may of 2012, the company finally went public. >> the world largest social network managed to raise more than $18 billion, making it the largest technology i.p.o. in u.s. history. d people literally lined up in times square aroe nasdaq board. >> we'll ring this bell and we'll get back to work. >> with founder mark zuckerberg ringing the nasdaq opening bell remotely from facebook headquarters in menlo park, cafornia. >> narrator: mark zuckerberg was w worth an estimated $15 billion. facebook would go onacquire instagram and whatsapp on its way to becoming one of the most valuable companies in the world. >> going public is an important milestone in our history. but here's the thing: our mission isn't to be a public company. our mission is to make the world
5:13 am
more open and connected. (cheerin >> narrator: at facebook, the business model built on getting more and more of users' personal data was seen as a success. but across the country, researchers working for the department of defense were seeing something else. >> the concern was that social media could be used for really nefarious purposes. the opportunities forma disinfon, for deception, for everything else, are enormous. bad guys or anybody could use this forny kind of purpose in a way that wasn't possiblefo . that's the concern. >> jacoby: and what did you seei as a pot threat of people giving up their data? >> that they're opening themselves up to being targetson for manipulati. i can manipulate you to buy something, i can manipulate you toote for somebody. it's like putting a target... painting a big target on your front and on your chest and on your back, and saying, "here i am. come and manipulate me. you have every... i've given you everything you need. have at it." that's a threat.
5:14 am
>> narrar: waltzman says facebook wouldn't provide data to help his research. anbut from 2012 to 2015, h pas colleagues published more than 200 academirs and reports about the threats they a.re seeing from social me >> what i saw over the years of thogram was that the medium enables you to really take disinformation and turn it into a serious weapon. jacoby: was your resear revealing a potential threat to national security? >> sure, when you looked at howa itually worked. you see where the opportunities are for manipulation, mass manipulation.>> acoby: and is there an assumption there that people are easily misled? >> yes, yes, people are easilymi ed, if you do it the right way. for example, when you see peop forming into communities, okay, what called filter bubbles. i'm gonna exploit that to craft my message so that it resonates most exactly with that community, and i'll that for every single community.
5:15 am
it would be pretty easy... itd wo pretty easy to set up a fake account, and a large number of fake accounts, embedded it in different communities, and use them to disseminate propaganda. >> jacoby: at an enormous scale? >> yes, well, that's why it's a serious weapon, because it's ans enormole. it's the scale that makes it a weapon. >> narrator: in fact, waltzman's fears were already playing out at a secret propaganda factory in st. petersburg, russia, called the internet research agency. esndreds of russian operat were using social media to fight the anti-russian government in neighboring ukraine. vitaly bespalov says he was one of them. >> jacoby: can you explain, what is the internet research agency? (speaking russian) >> (translated): it's a company that creates a fake perception of russia. they use things like illustrations, pictures-- anything that would influence
5:16 am
people's minds. when i worked there, i didn't " hear anyone sae government runs us" or "the kremlin runs us," but everyonend there knewveryone realized it. >> jacoby: was the main intention to make the ukrainian government look bad? >> (translated): yeah, yeah, that's what it was. this was the intention with ukraine. put president poroshenko in a bad light and the rest of the government, and the military, and so on. (speaking russiark you come to nd there's a pile of sim cards, many, many sim cards, and an old mobile phone. you need an account to register itfor various social media. you pick any photo of a random person, choose a random last name, and stt posting links to news in different groups. >> narrator: the russian propaganda had itsntended effect: helping to sow distrust
5:17 am
and fear of the ukrainian government. (chanting) >> pro-russia demonstratorst agairaine's new interim government. >> "russia, russia," they chant. >> russian propaganda was massive on social media. it was massive. >> there waso many stories that start emerging on the facebook. >> "cruel, cruel ukrainian nationalists killing people or torturing them because they speak russian." >> they scared people. "you see, they're gonna attack, they're gonna burn your villages. u should worry." (speaking russian) >> and thethe fake staged news. (speaking russian) >> "crucied child by ukrainian soldiers," which is totally nonsense. (speaking russian)en >> it got prov that those people were actually hired actors. >> complete nonsense.>> ut it spreads on facebook. >> so facebook was weaponized. >> narrator: just as in the arab spring, facebook was being ed to inflame divisions. but now by groups working on behalf of a foreign power, using facebook's
5:18 am
tools built to help advertisers boost their content. >> by that time in facebook, you could pay money to promote these stories. so your stories emerge on the top lines. and suddenly you start to believe in this, and you immediately get immediate response. you n test all kind of nonsenses and understand to which nonsense people do not believe... (m speaking ukrainian) and to which nonsenses people start believing. (chanting in russian) which will iluence the behavior of person receptive to propaganda, and then provoking that person on certain action. ♪ >> they decided to undermine ukraine from the inside... (gunfire echoing, shouting) ...rather than from outside. >> i mean, basically, think
5:19 am
about this-- russia hacked us. >> narrator: dmytro shymkiv, a top adviser to ukraine's president, met with facebook representatives and says he asked them to intervene. >> the response that facebook gave uis, "sorry, we are open platform, anybody can do anything without... within o policy, which is written on the website." sd when i said, "but this fake accounts." (laughs): "you could verify that "well, we'll think about this but, you know, we, we have a freedom of speech and we are very pro-democracy platform.sa everybody caanything." >> jacoby: in the meeting, do you think you made it explicitly clear that russia wausing facebook to meddle in ukraine politics? >> i was explicitly saying that there are trolls factory, that there are posts and news that are fake, that are lying, and a th promoted on your platform by, very often, fake
5:20 am
accounts. have a look. at least sending somebody to investigate. >> jacoby: and no one... sorry. >> no. >>acoby: no one was sent? >> no, no. for them, at that time, it was not an issue.ar >>tor: facebook told frontline that shymkiv didn't raise the issue of misinformation in their meeting at their converstations had nothing to do with what would happen in the united states two years knter. >> jacoby: it wan to facebook in 2014 there was potential for russian disinformation campan facebook. >> yes. and there were disinformation campaigns from a number of different countries on facebook. ti you know, disinfor campaigns were a regular facet of facebookery abroad. and... i mean, yeah, technically
5:21 am
that should have led to a learning experience. i just don't know. >> jacy: there was plenty that was known about the potential downsides of social media and facebook-- you know, potential for disinformation, potential for bad actors and abuse. were these things that you just weren't paying attention to, or were these things that were kind of conscious choices to kind of say, "all right, we're gonna kind of abdicate responsibility from those thingand just keep growing?" >> i definitely think we've been gsying attention to the th that we know. and one of the biggest challenges here is that this is really an evolving set of threats and risks. we had a big effort aroundam we had a big effort around bullying and harassment.ig we had affort around nudity and porn on facebook. it's always ongoing. and so some of these threats and problems are new, and i think we're grappling with that as a company with other companies in
5:22 am
this space, with governments, with other organizations, and so i, i wouldn't say that everything is new, it's just different problems. >> facebook is the ultimate... >> narrator: at facebook headquarters in menlo park, they woulstick to tission and the business model, despite a gathering storm. >> ...get their election news and decision-making materialk. from faceb >> the most extraordinary election... >> narrator: by 2016, russia was continuing to use social media as a weapon. >> ...hillary clinton cannot seem to extinguish... >> narrator: and division and golarization were runn through the presidential campaign. >> just use it on lying, crooked hillary... >> the race for the white house was shaken up again on super tuesday... >> narrator: mark zuckerberg saw threats to his vision of an open and connected world. >> as i look around, i'm starting to see people and nations turning inward, againsth idea of a connected world
5:23 am
and a global community. i hear fearful voices calling for building walls and distancing people they label as others.g for bockee expression, for slowing immigration, reducing trade and in some cases around the world, even cutacng co to the internet. >> narrator: but hinued to view his invention not as part of the problem, but as the solution. >> and that's why i think the work that we're all doing together is more importannow than it's ever been before. (cheers and applause) >> we stand for connecting every person. for a global community. >> facebook systematically went from interconnecting people, to essentially having a surveillanceth system of r whole lives. >> facebook has come under fire for it's role during the election... >> i mean everybody was pretty upset that we hadn't caught it during the election,as and it very intense time.
5:24 am
>> mark zuckerberg will testify... >> i still have questions if we're going to make sure that in 2018 an2020 this doesn't happen again. >> narrator: next time on frontline. >> go to pbs.org/frontline for the latest in frons "transparency project", and see key quotes from the film in context. a >> there will ays unintended consequences. >> this isn't a problem that you solve, it's a problem that you ctain. >> then read additional reporting and watch a video explainer about what facebk know's about you and how. >> even though you never signed up for it, facebook now hasou data aboutnd stores it as a shaw profile. >> connect to the frontline communy on facebook,twitter or pbs.org/frtline. >> frontline is made possible by contributions to your pbs station from viewers like you. thank you. and by the corporation for viblic broadcasting. major support is pd by the john d. and catherine t. macarthur fodation, committed to building a more just, verdant and peaceful
5:25 am
world. more information is available at macfound.org. the ford foundation, wking with visionaries on the front lines of social change worldwide. at ford foundation.org. additional support is provided by the abrams foundation,el committed to ence in journalism. the park foundation, dedicated toseightening public awaren of critical issues. the john and helen glessnerru family. supporting trustworthy journalism that informs and inspires. and by the frontline journalism fund, with major support from jon and jo ann hagler. and by the y, for a better us. captioned by media access group at wgbh access.wgbh.org >> for more on this and other frontline programs, visit our website at pbs.org/frontline.
5:26 am
5:30 am
184 Views
IN COLLECTIONS
KQED (PBS)Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=99822348)