tv Future of the Internet CSPAN August 15, 2017 8:00pm-9:29pm EDT
8:00 pm
countering hate groups in sharing experience as a former organizer for the white area in resistance. 7:00 a.m.. join the discussion. >> tonight on c-span the president answers questions about the violence in charlottesville, virginia. rex tillerson says isis is guilty of genocide. and a look at the impact of nafta. first we take you to the aspen ideas festival for a look at the future of the internet. this runs about one hour and a half. [applause] everyone.ery -- hi
8:01 pm
thanks for coming. each part will tackle the question, can democracy survive the internet? we are going to talk about trolls, hackers, leakers, twitter, facebook. this question of whether a democracy in greenwich media has , can we survive or adjust to that new world? nate is the professor of law at stanford law school and his scholarship focuses on political parties, redistricting and elections administration. he served as a special master or expert to districting plans in many states. work is on campaigns and elections administration. broadly that is what he will be
8:02 pm
talking about today. let's welcome nate. [applause] >> thank you. it's an honor to be here. an honor to be on this panel. a lot of people are following me. my role is to set the table. i'm going to talk about the issues and they are going to dive deeply into it. topic, the a big question of how the internet affects to mock received. the first question, what are the unique challenges the internet poses for democracy? given those challenges, who should address those problems? what would those reforms look like? i want to start with caveats. there are several academics in the audience.
8:03 pm
that we have all seen this before. that is the first disagreement. yes. some problems we have seen many times before. fake news, hate speech. we have seen it many times before. the question i want to focus on, what are the unique features of , such differences in degree that we need to focus on the technology? is second academic objection what you think is so bad is actually pretty good. the internet as a technology like any other. thingsre going to be bad because of the internet and good things. i will talk about how there has been a sea change in the way we have thought about this. the third argument is i have published on that before. yes. many people have been writing about this before. a whole book on this question of
8:04 pm
filter bubbles and echo chambers. i recommend it to you all. let me talk about how i got into this because i am a campaign and dealingights specialist with redistricting and campaign finance. when the supreme court issued its decision in citizens united, maybe the most reviled decision of the roberts court the last 20 years, most people looked at it as this great crime the supreme court committed on the nation, giving corporations personal rights. that corporations are people. i looked at that case and said that is not what this is about. it is about the transforming nature of the internet for communication. that case was not about a corporate television ad that was washing over captive eyeballs.
8:05 pm
what it was about was an on-demand movie you could , produced byof hbo a nonprofit corporation called citizens united. prefaced the future of communication when we are not going to be see -- receiving our communication through linear programming over the television but through a range of devices and the internet and mobile phones. that was my entry point thinking about it. ask,uestion i wanted to what were the unique features that the internet posed for american democracy? election, it016 would be about what everybody looked at is the digital
8:06 pm
campaign geniuses. maybe about the revolution in small donor fundraising. another thing we saw. maybe it would be about the change in television as the main mode of political communication. but now what we talk about with the effect of the internet on democracy, we are talking about fake news. twitter bots, automated programs that send information to social media users. we are talking foreign hacking. about the effect of the internet on democracy replaced by this pessimism since the election about how it is going to threaten democracy. here is the six features of the
8:07 pm
internet that make it different than the previous communication ecosystem in terms of how it affects democracy. the speed of communication. that if i rally the -- is the way we get information online. while we have had anonymous speech before, the capacity of an anonymous speaker to reach an on -- a wide audience is unprecedented. for echo fancy word chambers and filter bubbles. intowe are self-selecting our own information preselected for us by the platforms and the like. the issue of sovereignty of course came up in this election, for the first time it would seem that the united states election
8:08 pm
was penetrated by a foreign power. then finally, the new problems created by the extreme power that corporations like facebook and google have when it comes to political communication. let me start with veracity in velocity and virality. a tribute to -- attributed to mark twain. he died in 1910. one of many examples of fake news. but the point remains, especially in the internet age, there is a capacity of lies to reach around the world and back again before it could be corrected and mediated, assessed for its ferocity. why is it this happens?
8:09 pm
thebig story in terms of speed of communication is we can do it individually. each one of us can be a broadcaster. previously you had the elite broadcasting companies. the newspapers. level local or national would act as mediators would put in roadblocks for certain types of speech and information. that had positive and negative consequences. walter cronkite said that is just the way it is. americans believed it. a neutral source for information that people could count on. that excluded certain voices from the left and right. a mediawhat we have is ecosystem in which republicans are trusting certain sorrow stash sources. there is little trust in the media as an institution at all.
8:10 pm
that lack of trust is representative of a larger lack of trust americans have an almost all elite institutions. it is happening around the world. of intermediaries, that squelch speech or set the ground rules for what could be transferred from political actors to the mass public is what makes the problem of fake news possible. everybody who talks about fake news hates the moniker of fake news. you can count me as well. it is not really news. it is describing a heterogeneous media. the daily show or the onion. it also refers to things on the top here, fake news for profit. something we did see at a scale that we had not seen before.
8:11 pm
teenagers whonian put up these particular articles , such as the pope endorses donald trump. it wasn't that they had some affinity for donald trump. because of the way google and other services provide ads, they could make money lifting up something that would get a lot of cliques. issueke news for profit came up. that relatively small problem, people who are putting up websites to make money off of the ads, you have in the most violent consequence of fake news, the pizza gate scandal where someone believing the conspiracy theories that were stirred in the cauldron of reddit and four chan, these websites, then actually took
8:12 pm
action going to a pizza shop and firing a gun. all of this is a product of the fact that you don't have the traditional intermediaries. a more populist and popular form of communication. if you look at the data on fake news you find as buzz feed has done some incredible journalism a this -- others have shown remarkable amount of sharing of fake news. ,he pew research center americans haveof shared a fake news story. if you look at the tweets or the engagement on facebook, you find significant sharing of fake news during the election. we do not know, however, what affect fake news had on the election. my colleague at stanford said he doesn't get laid a dispositive
8:13 pm
role. have you defined fake news? but, what we can say is the media system has changed. certain incentives worth their, and we have a lot of this content on the internet. reasons, the ability to engage in anonymous speech online. we have a long tradition of anonymous speech. the federalist papers. something protected under the constitution. facilitates -- it also facilitates misinformation, the problem of foreign intervention, it aids in the propagation of hate speech.
8:14 pm
then the issue of box. that computers can essentially imitate human beings. box,ng about the issue of so just to be clear what a bot is? it's -- it sounds fancy but it's not, it's basically just code in the 10 computer right, or in an algorithm in a search engine that produces, you know a response or -- or delivers communication to you. bots can be good, bots can be bad. you can have a bot that tells you what weather it is every day, right. it's just a computer code that gives -- delivers information. when we talk about the kinds of bots that affect campaigns and affect elections though we are often talking about in particular twitter although it's really interesting that there are now bots on youtube and there are bots on facebook that they've been pretty aggressive in trying to police them. but these bots are automated accounts that pose as individuals, in a sense, or they seem like there might be individuals, who re-tweet content, produce content
8:15 pm
sometimes the bots are sort of sent in to comment on other people's posts but it creates a false impression of popularity. so something like one third of the followers that donald trump had on his twitter account during the campaign were bots. and -- but hillary clinton had them as well, right millions of bots were followers of hillary clinton. interestingly enough donald trump retweeted bots by -- there's some debates about this -- 100 to 150 times during the election. but that's because you don't know who a bot is, right to sort of the average observer. analysts, the reason we know these sort of statistics on how -- how many bots are out there is because if you figure someone is tweeting, some account is tweeting 24 hours a day 7 days a week you think well maybe it's not human, right. and so there are ways of trying to figure out who's a bot and what's -- and what's a bot and what's not. but it's a significant amount of
8:16 pm
the twitter accounts, twitter says it's 8% percent some analysts say it's closer to between 10 and 15% certainly during the debates in -- right around the presidential debates about 15% of the election related conversation was being done by these automated accounts. about 45% of the twitter accounts in russia are bots, okay. 11 and so it's a significant issue going forward. just as anonymity allows computers to impersonate individuals, it allows individuals to engage in unaccountable speech like hate speech, right. there is a big debate as to whether hate speech actually has been going on, going up over the last year or so. the best analysis is done by josh tucker at nyu and i present one of his slides here, where he finds that actually during the election, there was not an increase or a spike during the -- in the run up to the election, during the campaign
8:17 pm
that there wasn't a spike in hate speech. there was a rise in white nationalist speech it seems following the election, okay. right and a blip also in masochinist speech, right, after the election. but that part -- this research is still ongoing and that there's no question that there's been a rise in hate speech directed toward journalists, many of whom are here, right now at this conference and hate speech on basis of race and gender and -- and religion and the like. and that there are sort of cesspools of hate speech on the internet in places like 4chan and reddit and the like that produce something, you know, things like the pizzagate scandal. now just as there are these cauldrons of hate speech online they're cauldrons of everything online, right that there are locations where you can go to get reaffirming information that satisfies your preexisting biases.
8:18 pm
whether it's cat videos, right or whether it's white nationalist rhetoric, whether it's supporting a political candidate or the like. when people talk about the problem of hamafaly, the problem of echo chambers, most of the time what they're talking about is how search engines and social media reinforce your political beliefs by feeding you information that is pre-selected for you, right. and there is evidence that look -- it is no surprise to you that websites like the drudge report and fox news tend to be more republicans that go to those sites right. that democrats are going to be more likely to go as it shows here huffington post, new york times and the like, right. interestingly and this -- this
8:19 pm
is from 12 my colleague shanto iyengar at stanford. the only -- the two publications that fall right in the middle are real clear politics and usa today at least during the -- during the election. that's because everyone is real clear politics is for the polls, right. and so -- and so that's why that was sort of fell at right in the middle. and so the challenge in thinking about hamafaly and whether the internet is reinforcing our preexisting beliefs, it's not a -- it's not enough to just say well look conservatives gravitate to conservative sites, liberals gravitate to more liberal sites. the question is as compared to what, right. because we are self segregating in all of our areas of life, whether it's online or offline, right. and so you are actually more likely, or at least as likely to run into someone on social media saying your friendship network on facebook who voted for a different presidential candidate than you, than you are say if you wanted to find a republican on the streets of aspen right now, right, we're only about 20 percent -- 20 to 24 percent of the population here voted for donald trump, right. and that's so much of our lives
8:20 pm
right are self segregated in neighborhoods, where it's an reinforcing our political beliefs in the offline world that we see that, the fact that we see that on the online world should not be too surprising. nevertheless, when you start looking at hot topics or politics and the like that you do see twitter networks which are, we're -- democrats are re-tweeting democrats, republicans are re-tweeting republicans and people are of course gravitating to many of the sites that -- that i showed you on the first slide, all right. now moving to the last two unique features of the internet sovereignty or lack of sovereignty and monopoly. and we'll have -- i'm going to just briefly talk about this because we've got some real experts on the panel following me talking about this. in thinking about the issue of russian penetration in the last election, which you know you just need to turn on the tv you'll hear 13 about it every day. all of these other features of the internet that i've talked
8:21 pm
about up till now are ones that are facilitating the ability of out-of-state actors, right to have an influence on our domestic elections. that is, i think, one of the unique features of the internet age, right. yes, it's true that the united states and you look at what the united states did in latin america that there are examples where we have had an effect on domestic elections. and there are examples of attempts that russians, you know, even during the soviet era may have tried to have an effect in the u.s. domestic election. but the sophistication of the russian dis-information campaign with the use of what are, you know, trolls, bots, cyborgs what those -- you know what bots are now; trolls are actual human beings, if you've been watching homeland, they have a lot of -- they did a whole episode dealing with trolls -- or cyborgs which are sort of actual human beings who are in charge of a range of
8:22 pm
automated accounts; through hacking and disinformation and the dissemination of propaganda and the like, through both official and unofficial channels, right. so that whether it's through the rt television network or through the rt and sputnik web channels you know websites that we saw a lot of this effort in this last election. finally, let me talk about the issue of monopoly. so i had to get the atlantic in here right. so, which had an article that facebook is eating the internet. not to be undone, the columbian journalism review says facebook is eating the world. or as another publication said that google is the new mind control and that you need to have google and facebook -- stop google and facebook from destroying journalism. and so there is no doubt that the power that these platforms have is different in character from the one -- from the power that say the three networks had. because if you look at the affect that they're having on the revenue sources for -- for journalists it's a -- it's a completely different world than one where you had this 14 sort of oligopoly of the broadcast
8:23 pm
networks and the like. it's a different kind of monopoly though, right it's a different type of power and where there's going to a whole session here at aspen later today about whether google and facebook are too powerful. so i commend that to you. but we have this sort of bizarre situation, right, where you have this concentrated power among the platforms and you have this fragmented media environment at the same time. so you have more information at your disposal, information and disinformation, than we've ever had before and yet the power of any individual platform, you know, is diminished except for just a select few like google and facebook. all right. so now let me conclude with just talking about what the sources of reform are. first, you can look to government regulation and that's what's happening in germany, right. so there is a law that's been proposed to stop, it's called the fake news bill but it's really not about fake news, it's about any illegal speech that that is prohibited under german law, which includes things like
8:24 pm
holocaust denial but also a kind of capacious definition of slander and the like. how -- that that will then be illegal online but more importantly that the platforms will be liable up to 50 million euros for any example of illegal speech that occurs on their platform that isn't taken down within 24 or 48 hours. then there are the platform self regulations. so you've probably seen, if you have a facebook account, this new option post-election to report a story is fake. and so you the user are now empowered to then send a message to facebook that a story should be referred to fact checkers, if two fact checkers then agree that it is false, then it will be marked as false for every other user. that's what it looks like when they -- when you get a disputed fact check. and then finally there are series of apps that have been developed to help all of us combat each one of these
8:25 pm
problems, whether it's echo chambers, or whether it's fake news, or whether it's automated accounts or not to then give you the user some 15 options that the platforms might not give you. but in each one of these cases the reforms fall into one of several categories. either as in the fake news situation there's additional disclosure as to what -- where this is coming from or who's behind it or whether it's been checked as fake news. in some cases it's demotion in a search engine and a like so for example google news' algorithm will tag certain stories or certain sources as less credible than others so that they are not put at the top of a news feed. there's another proposal for delaying fake news that krishna bharat who founded google news has proposed, which is to say, have some trip wires when a story starts becoming viral and gets forwarded more than a thousand times and the like. in europe and elsewhere where you have robust public
8:26 pm
broadcasting corporations and presence then there's the idea of trying to dilute the effect of bad speech with better speech, right, so that you can combat fake news, hate speech and the like. and then there are more authoritarian approaches to this, although we do this also in the united states, which is to literally censor the speech, right. there, if you look at the platforms and the terms of service that they have, it is clearly the case, right, that whether it's incitement or hate speech or obscenity or a range of other types of speech that if you violate those terms of service that that speech will be taken off the platform, right. so i will end just by saying -- by issuing sort of the challenges that i think the internet poses for democracy that it does test our inclination, particularly in the united states, that the marketplace of ideas is the test for truth.
8:27 pm
it challenges our democracy to function in an environment where there is widespread disagreement on facts and distrust in institutions. and then finally, how do we manage a campaign information ecosystem, when it is sort of, invaded either by humans on the outside, state actors, or bots, trolls and cyborgs on the inside. and so if the question is, can democracy survive 16 the internet, i have a, you know, a click bait answer which is the answer may surprise you. so i will end on that. [applause] >> let me describe it. we are going to be talking about left off, which is
8:28 pm
informationbased attacks. and let me introduce the panelists, who are going to be talking with us. one second. so john carlin is the chair of the aspen institute cyber security and technology program. you can come up. until the fall of 2016, he was assistant attorney general for national security at the justice department and its top national security attorney. and he previously served as national coordinator of doj as hack - computer hacking and intellectual property program as an assistant u.s. attorney for the district of columbia and as chief of staff to former fbi director robert muller. mr. carlin: so, he was in the news again. mr. manjoo: yeah.
8:29 pm
mr. manjoo: ron brownstein is editorial director for strategic partnerships at atlantic media, where he contributes to the atlantic and the national journal including a weekly column on both sites. he's a longtime political analyst for cnn and he was previously a columnist and political correspondent for the la times and he's the author six books most recently the second civil war how extreme partisanship has paralyzed washington and polarized america. and also julia angwin is a senior reporter at pro publica where in 2016 she led a team investigating algorithms that was a finalist for appeal and surprise in explanatory reporting. and from 2000 to 2013 she reported at the wall street journal where her team of reporters won the 2003 explanatory reporting covering corporate corruption. and she's the author of among other things dragnet nation a quest for privacy security and freedom in a world of relentless surveillance. so i wanted to talk about this question of one way to think of what happened in 2016 is to in the 2016 election is to think
8:30 pm
that it was an attack on the media, like the media information system made out like the -- the way the media works -- the way the media works now, as nate defined it, made us uniquely vulnerable to outside attackers. do you guys think about it that way? is this sort of a media problem that the collect -- like journalists and platforms involved in media, like facebook and twitter and others, should work to solve? mr. carlin: i guess removing it for a second from -- from the question of our elections, this type of tactic, this type of strategy, it's not the first time we saw it. a couple of points. one, the islamic state in the levant, the growth of a terrorist group who works by crowd sourcing terrorism was -- adopted a strategy just like al qaeda used aviation against us, western technology against us. we saw islamic state in the levant try to turn human beings
8:31 pm
into weapons. and they did so by using social media, which they had for free. and they were incredibly effective at using propaganda and other means to target particularly young people and those with mental issues. in the united states, when i was overseeing the terrorism prosecutions here, we brought more international terrorism cases than we'd ever brought before and they were directly linked to the fact, if you looked at who they were, it's not that there's one geographically isolated or ethnically isolated group inside the united states. there are cases across 35 different u.s. attorneys offices. what they had in common was in over half the cases the defendants were 25 or younger. and most troubling and why prosecution isn't solving the problem, one-third of the cases the defendants were 21 or younger. the reason why is, in almost every case ,what we saw was the involvement of social media. and secondly, if you look at the use of socalled fake news, we saw the syrian electronic army spoof a terrorist attack on the white house by taking over the twitter, its twitter account and saying that it was under attack. that caused the stock market to
8:32 pm
lose billions of dollars. that's fake news and it was reported upon by mainstream media once they saw the fake twitter account. or sony, we war gamed for years what it would look like if north korea attacked the united states through cyber means. we got it all wrong. we never guessed it'd be about a movie about a bunch of pot smokers. it's the only time in my career i've had to brief the president in the situation room and started by giving a plot synopsis of that movie, which if you've seen is not easy to do, right? but that attack was not actually the first destructive attack in the u.s. there had been an iranian affiliated attack on sands casino. there been iranian affiliated actors that committed a denial of service attack using a botnet on the financial sector. the reason why everyone remembers sony and north korea, it's not because of the destruction, and they did destroy their systems, not because they stole intellectual property, although they did steal intellectual property and
8:33 pm
post it. it's because they took salacious e-mails off the system, put them up through social media and then watched as the mainstream media jumped all over it and magnified that message out doing the exact harm that north korea wanted to accomplish, which ironically was to prevent the exercise of the first amendment and political free speech. so, prior to the election, and in different contexts, i think we've seen all these tools used before. mr. brownstein: yeah. no, and i remember saying at the time and thinking at the time , when the north korea episode happened, i live in los angeles now, i mean the 19 initial reaction was, "wow, this is really cool. we know what all these people think about angelina jolie." and then it only was later that we kind of realized, you know, this is not so cool, we are being attacked by a foreign intelligence service. and that was exactly the life cycle of the dnc hack and the john podesta hack. and, you know, that's why i think thinking about this as an attack on the media is kind of insufficient. i think what the public is concerned about is bigger and it goes beyond -- it's bigger in two senses. first, we're not -- i think obviously we are worried and
8:34 pm
with good reason after this election about what foreign actors can do to us and do to our elections. whether or not, you know, russia was able to actually influence the vote totals or deregister people, which is kind of a nightmare scenario that's still out there, the sheer level of interfering and throwing dust up is significant and in many ways has succeeded, i think, beyond their wildest dreams in leaving us all kind of chasing our tail after the election. but i don't think we're only concerned about what others are doing to us. it's what we're doing to each other and the slide, you know, niel's slide about, when you look at the level of self segregation on the internet and it's really not that different than our self segregation in the society overall at this point, just making it easier, more intense, angrier, louder. just a quick question on this -- quick question. how many of you live in a county that you think was close in the last presidential election? there you go.
8:35 pm
60% of us live in counties that were decided by 20 points or more. so we're taking that life, transforming online and adding a level of ferocity because of the anonymity and so forth. so we have like -- you know, i kind of look at this, i feel like, you know, the title was can the internet survive democracy. no. what do we do for the next hour? you know, what are going o talk about because not that it's creating -- i mean the foreign intervention is giving a new -- that is new, it's giving them a new way 20 to kind of strike at us. but in terms of what we're doing to each other, it's taking the things that are already happening and it is just accelerating them and making them more vicious, just more vicious in a way that causes, i think, the bonds of the country. there's no way to look at what's happening in the last few years, especially since this election, without concluding that the bonds that are holding the
8:36 pm
country together are fraying and that in many ways this feels to me, as someone who travels a lot, spends a lot of time in different cities, this feels to me as barely one country anymore. and the internet i think is an important part of that. ms. angwin: well, i wanted to say that all that is true, but i do think you're right that the media itself has been attacked but it's in a different way than what you guys describe. so i see it through the lens of finances. so if you think about -- we had this really wonderful period, where having a robust media that was extremely adversarial to the government and really proved to be a great watchdog was also something that was for profit and made money. and that is just no longer true. and i'll tell you why it's true as i say because we've allowed ourselves to create this new internet economy, which i call the surveillance economy, which is essentially previously with the wall street journal, where i worked could sell their audience, like they're an elite audience and you would buy -- pay a lot of money to put ad on their page and their newspaper because those are the high end people you want to reach. but with the internet, we have this new model ad tracking, which is like you don't need to find that guy in the wall street journal anymore. you just say i want to find all dudes who look like john carlin and follow them around the
8:37 pm
internet with all the tracking technology. and actually they're more likely to look at my ad, when they're not reading a really good story. so they're more likely to look at it on some crappy web site. 21 and so what we've done is made it so that no newspaper can financially support their journalism through their online audience. not-- the two leading media outlets, they are in the top 100 properties globally that are new york times and espn, and neither one of them is profitable online. and so if you can't be a top 100 online property and make money, like, i don't know who can. so then you have another thing, which is nobody would have ever bought an ad in breitbart before, okay, or daily stormer or any of these other places. but they all have ads from google ad networks. and so what we've done is we've taken away the funding for traditional media which had a real sense of a purpose and a watchdog and a mission and we've funded sites that are basically just click bait propaganda on both sides. but there is probably maybe more on one side, i don't know -- seems that way. and so we do have a financial problem, which is do we want to
8:38 pm
-- how are we going to support journalism because journalism is the watchdog of democracy. right now, if there were no leaks in washington, all those people would be very happy to pretend like there's nothing going on, right? there's no health care bill. like, what do you need to know, right? and so this is what journalists are for. and so we do have to face the fact that if you don't fund journalism, it isn't going to be a watchdog for democracy and then we won't have democracy. mr. manjoo: so let's say we had a better funded mechanism for creating media. what would that -- like, what would a reformed media have done with something like the podesta email? so he gets -- he gets his phishing attack, he clicks on it. he -- all his emails get out there. how does a sort of responsible news environment that's competitive like, you know, address that? ms. angwin: that's a very difficult question because, right, you know you have -- the european standard is actually very pro-censorship. and it's something i don't think
8:39 pm
that we would take well to here, right, the whole news blackout in france before the election. like, it actually worked well for them, right? the wikileaks thing just died. but at the same time, like, me as a journalist, that makes my skin crawl, like, you have like not writing about something. and so i don't know that we as a nation are ready to go to that level. but, you know, it's also worth noting that they have kept their democracies intact and maybe that's the price you have to pay. mr. brownstein: you know, it may be the answer. i mean, i felt like from the beginning with the podesta emails that the bigger story was how they were getting out, how this had come to happen than it was anything in them. there's no way, to your point, that we're not going to cover, i think, i mean what is put out. but i think the - i would hope after the north korea episode and certainly after the 2016 episode that the question of who did this and why is at least as prominent right from the beginning. because realistically, as you say, in a -- and there are no gatekeepers anymore, there are no standards, somebody is going to put it out there, what's in the e-mail. but the message i think should have been, and at least i can say in my own reporting, from the beginning it was how is this happening, why is this
8:40 pm
happening, who is this doing it, who is doing this has to be equal, i think, equal -- mr. carlin: i think that's an important issue and as the non-media -- as the non-media member, other ways we should start thinking about. this is going to happen again. we've had every measure -- leader of our intelligence community say the russians consider this a success. they did it before to other countries. they've done it here. it was a success. they will do it again in 2020. maybe as soon as 2018. so what can we do that does not rely on the better angels of the media to resist from reporting stolen traffic -- mr. brownstein: that is a thin reed. yes, that is a thin reed. mr. carlin: and france, it wasn't just a -- one thing that happened in france. so let's assume there's nothing you can do if your system is connected to the internet to keep a dedicated bad guy, particularly a nation state, from getting into your system and looking around. one low cost change of thinking that we've talked about in the corporate sector for a while now is, "hey, if you assume that's the case, don't put everything that could harm you the most in a folder called crown jewels and put that inside your system," right. maybe put something in crown jewels that won't work if they steal it.
8:41 pm
well the macron campaign did that. so they essentially knew russia was going to break into their system. they deliberately put fake e-mail into the system. when it was then stolen and they attempted to leak, they told the mainstream media and others we're not going to tell you what's real or what's fake and russia doesn't know what's real or what's fake. suddenly people didn't want to touch the story. another issue though on the -- on the truth is who's the credible voice, which is very hard when you're reading up to -- leading up to an election, which political leader is going to get up and say, "hey, there's a foreign power that's interfering with our election even though all of us in law enforcement intelligence community can see what's happening." i think there we should think about a structural reform of having a body, a dead man switch if you will, of career officials who can make the determination this is happening and they have no -- mr. manjoo: why couldn't it have been like the fbi director? mr. carlin: well, i think we've seen that an fbi director on their own without pre-conversation is not 24
8:42 pm
necessarily sufficient for people to believe the statements that are said. but i think it is an issue, you know, he was struggling with. you've seen from reporting now the president, others were struggling with how do we call out what we're seeing. i think if we want to rely on bipartisan political agreement, that is a -- that's another thin reed. mr. brownstein: there's a real bit though -- there's a little bit though we have met the enemy and he has us here in that-- in that every intelligence agency in october said this was happening. and there was a large segment of the electorate that did not want to believe it either because they didn't trust the intelligence community of barack obama or simply because they viewed it as against their interest in the campaign. that's why there's going to be a problem. and i don't think the internet by any means has created the intense separation, sorting out, polarization that we're living through. but it does intensify it. and i do wonder even if we create this bipartisan group, the side that is being advantaged by the leak in the run up to the 2020 election will be -- is the elite, is the
8:43 pm
establishment. ms. angwin: i think though -- i just like to point out that, like, as far as i understand political campaigning as a tech journalist is, it's always been information warfare. like, i don't know what we're talking about, to be honest. like, information warfare is what political campaigning is and obviously both sides are going to play the game best they can. the rules of the game have changed and the stakes have gotten crazy with like bulk leaking. but i'm not sure that's going to change. what disturbs me about this election actually is the non-auditability of our voting machines across the nation. we have no way to know what happened. and we also have no way to know what happened on our social networking platform. so as niels was saying, basically the fact is that there's no record of whether fake news made a difference because facebook is the only person who knows, right. and so we need to sort of think -- i'm sort of more of the mindset of we need to know how to audit, right? because i think that's our -- that's how our democracy works.
8:44 pm
we are not into censorship. we're not into proactive preventing of information. but we need to function as a democracy to audit it. and there are some pieces right now that are not auditable and they need to be auditable. and that's what i would say needs to change. mr. brownstein: you would think one thing that is different is that the kind of the moving so much of the communication online has created more opportunities than in the past for these foreign actors to -- i mean in the 1968 campaign, it's not as clear to me how you would have tried to disseminate information, say, across the nation. ms. angwin: i mean, the u.s. has dropped leaflets in seven countries. it's not like we've never done this before. mr. brownstein: yeah, sure, right? but no one's dropped leaflets here, i mean. ms. angwin: oh, cool, but like -- mr. manjoo: no, but i also think there's a -- there's an argument that this was more effective than leaflets, right? like the idea of --
8:45 pm
mr. brownstein: because it leaves no fingerprints. ms. angwin: yeah. mr. manjoo: right. and the idea of sort of hacking internal, you know, campaign documents that you know would be amplified in the media seems different to me. like, sort of fundamentally different. ms. angwin: but it's not preventable. i mean i do think what john is saying is like it's not -- like, at this moment, you know, maybe you have some secret thing but i don't know an institution hasn't been hacked, right, the nsa or -- mr. manjoo: yeah. i mean is that -- so is the long term solution to this -- i mean, the institutions could get better security. i mean john podesta didn't have two-factor authentication on. like, he could have just done something simple. they asked the it guy, the it guy said it was a legitimate email and it turns out he, like, had a typo, he meant it was an illegitimate email and -- ms. angwin: or so he says now. mr. manjoo: yeah, right, exactly. ms. angwin: i mean, i think that the true thing is, and then you would probably disagree with me,
8:46 pm
but the u.s. has clearly spent a lot more time preparing for like the equivalent of cyber war at a high level and this was a guerrilla warfare, right? and so we -- i think there's an adjustment in cyber techniques that it's going to have to happen, right, because people need to focus a lot more on the low hanging fruit and maybe, you know, we don't necessarily need to do as many of the high level things or we need to do them too, i don't know. but i think there is an adjustment we have to have. mr. carlin: well, there's been a wide scale -- look, we've moved as a society over a 25, 30 year period from putting everything we value in analog space paper and books to digital space. then we connected it using a protocol that was never designed for security in the first place, the internet. and now we're playing a frantic game of catch up, both in the private sector and the government. and the fact is right now today there is no safe internet connected system. so i don't think you can blame the victim in each one of these instances, which we need to balance trying to have them have better security 27 practices so that the low hanging fruit, the actors without sophisticated capability, aren't running rampant through our private systems. but at the same time we have to be realistic, we have to start preparing our defenses with the assumption that somebody can get in and not just steal information, alter the integrity of data or destroy the ability of systems to work as they're -- as they're designed. mr. brownstein: and that is the nightmare scenario, right, that
8:47 pm
the actual vote totals. you have an election -- just think about how polarized we are, how divided we are, how apocalyptic each side views the stakes in the next election. and if we have an election where the day after the election we are not sure who really won. ms. angwin: we just had that. i mean we just had that. mr. brownstein: well, i don't think -- i don't think we -- i don't think we had. i don't know. ms. angwin: no, we had-- mr. brownstein: we did not have significant amount of people who said that the vote totals were manipulated. ms. angwin: no, but we don't know. we don't know. we literally know that they broke in and they tried to get this software from a voting machine -- the software company. and there's basically no way to audit whether that went any further, right? and we suspect it didn't. but because of the lack of audit trail, like, we are in that situation now. mr. manjoo: if it did happen, it would look exactly like what it looks -- ms. angwin: yeah, right, it would look like this. this would be the successful operation. mr. brownstein: but all i'm saying is there was not -- there was not a widespread sense in the country in the day after the election that the results were illegitimate. ms. angwin: not the day after. it took a while. mr. brownstein: no, i still -- i
8:48 pm
don't think there is now. i mean people -- i don't think there is now a sense that the results were manipulated. maybe they should be, but there isn't. and i'm saying if you get to the next election and there is half the country or 60 percent of the country thinks that the totals they're seeing on the screen are not what people voted, i mean that's just a whole other level of social conflict and dysfunction that i'm sure vladimir putin would be very happy to see. mr. carlin: and it doesn't take a change in the vote count, right? so what -- i used the syrian electronic army example of there was a day essentially of confusion where they pretended that there was a terrorist attack on the white house, cost the stock market billions of dollars. what if there was a similar style misinformation campaign on the-- on election day? then no one would trust that it didn't affect the vote count and you'd know that a foreign power did it. so it's these sideways attacks that we've continually seen to be effective now and that we need to think about, prepare for so that we're -- and the -- the buzzword in corporate sector and corporate governance is resilience, not can we absolutely prevent it from happening, but how do we make sure. but here it's not corporate governance, it's our governance and we don't seem to have a similar focus. ms. angwin: well, i think that transparency is a key part of it, right? so there's just too much secrecy in government right now and there's overclassification. and so that has bred a lot of distrust in government and people don't know what to believe. and so we have to build systems that once again the part of the audit thing 29 is transparency,
8:49 pm
-- the audit thing is transparency, the vote totals have to be in some sort of lockbox or like a citizen committee, a dead man switch whatever group you want to call them, but there do have to be trusted entities that have gone through some public vetting process and people feel vested in and that takes a while to build. that's institution building. mr. manjoo: i feel like we're all calling for this trusted entity. but like, does anyone -- does everyone trust a single entity in this country? i mean, that's sort of the fundamental problem, right? like, we wouldn't be able to find that person who promised us -- ms. angwin: people trust documents, right? they trust, they do. they trust the wikileaks leaking even if it might or might not be true. like, we're in a world of show, not tell, right? so when -- it used to be back in the day, you know, we could write a story with three anecdotes. and we are like that's the news. now you have to show the documents and you have to be scared that you are -- the person who's leaked them to you who is going to get out because you have to post the documents. but like we are in a show your work mode and that's because we have to build trust again because everyone lost trust. mr. brownstein: i agree with you
8:50 pm
though. i don't think there is any set of individuals who could kind of bless the election results and everybody would say yeah, you know. i mean, even short of like -- even short of manipulating the vote totals, what if we get to 2020 and in six key state the number of people who go to the polls and find their registration is invalid is 2 percent higher than it was in 2016? i mean like, how do we know what that meant whether something happened or not? and it -- it's just adding this kind of layer of, at a time when i say each side views this in increasingly apocalyptic terms, you kind of layer on to that a question of legitimacy of the result. and these are very corrosive forces. mr. manjoo: is there some way that -- what is the role here for the tech platforms, for facebook particularly, google and twitter also which has a huge role in not many people lives but journalist lives? what is the role for them? i mean, specifically with regard to facebook, could facebook like somehow bless the election?
8:51 pm
one of the things that mark zuckerberg said after the election was they had looked at all the results and they, you know, thought that the -- they looked at all their data and they thought that the results matched up with, you know, all of the data that they had, that facebook had about how people were talking about the elections. if there are entities in this country, google and facebook, that sort of know everything, could they, you know, do some kind of, you know, investigation of the election? ms. angwin: oh my god. i mean-- [laughter] mr. carlin: i do think it is -- there are -- you can have paper ballots, you can have an auditable trail in terms of the actual vote count. not every state has implemented in every region would -- look, one of the reasons we were safer against a foreign power's attempts to change our vote count is we are incredibly inefficient. we have 70,000 different voting systems. and so not all of them have put in a paper trail. but you can that would be auditable. it physically exists. you can go back and check it. registration rolls will be harder but you could devise a solution. again, it's based on the idea
8:52 pm
that the digital data, if we connected to the internet, is not going to be our solution. so we need something plus. mr. brownstein: back to the future. it's back to the future. mr. carlin: and that's been true. you look at the russian attacks on the ukrainian electric grid. the only reason ukraine didn't suffer -- so these were real attacks by russian actors that knocked out the power grid, one of the nightmare scenarios we've talked about a long time in terms of our structures inside the united states. they did it there but they were able to get back and running. why? it was 30 years out of date and they still had people who know how to operate systems manually. we are rapidly moving towards a -- on purpose, we're moving towards an arena where nobody would, even your car, no one's going to know how to manually fix it anymore because it's basically a computer. so maybe with some of these systems knowing the risk, you do a -- you build in security by design so that it can be operated manually or have an audit in place. ms. angwin: i mean we do need a jobs program in this country also. so i mean it seems like a nice mesh.
8:53 pm
mr. manjoo: we should end by talking about the role for law enforcement here. one of the things we've learned since the election is that, you know, the obama administration knew to varying -- various parts of the administration knew that the russians were attacking us in various ways. they did something but not, according to a lot of people, enough, not a lot. what could they have done and is there any sort of -- is there any sort of way they could have stopped this or warned people? what should we do, sort of, next time this happens? mr. carlin: well, i think -- look, what was done was too little too late because it didn't have the desired effect. the desired effect would be that the russian would feel their attempts to interfere in our democratic institutions was a failure and they wouldn't do it again. that's not what happened. so you can judge it. it didn't work. looking ahead, i mean one thing is if we're moving towards -- which is relatively new in cyberspace of being more transparent, for years i knew in government that china was inside our system stealing billions of dollars worth of intellectual property and that was classified.
8:54 pm
it wasn't until 2012, i think that as a government official i could talk publicly about it and it wasn't until 2014 that i oversaw the first case, where we brought charges against members of the people's liberation army and just laid out exactly what they were doing, it was very controversial at the time. so this approach is new. with russia, we did figure out who did it. we did make it public prior to the election on october 7th. we did not do the deterrent action until december 29th. if that had happened pre-election, you guys are more of the expert on how it would have been covered, but that may have been more of a -- i think russia would have reacted more strongly, it would have been more of a substantial brushback than doing it after the results. mr. brownstein: the thing i wonder about is certainly russia will look at 2016 and say this worked well enough, they're going to try it again. but i wonder about, like, it worked well enough for russia, that why won't domestic actors try it again more. and will -- how will we react? will we accept that, you know, kind of the politics has always been information or this is okay, this is new -- this is acceptable for a group of
8:55 pm
democrats to try to hack the trump campaign and leak everything or for the, you know, trump to hack democrats and leak everything? will we accept that this is just a new front in an ongoing war or will this somehow be disqualifying and delegitimizing? i'd like to think the end -- the latter, but i don't think so. i think that, you know, that people view the stakes so high that the kind of by any means necessary view is taking over in big parts of the electorate. and, you know, i just kind of think of how many comments i got after talking on cnn about the russia leak was like, well, who cares who provided it. who cares who put this out. what matters is what's in the e-mails. and i do think that the issue is not only whether this inspires russia to 33 do it again, but -- inspires russia to do it again, but what this inspires here, in terms of what will happen in the next election. ms. angwin: well, we just don't have enough hacking prosecutions to begin with. i mean they're really hard to do. and they're particularly hard to do with russians, right,
8:56 pm
extradite, et cetera. but the truth is like it is illegal for one of these campaigns to hack the other one and they should not -- you know, that they should be prosecuted for it. mr. brownstein: it wouldn't be -- not even campaign. i think like something, you know, one degree of separation. ms. angwin: well, yeah, they will find a third party. but that person is also committing a crime, right? so, like we are still a nation of laws. i'm not saying information warfare at all costs. i just think that, you know, there will be crimes committed and then they should be prosecuted. but separately i do think the answer, i really just lean heavily on transparency and auditing. it's super boring and it's not exciting buzzwords but they have served us well for a long time. and i do believe if we could build that into these new systems then we can have all the good stuff and, you know, have our cake and eat it too. mr. manjoo: and you mean specifically like in the in the voting machines or where else do you mean? ms. angwin: well, i do think in political discourse. like, we have never had a system right where you can't see political ads after they've run, right? i can pull up a tv ad in any station, they have to log who bought it, how much they paid, et cetera, at the local new hampshire outlet. i can't see a facebook ad now, right? so i mean there is -- there are gaps in our system that we haven't accounted for, right? there's not a good fec spending
8:57 pm
report for what was spent on facebook. we don't know about the dark posts on facebook. there's a lot of things that could be opened up. but like we have the means to do that. we have regulated those things before. mr. manjoo: in case we don't solve these issues today, i'm just going to say aspen institute is starting a new program on cyber security and technology. we are having a conference october 4th and we're going to work on trying to not just talk about but develop some solutions that will go further, whether they're auditing. i think the solution of the press behaving differently. ms. angwin: that's not going to -- mr. brownstein: good luck with that. good luck with that one. mr. manjoo: all right. thank you all. mr. brownstein: yeah. thank you. [applause] mr. manjoo: okay. we're going to -- we have one more part of this. so in the third part we're going to be talking about the other thing that was mentioned earlier, which is filter bubbles
8:58 pm
and echo chambers. in 2008, i wrote a book called "true enough: learning to live in a post-fact society," in which i predicted that because of the fragmentation of the internet and because of the way that we were sort of filtering our media diets that we would -- you know, we would get to a point where most of what we learned was from partisan sources and the truth will come to matter less. so i think i was right. and so -- but to talk about this more, we have an all star panel. lee rainie is the director of internet, science and technology research at the pew research center where he's worked since 1999. he also directs the center's new initiative on the intersection of science and society. he's the co-author of networked: the new social operating system and of five books about the future of the internet. ory rinat is a deputy assistant secretary at the state department focusing on public diplomacy and global digital communications. and he's the interim white house chief digital officer. before joining government, he oversaw digital strategy at the heritage foundation and its news site the daily signal.
8:59 pm
he was previously a director at atlantic media strategies, the digital consultancy of the atlantic. and rebecca mackinnon directs the ranking digital rights project at new america. she co-founded the citizen media network global voices and and co-authored "consent of the networked: the worldwide struggle for internet freedom." she previously taught at the university of hong kong and the university of pennsylvania law school and has held fellowships at harvard's shorenstein and berkman centers, the open society foundation and princeton center for information technology policy. she was cnn's bureau chief and correspondent in china and japan between 1998 and 2004. and she serves on the board of the committee to protect journalists. so i thought we'd start with lee because he has the data. let's talk about the state of partisanship and polarization in the united states. is it worse than ever as we
9:00 pm
constantly hear and what role in that partisanship and polarization, to what role did the media and, sort of, our new media landscape play in that? mr. rainie: the simple answer is, it's the most intense that it's been in the time that it's been measured but it hasn't been measured for very long. but there are a couple of special dimensions to it now that have not shown up in the data in years gone by. the first thing that's happened over the past 30 years is there's been an purification of our political parties, 92% of democrats are now more liberal than this moderate republican, 94% of republicans are more conservative than a standard issue democrat. that's new in our culture, we used to have liberal republicans and conservative democrats. now that's hardly the case and there has been a rise and at the extreme ends of the spectrum, a
9:01 pm
doubling since 1994 of the people who are absolutely consistently liberal or absolutely consistently conservative, they are the most active engaged actors in our system and that's driving at least part of the politics. the second part of it is that it's become intensified, people have different and more harsh views about the parties themselves. the democrats feel that the republicans are performing less favorably than in the history of their party and republicans feel the same way about democrats and that's risen twofold in the past generation. the third dimension of it is, becomes personalized. there are ways now that people treat each other as people based on their partisan labels. and so now we're picking up for the first time in our data, that democrats and republicans say about each other that they are not just misguided or wrong in their ideas, they are - we are literally afraid of the other team. and the other team just doesn't have sort of subpar ideas, there are actually --their ideas will do great harm to the republic.
9:02 pm
and the way that this cascades through the system this probably coincides with the rise of the balkanized media culture and it's hard to impute the era of causality in all of that, but it -- all this stuff cascades through the culture in really important ways. so you get a formula for a culture that doesn't like itself. so now 64% -- nate was kind enough to mention 64% of americans now think fake news is so bad that it's hard for americans to get good basic facts about current events. 81% of americans feel that the parties themselves and the supporters of the opposite candidate can't agree on the
9:03 pm
basic facts. 81% say they can't even agree on the foundation of what's going on, 60% subscribe to the notion there's such a fog of information out there i find it hard to get to the truth. and just to make it really sort of personal, all of this stuff comes down to the very intimate level because 26% of americans say they've had false information about them posted online. so this is not just a political and ideological phenomenon, it's a phenomenon that now is taking place at the personal level. mr. manjoo: one question about that -- and i will open now out to everyone, when i sort of talk about the way that the internet is sort of fragmented politics, people always say that's how it was at the earliest part of the united states that we were originally, before kind of mass media society, we were more polarized as a society, what's different now in both media and how we respond to media from you know i understand you didn't do polls back then, but. mr. rainie: well, the simplest thing to say about it is that its way more visible now than it ever has been before. so whatever it was then, people can see more of it now. they can see more stupidity.
9:04 pm
they can see more ignorance. they can see more fake news. they can see more disputes and these disputes can go on for longer and longer because there's no incentive in the system to shut them down, why not keep fighting over things. so they have a sense that the culture is constantly at war with itself and it's constantly divided among disputing parties that just cannot be reconciled. mr. manjoo: what do you guys think about that, any thoughts about either of those? ms. mackinnon: well i mean just to pick up on many of the threads from the previous sub panel, i mean we do have a problem also in which society's immune system is very weak or very susceptible and there's this negative feedback loop that's taking place. so you know the question of how do we fix this problem, you can't just fix the internet, it's sort of -- and you can't really separate between fixing the problem on the internet and fixing the problem offline. it is sort of like you need to address what are the root political, geopolitical,
9:05 pm
economic, social issues that are causing our society to be so distrustful that are then feeding into the internet, that are then perhaps being exacerbated by various technology and design choices that companies have made by the lack of transparency in a number of places et cetera, but it all kind of mixes together in a negative feedback loop that then makes it harder to reform politics, harder to deal with gerrymandering, harder to deal with campaign finance, because it's all feeding on one another, harder to deal with economic issues, because people become more and more polarized and you can't get consensus around any policy. and the solution is not going to be easy, but it is going to require, you know, people in a position who have control over technologies whether they're regulators, whether they are companies to really think okay,
9:06 pm
how do we want this to be designed and managed in a way that's going to contribute to the kind of society we want to live in. and it's we have another session coming after this that's going to deal more in-depth with that question very specifically, but it's -- there's no simple fixes. in germany, there's been laws proposed as mentioned earlier to deal with fake news. and the problem is that technically, when companies are trying to stamp out speech that is unwanted, they make a lot of activists and journalists get censored accidentally as well. and the other issue is, is there any solution or any approach you take to addressing these problems because of the unique nature of the internet, it's kind of a global impact. it's not just going to affect
9:07 pm
americans. it's going to affect people in syria fighting assad and isis, right? so if you say okay, we need to get rid of anonymity, we need to weaken encryption so that we can catch the bad guys here at home so that we can better identify the bad actors in our systems. well, there's some people over in syria who just toast right and also investigative journalists in a lot of countries as well who are trying to bring the truth to light, . they're under attack not only by extremists and kind of groups that do not believe in global values, but also by their own governments, in many cases. and all of this then feeds back into our own environment, that makes putin stronger, it makes lots of situations harder, makes young people in many cultures more susceptible to terrorist messages because there's no strong alternative views. so it's a very difficult problem
9:08 pm
to deal with, very intertwined. mr. manjoo: thanks for the high note. ms. mackinnon: but you know i mean this is why in other places and in other times and in the long run if you don't double down on protecting civil liberties, human rights and freedom of expression globally, it's going to get harder to -- mr. manjoo: but the internet does do that. ms. mackinnon: we're going to be we're going to be even weaker in our ability to address the problems. mr. rinat: i think we're being way too negative about the internet and i think it does a lot of those things and when you think about it the underlying kind of theme of this, the major problem that is spoken or not so spoken is we're talking about how certain platforms seem to provide people with information that reaffirms what they already think. but it's not like facebook said, hey you're conservative, i think you're conservative, i'm going to keep showing you conservative content. they said, i'm going to show you things from the people you know.
9:09 pm
and i'm going to show you content from the pages you like. and when you start clicking on those things, i'm going to figure out whose content you seem to like and keep showing you more of that. and had facebook not done that we wouldn't be having this conversation because they wouldn't have grown to the scale to which they grew today. they did that by giving users the things they naturally want, content from people they know, conversations with people they know and as nate explained earlier that's not all that different than offline conversation. so we have facebook, but the internet is not just facebook and facebook traffic to publishers is dropping and that's why publishers aren't making money the way they used to a couple of years ago. and when you look at other things on the internet there are factors that really do build on democracy, wikipedia, no it's not the best source in an academic paper, well wikipedia is effectively crowd sourced and it is a place where you do see people on both sides working on a particular article, coming to something like a middle ground, something fairly balanced. it's not all doom and gloom.
9:10 pm
mr. manjoo: i think you have to start right, that sort of networked organizations like wikipedia maybe sort of one way out of this. mr. rainie: yeah, the big story that probably underlies the problems that we're talking about here is the collapse of trust in major industrial age institutions and it's across the board except for the u.s. military at the national scale, it involves the church, it involves the media, it involves businesses, it involves public schools and what has come to take its place in many people's minds is their own personal networks. so when they encounter information that they don't really know about they will consult their buddies to figure out whether it's true or false or whether to give it a lot of weight as they're trying to assess their lives or whether to give it a little weight and networked organizations and networked personal structures are the coin of the realm, in this age. so wikipedia is a perfect example of it, open source communities have worked out amazing protocols now where people can fact check each other and give each other positive
9:11 pm
feedback loops and come together to solve collective problems and things like that. and so there are network solutions to a lot of this stuff, it doesn't write you a new healthcare law and it doesn't necessarily fix the tax code but there are sort of supporting civil structures that are emerging from the networked world that are pretty promising. ms. mackinnon: i mean if i could just jump in on -- i'm really glad you raised wikipedia. that community has emerged also kind of outside of the commercial need to make money and from advertising and that him and kind of thing and is really revolves around what is in the interest of the community and the community developed its own governance structures that work quite well, even though lots of people don't agree at all. and i guess this is one of the problems, i do wish that actually facebook would perhaps learn a bit more from wikipedia about -- mr. manjoo: they are i mean they really are.
9:12 pm
ms. mackinnon: -- how to manage content in a way which is good if they are, because i think we need to kind of bring community interests into this a lot more rather than just what's going to bring us the most clicks, what's going to bring us the most advertising dollars and not to harp too much on facebook. i agree with julia and others that the media is -- the traditional news media has been somewhat to blame in this, is that they've been kind of using social media as a way to drive traffic and reacting to it in a way that doesn't always serve the public interest either, trying to maximize their own ability to go viral online rather than what stories are the ones the public really needs to know about. mr. manjoo: and facebook is learning from wikipedia and when you look at their steps over the past i don't know, year or so
9:13 pm
especially their war on clickbait and viralnova and all those sites with horrendous headlines, they've attacked those through crowd sourcing and we saw an example earlier of letting users report particular kinds of content and then looking at when there's a critical mass of those reportings, investigating those situations and using those -- using those signals from the audience to figure out how to make tweaks and how to push that back against those sources of information that are getting reported in particular ways. but i guess what's the alternative, do we want facebook saying hey you're conservative, all your friends are conservative, or you're liberal, all your friends are liberal -- here's 10% of your content from the other side. how would they do that? they'd have to start scoring content or scoring people on a left to right scale and making that subjective determination. i don't want facebook making those choices and when you think about it all of us are making those choices. the internet has been good for democracy and does allow people to find other viewpoints if they want that. if you want other viewpoints you will find them you'll know people who have other viewpoints
9:14 pm
and you'll be able to see them or you can find them on google or you can find them in other places. mr. rinat: as to your so your specific suggestion of facebook doing that showing you other points of view, so i mean they've tried that and they found that people just don't click on them, like the thing that we're really worried about here that maybe like the question we should ask is, why is it facebook's place, why are we saying it's facebook's place to give us a diverse source of media? facebook calls -- it does not call itself a media company, they've sort of deliberately avoided that term. they don't think of themselves in the way the new york times thinks of itself, they think of themselves as a way to connect friends and family. so why should it be their place to, it shouldn't be. ms. mackinnon: i don't think anybody here up here has been saying it should be though. i think they need to be more transparent about what's happening, they're starting to do that. they have been taking steps to involve their community more in how we go from here, but ultimately it has to do with the fact that users, the community out there human beings don't trust, the lack of trust within
9:15 pm
society, the lack of trust within politics, those -- you're not going to deal with that online that is -- so we have to address it much more fundamentally through policy. mr. manjoo: well it's also that what drives social engagement, what drives internet engagements it shares and that's not a social media thing, that's back to forwarding chain emails, it gets -- it's when people share that's the source of engagement and what drives people to share, it's anger, it's sadness, it's inspiration, it's really where it happens. but it's rare that somebody says "wow, i just read an objective fascinating piece that represents both sides, let me share it on facebook." that's not what people share. and so what happens is we've incentivized, as a society, sensationalism in journalism. i was given example earlier during the transition there was an article in a publication that should not be named it said something along the lines of
9:16 pm
trump transition website lifts passages from nonprofit group. okay, doesn't sound that great. a couple of paragraphs in they mentioned that the website actually sourced and cited the nonprofit, couple of paragraphs later they quote the ceo of the nonprofit saying it was okay. a couple of paragraphs later they quote a lawyer saying even if it wasn't okay, even if they didn't have permission and even if they didn't cite it was probably still legal. but that headline was so sensationalized and people want to click on something that makes them angry and so everybody just needs to take a breath and it's not the internet's fault. mr. ranjoo: well it's the internet ad models fault right? it's the fact that those sites facebook, every news site you can think of, is getting paid based on clicks. so is sort of the fundamental fix here some other business model for online news and everything else?
9:17 pm
mr. rinat: sure, i just can't think of one. mr. ranjoo: right. ms. mackinnon: and it's not a business model. mr. ranjoo: what do you mean? ms. mackinnon: this is -- this is the challenge. i mean you know, do you want -- we don't have a society in which people are going to allow their tax dollars to be substantially used to support media, that is not commercially driven, that's not where the society is going. there are pros and cons to public media in other countries, in the state media and so on. so it is a problem, but certainly ad driven media is part of it. i don't think we know exactly what the solution is other than that there needs to be more experiments, there needs to be more community media, citizen media you know -- mr. rinat: subscriber driven media. ms. mackinnon: subscriber driven media. mr. manjoo: i don't want to be a shill for my employer but this is why we -- like the new york
9:18 pm
times sort of new business model is to get subscriptions. and people are subscribing, does that give any of you some 45 -- some measure of optimism, not about the new york times but about people pointing at people stuff. mr. rinat: yeah and that's not the only one you look at quartz and atlantic media publication, when they launched they were selling ads based on shared voice, they were getting advertisers to say okay, i know that i don't know exactly how many views i'm getting for this, but i want to be aligned with your brand. and that's kind of the driver behind native advertising. and we were talking earlier in the last panel about the google ad networks and so on. i don't want to stress too much about display ads and how they're targeted, because nobody clicks on display ads and it doesn't matter if they're on the new york times or served up on a google ad network, it's the format that's broken. and so why is the stuff working, why are we seeing people chase this, it's because it's what engages users. facebook is successful because it delivers a good user experience. it delivers content that people want to read. we can do that as consumers, as media organizations, but it can't be by driving up these archaic display ads.
9:19 pm
mr. rainie: can i refine the notion about what an echo chamber is, because i think that the chain of circumstances that people have in their heads when they're describing that phenomenon is that people literally lock themselves in an information bubble and refuse to engage alternative points of view and only build personal networks around like-minded folks. in fact, the more interesting thing that we see just across the board in our data is that the most highly engaged, the most intense partisans, the people who are driving what's going on in this country actually know a lot about the opposing points of view to them and they are very engaged with others who have those views in part because it's an argumentative culture. so it's not reinforcing your views, you're building your repertoire of facts and information so that you can argue and reason well with the people you know you're going to disagree with and they're a whole host of people, by some measures it might be more than
9:20 pm
half of social media users who are living way apart from this phenomenon, they post no political content and little or nothing that comes into their feed relates to political content. so the divider here or the separation here might not be between the echo chambers and engaged folks -- who are -- who engage multiple points of view, but it's echo chambers and something approximating empty chambers. this is just not part of their life. they don't care for. mr. manjoo: yeah as to your first point i mean i rarely watch cable news, but i know everything that sean hannity said last night and everyone on cnn and everyone else because it's all on twitter -- and i don't have a -- my twitter feed is journalists. so it's partisans on each side. so is that happening -- i mean are people, is that what you mean by people getting sort of the opposing side, they're looking at sean hannity videos, liberals are looking at sean hannity videos to make fun of it?
9:21 pm
mr. rainie: that's a lot of their motivation. the other thing that's going on that has gotten much less attention than it probably deserves these is a longstanding communications phenomenon called the spiral of silence and it was enshrined by research coming out of nazi germany, where researchers went to people who supported nazism but who clearly weren't likely to act on it, or things like that. and they said, "why did you do it, why did you let them get away with it? and they were so afraid of being in violation of their social circles, of having a minority point of view or not raising their voice for fear of what their neighbors would do to them, it's now the internet might have broken that, the internet because it enables lots of people in lots of ways to speak out and speak out anonymously if they think they're in a minority point of view. the fact of the matter is perfectly replicated online. people if they think their facebook or twitter followers will agree with them are much more likely to talk about it than not and in fact if they think their facebook and twitter followers disagree with them, they won't even bring it up in a person-to-person context.
9:22 pm
they think the world is a little bit arrayed against them and they're scared of saying what they worry might lose them a friend or start an argument. mr. rinat: when you're talking about echo chambers and bubbles there's also the information bubble you're talking about and you're talking about how you're getting the full spectrum of cable news filtered to you on twitter and you're seeing everything. how many people here are from washington or new york? every time i go out to dc and i'm in another city and i say i'm from dc, the first question i get is always, is it like "house of cards" and that's funny but at the same time there's that information gap, but on the other hand you look at some of the things that have kind of cut through the clutter recently. s-town the podcast was actually a great example of this, it was a way to show people what a different part of america lives like, feels like, experiences. and it did tremendously well, i doubt many people listening to the podcast have the same
9:23 pm
political views as the people it was about. mr. ranjoo: another optimistic note i think that it's entirely possible that facebook and twitter won't be the way, the predominant way that we get news in 10 years or 15 years or even 5 -- mr. rinat: or 2. mr. ranjoo: you know one thing i have been using a lot recently is snapchat, snapchat is there's no virality -- you can't really like share stuff to your 48 -- share stuff to your friends, there's news on it, but it's curated by snapchat and some of it's made by snapchat. and the way that they cover like some natural disaster for example is everyone who uses snapchat will send in videos of their experience of flooding or something and you'll see many, many different people's perspectives in a short two or three minute video, which is so much more informative than sort of a cable news clip of that same thing.
9:24 pm
well, first of all, it's hard to find coverage of like non-politics on cable news anymore, but even if you did it would be difficult to have this wide range of views and it could be, i mean i don't know if snapchat will succeed or not, but facebook might copy all those features and then you might figure we have those kinds of things in facebook also. mr. rinat: okay, yes i agree. mr. rainie: the other thing that's been so interesting to watch we haven't quite documented -- because it's hard to talk to people about as i think it's broadened people's sense of what is news. so all of that stuff that's happening in a cluster of your friends network is news to you that wouldn't rise to the level of something even maybe a local news outlet would cover, but you're better off for knowing it and you're more engaged with your buddies for knowing it and there's a way that this intimacy that the internet is absolutely the core affordance of the internet is that kind of
9:25 pm
intimacy that's maybe the wave of the future. mr. rinat: i'm excited by all the opportunities in the space and if you think about it facebook found an unmet need and filled it and then snapchat said, "what can we do that's different and better and serves a different need." and somebody is going to disrupt snapchat and you know nobody in the 14 to 20 old range is using facebook right now. we've all seen the pew studies that show that. so what's next? what's going to disrupt snapchat? what's going to disrupt the ad models we're talking about, the content models we're talking about? it's not that it's cyclical, it is a constant evolution and somebody is going to crack the code and facebook's going to either buy them, learn from them or evolve. ms. mackinnon: i think you're right and you know i started out kind of on the negative side, but i work with a lot of communities that would be completely unheard if it weren't for the internet.
9:26 pm
but i do think that new technologies and innovations aren't kind of automagically going to take us to a more democratic and kind of human rights and just world without some mindfulness on the part of people building these things. and i do think there need to be principles around transparency, around accountability, around really being responsible. mr. ranjoo: you mean for technologies? ms. mackinnon: for technologies and also for governments and everybody who has an impact on how these technologies are shaped to really be thinking about okay, you know, can this be abused and if abuse of power, of information power, if abuse of the ability to conduct information warfare or to enable it takes place, can we even know who's abusing their power, can we hold them accountable and responsible? and if companies abuse their power for or governments abuse
9:27 pm
their power via networks we need to be able, it needs to be auditable, it needs to be held accountable and you can design for that or you can do design for more opacity in ways that can empower incumbents. and so i agree with your optimism in the long run, but i do think people need to be mindful about how they're designing these things. mr. rinat: in principle and in theory and not to add on a point of disagreement, but that sounds great, but who's going to enforce that? what top down mechanism is going to make that viable and then do the abuses or the 50 issues or -- do the abuses or the issues or the challenges move up to that oversight level and i guess, i don't know, you guys know i work, i only trust the market to get this right. ms. mackinnon: well, the enforcement doesn't have to be
9:28 pm
top-down, right? you can have consumers and investors. mr. ranjoo: i mean it could be like wikipedia, right? ms. mackinnon: yeah, or you know i mean there are a lot of -- you know i run a project project where we've started to rank companies on their policies and practices affecting freedom of expression and privacy. and the hope and we're working with consumer reports and the idea is that once you have a more informed public, once you're making it clearer, precisely what the different companies are doing exactly who is informing us a bit more about what they're doing with our data and how they're manipulating what we see and cannot see that there will be more pressure to do well. so i think -- you know, i'm not saying it's a top-down sort of government solution necessarily, i think there are a lot of mechanisms around auditing and accountability and companies have responded to pressure that didn't come from government to issue what are called transparency reports that give us more information about how they're responding to government censorship and surveillance
59 Views
IN COLLECTIONS
CSPAN Television Archive Television Archive News Search ServiceUploaded by TV Archive on