tv Noam Cohen The Know- It- Alls CSPAN January 15, 2018 8:40am-10:31am EST
8:40 am
simulate with that fear would be like would be enough to stop us from doing that thing. if you don't have the ability to do that, he just go right on the head. >> you can watch this and other programs online at booktv.org. >> hello. we're going to get started here tonight. my name is seth mnookin. i'm the director of the communications forum, and a couple of quick announcements before we start. first, communications forms are held three times a semester, six times a year. if you'd like to be informed of future events, there is a sign-up sheet over there. put your name and e-mail and we promise will only send you news about our six events a year. we have pretty good ones. we had sarah, john, last
8:41 am
semester, these three duke with some great stuff planned for next semester already. also to nights for him being filled by c-span, so during the question part of the forum, if you would go up to one of the microphones and also hopefully state your name and your question, another reason why we ask you to stitching is because we didn't do a write up of all of the forums afterwards which you will be able to read a couple of days after the event on our website which is comp forum.edu, and the last announcement is that this event tonight is cosponsored by radius, which is another group at mit. i am thrilled to be able to introduce these three. it is a different story than we initially thought would be here
8:42 am
because jeff, he called me up a little past by because his daughter was puking, and as a aa father of two young kids myself i said please stay home. and so fortunately christina couch who writes a lot about technology and works with the forum is a bright journalist in her own right has agreed to fill in as a moderator. but let me introduce everyone. noam cohen, death of the new book the know would also, that will also be a a hashtag tonigt at a think moving forward. >> got to get on that. >> we work together a decade and a half ago. i've known him ever since. he is a great guy and a brilliant journalist. he covered influence of the internet on the larger culture for the "new york times" when he wrote the link by link column beginning in 2007.
8:43 am
his first book to know what else can the rise of silicon valley is a political powerhouse and social wrecking ball come is an intellectual history of silicon valley and critically examines how it's destructive culture and ideology the little civility, empathy and even democracy. it was published in october 2017 and it is available for purchase right here, and in addition to supporting open discussion we also support both bookstores and authors. so please by all means buy the book. it's a great book. we have both read and loved it. to his left is sarah watson, a technology critic who writes and speaks about emerging issues and intersection of technology, culture and society. her works have appeared in the atlantic, wired, the "washington post," slate and mother were virtues are filled with the broken cline center for internet and society at harvard
8:44 am
university and author of the city for digital journalism. and then to my left is chris couch. chris couch is a science journalist who i've had also pleasure working with for several years now. she's also the core data for the communications forum. our own market for the intersections of technology and psychology and her bylines have appeared in mit technology review past company, coexist, science, friday and "wired" magazine. we've also for your convenience put all of their twitter handles on the board. and without further ado i will turn it over to chris. actually, with further ado, sorry about that.
8:45 am
in addition to the book come with a book that jeff cowrote, ahead of the mit media lab on sale called whiplash from which is also a great book. the book of those are available immediately afterwards. and no will be your design. >> thank you very think you guys for all bigger group were so excited about this panel. if you like it addresses a lot of important issues. i would encourage you to buy a book, great book. so first of all i want to kick off this panel by talking about the central argument of the book, and correct me if i'm wrong, is really the disruption and individualism, endemic to silicon valley has kind of in a lot of ways the road of humanity, is that fair to say? >> i was thinking about the question, the premise of this
8:46 am
get together like have lost your humanity. the clip and is that they have? the deeper answers and the person has humanity. what we're talking about this happened at a think i've approached some of this on the computers that aspect of the book goes to a lot of history of computer science. peoples machines and machines with people is one of the crucial mistakes or password on i think is scary so that is denying humanity of your fellow people when you think of them so individualistic leave as little data points. i think about in the introduction type but a well-known anecdote about googles first design director who was asked to create a design for gmail window doing it and he suggested a color. is that you can wanted he went and tested 41 different shades of blue. the one people use was with the
8:47 am
one in which is that he resigned over these issues and to be design director at lisa bennet googles like an oxymoron. two other human vision of what they're doing when they were going to test it and they don't apologize because basically they say the color, the shade of loose most popular one. it's led to $200 million in additional revenue as i think that breakdown, seeing people as data points. they're not apologetic about it but at least that outcomes come to some extent? >> can you speak to what those outcomes are. for people to not familiar with the intricacies of silicon valley, can you tell me about how to supplant? >> what i was trying to argue is there taking this ideology called libertarianism and making it seem very normal and mainstream. what do i mean by libertarianism. regulation, the idea that we should regulate tax these or
8:48 am
hotels -- taxis come all these different companies that we should regulate. what children see on video, , nt tv. that we should regulate who can pay for political ads. should you live in america lacks should they declare what they're doing? that's one part of that ideology. this tasteful regulation and this trust the government which is really the point of our society. that's one part. the extreme idea free speech is another one. i know all these issues are complicated. we will talk but whether i'm coming out too strong. to me i wrote a piece in the new yorker.com this week about an issue on the boulevard in stanford back in the 80s with the speedy limits on free speech. there was a joke group that told racist and sexist jokes. stanford try to limit that and it was such a severe pushback from the computer science department, the kind of people talk about who are running silicon valley, that it was reversed.
8:49 am
to me having limits on free speech is vital to having a community that is lisa. that's another dangerous aspect of that ideology. i think they're doing all this, i think some of it is done in good faith but i think it's having horrible consequences. i wrote this book for the 2016 election, was working on it, thinking about it but what happened from that election bears out these points. the fact that the big companies like google, facebook and twitter are so blasé about the idea that foreign companies, a foreign country could try to influence our election or that they should be disclosure was advertising, whether these powerful tools targeting people should be used by anybody to stir up anger and resentment. jamie shows this disconnect that they are not seeing themselves as custodians of the power they had and instead exploding it -- exploiting it for profit or a
8:50 am
utopian vision that supersedes other concerns. that's the effects. i would add one thing. people ask me when i talk about the book what prompted me to do because clearly it wasn't the 2016 election. i think about how one thing was a turning point and you can look me as a hypocrite about using gmail. i remember thinking that give it gmail would have computer region e-mail, in order to place ads for, my mom actually passed with them catch up in the time and remember thinking how whenever mention the word cancer in no because i just didn't want hey,, have you thought of radiation treatment? i didn't want that. that notion like someone would be a custodian upon information, giving it for me, would you like that right to commercialize it was a think the crystallizing moment to me when i think back on it. >> a lot of your work is with technology criticism not just the psychology but the culture surrounding it. i'm wondering from your
8:51 am
perspective, do you agree with the premise that he feel about the premise i guess of the culture of the technology world having an effect on humanity and empathy and severely? >> i absolutely agree with the overall premise that a lot of people have started to unpack the current applications of the way that technology is built but also the kind of assumptions and ideologies that are acted out in the technology itself. obviously, look at the individuals who are leading these companies and coming up with these designs look at their assumptions and ideologies really do matter. i think the biggest thing for me is i like to think about this in terms of optimization. i think most silicon valley leaders and companies are designed around questions of optimization. whether it's the design itself,
8:52 am
whether it's getting to the information access and you can connecting people as efficiently as possible or connecting you to all of the world's material goods. those are questions of efficiency in and optimizing for profit, right? those are kind of taken for granted as the right terms of optimization. trying to unpack what those assumptions are is really productive starting point is it okay, what if spending more time on facebook wasn't the optimization model? what if it was a quality experience on facebook? what would that look like? how would that change the experience or the design of the platform but also what would that change about what facebook scroll in our life is? that's the crux a lot of question i continue to ask about technology and society. i think the trick is using the
8:53 am
terminology of the industry, optimization and waited thinking about problems and problem-solving is a productive way of sharing language and trying to get at we haven't necessarily agree to the terms of optimization but they're coming at it from a market perspective and that's a natural way for things to evolve. but we as a society, we, construct a question whether those are the terms that we agree to or not. >> one of the things that i thought was really well done in the book, the book addresses how many of the issues that we associate with silicon valley now and with a larger technology world, issues of privacy, issues of commercialization, issues of users being assessed in ways they may not agree to, only some of the companies that are major giants currently google being the one that one of sticks outd
8:54 am
really started with an ethos that was entirely against all of those things. can you speak to how did we get here? >> i want to pick up on the way sarah mentioned was spot on. i was reading the report she wrote that she classifies critics. i can see myself in it, and i think what she's talking about is practical ways of trying to get to a better place anything in this book i was looking at the history, also kind ask bigger picture kind of questions. think of the efficiency argument. argument. it's not in the book but this idea that in the bible there is this instruction given here you should just get it one pass in harvesting it. you should go back a second time and efficiently get every little fruit and colonel you miss because part of the ecosystem of people are traveling or four people who live off this.
8:55 am
again, the efficiency would be like i have a form, i need to get all the content, that's what i do, i'm a former. or you could say your part of us aside society and efficiently is to listen scraps be there for other people because they efficiently use. it's like thinking about the picture, the world we are trying to create. an editor sent me a tweet or so the point out mark zuckerberg was saying how he cared so much about this election battling, russian meddling in the election that he was going, the company was going to spend all this money to our people, that's why he mentioned in the investment call, the were prepared to lose money over. the natural, basically you're making money from the current bad situation. that's the point, you'll flip on that one. again, deficiencies in the way their setup that are really troubling. is that what you're asking about, the history? i didn't know these answers directly but how did we get to. that was a question is kind ask.
8:56 am
you'll see in the book the computer science at account for some hacker mentality, accounts for some of these extreme ideological ideas about free speech and kind of like a diversity and hostility to outsiders. i credit or blame stands up a lot of the profit-seeking. for me the stanford, the google case was a real like very lightly. go back and read the original papers when they were developing -- which was an incredible invention like the google search engine. everybody agreed they were standing on the shoulders of others, but they really created something that took this chaotic thing called the early west and made it coherent. it was incredible and it's the reason why he became so popular. they would also explain as you described it to me, why it need to be advertising free and it needed to be in the actual academic world.
8:57 am
it need to be a place where it was transparent picture should be these black boxes and now of course it come to accept the idea that the google algorithm at facebook algorithm are these secret think salute you know what they're doing. they're constantly tinkering with it in a mysterious way. they are arguing that for that for science, very better trusting the system because there's no scrutiny. at least the way i see the story going is basically they are serious academics. their parents were academics. basically the end of using so much bandwidth at stanford that they were told yet to start figure out how to pay for this. to the specific question to say could stanford have said this is a great invention, will pay for. we paid for like a nuclear reactor in our building. it's very important for our study of our society and site to do this but they are were told you that if you got a way to do
8:58 am
this. there are connected to stanford network with an investor like immediately before, so only it were not incorporated. a story that is told in these books is how a person would been at stanford, graduate wrote that the check for google inc. there is no googling. there . there will be. uniate center a month later it was of google inc. and they deposited that $100,000 check and the rest is history. maybe it's a little corny, sounds like corruption. i guess you call it selling out. for them and for facebook as well with a really had some idealism. they kind of work in all of part of the computer and do not necessarily tied to become billionaires. i think in the book you see there are other characters like peter and jeff basis for bakers and that's what they're trying to do was to figure out a way to make money. these idealistic hackers kind of led astray, that's my view anyway. >> sarah, he written extensively about a coming of the technology
8:59 am
world has changed over time. can you explain how has the media theobald as a tech world has evolved? >> sure. so in the research report i did for the center for digital journalism at columbia, what are the things of time to look at was coverage from that kind of early almost breathless excitement about the silicon valley moment, the dot com boom and all of that kind of energy that went into covering before the amazon era, and then later in the google and facebook and others era. that kind of, starting from a very business oriented coverage model, or from a tech blogger model. and so that kind of breathless coverage moving into something a little bit more concerned with as a technology starts to
9:00 am
intersect with a lot of things like politics and people and society, those shifting the narrative about what matters about technology and why this is change our lives and affecting our lives. i think that shipped kind of happens at a couple different points. i think 2007 or so the iphone comes out and like we all of a sudden have dramatically changed our like day-to-day relationship with a computer in a pocket basically. but yet that was still in that gadget excitement phase. then we have like a 2013-ish moment which is the snowden moment and that's where everyone comes to terms with the fact that like technology has both good and bad uses, right? ..
9:01 am
i wanted to touch on this. as this book came out. there's "the world without end" more about the company controlling our access to knowledge and information. and from scott galloway, the m monopolistic, market side of thing and tim wu talking about the attention merchants talking about the companies monopoly over our information and our attention. so, i think it's an interesting moment right now in part because all of the books are written before the crisis hit, so, it's fascinating. the writing has been on the wall for a long time and publishers seem to have
9:02 am
acknowledged that there's a market for the books. i like to think about this and met like where is the audience and who is this for-- >> it's a narrative for 2013, it was sort of an important moment. i wrote a piece earlier than that about this german politician, a young guy who petitioned to get all of his data about that tracking that was done with him, but it was again, you kind of point out in the paper that you wrote, that the breakthrough is hard, so. it's weird, they're keeping that data and it's like they get attention for a lot of different reasons and probably takes a lot for people to see it, so, i think that was probably-- . >> we were talking about the problem to access, right? like who, for a journalist. ron: and i'd be curious to see your take on this, for a journalist to have access to the companies they have to stay on their good side to a degree and that is especially true if
9:03 am
you're like in the business tech journalist covering the story. so, you know, as more kind of journal from different walks of life are also coming to terms with impact on society, the narrative starts to change, right? >> yeah, no, i think that's really-- what you're pointing out is maybe one of the benefits of leaving that kind of gadget phase is that that's less important. there was a sense in 2007, access to the first gadget is usually important, but now we're kind of beyond the gadget phase now, and the ratifications become important in a well told story, and i didn't seek out a lot of access, and i knew that that really wasn't what they were talking about and a lot of-- there's an incredible site called zuckerberg files where the professor at university of wisconsin milwaukee, over the
9:04 am
privacy issue, saying so much for us, has systematically found everything that mark zuckerberg has said and it has access to a-- usually it's a videotape of him, streaming of it and also the text of it, anything he's ever said since he was 19. i've read a lot of it, i think almost all of it and likewise, and tony tweeted 100,000 times, there's more than enough about him. and peter talks about a book, an early one i thought really about what his world view and so they all had a lot of documentation and i felt like it wasn't so vital to have an interview, when i did try to interview it was often like very stenographic and not that revealing. so, i think there is an appreciation for the deeper journalism that you're talking about and that's great. i think that's what we need.
9:05 am
>> it has to be supported by the institutions, publications and to be willing for them to stick their neck out and put that forward. i think specifically of the kind of amazon workplace environment example, like, the reactions from bezos for that piece was what are you talking about? we're fine, just to fill everybody in, the new york times ran a pretty large piece on the inner workings of amazon employees and it really spans from very low level, all the way up to highest-- middle management, not all the way to the top, but they really detailed the sort of break neck work conditions of that place and ended up getting lots and lots of attention. >> and i felt that the response that bezos gave was really a classic libertarian response. and bezos is obviously the owner of the washington post and some call him a liberal,
9:06 am
and i think he embodies a lot of what i have in the book. his response is it can nt -- can't be true, if they weren't treated right they'd work for another company. and there can't be gender and sex discrimination because there to be arbitrage and they would take the women programmers and and the market would correct for it. that's another theme of the book. this detachment from retalent. when you ask about their intentions, there's something seductive of the internet erases all past. racism and sexism. it's a new world. it's an interesting anecdote, this company was proud of the carpet they had, a simple and
9:07 am
had meritocracy on it, we're proud, a meritocracy and women were walking in and saying this is offensive. what's offensive it's the best, and having that as their slogan, they were in essence saying what we have now is fair and if you aren't represented here, you didn't make it, you didn't cut it and i guess that-- i think they took the carpet away, but it was a reeducation for him because they believe the world has been remade because it's digital world that none of that is a legacy problems that are obviously current matter. i wanted to dwell on the bezos response. they're in seattle. for them to have a job they'd have to up root their families
9:08 am
and jobs and everything else and it's not as fungible as he's making it out to be. so-- >> yeah, i mean the question is whether it's on person or not. if he's a smart person and saying that, is he really not understanding it or-- got hung up on-- >> when we talk about biases in technology, in terms of biases in the culture of technology, underrepresentation of a number of groups of people as well as by the products themselves, whether the computer vision systems have a harder time detecting certain types of skin. there was an article fairly recently on women having a hard time getting prosthetics that fit because they're largely designed by men and so many of the problems, you have a dominant group in power that you have-- silicon valley is so very much dominated by white men, and
9:09 am
from your perspective do you see these types of issues changing? >> changing over time or right now or? >> now we're talking about it, right? there are books coming out and there is a lot more media coverage on these types of things. is that landscaping beginning to shift or we have it as a long way to go? >> oh, i mean, i would say that it would require-- for it to change, it would require -- the book is fundamentally saying that these companies are anti-democratic, that they are against democracy, they're against-- how do we correct in a society for to have wheelchair access and fill in every blank? ideally we have a democracy where it's not true, but ideally everyone gets to express their opinion and that's how you represent people and i was struck by amy clo
9:10 am
klobuchar, they had facebook and google and twitter there, she was explaining, we live in a representative democracy and it's important that we control our elections, you understand that's how we do it and that's a message we they don't have. how do you assure fairness. we have a democracy. and i remember hearing it said like the japanese internment, maybe if there had been a japanese-american representative, the internment of japanese americans wouldn't have happened. you need to have some political way of correcting those things. i think that fundamentally as long as they're going to argue they can self-regulate and they are above the government, it won't change. i think i'm like optimistic there could be some sort of a wave election that would kind of prevent a new path, but i don't think it can be done by themselves. the idea of self-regulation is, it won't work. i don't think that-- so-- >> self-regulation is like 60%
9:11 am
of the people, just people who are talking about, well, this is another problem that extends from pure self-regulation, the numbers are astronomical. and it's hard for anyone to regulate themselves. if i were given free will, i'm sure i wouldn't be quite as fair. i do fundamentally believe in democracy and i think it's scary that you get the sense and peter thiel, that he's not an outlier. thats' sort of described as a fringe character, but he's expressing the main thought that democrat i is -- democracy is bad. that when you have democracy you have not smart people running the wormed. and the co-founder of paypal with peter thiel, he is sort of
9:12 am
saying he believes in regulation, but not-- they think it's not efficient or, i think it gets down to democracy and that's important and why i wrote the book, i mean. >> sara, you've written about how specifically within technology criticisms that that world is also in certain ways reflective of technology and that female voices, voices of minority writers have been overlooked in a systemic way. can you talk a little about, you know, do you see that end of it changing at all? >> i absolutely do see that changing which is part of why i was looking at this kind of larger eco system of people writing about these things. i think certainly in the last couple of years it's drastically changed which it's all for the better. i think that also has put pressure on silicon valley to change. at the very least, speaking of like how now versus in the
9:13 am
future, i think we've at least seen the kind of, oh, yes, we will work on diversity for hiring. we'll work on, you know, thinking more about users interests than needs. and whether or not that's affected is another question. on the kind of writing side of it, i think, i was really interested in looking not just at a set of people who are like covering technology, but the rest of the people who are also contributing to a larger discourse about the role of technology in society. and so, some of that has to do with looking at a whole range of writers, not just technology journalists whose beat is like technology, but it's the people who think of themselves as critics. people who are, you know, just writing an op-ed because their academic work has a direct response to, you know, the current issue on russia, for example. so, my interest was in trying to articulate this larger is it
9:14 am
system of people who are contributing and a lot of that has to do with women writing blog posts about terrible things either happening at their workplace or critiquing a technology that doesn't include, you know, fitness tracking and iphone not having a menstruation tracker, and those kinds of pieces are coming from a lot of different directions from a lot of disciplinary backgrounds and existing in a lot of different places and obviously that's not limited to publications, but i think what is frustrating is that a majority of the kind of traditional ways and places that you would look for technology coverage, for a long time, still, were dominated by, you know, your standard texts like dude, bro. sorry. >> no, i hear you and hearing the list of things and mentioning and how a woman was, she had a bad encounter with mark van dreson.
9:15 am
and look at the coverage of t the-- and a lot of women are writing about that. i imagine if there were a similar kind of push going on about that, the way this works, written by women journalists. >> yeah, i'm certainly, like, especially in this current, like in the last two month's moment i'm hopeful. i think when you look back at what ellen powell went through with her, i think it was klein and perkens. >> talking about, explaining-- >> so, ellen powell had, i guess, she was in a company, and had a sexual harassment issue and that kind much just got shoved under the rug, basically. but that, again, was, you know, probably two years ago at this
9:16 am
point. she now has a book out that's all about her, you know, follow through on what to do about this kind of systemic sexism and not having an ability to have support and have people take her seriously, take her claims seriously. so-- >> and i wonder, i can't remember the guy's name, google engineer. and a lot of people look at me, google, they fired this guy. >> james demoore. >> yes, james demore and you're saying they're libertarians and they fired this libertarian guy for really, for his like insidious ideas and what he put there. i wonder, how do you see that? 'cause some ways it was like a pr. >> james demoore for those who don't know is the author of the infamous 10-page google memo, it included a pretty large critique of google's internal
9:17 am
culture and included some information stating that women might be naturally inclined to be-- >> biologically less inclined for coding than men, so it went over super well, and he ended up getting fired and then he went on a fairly large pr push after that, saying that, you know, it was his like autism spectrum that like led him to believe this. >> it was-- . it was a speech, why is any idea-- i remember seeing someone on twitter saying, charles darwin couldn't work at google. i'm sitting here saying, the problem is that google is now a company that isn't just doing program being, that would be bad enough, the computer lab being so nondiverse is bad, but this is affecting our society. so, even charles darwin couldn't do-- there are so many roles to be
9:18 am
played at google. it's just a weird way of defining what it means to be a tech worker. anyway, but-- >> and i mean, the interesting thing on the kind of free speech side of it was fascinating to me because it was so much i should be able to say what i think and yes, that's true and in a kind forum, but this is within the company and the company basically saying, like, well, yes you have free speech, but we can also decide to fire you. that doesn't preclude you from, you know, you don't fit in our culture anymore, which i think is fascinating that we have reached a point that google can say that that's not what their culture is or what they don't want their culture to be like. >> perceived as, anyway. >> sure, right. >> but, yeah, it's still indicative of this engineering mentality of just like very transactional, like very data-driven assumption or you know, backing of an idea and saying like this is the way the world works and all of these
9:19 am
kind of more just interventions are not valuable in his mind. >> i think that there is also, as far as biases within tech culture and tech products, there's a very valid argument that as technology increases, as we become more reliant on automation, on algorithms, this biases that you might once be able to hold someone responsible for hiring only men or hiring whoever, that it's an algorithm that's doing it and now you're losing that person to hold somebody accountable to. would you guys mind talking about the role of explainability and transparency might play with some of these issues? >> sure, yeah, i mean, i think there's-- this great interest in like, okay, we'll let the hr algorithm system do the work and like then we have to say that it's not biased because--
9:20 am
>> biased. >> it's not a human deciding, a-ha. of course, what are you optimizing for. gets back to my main question which is okay, success-- if you're building this algorithm to say, like, what a successful person at google looks like, then you're already baking in a lot of assumptions about their backgrounds, their history, their schooling, all of these things that continue systemic, you know, injustices, in the sense that, you know, you don't have access or you don't have the right background or you are just not a white man who gets along in his, you know, coding cohorts, so, i think there are a lot of people talking about this, especially in the kind of ai ethics and shout out to klein center and mit for working on this and accountability algorithms, saying that these are-- yes, they are objective and
9:21 am
yes, they are outsourcing the decision-making process, but looking into what the terms are really does matter. and i think that's still a hard conversation. >> yeah, i mean, i was-- like there was a footnote in a book that sherry wrote and i went back and looked at ai, one of the chapters about a guy named john mccarthy, a professor at mit and moved to stanford and came up with the term ai, and he was an early computer science pioneer, but also ai pioneer and the whole quest to creative thinking, the idea being machine thinking as people and the brain itself is an entity that could exist outside of a body. a lot of it was really to me, very odd and revealing quest that the early pioneers had. a key scene in the book is between, a debate between john
9:22 am
mccarthy and this professor, mit professor whether a computer could be a judge and they're like, they're really very angular, argued a lot all the time, almost appeared against each other and mccarthy says of course, a computer could be a judge, it can do everything, as long as it's programed correctly and why couldn't it do anything. >> wisenbalm, a refugee from germany, that was an obscene idea, seek something as human, lost their humanity. as being as humane as a judge or a therapist, that it could be a computer was a disconnect with reality. and in a footnote on that, and pt it was about how she interviewed minority students at mit like in the early '80s and they were very encouraging of the idea of a judge being a program, a computerized judge because they thought, you know, we know judges are biased and
9:23 am
horrible and the computer will be fair and at the era they believed that computers could actually be separate entities and be like a thinking being and then cut to ten years later and you know, the view entirely switches, realize that basically it's being given garbage information how judges rule in reality. what's it going to learn. how would it be different, any better than what it's learning from, the machine learning argument. so, all a way of saying computers are neither good or bad. it's obviously the people who use them and the information they're given. they're not any about thor than our society, why would they be? again, a fiction that the computer world is different from the real world. it's a product of the world if we have a racist, sexist society, it's not going to fix it, how could it. and one of the quotes that wisenbalm said, a carryover from the book, a lot of the problem, the computer programs think they're so great at math and they should be solving all the other problems. math is easy, at least for him
9:24 am
it was, but solving injustice or writing a poem, that's hard. so, it's that fundamental disconnect thinking you're good at programming or math that you should run in society. we know how hard it is to fix society is the fundamental problem. >> do you feel that technology has any role to play in terms of correcting these issues? >> you know, i contradict myself, i think there was a famous-- well, a famous court case in the '80s which was arguing that the death penalty should be ruled, you know, invalid because it showed systemic bias. so obviously, i think that data and showing systemic bias can be very enlightening to the publicment so, you know, and the court ended up not using that and didn't-- basically said you have to prove there's racism in each case, you can't have some systemic argument that our criminal justice system is unfair and i've proven it with data so we should fix it. on the other hand, data could enlighten us into how unjust
9:25 am
society is and what we should work on fixing, but i don't think a program is going to do that. does that make sense? >> sure. >> yeah, i mean, this leads into kind of a bigger question i had following up on the book. you do an amazing job kind of articulating the ideology and history where these ideas are coming from, hacker mentality butting up against the entrepreneur model. what i was wanting less and more of, okay, so what? and also, what do we do? and i kind of always go back to lawrence four points of what we do to change things. this-- he wrote this in code and code a long time ago. so, one of those things is, okay, we can decide what the technology is optimizing for. we can decide, you know, what terms you're designing the
9:26 am
algorithms for or kind of what direction we're leading for, and that still goes back to the question of, you know, is it optimizization towards efficiency or towards justice, right? but i think there's still other parts that we could start to unpack, like, okay, if the libertarian approach is a problem for the mentality that like this has led to huge monopoly systems, right? we can start talking about, you know, markets. seems to be one of the-- so, lawrence's has markets, codes, norms, laws are the four ways that one could-- or the levers that one could change society or change where things are going. i'm like hesitant to say that markets seems to not be a real possibility here. like, we have ended up in a monopoly situation, networks
9:27 am
effects basically means that the market lever is impossible, right? do we have an alternative to facebook? do we have an alternative to google? do we have an alternative to amazon, right? kind of, yes, but not really at the scale that these companies are operating at. which is why we kind of get back to the regulation problem, that's the law piece. i'm scared because that's certainly not going to be a functional lever for the next four years, at least. you know, even looking to the net neutrality moment we're rescinding all limitations, anti-trust is still not really set to address the way that these tech companies are set up. it doesn't really apply to free services either, right? the standard ways that we look at anti-trust, which are like
9:28 am
competition, pricing, harm to the consumer, right? these kind of don't work and so, i think we have to think about other ways to hold these companies accountable, but that has to evolve into some new model that's not, you know, based on our old levers. >> well, norms, i think, are something right here. we're certainly seeing what happens with less is norm kind of kicking in in a way. and i guess i am leery of the idea of code being useful because again, it's not knowing what you don't know. and so, a well intentioned programmer, you may not think of the administration calendar and it isn't even-- it's just like not, yeah, it won't be a viable solution, so, i have to say-- and certainly, you mention in your report, too, like the idea of what do we do now is always a big question and, you know, i think either characters in your
9:29 am
paper who talk about how, hey, i'm getting the problems, solutions not my department as they say. and so, but i do -- i think try -- the arguments we're making smaller, local things, the points kind of making that need to be fixed, i would think though like the power of narrative that maybe i'm mistaken-- i mean, in the book and maybe try to talk more about, is to sort of embrace sort of this individual argument and i do believe in individualism and -- and people, you know, self-actualizing and getting out of their lives. and the data, maybe people should have a right to their data and that's a way of framing that, that something has gone off the rails that it was considered normal for these companies to collect everything about you and even though they're giving you services in exchange is not done a very transparent way and it's fundamentally wrong. all i can fall back to are
9:30 am
analogies, imagine if you walk into the store and follows you, oh, did you buy that? that's interesting, i'll make a dossier about you. it would be unthinkable, why? it would be a norm. people point out that like that, you know, maybe an efficient thing is to go to a store and take a penny, add a penny and 55 cents, what can i buy with this? i break a rule? i don't know, it's not our norm. it's not how we behave. i think partly maybe i am hoping for a -- a norming away from these beliefs and a belief that we as a society, if it's still possible to come together and make better rules. so, that's all i can fall back on. and have certain values like your data is yours, and partly i think libertarian ucism won t day even though you framed it with snowdesnowden, if google d have the data, and others, and
9:31 am
they abuse that and set us on a wrong path and we have to get back from tpath. wasn't there someone who said that facebook has better dossier on people than we do. the government doesn't have everything you bought and like and wouldn't raise your antenna. so, i think we need to try to get back our-- control of society, pretty open-ended comment, but-- >> even if that's-- and i totally agree with that as a direction. i still wonder, like, how does that actualize. does that mean we stop using facebook? does that mean we don't let facebook do certain things? does that mean we demand certain features and protections of these companies? does that mean we have a-- like collective action movement about like, you know, the default, or some kind of way of getting into the way to change it, right. like, 'cause it's one thing to
9:32 am
say, yes, i need to have access to my data or be able to voice my interests and needs to determine the way that my news feed is filtered. we don't even get to do that. >> make a facebook group on this right now and i'm with you. >> does it take a movement or something. >> it takes a movement. i think your point, like, snowden, i think the election are going to be galvanizing moments. i was struck watching this hearing that i watched to comment about, about the senate judiciary hearing there was -- the sharpest questioner was this senator, republican from louisiana, john kennedy, who asked this pointed-- the best, most pointed questions what's going on. can facebook give me a list if i'm an advertiser, people are depressed and could i sell them alcohol, could i, you know, know who is over weight and sell them diet pills? it should transcend petty
9:33 am
partisan politics, i don't think it should transcend greed versus not greed. middle america is the one hit hardest by all this. their wealth is being taken and being sent to the coasts you know, by and large. i think there's a chance for a mome movement. if there's a belief in the political system. i think that change-- things are galvanizing. >> even in that case, so if we're going to define what our norms are, we at least have to have all of these examples and cases of saying, like, yes, this is a possible way that this advertising platform could be used and oh, by the way, we don't believe that that's an appropriate use, right? i think it has taken us a long, long time to get enough of those examples and enough of those understandings how those systems work and what they're capable of and how advertisers are using them, for example, to even begin to establish a threshold of what we are
9:34 am
defining as appropriate or not appropriate. and for the most part, all that have is kind of hidden. >>, but it's heartening like the senator, i haven't thought of it, i didn't articulate nearly as well as well as that. they were hemming and hawing. >> very concrete, yes. >> it was really impressive. so, i mean, there is a sense, yeah, that maybe that is sort of more of a burden and maybe you're saying the book or the next thing i have to do is really be more concrete and more galvanizing that way. i wanted to just tell the history, like chris was saying, i just want today say how did we get here? i was genuinely baffled and i kind of came at this stuff with a lot of optimism. wikipedia is an obsession of mean and i saw it as an incredible, arctic sum of little parts. and optimism and more trying to answer the question how did we get here, more than how can we get it back.
9:35 am
but that's definitely the next question. >> we're going to have to open this up for questions in a minute. and i'm going to ask one last question and if anybody in the audience would like to ask a question, please go to one. two microphones here, we'd love to hear from you, because we're at mit, i am 0 obligated to ask, what role does the university have for its students who are here, who are designing things, making their own startups? what ethical considerations should they be thinking of? before moving into the real world? >> wow, i mean, we're kind of saying the deck is so stacked, not and to say to young people-- i guess, i mean, there is a -- well, some advocating for local, you know, there are certain tend your own garden and try to create, i would think there must be a real thrill in like trying to create a small project that can have better values, maybe not be
9:36 am
driven by the market and seeing that come to fruition and i do think there's got to be a way to not sort of view -- they're basically saying because of the monopoly of system we're living in, that new companies are only looking to be acquired. and who am i say to don't seek your billions or your millions, but that's the message, try to nurture something that's smaller. i think there is a real yearning for it. >> and even if somebody is seeking their billions, which is weird that it's-- that's accurate here, strikes me as very strange, what should they be thinking of before when they're on that path? like what questions do you feel mit students should be asking when they're building their groundbreaking technologies? >> i think the hardest question to always answer, you have one intention for this technology, whatever it is that you're building, how to recognize the
9:37 am
meyer myriad mays that the same technology could be used. and that's hard to answer until someone else has found a way to use it or monetize it or apply it to weapons or military industrial complex, right? so, i think that's probably -- like learning how to see your own effort from multiple angles. >> like your own idea, yeah. >> and run it by so many different people. that's the thing, i think, is missing from a lot of silicon valley. how much user testing do they do, where the user gets to tell how they feel. they do so much a-b testing, does that tell my experience, my emotions all of these things. >> and right now they're using the market as the only way to intent-- it's pivoting, and they call it a pivot. >> there are so many assumptions in most of the way
9:38 am
the technologies are operating and that's kind of by design, the data driven approach, we can look at what the behavioral data is and determine what's going to get to you spend more time or what's going to get you to spend more money. and that doesn't get into anything about like what my actual intentions are. so, yeah, actually talk to people. >> and forcing real connections instead of virtual ones would be a real way to interrogate your idea. and even, you know, mark zuckerberg is coming around the idea that there's more to-- there has to be some real basis to these relationships, otherwise-- >> i have one more thing, which is, i think so much of what we're talking about is like the leadership role that, you know, these individual-- especially in the way that, the men who are in charge who, you know, have these ideologies, but that filters in the culture of these companies and it's actually, i think it's imperative to think about the more systemic contextual way
9:39 am
that these ideas get played out and you know, how that is baked into what your-- your engineering goals are, your efficiency, your, like, how is your job, you know, text. what's your performance? and having some control over those questions seems to be another way that we could like cut into, you know, the overall culture. >> all right. we are going to open this up to audience questions now. if you would like to ask, please use one of these microphones. please be kind and we're going to switch off from one to the other. we'll start at this one over here and then next one is coming over to you. >> my name is nina witten here from the humanist chapter at harvard and hub at harvard
9:40 am
square. thank you to radius and the folks for bringing this important topic out in the open. i wanted to ask the question about the -- sar sara, you mentioned that technologies don't take people's feelings into account. from an ethical experience, i believe the use of emotions like and anger, that is a proxy for hooking into the dope amine circuits and creating an experience and i think it's unethical. i wonder how to bring these, you know, bring this to awareness. >> slum. this, you know, this is like whiskey and cigarettes and the rest of it. >> yeah, how do we regulate those kinds of things, a great question. there's a couple of things. and one is, there are already engineers and designers who are recognizing that and that again
9:41 am
gets back to the optimizization question, is it get you to keep scrolling through and auto refreshing and infinite feed, right? tristan harris is one much these engineers and designers, former google person, who is working on the question of time well spent, time well spent .io. we're designers and engineers, we can design better ways. on the addiction side of things, i really love, i think she used to be at mit and now miu, but natasha's work, a book called addiction by design. and it's actually about slot machines, but, of course, very relevant to the design of any technological interface and, yeah, she's clearly articulating these stems are designed to keep you in flow, keep you going, not necessarily to keep you winning, and that's
9:42 am
kind of a really interesting look at specifically such a clear use of addiction, but people are -- have naturally taken that and applied it to these markets, kind of broad consumer interfaces. >> hello, i'm elliott. born and raised in the bay area, silicon valley and come from all of this stuff and there's a lot of complexity there, plenty of good things and plenty of bad things. one thing that struck me your argument of unefficiency. if you have a company to take over a market or whatever it is, and you're squeezing out people who live on the crumbs, i see also like a flip side. i can argue, you know, when you pay a medical bill, who is advocating for efficiency when you have trillions of waste in the health care industry. it goes both way, there's a
9:43 am
cost to the people you squeeze out there. if you had a tech company go through and try to completely redo health care and make it an eefficient system you might be able to save trillions of dollars, but what happened to the people who got squeezed out from that system. you can't paint it in black and white either way. there's this trade-off. >> yeah, i would think and having universal health care is a much more important goal than either one of those things. you know, in my opinion. so, i totally agree and of course, there's a reason why these companies are successful and they're popular because there was a need for efficiency and for hailing cars and shopping and we're sort of saying it needs to have a higher gome. i think what sara is framing is really good, the idea if your goal is just to maximize profit that's not a good goal. so that could explain why you want people to be addicted to maximize profit, but not good for society. and the gleaner analysis, the pollution, the first question,
9:44 am
these are, you know, she's mentioning alcohol companies and tobacco companies and go i having people what they wants and very popular. it's like, the way you dress these things. so i think we need tomorrow enthusiasm and creativity is vital to the society, but there has to be a larger goal or we will have a problem, we are having a problem. >> and i would follow up, the idea, why hasn't silicon valley completely disrupted the health care industry yet. i think it comes down to the allergy to a highly regulated eco system, right? i mean, that's so far down the road of like we just aren't going to touch that. looking at the quantified self community and self devices, you know, none of those are medical devices for a reason. >> and those are-- >> and sorry, yeah, trackers, you know, fitbits, a lot of these companies like wanted to start getting into, what's the
9:45 am
tri coder, like, let me scan your face and see if you have, like "star trek"-- >> like-- no, no, how sick you are and have a checkup, like a checkup in five seconds, right? the reason that company is kind of, you probably haven't heard of it, they've gotten so far back with fda approvals and things like that. >> what do you think of the blood transfusions and that's part of that culture. and you talk about you think on that level. i was thinking about the level, to me it was the idea of creating artificial intelligence and giving birth to yourself and not dying and detached from reality and thinking of your brain something that could be on the cloud and lives forever and that's how i view it in that context, ideology of computer science, how do you see-- the biohacking. >> that kind of stuff and is that medical or is that data.
9:46 am
>> certainly, but then, you know, you've got a lot of hacker types who are willing to experiment on their bodies and they-- there's something about thinking differently about your corporeal place in the world and whether or not that's a sacred thing or not. and that's getting deep in the weeds, but-- >> thank you so much. >> thank you for the engaging decision. i'm a graduate student in mechanical engineering. i grew up in india and the one thing that really struck me when once i moved here was the commitment to free speech and the protection of first amendment that the supreme court has provided in this country, which is-- i was taken aback when you mentioned free speech as liberal value and-- i wonder what you mean by abuse of free speech. of course, internet is not the
9:47 am
best place for discussions and we all know that, but how do you think we can really maintain the spirit of the first amendment going ahead? >> right, obviously, this is something that people argue about. you could argue limiting the amount people can donate to campaign is restriction on free speech and we live currently with a supreme court any rich person can give as much money as possible to an election and that's their freedom. >> and corporations as well. >> and that could either be a very enlightening-- see, why i actually thought the russian meddling in our election is-- it puts lie to the argument much free speech. if you think that free speech is -- all of these russian or other meddlers, putting words and pictures together. what's the problem with antagonizing people and getting
9:48 am
them to hate each other, it's just words and pictures. there's a concensus in our country that that was a bad thing and maybe our president doesn't agree. almost everyone agrees it's bad to have a foreign country serve antagonism on our people and try to pick a winner of our election. that to be a limit on free speech in one level, but to me a totally necessary one. i totally hear you, i want people to express themselves and hard to belittle that and i understand if you live in a society if you can't do that how harmful that is. there are real coasts in essence, places like twitter are allowing incredible-- it's in anger in certain groups and look at it in historical context. that's how i would argue. almost like think of it as code, think that thing called ultimate free speech is just going to, would, it's the right thing to do and not recognize the actual real, real effects. and that's my view of it. i mean. >> yeah, i mean, i would add
9:49 am
especially in the twitter context, i think they've done so much to protect free speech that they do -- that it hampers them from undressing very serious things and behaviors like harassment and you know, how to embed text on that kind of behavior in the system. i think they've done a whole lot and tried to do a whole lot so far, but i think there are a lot of people who are really underwhelmed by how that gets manifest in the code and how as a user who is being harassed, like what can you do aside from blocking all of these-- >> and the scaling idea that it should be automated. if you were living in a society and picking on someone in a harsh way we would say don't do that. you'd say it's my freedom to yell at one person and make them feel uncomfortable so they would run away. we'd say don't do that. that's the fundamental--
9:50 am
when you're sort of ab tract, it's my speech right. i think it's dangerous. i mean, i totally hear your point point-- >> hi, i'm a humanities graduate student at mit and six year moderator. and piggy back on that question. you talk a lot about gaovernance and believe that the regulatory behaviors should be algorithmic controlled. but they rely on human moderation human or volunteer such as in my case. what do you see as the current place and possibly foouture potential, human or-- in these spaces. i naturally fall back to wikipedia, they do some human
9:51 am
policing, but fundamentally i think it has to be human. when we talk about big, it's not human scale. there's a book called "the boy kings", and a beautiful description about how the scaleability was the most important thing. growing fast and scaleability meant you couldn't be human customer service and removing people out of that and that's fundamentally-- wikipedia managed to grow quite big, but not billions, by using people who are very motivated to do the right thing. so, yeah, i think it's kind of vital that you have to have a human community that will respond. i mean, i think it's like vital. >> i think it gets back to the norms question as well, as to, you know, who is determining how norms or expressed for algorithms, or for, you know, automated platforms, right? and that's, that's a pretty hard question. so, yes, that's where the
9:52 am
interplay between technology and humans or, you know, the filler through the algorithm, but then have a human look at it and i -- i kind of stand on the side of, you know, i think that we're going to be working alongside of technologies and ai's and whatever else for a long time and it's not just going to be an either/or, it's going to be both. >> thank you. >> hi, and so, i'm postgraduate mit and i look at your book from the perspective of both political scientist and european, i guess, right? and we do have a slightly different standards. in a way i cannot help, but sort of want to ask you, where do you see the political system, you know, in all of this? because ultimately i would argue that, you know, what we see in terms of-- 'cause you take sort of an individualistic sort of approach, which is valid and very interesting, but i would
9:53 am
argue that ultimately this is about, you know, how much influence do we want to give government and how much influence do we want to have on our own and that's why we have-- why we have the european commission, being very explicit about the use of cookies right now. if you could elaborate on that. >> i do think-- i definitely hold up europe as a model that way. the idea of the individual marred to know, you're describing individuality because the government is protecting individuals. and in america, how dare the government protect me. and that's the catch 22, isn't it great, we've got the rules and protecting individuals. get the government off my back, off my medicaid. people don't really appreciate what is being done on their behalf. i think that's exactly-- i think of, you know, the rights we've forgeten. a case in europe they have this rule that basically says i have the right to even kind of--
9:54 am
you serve -- you've done something wrong, youer is -- you serve your time. you did something 20 years ago, shunts be the first thing you see on google. wikipedia, i remember discussing this with the executive director of wikipedia. they don't like that, they're an encyclopedia, you commit add bank robber or declared bankruptcy 20 years ago, that's a fact, part of your biography, he it should be in there. how dare you say that couldn't be put in. i think it's a reasonable regulation on recognizing how the internet is different than a newspaper or court record. in the old days, things are very-- there were no limits and things were published and it's hard to go to the court and who declared bankruptcy. you have that effort. 20 years, declare bankruptcy and now it's immediately the first thing said about you, you have to sort of bend, adapt for that and that's a different definition of free speech, you know, i think. >> so-- >> i would take it one step
9:55 am
further which is, i think part of what you're getting at is like what's at stake here is the legitimacy of these companies and individuals to govern and rule us, right? like, they're, as institutions we've kind of opted in to living in their world and those are very different worlds from like the traditional trans national, you know, view of the world, right? and that's particularly interesting and especially when we're talking about zuckerberg's aspirations, specifically to, you know, understand that he has built the largest or the beginning to be the largest community. >> community, yeah. >> so i feel like that kind of political term, legitimacy is operative here to say like, do we sign up for this actually, are these the leaders? are these the ideas that we believe in? and if not, what do we do about
9:56 am
that? and my last point here is that my-- my hope exists on the eu and the gdpr and so-- what is gdpr. >> it's the general data protection regulations, which is coming out-- or applies in, i think, may 2018. this is data protection and, but the big thing here is that any company that serves citizens could potentially serve somebody visiting, or a citizen visiting the united states. so it applies blanket to basically any company, especially all of the ones we're talking about if they operate and serve the u.s. citizens. that is like the best hope for any solid regulation and i think it's a kind of least common denominator situation, where, you know, in the same way that cars are manufactured to meet the highest standards,
9:57 am
california, i think we're going to see that kind of level of regulation. obviously, it's a little easier to change user experience based on geography so that's kind of a trickier loophole element of this, but i'm hopeful that that's at least pushing the conversations in the right direction. >> again, you're looking to germany for as the world leader as-- >> well-- >> everything, you have so many thoughts while you're talking because this is my life and all the articles you brought up and the people you named are-- give us all your thoughts. >> and i am a software engineer, and with sara, focusing on the responsibility of my field approtec and engineering and also the tech in government and how to get policy people how to understand
9:58 am
technology and how to get technologists even a sliver of interest in what's going on over in d.c., not only to make our government better, but also to impact policies in ways that may be both in the-- respect and care about. i wrote notes because i know i would just start rambling and lose my train of thought. i had on the first point of harassment, diversity in tech, there are a lot of women who have spoken out. they don't get the same kind of coverage as maybe the people in the media, but kelly elliss, ellen powell, erica baker, susan, the people who spoke out against justin when he was-- countless really, really brave women. i don't know how to amplify them more and get the same coverage. >> not enough for them-- >> maybe women journalists. we were talking about women journalists not brave women in
9:59 am
the field and the stories that women journalists have been writing about. so-- >> then in addition to that, something you said that stood out for me and sara talked about was this idea of both not only the market driving decisions, but also this idea of like utopia, you're going to connects all of these people and the world can be a beautiful place of rainbows and unicorns and the second part for me, i've been in cambridge for three months. maybe if people stop caring about money, maybe they'll care about users. and some of the engineers students in the room probably they have a sense we're here because we're changing the world and google's do things that matter and drives the engineers to go there. if you talk to many of the engineers, what's the average, or annual revenue for the company. >> i don't know. they even put like, their ads and marketing people over in a different campus and the engineers like unlimited pot of
10:00 am
money to go do whatever you want. so, in the world of people actually are only driven by money, maybe we can figure out what to do there. what do we 0 do when people believe that what they're doing is so good, because we're at google because we'll make the world a better place and we're at google because there's money and power to connect the world and a teach someone in developing regions to farm to pull them-- deep missions, but then not any about -- and that for me is a much harder thing to put a finger on. ... also they're not as smart as we are. they should be over there. that's another, part of it is seduction that, i was very, i
10:01 am
was very struck with the argument that the computer is a close world, so that you are in such control and you can make the rules and it's so clean and it all makes sense to your mind. maybe it's like unfair but i talk about zuckerberg, his first program was again based on julius caesar. he's doing world of conquest and realizes. who cares waste what he did one computer? that's fine. is that crossing of the screen to reality? were you drawn to programming because of that closed world were all made sense? is that what we talked about, rainbows and unicorns, create a world that way? >> it so interesting to say that. i think perhaps i was drawn in by the utopia of i really can help the whole world connect
10:02 am
with each other and all these things, and candidly studying computer science this is back to what center with singh. future algorithms, how great like some of your theory classes, what is it that people, they don't touch on ethics for users, like all that is separate so that wasn't what drew me in but but i don't think it's really were trained in the way we are greater for our performance at companies gears towards developing a certain way even if we have certain interest. i don't know how to fix that. it's what i think about all the time. >> is it a question of actual impact? if that's what you believe you are doing, how do you know you're doing it? like, how do you check that you doing that? i don't think most companies have with the following two of that. there's not room for that.
10:03 am
once you finish one product or feature of whatever, you're on to the next thing. and, like, how much aside from using yourself do you really get to understand how it's impacting users, how users use, what their expenses are. >> how would you get people to think about the impact of what we work on. >> requires training, social science and other skills which are not what people have. it's arrogance of thinking i'm good at programming, good at analyzing how millwork will affect the city. why, , why would you be good at that? >> one final and then i will sit down. i would love for you to talk to you later about your covering this field for so long. definitely these roles are seen as idols in the tech field. mostly men but people who people look to as, like, people pave the way and there in the hard computer science field, the leaders in ai or leaders who were original writers of main algorithm for google search. when they say stuff, people
10:04 am
really listen. things they say are not usually about ethics or or users. they talk about sufficient algorithms and things like that. perhaps there are ways that can influence the leaders that many people listen to to really, i don't know. >> the interesting thing doing the book i would collect each and saying how smart the other one was but insecure about their own intelligence. is more mark zuckerberg a great programmer? of course he is. john mccarthy who was the essential figure in this book was a mathematics professor at stanford and professor passed over for tenure, adult with an mit, became a computer scientist. was brought back to stanford as a full professor. computer science is easy. suddenly he is a professor. no one insecure in the world of constantly -- but i hear you. it's like maybe and i would be
10:05 am
the way to get some change. i have some important person like endorsement stuff, yeah. >> thank you for your panel. i'm a graduate student, and i had a quick question around what you mentioned, sarah pixar really think like a lot of these issues and externalities of skill came to be because of lack of good business models. one question, a new book about free innovation. it's like how do we think of monetizing value when you are not aiming for acquiring actually like how do we create and value what we create? and do you really think it's going to become, , like, decentralized blockchain eluent
10:06 am
owns the data, or we're going to go back to our tribes that are interconnected? like, what are the scenarios you are imagining? i am european as well. there was a summit of entrepreneurship in estonia last week, there was like such a big debate around scale ups, like the startups. people were revolting against that, like young people, fewer people, everyone was like really, do we need to scale? is that always the model of success? so that was a lot in there, but i'm curious what you think. >> great questions. so i think that again gets to the optimization question, like, is scale the optimization question. in your account a facebook it's purely about we don't know what the business model is. they did know what the business model was for a very, very long time. but the question was scale and like network effects.
10:07 am
by the way, like we'll bring in sheryl sandberg, but that's another story. so yeah, i mean, i think it's really hard for people to like imagine other formats, a psycho mike okay, well, you know, maybe map or, like how long have talked about that? since it first came out and we talked about it as one possible way to disrupt facebook. it was distributed, locally hosted social network infrastructure to not have any centralized kind of control over information and data. also i think some of the guys were former facebook guys, is that white? for niu? i don't know, i forget. and in similarly, but more for
10:08 am
like a twitter model. that's a rough description. the thing was with even thinkig about different models like blockchain, blockchain in itself like ideologically is very libertarian, right? >> i hadn't thought about it, like even the idea of tribes, there probably make more sense to say they are communities that belong and it's okay to share. it's like i'm an individual in the world and where connecting like that. i would say the other thing about business models is that, again not to fall back on regulation of the ideas you're supposed to make ability for things that are harmful and that's what you don't pollute otherwise would be a great business model to chop all the trees down and pollute the river and make the most money. i think it is on us to sort of ad the cost to this so that will work. >> to extract the most value.
10:09 am
>> from them or what they are doing? >> that's the alternative. >> right. >> i would also add, i think it might, one other leave is like the demand side of things. even if it's hard to imagine a world without facebook and amazon and google and apple, what does it look like if we start demanding different versions of those things are different features are different ways of interacting with these companies. that's really hard for people to get their heads around, but is probably on the way that these companies are going to change. >> the pessimistic view also that company specific and powerful that it's very hard, they're going to stop this from happening. the power, they have to lobby and such, you know, i mean the president -- jeff bezos owns the "washington post."
10:10 am
it certainly to some good things but it's ominous. i think anyone watching ought to be, it's an ominous thing. i think it's undeniable. so the few there so entrenched is definitely a problem. >> thank you. i come from the social logical perspective of having delivered mail for 30 years in california, right smack in the middle of silicon valley. >> so using the technology. >> icing the houses go up, hundreds of thousands added zeros to property and people are 30 and work for google and facebook and whatever coming to these neighborhoods and by the houses, and the people get pushed out from his houses other people that are now the contract workers that help support facebook and google. if these are not just, not just the advertising people that you spoke of but the people to work in the cafeteria, the people
10:11 am
that drive the dry cleaner cars to bring you to cling to these people so they can work 15, 20 hours a day. >> and none of them or apple or facebook. >> even though they spend full-time there. yet they have to like spend more time on the internet getting drawn into say, and you pick up another shift? you can't go over 40 hours because the we might have to pay benefits. so the fact that there's a disconnect between, like the zuckerberg foundation, right now i've read an article about how they're trying to work on this homeless problem. well, one of the people i know that is actually homeless is somebody that lost the benefits from a job and couldn't an old has part-time, so the only value the core center of with our bet they don't have any value of the people that truly support the company, all the outside workers. i think maybe mit probably has adjunct professors but it a of persons working in the boiler room isn't mit employee and
10:12 am
probably gets full benefits. the lack of benefits is destroying the middle class, and that's the total lack of humanity i think. >> that's peculiar. at least take care of their own. some of the societal effects but this is the immediate backyard. it's really staggering. >> i think the most common argument they make is our core business is not the cafeteria and cleaning. like, that we as these companies are going to do only our core business, and like -- >> that's the business you talking about earlier. the business -- my niece work for start up company. she had it to travel to paris and state in a room with five other guys wish you sexually assaulted by one of them but this was before the company had a pr department or an h.r. department, sorry, and h.r. department are you going with were. a lot of the start it's a very rogue and get the don't understand how it is to value everyone around them.
10:13 am
>> think about the logic it's a it's not part of our core business. it's like how are you having food? how all the halls being cleaned? in the book i talk about, there's a philosopher who argued basically libertarianism is antifeminist by definition. it's not a coincidence it's all text brose. because what is individual come from? its face of fiction you can be a google employee of the food shows up magically. libertarianism is the argument usual in life, you are an adult milk and you don't know any that anyone who raised you. the community that raised you. it's like i would at least agree that libertarians would make sense that he felt it was a fair playing field here your ideology and hard work is what you should reward. it's not a fair place. by definition it's faulty logic. it's predicated on devaluing women and about the family and
10:14 am
women and giving as here to adulthood. >> your knees, is that what you said? >> right. >> what happened to her after? >> she stayed with the company and really talked with who come she was in a meeting of the h.r. department, and that person did get fired, just by conversation but she was really worried about actually losing her job, and didn't. so that's a good thing. but probably because she was really good after job and maybe he wasn't as much. but it should not happen but there were not the thing set in place. but this is one thing i think it possibly be a way to bring everybody around is, like, if all of the amazon drivers that are hired right now for the holidays, it's like december 23 they went on strike, didn't take a ship. people would be like oh my gosh, what am i, you know. [inaudible] >> in germany on black friday nothing happened.
10:15 am
>> labor unions are the counter effect. they are our hope in that sense, true. >> i max. i'm an undergraduate in mechanical engineering. as an engineer who is really interested in user centered design in getting into that, and in particular i design conversation had to change for different users, i've noticed a lot of the conversations tend to be stuck either in academia are these design friends were a lot of these conversations exist. even in my capstone class a huge emphasis they put on user experience and design is great but then you shift between the jobs people end up taking and the framework of the job the end of thinking is very start to me. i was wondering like a somebody is going into in ginning field of wanting to be a professional in this or like people all of us here as consumers, viscount we see that shift in thinking in these larger companies?
10:16 am
>> that's a great question. i think the best way to get in is to like to make that a business case. like, this is going to lead to better experiences and better user like value, right? i think the question is how to speak the language and put a number on it or integrate it into the kind of way that they want to value their process and their development. i don't know. it's a tough question. i think it's also just the question of finding those people in the institutions themselves, in the companies themselves who are thinking like that and have influence in those ways. or just working outside of it, right? like tristan harris leaves google to kind of advocate on a large scale to influence a bunch
10:17 am
of desire to be asking these questions in their institutions from the bottom up, right? i think that is arguably the most effective way to cut it up the ethics from the bottom up. >> thank you. >> thank you guys for the work you do to bring these are important issues to light. i am a resident fellow at harvard divinity school, so shout out to the humanist chaplain overeager doesn't not a lot of us who come into these rooms but -- >> thank you for coming. >> i worked on tech policy in the obama administration so i'm a weird hybrid person. worked with kathy over there. >> all these hybrid people over there. that is what is missing. >> my question is related to that. one thing i think a lot about is doing the obama administration we did a lot of work to bring these techies into government in part because it's like if you want things to change then come join us. i'm thinking about this question
10:18 am
how to interrogate our own products, and if it's even realistic to talk about having teams that are within the leadership of some of these companies whose job it is precisely to do that, to think about what are the effects on humanity, what is the justice lens of this stuff. there's an element of, if that's not your job to do it, it's like a frog and boiling water problem. you forget how to look at those things, or you tell yourself it's easy to believe the narrative of we're here and we're changing the world in these ways are those ways. even if the product may be a skanky people to education who otherwise wouldn't have. your job is not to look at a macro level at how the overall company is influencing the direction of things. and so those of us who are talked about these issues, it strikes me a lot of us should talk about joining the companies themselves but i also wonder if
10:19 am
the sound idealistic, people could be empower individuals, what would it take for mark zuckerberg to say yes, we want the team that doesn't this. it would be having the humility to admit that like right now they're not doing those things. >> i think they're shortly thinking about it right now, at least from a kind of oversight perspective for a bunch of different reasons. but maybe not quite to the level like having a chief human experience officer or some version of that. i think this is why it's really important to not buy look at the individuals but also look at the systemic support systems around them. like, what is the institutional org chart for facebook, right? like that matters for google. that matter how does it change over time. also really matters as these
10:20 am
companies scale as they as a g, as a start touching more things. i think that is, aside from like trolling, like how do we as a public it to look into that? may be the shareholder, like has little bit of insight into that but probably not much. >> i think the book didn't dwell on this but a major absence is like lack of labor movement and that is what, there needs to be these checks. believe me, labor unions are inefficient but they serve as a very valuable come when you see the whole of society you see labor unions being inefficient and giving that incident is. they serve this vital to check these companies. they will not do it voluntary. there will not be self-appointed are only in reaction to the bad press and the valid concern
10:21 am
about the election. that's leading to some change. lack of diversity, that could lead to some change. if there are actual labor unions that would be able to strike and really affect the balance of power, the only things that i think will make them change. it's not beyond the realm of possibility that what happened but i think we expect an order charges a numbers like that, it won't have knowledge of because they think they're doing the best. that's why the message of the book, they think they're doing the best job and they will say look at successful that are, that is how i know i'm doing a great job. you're talking with ethics or whatever these things are. go back to the google designer producing i have a vision for google, my vision for how it should look. i'm a designer. i study this. they are like which is going to test it. people will like what we do, it's like they're not speaking the same language i would say. it doesn't compute what we're talking about. >> that makes sense. strikes me maybe now we are at a moment when they can no longer
10:22 am
say we are a neutral platform and we don't need people whose job it is to think these things. but again that may be naïve of me. it seems if there's a moment to push on it this may be as -- >> absolutely. >> i would just add, like there's any product goes through okay, lactulose have to do the check box. the lawyers are the last step to injure your sake is because it's like we built all the stuff in the lawyers are saying we can't do it, or like it because it's not integrated into the process. so imagine that, but like not just lawyers integrated into the process, but like experience. who are those people and how did they get hired? this is a huge case for human -- >> what were their calculus scores? >> but huge case for the humanists to find jobs in these companies. not just the engineers that we
10:23 am
need. >> and last question. you've been waiting so patiently. thank you. >> my name is heather and i'm a lawyer, but not one of the lawyers you were talking about. and the thing that struck me is that what we study in law school largely us how the rules got there. and why we have them. and the internet seems to be a giant eraser of history and why we do things. because i know lots of engineers, , and the general mindset of engineers is we can figure out anything. and so that means that expertise is not valued. because we can always learn it. we have all this information out here. but they are forgotten the part about that information is not helpful. and so when you have a platform
10:24 am
that has no way for you to tell what's good information and what's bad information, you make bad decisions. in my corporations class, the first thing we studied was taxicabs here not actually the uber issue really, but more what they used to do to avoid liability was one car, one corporation, ten cents in the bank. that's why we have the rule piercing the corporate veil. if you don't capitalize your corporation, then guess what, you don't get the benefits. we are losing all of this because we have a lot of these people who are so much smarter than everybody else, that they decided they don't need it. and they are also really young
10:25 am
and they haven't lived and they haven't seen that there are reasons for why we do things a particular way. and yes, we should re-examine them periodically, but we should recognize that there is a reason. >> have you ever of this thing called -- the id you walk into a field and you see a gate and you're just like well, it makes a sense in the middle of the field. obviously i will just hurt down. you should at least have the politico somewhat put it there. there was probably some reason. i don't know if it's the right reason but have some respect for history and context. he came at it from like a right-wing perspective, a conservative perspective. you could see why it has conservative values. reality libertarians are talking about highly, highly radical. in a sense like the russian
10:26 am
revolution or something, the lack of respect or institutions and for history in this belief that progress was the mr. bennet happen instantaneously. i think mistakes -- the stakes are very high and it's easy to get confused by what is being pushed forward, but it's a scary, dangerous ideology and that's what i was trying to say in the book for sure. >> i think i would just add that, to address the kind of allergy to regulations, that's a perfect example. there's a reason that taxis are regulated or have rules around them. this is the kind of uber model, like which is good to make it more efficient, like that's all we're trying to do. so regulations on taxis and local jurisdictions don't matter. that was quite literally like the word that came out of travis mouth, but we all know where he has ended up. but all of that is to say, like,
10:27 am
it's worth acknowledging that like technology has politics, and like there are, you know, the libertarian stance is such that it is i'm just apolitical. this is efficient, this is market driven. there is no politics involved. i think what we can really push against it's like yes, there are politics involved. to call it out and call a spade a spade. >> i was on my editor there's a line from the band rush, if you choose not to decide you still made a choice. actually he said was pascal who said that. so maybe it's more weighty that way, but that's the myth that simply do not make a decision but, in fact, they are making huge decisions. >> and additionally the way you ask a question, has an answer embedded in it. >> absolutely. >> i mean, that's what you learn in legal writing.
10:28 am
>> thank you all so much for coming out, and again just this one last reminder for you go back to class. for our speakers, noam's book is appear for sale. i cannot recommend highly enough. and our meeting was if you're interested in hearing about future forms is right over here. please join me in -- [applause] [inaudible conversations] [inaudible conversations]
10:30 am
>> look for these titles in bookstores this coming week and watch for many of the authors in the near future on booktv on c-span2. >> okay. hello and welcome to great minds, the harlem renaissance at the 19th annual fall for the book library festival. i benedict carton come into department of history and art history and african studies. we appreciate your attendance. the fats of runs through saturday, october 14 for all the most up-to-date information on this festival at all the other programs throughout the year plse
89 Views
IN COLLECTIONS
CSPAN2Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=385688235)