Skip to main content

tv   Washington Journal Jeff Kosseff  CSPAN  July 30, 2019 2:30pm-3:01pm EDT

2:30 pm
for a woman to do anything in her life, she needs a car. night,, a saudi arabia women's right after this talks about her decision to challenge the saudi government. an act of civil disobedience. capable ofare driving. >> watched sunday night at 8:00 eastern.
2:31 pm
>> we have to understand the history of why congress passed itliable if they knew or had reason to know of illegal content. once we had early online compuserve, the courts said, well, if you don't moderate user content, you can receive that protection, if you are like a store -- like a bookstore. but if you moderated, you are more like a newspaper. in 1996, congress passed the communications decency act, section 230, which says if you
2:32 pm
don't moderate or moderate, if you have reason to know, you will have this very broad immunity for content users post. host: go back to 1996. what would the development of the internet look like without section 230? guest: it would be difficult to have some of the current platforms under their current business models. yelp is a great example. if you didn't have section 230, yelp could be liable for every negative review someone posts, if they don't take it down immediately upon receiving a complaint. how useful would yelp be if they removed everyone' -- every one star review. host: why are we talking about section 230 today? why is congress interested in this? guest: it has taken on new visibility because of two separate types of criticism. one criticism says the platforms are not moderating nearly enough area so, there's a lot of harmful content, big propaganda,
2:33 pm
hate speech that gets through, videos of shooting that aren't taken down immediately. that is one set of criticisms. what those critics say is, contrary to the intention totion, it was passed encourage moderation. then, there is another criticism that is basically along the lines of large platforms like google, youtube, twitter, censoring certain political oftenints, certain conservative say that they don't get the best search results or their content is the monetize. they say that is political bias -- d monetized -- demonitized. they say that his political bias. host: go back to something you just said, that there is a thought that it was passed to encourage moderation. if you are not held liable for
2:34 pm
anything, how does that encourage moderation? guest: the idea is that the platforms will develop the theirization policies users demand. if a platform doesn't do any , if they allowed every bit of content out there, that probably won't be a great place to be. the idea is consumers will --. rather than have the government said moderation standards, which would run into a lot of first amendment issues, let's have these platforms accountable to their users. that is the theory behind section 230. host: do think that happened in the way the original writers of section 230 thought would happen? in some think it has respects. there are detailed moderation policies. i would say the platforms are very far from perfect in how they implement and communicate
2:35 pm
their policies. i am at the naval academy and i interact with intelligence agencies as well, and sometimes, i think the intelligence agencies are more transparent than some of the large online platforms, in terms of what their practices are. that is not what should be happening. they need to be communicating. they need to better involve users in these decisions. guest,eff kosseff is our an assistant professor of cybersecurity law at the u.s. naval academy. the author of the book "the 26 words that created the internet." online content and liability is the topic. the phone lines are open to join the conversation. democrats, (202) 748-8000. republicans, (202) 748-8001. independents, (202) 748-8002. jeff kosseff will be with us until the bottom of the hour this morning. i to the debate playing out on capitol hill and at the white house, can you set up the players in this debate and how
2:36 pm
they are lining up on either side of this? guest: you have -- there were two members of congress who sponsors section 230. one is chris cox, who is now in private law practice, and the other was then representative senator ron white. host: is at the chris cox who is at the nra? guest: no. he was the sec chairman under president bush. they wrote section 230 and both advocated for senator -- center wide and has been a strong defender currently -- senator widen has been a strong defender currently. senator cruz took the lead to say these platforms could not be biased on certain political viewpoints. senator hawley introduced legislation that would condition section 230 protections for larger platforms on some form of political neutrality. on the others, there are a lot of democrats saying there is not
2:37 pm
enough moderation. speaker pelosi suggested eliminating section 230, and she has had -- she recently had a video of her that was alter that we shared pretty widely. ash that was shared pretty widely. that was an example of under moderation. host: where does the president come down on this and his top advisers on these issues at the white house? specificm not aware of statements about section 230 from the president. he recently held a social media forum where he praised senator hawley's efforts on legislation, which i would assume to be his bill about political bias. host: do you personally take a stand on what you have here? guest: i will give the caveats, i'm not speaking on behalf of the dod or naval academy, i think all sides are raising important issues. we have a lot of anecdotes and a lot of contradicting anecdotes. what i would stress motive
2:38 pm
importantly is that section 230 is so fundamental to the infrastructure of how the internet ecosystem has been it is not something we can't change, it is something we really have to carefully consider how to change it if we are going to. host: is there a middle ground here between the status quo and creating a federal online content liability czar who gets to decide what is fair in what is not? guest: i think there are a lot of different options. one would be to require better transparency somehow. out what firstnd are they doing. these results people complain about the results of algorithms, search engine optimizations, or something else? transparency would be one thing. there are also certain cases were home pro content may have already been adjudicated, if
2:39 pm
amatory or harmful to individuals, making sure and seeing if that could come down from the internet. sometimes, that content can devastate people's lives. there are a lot of options to consider. everything should be on the table. congress amended section 232 deal with the issue of online sex trafficking. host: let's bring in callers who have lined up. online content and liability is topic. we start with benjamin in california, an independent. caller: good morning. host: good morning. go ahead, you are on with jeff kosseff. is, how doesestion cybersecurity correlate to poverty, because usually security threats are correlated with poverty or instability in
2:40 pm
countries, and how to cybersecurity relate to that specific thing? guest: i think there has always been a digital divide. this is separate from section is anut cybersecurity area i research and practice in. interesting thing on cybersecurity practices across the board have been very weak. part of that is a lack of training, education, which i personally think should be starting from the very early ages in elementary school and cyberchool, to develop hygiene practices so people might be less likely to click on a link or click on an attachment that could have does -- have devastating consequences for them and their employers. host: are there states that have created a program to do that in elementary school and high school? is there is one you would .22 do with the best? guest: it has been scattered --
2:41 pm
would .2 to do with the best -- point to to do it the best? guest: it has been scattered. people have had cybersecurity. more and more hands go up, but i think that should be a top priority in education. host: are you optimistic on that front? guest: i think so. there is more awareness than ever before and cybersecurity. i think the cybersecurity threats cut across every portion of our society, whether it is political, economic, individual rights, privacy, and i think people are recognizing it. it takes a little while to translate into action. ant: south carolina, pat is independent. good morning. caller: yes. i think there should be some rules against somebody being somebodyo on and trash -- say they have had a minor
2:42 pm
disagreement over something, be, andr it might they annihilate their reputations and destroy their businesses, and everything. want thaty not to protected, i don't understand where they're coming from. we work to be able to use the internet with the realization , and not juste because we disagree with know, weis called, you are called un-american or whatever, socialist. host: pat, we will take the point to pat's concern. guest: the internet has given us --irror to society, and is
2:43 pm
it is a mirror for the good things and bad things. i encountered a lot of these cases as i was writing my book. one important thing to note is that even with section 230, someone who has been defamed and harmed, and these harms are real and devastating, they can still go after, legally, the person who created the content, but there are few problems with that. bad actorsthe really post anonymously so you can't identify them. they also might not have sufficient assets. the discussion with section 230 has been, well, if we can't necessarily hold the posters accountable, then, do we hold the platforms accountable if they fail to take actions to protect us? host: we come back to the book, the 26 words to create the internet. why was this fundamental in your mind, the creation of the internet, why this specific provision? guest: the largest websites in the world, the largest platforms
2:44 pm
in the united states, facebook, twitter, youtube, think about how they can function even if there was a chance of them being liable for the vast amount of user content out there. they would have to have very different business models. the united states is the only country that has such a broad protection for platforms, and it also has -- is home to many platforms. host: how does it work in other countries? guest: it varies. some have default protections we have under the first amendment for distributors, which is if you know or have reason to know, then you could be liable. that effectively creates a takedown system. in europe, there is a right to be forgotten where there has to be removal of certain information that is determined to have substantial privacy invasion, that outweighs free speech, other hate speech laws that require the prevention or
2:45 pm
removal of hate speech. there's a wide variety of approaches. then, other countries will even go further and prevent certain types of content from being posted that's might be critical of government. an outlierstates is in terms of the broad protections that it provides to distributors. host: if you want to join the conversation, democrats, (202) 748-8000. republicans, (202) 748-8001. independents, (202) 748-8002. john is in statesville, north carolina. a republican. go ahead. caller: good morning. i would like for your guest to explain the two platforms i see use more and when i wake up and turn my computer on, google and yahoo! right underneath it. explain the difference between the two platforms, if you will, and the reason i'm concerned is because yahoo!, i think the lady
2:46 pm
that owns yahoo!, the ceo of yahoo!, i think she is a trumpeter -- trump hater. host: you want to know the difference between the search engines? caller: yeah. and why yahoo! never ever -- never says anything good? every headline is a negative president trump headline. host: jeff kosseff. guest: yahoo! developed first and yahoo!, in the late 90's, was the predominant search engine, and google quickly took the vast majority of the search market share. what it sounds like you're talking about isn't just the search results, but also the news headlines. yahoo! actually has news staff and they have a section that is more still like the traditional news service. that is where they probably have or editorial discretion.
2:47 pm
google is much more focused on its algorithm for both its search results, the google news, so i think that might be what you are seeing, that there is at least, in some sites, they still have more discretion over the stories they are running. sabrina,new hampshire, republican. good morning. caller: good morning. i would like to know -- i understood this is about liability and people that make negative comments about corporations and whatnot, so what is in place for the other way around? people, what do these who are making these -- what if these people who are making these negative comments are correct in what they are observing? how does the sword swing the other way so you see people and entities and investigative agencies that are trying to figure out what the truth is. are that even trying to find out what the truth is or are we
2:48 pm
bolstering up corporate america once again to be able to escape accountability? definitely the idea of section 230, allowing platforms to escape accountability, is definitely a criticism raised for probably two decades, from people who say the platforms are not transparent and they don't necessarily remove content that we know is false and harmful. so, i think the criticism you raise is something that has persisted over time, and has probably gotten louder as platforms have become far more important. we have to remember that when section 230 was passed in 1996, we were talking about prodigy and compuserve, and aol, which were important, but they don't have the same reach as a youtube or twitter, or facebook or google. host: less than 10 minutes left
2:49 pm
in this conversation if you want to join. phone lines are as usual, democrats, republicans, and independents open as folks:. go back to the discussion about mending section 230 and the history of why it has been amended in the past, and how often it has been amended. guest: there's only one substantial amendment. there have been technical changes, but the amendment happened from 2017 to 2018. concerns about a particular site called act page postedpage which had classified ads but many of the ing --volved sex trafficti trafficking and to some involved sex trafficking of minors. there were 315-year-old girls who sued backpage for violations of various laws and went to a federal court, which dismissed the case under section 230 in
2:50 pm
2016. that got a lot of criticism. this is not just decimation -- defamation, this is children sold online. that caused congress to consider and pass an amendment, the first substantive amendment, that exempts certain types of sex trafficking laws from civil and state criminal prosecution. host: was there anybody who tried to fight that on such a sensitive topic? guest: i would say the tech industry's response at the beginning, when this was proposed, was not great. they talked about frivolous lawsuits and plaintiffs lawyers, rather than acknowledging that there is real harms happening. host: where they scared of the slippery slope? guest: yeah. they were. thingsre two interesting to note. there has always been an
2:51 pm
exception to section 230 for federal, criminal law. a few days before the president law,d this amendment into the fbi seized and was able to shut down backpage and indicted people involved in the site. happened without the amendment. the second thing, you are starting to see platforms be very risk-averse. craigslist, for example, shut down his personal section, which you might not necessarily say that is the end of the world, but that shows you the platforms are going to be risk-averse and you are mending section 230. -- newewport florida port richey, florida. a democrat. caller: i would like to ask jeff kosseff two questions. i remember when europe changed about peopley laws should be respected online. to other thing i wanted
2:52 pm
interject into the conversation is that online content liability , we have a lot of breaches every day in the government. fbi today was one of them. what's liability to people have, laws, do the regular people have in a timely manner to correct any information that is connected to those losses? actually, i would like to ask you if you -- if the naval academy, what do you think about all of this stuff with the dark web? really, most people don't even use the dark web. host: lots of questions there. guest: the first issue about civility and approaches in europe, europe is different in terms of its core legal values for speech, than the united states. this goes beyond section 230. in europe, they have stronger ewivacy rights and vi
2:53 pm
a fundamental human right. in the united states, we have free expression over privacy. you have seen those values within the section 230 debate. in terms of the breaches, i'm not sure a fundamental human right. if you are talking about data breaches. theink that is outside of scope of section 230 because companies can be held liable for data breaches. unfortunately, that is currently -- there's not a strong legal system in the united states for that right now. we have state laws, a fairly weak law that empowers the ftc to take actions, but there are a lot of calls to have much stronger laws at the state and federal level. you saw california do that. host: before we leave reaches, i wonder what you think will happen in the wake of the newsmaking front pages around the country today of capital one, the fifth biggest u.s. credit card issuer, saying
2:54 pm
monday that hackers had access to personal information of approximately 106 million card customers and applicants, one of the largest ever data breaches of a large bank. guest: i think that will help to further strengthen the call for a federal -- a strong federal data security law. we have some, but they are fairly weak. in that case, they said most of the social security numbers were not disclosed, but we have to have a broader conception of what constitutes a data breach, because there are a lot of breaches that are harmful, that are not necessarily covered adequately by current laws. host: her last question is on the dark web. guest: this is the subject of my next book, which is about the u.s. approach to anonymity off-line and online. ofs goes back to the issue the united states valuing free expression. there are supreme court cases going back more than a half-century that have protected
2:55 pm
the rights of people to speak anonymously. it started with cases that refuse to require the naacp in the 50's to disclose its membership lists. it goes to modern-day with anonymity protections for the internet. in the united states, i think it would be -- we have this legal challenge, and we also have technological challenges with -- services that protect anonymity. which much evidence would say is fundamental to the american ability to speak freely. host: to bring it back to section 230 before we run out of time, what is the timing on any potential new change to that law? guest: i'm not sure. politics is outside of my scope of expertise. i know the criticisms i'm hearing from democrats and republicans are different than section 230. i'm not sure how you reconcile
2:56 pm
the criticisms that they are moderating too much and too little into a single bill. host: the last question for you from twitter, from john locke, to the larger question of trust in the internet. as somebody who study cyber security and writes about it, how would you respond to john saying "i have no doubt everything we ever say or type is saved. we will have the same system as china." guest: i don't know if i would go that far, but i do know, especially in a lot of these recent breach cases, we have seen companies do retain data for quite a long time. i think it is wise to be careful about what you put online. host: jeff kosseff's assistant professor of cyber security law at the u.s. naval academy. the book he wrote, "the 26 words
2:57 pm
2:58 pm
2:59 pm
i will note that the revelation about a facebook page and the number of people participating in that tends to give credence there is aon that troubling culture at least among some of the officers. what i'm trying to get at is
3:00 pm
what you are doing to make sure that you are disrupting that culture and improving it. >> i would be more than happy to provide you an extensive brief and what we are doing with that matter as well. one of those posts that we all know about is horrendous. but i can assure you that overall, this is a very small group of border patrol agents. >> which is better investigated by cbp's o

55 Views

info Stream Only

Uploaded by TV Archive on