Skip to main content

tv   Digital Photo Theft  CSPAN  November 9, 2014 1:20am-2:32am EST

1:20 am
in the library, when somebody signs up to use the computer, is to figure out how to put in their resume and apply for a job. this is essential to our economic lifeblood. the idea that we should take a blase hands off, let the big boys handle it approach is not one that we can afford. >> this was not billed as a debate. this was a very good one. i want to thank you all for participating. [applause] next, a forum on internet
1:21 am
privacy and after that president obama announces his nominee for attorney general and then a discussion on the election impact on state and local government. "washingtonxt journal" patent craig hill looks at the attitudes that shaped the election results. hows thurber examines divided government can work and will mark the 25th anniversary of the fall of the berlin wall with georgetown professor and a look at the impact of a unified germany. as always you can join the call -- and conversation on facebook and twitter. c-span veterans day coverage
1:22 am
begins tuesday morning during "washington times" with an interview with verna jones and then at 10:00 the annual uso gala. and we are live at 11:00 from arlington national cemetery and the tomb of the unknown. later, selections from this year posner middle of honor ceremonies. actress jennifer lawrence recently stated that the exposure of a personal nude photo should be considered a sex crime. the congressional internet caucus advisory committee examined the legal ramifications of hacking private photos and so-called revenge born. this is one hour 10 minutes.
1:23 am
>> hello i am the executive director of the international -- international internet caucus advisory committee. programic is on your which you have in front of you and twitter information is on there as well as is the information of the speakers and their twitter accounts. event withng this the congressional internet and onadvisory committee the house side the cochairs are congressman bob goodlad and the senators are john thune and patrick leahy. forre in their debt supporting this program and they do not agree on every issue but we are thrilled that they agree
1:24 am
that the internet should have a form where we can debate these issues. i want to thank them and our moderators today. she is a cyber security reporter and she has covered this over the past couple years, and she is situated to moderate our panel today. her twitter account information is on the program as well. take it away. >> thank you, and thank you to the net caucus for having me here today. a very interesting topic which we will be diving into headfirst. to introduce our panel, from here down on, we have marianne franks. to her left is the director of the free expression project. then a columnist at yahoo! tech.
1:25 am
and david post. all these fine people have a lot of expertise in this topic from a lot of different angles, which is where i wanted to start off today. one of the most interesting things about this hack of the celeb photos is it raised a different issues for a lot of different people. as a cyber security reporter, i covered in terms of the password security on the cloud and what the technical aspects of that hack might have been, but it raises a number of conversations from misogyny on the internet to what actually is the nature of the crime that occurred, whether you look at it from the perceptive of a sex crime, a hacking crime, a first amendment issue here, and we will touch on all those different takeaways today. how i would like to begin is if the panelists could go down and say what for them was the one or
1:26 am
two big takeaways from this hack and what is the most important thing to look at about this. so i think a good place to start as any is with jennifer lawrence calling what happened to her is a sex crime. a lot of people were taken aback by that characterization because as a matter of law that is true. what it highlights is an invitation to what we think what a crime is, what we think a sex crime in particular is, and thinking about ways we can recognize it as being such. i think it is interesting to hear from such a high-profile victim of this, that her own sense of one of was this violation of sexual autonomy, humiliation, and exposure that she would call this a sex crime. what i think would be the perspective that i would take on this is to consider why we criminalize certain types of behavior when we start drawing the line between bad behavior
1:27 am
and behavior that we think needs a response from criminal law. and i would invite us to think about why we think the criminal law is important. not a narrow focus, but a social expression, a condemnation of certain types of harms that are so serious that one of the only ways we can express it as a community is to say this should be against the law, and think about the particular nature of what happened to lawrence and other victims in terms of the daily suffering and humiliation they have to experience, that they can never get back, there is no way to undo what has been done, that the harm in these cases really is irreversible and ongoing. what i hope we can do to frame the conversation by looking to a
1:28 am
perspective of the victim is think about why we might care about the fact that such sexual humiliation has become an industry and what is our response validity as society, and if people are concerned about having a free and equal an open internet, what we should be doing in response to that. >> thank you, and i thought it is interesting how professor franks was talking about expressions about this behavior, because that me was one of the major differences i saw in the response around this most recent exposure of celebrity photos appear to show how this issue and how the nonconsensual disclosure of these images has been treated over the years. five years ago or several years ago, when many of us seated at this table first started talking about this issue, it was difficult to get people engaged on the question. there was not a public conversation about how is this
1:29 am
being used as a way to try to go after women, to harass them, to silence them, and to see that shift in the public conversation about there is much more willingness for major media outlets and for people engaging on social media to be talking about the other side of the story, to talk about, no, people should not be going and following these links. the information might be out there on the internet, that we do not have to see it and to treat what has happened to the people whose photos have been exposed as a real harm that has happened to them. i think it is a good thing that we are having much more of that conversation happen in public and society to appreciate the real harm that is happening to women when they are targeted and this way. of course, the concern that i see coming from a first amendment and open internet background is wanting to see if there are proposals on how to
1:30 am
take a stronger response to this, insuring that whatever these proposals are not so broadly crafted that they end up pulling in a lot of protected expression as well. there is a way -- it is very difficult to craft a law that goes after, that makes a crime of disclosing information in a way that only gets after a bad or a malicious disclosure of information and does not also sweep in a lot of real and vital important speech. i hope one of things we can focus today is looking on what are the existing laws that identify the kinds of harms that happen here, whether it is a person trying to inflict distress on another person, a person launching a campaign of harassment against someone, whether the federal computer fraud and abuse act can cover he hacking aspect of this. spirit there are ways we have addressed the harms that can
1:31 am
come from this kind of behavior in existing law that do not entail focusing specifically on the speech aspect of it. >> my first reaction was there was this it is a bunch of celebrities in trouble response, which is unhelpful and stupid because i'm sure no one in this room has pictures they do not want on the entire internet, with some of those people on facebook, and that is i got better way to look at it, because just calling celebgate was one of the stupider -gate words put around. if you want to keep your information safe, are there tools available, do they help, and in apple's case, they had a weak implementation of two-step verification. if you had done it, they did not protect icloud backups. i am not clear on what is getting backed up and how to
1:32 am
control it. it is an opaque system. you have this case where these people did not think they were putting their pictures on the internet, and it is not only clear in a lot of cloud services, where did your data go? there is a story in "the washington post" earlier this week where a photographer from johns hopkins -- he thought this as only on his computer. we have laws against unauthorized access. it does an effective sort of thing. we also need to -- not everyone is going to go through the factor of two-step verification, but it should be there, it should work him and you should know what is protecting and what it is not.
1:33 am
>> i guess i have less to say about the specifics of the lawrence incident. one of the other things that is on this purview of this panel is a related question, a broader question about, as, i guess, professor franks, called it the epidemic of sexual sites on the internet, photo sites, those kinds of things, which is a serious social issues. my thoughts turn with emma to the first amendment, first of all, which, as she said, rafting, even if we think this is harmful, crafting prohibition that would survive first
1:34 am
amendment scrutiny with respect to much of this material, it would be quite difficult, probably not impossible, but difficult and would require some care to make sure it does not sweep in a good deal of protected material. i got involved in this -- i had a student who is working on actually a project on copyright, possible copyright remedies around these revenge porn sites, can you take down photographs based on copyright claims, and i spent half an hour, 45 minutes poking around at those sites about a year or so ago. there's a good deal of material on there that is clearly protected speech. there's some material that may not be. drawing that line would be challenging to my thinking. that is one thought i had. and the discussion about these issues, and there has been a
1:35 am
good deal of discussion in the legal academy at least, about what to do about this, what kinds of remedies we can provide. conversation and debate has moved often quickly to the question of website operator iability for hosting these photographs. there are existing -- we can get into more the rest of this session -- there are existing remedies that may provide relief to people who have been harmed against the individual uploaders of the private photographs that are being posted. but section 230 of the communications act has been construed to protect the website operator from being drawn into that tort liability, because it immunizes the operator against a broad range of tort liability, including this.
1:36 am
so much of this discussion has come around to people arguing about whether 230 should be repealed completely or modified to allow actions against website operators. it is a very important law problem because many issues have this feature where the intermediaries, the website operators typically are helping to spread this harmful information and yet federal law immunizes them against liability. i hope we can get into those 230 issues. >> great. as we can all see, there is quite a bit at play here. perhaps we can start with -- and it is difficult in this case because we are sort of going from a very specific instance. this was jennifer lawrence had
1:37 am
private photos gotten into and her photos got on the nternet. those are a lot of different from revenge porn situation, and after that something went south, where the photo was initially given with consent. that is very different than someone hacking into a private computer that you may not have stored something on the web. it is different if someone gets the password because you used your dog's name is a password versus a sophisticated phishing malware -- so there are a lot of different cases that raise these ssues. generally speaking, what are the remedies who people feel that there are private images and private data in the digital world has been exposed to the internet, what can they do now under the law to get relief, although they may never be able
1:38 am
o get it back? >> i think it is important to focus on the fact that jennifer lawrence' situation, and the other celebrities come are different from these other types of context. it is important to not make too much difference out of these instances. if we look at it from a more privacy perspective, it should not be a difficult intuition when people disclose internet information one party they also do not expect it will given to another party. either that will this is disclosing that to your partner, or the cloud, it the obvious way to look at it is don't we have some kind of contextual sense of privacy? when you tell your doctor about your symptoms, you expect your doctor will not tell anybody else about your symptoms or share the pictures of your medical exam. you have plenty of situations where we can think about it not related to the charged issue of
1:39 am
women's naked bodies, but think about all the ways we expect our information should be kept confidential within a relationship even though we have given it voluntarily. when we consider it that way, it is helpful did think about what we do in other contexts. do we protect people's credit card information, social security information, home addresses, do we protect ompanies' trade secrets? we have criminal penalties when people step outside of this particular contexts, and it is useful to think about why we should order should not apply people step outside of this particular contexts, and it is useful to think about why we should order should not apply those remedies here, because it is true we can come up with ways for victims to talk about copyright remedies. opyright remedies. it's not going to work for the vast majority of victims who
1:40 am
have no resource. it takes time to figure out the takedown process. you need to have a lawyer backing you up. even when get it removed from one site it will pop-up on 20 others. copyright is really not a effective solution for the mass majority of victims. it is not just relationships going sour but actual ongoing domestic violence relationships that are being used to trap people in relationships. we've seen that we've had plenty of sexual assaults that are being recorded and broadcast. this is a big category of intimate material that is out there. the idea there's any kind of lawsuit or copyright material -- i think it is abstract given what actual victims experiences have been. i also think it's important again as we're trying to think about adequacy of legal remedies
1:41 am
to think, yes, completely about first amendment values and to think about the goals of section 230. and in that thinking consider how much of an effect i would say a disciplinary effect these types of harm are having on omen's speech. how many women are afraid to commit themselves truly to their careers or to online discourse because they're afraid that this is what's going to happen to them. this is the punishment that will be given to them. the laws -- unless you have tons of money and tons of time which many of these victims will not have because they've been fired from their jobs or kicked out of their schools. i think that's a sense that we have to take seriously about how much this is affecting not only women but the epidemic is using
1:42 am
the threat of this type of behavior, using the actual use of this behavior as a way to shut women up and drive them offline and as a speech matter, as a section 230 matter in the interest of fostering eek quality, we should all hear about that. if i may disrupt the order a bit? david, if you could go in depth on your opening statements. what are some of the ways that the law understands have tried to grapple with some of the issues and what ways have people sort of looked at adapted the laws before the internet was what it was today. >> the hacking is the hacking side of the problem. it is one that may well have been in this specific instance a violation.
1:43 am
accessing a protected computer without authorization gives rise to a civil and criminal liability under the federal code. obviously i'm not giving anybody legal advice or taking a position on whether or not it is but that's certainly one avenue. ,here are also on the tort side there are both several people mentioned -- there are a number of state law tort remedies for some of this behavior, intentional infliction of forional distress being one outrageous conduct and outrageous activity is another. i guess there's i word in response to what mar a ann said, of copyright to
1:44 am
-- i think it is not sensible. let me say the copyright act is one place in the federal code where a grieved -- aggrieved parties can quickly arrange to have without a lawsuit quickly to arrange to have material aken down from the internet. if you have a -- i'm not -- alt reasons she mentioned. it probably covers a small subset of the problem. but i think it's not a trivial subset of the problem where people can in fact. at least there is a remedy that is useful in terms of removing materials for one reason or another they believe they have a claim on. you send the message, they hack you, more or less give you some procedure to follow if they want
1:45 am
to claim the copyright immunity they do so, you click. you say -- it gets millions of times a day. this operates to actually remove the material from that -- from the site. and one other very quick comment i want to make about the notion again as mary an was saying does the law have to wait until something bad happens before providing a remedy? i think in this context the answer may be yes most of the time because this is a lot of what we're talking about falls into the category of protected speech or speech. there's a very serious problem with sort of a -- that says you can't put it up in the first place. that would avoid much of the harm, but that raises even more serious first amendment problems
1:46 am
than sort of ex--post regulation of this which raises its own problems. is that has to be taken -- that has to be thought about more carefully. >> and, rob, you mentioned the fraud abuse act. i'm sort of reminded of several years ago when sarah palin's email address was hacked from what we understand in a fairly similar fashion. >> with hers it won't guessing. it was the password recovering question. if you have a wikipedia page about you don't have the answers on that page. which hopefully we'll all be in position to make that -- >> the point being we put a lot of information on the internet and don't always think about the levels of security. what are sort of the levels of security for information on the
1:47 am
internet and how does the law protect those right now? >> the text of it says you tuesday in a way that wasn't specifically authorized by the people who own it or control it then you can be charged under this. which leads to it can criminalize a lot of security research that needs to be done that solve the problem we're talking about right now. basic. if a web page is coughing updatea because you entered the right input that could be a crime even though you have to do that to prove to the owner of the page you have a problem here. fix it. i would say it's not the existing laws don't protect it but they also sweep in a whole bunch of other stuff that criminalizes activity that the people in white hats need to do to stop the people wearing the lack hats.
1:48 am
>> it could be hard for prosecutors to figure out what o bring. this was a fellow who put a laptop nay closet to download academic research to make it publicly available. he spent he committed suicide. >> moving into a little bit more of sort of now that we have some sense lay of the land what can be done to change the laws and address some of these issues. emma you mentioned a little bit when laws are being crafted it's very important to understand what you're sucking in unintentionally and at some point i'm sure you can weigh in .n this as well mary anne
1:49 am
if you can give a run down on what's been sort of tried and where the pit falls sort of come up. >> sure. hasow that professor franks been working hard to figure out is there a way to craft legislation that will allow going after the identified criminal activity they want to target and not sweep in a whole host of other speech and there are key categories you have to think about. what kind of content is covered. is it what we're -- the content that we're talking about is generally content that's protected under the first amendment when it's created. it's a person taking a photo of themselves or of a partner of theirs. the nude image of a person is
1:50 am
constitutionally protected speech and there's no -- there's no crime involved in the creation of the image at the outset. so trying to define a set of how is it sexually explicit imagery, is it imagery that reveals sexually out nudity. there's a lot of back and forth about the exact nature of the content. it's difficult to define because there's a fair range of the sort of photos that we could all think of of ourselves getting exposed to others that we would see as a har harassing sort of effort. trying to define the category of content that would be protected so it's not so broad to include things like a photo of a woman breastfeeding or some other kind of nudity that you might very well be able to capture taking photos in public places and really try to focus it on images that are in this kind of this
1:51 am
fear of intimate exchange that the professor franks was talking about. it's also who is potentially liable under these bills is the big question. it seems clear that you want to be looking at the person who up loads the photo out of the consent they have or haven't received from the fern depicted in the photo. there's also a question of how these are drafted. are they so broad that they start sweeping in the website where the photo was taken. any person who looks at that photo who may or may not know that that photo was uploaded without consent. i think there's a virginia state statute that was passed within -- that went into effect this summer. the first prosecution under that law is under way. it's a rell testifyly narrow law that includes a requirement that
1:52 am
there's an intent to coerce harass or intimidate a person by displaying their image and trying to define exactly what kind of -- what the content of the sort of image would be. it's an attempt to draft a fairly narrow line. i don't know if it's been challenged yet under the first amendment. the state of arizona also passed a law, the nude photo law as trying to restrict the ability of people sharing photos without their consent. it would make it a felony. there are no exceptions for news readiness. there's no real acknowledgement that if somebody poses for a photo for an art exhibit and they've clearly given their consent to the person taking the
1:53 am
photograph to be included in the exhibit. if someone else then posts that exhibit online, they haven't gotten consent directly from the model depicted. it's implied. it's part of the process of being a model in the art exhibit. but under the letter of the law in arizona, that website that's posting from the gallery show could be in violation of the law. it's done with the best intentions of wanting to get the consent of people depicted in photos before the photos are shared but not really done with a view to just how much sharing of images happen in a way that doesn't violate that initial consent, also doesn't involve direct explicit consent. these are the kinds of things that we have to think through if we're looking at is it possible to craft something that really is very narrowly tailored and sort of anticipates all of these
1:54 am
unintended consequences. > mary anne to pick up on that -- this is a difficult task because narrow drafting and clear drafting is always difficult. i'm sure everyone in this room knows that. you can start out with the best of intentions and you might end up with something that is not that great. so that is certainly true and the organization for which i serve as the vice president, we've actually published a guide r legislate yures -- legislatures trying to make clear what the pill falls are. we've been making these points for quite some time. there needs to be a very narrow definition of what's considered sexual scomplessity material. we need to be clear about who it is that's responsible for this type of criminal conduct. we need have certain exceptions, but can include things like law
1:55 am
enforcement or newsworthiness. but there are a couple of things that we might defer. at is to say the arizona law has problems because the aclu is now suing it. there's no public interest exception in the arizona law. that was a mistake and that's one they're probably go going to fix. it's not at all clear that there would be problems that the aclu and others are trying to make it out to be. really anything that we're talking about as a modeling shoot, photography exhibit, none of that will be a problem. the question of whether or not you have to get actual consent from every single person every time is also not true in the arizona law because the law says when you knew or you should have known that the image in question was disclosed without consent. it's a pretty good standard to consider, sferblely if we start thinking about revenge porn site. if you're on one of those sites
1:56 am
and it says she has no idea that you're looking at her picture and you decide to share it, you have a good sense that this is a nonconsense yule image. that's the type of behavior we're discussing. as to the question of who should be responsible, many of you know because of section 230 which allows for a lot of immunity, 230 will always trump. none of these state criminal laws pose any threat to -- it can't actually preempt -- no state criminal law can preempt 230. there's as many of you know section 230 is not absolute. it does not apply to copyright. it doesn't apply to electronic privacy communications but it also doesn't apply to violations of federal criminal law which is why google, facebook, twitter all have to care about child pornography laws because section 230 doesn't write them a blank
1:57 am
check for that. what i want to emphasize here is while it is true we have to care about unintended consequences, we have to be worried about that. that's true of every single law. there's no such thing as a law that doesn't stweep in something that we're probably not going to like. the question always has been not just in the first amendment conduct -- context but also in criminal law generally on balance are we accomplishing more good with this th law than bad? and for us to suggest that any time you suggest to someone that they might not be able to disclose what they want to disclose, that means a disaster for us as a democracy or for the internet that hasn't been proven to be true in many contexts: the -- many people were convinced when it was passed that it was going to shut the internet down. it looks like the internet is
1:58 am
doing ok. same thing is true of child pornography laws and gambling laws. the same thing is true about trade secrets, about identity theft, about voyeurism. all kind of situations in which we have accepted the fact that disclose yures of lawful information can be criminalized. if we think about the identity theft context, none of us want to be criminalized for having a social security number or a credit card number. but if someone takes the information and uses it in an unauthorized way it is criminal. the same things happen in trade secrets. we are now dealing with a type of conduct that is primarily directed at women and we're trying to treat that the same as we would treat other types of sensitive information and perhaps we're resistant. that maybe shouldn't be the way we approach this. we need think about what we consider to be the social value
1:59 am
of saying you cannot actually disclose certain information unless we want to live in a rool where there no identity theft protections. no medical records protection, no trade secret protections, no confidentiality protections at all. we are living in a world in which we restrict sfeach all the time. the question is when is it worth it to restrict that speech or not? some people will say that's not what the first amendment does. but it does do that. the supreme court has said we have to consider these types of harms and consequences. but many times people don't bring up first amendment questions. how many people think that spam s a first amendment issue? it's kind of a rare thing. how many think that dispose -- disclosing social security numbers is a free speech issue? do we think that what's going to happen the people are -- we're
2:00 am
able to protect and the values we're able to support are more important than the few things that may happen otherwise. i don't want to underestimate about intimate consequences. no law can ever accommodate every single unintended consequence. there's going to be? measure in which we're depriving some people -- some measure of we can't simply say, we don't want to hurt other people. again, i think it is a question of how we traditionally treat privacy, intimacy, and why we are holding off on doing that ere. david?
2:01 am
>> as a very quick response. obviously, this is a contentious, i guess, difference of opinion that will not be resolved in a 50-minute program, but just to focus on what was said about looking for ways to craft a law that has more benefits -- does more benefit than harm -- i would think this is an issue supported by lots of authority that that is precisely what the first amendment does not ask. it does not say to weigh the benefits against the harm. it has a higher threshold in cases involving the suppression of speech. merely showing that harm is -- you know, that the good outweighs the bad is sort of what the first amendment is on the scales of that determination, and i think it does make it more difficult. it is not simply enough to say that this is preventing harm when the harm is speech related. we require more. we require more precision in the drafting of those statutes to do everything possible to ensure
2:02 am
that -- everything possible to ensure that protected speech is not swept in. once we do that, if it is an economic crime, we do not have o be that precise. all the bad guys and zero percent of everybody else. marianne is of course right -- no law is perfectly precise and gets 100% of the bad guys and 0% of everybody else, but in the first amendment context, we require efforts to at least move in that direction that, i think, would be difficult in this context. not impossible, but very difficult. >> section 230 came up again. i don't know if you want to pick up that conversation again as well as how that applies ere. >> just to say that section 230 is one of congress' great legislative achievements.
2:03 am
i am prepared to say that. i think it is in large measure. there are many reasons to think about -- there is an active debate, of course, there is a ery broad reporting of intermediaries is a good thing or a bad thing. i think one thing to consider is tweaking that law a little bit. an exception for this, an exception for that probably makes it go away in rapid order. the immunity has disappeared. there are a lot of claimants that would like to see section 230. lots of people that have been caught.
2:04 am
lots of people who have been defamed, privacy has been ininvaded, people who have been scammed, who would like to see an exception of their harm as it were carved out of 230. and they have a good argument. the floodgates will be open, and 30 will largely disappeared.
2:05 am
>> our well-connected friends in the entertainment industry have suggested all kinds of tweaks to the nca that would pose a kinds of liability issues for websites. yes, they have tried, and it has not worked. i'm a little more interested in how we can use laws already on the books, the prosecutors can already go to court with to make life is painful and expensive as possible for the people who went after these ipod users and other like-minded creeps. >> i just wanted to build out a little bit on the section 230 point, to give an example of why, you know, those of us who are such staunch defenders of that law -- what role it really plays. imagine we take a person or website operator who knows or should have known this website -- this photo was shared without consent. if we had a law that said, you know, imagine i run my own photo
2:06 am
hosting website. a created what i hope will be the next instagram. i get something way better than filters for photos. i don't know what it is, but i'm running my own site. a law on the book says i can be taken to court if someone claims that i know or should have known that a photo that was uploaded to my site was an image of another person shared without consent. currently, i can immediately get out of any lawsuit that somebody tries to come -- thousands of photos are uploaded to my site a day. it's doing very well, and somebody says, "a photo of me is on your site, and you should have known that i did not consent to it" -- under 230, i do not even get dragged into the court case. is very clear i cannot be held liable for this, and i can go back to my business of running a photo hosting website. if the law changed and there was this question of should i have known that this photo was shared without consent, then we are in the case where i as the operator of the website have to go to court. i have two or three employees for my business, and now i got to hire a lawyer. i'm operating on thin margins, and now i have a few legal fees because i have got to go and
2:07 am
defend -- there's no way i could have known. if even we are talking of a good-faith operators who really had no knowledge and could not be considered to should have known that these kinds of photos were on their site, they will still have to go to court and defend that, and that is one of the real burdens that this kind of liability framework would put on operators of -- not even thinking about these giant internet platforms that deal with millions of pieces of content a day and what knowledge standard they have about tens of illions of photos posted on their website, even just thinking about small companies, two or three-person operations -- it will be vastly more complicated. >> the flipside, copyright lawsuits are not something that lawyers on retainer normally do. i guess it is a larger issue that we have made the law something that people who can afford to hire lawyers can be good at. >> to respond to that specifically, we want to balance the harm. it may be true that especially the supreme court has gotten into motive saying we do
2:08 am
not do balancing tests, concerns of overly broad laws must not be real. basically saying exactly that there are harms out there i can be addressed by this law, and you cannot complete say there could be all these things that might happen. there could. that's true, but they have to be real harms, and they have to be weighed against the legitimacy of any statute. if we are looking at the case of a poor site owner, it's true. there's no reason to say that will not create issues. of course there will be, but we also know that there are actual current harms, thousands of people who are actually being affected by this, whose lives are literally being ruined. that's a real harm as well to say that we are not sure what will happen to these particular site operators -- it is a concern, but it cannot be the only concern. as far as section 230, people waving their hands saying that they want carveouts as well -- it already has carveouts. it has already been made clear that we are going to say it does not apply. for cop right.
2:09 am
whose interest does that serve? that has always been a matter of interpretation, always a question of who we are going to say gets protection and who does not, which interests are so valuable. that even section 230 does not apply. first amendment does not apply in certain condition considerations either. i think it is an invitation for all of us to consider, why is that the case? why is it that section 230 is natural? as long as we're going to talk about section 230, just one final note, the goals of section 230, written in the statute itself, include to ensure vigorous enforcement of federal them and a lot to deter and punish trafficking and obscenity, stalking and harassment by means of computer. that is in section 230 itself, and that's what a lot of people seem to forget. it's not just about letting intermediaries do whatever they want. there are certain values and goals embedded in the idea that we would do well to ask if they are being served today. >> i would love to keep asking questions, and i will, but i want to ask a chance for the
2:10 am
audience, if anyone has any pressing thoughts they would like to address with the panel. right here in the front. >> what is the flaw in the current law? there is defamation, intimidation -- whenever we have a high profile case, there's a desire from people who think they can solve the problem or people who were injured who want a new law specifically for that issue. i don't understand what the flaw is with the current state of law. >> so the question is, what is the flaw with the current state of law? why do we need a new law in this case? mary anne, do you want to start? >> i would be happy to. the civil rights work did not start with any high profile case.
2:11 am
the jennifer lawrence incident is not the beginning of this. we started two years ago when average people were being affected by this. this has been something that is happening private citizens for years. this is not some high profile case we are responding to out of a sense of now that has happened to someone famous we care. we care about this because the experience of victims has been that none of these laws work. if the image is out there, if it was not necessarily by someone trying to harass, and many of them do not, just to give you a oncrete example, just a couple weeks ago, and then are arresting women for drunk driving, taking their phones, and sharing pictures with other police officers. with each other, he did not even 100 find out about it. find out how many times they get turned away by lawyer's. there's no money there, no reason to pursue this. for all these reasons, we are responding to an issue where
2:12 am
thousands of victims have come forward and said that they cannot get any relief from the law. they are not going to second-guess -- i do going to second guess the victims because they are the ones experiencing this firsthand. >> but if law enforcement is telling women, "it's your own fault this happened to you," they are wrong. i know you agree with that, but that is not only just morally wrong and a faulty understanding of what it means to, you know, take your own photo or share photos in an intimate setting, but it is also -- probably indicates that they do not understand the laws that do exist. >> they might not be too effective in enforcing any law. >> right. by no means -- i hope no one gets the impression -- -- i do not know anyone who thinks criminal laws are the silver bullet to this problem. there is no super bowl it. we're asking tech companies to
2:13 am
rethink their internal policies, asking people to engage in educational programs, inform people about the practice, and we are engaging with law enforcement and others because we want them to understand the stakes. by no means is it a silver bullet, but much like in the 1970's and 1960's when domestic violence was not considered a crime, with sexual assault was largely not considered a crime, especially if it was your husband who assaulted you, there is a social and legal importance to recognizing that this is a harm that should be addressed by the law, at least in theory. >> i was going to point to if we have a range -- we have a range of laws on the books that might be useful in different cases. whether it is a privacy tort, invasion of privacy, public disclosure of private facts, going after it from a hacking angle, whether it is a copyright remedy, intentional infliction of emotional distress, there's a mosaic of laws out there that are the way society has
2:14 am
expressed that it is wrong to intentionally cause emotional distress to another person, and there are laws against that sort of thing. it's not going to mean that every single instance of this kind of exposure of a private photo is covered. there are going to be gaps where a case does not fit into every single aspect of or cannot fit into every single aspect of a current law, but if we try to craft a new crime that is expansively enough defined to cover every single instance of an exposed photo, we are absolutely going to sweep in other kinds of content, other kinds of expression, and that law is not going to survive first amendment scrutiny. that's the challenge we are facing. there's no silver bullet. and it's very difficult to figure out how to depet a law that can express disapproval over information that doesn't run afoul of the first amendment. >> you mentioned getting companies to enforce their own
2:15 am
companies. and interesting about gamer-gate. many have said that facebook has gotten better at doing this. twitter has not done it yet. they did not have a form to eport abuse until this year, which is kind of insane for a social network that has been around since 2007. i hope they are taking it more seriously because they can do a lot. twitter is not the public internet. they have their own rules. they are allowed to change them to make it easier for people who are being harassed or people who see others being harassed to call out the offenders, and they have not done enough. >> going back to our case example for the day, a lot of these images were circulating for a long time on websites like 4chan and sort of blew up when they hit reddit, which had a thread which really made some of
2:16 am
these images go viral, which made these shut down. it was sort of a question. i've heard it said many times in different ways, the most wonderful thing about the internet is also its greatest flaw. for the users to use the way they want to, and it's difficult to say a website that is based on the idea of people having open forums to share and discuss what they would like also needs to be responsible for the making judgment calls of when it crosses the line, so what are some of the difficulties with that? perhaps you might want to jump in here. >> i'm going to step back and say, the biggest problem with the internet is the people on it. >> to that point, though, i would say i'm glad you brought up twitter as an example to dovetail with the point about intentional emotional distress. that's why i think we need to rethink the emphasis on emotional distress. when people are engaging in
2:17 am
these activities, all these different types, not just some of the cases, but it was actually a pretty large number of these cases -- why are people oing this? because they think it is funny. because they think it is entertainment. not to cause emotional distress. why do we hold that is the one thing we would penalize? why is this person any better? why if he is doing it for profit, if he is getting ad revenue that the why would we say it's totally fine to do it for that but just do not hurt her feelings? that seems like an odd emphasis to make exception for the public nature of this humiliation. >> don't you need to have, as part of a prohibition -- you have to think -- maybe i misunderstand, but you have to have some reference to an improper purpose. you're not saying, like i don't think, like the arizona statute, which is clearly unconstitutional, that you cannot post a picture of someone
2:18 am
without their clothes on -- >> that's not what the statute says. >> and is not what you are saying. the difference is you, too, have to focus on the improper purpose. >> if purpose is meant by causing distress, no, it does not have to be focused on that. the motive for why some and does something, the motive for why someone spies and you in your bedroom -- why would that matter? they do it because they think you are funny looking or because they think you are arousing should not matter. we look at different categories and we are back to the theme of jennifer lawrence calling this a sex crime. think about the way sex crimes tend to be worded. we think a sexual assault in erms of consent. and if it distressed them. we think of it in terms of consent. there are certain forms -- and again, this is true of our identity information and about our other forms of privacy. do we only criminalize
2:19 am
disclosures of medical records when you intend to distress me with them? no, that is not even part of the statute at all. same thing when it comes to social security numbers. i just thought it would be funny to put your social security number out there. i did not mean to hurt your feelings. no one cares. when it comes to certain types of intimate information, the motive for why someone is doing it should not be the point. it's the lack of consent to do so. i think that is something that is becoming clearer to us as a society, that we has serious deep problems with sexual consent understandings in this country. we can see this in terms of how many sexual assaults are committed every year, but also that we seem to take it as a given whether a woman especially consented to the use of her body for sexual entertainment or enjoyment. i think it's time we started to rethink that. any other questions? >> i believe this was referred to as a hacking, but wasn't this really an instance of phishing rather than hacking?
2:20 am
>> social engineering is still hacking. it's the easier kind, often. >> it involves folks being able to give their passwords, not necessarily a hacking of the cloud. >> it's unclear exactly what went down. apple has come out and said that their systems were not hacked. which is to say that apple writ large was not hacked. they did not rule out that individuals through sophisticated techniques, whether it was social engineering, whether it was phishing were able to get passwords for individual accounts. >> two things about apple as they are not generous with specifics about their products. in cases like this. and they have a history -- their device security, everyone i know that has looked at touch i.d. is great. their cloud, they have had a lot
2:21 am
of issues with it. >> to that question, does that matter? is that a significant distinction? we talked about how there's a difference between this case where you have perhaps a violation of cfaa -- where does the dissection of how the image was gotten come into play? >> i think back to what marianne was saying, where if you think consent is the fulcrum or the line, then whether you have hacked into someone's account or have a photograph that was sent to you or you have access to that account gotten in a perfectly reasonable way -- all of those would matter and would have to be evaluated. i mean, there is a much larger debate here about the role of consent.
2:22 am
generally, with respect to information, on the internet, there was an active international debate with the so-called right to be forgotten in european -- there's various european laws where if you no longer consent to have information that has been published about you, you can sort of withdraw the consent, have the website delete information that may be floating around about you. again, not to beat a dead horse, but there's a familiar landscape in a sense for the first amendment debate, which has sort of been in the background of this consent, privacy, information debate for many years. what will it do to the free flow of information if you have to show that you have consent in some form for passing on a piece of information?
2:23 am
this is related to this idea, i think, that the people should own the information about them, should own the information about them. should be a property interest, even, and therefore, people have to come to them if they want to publicize their areas things about them. that has serious, very difficult issues that we could call free speech or first amendment issues because it is very difficult to evaluate whether consent has been given in many circumstances as to find out how you demonstrate consent, and it would have a very serious impact on the sorts of things -- you know, can you tell people you saw me at this -- i would use this as an example about owning information. i saw this thing in the rayburn office building. if i own that information, you cannot. it is an extreme example. nobody suggests we should have such a law, but that is the issue, i think, with respect to sort of balancing the free flow
2:24 am
of information on the one hand against reasonable requests for a showing of consent with respect to some information on the other. >> i think that is right, and it's one of the reasons why am actually optimistic about this particular type of material because there seems to be a fairly easy way to fix this. you want to disclose some but his image, ask them to sign a form, you can disclose away. make it easy. we have something like that when it comes to modeling releases. we have it when it comes to medical records. if you want to submit this information, and you think it is consensual because that is the only principled stance to take, make sure you have documented evidence. we can fix this. this is not nearly as hard as things like the right to be forgotten or a general question about what people can say about you. it's very specific and can be resolved through paperwork. i saw one more hand right ere. >> given that there are already takedown regimes for child
2:25 am
pornography and these other protected what have you, would it be that much more burdensome to require search engines, facebook, other tech companies to also takedown revenge pornography? do you think it would possibly impede the growth of small tech businesses? >> there are a couple of things you really have to keep in mind when you are talking about some kind of takedown regime. first and foremost, what a notice and takedown regime does is give a person the ability, a mechanism to tell, say, a website host to take down someone else's content, takedown something that was uploaded by another person. this is a mechanism that has been helpful in taking down, you know, infringing copies of movies and songs, but at its heart, it is giving a person ability to say, "take what that other person has uploaded."
2:26 am
the potential for abuse of these systems is very high. when you look at something like the copyright takedown system, there are a number of safeguards built into the system based on what you have to include in a notice. it has got to include the ability to -- you have to identify yourself, including contact information. you have to attest that you are the legitimate owner of the copyright. the person who uploaded the content originally has the ability to push back and say, "no, this is actually my content," or "i have been making a fair use of this copyrighted ork" or what have you. they can file a counter-notice. then the website puts the information back up online and leaves them to fight it out in court. it's not a simpler mechanism as saying give someone an easy form to fill out, and the information
2:27 am
comes down, and you are set. a lot needs to go into figuring out how to construct this takedown mechanism so that it is not so vulnerable to someone using it to say, "i don't like with that person said, so i'm going to file a takedown request and abuse the system." one of the real challenges we have to think about and we talk about questions around nude images is that there is a sensitivity and a privacy interest that the person depicted in the image might have. if it is your photo that has been posted without your consent online, you will want to, you know, file a takedown request and get that taken down. if you have to identify yourself in that request, that could cause privacy concerns, but if it is someone trying to abuse the system, if you have uploaded your photo under a pseudonym and you're happy with the photo being out there, but you do not want it connected with your real name, and someone else is trying to abuse the system to get the content taken down, your ability to respond requires you to isclose who you are. so there are some complicated
2:28 am
issues, thinking about the vast range of nude imagery that is available on the internet. some of it is this kind of nonconsensual posting, but there s a lot of it that is uploaded pseudonymously. we really have to take that into account. >> just to follow up a little bit, i think a notice of takedown regime is worth exploring at this point. the existing notice of takedown schemes, particularly the copyright one, which is the one in section 512, does require -- the burden is on the aggrieved party. it does not say google or facebook or twitter has to take tuff down. t's copyrite infringement. you have to respond to the copyright owner's identification of the infringing material, which i think is very important
2:29 am
and has been very contentious, so they are working that out in the court, but they have more or less come to the resolution that it is the obligation of the aggrieved party to find -- and it's not a trivial obligation -- to find the material and send the notice in, at which point the process kicks in. there are all sorts of protections. you do have to be careful about allowing something to be abused. if it is too easy to submit a takedown notice, people will be using it for purposes that have nothing to do with the harms they are trying to protect against, but all those, i think, are -- the devil is in the details. you all should know that. the copyright takedown regime, if one wanted to go in that direction for these sorts of problems, i think would be worth looking at carefully. to see how well it has worked, what hasn't worked about it. that might at least streamline
2:30 am
getting at hundreds of millions of copyright infringing files are taken down weekly. under section 512. in that sense, it has removed an enormous -- i know the copyright industry has gone crazy about it because they have to go find the material. they don't like that, but on the other hand, it has done the job pretty well. it has provided a process at scale, and scale is always important when you are talking about the internet. whatever we're talking about, we are talking about millions of it, and that scale has allowed the automation of takedown, but yet, protecting the people who have uploaded and giving them an avenue to say, "i did not post it" or it's not infringing or whatever there argument might be. that will be worth looking at carefully and see how it could be modeled to work on this problem. it might be a useful avenue of approach. >> you mentioned search engines. we should be careful about going too far.
2:31 am
somebody like youtube can have content ided, but also, they have a known universe of copyrighted material that they get from the entertainment industry that they can use to match against this. there's no such thing when it comes to people's private photos. rying to do a general search and match -- webmail sites can do automated screening against child pornography pause there is tual sensual -- consen anything there. that's why they are illegal, and there is a hash database assembled by the national association of missing and exploited children, and they can compare the hash of the image, the mathematical shortcut to it, to what is in that database. doing that for the broader universe -- that's not going to work. >> just one clarification --

65 Views

info Stream Only

Uploaded by TV Archive on