tv Key Capitol Hill Hearings CSPAN July 10, 2014 8:00am-10:01am EDT
8:00 am
surveillance purposes. as many of you know, and as we're discussing today, when an individual or organization builds a backdoor to assist with electronic surveillance into their products and services, they place the data security of every person in business at risk. it's simple. if a backdoor is greater for law enforcement purposes, it's only a matter of time before a hacker exploits it. in fact, we've already seen it happen on more than one occasion. ..
8:01 am
>> the result was overwhelming. the house stood up to the american people and for the constitution. that is something we can all celebrate. we sent a strong signal; that if the government wants to collect information on u.s. citizens, get a warrant. thank you for your hard wok on this important issue, and i look forward to working together with each of you to keep pushing for safer, more secure internet. >> all right. thank you, congresswoman lofgren. and next up, alan grayson, representative alan grayson./mn >> congressman alan grayson.
8:02 am
thank you for inviting me to share this panel on the nsa, and thank you for all the good work you do to protect private and security in america and throughout the world. listen to me. if the chinese government had proposed to put in a back door into our computers and then paid $10 million to make that the standard, we'd be furious, we'd be angry. we'd do something about it. what about if it's our own government that does that? that's exactly what the nsa has become, the best hacker this the entire world. and when they put in a weakness in the architecture of the software that everyone uses, what they're doing is making it a weakness not just for their benefit, but for the benefit of anyone who comes along and knows about it, and that's a crying shame. we are entitled to our privacy as human beings. many of our economic activities cannot be done unless it's with some degree of security and safety. and the protection that the nsa is purporting to provide to
8:03 am
americans is actually being undermined by the nsa itself. that has to end. that's why i'm happy that many of you joined me in passing two amendments recently which represent the first substantive limits on the nsa's ability to insinuate itself into our software for improper purposes. one was our science and technology committee amendment which said that nist no longer has to be a short order cook for whatever the nsa has to do, and the other was a parallel amendment on the floor of the house which passed unanimously among democrats and republicans for the same purpose. these are the first steps we're taking to take back our privacy, take back our own security, take back our freedom, and i welcome your help in doing that. it's one of the greatest endeavors in modern life to preserve against the encroachments of big brother. i'm congressman alan grayson, thanks again. >> all right.
8:04 am
well thank you to both representatives far ticking the time -- taking the time to tape those messages and also to starç a much-delayed conversation about security and data that we're going to start today, and i'd like to invite the panelists to, please, come on up. if you were wondering about representative grayson about $10 million being paid to somebody to undermine security, we're going to explain what he was talking about. all right. joining me on the stage in alphabetical order, i believe, are joe hall who is the chief technologist who i was lucky enough to get to work with. danielle keel, analyst at the american policy institute and the author of our upcoming paper. david leash, who is the privacy policy counsel for google here this washington d.c. bruce. >> fire, noted security it cannologist and author --
8:05 am
technologist and author. and amongst husband many books and and articles including one you can find outside, he has done some of the reporting based on the snowden documents about the nsa's impact on security while working with "the guardian." and finally we have amy she pan be slip working on several of these issues. we're going to break up just to tell you where we're going with this, we're going to talk about four sets of things that the nsa has been up to. along the lines of our upcoming aper and along the lines of the handout that those in the room might have picked up in the front. first, we're going to talk about the undermining of clip to standards, second, the surveillance of back doors into products and services, third, the nsa's stockpiling of vulnerabilities in software, and fourth and finally, the range of offensive tactics that the nsa is using, taking advantage of several of those tools we've already spoken about. after spending about an hour on
8:06 am
those issues, we'll spend a few minutes batting clean up, talk about recommendations or policy issues we missed, and then we'll turn it over to you guys for some questions. so starting with the use of undermining encryption tools and and standards. there's been some reporting about the nsa taking a variety of steps to weaken encryption tools that we use online to keep our communications clear. representative grayson made reference to that as did the president's review group talking about the importance of encryption to insure the security of our communications online and the continued health of the internet economy. so i'm going to, i think, start with amy to explain what the heck happened. what did the nsa do? who or what is nist, and why does what they do matter? >> sure. so the nsa, many people don't know, actually has two with different, very, very different issues. the first is -- missionings.
8:07 am
the first is signals intelligence, the mission under which they conduct all of the surveillance operations that you've been hearing about pretty much ad nauseam for the last year. however, their second, lesser known mission is information assurance, and this is under which the nsa is supposeed to be promoting security protocols. it's under the information assurance mission that the nsa communicates with nist, the national institute of standards and technologies. nist, for those of us in d.c. who love acronyms, deals with many, many things. they set standards across the board in so many different types of businesses, jobs not only encryption, but one of the things they do is they set encryption standards. under a law called the cuter security act that was passed in the 1980s, they coordinate with the nsa and the nsa's
8:08 am
technologists and their information assurance mission on these encryption standards. the computer security act which was actually very well drafted and made in kind of the form tiff days of the interinnocent was preempted by a law that was passed in 2002, that being a key date because it was post-9/11. in 2002 the federal security management act came along, and it actually had language that was not as fine be tuned as the computer security act and allowed the nsa or really allows the nsa if you look at it closely to come in and undermine theef÷ encryption standards in a way they weren't able to or they probably weren't able to under the previous be language. so under this law, nist is required, absolutely required to consult with the nsa on all encryption standards. the amendment that representative grayson alluded
8:09 am
to earlier that passed out of the house in the first act, this is primarily an act that funds science and technology research. it has not made to it the senate yet, but in that bill an amendment was added on out of committee that says that the nist is no longer required to consult with the nsa on standards. they're still able to, and this is in recognition of the fact that nsa has a lot of fund, a lot of experts, a lot of really smart people who do this work, and they shouldn't be prevented from being able to help and to assist. but they are no longer required to consult with the nsa on encryption standards which means there's going to be a lot more accountability if those encryption standards get undermined. later on as part of the defense appropriations act, a second amendment -- again, alluded to by representative grayson -- actually is supposed to prevent any funding from being be used by the nsa to undermine encryption standards. so not only if the first act
8:10 am
passes will nist no longer be required to consult with nsa, but when they do consult, the nsa cannot act to make us all a little bit less secure. >> well, perhaps bruce can explain to us why these lawmakers are seeking to reduce the nsa's influence over what nist is doing. can you talk about how the reporting indicates that the nsa, in fact, undermined the standards that nist was setting? >> it's actually, actually surprisingly complicated. nsa does a lot of undermining of fundamental technology, various ways; mathematics, we learned intercepting cisco equipment as it's being shipped from cisco to the customer and inserting back door chips. so undermining happens all through the process. what we're looking at here is what happens as products are being built, as standards, protocols, things that atext every example of the product. so it's encryptionby standardst
8:11 am
is implement cases, it's sort -- implementations, it's software. and all of these we have examples of the, this sa going in and -- the nsa going in and deliberately weakening security of things that we use so they can eavesdrop on particular targets. we have one example of a mathematical random number generator. that was a nist standard that was modified by the nsa to put in a back door. there's a lot of standards where this didn't happen. that's actually a very risky place to do it, because that's likely discovered -- this actually was discovered in 2006. we didn't know who did it. we had some suspicions, and it wasn't until the snowden documents that we had more of the story. more likely you are going to see nsa back doors in places you can't actually see. so you might imagine a operating system in your computer and your phone that has an encryption product and program that you use
8:12 am
that is somehow modified that is not as good as we think it is. that'll be much harder to find, much harder to pin on who did it. a lot of examples, we'll find these sorts of bugs, and they look like mistakes. they could be mistakes. they could be enemy action, they could be by the u.s. or somebody else. we don't know which programmer did what. so this very act of undermining not only undermines our security, undermines our fundamental trust in the things we use to achieve security. and it's very toxic. >> so it would seem that the underhining, the standard not only undermines the standard itself, but undermines trust in the process whereby we achieve these standards. i'm curious if someone could speak to issue, i mean, so we were talking about this random number generator is a code that is a part of many products used widely across the internet by civilians like us.
8:13 am
can someone speak to the issue of rsa and its role in this and the $10 million that representative grayson mentioned? >> so this gets a little complicated as well. laugh but bear with us. >> the subtitle of the panel ise "it's complicated." >> yeah. so the flaw in a random number generator, random number generators are extremely important in encryption. and encryption, which is essentially complicated math to make things totally unreadable, you have to be able to generate very big numbers that no one else can generate. they have to be random. if you have a flaw in a random number generator, that means someone may be able to predict the key. essentially being able to, without much work, decide, okay, here's the shape of the key to your house, and go and cut that key and break into your home. and the nsa did this specifically with one particular random number generator. it was very hard at first to
8:14 am
tell what the extent of this could be. we knew this had been used in a lot of popular products. not only that, but incorporated in a piece of software that other products en masse use. and one of the unfortunate things that we found out, but, you know, a lot of this snowden stuff is -- i'm really glad we know this stuff. it's scary, i but i'm better for having known it. we learned there was a contract signed between the cane that makes this popular piece of software, rsa, which i don't believe -- stopped being an acronym at some point. >> like kf be c? >> yeah. and the nsa had paid them $10 million to make it the default choice. and you can be gracious and say the nsa was tired of configuring all these crazy number, millions of computers, but, no, it's the default set across the whole
8:15 am
product line. so anything that incorporates this thing would be, would use this flawed random number generator. now i think, as far as or i can tell, i saw a report recently that very few products out there on the wild, at least the ones you can measure by using, by testing web servers and things like that, don't use -- they use other sources of, other random number generators. and so, you know, from the point which we learned about this til now, this is sort of one thing that if you don't know cryptographer, you learn quickly they're some of the most paranoid bexñ people in the world -- computer security people being a little less paranoid -- and apparently, many of them have moved en masse to change the technologies they use away there ones that have this unfortunate flaw in them to ones that we don't believe have flaws in them and have sort of stood the test of time against a lot of people banging away at them. >> thanks, joe. i wanted to turn to david from
8:16 am
google and talk about what you think this activity by the nsa means from a company or user perspective. and also what you think it means about what the government's perspective is on the use of encryption. >> thanks. thanks, kevin. yeah, i think what has been probably truly vising has been the extent -- can surprising has been the extent to undermine the encryption. the fact that those efforts were undertaken maybe is a little less surprising given the nsa's mission. i do think it's important to sort of take a step back and from a broader context understand what the government's current view particularly and the intelligence committee's view is about users' use of encryption. there are minimization procedures under section 702, and what those minimization procedures say is that notwithstanding a requirement to destroy wholly domestic communications, that encrypted
8:17 am
communications -- whether they're used by u.s. persons or non-u.s. persons -- can be detained directly at the direction and in writing by the nsa director. i think it just sends an unfortunate message that the use of encryption is inherently suspect, particularly in the aftermath of what we've seen with large scale day breaches. that's not a positive, that's not a positive moment for users or for companies, and i think it has the potential the bleed over not just into encryption, but to security tools that we offer and that others offer. i don't know that users commonly distinguish necessarily between encryption and other security tools. so while end-to-end encryption and tools might be difficult to use and hard for ordinary users, there are other things companies do, for example, we do fact drop education that'sa relatively easily to use and implement. and if the perception is that all these security tools are going to be ultimately
8:18 am
undermined or exploitable, i think that creates disincentives for users to take advantage and avail themselves of those tools. as a result, with future cybersecurity innocents, there's the potential to exact -- cybersecurity incidents, there's the potential to exact greater penalties. >> i'm curious moving forward, what are the policy options, prescriptions that we've seen so far in how to deal with this issue of the nsa undermining standards? danielle? amie? >> i can start. one of the things is this key, this relationship between nist and the nsa. and so this is, you know, maintaining this statutory requirement that nist consult with the nsa and nsa being able to take advantage of that to undermine certain standards. that's very dangerous because the standards themselves are
8:19 am
used by developers and in lots of commercial product. so it's not just what we'll talk about later where they, you know, pick a particular product and try to insert a act door into it, but it's actually the standards that are used in sort of a variety of things. and it's also a reputation as a standard setter which is something that the united states has been a leader on for many years or, you know, actually, probably since the beginning of the internet. and so part of it is making sure that there's not a requirement in our law that allows the nsa to take advantage of nist. and there's also, i think, on the other side nist is a body that needs to rebuild its credibility, and they've been to do that to an extent. they've started reviewing their own policies and guidelines. they claim that they didn't know what, you know, what was happening in 2006 when this come propoise standard was issues, but they're now sort of looking through all these things because, you know, they're facing a trust deficit right now. they need to rebuild that so that the u.s. can continue to be a leader in standards and so
8:20 am
that developers and ordinary users are going to trust what they say. >> i saw bruce had something to say. >> the fundamental be issue here, and we're going to see it again and again in the next couple of hours, is broad versus target. the issue is not that the nsa is spying on whoever the bad guy is they want to spy on. deliberately weakening the security of everybody else in the world in order to make that sighing easier. so when -- spying easier. so when we look at solutions, the solutions are always going to be on the order of force the targeted and not do the broad attack. the broad attack is what hurts everybody. as i think representative lofgren said, once you build a weakened anything, you can't guarantee that you're the only person taking advantage of it. right? once you do any kind of broad attack, broad surveillance, you suddenly i start losing control over what you're doing.
8:21 am
it's not the target, it's a fact that happens broad. >> so, bruce, you also mentioned you actually wrote about, in fact, i think we handed out at the front desk one of your pieces about a lahr policy solution to this issue where you said to break up the nsa. could you talk about what you meant by that? >> well, it's a little bit along the lines of what amie talked about. they are jammed into one agency, there is the attack them and the defend us. and those were pretty complementary missions all through the cold war because you had the same basic expertise to do both, but their stuff and our stuff were different. tapping a soviet undersea naval cable had no effect on u.s. communications. and you were able to keep those two missions under one roof because they were physically separate in what they did. what's changed with the internet is that everyone uses the same stuff, right? you can't hack the soviet radar
8:22 am
data without affecting all of us. so those missions now collide, and that's where the problem is. so what i, what i view as to how to go forward, i think we need a much more formal breaking of the security mission, the information assurance mission which protects the communications of the united states, of the world, protects standards, makes us all safer from all the attackers out there from the targeted espionage mission, surveillance mission of going after the bad guys. more complication, that espionage mission is now too complicated because it has two components as well now. during the cold war, it used to be very simple, we would spy on enemy government communications. right? we would eavesdrop on them. that changed after september 11th, and now the surveillance is against pretty much everybody. right?
8:23 am
everybody in a country. all right, we get all of the telephone calls going in and out of bermuda. not just government ones, every one. every agent. we get the phone call metadata of every american. and so these measure, these broad surveillance measures, government on population surveillance, i think, are much more a law enforcement-like he can nhl. that government -- mechanism. government-on-government espionage, cold war, older, that's a military mission. government on population surveillance is much more of a law enforcement mission. i think it belongs in a law enforcement agency, not in a military agency. and that's broadly the way i want to divide things up to be more in line with what we imagine the rules and regulations governing these activities should be. >> it's worth noting the president's review group agrees with you on several of those points. moving on, as a transition to
8:24 am
our next discussion about back doors into various products and services, i was hoping maybe joe could take us on a brief history lesson. it seems like when it comes to back doors into clip toe, we had this debate in the '90s. they called it the crypto wars. the government wanted security chips so they could have lawful access to the data that was encrypted and, eventually, that didn't happen. could you talk a little bit about that? it seems like we won the crypto wars, but then the, this sa kept fighting them. >> it's a wonderful thing. for the longest time it used to be sort of entirely the purview of the u.s. military. under the nsa. and so one of the crazy things that happened in our history is that people started to learn about it, there were independent discoveries of fundamental clip torgo graphic methods that had n discovered a decade before by
8:25 am
other people working in the military. now you had academics and other people discovering these thicks and realizing -- things and realizing, oh, gee, we might want to have some privacy, some confidentiality, some security associated with that. we need to be, we need to have these kinds of methods outside of pure military control and in the hands of civilians. and so there's this tension going on what the administration at the time provoaz posed was something -- proposed was something called a clipper chip. this was essentially a chip that had an encryption key on it. the idea being i've actually charted it and cut into two pieces, and there would be two parts of the u.s. government that would have them. and if up doing something bad or you suspected they were doing something bad, if they had a warrant, they'd be able to get that, be able to listen in on your encrypted communications which would sound like complete
8:26 am
gibberish, it would be white noise. they would get this key, and because they then had this escrowed key, they could then get access to this data. i believe bruce was on this. this amazing group of experts, one of which is up here right now, wrote this extremely compelling paper that basically said, look, here are the problems technically with keeping copies of keys around in places where you think only the government can get's to them. and, in fact, the eff, electronic frontier foundation, conditioned and built a cracker for the chip -- i believe. i think i'm getting things mixed up. whatever. >> it's complicated. [laughter] >> the idea through an resistance-proof -- well, we were able to argue this is not a good idea, and it's not going to work, there are other ways to get access to this stuff. and, in fact, there are other -- if you ever want to check out a
8:27 am
cool book, read "crypto." it talks about this back and forth war between advocates of very complicated math in the thought that's only going to make the world a very, very horrible place because bad guys will be able to hide stuff from the u.s. government that has this duty to oversee the entire world, at least back then if you think about that. now, what it turns out, we won the crypto wars not only op the key -- on the key front, the u.s. government would not let you export very, very strong encryption technologies for many years. and after a bunch of wily coders and deep thinkers essentially put a bunch of very secure crypto code on news groups -- and and -- if you don't know what a news group is, look it up later -- put it on news groups
8:28 am
so people around the world could get access to it. and when that happened, there was no sort of, you know, vision that we could keep this within the u.s. borders. there was no assurance that that would happen anymore. so, essentially, that war stopped because one side stopped fighting, and we were happy to move on to other ballots in the advocacy -- battles in the advocacy realm. what seems to be happening is we'll intercept routers on the way to their customers to put things in there so you're not even messing with the math, you're messing with a physically sautered hardware component. massive amounts of things that i think i would want to read, at least in the publicly-released documents. i know bruce has seen others, and who knows? [laughter] >> it seems any be arguments, when it comes to arguments against the chipper chip and for -- clipper chip and for
8:29 am
allowing export of strong crypto, the idea of if we're going to be transitioning our ideas to these networks, if we want them to be used, if we want to have confidence in our transactions and grow that information economy, we actually need it to be secure, that is the same argument that many have been making in response to what we're learning about the nsa's insertion of back doors into a variety of software products and services. could introduce us to what we've been learning in the past year about those back doors. >> shower. sure. and i think joe just described sort of the transition between this public attempt to insert a back door into all products and for the nsa to have the key to this private earth. is what they -- this private effort. they turned to the companies and is said let's figure out a way to develop relationships, to leverage product design to convince them to make it easier for the nsa to get access, and
8:30 am
the idea only the nsa would have access, and i think everyone up here can explain why that theory is not necessarily sound for security. so what we've learned in the it's year is that the -- past year is that the nsa spends about $250 million a year on a program, and this is one of multiple sort of different programs that have been revealed where they look to leverage these relationships with companies to covertly influence product design. so it's to develop relationship, and i think the words are so to shape the global technology marketplace to facilitate other types of collections. so this idea that they can convince companies to make it easier for them to get access to their products. so this is inserting back doors into commercial i.t. systems, into encryption, into, you know, end user devices, 4g technology. so the goal cans of this -- goals of this project are really wide ranging, to try to get
8:31 am
access in as many ways as possible. so this is kind of a very, you know, private and sensitive way to get the companies on their side to let them insert back doors into their products. so we know that they're doing that, but we also know, one of the other things we've learned is it's not always with the knowledge of the companies that they're inserting back doors. so joe mentioned intercepting foreign-bound network routers. we learned they're intercepting cisco routers to insert back doors into them. this is kind of, i think, the tip of the iceberg of this idea, the nsa wants commercial products that they might need access to monoodor targets. these are also the -- monitor targets. these are also the products we all use all day for our communications and various different]á activities online. it's this idea that they want to insert back door so that they can insert malware if they want to, and they can kind of see whatever they want. so that's one of the -- [inaudible]
8:32 am
i appear to have -- there we go. i think that was a sign i should stop talking. >> i mean, so it seems, it seems this is also a debate, we've had a version of it before. there's a law called calea that required the phone beñi companis back in the '90s to ec near their -- engineer their systems to make their systems wiretapable. and there's been discussion in the past few years to apply to other online services and products. joe, can you talk to us about that debate and what some of the arguments were that you and others in civil society and in the security world had against that proposal? >> sure. so -- >> in other words, why are back doors bad for security? >> yeah, yeah. up this about june 15th or last year -- or june 5th, sorry, of last year which is when the first snowden leak was made public, the fbi had been pushing strongly internally to the obama administration for, essentially,
8:33 am
this argument they made with they're going dark. fbi was going dark can. and what that means was back in the day all they had to do was get a warrant and, you know, use telephonic wiretapping, you know? it used to be as easy as attaching a couple of alligator clips to a phone line and just listening in on the call. it got a lot more complicated over the years with circuit switching and packet switching, all sorts of crazy stuff. got to the point where they passed a law, we call out it the communications assistance for law enforcement act which said any provider of the telephone services must have a way to wiretap people. you must be able to respond to a law enforcement request to wiretap this stuff. the fbi has been saying people don't use phones as much anymore, they're using what's app? i play clash of clans and sit there and talk to people via that, you know? there's a variety of ways we
8:34 am
communicate these days. now, over about two ympyáj the fbi had been arguing we need some sort of fix, we need some way to make us, to make these thicks a -- things a little more bright. not going dark, but getting a little brighter so they could actually get access to this stuff. and what surfaced was, essentially, a proposal to wiretap all software. essentially, no one never actually saw the proposal except for a couple of reporters that basically said the fbi could come to you as a maker of a piece of software with something i believe i call a wiretap assistance order, you you'd have to see the text. if you said, well, gee, the product's not designed to do that, it's going to take us a while, they'd say, okay, well, make sure in the future when we come to you, with a knob be you can turn on the wiretap capability for this stuff. so it's sort of a way of putting
8:35 am
you on notice that you needed to build a back door into your products. unfortunately for them, this got leaked to the press, and it got leaked in a very absurd way where you saw proposals like, well, okay, you get serve with the this order, you're on notice, you need to wiretap your users, if you don't do it, you'll get $10,000 a day, and it will double every day which you do some basic math gets to, like, all the money in the world within three or four weeks. pretty ridiculous. [laughter] yeah, that's pretty strong, but it's ridiculous. cdt organized a group of experts who wrote this gorgeous aper called calea ii, the risks of wiretapping endpoints or something like that, and they made a really compelling argument. i'll shut up in a sec. the first was this is a bad idea. putting back doors in products is like, essentially, fundamentally undermining the structure of the universe if you think about it in a physical reality sense. what do i mean by that?
8:36 am
well, everything you do online involves communication, and to the extent that you want some integrity, to know it came from somebody, to know it hasn't changed and came from that person, you're going to be using products that use encryption or other kinds of security features. that's not going to work anymore because they will have these back doors that no one can@m+jut be can only be used by good guys. although the random number generator may give it a run for its money for technical reasons. but the most compelling argument was it's not going to work, that all of the things you seem to want to wiretap these days, think about the firefox browser, the chrome proser, you can get it -- browser, it's very easy to cut that piece of code out, recompile it and turn it into a committees of execute bl software without the back door. moreover, if you can't do that in the u.s., that means all the secure product development will go to other countries, and we lose out on all those
8:37 am
capabilities. these kinds of things will still be available. you can't erect a treaty that says everybody needs to be able to wiretap all software all the time. >> in the telecom context where in the mid 2000s, so like the u.s., greece had systems for lawful intercept of their phone system x. they eventually discovered that some unknown adversary -- rumored to be the cia, rumored, i don't -- [laughter] had actually compromised the lawful intercept systems there and had been using it for a long period of time to actually spy on the highest echelons of the greek government including its prime minister or president. so a good object lesson in how these back doors can actually backfire. any other thoughts about the security implications of back doors? >> i can talk a bit. >> funny thing about this is bruce has written an essay on almost every single thing we've talked about, and they're great.
8:38 am
[laughter] >> the fundamental issue is, again, should we compromise the security of everybody in order to access the data of the few? in order to believe that that's a good idea, you have to believe that, one, only you can use that compromised path. that if in some way no one else can use it. the greek example is an example where -- representative lofgren mentioned another example. there are lots of examples where this global compromise is used by other people than you expect to;y weaken security. and you also have to believe and, ultimately, this is a good idea, that the value of this path to few outweighs the security of the many. and you have to believe that. i mean, i think that security in our communications, in our data, in our information is vitally
8:39 am
important to all of us. there's a wide variety of threats out there; government threats, criminal threats, foreign threats, domestic threats. and security is one of the ways we protect ourselves. and what the fbi, the nsa are asking is our mission trumps that, right? that we want access to that person so wadly that -- so wadly that none of your security matters, it matters less. so when we talk about harms, how the nsa harms security, this is it. they harm security because they believe their need for access to to the few trumps the need for security for everybody. that story from greece was a u.s. product, and greece didn't want the feature, the feature of the lawful access wasn't wanted. it was just in the code.
8:40 am
so it happened to be there. it came with the product, wasn't turned on. someone snuck in, turned it on, used it. so here's the government having their government communications breached because of a back door they didn't even want. that's the kind of thing we have to worry about. you put a back door in, three years from now criminals are using it. now what? these are, this is a -- i think, i don't think this is a difficult trade-off to make. the problem is, the nsa is not equipped to make it. this has to be made in lick at higher levels -- in public at higher levels. that's why i like seeing some of these bills being proposed that actually has congress making these decisions. at least we have a chance of them recognizing that security trumps surveillance. >> so we do have, we have the president's review group and recommendation 29. & nerdy enough to have favorite recommendations in that report, and 29 and 30 are them, urging the u.s. government to
8:41 am
make clear that it won't, that nsa won't mandate that any product change, that a vendor or product doesn't have to change the product to undermine security and enable surveillance. the lofgren amendment, cosponsored by representative massey and, again, a pretty broad, bipartisan coalition of folks, went even further than that, and would have said that nsa cannot mandate or even request that a product, a vendor or service provider weaken their product to enable surveillance. that amendment was vocally supported by google amongst a variety of other companies, trade associations and civil society groups including my own, but i was curious, david, if you could talk more about why google support that. >> yeah. so i think that particular amendment actually addressed two back doors. with respect to requiring companies to essentially build in those security vel initials into their products, and the
8:42 am
second was with respect to the back door search loophole, and that was something -- i think it was an important but perhaps overlooked component of the original usa freedom act that was introduced by senator leahy and representative sensenbrenner and just really quickly on the back door search, section 702 enables the intelligence community to, or prohibits the intelligence community, i should say, from intentionally targeting the communications of u.s. persons or people that are in the u.s. what it doesn't, what it really doesn't speak to is what happens when the communications of u.s. persons are incidentally collected, and we learned more about that from "the washington post" article that appeared yesterday, and i think it just reinforces the importance of the amendment. because under current law effectively the intelligence community can turn a blind eye to the fact that there is a large cache of communications -- u.s. communications that are being collected and searched
8:43 am
without the protections the fourth amendment would really afford. and this is something i think has been really core to google's advocacy to washington for some time, that's something that the hinted to in some of the dicta, i guess, in the reilly opinion from a couple of weeks ago. >> that's the searching of cell phones. >> that's right. the cell phone, searching into arrests. you know, so we thought it was important, this is a welcome and unexpected opportunity to weigh in and support, you know, both the back door search loophole component, buttal to prohibit the use of -- but also to prohibit the use of funds, albeit for just one year, to require companies to build in these sorts of back doors. it would seem sort of, maybe a year ago this sort of language might have seemed unnecessary, but now it's actually really important to sort of restore trust that these sorts of things are not be requested and/or required of companies. so i think it's a positive step,
8:44 am
but i think there's obviously, you know, more to be done. again, this is an appropriations bill. it was an amendment to a bill, and it's unclear where -- whether it's ultimately going to survive the entire appropriations process. >> uh-huh. i commend to you if you haven't read it yet the story in "the washington post" yesterday. on sunday. i think it's going to get them their next pulitzer on this topic. but any other comments or thoughts about the back door issue before we move on to the issue of stockpiling vulnerables? >> i want to say one more thing about trust. we were talking about trust and how this destroys the trust. i think it's worth talking about exactly what the trust was. it's not that we in the tech community trusted that these products were secure, that they were invulnerable, that they didn't have vulnerabilities that allowed hackers in. what we know that security is hard, vulnerabilities are everywhere. what we did trust is that these security technologies, these
8:45 am
standards, these protocols would rise and fall on their own merits, that they would be what they were advertised. not that there was some government hand secretly sneaking in and twiddling with the knobs, right? that's the failure of trust. and it's a big one. and it's something that we in the united states have to deal with as we're trying to sell our products overseas and other countries are saying why should we buy this u.s. thing? the nsa hasv probably dinked wih it. you've been forced to make changes, and you're not allowed to talk about it. we though this happened with microsoft, microsoft has head some unknown changes to skype -- has made some unknown changes to skype to make it easier to eavesdrop. we don't know what they are, we know that they happened. now, how is that going to play in an international market? germany recently kicked verizon out of a large contract because
8:46 am
they didn't trust that verizon was behaving in their customers' interest, right? they didn't trust that the nsa didn't come in and force them to do something and thenñi lie to germany, their customer, about it. that's the betrayal. and it's a big one because we as technologists like to bereave that the technology -- believe that the technology rises and falls on its own merits. >> and this is, again, the drilling back from i think bruce alluded to it earlier, from the broad, broad targeting of everybody to the more targeted. so eliminating back doors and trying to make it so that the nsa can insert them this products and services isn't going to get rid of the targeted surveil lapse they're trying to collect. we've talked about the many different ways the nsa has that it's been able to prove, probably have, foreign intelligence information. this just eliminates their ability to spy on everybody at any given time which is really what we're trying to continually do, is to take it away from everybody is a target to let's
8:47 am
look at who the real targets are. >> so perhaps makes them, as another commentator said, it makes them fish with a pole rather than with a net, as it were. spinning off bruce's comments about how we don't expect our products to be perfectly secure, we just don't expect them to be potentially secure. most software does have flaws in it; bugs, vulnerables, these things called zero days. what we learned in december in a great expose in "der spiegel" which some of us are starting to wonder whether it came there a source other than snowden, we learned of nsa's massive catalog of vulnerabilities in a wide variety of widely-used products, ha!ssu(áq and software. and, basically, they can pick and choose and go, oh, the target has that? here's a vulnerability for that. bruce, can you help us out with, like, where did those come from, and what the heck is a zero day
8:48 am
anyway, and where can i pie one? -- buy one? >> let's talk about softed ware for a second. software is incredibly complicated, everywhere, and we as scientists, we as a community, we as technologists do not know how to write secure code. we do our best, but all software contains bugs, contains vulnerability, right? you know every month you get a dozen or so updated to your microsoft operating system. those are all closing, fixing bugs, closing vulnerabilities. those vulnerabilities can be used to attack systems. remember earlier i talked about the nsa's dual protection -- amie started with that. dual missions, protection and attack. when vulnerabilities can be used for both, right? the you discover -- if you discover a vulnerability, you call up microsoft, microsoft fixes it, we are now all safer,
8:49 am
right? nobody else can use that vulnerability to attack systems. you discover that vulnerability, you call the criminal, you say, hey, look, look what i found be, that's now used for attack. it's being used to break the into the system, steal passwords. we recognize that the way to improve security is by continually researching, finding and fixing vulnerabilities. now, the nsa can play either end, right? they have two missions. they can play defense, use those vulnerables to make things more secure, or they could play offense, keep those vulnerabilities in their back pocket and use them to attack systems. but remember, you know, targeted versus broad, those vulnerabilities affect everybody. they're in an operating system, they're in the interwith net -- internet. so now we have this question; what should the nsa do? and there's been debate about this, right? should they hoard them to attack the bad guys? use them to fire cyber weapons? you can come up with all these
8:50 am
reasons why you might want to keep them.vñ but by keeping them as vulnerability, we're now vulnerable to them. or should they fix them? right? if you six be them -- fix them, you fix the computers of the good guys and the bad guys. if you hoard them, anyone can use them to attack the computers of the good guys and the bad guys. that's the fundamental debate. and, again, the question comes down to what's more important, security or surveillance? is it the surveillance of the few that beats the security of the many, or is it the other way around? >> well, so we've learned that the nsa has a very large catalog of these vulnerabilities that it is stockpiling and using for its own fencive or foreign intelligence -- own offensive or foreign intelligence purposes. the alternative, one of the alternatives is simply disclosing them immediately or something in between that. danielle, you've done some research on this in preparation of our paper. what have you seen out there in terms of the discussion of how should the nsa handle this?
8:51 am
>> i think that this is something that this comes up in, you know, in the president's review group report, but it's come up many times before, and there's a really great paper about the idea of lawful hacking by stephen belllum and matt blaze and a couple of other folk s and what they talk about is, you know, this challenge of how, what's the best and most sort of ethical way to get access to communications for lawful purposes. and so one of the big challenges is, you know, zero days are always, you're always going to find zero days, you're always going to find some kind of vulnerability. and so the tendency, and when there's this tension between the offensive and defensive capabilities might be able to stop them, well, we might need all of these. which ignores the fact that since you're going to keep finding security holes, you're going to continue to come up with an every growing and ever longer list of holes. what they talk about when a
8:52 am
responsible practice looks like when you find a absolutely nibble of some -- vulnerability of some kind you disclose it immediately unless you have a very compelling and immediate need to use it. so if you are looking for something specifically at that moment and it's this high national security reason, you might be able to use that vulnerability and then later, as soon as you'vemq used it to get what you needed, then disclose it to the company so that the company can patch it so all the ordinary users can have their software or their products patched. the other thing they point out which is very true is that software patching isn't immediate. even when you find a vulnerability, you can disclose it and continue to exploit it as a law enforcement agency for a short period of time until you sort of run out, and then you go and look for another way to get in. it's a very complicated issue because, of course, there's something strange about the idea at all of exploiting vulnerabilities to get access to
8:53 am
information. but it's the idea that this is inevitable, this is going to happen, and we need to figure out a reasonable way to deal with the problem while still recognizing there may be legitimate law enforcement or national security needs. the president's review group says the same thing. they say the default should be disclosure of vulnerabilities, sort of immediate and then the only, you know, for a very compelling reason following sort of the senior interagency review process, the nsa might be able to withhold vulnerabilities so that they can use it. what it says that they should not be doing is holding on to them and sort of accumulating their own arsenal of vulnerabilities and not letting, you know, the companies know because that means that sort of general cybersecurity is weakened so that just in case the nsa might need that vulnerability at some point for some target, they have access to it. and so, i mean, that's sort of -- it's this all or nothing approach again where there's no recognition of the fact that it's actually bad for everyone's security that these holes are out there, that these flaws
8:54 am
aren't being disclosed. and it's not telling the company themself so that they can patch them, and the companies are saying, no, no, we have this information. this came up in the decan bait about the heart bleed vulnerabilities. the question is did the n, a know about the vulnerability, and if they did, whey didn't they disclose it -- why didn't they disclose it? was it because they've been looking to get access to things? i mean, that is a serious allegation, and it's a serious challenge. and, of course, they talked about a disclosure process, but they didn't really say very much about the details of it and about what constituted an extraordinary circumstance. >>f3x yeah. so there was, there was a story that was denied that nsa knew about heartbleed, yeah, it seems that that is not true. but in response the white house said, by the way though, we actually do have an interagency process to decide when to disclose vulnerabilities, and we've had it for years, and we
8:55 am
are now in the midst of reviving it or revitalizing it in response to the review group's relations. that would -- recommendations. that would seem to imlie that they weren't fully following it before or something. but i'm curious, what do we know be, amie, about this so-called vulnerables equities process? >> so you've touched on a lot of it. we know that the nsa has a stockpile vulnerability, so we actually know that the u.s. stockpiling vulnerabilities is one of the main drivers of the economy of vulnerabilities. it actually raises the price, because the u.s. is willing to pay quite a bit of money for vulnerabilities to things that it can exploit. so we have this process that they came up with. kind of like, oh, my god, heartbleed, people think we knee about it -- we knew about it, what can we do? let's dust off this really old thing that we probably haven't been using for a really long time and say this is going to be
8:56 am
the process by which we figure out if we're going to reveal vulnerabilities so that they can be patched. it's a multilevel weighing process where they're looking at whether or not you're vulnerable versus their own security needs. now, again, we come back to the nsa's dual functions, and we see over and over and over again whenever they weigh information assurance against surveillance, this side wins. this side wins. so it's very unclear how weighing process is going the play out, and actually one of the reasons it's unclear is that there's no transparency built into it. so i think one of the key things that we need to consistently talk about throughout this is the need for greater transparency and how things are being applied. and they haven't talked about if any numbers are going to be made public about this process, who's going to be aware of how many vulnerabilities they turn over every year and how many they keep back, how many days onv, average that they keep things back. so these are core questions that
8:57 am
need to be answered. things that can be made public, numbers that can be made public without great risk to national security, if any risk at all, and it's not built into this process that is just inherently kind of tilted in one direction from the very beginning because the nsa values its surveillance mission so high. >> bruce? >> one of the things we haven't touched on is the very international nature of this. this is not the nsa or nobody. there are lots of countries looking for vulnerabilities. the government of china is doing the same thing. there are cyber weapons, arms manufacturers. one is called hacking team out of italy that sells software to break into systems with vulnerabilities like these to governments like ethiopia, kaszikstan, governments you actually don't want breaking into the security -- the communications of their citizens. so as we look at these vulnerability, find them, fix them, we're not just making
8:58 am
security better for us, we're making security better for a lot of people in the world that need security to stay alive, stay out of jail. and the international nature of this makes it very subtle. you'll hear a lot of arguments that we have to hoard vulnerabilities because if we don't, china will, and china will win. that's a very zero sum game, arms race argument. but it fails to recognize that every vulnerability we allow to remain is a potential chink in our armor. and as long as we are a highly-connected, highly-computerized, highly-internet-enabled society, we are fundamentally at great arer risk than the government of -- greater risk than the government of china, the government of north korea. that defense is really much more important not just in general, but to us specifically because of this very international nature. >> i just wanted to add, i do
8:59 am
think it's encouraging that the administration is taking up this process. you were talking about one of your favorite recommendations, this is one of mine from the review groups. >> number 30, people. >> number 30. think if nothing we've learned about the ons of language and trying to understand and divine the intent and the meaning of what the intelligence community is saying based on sort of the written word, and the review group's relation in this regard -- recommendation in this regard was to disclose unless there was an urgent or significant national security interest. and in the aftermath of the insinuations that exploited heartbleed, they had said that there was a strong bias toward december closure unless -- disclosure unless there was a clear national security or law enforcement interest. that's very different. those are two very different standards. so to amie's point, what would help to sort of inspire confidence that there is a strong bias toward disclosure is
9:00 am
to have more transparency. because this is easily quantifiable in terms of the circumstances under which a vulnerability is disclosed or even temporarily stockpileed or used. i think there's a lot to be done on this front, and i think it's encouraging that the administration is undertaking this process and seems to have done so before they were accused of exploiting the, you know, the heartbleed. but at the same time, i think there are a lot of questions that remain about what the standard actually means in practice. ..
9:01 am
>> verse kind of anticipated my closing questions which how do you counter the argument this goes emboldened those is like unilaterally disarming? i think the answer is if you blow up a bomb, you can't use it again. you disclose obama build a. you get a patch and no one can use a. moving on to our final category, a catchall category, now that they've weakened encryption and have these backdoors into a variety of products, it has a bunch of vulnerabilities in other products. what are they doing with all of that? it seems what you're doing is building a very large network across the planet of compromised computers and networks that they can then use to conduct surveillance. it seems a big part of this is something called quantum.
9:02 am
i didn't really understand the quantum stuff. this is something bruce has done an extensive amount of reporting on. i didn't really understand it until joe explained it to me. i was hoping joe could explain briefly what this quantum business is. >> i was just explaining this is article to you. so that means a done my job well. mineable job is to be able to explain things to people in a way they can understand. bruce, jumping at any point. >> my job is to write things so people can explain things so people understand. >> it so important. quantum is this really scary thing. so scary and public and it's easy to be like, what? okay, i'm going to falsely. quantum appears to be the u.s. government can respond quicker than any website you go visit, for example. so your browser says, hey, i want to go to cnn.com.
9:03 am
they have stopped any internet that can respond faster than actual cnn.com can. that's what we call a race condition which means the nsa is trying to beat a response from your actual thing you want to get access to with their stuff. so this is where surveillance gets really strange. surveillance, you tend to think of it as, i'm watching a bunch of stuff flow by, i'll jot down some notes about what this person is saying or whatever. this is offensive, active. what we mean is they are actually changing stuff. they are changing communications to do this done. one example is if you happen to be using the to her browser, and you don't know what that is, it's a very awesome anonymity tools you should all look into. you go someplace, they have stuff and it's hard to know what the stuff is. the documents don't describe that, it's too sensitive drive down, i don't know, they're
9:04 am
using the browser is an indication you might be a batman. you also may be looking up contraception and place that doesn't allow that or something. you are a bad guy, right. you can respond to fast and poke a hole into your browser. basically use one of the catalogs, a weaponize this catalog. not just a database of all abilities that may be other that they haven't fixed yet. they've operationalized into tools that kenpo goals in your stuff and establish a beachhead on your computer and do things later or do things. so this is kind of scary but if you think about it, if you just happen to type the wrong thing in our happened have the wrong assignment or something like that you may, in fact, get helpful into your system via this vast set of infrastructure in this is using the set of vulnerabilities in a variety of very, very clever and what must be awesome network techniques. the internet is constituted. i guess that all they talk about how complicated the internet is
9:05 am
but to have this global reach into what people are doing, it's not everything but seems to be a substantial chunk of what people are doing on the internet. that is remarkable. it's the kind of thing where engineers, you know, think of, hey, i'm designing this thing to make your communications confidential between here and here. there may be a bad guy listening, but we're going to design it with a bad guy in mind so we thwart that bad guy. unfortunately the kind of bad guy we don't think about often is one that has infinite money and has a global insight to everything that's happened. that's exactly what we had to sort of reevaluate the way we design systems. so, for example, all say that later. >> let me try and sum this up. the unisys compromise a bunch of riders, isps speed is something. >> a lot of vantage points around the global internet and its watching for targets whether someone using the tour browser or someone searching for a
9:06 am
particular thing or using a particular ip address or a particular cookie -- >> it takes to trigger. >> it jumps out in front of that person's communication and uses that opportunity to inject malware into the computer. is that -- >> actually can work with -- yes. >> which is what joe described to me as crazy sidelong stuff. some of the sites that are being impersonated our major u.s. companies. linkedin is one that's been named, facebook. they have a tempted to spoof google. david, how do you feel about that? >> again it's one of these things that doesn't inspire confidence in the use of products and services. when people use a product like facebook or google or another service, they expect that it's going to be legitimate. these sorts of reports are baffling and they are
9:07 am
disconcerting. because again, i think him in the sequencing of revelations where came, i think these sorts of things were no longer a surprise to people which shows how far we've come in terms of our understanding about surveillance programs and how they work. i'm not aware of any of these sort of incidents, at least ones that haven't been publicized that it happened of the companies. are certainly the possibility or even likelihood that may be happening, and with our services. >> bruce, are we making sense of quantum? do you have anything to add? >> more or less. think of it again as broad versus targeted. the way we normally think of hacking, there's some hacker sitting somewhere that is going, accessing some network and trying to exploit, sending e-mail, trying to break in and using that connection get more access. this is something different. this comes from broad access. the nsa has agreements with
9:08 am
telcos to put equipment in the middle of the internet to watch everything go by and when they see something that triggers, and it could be anything, they will use quantum to inject data into the screen. in this one example we're talking about, it injects data in such a way that allows the nsa to take over tha the comput. so it is a targeted attack made possible by this broad surveillance system. there are a good dozen different quantum programs that do different things but it's all, we are monitoring everything, looking for specific things. this is something the nsa can do because they have an agreement with at&t to put this computer between the user and google. and it doesn't always win, but once in a while they can respond faster than google and fool the
9:09 am
user. the nsa can do this. we can't. but actually we can. this is not a new trick. this is a hacker tool. you can download it. it's called air upon. and upon works of networks. we can get the privilege but it's the exact same thing. this is a way hackers have of taking over your computer when you're on the wireless of, i don't know, this institution may be. so we have a choice. we can build the internet to make this attack not work. we can do it. it's hard. you have to do. and makes us safe from hackers, from criminals, from foreign governments, from everybody who might use this, including the nsa. or we can leave this massive bullet to open a line the nsa to use this broad surveillance to
9:10 am
attack probably legitimate target let the same time leaving us more affordable. >> so it's this kind of behavior that has led a lot of u.s. internet represent to express concern and dismay, and the word about especially the impact on the trust of foreign consumers. you have mark zuckerberg personally calling president obama to complain, and after meeting with the president complaining they are not doing enough to reform these processes. i forgot apples particularly strong work. microsoft at some point like in the nsa to an advanced persistent threat, asa goodrum the jews were reserved for chinese military hackers or russian mafia. and then, of course, there was google were a couple of lead engineers there, after learning about how the nsa was attacking google specifically, said, at a don't think there's a delay on c-span so won't say the word itself, but basically said, f
9:11 am
the nsa. they were pretty ticked off. what were they to talk about, david? what did the nsa due to these guys? >> this was the "washington post" reported the nsa was recapping the private links that connect our data centers. i think we express outrage about it in a continuum of unlike in terms of this happening. i think with five items less likely. weren't right about that. we been working pretty vigorously now to ensure that the traffic between our data centers is encrypted and i can say that word we're pretty much all whether you can never say you're 100% the way there, but we been working pretty aggressively and i think the post article noted even before the post reported that particular revelation that we are working to encrypt the traffic between our data centers. but that was a particularly
9:12 am
troubling and disconcerting revelation because there are mechanisms, including those that congress authorized under the fisa a minute act of 2008 that enable the intelligence committee seek information through the front door and to do so in ways that were envisioned by previous types of fisa civilian. it is a listing of apartments. to see the extent of efforts to go and tap the link between our data centers to obtain traffic in ways that wasn't targeted and swept up millions, hundreds of millions of communications, i think he just sort of reinforce our responsibility to redouble our efforts and to do as much as we can, not withstanding anything that congress might do to limit the ways the nsa conducts surveillance. >> so it seems beyond his policy response is one of the other key responses is armoring up, trying
9:13 am
to improve the security of your services to counter the particular threats. amie, you been working on a project in regard to the. tell us more about what companies do what we should expect companies to be doing at this point. >> when i came to access, we talked a lot about transparency reporting an obsolete finally importing transparent reporting his. one of the reasons is we have this window into the nsa's activities. provide a large part by edward snowden. it's time-limited. we only know what we know from the documents he was able to provide to us while he was there. we're not going to know what's happening next month, next year, five years from now. we need ways into the future to keep that window open are at least as open as absorbed possible so we can continue to have this conversation about the extent of nsa authority. that's not enough because transparent reporting actually really only provide you with the numbers based on when the government goes through official judicial processes to get information. how many times they ask the
9:14 am
court to provide them with information on their users or on their accounts. so what we are looking at is all of the different times when the government doesn't go through judicial process and actually taps into the fiber of the internet. and tries to get communications that way. what needs to happen to make sure that all of your, both in the room, all of you on tv, communications are protected. so we put forth what we're calling the security action plan to spend signed by a lot of forward thinkers, internet companies including twitter and duck duck go to grab another big announcement coming tomorrow. these are alerts. it's also been signed by leading civil society groups, oti, cdt, the electronic front your foundation. the liberty coalition, kind a broad range of groups saying that there are some things that countries can do if they're going to collect information on people, on each of you come in order to make sure that
9:15 am
information is properly protected. unauthorized users, foreign governments, the nsa, bad actors, criminals cannot get a hold of it. so it includes things like encrypting data when it goes between data centers and when it's flowing over the internet. making sure the data address is protected. making sure that your passwords are strong and you are moving toward a two-factor authentification system. really kind of core things, commonsense seven pieces of really commonsense activities that we are finding they companies across the board are not engaging in. we think of these seven things can become a floor on internet security, that you can then start moving forward. here's the minimum, the bare minimum of us except. now become inventive and protect people's information mor more rt and i think of new ways to protected. if you're interested that is at encrypt all the things.net is where we have those things
9:16 am
listed and we're trying to promulgate that indeed that moving. >> so it seems there's a lot of things, frankly, that you need to encrypt if you actually need, you need to encrypt all things. you need to encrypt between you and the website. can the website. need to encrypt between you and your in a server. you want e-mail servers to encrypt between each other which actually google just released a transparent report showing a lot of servers for not doing that and i think shame after them into turning the encryption on. there's also and to end encryption, and google put out a plug-in to help enable you use an encryption for your e-mail on web do. bruce r. joe, can you talk more about what we, putting aside what the companies can be doing, what we as users can or should be doing to try and protect our own privacy against the nsa or anyone else the? >> again, we will talk about old versus focus. if the, if they walk into your
9:17 am
computer, your computer, your personal computer, they are probably going to get it. almost certainly they are going to get it. we as sick of the people cannot defend against a well-funded, well targeted, sophisticated attack against a system. we are not able to do that. that's really not what we're trying to defend against. what we're defending and trying to defend against is both surveillance. is that can the nsa, the chinese, the criminal get into everybody? can they do it in bulk? can they do it on a broad scale? there is a lot we can do. we talked about encryption. that will protect your data as it's flowing from one place to another. there are going to be ways to get at it if the fbi gets award and gets a lot more public it but the normal case of a bulk
9:18 am
surveillance that doesn't happen. it's going to be easy to be grabbed if it's not -- i think you can do. lots of different tools. the issu issues going to be thaa lot of the data that's being elected is not able to be protected in this matter. that's what's being called metadata. metadata is really data that the system needs a noted to operate. you can encrypt your e-mail but the from line, time of day can encrypt a. you can have a secure voice conversation but who's talking, how long they are talking, and when they're talking cannot be encrypted. your location, your cell phone is a location tracking device. we can secure that, but then you can't receive phone calls. the system has to know where you are. so this data cannot be protected by action to take because the system needs it. so when i talk about what you can do to protect yourself, the
9:19 am
single most important thing you can do is advocate for political change. there are a lot of tech solutions. will talk about them but they're fundamentally around the edges. this is a political issue and the solutions will be political. so that is the most important thing you can do. and with that you can talk about the technology. >> i can't emphasize enough, laws move slowly. law and policy move slowly but it's a critical component of fixing this in the longer-term. standards. so the people who decide how your computers were and how things work on the internet move just a little bit faster than laws. so something we are doing is making sure that we are present in the conversations that the internet engineers are involved with and saying, look, it's not just a spoof thing, it's not just an industry thing. it's also something that regular people have interest in and care
9:20 am
about. but getting to the text specifically, i' i like to thinf this in terms of hygiene. you can go about your life not caring about your hygiene, not caring about how you spoken which look like or whatever, and shall not have as good a time as someone who might be more sensitive to those kinds of social norms. it's a little different on internet and since i like to talk about digital hygiene. what can you do to keep her house in order in a digital since? there's a variety of things but i'll mention a few in passing. vpn, three letters, essentially if you have one of these pieces of software and turn it on, all the local stuff that's happening outside of your computer is in the signals out is encrypted. you go to a coffee shop or an airport, you often see free wi-fi. you won't have a little block next to my journal probably does or should, in my opinion. won't have a lot which means even though you to click on some terms of service or payola bit
9:21 am
of money or whatever, all the communications of you send from your computer are not intended to be the use of vpn at least all the recommendations out regularly are encrypted out of some of the things that looks like it came from new york city if you're in d.c. or something like that. that helps you, protection only from the people that might be trying to convert you in your local coffee shop or airport. some of these sound like maybe they are not innocent level protections but they all sort of add into making your less smelly in your digital lives, so to speak. another what is a password manager. i know three passwords. i really only need to no one but i have 1200. some of those i have been used in many, many years but they're all completely randomly generated. i never have to think about them. my password manager is a bunch of different types of tools that manages all of the. the eff, the electronic frontier foundation, makes a really handy
9:22 am
plug-in. that means when you see the lock in your browser, the long url line will go from h. tpp, https which me it's encrypted. this plug of the electronic frontier foundation is dynamite technology. make sure that if there is an option, if enough about him make sure it if there's an option to have an encrypted connection, use the encrypted connection over the internet connection. there's a variety of these. >> thanks, joe. we talked about the variety of technical solutions, talked about a variety of policy solutions. i have one more policy thought to throughout and then i'll open it up to you guys for any opening thoughts or questions. one issue we didn't talk about was a policy response to this offensive hacking by the nsa. this is an issue we also see in the context of law enforcement and we're starting to have an
9:23 am
above board for the first time in many years conversation about what should the rules of the road be when the government wants to back into a computer? right now we have a computer -- has a pretty broad carpet for law enforcement and national secure become and we are only now starting to see a few court decisions about when is it okay for law enforcement to use vulnerability to break into your computer remotely. and we're starting to see a debate, a discussion in the advisory committee of the u.s. courts that discusses like what warrants should look like if you're going to use a war to break into computer. but we haven't in the context of the nsa discussion had a debate about what the rules should be if the intelligence community wants to break into computers. polling short of actually making a policy recognition i would safely say that that is a discussion that we need to have and it hasn't yet begun, except in the law enforcement context. aclu amongst others has done some great work on that issue.
9:24 am
but on that i'll leave it to you guys di if you all have any othr ideas, thoughts, policy recommendations or closing sentiments before we open it up to questions. >> thanks for coming. the fact he came means the care a little bit about that. if he didn't understand, after and we will explain it to you. >> it's a little complicated. >> questions? right there. front row. >> we know this guy. >> my name is chris and they work with the aclu. a lot of the assistance, or a lot of the surveillance you guys described relies on the assistance of companies, the assistance that were so scared about different companies are forced to subvert the security of their users. the quantum stuff that you describe, for example, subverts our security but probably relies on voluntary assistance of some companies. it's tough to imagine a court
9:25 am
order forcing at&t to install the smell were probed everywhere in the network, particularly given they wouldn't be installing it for specific target computer. they just put in there and use them on an ongoing basis. but subversion to 30 that troubles me the most is when they do it voluntarily. we've heard a lot about how companies have really beefed up security in the last year. i think google in particular has beefed things up, and some places you are still providing voluntary assistance. and weakening city of users. the one example i want to highlight here, if the police get a warrants and he sees a cell phone, they can go to google and google will unlock the phone for law enforcement. to google's credit they insist on a warrant when other companies might do it with less but there is no longer boring you to have the ability to unlock phones or circumvent the lock feature on the screen. i'm wondering, you know, a year after snowden if you are not thinking about whether that's a
9:26 am
feature that should still exist or whether you should be taking it away. i think many of your users to enable that locked phone do so with the expectation that only they be able to remove it and the fact that the police can get award and ask you to remove it may surprise and anger some users. >> my response is brief, which is, i don't serve a compliance role for google. i actually haven't heard about that before but am happy to take that back for our law enforcement team and ask that question. >> i'd be happy to say that i really think that this level encryption is a key technology, and that enabling this level of encryption on things like phones is the kind of thing that would make me very, very, very happy. it may mean, well, it's weird without ios. somethings are encrypted and some things are, right? and so i know there's practical
9:27 am
problems that take a long time to do certain things, but it would be nice if you had to sort of turn that off. i'm not a product guy. i'm just a nerd. >> first of all, a lot of really cool folk and dagger stuff here, so thanks for the. i'm going to go and watch sneakers tonight when i get home. with respect to what's going on, i mean, joe, i think you make a great point about the password manager, the two-factor authentification. i think a lot of people in the room and don't already use that type of stuff. what type of activities and steps have the companies themselves taken post-snowden revelations to make our commute nations more secure? and i'd be remiss in not asking joe and gavin to also discuss perhaps ecba reform, after wanted it is our electronic communication protections
9:28 am
submissively decrease as well. >> so i can just talk some. quickly. we've seen more encryption on the web. we've seen what's called -- that's got to be a better word for this thing. i don't want to use any of the nerd words or whatever. the idea being often when used encryption, which a lot of the websites have been moving into google and often deleted, they're using model of encryption where you have one key to come back tomorrow and start of a new web browser, a key that encrypted your stuff is not the same as the one yesterday. it requires a little bit more work on the site of the companies as you may know, but it's worth it and it's often not that more expensive than other kinds of stuff. and i'll shut up. >> the place to look is the eff has a good scorecard of the
9:29 am
major internet companies in seven or eight different things that they should be doing to encrypt the web, users as who's doing what. that is the place to look to its updated so you can see who's doing -- and look at the history counted a recent, who did a long time ago and that's a good way to get a hand on which companies doing what thing to protect security. >> i would be remiss if i didn't add that certain cell societies, well, a civil society to help the suffering person since two types of four positions if they move to encryption by default. >> it's a good incentive to do it. spent in response to ecpa reform, third method because it is an important issue. the electronic communications privacy act, law of 1986, our first digital privacy law but it's so broken at this point because it was based on a lot of tension about how technology works, such that the e-mails
9:30 am
that you are less than 180 days old require a warrant issued by a judge based on probable cause. but e-mails older than that require only a subpoena written off by a prosecutor. in fact, under the doj's reading of the law they don't even need a warrant for your e-mail, even if less than 180 days old if you opened it or if it is in your drafts folder or if it's in your sent folder. so the really incredible take away from that is under current law, the most protected e-mail in your e-mail account is everything in your spam folder because you haven't opened it yet. >> or stuff you haven't read. and don't read your e-mails. >> there's one practical tip of things you do to protect your security. don't read your e-mails. i'm glad we are here for this. >> so, many of us lead in a coalition effort by cdp called digital due process, coalition of companies and organizations have been pressing for many
9:31 am
years to try and reform ecpa starting with a single, clear rule, if you want somebody's content, you're a no content or whatever, stored content with the provider, you need a ward. we think this falls basic vegetable of in the digital age we think that what you store and drop box or gmail or whatever should receive the same protection as the files you keep at home. right now we're in a frustrating place where we have a bill in the house sponsored by mr. hilder that actually has a majority of the house sponsoring the bill at this point, i quit over the magic number is, to an 18 plus, whatever. it's still not moving. we are in, ma for my prospectus and been working on these issues of intelligence and law enforcement for a long time in a weird, bizarro world where it seemed like innocent form has more heat than what should be a really uncontroversial fixed as law-enforcement digital privacy law. but the momentum is still
9:32 am
building. at some point the leadership and the committee leadership are going to have to move this bill because the tide is unstoppable. >> i think ecpa reform truly is the lowest hanging fruit on the surveillance tree. there's a reason why there's a majority of congress that supports the bill enjoys broad bipartisan support both from republicans and democrats. i'll point again to the right decision from a couple of weekss ago where it weeks ago where there was a passage where the supreme court was saying that some users are coming whether the data they store on your cell phone is stored locally or remotely. they said it does make any difference for fourth amendment purposes. a unanimous supreme court. it is a fate the conflict but it is not. the supreme court is sending signals that to the extent that type of case come before it, they will hold it, or there should be an ironclad ward for
9:33 am
content requirement. but we see, i think what we're seeing i think in a different context with the debates about the limits that might be imposed on the nsa is while that warrant requirement isn't so ironclad and maybe there are circumstances where the nsa should be allowed to search communications that they've already collected if the data is lawfully collected, the argument goes, there shouldn't be restrictions to create. it focuses just on what happens to dead after it's been collected. it's a significant constitutional moment at the point is collected. one thing kevin and i probably should mention in case my overseers from mountain view are following this at all, the plug-in you refer to, we use source code for into in which is going to be, hopefully is going to be a browser extension for chrome that if it works right will enable end-to-end using open pgp. we're not quite there yet.
9:34 am
we're kicking the tires and encouraging other people. we have a vulnerability rewards program. researchers that discovered any vulnerability or problems with, with the source code, we are encouraging them to report things to us. >> nsa pays can ask, though spent one last thing to add is that a lot of the things that we talked about today, security researchers and people have known or suspected for a long time, and so one of the good things in the past year is this is something it's coming out for like meaningful public discourse, which is in which creates a much greater opportunity for what bruce highlighted which is political change because it's very clear that a lot of these laws are outdated. it's clear these are things that affect real users, and as we keep getting more stories like the one sunday, that there's a lot of collection that's happening that's into them that makes people uncomfortable and makes him a change and they can talk about it now in a sort of
9:35 am
responsible and a well-informed way. that's very positive, short of moving the political process forward and seeing reform on a wide friday issues. i think kevin has said this before, this is sort of the beginning of this year of what will be many years worth of fights on a lot of these issues. it doesn't it will be easier. it means a lot of these conversations are happening and they are long overdue. >> my personal, i come at this from an expense of preview working at the electronic frontier foundation and soon at&t and the nsa based on whistleblower evidence from 2006 that the nsa were sitting on at&t network and filtering out what they wanted them and being looked at like we were crazy conspiracy theorists. finally at this point admitted
9:36 am
that yes, the nsa is sitting on our domestic internet backbone. now we need to actually do something about it. >> first of all, thank you all so much for this but it's very interesting. i'm with digital liberty and americans for tax reform. i will say it is indeed the week of ecpa. there are two other events this week, cato on wednesday and one on the hill on thursday, but i have a question for you that i've written down because this is indeed complicated. so what i wanted to ask is, how does nsa target bad actors if any kind of weakening or strengthening of security affects the entire world? so it's been said that the nsa has the ability to target government-to-government espionage but it was also said that often we don't know which program or is the underminer of encryption. and then how do we/the nsa find foreign or criminal bad actors? does this also mean we don't
9:37 am
know who is hole poking in our different browsers? so really how does the nsa target? and when i said i'd we find out, who is we? >> so let me ask clarifying questions. you mean, how do they do it now technically, or do you mean how would they do it if we encrypted everything? >> actually would be great to answer both of those, but i would probably take forever. i guess what i'm curious is you all said the nsa does have ways of getting everybody's information to everything. >> you break into a network. the criminals want to get into our corporation to they want to poker garden of his. they break into network, they do that through a partner. they use standard hacking techniques and get the date and left. that is what the chinese
9:38 am
government did a couple months ago. we indicted five chinese military officers in absentia for exactly the same thing to five u.s. corporations stealing data for the chinese government. this is something we believe the nsa does. you want to target north korea, you hacked into the computers and you target them. so there's lots of targeted targeting techniques for targeting targets that everyone uses and, you know, we can talk about the technology of those. but that is what's done. that's very different than targeting, going after everybody. so you try to ask, so what does the nsa do? near as we can tell there is a series of filters. so the nsa will put a computer on the internet backbone, and this is not -- this is nothing the chinese don't even own country. it is not industry specific. we just know a lot of nsa
9:39 am
details. don't think of it as magic as a technote. this is what many well-funded governments do. russia does the same thing. do a broad collection of everything and then very quickly, based on names, based on networks -- keywords, based on topics, call out stuff they don't care about. you're watching happy days can we don't care. get rid of that. to try to focus on things they're interested in. that whittling process, you getting joke about the you will lose things you care about but the hope is you do pretty well. last week and went a very interesting story from the "washington post" that the end result of that entire final were reports given to nsa analysts. here are communications that pass all of these filters. they are no, not american, on bad topics, from that people come whatever. get it is. what we learned is about 90% of that stuff is about innocence, including americans. the filters actually don't work
9:40 am
all that well. even with all of that filter. not if it answers the question but that's basically the process. [inaudible] >> so that we actually find targets -- >> the way that you look at our successes, of law enforcement, terrorism, they don't stem from looking around and saying, there's someone suspicious. they stem from following the lead. the kind of police and intelligence stuff you see in movies and in television. we are going to go after that guy, who's he talking to, what is he doing? the things you don't need barrage of valence for. normal investigative procedures that start with a target and figure out what's going on. that the successes come and we see this from review groups that have looked at these broad surveillance programs, and actually there isn't a lot of value from looking at everything, looking for someone saying the word bomb.
9:41 am
i just made this up, but it's probably true. i could look him everyone is saying the word bomb. it is the work word i'm going to start watching you. that has extraordinarily low value, because randy people say bomb all the time. and actually the people couple of things actually don't say bomb at all. these bulk of systems don't work and are incredibly costly. a big discussion here, we didn't talk about the effectiveness. we talked about the cost. the cost and security for the rest of us to enable those broad surveillance programs. no one is arguing here that there is an a valid intelligence mission, a valid and spin is mission, the target surveillance with a warned by the fbi. isn't a great idea. but what we want is transparency, oversight, accountability, presumption of innocence, and the ability of us to protect ourselves from all
9:42 am
threats. >> and i'll just add, i think in a way part of what we are debating and what bruce keeps coming back to is, we used to live and work of retail surveillance. you would pick a target based on some sort of suspicion, and then you would surveil that target. now we reversed that into wholesale surveillance where you collect on everybody and then you decide who to target. and ultimately that change to the law happened without us actually overtly had a discussion about whether that shift in the way we investigate people made sense in terms of the trade off we were making. it's the discussion we are finally starting to have now, like far too late. >> i'd like to -- [inaudible] >> we want to get you on the web.
9:43 am
>> forgive my -- although we did abolish slavery quicker than you did, i'm not going to talk about, nor am i going to complain about one of the members of your community -- [inaudible] we are in deep trouble and would like to pay actually tacked towards us for all of the money you are taking out of the country. having said that -- >> that would be you, not me. >> a serious point. i chaired the defense to me. i was on the chair committee for 30 years. i chaired it for eight years. i was moving up the hierarchy a long, long time. and one thing i learned, morality and politics is important, but not too important. what you have to do is to protect your society. and if you are being confronted
9:44 am
by those who are using every trick available to make life difficult for us, extorting money, putting us in danger, the idea of responding to that with an excess of morality seems to me, as we would say in the uk, bonkers, stupid beyond words. it's difficult to say that, but when is on the defense committee, we knew the enemy was. they were plai playing nasty, af we did not play nasty, we would be absolutely pilloried, and we didn't do that. so i don't know is there someone who has a perspective that's not a very nice perspective, but it is a realistic perspective. you had your big inquiry in which some of you think hasn't been good enough. you know that your intelligence services played 30 games. thank god they do, because if they did not play dirty, as the
9:45 am
other side did, then the bigger problem you would have would be exploitation and the possibility of political, economic disaster. so if i do appear a little bit off message, it's based on 30 years of experience. i had it. election observation missions to the osce for 25 occasions, all of these russia, evil countries, not evil people, evil countries. and i knew from firsthand almost all of my years of 36 years of treatment in parliament it was fighting the danger our country and our lives. i'm glad to hear that we've had a strong degree of realism that should be a greater degree of realism. i'm not defending every nasty thing that your government has
9:46 am
done. i'm certainly not defending your mr. snowden who bunkered up that great democracy in the world, russia, although i still call the so been. so i don't think we need any lessons of that, or from people like that pic if we have to play dirty, don't admit it but we have to play dirty. i'm actually certain the consequences of playing it decently as though your point football, again, not that english would be good at that, but if you are -- farther than we did which is a difficult. but, frankly, i have no doubt if have to play dirty, you've got to do it. for a question, how do you tolerate me? full stop. >> i did want to allow you to finish because i wanted, it's not an uncommon perspective but also i wanted to hit also i
9:47 am
could fully comprehend exactly why we through a revolution. [laughter] >> i do, i do want to reflect on what you said about making arguments about morality. and, in fact, i think much of the discussion and discussion we've been having, we had in the spring and that is the focus of her paper is trying to step away from a civil argument. clearheaded fully answer side of the about all other there is costs of these programs that we're not talking about. the cost or in its about. the costs are in executed, the cost to our economy, because our foreign relations in other respects, the cost or internet freedom agenda around the world. i think there are a whole raft of reasons to be concerned about these programs, completely separate from concerns about civil liberties or the morality of those who are engaged in a. so moderate prerogative.
9:48 am
that's my answer for that. >> that argument is fundamentally unfair argument. i can summarize in one sentence. terrorists will kill your children. that is the argument. it's an argument that shuts down debate. it's an argument that wins over every other possible argument because it's an argument that can't be argued with. the problem here is that that argument short-circuits any discussion of, are the things you're doing actually effective? right? do they do any good? up here we're not making him around the argument. we're making and efficacy argument, a cost argument, right? there is a threat. there's a threat the bad guys don't play the role, that's fine. but what does that mean the defense should be? there are actually many threats in society. we have been talking about the
9:49 am
threat of government overreach. actually very cities threat. in the united states, you incomes were likely to be killed by a policeman than a terrorist. terrorism is not the one thing you worry about forum automobile accidents. i could list dozens of threats. we are trying to balance them. we bounced them by looking at costs and benefits. up here we've talked about the costs. if the cost abroad surveillance are greater than the benefits, we don't do them. even if the bad guys are bad guys. bad guys are going to go away. the question is what is the best way to deal with that? the arguments we are making is that there are more effective ways to deal with them. not that we're going to be more and they are not and they are going to win. that's done. that makes no sense. it is the question is what is the efficacy of the very tactics, what are the variety of threats and what are the best ways that we as society can do with them? and in order to get those
9:50 am
arguments you actually have to dampen fear. because once someone says, terrorists will kill your children, all that discussion goes away. no congressman will vote against something that someone says if you don't do this, terrorists will kill your children. we saw this and did administration. there will be blood on your hands if you don't vote for this. that's never explained, never justified but as soon as it's said, the fear sets in. one of my great worries right now about reform is that if we ask congress to oversee the nsa, we will get a more permissive nsa because right now congress is scared. not just scared of the terrorists, scared of being blamed if something happens. getting beyond the sphere is the single most important thing we can do to move society forward. and honestly, this might take a generation. you and i might have to die
9:51 am
before more sensible people take over government. >> we simply can't be terrorized. that's exactly what bruce has explained. we have to stand up, in some cases very large political pressures it in the face of low probable events and argued very soberly that that's not worth it spent this gentleman right here has been raising his hand very tightly. >> correspondent for euro politics newspaper. i was just wondering how this issue of the encryption and internet security aspect more than those of his aspect, as this appeared on the radars of other countries around the world, like, for example, in europe which is considering its whole data privacy producer for them at the moment? sort of apollo i to the, it seems to me that the reason the nsa can do this so extensively is because all the companies involved are us-based. does this create an incentive for more, say, european
9:52 am
companies to develop software that has encryption in it that cannot be hacked into by the nsa because they are not subject to u.s. rule? >> danielle? >> so i think, so first of all i think some of the stories have talked about not just the u.s. intelligence agencies but other intelligence agencies including the british actually doing this, but it's most certainly as one thing we've learned when you look at the cost and economic cost to the united states, we've seen a huge rise in sort of this competitive advantage from foreign companies in europe and elsewhere claiming that they have more secure products or that their products that haven't been tampered with and that they're using this as a way to get, you know, to lure business which is incredibly profitable. so i think the broader thing we talked a bit about the cost to internet sector specifically and sort of how in attempt to
9:53 am
protect security we may be weakening our security. we are also doing a great economic cost and there's a cybercrime cost and sort of actually the amount of money we're spending on these programs in order to weaken our security. is also what we're doing to american companies and that's a serious, from its early years of focused, a serious problem because we are sort of driving customers away from the united states. that doesn't always mean we're driving into more secure alternatives, just driving them to what they believe are more secure alternatives and that's a key point. just because it's not a u.s. product doesn't mean it's more secure. but if you believe the u.s. government is interfering with u.s. products, you be more likely to try your luck elsewhere, which is kind of one of the big challenges. >> i think with time for one more question, right there. >> hi. matt stoller with congressman grayson's office.
9:54 am
so a couple of weeks ago bea systems said that, went on, a representative from the company went on cnbc and said there was a cyber attack on a hedge fund, and their stock drop by roughly 2% and they form the fbi formed a partnership with the think tank called the center for financial stability, and there was a lot of discussion about cyberattacks in the financial space. i think it was last week, ba systems said in fact they made a mistake, there was no cyber attack on the hedge fund. it was a training exercise which they confused and thought there been an attack, essentially it was their own training exercise. >> complicated come we told you. >> i mean, that probably helped their business. it's not, i don't know what happened, but there's a lot of money in saying cybersecurity is a big problem, and he don't know
9:55 am
anything about technology, and i'm not, i don't know much about it, you know, how much of the sort of fear of the cyberattacks, how much of that is just profitable for entities to push for his own security businesses? how much of it is legitimate? how well is the incident in terms of defending the country from this kind of, these kinds of attacks? and how do you measure these risks against other risks, climate change, nuclear terrorism and so on and so forth? i don't have a from work for how to think about this. so when i think about political action, policy questions, you can service a let's carve out, let's have orange. that tends to be a good idea and has been ever since the magna carta, but how do you think about these new ruling novel institutional threats? >> okay, 30 seconds, go. >> wow. complicated. you know, i mean, there's a lot
9:56 am
going on. yes, there's a lot of profit motive, a lot of profit-making, a lot of fear mongering but there's a lot of threat to chew. we tend to, for example, over exaggerated the terrorist threat and under exaggerate the criminal threat. so you will find discontinuities on both ends. cybercrime is an enormously profitable, a very big deal, very big business. compass are not doing enough to defend themselves. on the other hand, a lot of other threats are overhyped. there's an enormous security investor complex providing open studios military, being a lobbying force for some of these draconian laws. but at the same time there's real stuff that needs to be sold to real companies. nsa is not doing a lot to defend the country but that's not really their mission. their job is to defend military and government networks. they have not been tasked with defending the broader and it.
9:57 am
that's probably a good thing. so we really can't judge them on the. there's a lot going on here. how to compare this with climate change? your guess is as good as mine. climate change is probably those single most catastrophic for our species is facing, but it's 100 is out. we as people cannot be threat analyses 100 just out. we can barely due to the next harvest. we are not equipped as a people to do that. that's why this is so complicated. there's a lot going on, a lot of moving pieces in profit-making versus real threats. is at 30 seconds? >> close enough. and bringing full circle, it's complicated. thank you, everyone, for coming today. i really appreciated. thank you, panel. [applause] >> c-span to providing live coverage of the u.s. senate floor proceedings and key public policy events. and every weekend booktv, now
9:58 am
415 years the only television network devoted to nonfiction books and authors. c-span2, created by the cable television industry abroad as a public service by local cable or satellite provider. watch us in hd, like us on facebook and follow was on twitter. >> now you can keep in touch with current events from the nation's capital using any phone anytime with c-span radio on audio now. simply called (202) 626-8888 to your congressional coverage, public affairs forms and today's "washington journal" program. end of the weekday is the recap of the day's events at 5 p.m. eastern on washington day. you can also do audio of the five networks sunday public affairs programs beginning sundays at noon eastern. c-span radio on audio now. called (202) 626-8888. long distance our phone charges may apply. >> and a picture there of the capitol building as we are just a couple minutes away from the
9:59 am
start a business in the u.s. senate today here on c-span2. take a look at what's going on. the house is scheduled to come in a couple of minutes from now. that has been delayed and they spill their clean up on the house side of the capitol and we expect the house to be in a couple of hours from the. of course, the house over on c-span. house begins work to do on their fiscal 2015 financial services spending bill. that's a financial services bill that would make the 50% bonus depreciation permanent. in the senate today would expect when they come and they will spend the money on the check the morning until stages and at noon and we can debate on nominations. former hud secretary shaun donovan be the white house budget director and then the u.s. ambassadors for kuwait and qatar the expected votes on those nominations about 2:00 eastern. take you live to the floor of the u.s. senate as things are about to get underway. live coverage of the u.s. senate always right here on c-span2.
10:00 am
64 Views
IN COLLECTIONS
CSPAN2Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=1675692772)