Skip to main content

tv   Hearing on Digital Privacy  CSPAN  February 16, 2022 2:04pm-4:25pm EST

2:04 pm
republican line. this is mark. caller: i want to talk about the dorm report. the term report. there has been some stuff that has not been talked about. none of the mainstream media has reported on this besides your show talking about it occasionally and a couple shows on fox news. what has not been reported is that the national security director is implicated in this also because he worked in the hillary clinton campaign. he is fingered as the mastermind behind the spying on the trump campaign. his wife works for the department of justice under merrick garland. the company used to do the surveillance of the trump campaign and white house also worked for the >> you can watch the rest of this program on c-span.org.
2:05 pm
next, digital privacy and protecting consumer data. >> testimony or when answering questions. the rules require that we keep our cameras on at all times. even if you need to step away, please turn your camera off. i would like to remind members that governing remote proceedings requires members to not participate in more than one proceeding at a time. i ask unanimous consent that the chair be authorized. without objection, that is also ordered. we are here today to confront one of the central challenges of our digital age as nearly every
2:06 pm
facet has moved online. it spans everything from who we are, what we like, who we communicate with and what we do in minute detail. it has some applications, but we are only just beginning to understand effectively. privacy, as it is conventionally understood and the freedom to decide what to share with who and how. we are familiar with the privacy clause on the weather it is government surveillance, identity theft and other economic crimes or social and economic harm that can the health -- that can befall us.
2:07 pm
all of these risks are very much in play today with the rise of the internet and business models of many online firms. privacy has been a catchall term for a more profound set of risks, some of which are truly awful or different in the digital age. what does it mean for a basic economy, when the government and private companies can amass large profiles and use them to manipulate our behavior. what does it mean for our economy when it revolves around opaque data collection. what does it mean for our democracy when they shape the public discourse. what does it mean when we have little to no meaningful ability.
2:08 pm
the american public has been deeply concerned about these questions. in 2020, a research center found half of americans had specific online -- limited specific online activities out of privacy fears. many are frustrated that they do not have more control over their personal information online. economic, racial -- for all these reasons, this is long overdue. there has been little to no action or clear progress. we have introduced the online privacy act. it was the most thorough set
2:09 pm
with the clearest restrictions and a wide variety. the strongest mechanisms to enforce the law. the online privacy act also contained privacy obligations from legislative branch agencies that this committee overseas, requiring the library of congress, the smithsonian and the chief administrative officer of the house to each identify concrete privacy risks. many of the risks and policy -- we can take lessons from both, in determining the best ways to address them. i hope this will not only advance the conversation, but
2:10 pm
reinvigorate attention on digital privacy for the ultimate enactment of reform. with that, i am pleased to recognize our ranking member for any comments, if you would like to make them. welcome. >> thank you, madam chair. it is a hearing on a rare topic, but no doubt an important one, online data privacy. a framework for enforcement, it needs to take place in the jurisdiction. it is clear to me that this discussion should be led by those committees and jurisdictions. we should let them work out the underlying policy details. 304 have been referred to this committee by the house, but we are devoting our time to this topic. it is rare that we are meeting
2:11 pm
this congress. we have one other hearing on the policy underlying a bill and a zero markups. this committee has not held standard oversight hearings with our entities in nearly three years. not even having a discussion about oversight hearings and an election year is shocking to me. eight months ago, i sent a letter to the chair asking for this committee to fulfill its duty with each of our legislative branch entities. eight months later, i still have not received a response from the chair. the committee continues to shirk its duties. regarding the topic at issue today, a call for a workable framework, increase transparency and overall accountability in the technology industry is all
2:12 pm
that universal at this point. data privacy is at the front -- forefront because it affects everyone. as a parent of three young adults, this not only requires congress' attention, it demands it. last week, filings by special counsel alleged that in 2016, democrat operatives, multiple members of this committee are familiar with the attempts to challenge the legitimacy of elections and sought to exploit access to internet data to create a narrative tying president trump to russia. to infiltrate trump tower and white house surveys to link trump to russia. talk about an election subversion. irresponsible solution for data
2:13 pm
privacy. it can curb them. it is necessary. i know our chair has a special interests in this topic. she is the reason we are here today. let's discuss. she offered me a lot of good thoughts. first, we share more personal information online now than ever before. as the internet has grown, so has our opportunity for our personal information to be used by people and businesses that we never intended. privacy does not end at state lines. it requires a congressional response. americans cannot lie on conflicting statements to keep
2:14 pm
their information safe. it is a concern that affects all of us. there needs to be a uniform approach. it needs to be workable for compliance. solutions that do not account for burdens placed on small businesses and innovators are doomed to fail. finally, those who misuse information must be held accountable. the federal trade commission already has jurisdiction over these matters, as well as a long-standing relationship with the state ag's. it is bureaucratic waste at the highest level. the call for a comprehensive
2:15 pm
solution is loud and clear. they deserve an answer. transparency and accountability. i'm actually yield back. >> the gentleman yields back. i would note for the record that the committee's jurisdiction is generally determined by house rules, specifically house rule 10 and this committee has been, along with other committees, assigned a jurisdiction. i want to thank the ranking member for his comments and i would also like to recognize our first witness, who is someone who has led the u.s. government publishing office with great distinction as of late. he is the 28th person to lead
2:16 pm
the gpo, since it opened its doors in 1861 and has served in this role in 2019. he was selected this month. the gpo was selected as one of the best employers in the country, ranking first on a list of midsize agencies and government services. it is a reflection on your leadership as well as your fine staff at the gpo. it is not your first time to be a witness. your statement will be made part of the record. we ask that you summarize your testimony and about five minutes. you are now recognized for five minutes. >> thank you some much and thank you for acknowledging the achievement of making that list. i will take some credit, if it
2:17 pm
is assigned, but it really belongs to the men and women at gpo. thank you so much for having me today. both to the chair and the ranking member, and all the members of the committee. i appreciate the opportunity to testify. on behalf of gpo, know that we take our responsibility seriously to protect pii every day. it is a government agency complete with customers, products -- we face many of the same kind of problems facing private sector firms, when it comes to the handling of pii. cpl is entrusted with pii come
2:18 pm
along with customers and by nature of our business, the general public. protection of pii is critical to building trust with customers and stakeholders. without trust, we cannot achieve our vision. this program that wishes pii itself, regardless of how it is stored, along with section of related information systems. the program has several fundamental programs. first, access to pii is limited to only certain agencies and contractors with a specific need. the business unit is someone responsible for the privacy
2:19 pm
function and answers for that unit's leadership. any gpo teammate or contractor that suspects a breach in pii security is obligated to report the problem as soon as possible. violations will be addressed by appropriate, corrective action. my written testimony goes into more detail as to how this program works, but today i would like to talk about the separate example and how we deal with issues. first, gpo produces materials that require the use of pii. for instance, gpo security and intelligence document unit manufactures smartcards, such as
2:20 pm
the global intrigue card for identification and for expedited processing. by the nature of these products, it must handle a vast amount of highly sensitive pii within our systems. gpo works closely with customers to maintain a series of firewalls to make sure that they only receive encrypted pii that is only decrypted when we need to distribute cards. pii is scrubbed from the system. you cannot leak information that you do not have. second, there are vast amounts of pii in our regular publications. some good news, particularly
2:21 pm
with the federal register is that much of it is contained in historic volumes. where it is, we have automated and manual systems to be jacked that information from our digital collection, when we find it. the bigger challenge is with historic information, particularly, if we work to digitize older collections. at one time, it was common for the department of defense to include social security officers on lists later printed in the record. as we digitize volumes, we scanned for it and react that pii in preparation before posting info.
2:22 pm
while the alert our partners, we do not -- we do not have the resources to visit every collection across the country and manually reject that information. the third and final category are instances where it is not quite pii, but it is information that is not widely acceptable on the internet. this comes up often with court opinions on info. it is understandable that federal court cases may not want that information to be easily accessible. they often ask us to move that opinion from our collection, but it is important to remember that gpo is only making that data accessible. it is not our data.
2:23 pm
our library services division refers individuals. you can have the opinion sealed and removed from our system, if the judge determines that public disclosure is problematic. i hope that this review was helpful with your expiration of this critical issue. i look forward to any questions that you have. thank you so much. >> thank you so much, directed. it is great to hear that testimony. we will turn first to the ranking member, mr. davis, for five minutes of his questions. >> i apologize for the delay.
2:24 pm
i remember the last time you are called to testify. it is nice to have you back. i wish it was to get a comprehensive update on operations and initiatives you have launched, as well as the agency's health, post-pandemic. it is working to print passport and other sensitive projects. work is undertaken to follow the legislation laid out. protecting the confidentiality. d believe as you -- that you, the director of gpo, have all the authority to continue a high level of excellence and protecting pii?
2:25 pm
>> thank you for that question. at the moment, i think that we do. we have the authority that we need to work with our customers to design systems that protect the pii that they give us, and in our relationship with our own vendors, whether it is with a contractor that we hire on half of the executive branch, or work that they are doing for the irs or other government agencies, we are able to construct systems that protect that level of pii. one of the systems that we have to remain really vigilant about is the pii of our own teammates. if there was a large body of pii that the agency maintained, that is it. that is our payroll information.
2:26 pm
it is our personnel files. we work with people in the building and with our vendors over at the department of agriculture, who handle our payroll and other systems, to make sure that the information is protected. we are doing our level best to protect the nearly 1600 people from running into any problems or information of anybody working with our customers. >> how often is it reassessed or updated to remain at this high level of effectiveness? >> it is constantly undergoing assessment. the directive that governs our privacy program is supposed to be reviewed every three years,
2:27 pm
but we are constantly going through our policies, our procedures and actual practices to make sure that we are doing our level best. that is on the digital side, where we have an extremely competent team, but also with our safety and security team. i get an update three to four times a year as our team walks the building, to make sure that anxiety shipshape. i can tell you that they have flagged people who have inadvertently left pii sitting on the desk, not having them stored properly. it is a good reminder to all of us that we need to be proactive in protecting this information, whether digitally or tangibly in the real world. >>. -- >> i cannot agree more. what about that breach that
2:28 pm
happened several years ago? >> i think it is similar to the lessons that a lot of government agencies are learning. we have got to secure our systems. we spend a lot of resources -- we have a dedicated appropriation for that purpose. it requires a lot of vigilance, on our part. knock on wood, thank god, during my tenure here, we have not had any thing that indicated that we were close to a breach of those systems. it is something that we dedicate a lot of resources to every single day. the one thing that the breach taught all of us is that you cannot rest on your laurels. security needs to be everybody's job, every single day. >> thank you. i yelled back.
2:29 pm
>> i recognize mr. raskin for his question. >> discrete bits of personal information can seem relatively harmless when they are all combined together though, they can become more -- they can have a more sweeping impact and create special dangers. what lesson does this contain for privacy regulation? one of the biggest risks that congressmen should be considering? >> miss fitzgerald is on the second panel. >> i'm sorry. i thought i saw her there. forgive me. i will comment. what are the greatest privacy risks that gpo faces in its operation? what are the specific harms in
2:30 pm
the area that you are most focused on right now? >> the largest single trove that we maintain over a long. -- a long period of -- focused very closely on protecting. most of the other places where we have pii, and we need to focus on that, we have developed systems to make sure we are holding onto that pii for the minimum amount of time possible. for instance, in the example i used with the global entry card, we literally have a contractor on the others of the wall from our smartcard manufacturing facility who will defend the inquisitive data -- we will utilize that information to
2:31 pm
produce those cards, then scrubbed that from our systems, send it back to the customer once we are done with it. so that we are not holding onto any more data than we need. our biggest problem, in terms of other folks' data, is really historical data. as we fulfill the greater public good of working backwards and trying to digitize old records, we are going to find more pii, because frankly, 20-30 years ago, folks were a lot less conscious about what they were putting in a congressional hearing or the congressional record where the federal register. -- or the federal register. those were all tangible documents. it was harder for somebody to get that information and then to do something with. because the internet just was not developed as it is now. we have got a program in place
2:32 pm
to scan for pii and those older documents. as we digitize them, we are working to reject that -- redact that from our digital collections. >> thank you for that. they routinely publish documents from the federal government, so you are not the original custodian of all the information published. how does this affect your privacy policy and practices? >> so, we are often the middle person in this transaction. a good example is the one i gave in my testimony, about u.s. court opinions. as you can well imagine, there are a lot of opinions that folks may not be happy about that are out there in the world or easily searchable. that serves a greater public
2:33 pm
good, but there is that problem with, you know, may be having an opinion where there's information in there that somebody is uncomfortable with having out in the public. what we have done as we have a number of different places -- if you go to gpo.gov/privacy, it talks about our entire privacy program, including who you need to talk to over at the administered of office of u.s. courts, to deal with the court document that may have issues -- a court document that may have issues. we revamped our task gpo system, now powered by a modern contact management system. and there, we have also got links, so that if folks find an opinion or something like that, that raises an issue, they can ask about it, we can route that
2:34 pm
to the proper person and make sure they get the information to talk to the administered of office of the u.s. courts to get that particular record sealed, if that is appropriate. >> i yield back, adam gerol. -- madam chair. >> thank you very much. i recognize the gentleman from georgia. >> thank you, madam chair. good to see you again, i appreciate the work you are doing at gpo. you set something a few minutes ago that was music to my ears. i've been saying this ever since i've been in congress. the lesson i learned working in intelligence on the i.t. side of intelligence when i was in the air force -- one of the primary principles that was just engraved into our every policy we have is, you don't have to secure what you don't have. if you don't need it, then discard it. we have procedures built around that, to analyze data, whether
2:35 pm
we needed it or not, you had to justify keeping it, then it was destroyed. the problem we have -- i have been saying this ever since i've been in congress. in special and finances services committee dealing with the same issues. there are many in congress and the federal government that are not applying that principle and so much legislation is being passed that is getting the federal government to grab more data or forcing businesses to acquire more data of individuals, and somebody is keeping that and storing it -- it is data that does not need to be kept. so i appreciate hearing you say that. because it's very few government officials and people in congress that you ever hear say that -- if you don't need it, do not keep it. so, has gpo experienced any data breaches or identified attempted attacks in recent years? >> so, yes, on the attacks side,
2:36 pm
no on the breach side. my i.t. security folks were doing their job. to the best of our knowledge, we have not had a breach, certainly during my tenure, or even in the years leading up to it. shortly after i became director of gpo, i still remember it -- i was at an event on saturday night at my synagogue, when i suddenly learned that somebody we believe affiliated with the iranians had defaced the front of one of our sites. and the good news is, that is the -- that is how far they got. we were able to destroy that site. -- we were able to restore that
2:37 pm
site. we are in the process of moving to a more secure infrastructure. and, you know, the good news is, my team has really good processes in place to make sure systems are secure and updates are tested thoroughly before we do those upgrades -- so we don't necessarily have the exposures you might have, from unknown vulnerabilities. we really work hard to get that right. >> sounds like you are doing a good job at that. because if you would've answered my question, while we haven't had any breaches in the attempted attacks, i would've said then you are not doing your job -- because i guarantee you that you are being attacked. and if you don't know it, there is something wrong. also the updates -- they are crucial. that is what caused the equifax data breach. was it simply a failure of the i.t. department to not do a
2:38 pm
firmware update that was simply to be done. -- that was needed to be done. really comfortable with the answers you've given. while we haven't seen any data breaches in the 30 years i spent in i.t. and data security, i could tell you, the question is not if you are going to be hacked -- it is when. it is a race to stay ahead of the bad guy. i use the analogy of when i used to hunt in alaska. we would go bear hunting and put on a pair of tennis shoes. if we see a grizzly, i want to have my tennis shoes on, they would say, you cannot run a grizzly, and they would say, i just have to outrun you. you have to stay a step away in front of someone else. it sounds like i've already heard the answer to this question -- but do you have all
2:39 pm
the tools you need to keep carrying out this level of privacy and security, or are there additional things we can do in congress to help? >> at the moment, i think we are in pretty good shape. we have the flexibility and tools that we need. obviously it is always a question of resources. but as we start automating processes at gpo, we are really focused on figuring out how we make our dollars go farther. that's been one of the hallmarks of gpo throughout history. we are putting that emphasis on security. and trying to do it with tools we've got and where we don't have tools, we make the investment we need. to some extent, that's one of our advantages. only about 12% of our total
2:40 pm
revenue comes from appropriations. the rest is what we charge our customers. the state department expects us to keep their passports fully secure. we spend a lot of money doing that. at the state department reimburses us for those -- but the state department reimburses us for those investments. we are comfortable with where we are. but we are always happy to talk about things and figure out how we can do better. >> this is an ever-changing environment. it is a job that's never done. because once you think you are secure, and you are not staying on top of it, the bad guys are going to beat us. thank you, madam chair. i yelled back. -- i yield back. >> thank you for the information. appreciate your commitment to safeguarding that pii.
2:41 pm
what i was kind of interested in is, you talked about some of the digital versus the historical documents that you are in charge to protect. your testimony speaks to that and said that 97% of the federal documents now are digital. what distinctions does gpo make between digital and physical documents? how does it transition to the digital age and the privacy risks associated with that on your operational side? >> thanks so much for that. the short answer is for new documents that we produce, we have a pretty airtight system for trying to scan those documents, as we are producing them, looking for pii. with the federal register, they
2:42 pm
are under the opm policies. regarding pii. so there's not a lot that's coming in. we don't need to check the congressional record. and we work very hard to redact any pii that might appear there. i think even congress is being more circumspect about that stuff going forward. our biggest problem is stuff that exists in the tangible form. so printer paper. that was printed historically. 20-30 years ago, even a lot longer. and what we are running into are two issues there -- one is as we are trying to digitize that information. so we are taking committee hearings from the 70's or the 80's and trying to digitize
2:43 pm
those. there, both are contractor and library team are looking at those documents -- our contractor and library team are looking at those documents where we see pii, rejecting them from digital copies that come online -- redacting them from digital copies that come online. we don't have the capacity to travel the world and find every copy of the congressional record from the late 1800s forward, check those volumes for pii, and redact them. so we are in some respects reliant on our partners, the federal depository libraries, and other libraries across the world, where a lot of this information goes. that said, as i mentioned before, i think because that information is comparatively
2:44 pm
hard to access, you think you -- you need to go to a physical place and find a physical volume to open the. because it's harder to access, it is a lower threat than what's going out in the digital domain. it's not a zero threat. and where we find pii, we make an effort to try and redact it. but we just don't have the resources in terms of people, money, all of those kinds of things that do a full score -- full scrub of every place these documents might exist in the world. >> understood. how do you kind of keep -- what is the digital hygiene? how do you stay ahead of the evolving threats and the digital privacy landscape piece of it? is it through the opm guidance? is it through contractors?
2:45 pm
how do you stay ahead of the risks that are in front of you? >> so, one of the great things about being a legislative branch agency as we can take best practices all across the government. we are not necessarily confined by what opm provides. i know our i.t. security team folks work with other spheres in the legislative branch -- as well as broader across the government. as we do work for our other federal customers, we are obviously talking to them, figuring out what their requirements are and how we can meet those. so there's a variety of avenues for us, to be exposed to -- the chips are a changing landscape. what requirements are. we are always going to make sure our systems are designed in such a way as to protect the security
2:46 pm
of those documents. whether it is producing smartcards, whether it's the u.s. passport, one of the reasons they come to gpo's because we can produce those documents securely and government facilities -- in government facilities with the highest quality you are going to find any place in the world, and do it at a reasonable rights and timeframe. -- reasonable price and timeframe. we've got some advantages there. >> > i yield back. >> thank you very much. the gentleman is now recognized. >> thank you, madam chair. we are looking forward to getting back in person before too long. but appreciate you being here. i think in a future committee hearing, we will get an opportunity to talk about some of your legislative branch colleagues. i can't tell you how often i hear from folks across wisconsin about the importance of data
2:47 pm
privacy. they are thinking about being hacked. there's a timely political article that came out a little bit ago talking about our reporters being hacked through apps. it is nonstop that this is something we are talking about. it is important we are talking about here. the proposed piece of legislation that would mandate gpo action in a data privacy space, the online privacy act of 2021 we are talking about, we re you consulted in the drafting of the legislation, specifically section five? >> we were not consulted at the front end, although we did have some subsequent discussions with the committee. from my reading, it is a fairly flexible standard. in the proposed legislation. it is something we think we can meet with the program we are running today. it will give us an opportunity to take a look and make sure that if there are any places
2:48 pm
where we need to tighten things up, we definitely can do that. >> appreciate your commentary on that. that's helpful. under privacy laws, there's a balance and ways to minimize compliance costs for small businesses while still providing appropriate levels of protection to individuals. can you comment on how you strike that balance? >> so, for most of our interactions with small businesses, that is through the procurement program. the requirements there are largely driven by our customers. that said, there are any number of small and medium-sized businesses that are able to participate in our current production -- print procurement program.
2:49 pm
i was just up in pennsylvania. i think it was a few months ago. we visited with one of our contractors, who does a lot of work with gpo. we used that contractor specifically because they have a great system for keeping track of information and keeping track of documents with sensitive information in it. they do a really fantastic job of handling those kinds of jobs for us and doing it in a way where we are confident that that information is safe and secure. i think there are some other companies across the country that can do that work for us. you are not a ton, to be 100% honest with. we have run into problems, where say one contractor has
2:50 pm
experienced labor shortages, which isn't something that is uncommon these days. there are a lot of other places -- there are not a lot of other places to go. but some of the small and medium-sized businesses we work with have gotten really good at having internal systems that can protect that information. and frankly sometimes they do a better job of it than some of our larger contractors. >> i appreciate that feedback. i yield back, madam chair. >> thank you, madam chair. what measures does the gpo have in place in the event that one of its publications contains nonpublic personal information about a private individual, especially about a sensitive or otherwise potentially harmful in nature? >> absolutely. as i mentioned, we have both automated and manual systems to
2:51 pm
redact information that is found in our digital collection. so for instance, let's say for the sake of argument that something slipped through tonight's edition of the congressional record, there was something that had some pii in it. all of our checks failed. the good news is, when that is brought to our attention, our team can go in and change the record in gov info -- requested digital repository and redact that information out of the digital record. we can't really fix the printed copies. but the volume of those copies for something, say, like the congressional record is much smaller than it was a decade and a half ago. to give you an idea, in the
2:52 pm
1990's, the daily circulation of the congressional record was north of 20,000 copies a day. today, we are only producing about 1500 copies a day. so it is a much smaller universe in the printed world. and we have tools in place to fix any problems that might occur on the backend through our digital systems. >> when you were talking about the court systems, that kind of caught my attention. because that's where i spent my pre-congressional career. and certainly judges will talk about a lot of facts as they are ruling on a case. would an individual have the right or ability to get personally identifiable information deleted, redacted, or otherwise remediated? >> the way it works right now, it is not our process, it is the court's process. so they can petition the administrate of office of
2:53 pm
the u.s. courts. it is up to the judge to decide whether or not field those particular records were part of a record. depending on that decision, we will either pull the record out of gov info or putting a redacted version depending on what the constraints are of the judge's order. as i mentioned in my testimony, this is sort of that gray area. it doesn't necessarily fall into the category of the standard pii. people's social security number, other key identifiers. but it may describe facts that somebody is uncomfortable with. in those cases, we are not going to be the best folks to make a decision, as to whether that needs to stay in the public
2:54 pm
domain or it needs to be removed. the judge involved in that case or another judge really needs to be the person who makes that decision. and depending on what the decision is, we will act accordingly. >> what is something within your domain? is there a right of the individual to get pii information redacted, like a social security number or a card number or something? is there a right to have it removed? >> i can get back to you on what the statutory background is on the. -- pond that -- is on that. if we come across that information and have the ability to redact it primarily from our digital collection, we are going to do so. and if somebody brings that to our attention, it may not be the person whose information
2:55 pm
that is, could be a third party, but if we become aware of that, and we agree that is pii, we will make the effort to redact that from the digital side. >> thank you. i yield back, madam chair. >> i recognize a gentlelady from new mexico. >> thank you so much, chairwoman. thank you for holding this hearing. i can tell you this issue comes up a lot. as i talked to my constituents, they are concerned. we are concerned, matter in which our data is protected -- we are concerned in regards to the matter in which our data is protected. i think you would agree that the link of trust will grow stronger
2:56 pm
as more of our lives move into the digital space. i appreciate comments on how gpo and other federal agencies are discussing best practices with each other. do you believe that -- can you elaborate a bit more on what we are doing here at the legislative branch that we might strengthen, based on the information you are hearing and sharing with the other federal agencies, so that we can make sure that we have those protections against the cyber attacks, as well as protecting individuals' personal information as you described. >> i will do my best. it happens on a number of different axes. is probably the best way to think of it. as i mentioned before, our computer information security folks have a lot of back-and-forth.
2:57 pm
there's a lot of discussion about what are those best practices? how do we secure our system? particularly with gpo. and the congress. there's a lot of interchange that occurs on a daily basis. you are sending us information. we are sending it back to you. and we are working with the library to make things available via congress.gov. so there a lot of discussion -- so there is a lot of discussion about those particular flows. and frankly what our cso's are experiencing day-to-day. where are these attacks coming from? how do you secure against them? and where the appropriate countermeasures? the other area we are working on is our obligation as an employer. i just got the update this
2:58 pm
morning, right now, we are at 1547 people working directly for gpo. plus if you hundred contractors. -- a few hundred contractors. we have an obligation to secure that information. payroll information, health information, all sorts of things. the house has a similar obligation. our officials are talking all the time about how we maintain our systems, both from the cyber standpoint and from the more mundane operational standpoint. and as i mentioned before, my safety and security team literally walks our building. we have a million square feet under our roof. we walk every inch of that building. one of the things they look for is, is there pii not being stored properly? it's a little bit harder to do in the congressional space.
2:59 pm
i spent 30 years working in the house. it is a little bit differently. -- a little bit different there. but some of those same practices, whether it's the secretary of the senate, the house, all of us are talking all the time to try to figure out, what is the best way forward? and there's a lot of information hearing that goes on. >> thank you very much. i take it that you have no control over the fact that every search is saved. as individuals might be searching within the congressional record for information, that -- tell me, do you have any way of protecting any of those searches? or is that something left to the individual, with regards to his or her computer use and privacy? >> it is a great question.
3:00 pm
i will get to a more detailed answer as the information we retain and don't retain -- the short answer is we try to retain the minimum amount of information we possibly can. about searches in gov info for other systems. -- or our other systems. in the case of gov info, we use the smallest, most mundane kind of cookies we can, we don't track people across the internet, and frankly unlike congress.gov, we don't have logins, we don't persistently save those searches for folks. >> i think my time has expired. thank you and your organization for respecting the privacy of the users in the documents -- and the documents and the searches.
3:01 pm
i yield back, madam chair. >> thank you. i just want to commend mr. halpern and your staff for your proactive work, protecting privacy. it is really admirable. and in your statement, you said you can't -- if you don't save the material, it can't be abused. and that is actually the principle of the online privacy act, to really prevent the unnecessary retention of information, because if you don't have it, you can't data mine it, you can't abuse it. that is what you are doing in your agency. and something that i think we want to look at. because if you are using a search engine to search something in the congressional record, the engine retains it even though you don't. so you are doing the right thing, but people's searches or
3:02 pm
inquiries can still be abused. and and manipulated. two quick questions -- first, you are doing everything, and your guidance and directives, you are doing that because you want to do the right thing. . but your policies are not required by law at this time, are they? >> no. the short answer is, it is one of the blessings and curses of being in the legislative branch. i actually am a big article one guy. i actually like the flexibility that this gives us, to survey the landscape and pick the policies that meet our values. and our values are -- we are
3:03 pm
going to protect our customer, we are going to be transparent with folks. and we are going to do everything we do, whether it with our folks in the building or customers in a respectful way. part of being respectful is protecting that data. the good news is, while there may not be your statutory requirements that say you have to do this, one, we believe our oversight committee expects us to do that. it is the right thing to expect. and to our -- and do our customers expect us to do it? those are driving imperatives for us here at gpo. and helps us get in a position where it's easier for us to do the right thing that we were planning to do anyway. >> just one final question. in your written testimony,
3:04 pm
you mentioned that gpo distinguishes between high-impact and low impact pii. i am interested in how or whether you reevaluate the level of risk for different types of pii net may change over time. and as you classify that, do you consider the potential impact when information is aggravated by a third party -- by third parties in combination with other personally identifiable information? for example, through the use of machine learning or other data processing techniques that are outside of your agency. >> we are always trying to learn. there are a lot of very clever folks all across the country, who are coming up with new ways to use that sort of ubiquitous data that is out there.
3:05 pm
but we do draw a distinction between certain kinds of pii. those unique identifiers, that your social security number -- that's your social security number, the card numbers or passport numbers the department of state or department of homeland security use, those are really key pieces of information that have the potential to unlock so much else. including biometrics and a whole slew of other things. we work much harder to -- we are going to work much harder to protect that information. lower impact data are things like addresses, photos, names. that stuff is really kind of ubiquitous. for instance, when you do a special order, and you are congratulating the coach of your high school football team, you are going to call that person by
3:06 pm
their name. that is pii. what we are not going to redact that from the congressional record because that, one, is not a huge security risk, and two, if we did, it totally eviscerates the point of that special order. >> exactly. i want to thank you, director halpern, for your great testimony today. as you know, from your years here on the hill, we do keep the record open, if we have additional questions. so if that occurs, we will send it right on to and i know you will get back to us. i just want to thank you for being here with us today. it's been a very informative -- it's been very informative. please let your team know how much we think of them and how proud we are of them. >> thank you very much. i really appreciate it. >> thank you. we will now go to our second panel of witnesses. let me briefly introduce them. first we have shoshana z.,
3:07 pm
the charles edward wilson processed -- processor. we have the internationally recognized expert and the author of several major works on digital privacy issues, such as "in the edge of the smart machine," the future of work and power, and why corporations are failing individuals, and the next episode of capitalism. the most recent book investigates the new surveillance economy, which is driven by consumer data. our next witness is ms. fitzgerald, the deputy director at the electronic privacy information center, or epic. she works to advance strong privacy, open government, and algorithmic fairness and the kind ability laws.
3:08 pm
-- accountability laws. the next witness is marshall irwin. the chief security officer of mozilla. he focuses on data security, privacy, and surveillance. he has previously worked as a counterterrorism and cyber security analyst. has served as the counterterrorism and intelligence advisor to senator susan collins on the senate homeland security and government affairs committee. he was the intelligence specialist at the congressional research service. finally we have daniel castro, vice president of the information technology and innovation foundation, and the director of the center for data
3:09 pm
innovation research into suit focusing on the intersection of data, and public policy. your entire written statement, which by the way are excellent, will be included and made part of our public record. that record will remain open for at least five days for additional material to be submitted. we ask that you summarize your testimony in about five minutes, so the members of the committee will have time to pose questions to you. first, professor, we would love to hear from you. >> thank you to the committee. i am delighted to have -- oh, you muted me. shall i start again? members of the committee, thank you so much for this opportunity to discuss the challenges of
3:10 pm
privacy, privacy law in a world without privacy. i've spent the last 40 -- the last 43 years of my life studying the rise of the digital as an economic force driving our transformation into an information civilization. over these last two decades, i've observed as the fledgling internet companies morphed into a sleeping surveillance based economic order. on the premise that privacy must fall and powered by economic operations that i have called surveillance capitalism. surveillance capitalism maintains core elements of traditional capitalism, private property, modification, market exchange, growth and profit. but these cannot be realized without the technologies and social relations of surveillance.
3:11 pm
methods of secretly extracting human experience until recently considered private and translated into behavioral data. robbing actors of the right to know and -- the right to combat. -- available for aggregation, computation, prediction, targeting, modification, and sales. surveillance capitalism challenges this property claim. and redefines it as such, it was invented at google during the financial emergency of the.com bust. it migrated to facebook and became the default model of the tech sector. it is now reordering diverse industries from insurance, retail, banking, and finance, to
3:12 pm
agriculture, automobiles, health care, and much more. as one executive recently described it to me, all software design assumes that all data should be collected, and most of this occurs without the user knowledge the user -- -- without the user's knowledge. a condition that's only intensified during these two years of global blank. the abdication of these spaces to surveillance capitalism has become the meta-crisis of every republic. because it obstructs the solution to all the other crises for which will require information integrity. in the sanctity of communications.
3:13 pm
-- by private commercial interests for profit maximization. while almost entirely unconstrained by public law. no democracy can survive these conditions. the the facet i described here reflects a larger pattern. the united states and the world's liberal democracies have thus far failed to construct a coherent political vision of a digital sanctuary that advances democratic values and principles in government. while the chinese for example in contrast have focused on designing and deploying digital technologies in ways that advance their system of authoritarian rule.
3:14 pm
democracy should be living our citizens mountain march naked into this decade of surveillance capitalism, without the rights, laws and institutions necessary for a democratic digital future. instead, we have stumbled into an accidental dystopia. a future that we did not and would not choose. in my view, we must not and cannot allow this to be our legacy. survey evidence, some which has been referred to here, now shows americans are moving ahead of their lawmakers in a massive rupture of faith with these companies. we see extraordinary majorities calling for action to curb the unaccountable social power of these firms. a realization that the emperor of surveillance capitalism not only has no clothes, but is
3:15 pm
dangerous. a clear sense that human rights, the durability of society, and democracy itself are on the line. what was on discussable has been come -- has become discussable. to end, we are still in the very early days of our information civilization. this third decade is a crucial opportunity to build the foundations for a democratic digital century. democracy may be under siege, but it is however paradoxically the kind of speech that only democracy can and. -- can end. thank you so much for your attention. >> let me know call on miss fitzgerald for her testimony. >> think it's a members of the
3:16 pm
committee for holding this important hearing -- >> thank you to the members of the committee for holding this important hearing today. epic is an independent nonprofit research organization in washington, d.c. established in 1994 to protect embassy, freedom of expression -- protect privacy, freedom of expression. for over 25 years, we have been a leader in public and private sectors. united states faces a data privacy crisis. companies and be our private lives, spanner families, and gather the most intimate details about us for profit. these companies have more economic and political power than many countries and states. we our profiled and surveyed into winners and losers -- sorted into winners and losers,
3:17 pm
laced with other personal information. private companies are not the only problem. . . government agencies have also dramatically increased their use of personal data. often purchased from private companies, thus failing to address the significant risks of cybersecurity. the impact of this uncontrolled data collection and use of both private companies and the government is especially harmful for marginalized communities. fostering systemic inequities. these industries and systems have gone unregulated for more than two decades. and this is where it's left us. the system is broken. technology companies have too much power. and individuals have too little. to restore the balance, we need comprehensive baseline privacy protections for every person in the u.s. changes to the business models that have led to today's surveillance system and limits
3:18 pm
to personal data -- limits of access to personal data. i'm going to detail the crisis we face and the element of the strong privacy law, but i want to focus on the importance of enforcement. without strong enforcement, many businesses will simply ignore privacy laws. and accept the small risk of enforcement action is the cost of business. as we have seen in europe and several states. enforcement must include a private right of action. this is not new. congress included this in the cables communication privacy act, video privacy protection act, and the driver's privacy protection act. the statutory damages set are not large in individual cases, but they really provide awful incentives for companies to comply with privacy laws. they require congress to establish independent data protection agencies. the u.s. is one of the few democracies in the world that
3:19 pm
doesn't have a data protection agency. u.s. government should be a leader and technology policy. when you think about the presence of technology in our lives and economy, it seems obvious that we should have a federal agency dedicated to overseeing and regulating it. can promote innovation and competition while protecting privacy and level the playing field for smaller technology companies who currently struggle to compete with tech giants. it could also be a central authority within the federal government or privacy issues. we saw a glaring example of the failure to consider privacy risks and data collection just this month. the irs sparked outrage when they require taxpayers to submit facial recognition in order to access tax records online. thankfully pressure from advocates --
3:20 pm
thankfully due to pressure from advocates the irs backtracked. cpa in that privacy is carefully considered when agencies are in contracts like this. we are in broad support of this idea. present data show 78% of americans across the political spectrum support establishing a data protection agency. the good news is congress and committees are restoring privacy online for americans. the online privacy act filed is a competency framework with strict limits on the collection and use of personal data, protections online, and establish strong enforcement mechanisms via private right of action and the creation of u.s. data privacy agencies. it is time for congress to act. thank you for the opportunity to testify today. >> thank you very much. we will turn out to mr. erwin
3:21 pm
for your testimony. >> thank you for the opportunity to testify today. my name is marshall irwin. i am the chief security officer of mozilla, a nonprofit organization and open source community. our product include the firefox browser used by hundreds of millions of people on the world. we are constantly investing in security and privacy and building healthier internet. privacy online is a mess today. consumers are constantly under attack. they are stuck in a vicious cycle in which their data is collected without their knowledge or understanding. then shared and used to build profiles about them. the data is then used to target them in a way that can be harmful. i will talk about what mozilla is doing to address this problem and what congress needs to do. you have been to focus areas for us --
3:22 pm
we drive major initiatives like the encrypted web traffic -- from below 50% a few years ago. what this is doing is protecting your browse activity and your information from attackers in the middle of the network who would otherwise collect that information to build profiles about you. the second big area of focus for us has been on eliminating what we call cross tracking. these are parties following around from website to website, and collecting your information, about your browse activity, and building a profile about you. in 2019 we turned on what we call enhanced tracking protection in the browser. we turned on by default because we don't think consumers should protect them subs opaque risks they can't see or understand. that's a little bit of what we've been doing to try to adjust the problem.
3:23 pm
first focusing on federal privacy legislation, we will talk a little bit about transparency and only harms. in our view, technical privacy protection from companies and baseline regulations or combo mentoring a necessary. neither alone are sufficient. -- are complementary and necessary. neither alone are sufficient. to create a less permissive environment overall. we can't solve every privacy problem with a technical fix. to give you one example, we know dark patterns are pervasive across the software and applications people use. unfortunately, there's a -- there is little or browser can do. this is where law enforcement regimes have to step in.
3:24 pm
mozilla supports privacy and data protection laws around the world. including in the u.s. we must enact privacy protections to ensure public and private actors treat consumers fairly. with strong privacy rights for people interacting with those entities. with agencies with the enforcement to take enforcement action. solutions must provide accountability into online homes. what we see today is a direct result of the data collection happening. we know from disclosures and recommendation systems and targeting systems can have really pernicious [indiscernible] the systems are really powered by people data. it is easier to discriminate against people and manipulate and deceive people if you know more about them. this is why privacy is
3:25 pm
inherently [indiscernible] this harm is mostly hidden from -- -- the public and regulators. that is why we have called for things like an establishment of an investigation into the activities of major online platforms. this will protect research into the public interest and help us better understand arm happening on these platforms. . there's a real public benefit that can be had here. we have been leading the push for full ad disclosure and the european union. we are encouraged by recent proposals in congress that would require disclosure of ads for public benefit and understanding. this provides transparency into the opaque world of online advertising. it can be done without reading privacy risks. at mozilla, we advance privacy technology to ensure the privacy consideration is at the forefront when
3:26 pm
considering how to protect consumers. we look forward to the discussion. >> thank you, mr. erwin. now we turn to our last witness, mr. castro, for your testimony. >> thank you. chairperson, ranking member, members of the committee, i appreciate the invitation to speak with you today. u.s. data privacy is at a crossroads. many consumers are just a flabby frustrated -- justifiably frustrated about the same -- seemingly lack of accountability for their misuse of information. many are overwhelmed by a tsunami of new data protection obligations, ongoing restrictions on how they can use personal information for legitimate business purposes. those are often confused by the multitude of ever-changing laws and regulations. three states, california, virginia, and colorado have passed come brands privacy legislation. many other states are
3:27 pm
considering it. other over the past three years, legislators have introduced a total of 72 bills and more coming. these new privacy laws can impose significant costs on businesses. decreases in productivity and undermine the ability to responsibly use data innovate and deliver values to consumers. moreover, the laws create high cost not just for in-state businesses, but also for out-of-state businesses to confine themselves -- and can find themselves subject to multiple laws. california recently enacted a privacy law and it will likely cost $8 billion annually. but california's economy is $46 billion. in the absence of federal data privacy legislation, they could impose out-of-state costs between $98 million and $112 billion annually. over 10 years, the cost would
3:28 pm
exceed $1 trillion. the burden on small businesses with these substantial -- -- would be substantial. . the u.s. needs any of federal data privacy law. it should not back away from the approaches taken to regulate in the digital economy. instead, it should focus on the following goals. first, data privacy legislation should establish basic consumer data rates. congress should give individuals the right to know how organizations collect and use your personal data. when the reformation has been part of a data breach. congress should give consumers the right to access or delete and rectify certain context. consumers should be able to have a copy of financial data. lawmakers should establish privacy rules with local privacy laws. consumers to have the same protections regardless of where they live. companies should not be faced with 50 different sets of laws and relations. a patchwork of state laws would
3:29 pm
vary in definition with standards. creating accomplice regulatory minefields for businesses to navigate. especially if potential violations are risk to costly litigation. there must be robust enforcement of federal privacy laws. congress should not create a -- instead rely on federal and state regulators to hold organizations accountable. illinois which has a private right of action and biometrics lasting hundreds of lawsuits in the past few years, even when there's been no consumer harm. to avoid unnecessary litigation, businesses should have a reasonable period of time to address a violation without penalty in cases with no consumer harm. congress should set a goal of repealing and replacing potentially duplicate of or contradictory federal privacy laws. for major sections on health and financial data, to narrow er ones, with different
3:30 pm
sets of rules and definitions to comply with. legislation should minimize the impact on innovation. congress did not include these provisions reduce the limit and sharing of use, limiting innovation. congress has hindsight to look at the laws. particularly around the cd pr and the data protection regulation. as opposed massive compliant costs in businesses, around the world. fortune 500 companies have spent $8 billion to comply. they have non-delivered a many of their goals. the use own survey shows that the law has had no impact on consumer trust and the internet. the majority of companies also report that the most expensive requirements such as data, serves no value business function. policymakers should be aware of
3:31 pm
the unintended consequence of poorly crafted legislation such as not considering the application of the law on artificial intelligence and blocking. congress should note that the primary purpose is to harmonize data protection laws. they pursue their own laws with privatization in the united states. but the eu caused it to dissolve. it is essential for congress to act swiftly with privacy legislation and reinstate laws to establish data rights with tax innovation. thank you for the opportunity to be here today. i look forward to any questions. >> thank you, mr. castro into all of our witnesses. we will go to members of the committee who may have questions. i will turn first to our ranking member, mr. davis. >> thank you. thank you again to our witnesses. your testimony. esther castro, let me start with you. there have been proposals introduced in congress that if
3:32 pm
implemented, would create a right of action for individuals in a collective right of action for nonprofits to sue on behalf of the individual. is this the best approach for account to be -- accountability and a privacy framework? >> no. right now, we have an effective regular in the f's -- fcc. they take up the slack. we have seen and states have implement these rights, with biometrics and innovation, once a core rule has brought lawsuits under the rule for harm to consumers, we saw lawsuits. they are driven by class action, and lawsuits where they have really just had attorney fees, and this leads to significant -- over $100 million in facebook, adp, walmart.
3:33 pm
tiktok. the list goes on. it is spirit technical violation. that is not an effective use of dollars that could be back in investing in technology and processes that would protect consumer privacy and make it work secure and protected from harm. that is taking money out and being used on actual protection from consumers. >> thank you. i'm sure you have read the reports about special counsel's allegation that political attorneys were specifically infiltrating servers in the trump organization and that the white house. would nationwide data privacy legislation have protected president? >> it depends. certainly, privacy legislation would restrict what companies might be able to keep on hand,
3:34 pm
and what they might share. but at the end of the day, what you really want to see is better security. better transparency. something that consumers will know what they're getting. the trump campaign would know what kind of privacy control they are getting the purchase of service. they will know how the data is shared. more transparency is what we should be looking for in the future. >> i agree, transparency matters. but penalties for bad actors and allegations are borne out. do you know if they would have any enforcement penalties if these allegations were borne out? >> the fcc can impose penalties on companies with violations. particularly with most companies in terms of services, saying that we respect privacy and we will secure your data. if they do not do that, it is a misrepresentation, and the fcc can and has gone out to companies before that.
3:35 pm
we certainly can consider giving the fcc more authority, if we are increasing their budget. we should say exactly what we want them to do with that additional money. i think that is the best way to go forward, but we need to listen to the fcc and give them latitude to start doing it's on rulemaking that goes outside of the can -- congressional intent. it is very risky that the sec might decide they want to decide how to design a website or a mobile lab. all under the name of ensuring consumers are not being manipulated. on the other hand, the fcc is not making it up. they have engineers. >> very good points. this agency does exist to address concerns already. the online privacy act of 2021 referred secondarily to this committee calling for the creation of a new federal agency called the digital privacy agency.
3:36 pm
not only is this a clear example of congress shifting his authority to the executive branch, but it also seems like an extreme waste of resources given the existing structure already established and the federal trade commission and with the state attorney general. do you believe that federal agencies are necessary to accomplish the goal of data privacy? >> the fcc has the authority and the reputation and experience to do his job. we don't need a new agency. i'd also mentioned that when we think about protecting consumer privacy, it overlaps with fraud and security, and that is something the fcc should be focused on. we would split up the mission, for consumers. >> thank you for joining us. to all of the witnesses, in -- i yield back the balance of my time. >> mr. rasping. you are now recognized for five minutes. >> thank you madam chair.
3:37 pm
good afternoon. i have a question about the application of data. the aggregation of profiting data in large quantities with risks that might not be a apparent when talking about a single piece of discrete information. i'm wondering what can be done about that. >> the lesson we can take, combined with how important it is to have a comprehensive privacy law that covers personal data generally, rather than the senatorial law. the acronyms go on and on, but they leave huge gaps of protection for personal data not covered by those laws. and, it also shows with notice of content that it doesn't work. the industry is just impossible for the vast majority of internet users to fully grasp. they have a website with their
3:38 pm
data, but that does not mean they've consented to data being passed on to hundreds of workers they've never heard of. we saw an example when the cabinet -- catholic priest was outed by news outlet. the data was on grinder. there was a realization with a letter to the fcc and other committees with rulemaking to strengthen the location data. it could also be integrated into privacy legislation. >> thank you. >> i've read your book about surveillance. it was hard to sympathize -- synthesize an entire book. but who is big brother today? is it the state as we traditionally conceive it, or is it corporate sectors, or some combination? >> well, thank you so much senator ruskin, and i am
3:39 pm
flattered and thrilled to hear that you have read my book. thank you. as you know, beginning with the book, i developed a concept i called the other. big brother was a notion of a totalitarian state which had very specific ideology and wanted people to believe certain things, and talk certain ways and act certain ways. we are in surveillance capitalism. not a totalitarian state. surveillance capitalism does not care what you think. doesn't care what you believe. it doesn't care how you act. but it does care about is that you think and believe and act in ways that it can capture the data. to do the aggregation, that
3:40 pm
you're asking about, with aggregation, there's a possibility of computation, applying artificial intelligence, coming up with predictive algorithms, targeting methodology, with targeting methodology, various notions are achieved in increasing engagement which is a euphemism for increasing the footprint for greater data extraction. and also, as we increase engagement, using our knowledge from deep computation ai, using our knowledge about different things like subliminal cues and selected micro-targeting, and engineers social comparison dynamics, we really know a lot about you and in that way we can use messaging and actually begin
3:41 pm
to modify your behavior and to do that in a way that is consistent with a commercial objective of the company and its business customers. at the end of the day, this big other, it is you it is pervasive. it is full on control. full on opacity, full on power. >> we have not yet enacted comprehensive federal privacy statutes to address the rise of the big event. in other countries, they have attempted to do so. the best efforts and
3:42 pm
comprehensive privacy bases to protect people's possession of interest and ownership of their own characteristics. their own behavior. >> the gold standard now, i look at this from a few weeks ago, but the digital services act and the digital market act, in the new, these are not the total solution. we have very specific things to accomplish. as far as setting a new standard a new frontier for the kinds of life, when i use user, use it with air quotes. users are all of us. it is the synonym for humanity, at this point. these pieces of legislation are ambitious and comprehensive.
3:43 pm
they have the buy-in of parliament. member states. a few details left to be worked out in the spring, but these set a new standard for the right, the enforcement powers, the rights of users and the enforcement powers. of the government. to really come up with first time, put democratic governance back into control of our information and communication spaces. to work with the privacy sector rather than having the private sector as what it has been for the past two decades. a completely autonomous force that is unimpeded by relevant law. all the social harms we have described this afternoon. they are of that situation. >> thank you so much. the gentleman's time has expired. we turn to the gentleman from georgia, mr. ludwig. >> thank you madam chair.
3:44 pm
very interesting topic that we are talking about. i have been dealing with this for quite some time. mr. castro. one of the issues i have seen from a financial services perspective, and i assume it is true and the same here, is a multitude of data privacy standards that we have across the country and the different states. you mentioned in your testimony that there are 34 state legislatures that introduce bills in the space from 2018. i want to get your thoughts on the importance of having a uniform federal standard for data privacy. it will go across the board. >> i really appreciate that question. >> thank you so much. >> for so many companies, especially smaller companies, we think about data privacy. we are often thinking about facebook and google.
3:45 pm
these laws impact everyone. the impact local florists and barbershops and grocery stores. complying with those laws can be very expensive and very difficult. even though, the end of the day, most of these laws are based on the same thing, they say it to little diffley. complying with those laws so that you are not, especially if there's a right of action, you are not suited for making some technical violation, if you have not, for example, put a certain policy on your website. you might be in violation of that. all of these obligations at a cost to businesses. these businesses, instead of hiring engineers or giving a race to the workers, they have to go out and hire a privacy layer, and privacy lawyers really like privacy laws because it is good business for them.
3:46 pm
the issue is that we don't need more laws, that we need one good love. i'm glad to see we are having this conversation because i think we can get to the one good love. we can get there sooner rather than later because the more we wait, the higher the cost gets. >> what effect does this have on american competitiveness in the international marketplace. >> it is very significant right now. when we think about a lot of u.s. companies, they are doing a good job with protecting privacy better than companies in china. but, because we have no federal law, and we have a sectoral approach, another country may not have taken a secretarial approach -- sectoral approach. the u.s. does not care about privacy. that is not true. i don't think it is true terms of the fcc and the work they are doing. i think they are doing respectable work. i thing they're doing great work. i think the state laws can even
3:47 pm
be effective. i don't gets fair to say that the u.s. is not doing good work on privacy. we don't have a national privacy law. so often, that fact is used to say a u.s. company should not get a particular contract. we are that use right now in part because european lawmakers are saying that the u.s. is not doing enough to protect privacy. one of the best ways we can promote u.s. competitiveness again, not just from the tech sector but from across the board, is to create a memorable law so we can show that we take this seriously. >> i think everyone on this committee is committed to, and i think most individuals recognize the importance of and practice -- of protecting personal data and privacy. definitely, i do as well. i know you do. but with that being said, you mention your testimony a concern that i share as well. a compliance cost of small businesses due to state privacy laws.
3:48 pm
is there a state you can point to that has successfully executed a data privacy the law -- privacy law that validates those concerns? >> i think virginia is the closest we've gotten so far. they are trying to use strike it right now, but the biggest problem is that when you see the state laws, the state legislature is not thinking about the cost that would be borne by states outside. again and again as we see, each new state goes down this path. they are forced to rehire lawyers and reevaluate the privacy policy. they make technical changes to their systems. they are compliant, without any discernible improvement in actual consumer privacy. the goal is consumer privacy, but let's address this top-down rather have 50 different rules for how to do it and have so much for consumer confusion about what is actually going on and who is protecting your data. >> i see my time has quickly run out, so i will thank you for
3:49 pm
that and i yelled back. >> the gentleman yields back. >> thank you, madam chair. i want to talk a little bit about the lessons that mozilla has learned from building privacy protection tools. remaining relevant here to us in the public sector and for government agencies that we have oversight over. coincidence and operations, what are your thoughts on the operation and what is applicable to the public space. >> there is way we approach this question, and it is about our core privacy pack -- practices. how we are going to be transparent to provide them with the full thing. typically, it is about data minimization. we are really only collecting
3:50 pm
what we need to run our business and compromise for users. frankly, that fall substantially across our industry. [indiscernible] once we have the data we work to protect it pretty aggressively. data minimization is a strong feature that is controlled on the back end and needs to be applied to both the public and private sector. we also work really hard in what we talked about earlier. privacy features to protect features. tracking in the browser. we have that which is a little bit disruptive for the industry, right now, we have a really permissive internet and a large number of parties building on that permissiveness. their data practices and their businesses which is actually a problem. it is not a disruption. it is helping breed we don't always get a positive response
3:51 pm
we will out privacy features. i think that is a private -- positive signal we are moving in the right direction. we try to roll privacy features. >> i understand it sometimes being at the forefront that you take on a lot of folks with different viewpoints as well. miss fitzgerald, what are some of the best ways that we can mitigate the risks and harms of the aggregation of personal information over time from the bull different sources that we know exist? >> thank you. we talk about the minimization in the first panel. and again, with private companies, our data would be must better shape. -- in much better shape. we don't have to protect it. the obligation should be on the companies to minimize the amount of data they are collecting since they don't really need anymore. that is protecting us from
3:52 pm
having information passed on, and it protects us in case of a data breach. >> i perceived that. back again, what are some of the privacy risks and harms that we have that may not have primarily technological solutions, and i guess what i'm trying to ask is issues that instead require changes to business practices or laws breed we are trying to balance the regulatory side here, and what are your thoughts on how we balance those pieces? >> we bump up against this problem all of the time. we work in a browser, for example with challenges involved. the website, for example, and there is a third party on the website that the user doesn't know about. their collected data. the third-party relationship can do something about that in the browser. actually, a network group with a
3:53 pm
third-party. we can actually block them. that is typically the role of the browser or the operating system. it can jump in and say third-party is tracking our behavior. what we can do as much of is when the user has a direct first party relationship with the website or business, that comes down to a question of business practices. technical problems. there is little we can do to adjudicate the relationship between the user who has decided to engage directly with the first party, but then is outstanding and they may not know about it. that is where i think regulation needs to step in. where is technical signal that truly goes together. >> i appreciate that. thank you so much. i yelled back. >> mr. salas recognize. thank you madame chairwoman. i want to pick up her my colleague left off. a little bit. i want to shift over to you if i can, miss fitzgerald. on the topic in particular of
3:54 pm
reading testimony and you referenced a biometric and genetic data. that is sensitive and deserve strict regulation, correct? >> absolute. >> in particular, you raise some concerns about how health care data is used. is that correct? >> yes. health care data is particularly sensitive. >> a lot of people think hip is protecting our health care data, as it extends to health apps. that is not correct. would you agree? >> yes. hip only applies to the relationship with your doctor, your health insurance company and exchanges. it doesn't cover health benefits. >> that is an interesting distinction that a lot of americans might be finding interesting. building on what was being discussed with mr. erwin, the third-party apps and third-party relationships that exist
3:55 pm
uncertainly, what is the impact that it could have. congress passed the cares act to protect patient data, and to require patient data to move from the doctor's office to health care apps of the patients choosing. they're all a lot of reasons that is a good thing. but there are some things we want to get right. you want to be able to make sure patients can control their data. he also want to ensure they don't lose control of their data by sending it in either party so they might not be aware of it and find interest in their health care data, thereby metadata, or others. do you think the risk is bigger in some of these apps they might send that individuals might have a greater risk if they are sending information to some types of apps under the rules and regulations of a 21st century cares act? >> the reason biometric data is so sensitive is because we cannot change it. we cannot change our faith and fingerprints.
3:56 pm
>> let shift offer second. great topic. let shift and talk about medical data. does that give you unique concern? >> yes. it can be used against us. when you think about insurance companies, it is determined already. [indiscernible] hiring somebody with health conditions with health insurance premiums. >> i appreciate that. it is an area we need to really keep an eye on, and i think there are a lot of really positive things there. i think innovations in the united states will allow patients to control their data, and that the same time we want to make sure that patients are protected in particular where there might be a third-party player involved where a patient might not fully appreciate the risk that they are engaging in ensuring their data. it is something we should really spend some time picking up. let me shift gears again. thank you. let me shift gears to you mr.
3:57 pm
castro. there are a lot of laws that govern privacy. for different types of institutions. the act with federal law requires financial companies to explain their information sharing practices with some of their data. there is a real attorney for congress to look at currency privacy -- current privacy laws are working. they are not duplicative of regulation. can you shed some light on the passage of gdp are in the eu. what do we see in that sweeping legislative resolution. has it stifled innovation or grown it? what should we take away from the eu's work #--? >> when you look at the cost of compliance, when you look at the impact it has had on startups and the tech economy, it is data
3:58 pm
points again and again showing that european businesses are suffering. >> only because we have limited time, can we shift gears slightly and say let's jump over to california. california has consumer privacy acts that make things easier for california. what would happen if we took it and ran that nationally? one of the biggest promises we've seen is related to enforcement. that is one of the biggest positives is the idea of a 30 day care which i think could be extended nationally. that's been said. >> thank you. that's my time. i appreciate everyone's time. we got a lot of laws already on the books. state laws, we can look at the ee. we can look at financial services and what they are doing with a toy for centuries. i think it is critical that we are looking at all of these legislative's -- legislations and learn from that so we can
3:59 pm
protect data and also maintain innovation the niceties of america. we appreciate that. i will yell back. >> the gentleman kneels back. >> we hear quite a bit about this on the company and agency side, how do they navigate all these different laws being promulgated between states but also countries. as we are aware, the internet does not have national boundaries in the same way. ms. fitzgerald, could you talk about that? what are the most important privacy rights you have seen proposed that you think congress should work on adopting? >> the most important thing when thinking about privacy legislation is we cannot just be telling people what companies are collecting about them.
4:00 pm
that is not enough. we need to limit that collection, what they are doing, and had them delete it when they are done using it. >> any suggestions about which are the best models for us to look at, whether state, or the you has been mentioned -- has been mentioned. >> california has the strongest now in the united states, this ecpa -- the ccpa is the strongest model we have seen, but that only allows users to opt out of the sale of their personal data. it is not as much put obligations on companies to minimize the data they are collecting. the online privacy act is a great model that includes data
4:01 pm
minimization, preventions of discriminatory uses of data. that is important. you also want to make sure you are acquiring algorithmic fairness and accountability. we don't want to make the same mistakes with ai that we have made with data collection. let's get ahead of it, encourage companies to innovate around privacy. >> one of the typical tools that people talk about with the privacy laws is the notion of consent, and i gather that you are a little bit skeptical about the usefulness of consent as a focus of privacy protections. he talk about that a little bit? >> there is no way for an individual to understand the web their data gets past through -- gets passed through.
4:02 pm
if you already pushing a button to make a banner go away on a website, you don't realize what you are saying is, sure, take my personal data. >> we have certainly seen that proliferate recently to get access to almost anything. you decide to click or you decide not to. so with respect to discriminatory data uses, how do we address that? what are your top line recommendations there? >> several groups have proposals on what we need to do to make sure civil rights are protected online, making sure that public accommodations are protected. that extends to online spaces. to make sure that the algorithms
4:03 pm
that companies are using are not being used ways -- used in ways that deprive people of opportunities, you know, for housing, ads for jobs, that our personal data being used in discriminatory ways. >> thank you, madam chair. i you -- i yield back. >> the gentlewoman from new mexico is recognized. >> our personal data is extremely valuable. we take a question, look up directions, the companies collect our information and sell our information, often without our consent, to a third party and use our information for things like ads for clothing to
4:04 pm
some things that are more dangerous and it is invisible to the true owners of the data, me, my neighbors, fellow americans. i want to focus on how data privacy or the lack thereof affects our elections or democracy. he election subcommittee of this committee held a roundtable in miami on the creation and dissemination of this information -- of disinformation and this targets spanish-speaking communities. we have learned how it has been exported to other states, including new mexico, and amplified based on this data harvesting we have heard about today and read in this testimony. mr. fitzgerald, actors can use our data to undermine our elections integrity. what does this look like in practice? >> sure.
4:05 pm
thank you, representative. just as advertising companies use profiles about us to manipulate us into purchasing, they can manipulate overviews by filtering the content we see. the way they can do that is -- there was a great investigation done after january 6 showing the two different facebooks republicans and democrats were seeing, and it is a stark view of, you know, the different information that both sides are seeing and how that shapes your views. even if someone does not click through an article, just seeing that headline, having it in the back of their head, these companies are able to kind of manipulate our political views by determining what we see. >> next, i appreciated your written remarks about how democracy is simultaneously the
4:06 pm
only legitimate authority capable of halting surveillance capitalism but also a prime target. i appreciate that and would welcome you expanding on that, but the question i want to get to first is mr. castro suggested that a private right of action is unnecessary and we should focus solely on transparency. do you agree, and if not, why? >> you need to unmute, professor. >> sorry. i don't agree with that. i think the private right of action is very important. you know, when we passed a law, whether it is the california law or any other law, that is the
4:07 pm
only -- that is only the beginning. what has to happen is these laws are tested and they evolve and they develop and, as a society, we learn what they can mean, what they can mean for us and how they are going to protect us in society, so what the private right of action does is it creates the opportunity not only for individuals but for groups of people, for collectives, to really bring issues into the judicial system, to have those issues explored, and to create precedents, and this is what is called the life of the law, how the law involves and how we can move forward into this century not just with statutes that are frozen in time but with laws that are evolving according to
4:08 pm
what justice brandeis once called the eternal youth of the law, because we have these kinds of interactional processes. >> i completely agree. our civil rights laws -- right to private action. i didn't want to ask you to elaborate in the few seconds left with regard to what you believe is most important to protect our democracy in a long line privacy act? >> the first thing is that so much of our discretion has been minimizing data or how was the data used and things like that? once we started discussion talking about data, we have already lost primary ground. the key thing is that now we are all targets for massive scale secret data collection, much of which should not be collected in the first place. the decision should live -- should lie with the individual.
4:09 pm
do i want to share this information about the cancer that runs in my family, and do i want to share that with a research operation or a federation of researchers that are going to make progress on this disease and help the world? maybe so. but do i want google or facebook or any other company to be inferring information about my cancer from my searching in my -- searching and my browsing? absolutely not. so we need to reestablish decision rights, as justice douglas offered in 1967, the idea that every individual should have the freedom to select what about their experience is shared and what remains private. these are the cause of which privacy is the effect. we need to establish finally in laundry reticle rights that -- in law juridical rights about
4:10 pm
what can be shared and what remains private, and then we have a new ballgame, where organizations like mozilla and the others waiting to come on stream, not just in search and browse but in every sector, waiting to get in the game. >> we have gone over our time limit. thank you. i yield back. >> the gentlelady yields back. a few final questions. thanks all the witnesses for excellent, interesting and enlightening testimony. you know, i do think -- i wonder, mr. irwin, and your testimony, you refer to dark patterns as one of several privacy abuses that demand regulation that technology alone could not adequately address. can you explain in further
4:11 pm
detail for the committee and those watching this hearing what dark patterns are, what are some of the most troubling examples of dark patterns, and what is the best legal approach to rating them in? >> yes. it is an umbrella term used to refer to specifically user experiences that can deceive a user into opting into data collection without being explicit about what is happening under the hood. a dark pattern might be just text intended to deceive the user or it might be you have to click through five things stopped out of data collection. all of those are sometimes referred to as dark patterns. you see those across the web. that is the area where the browser cannot really do much, where the user is actively engaging with the first party. that is where we do need to see some regulatory engagement. obviously, we do not want a
4:12 pm
regulator designing a user experience in a browser. we deafly don't want a regulator doing that -- we definitely don't want a regulator doing that, but actually saying here is what looks like a deceptive practice and a sound practice. that is where we need a regulatory push. >> standards instead of design. >> yeah. >> we have talked a lot about the rights of individuals for privacy and some of the disinformation, digital addiction, harmful impacts on children, on certain internet platforms, but i think, you know, there should be a private right of action, but the core issue really is, as mr. halpern said, if you don't have the data, you cannot use it to manipulate. it is my view that the victims
4:13 pm
of manipulation, whether it is for political, societal, cultural or commercial purposes is not just the individuals being manipulated by the public at large. if you are manipulating for a commercial purpose, you are a big tech company and you've got all this data, you know, you are at an unfair advantage to smaller companies that have not acquired all that data. if you are manipulating for a cultural or societal purpose to move society in one direction or the other, the public is also a victim of that. individuals have been removed from their agency and have been -- and are less free because a private corporation is manipulating them through their own data, so let me ask you, ms. fitzgerald, isn't really the crux of this to restrain the collection and retention of data?
4:14 pm
>> yes. absolutely. it is not enough for users to just know what companies are collecting about us. it has to be restrained. the obligation has to be on the collectors to limit what they are collecting. >> i also think -- i just want to mention briefly, and maybe any of the witnesses can talk, or, ms. fitzgerald, you have studied this. i am a fan of the ftc but i have been told by many who have served their that the sheer number of individuals who are really technologically savvy is fairly minimal compared to the actual army of digital engineers, software-savvy people in these large companies and at that, frankly, they are no match for the gigantic companies with the computer and digital
4:15 pm
expertise that those companies have, which is why we are looking at how do we better arm a regulatory agency to actually be successful in facing off with these gigantic corporations? is that capacity of the ftc off-base, ms. fitzgerald, in your view? >> i am encouraged by recent action by the ftc on privacy but they have limited resources and an incredibly broad mandate that covers everything from antitrust to horseracing safety, tractor parts and laundry bags. the task of data production is best done by a specialized, independent regulator when you think about the outsized presence of technology in our lives and our economy. 10 years from now, gnome will question why we have a data protection agency, just as now no one questions why we have an faa or epa.
4:16 pm
>> let me ask a question of you, mr. erwin. in your written testimony, you say is important that competition not be a pretext to prevent better price -- better privacy for everyone. can you explain that tension and how congress might strive to strike the right balance on? >> this is something we see fairly often. the basic challenge we have is the internet was designed in a permissive way allowing for a large diversity of parties to collect data, sometimes the larger but also the small. closing down these privacy gaps means denying data both the big and small parties. sometimes we hear the argument that we should not close the privacy cap because -- privacy gap because that has competition applications, the idea being we should leave the internet more permissive to protect some
4:17 pm
business models. what we want to see is an overall more protective platform that has an even playing field that all companies can compete on. that is the approach we are in favor of. we will push back aggressively on any suggestion that -- in order to protect some certain business model. >> i will just close with this. i live in california. the california law has not really stopped the collection i think it's authors intended. i will say, however, that despite some regulation, the formation of businesses in the tech sector is at an all-time high. business is good. job creation in silicon valley is carrying the entire state in terms of the job creation
4:18 pm
so it has not had these horrible impacts in the tech sector. i am mindful that if we constrain the collection and retention of data by internet companies, especially in the internet space, it will require a change in their business models, and i think that is necessary, because right now, the propensity and capacity to manipulate every person in america is unacceptably high, so i am wondering, professor, you talked about, in your new york times essay recently, about how democracies are confronting the tragedy of the uncommons and how we need to get back to protect our society, for lack of a --
4:19 pm
can you expand, what is the most important thing to accomplish that goal, in your judgment? >> well, you know, the digital century opened with great promise. it was supposed to be the democratization of knowledge. not wanting to let that dream die. the problem is we have gone on this accidental path. in my written testimony, i give a lot of background as to how that happened. a certain market ideology, a certain national security concerns, the accidents of financial emergencies in silicon valley and how surveillance capitalism was invented in order to get tiny little google in the year 2000 over the hurdle where its investors threatened to pull out in the dotcom bust.
4:20 pm
that is why i call what we are in now and axonal utopia. we have an opportunity for the digital century to be a sentry where data is being used to solve our core, most important problems, and it data collection is being used in ways that are aligned with what we consider in the digital century to be necessary, fundamental rights. this is work that we have not yet undertaken. we have got to figure it out. it would be like having lived through the 20th century without ever having tackled workers rights or consumers rights or the institutions to oversee all that. we created all that in the 20th century. we are in a new century, new material conditions, a new kind of capitalism. we have to do that kind of fundamental creative work all
4:21 pm
over again, and that, finally, brings us to a point where, you know, right now, there's this very in depth census of how artificial intelligence is developing around the world, the ecosystem of artificial intelligence, and what it shows very plainly is that the big five tech companies on almost all -- companies own almost all the artificial intelligence scientists, science, the data, the machinery, the computers, everything pertaining to artificial intelligence concentrated in a small handful of companies, and all of that knowledge and capability and materiel is going to solve the commercial problems of these companies and their customers. surveillance capitalism. it is not being used for society, so the public is being not only -- not only is agency
4:22 pm
being subtracted from individual life, but the benefits of the digital are being sequestered from the life of our public, our societies, our democracies. we have the opportunity to get that back and get us back on track to a digital century that fulfills its promise of knowledge, democratization, and really solving the problems that face us. that is what it is all about. >> thank you, professor, and thanks to all of our witnesses. as i mentioned, the record of this committee will be held open for at least five days. the committee may have additional questions for you, and if so, we will send them to you and humbly request you incident. and i think that we have advanced our knowledge of this situation substantially today and it is due to the very helpful and thoughtful
4:23 pm
testimony. at this point, our hearing will be adjourned. tomorrow, our oversight hearing of the capitol police. with that, this hearing is adjourned with thanks. >> thank you. >> how can the u.s. protect its allies and interests against russian aggression in eastern europe? the house oversight and reform national security subcommittee heard testimony earlier today. you can watch it tonight at 9:00 p.m. eastern on c-span, she's beto o'rourke or c-span now -- c-span, c-span.org, o c-span nowr. >> c-spanshop.org is c-span's online store. browse through our products, apparel, books, decoration and accessories. there is something for every fan
4:24 pm
and every purchase supports our nonprofit operation. shop now or any time and -- at c-spanshop.org. over the past year, the u.s. capitol police inspector general's office put out reports on the attack on the u.s. capi tol. michael bold will testify before the committee thursday. watch live on c-span, c-span.org, or watch full coverage on c-span now. c-span is your unfiltered view of government. we are funded by these television companies and more, including cox. >> cox is committed to providing eligible families access to affordable internet through the connect program. bridging the digital divide one connected and engaged student at a time. bringing us closer.
4:25 pm
>> cox supports c-span as a public service come along with these other providers, giving you a friend ro seat -- front row seat to democracy. >> earlier today, the white house covid-19 response team gave an update on the ministry shoulds'effort to combat the pandemic. cdc director walensky says the omicron wave continues to receive. also at the briefing was johns hopkins health security director, doctor tom inglesby, who talked about efforts to increase testing capacity. this is happening our.

115 Views

info Stream Only

Uploaded by TV Archive on