tv Hearing on Digital Privacy CSPAN April 7, 2022 6:52pm-9:12pm EDT
6:52 pm
next, a look at digital privacy and protecting consumer data the house administration committee heard testimony from the head of the government publishing office and other witnesses. on steps congress can take to ensure digital privacy. such as establishing a federal agency for data protection. this virtual hearing is about two hours and 20 minutes. >> we're here today to confront one of the central challenges of our digital age. as nearly every facet of our society and economy has moved online. ever greater amounts of digital data are collected about each of us, spinning everything from
6:53 pm
who we are to what we like to who we are communicate with to what we do. often in minute detail. while this mast at a collection has some productive valuable at vacations. it comes with profound risk that we're only beginning to truly understand, let alone to regulate effectively. privacy as it's conventionally understood means control over one's own personal formation and the freedom to decide what to share with whom and how we're all familiar with the classic privacy harms whether it's government surveillance identity that fast and other economic crimes or the social enough comic harm that can be false win some intimate detail of our life become public against or will. all of these risks are still very much at play today, with the rise of the internet and data-centric business models of many online firms. however, in some ways, privacy
6:54 pm
has become a catchall term for a larger and even more profound set of potential risks. some of which are truly novel, or at least fundamentally different in the digital age. what does it mean for our basic autonomy as individuals when both the government and private companies can amass large profiles about each of us, and use them to predict and even potentially manipulate our behavior. what does it mean for our economy with some of the most popular online products and services revolve around opaque data collection and personalization. and what does it mean for our democracy when these data practices shaped the public discourse and the flow of news information. and what doesn't mean that as individuals and internet users, we have little or no meaningful ability or visibility into how any of this really works, let alone genuine control over it. the american public has long been deeply concerned about these questions, for example,
6:55 pm
in 2020 the pew research center from that half americans had limited specific online activities out of privacy fears. the same year another poll found that 88% are frustrated that they don't have more control over their personal information online. and i'll know that these fears are across many divisions in our society. economic, racial, geographic and partisan. for all these reasons, the time for a comprehensive federal policy framework is long overdue. there have been many proposals and lots of talk, but little or no action or clear progress. i'll know that representatives in a issue and i have introduced the online privacy act in several congresses including this one. this legislation set forth most furrow and aggressive set of privacy rights among the major proposals with a clear restrictions on a wide variety of abusive untroubled data practices and the strongest
6:56 pm
mechanisms to enforce the law and make digital privacy rights more than just an abstraction. the online privacy act also contains a set of privacy obligations on legislative branch agencies that this committee oversees. requiring the government training office, the library of congress, the smithsonian and the chief administrative officer of the house to each identify concrete privacy risk in their operations and to take appropriate remedial steps. many of the risks and policy question around data cut across both public and private sectors. we can take lessons from both and it's herman-ing the best ways to address them. and i'm hopeful that today's hearing will not only address -- but also reinvigorate congressional attention on digital privacy toward the ultimate enactment of major reforms. with that, i'm now pleased to recognize our ranking member congressman rodney davis for
6:57 pm
any comments that he would like to make. welcome, rodney. >>, well thank you madam chair and thanks to our witnesses for joining us for today's hearing. it's a hearing on a rare topic for this committee but no doubt an important one. online data privacy. policy discussions on creating nationwide privacy standards in a framework for enforcement need to take place in the primary committees of jurisdiction. energy and commerce and judiciary. it's clear to me that this discussion should be led by those primary committees in jurisdiction. and we should let them work out the underlying policy details before we consider application to the legislative branch. 304 bills have been referred to this committee by the house. yet we are devoting our time today to this topic. and it's rare we are even meeting on potential adulation this congress. we have had exactly one other hearing on the policy underlying a bill in exactly zero markups. further, this committee has not
6:58 pm
held comprehensive standard oversight hearings with our entities in nearly three years. for us not even to have a discussion about fcc or eac oversight hearings in an election year, is shocking to me. eight months ago i sent a letter to the chair asking for this committee to fulfill its duty and conduct oversight hearings with each of our legislative branch and these. eight months later, i still have now received a response from the chair and the committee continues to shirk its duties. i look forward to this committee having more oversight hearing soon. to address the concerns that follow more directly this committee jurisdiction regarding the topic an issue today, the call for a workable national framework increase transparency and overall accountability in the technology industry is all but universal of this point data privacy is at the forefront we all see the need of three young adults. i believe this issue is not only deserves congress's
6:59 pm
attention, it demands it. all anyone has to do is watch the news to see how data privacy affects everyone. last week, filing by special counsel john durham allege that in 2016 democrat operatives perhaps including marc elias hematomas of this congress are familiar with it's attempts to challenge the legitimacy of elections sought to exploit their access to non public and or proprietary internet data to create a narrative type president repression. in the words of one outlet, to infiltrate to trump tower and white house servers to link trump to russia. talk about an election subversion, unbelievable. so a responsible and workable nationwide solution for data privacy that can protect american's privacy and also curb bad political actors from attempting to subvert our political discourse is necessary.
7:00 pm
i know that our chair is a special interest in this topic, and i imagine that her desire to raise approval for legislations when we heard today. so let's discuss it. i spoke with my good friend ian see ranking member kathy morris rogers since her committee would be a primary and she offered me a lot of good thoughts. things i assume she would say if this hearing were before her committee, where it belongs first. first, we share more personal information online now than ever before. as the internet has grown so has the opportunity our personal formation, so has the opportunity for our personal formation to be used by people or businesses that we never intended. second, privacy is not in a state lines. this is a nationwide commerce issue that requires a congressional response. americans cannot rely on conflicting state laws to keep their information safe. tate l a patchwork solution is not viable for concern that affects all of us. the only way to provide transparency and how america's information is collected, isn't shared is with the uniform
7:01 pm
approach. we must also ensure that any congressional response is forward to thinking and comprehensive while still being workable far in compliance. solutions that do not account for points placed on small businesses and innovators such as the gdpr in the eu or the ccpa in california are doomed to fail. finally, those who misuse personal information must be held accountable. there have been proposals that call for the creation of new federal agency to enforce its provisions. the federal trade commission already has jurisdiction over these matters as well as a long-standing relationship with the states ags. created new agency to do the ftc's job is bureaucratic waste at the highest level. americans call for a comprehensive solution to big data privacy issues is loud and clear. and they deserve an answer. and so that is built with bipartisan buy-in, transparency and accountability. even though this committee isn't where it should be hosting this portion of this
7:02 pm
discussion, i'm actually looking forward to today's panels and i yield back. >> the gentleman yields back. i just would note for the record that the committee's jewish jurisdiction is generally determined by house rules, specifically house rule ten and this committee has been, along with other committees, assigned a jurisdiction. i want to thank the ranking member for his comments and i would also like to recognize our first witness who is someone who has led the u.s. government publishing office with great distinction of late. he is the 28th person to lead the gpa since it opened its doors in 1861, and has served in this role in 2019. hugh nathaniel halpern was
7:03 pm
selected this month, the gpo was selected as one of the best employers in the country, ranking first on forms list of mid size agencies in government services, and they really do thank you that is a reflection on your leadership as well as your fine staff at the gpo. as you know, this is not your first time to be a witness. you intermittent statement will be made part of the record and we would ask that you summarize your testimony in about five minutes. so, mr. halpern, you're now recognized for five minutes. >> thank you so much, and thank you for acknowledging the achievement of making that forbes list. i'll take some credit if it's assigned, but it really belongs to the men and women here at gpo. so thank you so much for having me today.
7:04 pm
both to the chair and the ranking member and all of the members of the committee. i appreciate the opportunity to testify about gpo's approach to handling personally identifiable information commonly known as. on behalf gpo of's more than 1500 craftspeople and professionals and we take our responsibilities seriously to protect pii every day. gpo operates as a hybrid. it's a government agency that operates as a business complete with customers, products, and profit and loss. thus we face many of the same kinds of problems facing private sector firms when it comes to the handling of pii. gpo is interested with pii belonging to our teammates, customers, and by nature of our business, the general public. robust protection of pii is
7:05 pm
critical to building trust with our customers and stakeholders. without that trust, we can never achieve our vision of an america in formed. gpo our preacher privacy program overseeing by our privacy officer and our information technology business unit. and this program establishes a framework for the protection of pii itself, regardless of how it is stored, as well as the protection of related information systems. the program has several fundamental principles. first, access to pii is limited to only those agency teammates and contractors with a specific need. second, each business unit has someone who is responsible with privacy function and answers to that business units leadership. third, any gpo team mate or contractor that suspects a breach in pii security is
7:06 pm
obligated to report the problem as soon as possible. and, fourth violations will be addressed by appropriate corrective action, including termination for our teammates to bar meant for contractors, and criminal prosecution is appropriate. while my written testimony goes into more detail as to how this program works, today i'd like to talk about three separate examples of privacy issues at gpo and how we deal with them. first gpo and its print procurement contractors produced materials that require the use of pii. for instance, gpo security and intelligent doctrine unit manufactured and then personalizes trusted traveler smart cuts such as the global entry card used at border crossings for identification and for expedited processes. by the very nature of these
7:07 pm
products, gpo must handle vast amounts of highly sensitive pii within our system. gpo works closely with its customer to maintain the series of firewalls then through the gpo only receives encrypted pii that is only decrypted when it is needed to produce and distribute those cards. when gpo finishes production of the product, that pii is scrubbed from our system. it can't leak information you don't have. second, there are vast amounts of pii included in some of our regular publications such as the congressional record and federal register. the good news, particularly with the federal register, is that most of that pii is contained in historic volumes and very little is being
7:08 pm
published currently. where it is, we have both automated and manual systems to redact that information from a digital collections when we find it. the bigger challenge is with historic information, particularly as we work to digitize all their collections. for instance, at one time it was common for the department of defense to include military officers social security numbers on promotion lists later printed in the congressional record. as we digitize a volumes that contain that information, we scan for it and redact that pii in preparation before posting it on glove dot info, or iso certified trusted digital repository. and while we allowed our partners when we find pii in tangible collections, we don't that we don't possess ourselves, we do not have the resources to visit every collection across the country and manually redact
7:09 pm
that information. the third and final category are instances where the information is not quite pii but it's information that people don't want easily accessible on them in the internet. this comes up most often with our collection of u.s. court opinions on government info. and that is the repository's singular largest collection of data. it's understandable that some parties of federal court cases may not want that information to be easily accessible and they often come to gpo asking us to remove that opinion from our collection however, it's important to remember that gpo is only making that data accessible and it is not our data. our library services division refers individuals who petitioned the owner of the data, the administrative office of the u.s. courts, to have the opinion sealed and removed from
7:10 pm
our systems if the judge determines the public disclosure is problematic. i hope that this overview of our operations what's helpful in your expiration of this critical issue i. look forward to hearing any questions you have, and assisting the committee with us [inaudible] thank you so much. >> thank you so much, director halpern. it's great to hear this testimony. now is the time when members can ask questions and will turn first to the ranking member, mr. davies, for five minutes of his questions. >> you'd think i know how to work the mute button by now, madam, chair i apologize for the delay. hugh, great to see you, man. >> you too. >> last time you are called does vibe where this committee was on march 30th, march 3rd of 2020. nearly two full years ago. it's nice to finally have you back but i wish it was to get a comprehensive update on i gpo's
7:11 pm
operations in the initiative you've launched and successive you've had as well as the agency's health post-pandemic unfortunately we're here to talk about much narrower topic, the one the gpo clearly excels that and that's protecting the pii, personally identifying information you spoke about. in its work to print passports, create global entry cards and in artisans of projects. in your testimony, you share the impressive work gpo has undertaken to follow the recommendations laid out by the national institute of standards and technology and their guy to protecting the confidentiality of pii. do you currently believe that you as the director of gpo have all the necessary authority to continue a high level of excellence in protecting pii. >> thanks so much for the question. the short answer is, at the moment, i think we do. we've got the authority we need and to work with our customers,
7:12 pm
to design systems that protect the pii that they give us. and, in our relationship with our own vendors, whether that is a contractor that we hire on behalf of the executive branch for work that they're doing for the irs or other government agencies. that, we are able to construct systems that protect that level of pii. and one of the things that we have to remain really vigilant about is the pii of our own teammates. and frankly, if there was a large body of pii that the agency maintains, that's it. it's our prey role information, it's our time and attendance informations. it's our personnel files. and we work both with folks here in the building and with
7:13 pm
our vendors over at the department of agriculture who handle our payroll and other systems to make sure that information is protected. and we're doing our level best to protect the nearly 1600 people who work here at vp o from running into any problems as well as the information of anybody who is working with gpo or working with our customers. >> great. how often is gpo's privacy program reassess are updated to remain at this high level of effectiveness? >> so was constantly undergoing assessment. our, the directive that governs our privacy program is supposed to be reviewed i believe every theory years. but we are constantly going through our policies, our procedures and our actual practices. to make sure that we're doing our level best. and that's both on the digital
7:14 pm
side where we have an extremely competent team on our information technology team. but also, just with our safety and security team. now i get an update three, four times a year as our safety and security team walks the building to make sure that things are shipshape. and i can tell you, they have flagged folks who have inadvertently left pii sitting on their desk, not having it storm properly. so it's a good reminder to all of us that we need to be proactive and protecting this information. whether it's digitally or tangibly here in the real world. >> i couldn't agree more. one last question. what is gpo learned from the opp and breach that happened several years ago? >> i think it's very similar to the lessons that a lot of government agencies are learning. we've got to secure our systems and we spend a lot of resources,
7:15 pm
we have a dedicated appropriation for that purpose. and it requires a lot of vigilance on our part. and you know knock on wood, think about, certainly during my tenure here, we have not had anything that even indicated the folks were close to a breach of those kinds of systems. and it's something we dedicate a lot of resources to every single day. the one thing that the opm breach, i think, taught all of us is you can't rest on your laurels. security needs to be everybody shahab everything all day. >> well thanks, hue, thanks madam chair, i yield back. >> thank you. i now recognize mr. raskin for his questions. >> thank you madam chair and thank you for calling this important hearing. miss fitzgerald, discreet bits of personal information can
7:16 pm
seem relatively harmless or trivial but when they're all combined together with other data, they can have a far more sweeping impacts. and creates special dangers. what lessons does this point contain for privacy regulation? what are the biggest risks that congress should be considering in terms of the large scale aggregation of data? >> i would know that miss fitzgerald is on our second panel. >> oh i'm sorry i thought i saw her there. forgive me. then i will come then to mr. halpern. so what are the greatest privacy risks that gp o faces its operations? what are the specific harms in this area that you are most focused on right now? >> well, as i was saying, i think the largest single trove that we have maintained over a long, over a longer period of
7:17 pm
time is our employee data. and that's something that we are focused very very closely on protecting. most of the other places where we have pii and we need to focus on that we have develop systems to make sure that we are holding on to that pii for the minimum amount of time possible. so for instance in the example i used with the global entry card we literally have a contractor sitting on the other side of the wall from our smart car manufacturing facility. who will send the encrypted data that will need to produce those cars into the smart card room. we will utilize those information to produce those cards. and then scrub that from our systems, send it back to the customer once we're done with it. so that we're not holding on to
7:18 pm
any more data than we need. our biggest problem in terms of other folks data is really historical data. and as we, as we fulfill, i think, the greater public good of working backwards and trying to digitize old records, we are going to, we're going to find more pii because frankly, 20, 30 years ago, for folks were a lot less conscious about what they were putting in a congressional hearing or the congressional record or the federal register. because those were all tangible documents, it was a lot harder for somebody to get that information and then to do something with it. because the internet just wasn't developed as it is now. we've got a program in place to scan for pii in those older documents and as we digitize them, we are working to redact
7:19 pm
that from our digital collections. >> thank you for that. i know that i gpo routinely publishes documents and information that originate elsewhere. in the federal government. so you are not the original custodian and source of all the information you publish. and so, how does this affect your privacy policies and practices? >> so we are often the middle person in this transaction. and the good example is the one i gave in my testimony about u.s. court of opinions. as you can well imagine, there are a lot of opinions that folks may not be happy are out there in the world or easily searchable. and while, as a general matter, that serves a greater public good, there is that problem with maybe having an opinion where there is information in
7:20 pm
there that somebody is uncomfortable with having out in the public. so what we've done is we have a number of different places, if you go to gpo.gov slash privacy it talks about our whole privacy program including who you need to talk to over the administrative office of u.s. courts. to deal with a court document that may have issues. similarly just a few days ago we revamped our ask gps system which is now powered by a modern contact management system, and there we have also got some so that folks find an opinion or something like that that raises an issue, they can ask about it, we can write that to the proper person and make sure they get the information to talk to the administrative office of u.s. court to get that, get that particular
7:21 pm
records sealed, if that's appropriate. in terms of -- >> [inaudible] a yield back. >> thanks very much, and i'll recognize the gentleman from georgia now for his questions, mr. vladimir. >> well thank you, madam chair, and you, it's good to see you again. i appreciate all the work that you are doing over at gpo. and you said something a few minutes ago that was the music to my ears, and i've been saying this ever since i've been in. congress to let its a lesson i learned working in intelligence on the i.t. side of intelligence when i was in the air force. one of the primary principles that was just ingrained into our every policy we had is you don't have to secure what you don't have. if you don't need it, then discarded. and we had procedures built around that to analyze data whether we needed it or not. you had to justify keeping it and then it was destroyed. the problem we have, i've been saying this ever since i've
7:22 pm
been in congress and especially on financial services committee is we're dealing with the same issues is there is many in congress and in the federal government that aren't applying that principle and so much legislation is being passed that is getting the federal government to grab more data or forcing businesses to acquire more data of individuals and somebody is keeping that and storing it as data that does not need to be kept. so i appreciate hearing you say that, because there's very few government officials or quite frankly people in congress that you every hearsay that. if you don't need it, don't keep it. so, has gpo experienced any data breaches or identified attempted attacks in recent years? >> so, yes on the attack side, no on the beach side. so, the good news is my i.t.
7:23 pm
security folks were doing their job. and do the best of our knowledge, we have not had a breach, certainly during my tenure or even in the years leading up to it. shortly after i became director of gpo, i still remember it, i was actually at an event on saturday night at my synagogue when i suddenly learned that somebody, we believe affiliated with the iranians, had defaced the front of one of our sites. and the good news is, that's about as far as they got. and we were able to restore that site. we are in the process, we are in the process of moving that over to a more secure infrastructure, and, you know, the good news is that my team
7:24 pm
has really good process in place to make sure that our systems are secure. as updates come out, they are tested thoroughly before we do those upgrades, so we don't necessarily have -- we don't necessarily have the exposures you might have from unknown vulnerabilities. so, we really work hard to try and get that, get that right. >> well, it sounds like you are doing a good job at that because if you would have answered my question in, well, we haven't had any data breaches and no attempted attacks, i would have said, then you're doing your job, because i guarantee you are being attacked, and if you don't know it, then there's something wrong. and i also, the updates. they are crucial. that's what caused the equifax data breach was simply a failure of the i.t. department to not do a from where update that was needed to be done. so, you know, i'm really uncomfortable with the i'm really comfortable with the answers you've given, and
7:25 pm
that's why what's why we haven't seen any day to breaches. and in the 30 years i spent in it and data security, i can tell you, the question isn't if you are going to be hacked, it's one. it's going to happen at some point. and it's a race to stay just ahead of the bad guy. i use the analogy of when i used to hunt in alaska, and we go bear hunting, somebody that would put on a pair of tennis serious and we say was that for? they say if i see a christmas, i want to have my tennis shoes on so i have to out run aggressively? they say, no, i have to argue. so is you have to stay ahead of the bad guy and stay one step above someone else, because if they can't get it to you, they'll go on to someone else. so, it sounds like i've already heard the answer to this question, but do you have all the tools you need to keep carrying out this level of privacy and security, or is there additional things that we can do in congress to help? >> at the moment, i think we
7:26 pm
are in pretty good shape. things are -- we have the flexibility, we have the tools that we need. obviously, it's always a question of resources. but as we start automating processes here at gpo, we are really focused on figuring out how we make our dollars go farther. and that's really been one of the hallmarks of gpo all through history. and so we're, we are putting that emphasis on security and trying to do it with tools we've got, and where we don't have to, as we make the investments we need. that's, you know, to some extent that's one of our advantages. only about 12% of our, of our total revenue comes from appropriations. the rest is what we charge our customers. so the state department expects
7:27 pm
us to keep their passwords fully secure. and we spent a lot of money doing that. but the state department reimburse us for those investments. and the same with our other customers. so, we are pretty comfortable with where we are. but we are always happy to talk about things and figure out how we can do better. >> well, this is a and ever-changing environment and it's a job that is never done. because once you think that your secure, and you're not staying on top of it, the bad guys are going to be tough. so thank you for all your doing. madam chair, i yield back. >> the gentleman yields back. mr. aguilar is recognized. >> thank, you madam chair. and thanks, director halpern, for the update and information. a lot to review here. and i appreciate your commitment to safeguarding that pii. but i was kind of interested in is, and you talked about some of the digital versus the historical documents that you
7:28 pm
are in charge to protect, your testimony kind of speaks to that and said that 97% of the federal documents now our digital. so what distinctions does gpo make between the digital versus paper documents in terms of privacy risks and how they can be addressed by your team and by the contractors? and how is that transition been to the digital age and the privacy risks that are associated with that on your operational side? >> thanks so much, sir, for that. the short answer is, for new documents that we produce, we have a pretty, pretty airtight system for trying to scan those documents as we are producing them, looking for pii. as i mentioned, with the federal register, they are under the opm policies regarding pii so there's not a lot that's coming in.
7:29 pm
we do need to check the congressional record and we work very hard to redact any pii that might appear there. but i think even congress is being more circumspect about that stuff going forward. our biggest problem is, is stump that exists in a tangible form. so printed, printed paper that was printed historically. 20, 30 years ago, or even a lot longer. and what we are running into are two issues there. one, is as we are trying to digitize that information, so we're taking committee hearings from the 70s or the 80s, and trying to digitize those and put them on govinfo. there, both our contractor and our library team are looking at those documents.
7:30 pm
where they see and where they catch pii, we are rejecting that from the digital copies that go online. when it comes to those paper documents, we simply don't have the capacity to literally traveled the world to find every copy of the congressional record from the late 1800s forward, check those volumes for pii, and redact them. so we are in some respects reliant on our partners, the federal depository libraries, and our other libraries across the world, where a lot of this information goes. but that said, as i mentioned before, i think because that information is comparatively hard to access, we need to go to a physical place and we need to open a physical volume to find that. >> ecause it's harder that's actut
7:31 pm
lower of a threat than what's going on in the digital domain. it's not a zero threat, but where we find pii, we make an effort to try to redact it. but we just don't have the resources in terms of people, money and all of those kinds of things that do a full scrub where one of these paper documents might exist in the world. >> understood. and how do you kind of keep, what's the digital kind of hygiene, how do you stay ahead of the evolving threats in the digital privacy landscape piece of it. is it through the opm guidance? is it through contractors or how do you stay ahead of the risks that are in front of you? >> so one of the great things about being a legislative branch agency is that we can take the best practices all
7:32 pm
across the government. we are not necessarily confined by what opm provides. i know that there are i.t. security teams, both works with the other them csos here in the legislative branch as well as broader across the government. as we do work for our other federal customers, we are obviously talking to them, figuring out where the requirement are and how we can meet those. so there is a variety of avenues for us to be exposed to some the sort of changing landscape and what folks requirements are. we're always going to make sure that our systems are designed in such a way as to protect the security of those documents. whether it's producing smart cards, whether the u.s. passport, one of the reasons that they come to gpo is
7:33 pm
because we can produce those documents securely and government facilities with the highest quality you're going to find in any place in the world. and do it at a reasonable price in at a reasonable timeframe. so we think we've got some advantages there. >> thank you so much, appreciate it, yield back, madam chair. >> thank you, mr. steil is now recognized. >> thank you mary much madam chairwoman. mr. help earn. -- some were all looking forward to getting back in person before too long. but appreciate you being here and i think maybe in a future committee hearing will can opportunity to talk to some your legislative branch colleagues on what is just an absolutely critical issue. i can tell you how often i hear from folks across wisconsin about the importance of data privacy, they're thinking about it, they're thinking about being hacked. there is actually a really timely political article that came out just a little bit ago, talking about our reporters being hacked through apps.
7:34 pm
it is nonstop that this is something that we're talking about. and i think it's important we're talking about it here. the proposed piece of legislation that would mandate gpo action in the data privacy space, the online privacy act of 2020 when they were talking about. were you consulted in the drafting of these legislations specifically section five? >> we were not consulted on the front and, all though we did have subsequent discussions with the committee. so from my reading it's a fairly flexible standard in the proposed legislation. and it's something that we think we can meet with the program that we're running today and it'll give us an opportunity to take a look at make sure that there, if there are any places where we need to tighten things up, we definitely can do that. >> understood. appreciate your commentary on that. that's helpful.
7:35 pm
and our national endowment of privacy laws, it's really kind of a balance of ways of minimizing compliance costs for small businesses. while so providing appropriate levels of protection to individuals. can you cannot comment on how you try to strike the balance? >> i'm so for most of our interactions with small businesses, that's through our print procurement program. and the requirements there are largely driven by our customers. now that said, there are any number of small and medium sized businesses that are able to participate in our print production print procurement program to give you an idea, i was just southern pennsylvania i think it was a few months ago and we visited with one of our contractors that does a lot of
7:36 pm
work with gp o and we use that contractor specifically because they have a great system for keeping track of information. i am keeping track of documents with sensitive information in it. give us a great paper trail and they do a really fantastic job of handling those kind of jobs for us and doing it in a way we're confident that that information is safe and secure. i and i think there are some other companies across the country that can do that work for us. there aren't a ton, to be 100 percent honest with you. when we have run into problems where say one contractor has experience labor shortages, which isn't something that's uncommon these days, there aren't a lot of other places to go. but some of the small and
7:37 pm
medium sized businesses that we work with have gotten really really good at having internal systems that can protect that information. and frankly, sometimes they do a better job of it than some of our larger contractors. >> i appreciate that feedback, mr. halpern. and madam chair, oil back. >> gentleman yields back. miss scanlon is recognized. >> thank you, madam chair. mr. halpern what remedial measures did the gpo have in place in the event that one of its publications contains non public personal information about a private individual. especially if it's sensitive or otherwise potentially harmful nature. >> absolutely. as i mentioned, we have both automated and manual systems to redact information that is found in our digital collection. so, for instance, let's just
7:38 pm
say for the sake of argument, something slipped through tonight's addition of the congressional record. there was something that had some pii in it and all of our checks failed i. the good news is, when that's brought to our attention, our team can go in, change the record in govinfo, iowa are trusted digital repository. i and redact that information out of the digital record. we can't really fix the printed copies. but the volume of those copies, for something say like the congressional record, is much smaller than it was even a decade, decade and a half ago. to give you an idea, in the 1990s, the daily circulation of the congressional record was north of 20,000 copies of day. today, we're only producing
7:39 pm
about 1500 copies a day. so it's a much smaller universe in the printed world. and we have tools in place to fix any problems that might occur on the back and through our digital systems. i >> mean, when you talk about the core systems, that kind of caught my attention because that's where i spend my pre-congressional career. and certainly judges will talk about a lot of facts as the ruling on a case. you know, when it in affected individual have the right or ability to get personally identifiable information deleted, redacted or otherwise remediated? >> so the way that works right now, it is not our process, it is the quartz process. so they can petition the administrative office of the u.s. court. and it's up to the judge to decide whether or not to seal those particular records or parts of a record.
7:40 pm
and depending on that decision will either pull the record out of gove info or put in a redacted version, depending on what the constraints are of the judges order. as i mentioned my testimony, this is sort of that i gray area. it doesn't necessarily fall into the category of the standard pii. social security numbers, there are other key identifiers. but it may describe the fact that somebody is uncomfortable with. and those kind of cases, we are not going to be the best folks to make a decision as to whether that needs to stay in the public domain or needs to be removed. the judge who was involved in that case, or another judge, really needs to be the person to make that decision.
7:41 pm
and depending on what decision is, we will act accordingly. >> well if it was something within your domain, is there currently a wipe of an individual to be able to personally identifiable information redacted? if, for example, a social security number or a i don't know a credit card number or something slipped through. is there a right to have it removed? >> i can get back to you on with the statutory background is on that. i can tell, you 4g peels policies and directives, if we come across that formation and have the ability to redacted form are primarily from our digital collection. we're going to do so. and if somebody brings that to our attention, it may not be the person whose information that is. could be a third party. but if we become aware of that and we we agree that that's pii,
7:42 pm
will make the effort to redact that from the digital side. >> okay, thank you. madam chair, i yield. back >> i now recognize the gentlelady from new mexico. >> thank you, so very much, chairwoman utah lofgren and thank you for holding this hearing because. i can tell you that this issue comes up a lot as i talk to my constituents. they are concerned and we are concerned with regards to the matter of data is protected. director halpern, i truly appreciate your comments about how protection of our personal data helped by the government is linked to people's trust in our government. and i think you degree that this link will only grow stronger as more of our lives move into the digital space. i also appreciate your comments about how gpo and other federal agencies are discussing best practices. with each other.
7:43 pm
do you believe that, can you elaborate a bit more on what we are doing here at the legislative branch that we might strengthen based on the information your hearing and sharing with the other federal agencies so that we can make sure that we have those protections against the cyber attacks, as well as protecting individuals personal information? as you described it. >> sure, i'll do my best. it happens on a number of different axes, is probably the best way to think about it. as i mentioned before, our computer information security folks have a lot of back and forth, and there's a lot of discussion about what are those best practices. how do we secure our systems, and frankly, particularly with gpo and the congress, there is
7:44 pm
a lot of interchange that occurs on a daily basis. your sending us information, we're sending it back to you. and we're working with the library to make things available via congress.gov. so there is a lot of discussion about those particular flows, and frankly, what's our csos are experiencing day-to-day. where are these attacks coming from? how do you secure against them? and what are the appropriate countermeasures? the other area that we're working on is, our obligation as an employer. so, i just got the update this morning, right now we're at 1547 people working directly for gpo. plus, a few hundred
7:45 pm
contractors. we have an obligation to secure that information. payroll information, health information, all sorts of things. the chief administrative officer of the house has a similar obligation. so our officials are talking all the time about how we maintain our systems, both from the cyber standpoint and from the more mundane operational standpoint. and as i mentioned before, my safety and security team literally walks our building. we have 1 million square feet under a roof. and we walk every inch of that building and one of the things that they look for is, is there pii not being stored properly? that's a little bit harder to do in the congressional space. i spent 30 years working in the house. so it's a little bit different there. but you know some of those same practices, whether it's the
7:46 pm
chief administrative officer or the secretary of the senate or house sergeant at arms. all of us are talking all the time to figure out, what's the best way forward. and there's a lot of information sharing that goes on. >> thank you so much and take it that, you have no control over the fact that, and i think will hear testimony, that every search is saved and mind. so as individuals might be searching within the congressional record for information, tell me, do you have any way of protecting any of those searches or is that something that's left to the individual with regards to his or her computer use and privacy? >> so, it's a great question. and i will get you a more detailed answer as to the information we retain until retained. the short answer is, we try to retain the minimum amount of information we possibly can.
7:47 pm
about searches income of info or any of our other systems. so for instance, in the case of govinfo, we used the smallest most mundane kinds of cookies we possibly can. we don't track people across the internet. and frankly, unlike i congress .gov, we don't have logins. we don't personally say those searches for folks. that said -- >> i see my time is expired, i do think you and your organization for respecting as proper the privacy of the users in the documents in the searches. i yield back, madam chair. >> thank you. just a couple of further questions. first, director halpern, i just want to commend you and your staff for your proactive work
7:48 pm
protecting privacy. it's really admirable. and you know, in your statement, you said you can't, if you don't save the material, it can be abused. and that's actually the principle of the online privacy act to really prevent the unnecessary retention of information. because if you don't have it, you can't eat a minute, you can't abuse it. and that's what you're doing in your agency. it's something that i think we want to look at. because if your, using a search engine. to search the something in the congressional record, the search engine retains it even though you don't. and so, government, you're doing the right thing but peoples searches or inquiries can still be abused. and manipulated. two quick questions. first, you are doing everything
7:49 pm
with lift and opm guidance in your directives. you're doing that because you want to do the right thing. but your policies are not required by law at this time, are they? >> no. the short answer is, is one of the blessings and curses of being in the legislative branch. i actually, i'm a big article one guy so i actually like the flexibility that this gives us to survey the landscape and pick the policies that meet our values. look and our value is the, we're going to protect our customer, we're going to be transparent with folks. and we're going to do everything we do whether it's with our folks here in the building or our customers and
7:50 pm
respectful way. and part of being respectful is protecting that data. and, yeah, the good news is, while there may not be pure statutory requirements that say you have to do this, one, we believe that our oversight committee expects us to do this. and it's the right thing to expect. and to our customers expect us to do it. both of those are driving imperatives for us here at gpo. and, it helps us get into the position where it's easier for us to do the right thing that we were planning to do anyways. >> just one final question. in your written testimony, you mentioned that gpo distinguishes between high impact and low impact personally identifiable
7:51 pm
information. and i'm interested in how or whether you reevaluate the level of risk for different types of pii that may change overtime. and as you classify that, do you consider the potential impact win information is aggravated by third parties in combination with other personally identifiable information? for example, through the use of machine learning or other data processing techniques that are outside of your agency. >> we are always trying to learn, there are a lot of very clever folks all across the country who are coming up with new ways to use that sort of ubiquitous data that's out there. but we do draw distinction between certain kinds of pii. so for instance, those unique
7:52 pm
identifiers, your social security number, the card numbers or pass for numbers in the department of state or department of homeland security uses. when those are really key pieces of information that have the potential to unlock so much else. including biometrics and a whole slew of other things. so we're going to work much harder to protect that information. lower impact data or things like addresses, photos, names. and that stuff is really kind of ubiquitous. for instance, when you do a special order and your congratulating the coach of your high school football team. you're going to call that person by their name. and that is pii but we're not going to redact that for the congressional record because that one, isn't a huge security risk. into, if we did, it totally
7:53 pm
evaporates the point of that special order. >> exactly. >> i want to thank you, director halpern, for your great testimony today. and as you know from your years here on the hill, we do keep the record open if we have additional questions. so if that occurs, we'll send them right on to you and i know we get back to us. i just want to thank you for being here with us today, it's been a very informative, please let your team know how much we think of them. and how proud we are of them. >> thank you very much, ma'am, i really appreciate it. >> thank you, we will now go to our second panel of witnesses. and let me briefly introduce them. first we have some shoshana zuboff, who is the charles edward wilson a america at harvard business school. professor zuboff is an internationally recognized
7:54 pm
expert in the author of several major works on digital privacy issues, such as in the age of the smart machine, the future of work and power, and the support economy, why corporations are failing individuals. and the next episode of capitalism, her most recent book, the age of surveillance capitalism. the fight for a human future at the new frontier of power. investigates the new surveillance economy, which has driven by consumer data. our next witness is caitriona fitzgerald, who is the deputy director at the electronic privacy information center, or epic. she leads ethics policy works working to advance strong privacy, open government and algorithmic fairness and accountability laws. she recently authored, grading on a curve, privacy legislation in the 116th congress. which sets out the key elements
7:55 pm
of modern privacy a law. including the creation of u.s. data protection agency. the next witness is marshall or when, who was the chief security officer of mozilla. where he focuses on data security, privacy, and surveillance. he has previously worked as a counter-terrorism and cyber security analyst. served, and has served as the counter terrorism and intelligence adviser to senator susan collins on the senate homeland security and government affairs committee. and was the intelligence specialist at the congressional research service. and finally, we have daniel castro, vice president of the information technology and innovation foundation. or it inf. and the director of the center for data innovation, and it inf affiliate institute focus on the team -- all remind the witnesses that your entire written statements,
7:56 pm
which by the way are excellent, will be included and made part of our permanent record. and that that record will remain open for at least five days for additional material to be submitted. and we asked that you summarize your testimony in about five minutes so that the members of the committee will have time to pose questions to you. so, first, professor zuboff, we love to hear from you. >> members of the committee, i'm delighted to have, he is my, there we go. should i start again? >> sure, go ahead. >> the chair lofgren, ranking members davison or the committee, thank you so much for the opportunity to discuss the challenges of privacy, privately see law in a world without privacy. i have spent the last 43 years of my life studying the rise of
7:57 pm
the digital as an economic force that striving our transformation into and civilization. over these last two decades, i've observed, as the fledgling internet companies morphed into a sweeping surveillance base economic order. founded on the premise that privacy must fall. and powered by economic operations that i have called surveillance capitalism. surveillance capitalism maintains core elements of traditional capitalism, private property, commodification, market exchange, growth and profit, but these cannot be realized without the technologies and social relations of surveillance. hidden methods of observation, secretly extract human experience until recently considered private. and translated into the behavioral data. these methods operate outside
7:58 pm
of human awareness. engineer to do so. robbing actors of the right to know and with it, the right to combat. ill gotten human data are than immediately claimed as corporate property, private property. available for aggregation, computation, prediction, targeting, modification and sales. this era of surveillance capitalism challenges this property claim. and redefines it as the theft. surveillance capitalism was invented at google during the financial emergency of the dot com bust. it migrated to facebook, became the default model of the tech sector and is now reordering diverse industries from insurance, retail, banking and finance to agriculture, automobiles, education, health care and much, much more. as one tech executive recently described it to me, all software designers assume that
7:59 pm
all data should be collected and most of this occurs without the users knowledge. all roads to economic and social participation now lead through surveillance capitalism's institutional terrain. a condition that has only intensified during these two years of global played. the abdication of these spaces to surveillance capitalism has become the meta crisis of every republic. because it obstructs solution to all the other crises. for which we require information integrity and the sanctity of communications. the world's liberal democracies now confront a tragedy of the uncommon's. as mission critical information and communication spaces that most people have assumed to be public. are now owned, operated and mediated by private commercial
8:00 pm
interest for profit maximization. while almost entirely unconstrained by public law. no democracy can survive these conditions. the deficit i described here i reflects a larger pattern. the united states and the world's liberal democracies have thus far failed to construct a coherent political vision of a digital century that advances democratic values, principles, and government. while the chinese, for example, in contrast, have focused on designing and deploying digital technologies in ways that advance their system of authoritarian rule. this failure left a void where democracy should be, leaving our citizens now to march naked into the third decade of surveillance capitalism without the rights, laws, and institutions necessary
8:01 pm
for a democratic digital future. instead, we've stumbled into an accidental dystopia, a future that we did not and would not choose. in my view, we must not and cannot allow this to be our legacy. survey evidence, some of which has already been referred to here, now shows that americans are moving ahead of their lawmakers in a massive rupture of faith with these companies. we see extraordinary majorities calling for action to curb the unaccountable social power of these firms, a realization that the emperor of surveillance capitalism not only has no clothes, but is dangerous, and a clear sense that human rights, the durability of society, and democracy itself, are on the
8:02 pm
line. what was undiscussible has become discussible. what was settled is now being questioned. to end, we are still in the very early days of our information civilization. this third decade is a crucial opportunity to build the foundations for a democratic digital century. democracy may be under siege, but it is, however paradoxically, the kind of siege that only democracy can end. thank you so much for your attention. >> thank you very much, professor. let me call now on ms. fitzgerald for her testimony. >> thank you, chair lofgren, ranking member davis, members of the committee. thank you for holding this important hearing and the opportunity to testify today. i'm caitriona fitzgerald, deputy director at e.p.i.c.
8:03 pm
e.p.i.c. is an independent nonprofit research organization in washington, dc, established in 1994 to protect privacy, freedom of expression, and democratic values in the information age. for over 25 years e.p.i.c. has been ail leading advocate for privacy in both public and private sectors. the united states faces a data privacy crisis. large and powerful technology companies invade our private lives, spy on our families, and gather the most intimate details about us for profit. these companies have more economic and political power than many countries and states. through a vast, opaque system of databases and algorithms, we are profiled and sorted into winners and losers based on data about our health, finances, location, race, and other personal information. and private companies are not the only problem. government agencies have also dramatically increased their collection and use of personal
8:04 pm
data, often purchased or obtained from those same private companies while failing to address the significant risks to privacy and cybersecurity. the impact of this uncontrolled data collection and use by both private companies and the government is especially harmful for marginalized communities fostering systemic inequities. these industries and systems have gone unregulated for more than two decades and this is where it's left us. the system is broken. technology companies have too much power and individuals have too little. to restore the balance, we need comprehensive baseline privacy protections for every person in the united states, changes to the business models that have led to today's commercial surveillance systems, and limits on government access to personal data. most crucially, we need strong enforcement of privacy protection. in my written statement i go into detail about the crises we
8:05 pm
face and the elements of the strong privacy law but i want to focus here on the importance of enforcement. without enforcement, many businesses will simply ignore privacy laws and accept the small risk of an enforcement action as the cost of business, as we've seen with privacy laws enacted in europe and in several statements. strong enforcement miscellaneous include a private right of action. this is not new. congress included private rights of action in the cable communications privacy act, the video privacy protection act, fitgra, and the drivers privacy protection act. the damages are not large in individual cases but they really provide an incentive for companies to comply with privacy laws. strong enforcement also requires congress to establish an independent data protection agency or a dpa. the united states is one of the few democracies in the world that does not have a data protection agency. u.s. companies are leaders in technology and the u.s. government should be a leader in technology policy. when you think about the privacy
8:06 pm
technology in our lives and economy, it seems obvious we should have a federal agency dedicated to overseeing and regulating it. data privacy could protect privacy by setting rules that help level the playing field for smaller technology companies who currently struggle to compete with tech giants. a dpa could be a central authority within the federal government on privacy issues. we saw a glaring example of agency failure to properly consider privacy risks of data collection just this month. the irs sparked outrage after reports surface that can it had contracted with a third party vendor to require taxpayers to submit to facial recognition in order to access tax records online. thankfully, pressure from advocates and members of congress caused the irs to backtrack, but the contract should never have been signed in the first place. a dpa could ensure privacy is carefully considered when
8:07 pm
agencies are weighing contracts like this. a recent data for progress poll showed 78% of americans across the political spectrum support establishing a data protection agency. the good news is that congress and indeed this committee has a strong bill before it that would restore online privacy for americans. filed by chair lofgren and representative eshoo, it would establish strong enforcement mechanisms via a private right of action and the creation of a u.s. data privacy agency. e.p.i.c. recommends swift action. it's time for congress to act. thank you for the opportunity to testify today. >> thank you very much, ms. fitzgerald. we'll turn you to, mr. erwin, for your testimony. >> thank you, chairperson lofgren, ranking member davis
8:08 pm
and members of the committee. thank you for the opportunity to testify today. i'm the chief security officer at mozilla, a nonprofit organization and open source community. our products include the firefox browser which is used by hundreds of millions of people around the world. mozilla is constantly investing in security and privacy and advancing movements to build a healthier internet. in our view, privacy i don't mean online is a mess today. users are stuck in a cycle where their data is collected without their knowledge or understanding. that data is used to manipulate them in ways that could be actively harmful. today i'll talk about what mozilla is doing to address this problem and what congress needs to do. we've been driving major initiatives such as the standardization of tls 1.3, encrypted free nonprofit certificate authority.
8:09 pm
this has resulted in increased web traffic to 85 to 90%. this is protecting your browsing activity and information from attackers in the middle of a network that would otherwise protect that information. the second big area of focus for us has been on eliminating what we call cross site tracking. these are parties following you around from website to website that you visit, collecting information about your browsing activity and building that profile about you. from 2019 we turned on what we called enhanced tracking protection in the browser. we turned that on by default because we believe the onus shouldn't be on consumers to product themselves from opaque risks they can't see or understand. so that's just a little bit what have we've been doing to try to address this problem. i want to talk a little bit about what we think congress needs to do. first, focusing on baseline federal privacy legislation, then i'll talk a little bit about transparency and online harms. first, in our view, technical privacy protections from
8:10 pm
companies and baseline regulations are complementary and necessary, neither alone are sufficient. the internet was not designed with privacy and security in mind which is why technical solutions like the ones that we offer are necessary to create a less permissive environment overall. at the same time we can't solve every privacy problem with a technical fix. there is a central role for regulation here as well. to give you just one example we know that dark patterns are pervasive across the just and applications that people use. unfortunately there is little that a browser can do when a person visits a website directly and is deceived into handing over their information without proper consent or understanding. this is where law plus a robust enforcement regime has to step in. the online privacy act would prohibit dark patterns. the u.s. must enact baseline privacy protections to make sure the public and private actors
8:11 pm
treat consumers fairly. we need clear removals the road for entities using data, strong privacy rights for people interacting with those entities, and effective authorities, agencies with the authority to take enforcement action. second, it's important to complement federal privacy legislation with solutions that provide direct transparency and accountability into online harms. many of the harms that we see today are a direct result of the pervasive data collection happening. we know from disclosures things like recommendation systems and targeting systems can have really pernicious effects if abused. these systems are really powered by people's data. it is easier to discriminate against people, manipulate people, or deceive people if you know more about them. this is why privacy and some of these online harms are inherently [ inaudible ]. the problem is this harm is mostly hidden from the public and regulators. that's why we have called for things like an establishment of a safe harbor to protect
8:12 pm
researchers doing investigations into the activities of major online platforms. this would protect research into the public interest, help us better understand the harm happening on these platforms. there's a real public benefit that can be had here by such a safe harbor. similarly, we've advocated for more robust regimes for governing online ads. we've been leading the push for ad disclosure in the european union and are encouraged by recent proposals in congress that would require disclosure of ads for public benefit and understanding. these approaches would provide transparency into the opaque world of online advertising and could be done without creating privacy risks. in conclusion, at mozilla we speak to advance privacy technology and to ensure that privacy considerations are at the forefront of policymakers' minds in considering how to protect consumers and grow the economy. we appreciate the committee's focus on these issues and look forward to the discussion. >> thank you, mr. erwin.
8:13 pm
and now we turn to our last but certainly not our least witness, mr. castro, for your testimony. >> thank you, chairperson loveagain, ranking member davis and members of the committee, i appreciate the invitation to speak with you today. u.s. data privacy is at a crossroads. many consumers are justifiably frustrated by the frequency with which they learn about new data breaches and the seeming lack of accountability for those who misuse their personal information. at the same time many firms are overwhelmed by a tsunami of new data protection obligations and growing restrictions on how they can use personal information for legitimate business purposes. and both are often confused by the multitude of ever-changing laws and regulations. three states, california, virginia, and colorado, have passed comprehensive data privacy legislation and many other states are considering it. over the past three years, 34 state legislators have introduced a total of 72 bills and more coming. these new state privacy laws can
8:14 pm
impose significant costs on businesses, both direct compliance costs and decreases in productivity, and undermine their ability to responsibly use data to innovate and deliver value to consumers. moreover these laws create high costs not just for in-state businesses but for out of state businesses that can find themselves subject to duplicative laws. california's law will likely cost $78 billion annually. in the absence of federal data privacy legislation, the growing patchwork of state privacy laws could impose out of state costs between $98 billion and $112 billion annually. over a ten-year period the costs would exceed $1 trillion. the burden on small businesses would be substantial with u.s. small businesses bearing 20 to $23 billion annually. the united states needs a new federal data privacy law.
8:15 pm
it should not back away from the light touch approach it has historically taken to regulating the digital economy. instead, it should focus on the following goals. first, data privacy legislation should establish basic consumer data rights. for example, congress should give individuals the right to know how organizations collect and use their personal data or when their information has been part of a data breach. congress should also give consumers the right to access, delete, or rectify census data in certain connects. for example, consumers should have a right to obtain a coffee their health or financial data and move it to a competing service. second, lawmakers should establish uniform privacy rules for the entire nation by preempting state and local privacy laws. consumers should have the same protections regardless of where they live and companies should not be faced with 50 different sets of laws and regulations. a patchwork of state laws with varying definitions and standards creates a complex regulatory minefield for businesses to navigate, especially if potential
8:16 pm
violations risk costly litigation. third, there should be robust enforcement the federal privacy law. congress should rely on federal and state regulators to hold violators accountable. to avoid unnecessary litigation, businesses should have a reasonable period of time to address a violation without penalty in cases with no demonstrable consumer harm such as a 60-day notice to secure period. fourth, congress should set a goal of repealing and replacing potentially duplicative and contradictory laws. the u.s. code is littered with privacy statutes, from major ones to narrow ones. each one has its own set of definitions and rules to comply with. finally, federal data privacy legislation should minimize the impact on innovation. to that end, congress should not include data minimization, or
8:17 pm
privacy by design requirements because these provisions can reduce access to data, limit data sharing, and constrain its use thereby limiting beneficial innovation. congress has the benefit of hindsight to avoid some of the problems found in other data privacy laws, particularly europe's gdpr. the gdpr has imposed a massive compliance cost on businesses not only in the eu but around the world, for example fortune 500 companies have spend nearly $8 billion to comply with the gdpr. the eu's own survey shows the law has had virtually no impact on consumer trust in the internet. some of the most expensive requirements like appointing a data protection officer serves no valuable business function. policymakers should also be aware of the unintended consequences of poorly crafted legislation such as not considering the implications of the law on emerging technologies like artificial intelligence and blockchain. finally, congress should note that the primary purpose of the
8:18 pm
gdpr was to harmonize laws across eu member states. ironically they're creating the exact type of fragmentation in the u.s. that the eu created the gdpr to solve. therefore it's essential for congress to act swiftly to pass comprehensive privacy legislation that establishes basic consumer data rights. thank you for the opportunity to be here today. i look forward to any questions. >> thank you very much, mr. castro and to all of our witnesses. we'll go now to members of the committee who have questions. i'll turn first to our ranking member, mr. davis. >> thank you, madam chair, thanks again to our witnesses. i enjoyed your testimony. mr. castro, let me start with you. there have been proposals introduced in congress that if implemented, would create a private right of action for individuals and a collective right of action for certain nonprofits to sue on behalf of individuals. is this the best approach to ensure protection and
8:19 pm
accountability for a federal data privacy framework? >> no, i don't think it is. right now we have a very effective regulator and we have state attorney generals that can take up that slack. what we've seen in states that have implemented these private rights of action like with the biometric privacy act, once a court ruled that you could bring lawsuits under this for cases where there is no actual harm to consumers, we saw a flood of lawsuits, and these lawsuits are being driven by class action, you know, lawsuits where they're just going after the attorney fees. this has led to very significant fines without payoff for consumers. in the past year we've seen over $100 million in settlements for companies like facebook, adp, walmart, six flags, wendy's, tiktok, and the list goes on. and these aren't instances, again, where we've seen any consumer harm. it was pure technical violations. that's not an effective use of
8:20 pm
dollars that could be going back into investing in technology and processes that would actually protect consumers' privacy, make it more secure, and protect them from harm. so a private right of action to me is just taking money out that could be used on actual protections for consumers. >> thank you, mr. castro. i'm sure you've read the media reports about special counsel john durham's allegations that democrat political attorneys, perhaps marc elias specifically, infiltrated servers at the trump white house. would it have protected president trump from data hacking attempts? >> it depends, certainly privacy legislation would restrict what companies might keep on hand and what they might share. but at the end of the day, what you really want to see if you want better security is better transparency, so consumers know what they're getting, in this case the trump organization,
8:21 pm
trump campaign, would know what kind of security and privacy controls they're getting when they purchase a service, and they'll know how their data can be shared. getting more transparency is really what we should be looking for in the future. >> i agree transparency matters. but so do penalties >> i agree, transparency matters but so do penalties for bad actors if the allegations are born out. do you know if the ftc would have allegation penalties if the allegations are born out? >> the ftc can impose penalties on companies for violations. particularly, the way this works right now, most companies have statements in their privacy policies or terms of service saying we respect your privacy and will secure your data. when they do not do that, that's misrepresentation and ftc can and has gone after companies for that. we can certainly consider giving ftc more authority, if we increase budget, saying exactly what to do with that additional
8:22 pm
money, i think that's the best way to go forward. what we don't want to do is give the ftc too much latitude though, doing its own rulemaking that goes outside of congressional intent, because it becomes very risky that the ftc made decide they want to start deciding how companies design websites or mobile apps all under the name of maybe insuring consumers aren't manipulated but on the other hand ftc isn't really made up of engineers and user experience designers. >> very good points, but this agency does address to exist concerns already. the online privacy act of 2021, bill referred secondarily to this committee calls for creation of a new agency called digital privacy agency, not only an example of congress shifting authority to the executive branch but seems like an extreme waste of resources given the
8:23 pm
existing structure already established in the federal trade commission and with the state's attorney general. mr. castro do you believe a new federal agency is necessary to accomplish the goal of data privacy? >> i think the ftc has the authority it needs and reputation experience to do this job. i don't think we need a new agency to solve that problem. i also mentioned when we think about protecting consumer privacy it overlaps with fraud and security and that is something ftc should be focused on. i think if we split up that mission we weaken for consumers. >> thank you for joining us and to all the witnesses i yield back the rest of my time, madam chair. >> thank you. mr. raskin, you are now recognized five minutes. >> thank you, madam chair. ms. fitzgerald, i was eager to ask you the question about the aggregation of data, about the aggregation and processing of data in large quantities can pose risks that might not be
8:24 pm
apparent just talking about a single piece of discreet information, and what can be done about that. >> thank you, representative raskin. i think the lesson we can take from all of these data points being combined about us is one, shows how important it is to have a comprehensive privacy law that covers personal data generally rather than the laws, hipaa, shrca, et cetera. the acronyms go on, and these personal data also show why commercializing intent doesn't work, it's just not available for the vast majority of internet users to grasp, may have consent of the website to use their data but are they consenting to be passed on to thousands of brokers never heard
8:25 pm
of? we urd of this, when a news outlet, got location from dating app grinder, i know you and members of committee signed rule making for protection data and that could be integrated into privacy legislation. >> thank you so much. i've read your book about surveillance capitalism and know it's hard to synthesize an entire book but who is big brother today? is big brother the state as we traditionally conceived it or is big brother the corporate sector or some combination there of? >> well, thank you so much senator raskin, i'm thrilled and flattered that you read my book so thank you for that as well. as you know, because you read
8:26 pm
the book i develop a concept that i call "big other" so big brother was a notion of a totalitarian state which had a very specific ideology and wanted people to believe certain things and talk in certain ways and act in certain ways. we have surveillance capitalism, not a totalitarian state. and surveillance capitalism really doesn't care what you think. it doesn't care what you believe. it doesn't care how you act. what it does care about is that you think and believe and act in ways that it can capture the data. to do the aggregation that you are asking about, because with that aggregation is the possibility of computation, applying artificial
8:27 pm
intelligence, coming up with predictive algorithms, targeting methodologies. with those targeting methodologies, various functions are achieved, including increasing engagement which is a euphemism for increasing the footprint for greater data extraction, and also, as we increase engagement, using our knowledge from the deep computation ai, using knowledge for things like subalmost nl cues, psychological micro targeting, engineered social comparison dynamics so we really know a lot about you and that way, we can use messaging and different, not only the content of messaging, but the forms of messaging to actually begin to modify your behavior and do that in a way that is consistent with
8:28 pm
the commercial objectives of the company and its business customers. >> so -- >> so at the end of the day, this big other is, sorry, i'll just wrap it up with the notion that this big other is ubiquitous, it's pervasive, it's not totalitarian, but it is full-on control and full-on opacity, full on power. >> well, we haven't yesterday nakted comprehensive federal privacy stature to address the rise of the big other, but a lot of states in some other countries have attempted to do so. i wonder if you can just tell us what you think have been the best lessons and the best efforts at comprehensive privacy frame work to protect us individually, but also to protect the peoples' possession of interest and characteristics and their own behavior.
8:29 pm
>> well the gold standard right now, and we could of said this a few weeks ago, but the digital services act and digital markets act, in the eu, these are, you know, not the total solution. we have very specific things to accomplish ahead of us but as far as setting a new standard of defining a new frontier for the kinds of rights, i want to use the word users. i try to do it with air quotes because users is all of us, you know, user as kind of a synonym for humanity at this point, and these pieces of legislation, they're ambitious, comprehensive, they have the buy-in of the european parliament now and the member states, few details left to be worked out in the spring, but these set a new standard for the
8:30 pm
rights, the enforcement powers, the rights of users, enforcement powers of the governments to really, for the first time, put democratic governance back in control of our information and communication spaces to work with the private sector rather than have the private sector as what it has been for the past two decades, a completely autonomous force, essentially unimpeded by relevant law, and all of the social harms we've been describing this afternoon are the result of that situation. >> thank you so much and gentleman's time expired we turn to the gentleman from georgia. >> thank you madam chair and this very interesting topic we've been dealing with, i've been talking about quite some
8:31 pm
time. mr. castor one of the issues i've seen from a perspective is the privacy laws we have across the country and states, you mentioned in your testimony, 34 state legislatures that introduced bills in the space from 2018 to 2021 and i want to get your thoughts on the importance of having a uniform federal standard for data privacy that, you know, would go across the board. >> really appreciate that question, thank you so much. when you think about, especially the smaller companies, people often think about, you know, facebook and google but these laws impact everyone, impact the local florist, barber shop, grocery stores and complying with those laws can be very
8:32 pm
expensive and difficult because even though at the end of the day, most of these laws are basically saying the same thing, you know, say it all a little differently and holding the laws so you're not in violation, for there's some technical violation if you for example not put a certain term up on your privacy policy in website may be in violation of california's law. all of these obligations add costs for businesses and these businesses, you know, instead of hiring another engineer or, you know, giving a raise to their workers, they have to go out and hire a privacy lawyer and a lot of privacy lawyers really like privacy laws because it's good business for them and the issue here is we don't need more laws, we need one good law and i'm glad we're having this
8:33 pm
conversation, i think we can get to that good law, hopefully sooner rather than later. >> so what kind of effect does this have on american competitiveness in the international market place? >> it's very significant right now. when we think about, you know, a lot of u.s. companies are doing a very good job of protecting privacy better than countries for example in china. but, you know, because the u.s. does not have federal law and we've taken a sectoral approach and other country advise not taken that sectoral approach, a lot of countries will say the u.s. doesn't care about privacy. i don't think that's true, i don't think that's true when we hear about the ftc defending the work they're doing, i think some of the state laws are being effective so i don't think it's fair to say the u.s. is doing
8:34 pm
good work on privacy but we don't have international privacy law, so that is used so a u.s. company does not get a particular contract. we're seeing that used now, in part, because european lawmakers are saying the u.s. isn't doing enough to protect consumer privacy. so i think one of the best ways we can, you know, promote u.s. competitiveness, not just in the tech sector but across the board is to create this federal law so we can show the u.s. is taking this seriously. >> i think everybody on this committee is committed to and i think most individuals recognize the importance of protecting personal data and an individual's privacy and definitely, i do as well, and i know you do. but with that being said, you mentioned in your testimony, a concern that i share as well which is the compliance cost to small business and state privacy laws. is there a state you can point to that successfully executed a data privacy law that balances those concerns? >> i think virginia is probably
8:35 pm
the closest we've gotten so far, you know, they're trig to strike the right balance. problem, when you see these laws, the state legislatures, you're not seeing the cost born by states outside their border so as we saw each new state on this path, you're putting up laws that force companies to rehire lawyers, redo privacy policies, make changes to their system, all the same compliance without discernable improvement in privacy. so if the goal is consumer privacy, let's address this top-down rather than have 50 new rules how to do it and have so much consumer confusion on what's going on and who is protecting your data. >> okay. i agree, i see my time is quickly running out, madam chair, i yield back. >> mr. aguilar is recognized. >> thank you, madam chair, i want to talk with you a little
8:36 pm
bit about mozilla has learned from its own experience building privacy protective tools. there may be relevance in the public sector and government agencies we have oversight to handle and minimize risk for their own operations. what are your thought on what you learn and what is applicable to the public space. >> so i think there are two important ways tee approach this question, first is to think about our core privacy practices and the data we'll collect from consumers, and how we're going to be really transparnlt and provide them with control of that. we really work hard to refer to these as we we refer to as data optimization practices, really only building what we need for our business and to create a good product for users and that isn't followed by expansively
8:37 pm
across our industries and the underlying problems we need to see fixed and once we have that data we work to protect it pretty aggressively which i think basically, this idea of data minimization and security controls on the back end should really apply to the public and private sector. we also work really hard to build as i mentioned earlier, these privacy features into the browser to protect users from web encryption, blocking, tracking in the browser. the interesting experience we have there, that can be a little bit disruptive for the industry in a way that like, right now, we have a really permissive internet, a large number of parties depend on that permissiveness to build on data practices and fund business, that's a problem so some amount of disruption is healthy. we don't always get out a positive response when we roll out privacy features and i think that's a signal we're going in
8:38 pm
the right direction, but certainly creates headwind. >> understand sometimes being at the forefront you take on a lot of folks with different view points as well. ms. fitzgerald, what are some of the best ways we can limit the risks and harms of the aggregations of personal information over time from multiple different sources that we know exist? >> thank you representative, i think the innovation that's been talked about a few times, especially in the first panel, was thinking if private companies triggered our data with what epo did, would be in much better shape. the point was made, if you don't have it, you don't have to protect it. so the obligation should be on the companies to minimize the amount of data they're collecting, you know, to leave it when they don't need it anymore. that protects users from having information passed on and in case of a data breach. >> thanks, appreciate that.
8:39 pm
back again, what are some of the privacy risks and harms we have that may not have, you know, primarily technological solution. i guess what i'm trying to ask is, you know, issues that instead require changes to business practices or laws, we're trying to balance regulatory side here and wonder your thought on how to balance those pieces. >> yeah, so we bump into this problem all the time in our work on the browser where for example the challenges we can solve, you visit a website for example and there's a hidden third party on the website the user doesn't know about and is collecting data on them, that's a third party relationship, we can do something in the browser, because there's a network request that calls out to that third party so we can just block that. that's typically the role of the browser or the operating system can jump in and say we'll
8:40 pm
prevent third parties from tracking your behavior. what we can't do as much of is when the user has a direct, what refer to as first party relationship with the website or business and that comes down to a question of business practices not technical problems. there's little we can do to adjudicate the relationship between a user who decided to engage directly with a first party business but the business isn't up standing and that's where the user may not know about it and that's where law and regulation need to see step in where technical fixes don't get what we need. >> appreciate that, yield back. >> gentlemen yields back. mr. spouser recognized. >> thank you, i want to shift to you if i can, ms. fitzgerald on the topic. in particular i was reading your testimony and you referenced biometric data is particularly sensitive and deserves stricter
8:41 pm
regulation, correct? >> yes, absolutely. >> and in particular, raised concerns about how healthcare data is used, is that correct? >> so i think a lot of people think hipaa is protecting healthcare data with extent to health apps, that's not correct, would you agree? >> right. hipaa only applies to relationship between doctor and health insurance company and see certain health exchanges, doesn't cover health data in apps. >> yeah, i think that's an interesting distinction i think a lot of americans might find interesting and building on what was previously discussed with mr. irwin, these third party apps and relationships that exist on certain website and see the impact that could have. so congress recently passed the 21 century cures act to protect patient data or require patient
8:42 pm
data to move into doctors office healthcare apps of the patient's choosing, a lot of reasons it's a good thing, things we want to make sure we get right, because you want to be able to make sure patients control their data but also ensure they don't lose control of their data by sending it in a third party that they might not be aware of, with interest in healthcare data, biometric data or others. do you think the risk is bigger in some of these apps that might send, that individuals, that the risk might be greater if they're sending information to some of these types of apps kind of under the rules and regulations of 21st century cares act? >> the reason biometric data is so sensitive is we can't change it. we can't change finger print. >> i totally agree, great topic let's shift and talk about
8:43 pm
medical data. does that create unique concern? >> yes, because of the ways it can be used against us, think insurance companies using it to determine our rates or even employer possibly not wanting to hire someone with health condition if they're paying the health insurance premium. >> i appreciate the commentary, i think it's an area to really keep an eye on. i think there are positive things, you know, we want patients to control their data, but at the same time we want to make sure that patients are protected in particular where there might be a third party player involved where the patient may not fully appreciate the risk they're engaging in sharing their data. something we should really spend some time and think about. let me shift gears if i k can, ms. fitzgerald. mr. castro, there's a lot of laws on policy, for government
8:44 pm
institutions, federal law requires companies to explain information sharing companies, safeguard some of their data, before new laws, i think there's a real opportunity for congress to look at how current privacy laws are working and how they interact with each other so that we're not duplicative of regulations, could you shine some light since the passage of gdpr in the eu, what have we seen in that sweeping legislation? is it stifled innovation, grown innovation? what big take-aways should we take from that from the eu's work? >> in many areas, certainly stifled innovation, when you look at the cost of trailer, the impact it's had in start-up economy, many areas where the data points show european business are suffering. >> so only because we got a little bit of time, if we shift
8:45 pm
gears slightly and say let's jump over to california, california's consumer privacy act, what lessons can we take away from the state of california? >> one of the biggest problems with california, we've seen hundreds of lawsuits related to enforcement. that's been a problem, one of the biggest positives of it is the idea of a 30 day cure which i think could be extended nationally where the goal is not to penalize for mistakes but take corrective action. >> i think what's really important is got a lot of laws on the books, state laws on the eu, look at what financial services committee talk about the services app and i think it's critical we're looking at all these legislations, taking lessons from that, so we help consumers protect that data but also maintain innovation here in america. thank you, chairwoman, and i yield back. >> ms. scalin in recognized.
8:46 pm
>> thank you, madam chair. i guess i'd like to pull on the thread a little more since we certainly do hear quite a bit about this on the company side and the agency side, how do they navigate the different laws between states but also countries, as we're well aware the internet does not have national boundaries in the same way. ms. fitzgerald, could you talk about that. >> what are the most important privacy rights you've seen that you think congress should work on adopting? >> yeah, the most important thing i think when thinking about privacy legislation is you can't just be telling people what companies are collecting about them. we need to put obligations on the companies collecting data to limit that collection, limit what they're doing, after they collect it, and have them delete it when they're done using it.
8:47 pm
>> in terms, do you have suggestions about which are the best models for us to look at or whether the eu has been mentioned? >> sure, so i think, you know, state models, you know, california has the strongest right now in the united states, the ccpa, california with the strongest model we've seen but that still is limited, only allows users to opt out of the sale of their personal data. it doesn't as much but obligations on companies to minimize the data they're collecting. you know, the online privacy act, by chair lofgren is a great model that includes data minimization, also includes provision for protecting against discriminatory uses of data, that's really upon the, and also want to make sure you're requiring algorithmic fairness
8:48 pm
and accountability, we don't want to make the same mistakes with ai that we've made with data collection. this is the next, you know, wave of technology, let's get ahead of t set the rules now, encourage companies to innovate around privacy and autonomy. >> okay. i mean one of the typical, i don't know, tools that people talk about with the privacy laws is the notion of consent and i gather that you're a little bit sceptical about the usefulness of consent as a focus of privacy protections. can you talk about that a little bit? >> yeah, so there's just no way for an individual to understand kind of the web that their data gets passed through if they're consenting, you know, if you're just hitting that consent button to make the banner go away on a website, you don't realize that sure, there's no rules.
8:49 pm
>> certainly, we've seen, certainly to get access, you either decide to click or decide not to. so with respect to discriminatory data uses, how do we address that, like your top line recommendations there. >> now, groups such as color of change, committee for civil rights, leadership conference have great proposals on what we need to do to make sure that civil rights are protected online, you know, making sure that public accommodations are protected, you know, that extends to online spaces. you need to make sure that the algorithms that companies are using are not being used in ways that deprive people of opportunities like, you know, ads for housing and ads for
8:50 pm
jobs, that those are, data are, our personal data is being used to show those in discriminatory ways. >> okay. thank you, madam chair, i yield back. >> gentlelady from new mexico is now recognized. >> thank you so much, madam chair. as we heard today, our personal data is extremely valuable. we take a question, quiz of facebook, look up directions, companies direct our information, sell our information without consent to a third party and then use our information, our information for a range of activities from showing an ad for clothing to sometimes more dangerous like targeting disinformation to manipulate our behaviors or believes and it's all invisible
8:51 pm
to the true owners of the data, i want to focus how data privacy or the lack there have impacts democracy or elections. recently, our committee held a round table on the creation and disemination of disinformation on elections and the pandemic and this disinformation targets spanish-speaking communities. we learned how the disinformation is then exported to other states including new mexico and then amplified based on this data harvesting we heard about today and read in this testimony. ms. fitzgerald, you noted that actors can recognize our data to undermine election integrity and democratic institutions. in practice, what does this look like? >> sure, thank you representative, just as advertising companies use profile to manipulate us into
8:52 pm
purchases, so too can they manipulate our viewers by filtering the content we see. the mark-up, after january 6th, showing the two different facebooks republicans and democrats were seeing, it's just a really stark view of, you know, kind of the different information that both sides are seeing and how that shapes your views. even if someone doesn't click through to an article, just seeing that headline, having it in the back of their head as they're voting, these companies are kind of able to manipulate our political views by determining what we see. >> ms. zuboff, i really appreciated your written remarks about how democracy is simultaneously the only legitimate authority of halting the surveillance capitalism but also a prime target so i
8:53 pm
appreciate that and welcome you expanding on that, but first, mr. castro said a private right of action is unnecessary and should only focus solely on transparency, do you agree, and if not, why? >> you need to unmute, professor. >> sorry. i don't agree with that. i think the private right of action is very important. when we pass a law, whether it's the california law or any other law, that's only the beginning what has to happen is these laws are tested and develop and as a
8:54 pm
society we learn what they can mean for us and how they're going to protect us in society. so what the private right of action does is it creates the opportunity not only for individuals, but for groups of people, for collectives to really bring issues into the judicial system to have those issues explored, and to create precedence. and this is, you know, this is what's called a life of the law, how the law evolved and how we can move forward into this century, not just with statutes that are frozen in time, but with laws that are evolving according to what justice brandice once called, you know, the eternal youth of the law because we have these kinds of international processes. so this --
8:55 pm
>> yes and i wanted to -- i completely agree. our civil rights laws would have had no effect if we hasn't had the right of private act, and i did want to ask you, though, to elaborate in the few seconds left with regards to what you believe most important to protect our democracy in an online privacy act. >> first thing is we notice so much of our discussion has been we minimize data or how is the data used and things like that. once we start a discussion talking about data we already lost primary ground. the key thing is now that we are all targets for massive scale secret data collection, much of which should not be collected in the first place. the decision should lie, the decision rights should lie with the individual. do i want to share this information about the cancer that runs in my family and do i want to share that with a
8:56 pm
research operation or federation of researchers that are going to make progress on this disease and help the world? maybe so. but do i want google or facebook or any other company to just be inferring information about my cancer from my searching and my browsing? absolutely not. so we need to reestablish decision rights as justice douglas offered in 1967 the idea that every individual should have the freedom to select what about their experience is shared and what remains private. these decision rights are the cause of which privacy is the effect. we need to establish now, finally, in law, juridical rights that give us the choice of what is shared and what remains private. then, we've got a whole new ball game where all kinds of innovations like mozilla and all
8:57 pm
the mozilla that are waiting to come on stream, not obviously just in search and browse but all kinds of businesses in every sector. they're all waiting to get in the game. >> thank you, and i know we have overgone our time. thank you for madam chair, and i yield back. >> gentlelady yields back. i just have a few final questions. first, thanks to all of the witnesses for really, very interesting, excellent and enlightening testimony. you know, i do think, you know, i wonder, mr. irwin, in your testimony, you refer to dark patterns as one of several privacy abuses that demand regulation, that technology alone couldn't adequately address. can you explain in further detail for the committee and for those watching the hearing what dark patterns are, what are some of the most troubling examples of dark patterns and what is the
8:58 pm
best legal approach to reigning them in? >> yeah, so dark pattern is a sort of broad umbrella term used to refer to what specifically sort of user experiences that can deceive a user in the opt-in data collection or consenting to data collection without being explicit about what is really happening under the hood and a dark pattern might be just text intended to deceive the user or might be you got to click through five things to opt out of the data collection. all of those are typically, sometimes referred to as dark patterns, we see those everywhere across the web and as i mention, that's the kind of area where the browser can't do much when the user is actively engaging with the first party. so i think we really do need to see regulatory advisement, actually acting to say here's what the standards should be and
8:59 pm
here's deceptive practice and here's a sound practice. that's where i think we really do need to see regulatory measure snooze standards instead of design. >> yeah. >> you know, we talked a lot about the rights of individuals for privacy and, you know, some of the disinformation, digital addiction, harmful impacts on children, on certain internet platforms, but i think, you know, whether this should be a private rights of action, but i think the core issue really is, as mr. halberd said, if you don't have the data, you can't use it to manipulate. and i think it's my view that the victims of manipulation, whether it's for political, societal, cultural or commercial purposes, it's not just the individuals being manipulated but the public at large.
9:00 pm
if you are manipulating for a commercial purpose you are a big tech company and got all this data, you know, you're at an unfarad vantage to smaller companies that have not acquired all of that data. if you're manipulating for a cultural or societal purpose, to move society in one direction or the other, the public is also a victim of that. individuals have been removed from their agency, and have been less free because a private corporation is manipulating them through their own data. so let me ask you, ms. fitzgerald. isn't really the crux of this to restrain the collection and retention of data? >> yes, absolutely. it's not enough for our users just to know what companies all collecting about us, it has to
9:01 pm
be restrained. obligations have to be on the data collectors to limit what they're collecting. >> i also think, i want to mention briefly and maybe any of the witnesses can talk or ms. fitzgerald may have studied this in your nonprofit, i'm a fan of ftc but have been advised many have served there, the sheer number of individuals technically savvy is fairly minimal compared to the actual army of, you know, digital engineers, softwear savvy people in these large companies and that frankly, they're no match for the gigantic companies with the computer digital expertise those companies have which is why we're looking at how to better arm a regulatory agency to actually be successful in
9:02 pm
facing off with these gigantic corporations. is that capacity of the ftc off-base, mrs. gerald, in your view? >> absolutely, i mean look, we're encouraged by recent actions of the ftc on privacy but ftc has limited resource and see incredibly broad mandate from antitrust to horse racing safety, you know, the task of data protection is best done by a specialized independent regulator. when you think about the outside presence of technology in our lives and economy, i think this is something where 20 years down the line, no one will question why we have a data protection agency just as now we don't question why we have a faa or bpa. >> let me just ask a question of you, mr. irwin. in your written testimony you state, quote, it is important that competition concerns not be
9:03 pm
a pretext to prevent better privacy for everyone. can you explain that? tension and how congress might strike the right balance on that? >> yeah, this is something we see fairly oftenput basic challenge we have here now, is the internet was designed in a very permissive way that allows a large number of parties to collect data, not only the larger parties but also the smaller ones and closing some of these privacy gaps sometimes means denying data to the large parties and small ones, sometimes we hear the argument that we shouldn't close the privacy gap because that could have competition implications like that idea that we should leave the internet more permissive in order to protect some set of business models. you know, what we want to see is an overall more protective platform that has an even playing field that all big and small companies can get on,
9:04 pm
that's the approach we i think are in favor of and will push back aggressively on any suggestion which leads to privacy holes in the browse order operating systems in order to protect some business model competing against big tech. >> right. i'll close with this. you know, i live in california. the california law has not really stopped the kind of collection i think its authors intended. i will say however, despite some decline to regulation, the formation of businesses in the tech sector is at an all time high. business is good, actually, job creation in silicon valley is carrying the entire state in terms of the job creation. so it has not had these horrible impacts in the tech sector. i'm mindful that if we constrain
9:05 pm
the collection and retention of data by internet companies especially in the internet space, it will require a change in their business models and i think that's necessary. because right now, the propensity and capacity to manipulate every person in america is unacceptably high. so i wonder, professor zuof, you talked about in your new york times essay recently about how democracies are confronting the tragedy of the uncommons and how we need to get back to protect our society for lack of a defined thing. can you expand what is the most important thing to accomplish that goal in your judgment?
9:06 pm
>> well, you know, the digital century opened with great promise. it was supposed to be the democritazation of knowledge. i'm not ready to let that dream die. the problem is we have gone on this kind of accidental path in my written testimony, i give a lot of background as to how that happened. a certain market ideology, a certain national security concerns the accidents of financial emergency in silicon valley and how surveillance capitalism was invented in early 2000 to get google over the hurdle when investors threatened to pull out the.com block, a bunch of accidents and that's why i call what we are an accidental utopia, but what we really have is an opportunity for the digital century to be a
9:07 pm
century where data is being used to further the needs of society, to solve our core, most important problems. and data collection is being used in ways that are most aligned with what we consider in the digital century to be necessary fundamental rights. this is a work that we have not yet undertaken, we got to figure out, it would be like living through the 20th century without having tackled workers rights and the institutions to oversee all of that. we created all of that in the 20th century. we're in a new century. new material conditions, new kind of capitalism, we got to do that is kind of fundamental work all over again. and that primarily, right now,
9:08 pm
this in-depth census of how artificial intelligence is developing around the world. the ecosystem of artificial intelligence and what it shows very plainly is the five big tech companies own almost all of the artificial intelligence scientists, the science, the data, the machinery, the computers, everything pertaining to artificial intelligence is concentrated in a small handful of companies and all that knowledge and capability and material is going to work to solve the commercial problems of these companies and their business customers. surveillance capitalism. it's not being used for society. so the public is being, not only is agency, being subtracted from individual life, but the benefits of the digital are being sequestered from the life of our publics, our societies,
9:09 pm
our democracies. we have the opportunity to get that back and get us back on track to a digital century that fulfills its promise of knowledge, democritazation and really solving the problems that fact us. that's what it's all about. >> thank you so much professor, and thanks to all of our witnesses. as i mentioned, the record of this committee will be held open at least five days. committee may have additional questions for you and if so we will send them to you and humbly request that you answer them and i think that we have advanced our knowledge of this situation substantially today, due to the very helpful and thoughtful testimony provided by these excellent witnesses. at this point, our hearing will
9:11 pm
of the screen control your personal view and can be switched between active speaker and gallery view. this is your individual view and will not affect anyone else or what the public sees during the proceeding. we ask that you please leave your screen gallery view to you can see the timer and physical low members. note the timer is its own square and will only be visible if you're in gallery view. it will remain there throughout the event. all participants are automatically muted. in a moment i'll mute any participants that have taken themselves off mute. all participants are asking to keep themselves muted when not engaging in the discussion, which should help limit echoes
34 Views
IN COLLECTIONS
CSPAN3 Television Archive Television Archive News Search ServiceUploaded by TV Archive on