Skip to main content

tv   Key Capitol Hill Hearings  CSPAN  July 28, 2022 2:39pm-5:32pm EDT

2:39 pm
today members of congress will participate in their annual baseball game in washington, d.c.. live coverage from nationals park begins at 7 pm eastern on c-span. you can also watch on c-span now, our free mobile video app, or online, at c-span.org.
2:40 pm
next, officials testifying on the use of biometrics and its impact on privacy they are asked about the use of facial recognition technology and privacy risks. the hearing before house subcommittee on investigations and oversight is just under an hour and a half today our focus will be on how technological solutions can secure our privacy while
2:41 pm
allowing us to enjoy the benefits of biometric tools. biometric privacy enhancing technologies can and should be implemented, along with biometric technologies, so called beach pads, the so-called beat pads could be implemented at the point of capture, improving the precision of collection tools to ensure they are not picking up features that are not necessary for use. they can insert, for example, obfuscation's on the data collected, degrading the quality of the information, or introducing statistical noise so the biometric data is less usable for unintended uses. the technique called a template protection can ensure that one systems biometric information is encrypted so that it cannot be read by another system, for example, someone's image obtained from the security systems that our doctors or psychiatrists office cannot be linked to the workplaces, to the work place identity
2:42 pm
verification system. federal agencies, including -- represented at this hearing today, as well as dhs and science acknowledges director, are already working to develop and improve our technology for biometric technologies. the american competes act, which i'm in conference with, and with the senate contains a number of provisions that will future proof the government's demolition standards for biometric application systems, and invest in privacy enhancing technologies. i look forward to hearing from our panel how we can further impress in these protections, as biometric technologies become more and more prevalent in our daily lives. and the timing of our discussion today is in notable. the supreme court has recently substantially weakened the constitutional right to privacy in their recent decision, overturning roe v. wade. states attempting to criminalize access to medical care they try to use biometric data to prove where someone has been, what they did when they
2:43 pm
were there. third parties may also try to access biometric information to collect the bounties now being offered by some states to enforce new laws. this makes protecting americans will metric data more important than ever. finally, i just want to observe that some of our witnesses testimony came a little bit late for this hearing. i apologize to the other members of the subcommittee, that we didn't have the usual amount of time as we would normally like to have had to prepare. the chair will now recognize ranking member of the subcommittee investigations and oversight, mr. obernolte, for an opening statement. >> thank you very much, chairman foster. good morning everyone. i'm really excited about our hearing this morning. the benefits and risks of biometric technologies and exploring research opportunities in these technologies. i'm really hoping that this hearing turns into a productive discussion that helps us learn about ways to improve biometric technologies in the future, at the same time protecting
2:44 pm
people's privacy. i was reflecting this moment -- this morning on the fact that biometric technologies really changed the way we lived our lives. this morning, i use facial recognition to open my phone, used the fingerprint reader on my computer to open my macbook. when i got to my car this morning, to go to my district office, the car recognized my face to set the seat settings. as i was arriving, used facial recognition to make sure i was paying attention to the road and that's just in the first couple of hours of today and it's definitely changed our lives and it's amazing to think that this was once the world of science fiction and now we just take it completely for granted. obviously, biometric spurring a lot of benefits to our daily lives. we want to make sure that we are able to continue to allow those benefits while protecting the privacy of the people that rely on biometrics. for that reason, i am
2:45 pm
particularly glad that dr. romine, from the national institute of standards and technology, is with us here today to talk about the work that -- nist is doing in space. nist has been working in research research for over 60 years. they have had incredible role to play in developing standards for biometrics. i'm hoping that in the same way they helped the fbi establish standards for fingerprint technologies in the 1960s, that they are going to be able to take a leadership role in establishing standards at the national, international level for biometrics today. and the standards are gonna be critical to enable the exchange of biometric data between agencies and their systems, as well as providing guidance for how those biometric systems are tested, how performances measured, and how assurances are made that data is shared securely and that privacy is protected. that's important because, as we
2:46 pm
all know, biometrics are really no different than any other advanced technology and that they have beneficial uses, but they are also misused can harm individual and harm our society. in this case, by compromising the privacy of individuals or the security of their information. as policy makers, we need to be acutely aware of not only the benefits these biometrics have to our society, but also of the risks associated with the technology, especially in my opinion when it comes to the covert collection and the issue of individual consent to have ones information stored and used. i think as policy makers we have to paul -- vallance that awareness objects the potential benefits biometric spring to society. you could easily imagine as taking a draconian approach to regulating biometrics that effectively prevents the development and use of
2:47 pm
biometrics that would lose all the benefits that we enjoy from biometrics. i'm not just talking about unlocking our phones or setting the seas on our cars. biometric technologies really have extraordinarily helpful applications. to give you a couple examples, and ukraine, the defense ministry is using clear view ai facial recognition technology to recognize russian assailants and identify combatants. we -- analytics tool traffic jam uses facial recognition ai to detect patterns -- to help law enforcement identify victims of sex trafficking. if we were to take an overly heavy hitter approach to regulating biometrics, we would lose out on those lifesaving applications as well. that is something i actually have firsthand experience with before serving in congress, i was member of the california state legislature. i served on the committee for privacy and consumer protection in the early days of facial recognition before the risks and benefits of the technology
2:48 pm
were well understood. i could tell you, we saw a lot of bills that were misguided proposals. they could effectively banned the use of facial recognition technology altogether. it's clear that it's a lot easier for us to push for legislation to outlaw technology entirely, then it is to conduct due diligence and try to intelligently balance the benefits against the risks of technology. that's actually why the work of nist is so valuable. here veteran are standing of the technology and carefully understanding cave safeguards and standards will help us develop biometrics in a way that provides safety for people's privacy without stifling and the innovation that's gonna lead to future breakthroughs and benefit society. before turning about their work today and from hearing from our witnesses. i'm very much looking forward to the discussion and i yield back. >> thank you. i have to say i am very much of the power that you might be
2:49 pm
driving. with all those features, it must be -- i wager you are probably are not driving in a ten year old ford focus. >> actually, that technology is coming to date expensive cars as well. it's amazing. >> anyway, if there are other members who wish to submit additional opening statements, your statements will be added to the record at this point it at this time, i'd like to introduce our witnesses. our first witness is miss candice wright, miss wright is a director of science technology assessment and analytics. she oversees the work, federally funded research on electoral property protection and federal efforts to help commercialize innovative technologies and enhance u.s. economic competitiveness. since joining jao in 2004, miss wright lead on a wide variety of policy issues involving federal contracting, wrist to the defense supplier base, foreign military sales, and homeland security.
2:50 pm
after miss wright, is doctor romine. doctor romine is a director of information technology, laboratory itl. itl it's one of six laboratory institutes in technology. doctor romine oversees a research program the cultivates information technology by developing at standards and testing, security, usability and reliability of information systems. our final witness, dr. ross. doctor ross is a professor in computer science and engineering at michigan state university. he also serves as a side director of the identification technology and research. his experience is in biometrics, computer vision and machine learning. doctor ross has implicated for
2:51 pm
the responsible use biometrics in mobile phones, including the nato advance workshop on identity and security. as our witnesses should know, each of you will have five minutes for your spoken testimony, your written testimony will be included in the record of the hearing. when you have completed your spoken testimony, we will begin with questions. each member will have five minutes to question the panel and if time permits, we may have two rounds of questions for our panel. we will start with miss wright. you will have to unmute, i am afraid. >> thank you. chairman foster, ranking member obernolte and members of the subcommittee, thank you for the opportunity to discuss jao's work on federal agencies use on biometric technologies particularly for facial recognition. the technology, which read measures and rent
2:52 pm
characteristics, is used to compare facial images from a photo or video for identification and verification. as the technology has continued to rapidly advance, its use has expanded in both commercial and government sector. today, i will share highlights from our work on how agencies are using face show recognition and federal efforts to assess and mitigate privacy risks. last, year we reported on the results of our survey of the 24 largest agencies and the use of facial recognition technology. 18 agencies reported using the technology, the most common use was unlocking smartphones provided by agencies. there were other uses that included domestic law enforcement to generate leads for criminal investigations. as well as monitoring our controlling accessibility or facility to, for example, identify someone on the watchlist is attempting to access. such use can greatly reduce the burden on security personnel to memorize faces. federal agencies may own their own systems or access to systems in state and local
2:53 pm
governments or probe commercial providers to conduct research to facial images. agencies noted that some systems can include hundreds of millions or even billions of photos. multiple agencies reported accessing systems owned by a commercial vendor. for example, they reported it using ai tried and five victims and perpetrators and child exploitation cases. agencies are investing in research and development to further their understanding and application of the technology. some examples include dhs's science and technology directorate who sponsors technology challenges for industries to develop systems. one recent challenge was to collect and match images of individuals wearing masks. in addition, they have awarded grants to research methods to prevent identifying an individual from facial images used in research. the expanded use in the federal government, there are concerns about the accuracy of the technology, a data security risk, transparency and its usage, and the protection of privacy and civil liberties.
2:54 pm
in our survey of law enforcement agencies, some agencies did not have completed information on what non federal systems are being used by their employees. in fact, during the course of our work, multiple agencies had to pull their employees and discover they were using non federal systems even though the agency initially told them otherwise. using facial recognition systems without force assessing the privacy of implications and applicable privacy requirements can put agencies at risk of running afoul of privacy related loss regulations and guidance. they are also risks that data sets with personal information can be compromised with unauthorized individuals. unlike a password which can be changed or breached, a breach involving data derived from a face may have more serious consequences as a facial image is more permanent. we recommend that agencies improve the process to track the facial recognition systems used by their employees and assess the risks of such systems. agencies are in varying stages
2:55 pm
of implementing the recommendations. in our work examining biometric privacy practices at tsa and cbp, we found that tsa had incorporated privacy protections for its pilot private program to test the use of a technology for traveler identity verification at airport security checkpoints. however, cbp's privacy notices to inform the public a facial recognition being used and its exit program we're not always -- or complete. cbp had not conducted audits of its commercial airlines and airport partners to ensure compliance with the requirements and retaining the use of traveler photos. fully implementing our recommendations will be an important step to protect travelers information. in closing facial recognition technology is not going away the men for will likely continue to grow. as agencies continue to find utility in the technology to make their mission, balancing the benefits of the technology
2:56 pm
the, data security requirements, and privacy protections, will continue to be important. chairman foster, ranking member obernolte, and members of the subcommittee, this completes my remarks. i will be happy to answer any questions you may have. >> thank you. and next is doctor romine. >> german, foster ranking member obernolte, and distinguished members of the subcommittee, i am charles romine, director of the information technology lab and national institute of standards and technology, known as nist, thank you for the opportunity to testify today on behalf of nist, on our efforts to evaluate the privacy implications of biometrics technologies. this is home to five nobel prize winners, with programs focused on national priorities, such as artificial intelligence, advanced manufacturing, the digital economy, precision mythology, quantum information science, bio sciences, and cybersecurity. the mission of nist is to promote u.s. innovation and
2:57 pm
industrial competitiveness by advancing measurement science, standards, and technology in a way that enhanced economic security and improve our quality of life. in the newest information technology laboratory, we work to cultivate trust in information technology and meteorology. trust in the digital economy is built upon key principles, like cybersecurity, privacy, interoperability, equity, and avoiding bias in the deployment of technology. nist conducts fundamental and applied research, advances standards to understand and measure technology, and develops tools to evaluate such measures. technology standards and the foundational research that enables their development and use, are critical to advancing trust in, and promoting interoperability between digital products and services. critically, they can provide increased assurance, thus enabling more secure, private,
2:58 pm
and writes preserving technology with robust collaboration with stakeholders across government, industry, international bodies, in academia. this aims to cultivate trust and foster an environment that enables innovation on a global scale. since its establishment nearly a decade ago, nist privacy engineering programs mission has been to support the -- by playing measurements to the measurement science and system engineering principles to the creation of frameworks, risk models -- tools, and standards, that protect privacy. by extension, civil liberties. the ability to conduct thorough privacy risk assessments is essential for organizations to select effective mitigation measures including appropriate privacy enhancing technologies. modeled after nist's highly successful cybersecurity framework, the nist privacy framework is another voluntary tool developed in collaboration
2:59 pm
with stakeholders through a public and transparent process. it is intended to support organizations decision-making in product and service design or deployment to optimize unofficial uses of data, while minimizing adverse congress -- for society as a hold. since the 19 80s, nist has coordinated of the -- standard data format for the interchange of fingerprint, facial, and other biometric information. four interchange of biometric data and a lot forsman applications, extending the modality of fingerprint to include face, voice, and dna. the standard is used globally by law enforcement, homeland security, defense, intelligence agencies, and other identity management systems and developers to ensure biometric information interchanges i were in operable, and maintain system integrity and efficiency. since 2002, nist has also
3:00 pm
supported development of international standards -- primarily civil applications including i.d. cards, including e passports. protect it's authenticator's protect access to sensitive data or computer entry solutions and fraud prevention. give rights to different degrees of privacy risk, organizations need to have the means to be able to distinguish between the different degrees of privacy risk and implement appropriate mitigation measures. the mist privacy prey framework provides a structure for organizations to consider which privacy protected outcomes are suitable to their uses. the research on that privacy enhancing technologies that misconduct and the guidelines and standards that missed publishes, helps organizations and implement mitigations appropriately tailored to identified risks. privacy plays a critical role in safeguarding fundamental values such as human autonomy and dignity as well as civil
3:01 pm
rights and civil liberties. they have prioritized measurement site research and the creation of frameworks, guidance, tools, and standards that protect privacy. in addition to maintaining the mist privacy framework, it considers privacy in cybersecurity guidelines as long -- thank you for the opportunity to present on missed activities on privacy enhancing technology and i look forward to hearing your questions. >> thank you. after doctor romine, there is doctor ross. >> think you mister foster, ranking member obernolte and seen members of the subcommittee, i am grateful for the invitation to testify today. i consider this to be a great privilege, an honor to engage with the power that graciously serves our nation. biometrics is available technology that has brought
3:02 pm
applications in a number of different domains. however, it is necessary to ensure that the privacy of individuals is not unduly compromised with the biometric data is used in a certain application. the purpose of my testimony is to communicate some of the ways in which the privacy of the biometric data of individuals can be enhanced, they are by, facilitating the responsible use of this powerful technology. firstly then, the benefits of biometrics, the need for determining the identity of the person is critical in a vast number of applications. ranging from personal smartphones, modern security, from self driving vehicles to evolving, checking child vaccinations and preventing human track, personalization of customer service. biometrics is increasingly being used in several such applications. for instance, many smartphones
3:03 pm
employ automated facer fingerprint recognition for unlocking payment authentication purposes. this increase use of biometric technology is being driven by significant improvement in recognition accuracy of these systems over the past decade. indeed, the phenomenal paradigm of deep learning based on networks has fundamentally changed the landscape of face recognition and biometrics. this brings me to my second point, the privacy concerns associated with the technology. for example, face images of an individual can be linked across different applications using biometric technology thereby creating a comprehensive profile of the individual or in some cases, unintentionally divulging the persons identity that privacy was expected. another example, rapid advances in the feet of machine learning and i will -- say i have led to the development of classifiers that
3:04 pm
can automatically instruct information pertaining to age, sex, from images. this can potentially breach the privacy of individuals. one more example, a number of face data sets have been curated for research purposes by scraping publicly available face images on the web. legitimate concerns have been expressed about using these images for research purposes without use of consent. in principle, therefore, an anonymous face image can be linked to one or more face images in a curated data set, thereby potentially revealing the identity of the anonymous face. now to my final point. how can biometric technology be responsibly developed and deployed while keeping privacy in mind? firstly, by utilizing encryption, which not only ensured that the original biometric data is never revealed but that all computations take place in the
3:05 pm
encrypted domain. secondly, by engaging a paradigm for biometrics but the biometric data of an individual is distorted using a mathematical function. the distorted data can still be consistently used for biometric recognition purposes, but within a certain application. this prompts the possibility of linking the biometric data of an individual across applications. thirdly, we -- a face of age in such a way that it's biometric utilities but the ability to extract additional attributes pertaining to age, sex, race or health is obscure. websitesfourthly, by making it e difficult for face images to be scraped from public websites and social media profiles. fifth lee, by deploying privacy preserving cameras without required images are not credible by a human but can only be used in a specific
3:06 pm
application. such cameras can ensure that the images are not viable for previously unspecified purposes. in addition, i must note, that academic researchers and biometrics are becoming increasingly aware of the privacy and ethical implications of the technology they are developing. this means the recognition accuracy is not the only metric being used to evaluate the overall performance of a biometric system, rather, metrics to security and privacy are also being increasingly considered. this shift in the research culture is remarkable and bodes well for the future of the technology. thank you and i welcome any questions. >> thank you. at this point, we will begin our first round of questions. the chair will recognize himself for five minutes. first on the prospects for secure and privacy preserving digital i.d.. we are all aware of some concerning aspects of biometric
3:07 pm
technologies, it is important to recognize that there are valuable uses for these technologies that can improve our license and security. the privacy protections must be valuable along with biometric capability so that we can reap the benefits safely. with our improving digital identity act of 2021, i am a bipartisan group of colleagues, have called upon the federal agencies to modernize and harmonize our nation's digital identity infrastructure in large part by leveraging the existing biometric databases that individual states already have been placed. as part of the program to support the real i.d., but additional using the standard to make sure these identity tools are interoperable and can be used for presenting that identity both online and offline in the privacy preserving way. first, doctor romine, how can biometric technologies increase our privacy by making our
3:08 pm
identities more secure against theft and fraud? >> thank you, mister chairman. i certainly appreciate the concern that you and ranking member have on this issue. i am delighted to be here talking about this today. the guidelines that we have put in place for privacy enhancing technologies, broadly speaking, we have investments in our privacy engineering program related to understanding how we can develop new technologies that can enhance privacy protections in many different aspects of technologies. that coupled with the guidance that we are updating today on identity management and appropriate protections for identity management technologies, i think they are going to be certainly
3:09 pm
opportunities to improve, as you point out, the protections of biometrics information across the board through some of these updated guidelines. i look forward to discussing that with you on your staff further. >> thank you. obviously any broad implementation of biometric identification techniques would require implementation of privacy, protective measures. the so-called bpad methods, how can they be strengthened, are they really ready for primetime? i've been told that there is still a privacy budget that you still have to enforce and that you cannot sit there and interrogate using houma fake description. you cannot just do it repeatedly without, at some point, revealing the underlining database. there must be limits to these.
3:10 pm
we pretty much understood and hit the limits of these or are there a lot of work that has to be done to understand how effective it can be to exchange information between trusted entities without revealing everything? >> thank, you sir. a very short time ago encryption was a theoretic idea whose performance was so unbelievably slow that it was not trackable. since then, enormous tribes have been made in improving the performance. i will say that these privacy enhancing technologies particularly using cryptography as a protection mechanism have enormous potential but there is still a lot more work to be done in enhancing those to make them significantly practical. as you point out, there are situations in which even with an obscured database through
3:11 pm
encryption that is queer-able, if you provide enough queries and have a machine learning a back and to take a look at the response, as you can begin to infer some information. we are still in the process of understanding the the specific capabilities that encryption technology such as homomorphic encryption can provide in support. >> dr. ross, do you have any comments on this? particularly the idea that you can cancel your fingerprints in some sense. how does that work and does it really work yet? >> yes, thank you for your question, chairman foster. cancel biometrics has been proposed as one way to both preserve the security and privacy of the biometric data but also the ability to cancel a once a biometric template. the way it works is as follows: that say you have a fingerprint
3:12 pm
image, now you subjected to some distortions using a mathematical function and the distorted image is then used for matching purposes. in other words, if the particular image is compromised, then you would just change the mathematical function. therefore you cancel your original template, if you will, and now you generate a new fingerprint template based on this revised mathematical function. in principle, this can allow us to not store the person's original fingerprint but only the distorted version or the other version of the fingerprint. that is why the accountable property, which is really important by changing the transformation function. now to your question about evaluation -- >> i am afraid i do not want to abuse my time, which has expired. we should be able to get to a
3:13 pm
second round. we may return. at this point, i will now recognize congresses other ai programmer, a ranking member representative obernolte for five minutes. >> thank, you chairman foster. thanks to our witnesses. it has been really interesting and i'm looking for to the questions. i have been reflecting on the fact that when we talk about privacy, it really is a nonbinary ethical problems, right? you cannot say that data is completely private or data is not. we are dealing with a strange kind of continuum where we have to weigh the amount of privacy we are willing to give up against a potential benefit that we expect by giving up that privacy. it is a complicated thing. that is -- i'd like to kind of organize my questions around that because i think solving that problem is going to be key to establishing a regulatory framework of what is expected when we ask companies to protect privacy. miss wright, i think i will start with you with a question.
3:14 pm
i am really, really happy to hear that jao is participating in this hearing. i think this is really -- it sends a powerful message to those we tend to regulate when we start with ourselves and government because obviously we interact with a lot of data from a lot of different users and we ought to be experimenting on ourselves on solving this problem before we expect others to solve it. i found your testimony really compelling. i have to admit, i was very alarmed when i read that 13 out of the 14 agencies you survey, did not have complete information about their own use of facial recognition technology. i realize most of those were people using facial recognition technology to unlock their own smartphones, things like that. it made me think about the fact that maybe there's a difference between privacy when it comes to our own data. i am using my face to unlock my phone. in the privacy when we are using other people data, especially when we have a large
3:15 pm
amount of data. do you think we need to -- when we do these surveys in the future, that we need to make a distinguish meant between those different kinds of uses? >> i certainly think that that is important. the case of where we found agencies did not know with their own employees was using, it was actually the use of non federal systems to conduct facial image searches such as for law enforcement purposes. in those scenarios, what was happening is perhaps the headquarters did not have a good stance of what was happening in the regional and local offices. that is where we think it's really important for agencies to have a good understanding of what are the systems that are being used and for what purposes and then to also make sure that by accounting that inventory, that there may be necessary tools to make sure they are balancing the potential privacy risk associated with using those systems. >> all of these things -- if you are using commercial source for this kind of technology, it has to go
3:16 pm
through procurement, right? would procurement maybe be a fruitful avenue to look at in terms of informing this flow of information? >> certainly. there are a couple of different scenarios. one in which agencies might have been accessing state and local systems or commercial systems through a test or trial period and then there also might be instances where they actually have an acquisition of procurement in place. we actually have some ongoing work right now that is looking at law enforcement use and the kind of mechanisms that they are using in acquiring systems from commercial vendors. i think that information is going to be really telling for us to understand what sorts of privacy requirements are being put in place when agencies are required services from these commercial vendors. >> that makes sense. >> doctor romine, i found a really interesting thing in your written testimony when you
3:17 pm
were talking about the privacy framework and the fact that it's not a static thing, it's not by an airy, which is very much in keeping with the way i look at it as well. could you talk a little bit about how you would evaluate to be dynamic. i think part of it has to be based on use. if you you use facial recognition for verification, that is a different use case that identified it. i think users, expectations on privacy is going to be different. how do you approach that kind of ethical for? >> that is exactly right. i think you have hit it on the head in the sense of the context of use is critical to understanding the level of risk associated with the privacy considerations. one of the things that our guidance intended to do and the privacy framework intended to do is give organizations the ability to establish privacy risk management principles as
3:18 pm
part of their overall risk management for the enterprise. you talk about reputational risk and financial risk for and human capital risk. privacy risk has not been included typically in that. we are giving organizations the tool now to understand that data gathered for one purpose, when it's translated to a different purpose in the case of biometrics, can have a completely different risk profile associated with it. it is not inherent in the data. it's the context in which that data is being used. our tools allow for a deeper understanding on our parties organization on that issue. >> i see the amount of time. if i get another round here, but i'm going to ask about scope creek. thank you, mister chair. i yield back. >> we will now recognize representative carey for five minutes.
3:19 pm
next represented for five minutes. >> you are passing? okay, we will recognize representative bice for five minutes. >> thank you, mister chairman and ranking member obernolte. i have a couple of questions that i just want to touch on. this is a topic of conversation that has come up in oklahoma a couple of times on the states side. miss wright, you testified that most agencies accessing non-facial federal recognition technology do not track use or access related to privacy risk. as far as you are aware, is there any federal law that requires agencies to track this information? >> there is a broad privacy framework, i guess i will say, where you have the privacy act that does call for agencies to limit their collection as well as disclosure and use and personal information in a government system. a photo would be considered an
3:20 pm
example of a personal information. you also have the government act as well, which does include provision for agencies to conduct privacy and passed assessments when they're using systems. again, to be able to use those privacy impact assessments to analyze how the information is collected, how it is stored and shared and managed in the federal system. lastly, i will just note that when we spoke with them, they had noted that agencies must ensure that privacy requirements apply to any systems that are being operated by contractors on behalf of federal agencies. >> actually we have not talked about the contractor piece which is sort of an interesting toggle. but i want to circle back around your comments about these assessments. do you think these agencies are doing the assessments? if so, are those outcomes sort of publish that other agencies can understand maybe risks or the breadth of what they are utilizing within the agencies?
3:21 pm
>> we have seen a mix of how agencies are approaching the privacy assessment. in my statement earlier, one of the things that i mention is when you have agencies that are using systems and their agencies -- excuse me employees who are using systems and their agencies are not even aware, then then there is a likelihood that these privacy assessments have not been assessed. it's a really important thing for agencies to keep in mind as they are continuing to use facial recognition systems. >> do you think it would be helpful for congress to look at requiring these assessments to be done maybe on a periodic basis for agencies utilizing these types of biometrics? >> again, the government calls for agencies to do that but the extent to which they are doing that really varies. perhaps that is work that we can talk about if there is oversight opportunities there to look at in the extent in which they are using privacies
3:22 pm
especially in the realm of biometrics. >> perfect. when you think some of the potential adverse consequences might be of agencies of failing to track information either themselves or through third-party systems? >> a couple of things come to mind. are they using systems that have reliable data? that have quality images that will then affect the sort of matching results that will come back and the extent to which those can be trusted? you can see where there is a potential for mismatch, which then might me in a long enforcement example where you might be chasing down a league that is not going to be fruitful or you might be missing an opportunity to chase down a lead. i think that is one thing -- one piece of it. the other piece is when we think about this from a privacy perspective, is how are the images being collected and how are they being use and does the individual have any say?
3:23 pm
did they provide any consent, for example, to their photo being captured and used in this way? there are certainly a number of different risks associated. there is still the issue of data security, are these systems being used secure? we have had cyber security as a high risk area on the jao risk list for many years within the federal government and you can imagine this only opens up the door to the potential for even greater security breaches. >> i would say sitting on the cyber subcommittee under services, i think you are exactly right. we talk about this from sort of a data privacy perspective but we also need to recognize that there are certainly huge potential for cybersecurity challenges when you are collecting these types of biometrics and storing them either through a third party, which i think in some cases can be maybe more of an issue but certainly if agencies are actually storing that information themselves. my time is almost expired.
3:24 pm
i yield back. >> thank you. i believe we will have time for a second set of questions here. i will now recognize myself for five minutes. ms. wright, i guess it's abundantly clear that the u.s. taxpayer has suffered greatly from identity fraud and everything from irs refund fraud, to unemployment benefit fraud during covid. you name it. has anyone, to your knowledge, well inside jao or elsewhere, netted out the total laws from federal government from identity fraud that might be prevented with using sort of state of the art identity proofing mechanisms? >> that is certainly not something that came up in the course of the recent work that we have done. i am not aware but certainly happy to take that back and follow up with you on that. >> i think we will be asking for the record sort of what is the coat scope of such a survey
3:25 pm
would be? there appears to be just little bits and pieces of documentation of the enormous losses that the taxpayer suffer from this fall. trying to get that balance right could be an important outcome here. waterton >> the. i am happy to do that. >> secondly, one of the tough things that we are going to face as a government is sharing a data with other governments. if you talk about biometric databases or the difficulty of regulating crypto where you will ultimately need to have uniquely identified biometric lee d. duped crypto drivable licenses if you are going to prevent it from being around somewhere or all these sorts of things. this involves setting up very much a passport system. something where you have to identify that someone is operating multiple identities
3:26 pm
in multiple jurisdictions. what doctor ross, are you familiar with the sort of state-of-the-art and what might be useful there? are there investments that we can make towards more research that would allow you to ask very sensitive questions of big databases that are owned by other states or other governments? >> certainly. i think one way thing that can be harnessed, which has to be further researched, is the notion of differential privacy which would indicate that within a certain jurisdiction, you are able to do certain identity assessments using biometrics and you have specific use cases, specific purposes in which identity can be matched but in other cases the identity cannot be matched. by defining the policies, one could then use these principles that you alluded to earlier including homomorphic encryption and including differential privacy in order
3:27 pm
to ensure that that kind of functionality can be performed. however, i must note that research is still in its infancy in the context of biometrics and certainly more investment is definitely needed in order to assess the suitability of this and operational environments. for their collaboration and investment is definitely needed to implement these techniques and operational environments. >> thank you. but doctor romine, when you are involved in international standards which is part of this mission, do you get the feeling that the united states is leading the way or are their peers around the world that are a sophisticated technologically in biometrics and in privacy preserving methods? >> in the work that we are doing in the international standards arena surrounding identity management, we certainly believe we are
3:28 pm
leading in that space. there are certainly other like-minded countries that our partners are with us that value democratic ideals. it's something we strive to work closely with and they do have a very strong technical capabilities as well. >> i have been instructed that at least some european nations, you have a right to know when any government official access is your data, at least outside a criminal investigation. are these things that can be cryptographic lee guaranteed or is that really an unsolvable problem too -- if you understand my question? i dream of some technology that would allow you with cryptographic certainty that someone has touched her data. >> it's certainly theoretically possible to use cryptography to address the concern there. i would not call it foolproof
3:29 pm
necessarily. the history of advancing technologies is colored with a many different sort of advances and risks and advances and risks. risks are addressed by new technologies which creates additional risks. the goal for us is to just ensure the trustworthiness of the underlying systems and certainly cryptography can be important. >> doctor ross, you have any thoughts on the feasibility of that? >> i think it's an excellent question because one thing this entails is keeping a ledger of entrapment between humans and the data that is being stored. for example, a block chain principle has been used to keep track of certain transactions that have occurred and these are immutable. i believe that some of these principles can be a leveraged
3:30 pm
into the field of biometrics but i must maintain that more research is needed, more investment is needed, but certainly the technologies available but then it has to be incorporated into the context of biometrics. pired. thank you. my time for this round is expired. i now recognize representative obernolte for five minutes. >> thank you, dr. foster. and doctor romine, we were having that discussion about, you know, the continuum of privacy and how that works ethically with our efforts to regulate it. in a written testimony, you talked about this idea that privacy can be violated when the scope of biometric data is used differs from the expectation of the person who provided it. that's ethically -- to, right? sometimes there are suicidally beneficial uses we put that to. a good example is when we've been talking about with using
3:31 pm
clear view ai to halt sex trafficking. if you asked the people that are saved from sex trafficking, they certainly didn't get permission for the use of their data in that context. if you asked them if it's a k without, they say yes, please. right? how do you navigate that minefield? >> that's a terrific question. one of the things to keep in mind is that you, know, when you have acquired biometrics data for whatever purpose, any organization that has acquired such data, these are now assets in their control. sometimes the pressure to use those assets in ways that haven't been -- orange originally intended, is pretty enormous. the idea that, hey, we could do this into thinking we should do this with those data. that's one of the reasons we always have to stress the importance of context of use in these areas. you are absolutely right. in some cases, a new context of
3:32 pm
use may be enormously beneficial and perhaps not even controversial. another cases, could be extremely, potentially, damaging. by the way, this is the difference between cybersecurity and privacy, in the sense that as a cyber security event does not have to take place for privacy harms to occur. simply using, in this case, by metrics data in ways that were not intended and perhaps violate the expectations of those who provided the data, can create privacy events. >> sure. yeah. i completely agree. in fact, i want to ask a question about that to dr. ross. in your written testimony, you are talking about privacy violations that can occur with using facial recognition to infer racial, sexual, or health care characteristics not intended by the person providing the data. which i thought was very interesting. how do you navigate that in an ethical sense?
3:33 pm
because when i post a picture of myself on facebook, and one of my friends looks at that and says, boy, he really looks unwell, right? i can't pinpoint my finger at them and say, hey, that's a privacy violation. i did not intend for you to infer anything about my health. they would just roll their eyes because it's understood, my pictures out there. those inferences can be made by anyone who sees it. why do we make a distinction between that when, that use when a human does it, and a machine does it? >> thank you. again, an excellent question. we are really distinguishing between human-based analytics versus machine based analytics. and when you employ a machine to do this, and then you can do this and must. you can have billions of letters, you can run the software over these billions of images, make some assessments in the aggregate, without user consent. and so, it is the ability to do this repeatedly over massive
3:34 pm
amounts of data and then use that aggregate in order to, say, perform additional activities that were not indicated to the user. that is where the problem lies. if the user were to give consent saying that, yes, these images can be used for further analytics, then i believe using the machine will be productive in some cases, but in other cases, as you point out, there might be a violation of privacy. i think it all boils down to use of consent, and the fact that you can do this and must. how do we do this in a manner that the person is aware of on how their data is being used? and in a matter that is not unwittingly leave additional pieces of information that might violate their privacy. >> right. i somewhat agree. i think the distinction is not the amount of data that is processed, but the inferences that can be made that might be an intuitive to the person providing the data.
3:35 pm
quickly here, another question for you doctor ross, before we run out of time. you talk a lot in your testimony about privacy by design, which i think is a really elegant concept. consider me a skeptic because, for example, if you're using an algorithm that distorts images in a way that sex or ethnicity can't be red, we are gonna run into exactly the same problem, aren't we? that we did with cryptography, where crypto algorithms developed ten years ago don't work anymore because computers are so much more powerful. is recognition technology gets better, are those algorithms not gonna work anymore out either? >> a great point. very insightful comment. i think this is where more mathematics is needed as we start developing biometric technology and developing -- theoretical guarantees. understanding with the privacy leakages are, information
3:36 pm
leakage. and what is lacking here is privacy metrics, really. privacy metrics in some sense is a moving target because if technology cannot deduce some attribute from a face image today, it might be able to do it tomorrow. and what is deemed to be private now, today, may no longer be deemed private tomorrow. that is where the concern is. this is why we are not -- the technology evolves, these collaboration's that establish, it must be re-visited. it's not static in time. it is dynamic in time. because as technology advances, these policies must evolve and also the metrics used to evaluate must evolve. in short, i completely agree with your statement. some of the problems in cryptography can potentially manifest itself in these other techniques. but it's not unsolvable.
3:37 pm
>> right. >> i think with adequate technology development, especially employing mathematical transformations, i believe a solution can be found. >> fascinating discussion. thank you for that. i yield back, mister chair. >> thank. you i think there may be time for an additional round. we will see how things go. we will now recognize representative perlmutter for five minutes. >> i was hoping miss weiss might go first. i'll never be able to catch up to jay erbil on this subject. stephanie, i can at least talk to her about it at softball. anyway, i want to thank the panel. there was a word you used, dr. ross, and then you got into this conversation with mr. obernolte about the fact that, you know, technology may make some of what we are trying to do today, in terms of privacy and cybersecurity, outdated tomorrow. it reminded me of a great
3:38 pm
oklahoma, will rogers, it was about certainty, but i will use it with immutability. the only thing that's immutable are death in texas. my question is, i'm just a science fiction person when it comes to this, is thinking of minority report with tom cruise. you may have all kind of addressed that. every place he goes they know him already. eventually, he has to have his eye taken out because of this. i went and bought an ipad holder from a company called whether deck the other day. we were in there for something else. i saw it, it looked good, so i bought the thing. all of a sudden, i'm getting ipad holder ads like crazy. i didn't even look for it online. i just bought the darn thing.
3:39 pm
i feel like i've got either big business looking over my shoulder, or big government looking over my shoulder. i'm making more of a statement and then asking a question. but i guess miss wright, i will start with you. doctor romine was talking about privacy versus cybersecurity. what can we do in the congress to ensure ourselves a little bit more privacy? >> i think a really key, important factor is how we hold -- i will start a federal government -- that we hold agencies accountable for the information they are collecting. the for purpose for which the information is being used, how it's being stored, shared, and sorted. those are fundamental things to start with as we think by this issue of privacy. to really think about what's applications or use cases we think should be permitted or
3:40 pm
restricted. because i think you will see a handle on where the concerns are with respect to privacy. again, at the end of the day, this is all about trade-offs. while there might be some convenience factors, there might be some security benefits as well. there's also the issue of privacy and being able to protect your personal information. i think that's where the tension lies. >> well, there is a tension as well between the kind of privacy we might want from state or federal, local governments, versus the privacy we may want from private enterprise, you know? the thing i ran into, it was a spontaneous purchase of this ipad holder. all of a sudden, i'm getting ads about it. you have got to really sizeable entities out there looking over your shoulder. i think we, in the congress, need to think about both of
3:41 pm
those, when we are thinking about, particularly about privacy and cybersecurity. gentlemen, anybody have a comment to my sort of general proposition? here it's not science based, personal based. >> i would be happy to share some comments. i think the issue you are describing is actually very important. namely, exchange of biometric data. it is collected for one purpose and then it can be transmitted to another agency which might use it for a different purpose. i think that is a legitimate concern. one way, in order to kind of prevent this, even before it happens, is by ensuring that when we store the biometric data in one entity, that it is
3:42 pm
suitably encrypted, suitably transformed. when it is used in a different entity, or by a different entity, it is encrypted differently, or transferred along differently. what happens here is then, now these two sets of data cannot be linked. they have been transformed differently. that becomes very important. on the flip side, it might actually prevent, say, one agency from communicating with another agency because the biometric data cannot be accessed. this is where you ask a specific specific -- there are certain situations when it is acceptable. in other situations, like the one you described, it is not acceptable. this is where technology developments must be augmented with legislative instruments to engage the data in a manner that is appropriate in different uses. >> thank you.
3:43 pm
my time is expired. >> thank you. i will now recognize representative bice for five minutes. >> thank you. and for representative perlmutter, my friend and coach, i think, part of that i recognize the connections there, it is re-marketing your email is likely tied to your credit card in some way, or you may have entered your email address when you checked out. your email is tied to social media, then when they realize that you purchased that, they started marketing to you all sorts of things. that's been going on for quite some time. for a lot of folks, it is concerning. we begin to wonder, how did they know, how did they get this information? that's big data at its finest. i want to talk, here in oklahoma, this last session we passed house bill 29 68, the computer data privacy act. the bill allows for the option for personal rights to be returned to the individual,
3:44 pm
along with the option for cancellation of the information in a private companies database. to me, this seems like it could be a solution for privately collected biometrics data. this is to any of the witnesses here. what do you think are the most concerning aspects of developing biometric technology? >> i would be happy to offer some comments, if you don't mind. >> sure. >> to both parts of your excellent question, i think one of the most obvious concerns about biometrics is the ability to link different data sets. i think that clearly constitutes a problem in some cases. in other cases, it is an advantage. once again, as the technology
3:45 pm
improved, as the recognition accuracy numbers improve, this kind of linking can be done with more certainty. because the numbers are decreasing. i think this is where policies for regulating the use of the technology become important. and some use cases, it is absolutely essential to have the functionality. in other cases, it may not be required. secondly, in fact, in response to your first comment, again an excellent comment, is one a user in a private enterprise offers their face image, or fingerprint image, it would be nice if they can say, for what purposes it can be used. for example, if it's a face image, they might say, well, you can use this for biometric recognition, but it should not be used for assessing, say, age or health cues.
3:46 pm
the moment they specify that, their data should be transformed in a manner that would facilitate the functionality prior to storing it in a database. this gives some degree of control to the user, because the user is now able to specify what kind of information can be gleaned, and what kind of information should not be gleaned that. technology is then being harmless to impart this kind of functionality. i think that is one important -- where more investment is needed. many techniques have been proposed in the literature. these have not been evaluated. the scale of these things has to be assessed. there is tremendous opportunity if we were to invest in this front. excellent question. thank you for hearing me. >> anyone else care to comment on that particular aspect? >> i'd be happy to weigh in. some of the challenges involved
3:47 pm
via bought ability as my colleagues said, doctor ross said, to glee certain kinds of information and some of the potential societal harms or equities that might occur. i will go back to the ranking member's question about his facebook image and having a friend of his seat and say wow, you do not look very good. imagine instead of his friend, it was an insurance company deciding, allow, you do not look very good. and taking steps as a result of that assessment. those are the kinds of societal harms that i think we need to be wary of. >> perfect. i think this is a really great point and that use of those biometrics is incredibly important and we need to be able to develop systems and control to be able to allow for individuals to have some sort of say and how their information is utilized. thank you for your time.
3:48 pm
mister chairman, i yield back. >> thank you. i think we will now embark on an actual final round of questions. i will recognize myself for five minutes here. doctor ross, you seem to be coming close to describing something that is like a license regime for collecting biometrics data. let's say someone wanted to put a camera, facial recognition camera in front of their nightclub to find people that have repeatedly shown up in the nightclub and caused violence. it sounds like a legitimate thing. if they start transferring that information around, then there are a bunch of issues. are there a standards? there might also be a question for doctor romine. are there are standards being utilized for how you would license the collecting of the data and also licensing the transfer of data? so that you would actually --
3:49 pm
if you were holding biometric data on someone, you would have to also be able to demonstrate a chain of custody that shows that you had achieved this -- you would have obtained this only through a set of license distributors of data with customer consent at each point. have people gone that far or any country gone in that direction? >> thank you for your question, chairman foster. i will address the first question. i am sure my colleague will answer the other. the first part, there is research that is being conducted in which privacy is being moved closer to the sensor then to the data because once the data is acquired, it is available. yes, you can encrypted, you can transform it, but someone has access to the data. what if we move the privacy
3:50 pm
aspect to the camera itself in such a way that the camera is designed in a manner that it can only extract or require other specific aspects of the scene? that becomes very important because the digital version will be available. even prior to storing them at the level, might be one way in which the scenario described can be handled because the data will no longer lend itself to be processed by a different organization or entity because the data has already been put up at the time it was acquired by the camera. that would be one technology solution but as i mentioned earlier, these things have to be evaluated. much more research, much more investment, much more evaluation. these are needed in order to
3:51 pm
substantiate these principles. >> will this ultimately require, for some purposes, basically a government back door? for example, if you have cameras looking at elevators, just to make sure you are opening and closing the elevators as fast as possible where you only really have to detect the presence of a human. all of a sudden you find some massive crimes have been committed. the government might want to go to a trusted court system and say, okay, bypass the observation, i want to see that person's face who's in the elevator. are these necessary things or are these policy options that we are going to have to face? i >> think it will be a good mix of technology built innovation and policy that are based in which the same data can be stored in different formats, different transformations so that it can be used for some purposes and not for other purposes.
3:52 pm
i think technology can be applied in order to transform the data in different formats but then individual formats should be guided by the policy as to who can access it and who cannot access it. i think it would require a good coupling between the technology innovations and some very nice policies to make it happen. >> dr. romine, do you have any comments about when you engage in some of your foreign colleagues in this, they face a very different set of attitudes as in the united states? >> certainly that is true. for example, as you know, the gdp are in europe envisions a very different way of approaching protection for privacy than we currently have here in the united states. that said, one of the reasons that the privacy framework that
3:53 pm
we have developed is regulation diagnostic and technology agnostic is that we wanted to be adaptable, usable around the globe and to be able to provide assurance that if you follow these guidelines, you have evidence to support your compliance with whatever regulatory regime you happen to be in at any given time. >> thank you. i will now recognize representative obernolte for five minutes. >> thank you, chairman foster. i would like to continue our discussion about the kind of ethical philosophy around privacy. a couple of interesting things have come up in this last round of questioning like how do we safeguard this privacy? from a 30,000 foot level. i think there is some things that can work and some things that probably will not work. i think doctor ross, you eventually had a disclosure
3:54 pm
which i used to think that that was a great idea and i started looking at and use your license agreement for software. there are pages and pages, people scroll through, they click agree, no one ever really read that. what's good is it possibly going to do for us to add another paragraph and say, here is how you are going to use your facial data that you give us? there was an episode of south park a couple of years ago when it was a parody of one of the characters that inadvertently gave apple the right to do medical experimentation on him. his friends were, like what, you just click that and signed without reading it? who does that? the answers, everybody does that, right? i do not think disclosure is the answer. i think maybe control over who has access to the data. if i give my data to apple for certain purpose, the fact that apple should not give that data to someone else to use for a
3:55 pm
different purpose, i think that is closer to the mark. i think ultimately we are not going to find a real regulatory solution to this problem without looking at the things we are trying to prevent. that's what attorneys call the parade. when asked about that. doctor for romine, i will ask you about this. we are entering in an era when anonymity is a lot less than it used to be. that is going to be true regardless of what approach, we as a government, take towards privacy. can you walk us through like the worst things we have failed to act, like the worst things that can happen? i think those are the ones that we have to be trying to prevent. >> fair enough. i will say figuring out what the work things are my taking some time. some of the things that i have already alluded to is this idea of organizations in making decisions based on influences from biometric data that
3:56 pm
disadvantages certain groups over others. >> let me stop you there though because we have had that problem where there are ethics around ai algorithms that we are dealing with that issue. i think -- i mean the solution to that is you focus on the fact that that behavior is already illegal. if i'm going to kill somebody, it's equally illegal for me to kill somebody with a knife or a gun. the tool does not matter, the act is what matters so why is that different in the case of privacy? as it is a >> i do not think it's so much different as it is -- it's a consequence of the lack of privacy or privacy compromise. privacy in this case. or compromising a privacy event would lead to that activity. there are a very things that i can imagine. there are aggregates societal
3:57 pm
decisions that are made that may be predicated on aggregate data that violates privacy considerations and those kinds of things, policies may be instituted that our enamel to certain populations as a result of issues related to privacy or biometrics. in all of these cases, what we have discerned is that there is no technological solution to solve the privacy problem and there is no purely -- i think there is no purely policy solution that is going to solve that problem. it is an ongoing joint efforts of providing appropriate technologies for improving privacy protections and matching those with appropriate policy decisions that can prevent some of these. >> i agree with you. i definitely think in crafting policy, we need to be looking
3:58 pm
at -- ask yourselves the question, what problems are we trying to solve? what are we trying to avoid? merely focusing on anonymity, i think it's a fools era because we have a lot less anonymity now than we are used to, and we will. there is nothing we can do about that. i think there's a big difference between when we talk about the parade of corals. whether or not it is government using and violating peoples policy -- privacy or other entities. government has, of course, a power that other entities do not. if you want me -- look at what china does with some of the personal data. that is the top of my list. i really do not think we are going to get there without -- from a policy framework standpoint without thinking about the problem we are trying to solve. it's a fascinating discussion. i'm sure we are going to continue to have it over the next couple of years. thank you very much, chairman foster, i really enjoyed it.
3:59 pm
i yield back. >> we will now recognize our lawyer and residence for five minutes. representative perlmutter. my >> thing mister obernolte is really focusing on the question of the day. i remember me a sermon in state senate about 20 plus years ago and we were just trying to have an internet with in the colorado legislator and something came up and we were talking about social security numbers and should we release them and all that stuff. for privacy purposes i said, well, i was sort of being cavalier and i said, there is no such thing as privacy. to your point, there is no such thing as anonymity. it is only grown since the last 30 years. the question is, i think from a policy perspective, technologically, we can address things and as miss wright for
4:00 pm
said, you give up things to get something. you can make it tougher for cybercriminal or for somebody to use your data but you are giving some efficiency or some ease of use in the process. the supreme court for in several decisions, none of which i liked. the one i liked the lease is the reversal of roe v. wade but they basically say that under the united states constitution, there is no such thing as a right to privacy. i want to feel secure that when i go i don't know -- i mean, i want to feel secure that when i buy something spontaneously, that that doesn't alert everybody under the sun to something. or when i walk by, that doesn't send out all kinds of -- sleds cell max, or get him.
4:01 pm
i guess this is for everybody, including my two colleagues. i don't know. to jay's question, what is it we are trying to solve? what do we want here? do we want to create a right to privacy, now that the supreme court says there isn't such a thing? we certainly, legislatively you can say something like that. and then how far we want to take it. those are the questions. and then for the technology, help us put that into place, knowing that technology is going to evolve and change and things we thought were in place will be replaced. that's just the perlmutter thinking on jay obernolte's line of questioning. if anybody's got a thought, it's the responsibility of the technologies. and you, miss wright, the director of the agency that thinks about this stuff, to think, to say okay, from a
4:02 pm
technology standpoint, we can do some things if you guys give us a clear direction. i think bill is trying to do that on some of his digital legislation. i think jay had some stuff to. i don't know. doctor foster, i'm gonna turn it back to you. you can do with my two minutes whatever you wish. >> all right. that's an interesting -- you know, i will ask you a question. so much of this is gonna have to do with our cell phones. doctor, romine, is there good coordination with the manufacturers of the cell phones? there's incredible ai being built into the next generation of smartphones but all of it is inside the secure enclave where you will have some idea that it is trusted from a computer. are you having thoughtful interactions or do you get the opinion that they are just trying to set up a walled
4:03 pm
garden for people who want privacy information under their control? >> we work with a very large and broad cross section of technology, including cell phone manufacturers and providers. an interesting, having further reflection on the ranking member obernolte's question about significant harm's, one of them that i can imagine is either through cell phone tracking or face recognition, you know, cameras, street cameras, so on. someone trying to access safe and reliable medical services, whether it is psychiatric services or something else. suddenly, that becomes a matter of public record. someone has now sort of been outed because of biometrics information, privacy information, trying to obtain services. this is another one of these
4:04 pm
very serious potential issues. but, yes, we are working in discussion with cell phone manufacturers and other advanced technology firms all the time. >> right. okay. thank you. we could go on all afternoon on this. i just really -- i suppose i have to close the hearing out. before we bring the hearing to close, i want to thank our witnesses for testifying before the committee. it is really valuable for us in congress, as we struggle with all of the policy issues here on biometrics and policy, that we have access to real quality experts, so we can understand the technological reality of the feasibility of things, and don't generate legislation based on wishful thinking instead of technical reality. the record here will remain open for two weeks for additional statements from the members, and for additional questions the committee may ask the witnesses. the witnesses are now excuse. and the hearing is now adjourned.
4:05 pm
4:06 pm
next, a review of the most watched and controversial decisions of the u.s. supreme
4:07 pm
court's recent term. from the new york guns case to the overturning of roe v. wade, which eliminated abortion as a constitutional right. lawyers and law for pressers examined those decisions and other cases decided this term, in this discussion hosted by the american constitution society. >> good afternoon everyone. welcome to the american constitution society's annual supreme court review for the 2021, 2022 terms. my name is christopher -- the country's foremost progressive legal network with more than 200 student and lawyer chapters across the season -- then acs mission is to shape debate and nurture the face and -- sure the law is -- if you aren't already, i encourage you to become a member of acs. go to acs law dot org where you can join and find more information about events like this one.
4:08 pm
before we get started, a few housekeeping notes. one and a half hours of california -- is available for today's discussion. additional information and cle materials can be found on the acs website and will be shared with all participants in a follow-up email. there will be time at the end of today's discussion for obvious conscience, to submit a question please use the q&a box at the bottom of the zoom screen. do not use the chat to ask your question as we will not be monitoring it. your question will not be put in the queue. if you are a member of the press asking a question, please identify the outlet for which you are reporting at the top of your question. now let's turn to today's discussion. each year, as a supreme court term draws to a close, and a summer begins, acs host a discussion with academics, practitioners, and advocates, about some of the most consequential and talked about cases of the past term. in that sense, this year is no different. in just a minute, you will hear from this year's distinguish panel who will explain and
4:09 pm
provide context for some of the most watched and anticipated cases of the term. in other ways, the chair is very different. it was the first full term that justice amy coney barrett served on the court, therefore the first full term for the court's conservative supermajority. it was a term that second half was conducted across the street from where the confirmation took place for now justice ketanji brown jackson, the first black woman nominated and confirmed to the u.s. supreme court. it was a term conducted amidst growing calls for court reform, that ranged from moisture engine ethics rules, to term limits for justices. adding seats to the courts. it was a term in which the full draft opinion of one of the most consequential decisions in the past half century was leaked to the public, months before the decision was released, giving us advanced warning about how this court would handle the latest chapter in the fight over reproductive rights. and perhaps offering a glimpse into how this conservative
4:10 pm
supermajority might handle other cases looking to abandon long-standing precedent. it was a term in which many court watchers have faith especially close attention to all of the court heard cases. to give an idea of what these decisions say about today, and where it might be headed tomorrow. to delve more into these cases and issues it's my pleasure to introduce our moderator for today's discussion, tom goldstein. he is best known as one of the nation's most experienced supreme court practitioners. he has served as counsel to a party in roughly 130 marriage cases at the court, and recently argued his 45th. in addition to practicing law, tom has taught supreme court litigation at harvard law school since 2004. he is also the cofounder and publisher of scotus law, a website devoted to comprehensive coverage of the court, which is the only -- to receive the p body award. he has received recognition of his practice on the supreme court and for his appellate
4:11 pm
advocacy generally. in 2010, the national law journal knowing to him one of the 40 most influential lawyers of the decade. we are delighted and lucky to have him again as the moderator for this annual event. welcome, tom. >> thanks so much, christopher. wow. a lot of peoples worst, in terms of what would happen in a supreme court term, came to pass. at the same time, this is a day of celebration for many because of the justice jackson taking her seat on the court today. we have an incredibly accomplished panel of academics and practitioners who are specialists in the fields of the courts major decisions from the term. they are able to talk to us about what the court has held and what it means for later cases, later controversies, that are sure to come up as follow-up decisions and new issues come to the court.
4:12 pm
with this, as christopher says, conservative majority. we are going to walk through the major decisions of the term, including, of course, with respect to abortion and with respect to gun rights. immigration, citizenship, and viral mental law, including major decisions that were just issued a couple hours ago. and we will try and put those in the context of what this court is doing, and where it is going. that final ruling it is making, it's implications for any attempt at progressive legal development. and related controversies, including, of course, the question of the leak of the opinion, why it happened, what's it will mean for the court going forward. we are going to start with the reproductive rights, and we have an incredible specialist in the field but caroline corbin is at miami law school -- an attorney with the aclu's
4:13 pm
reproductive rights project. maybe you can start us off with some of those decisions. >> okay. so the first thing i want to say about dobbs is that it is a devastating decision that will ruin live. even though we will be conducting a more home blooded constitutional analysis, i'm well aware that losing the right against forced birth, has truly dreadful, real world consequences. here's what i will do in my ten minutes. first, i will summarize the majority opinion, next i will consider possible consequences for other fundamental rights, and finally, i will offer some critiques. we all know that mississippi banned abortion at 15 weeks, which is way before viability. previous supreme court decisions, including roe v. wade -- not only established the right to abortion but how that bans
4:14 pm
the form of viability violated that. right -- the constitution does not protect the right to abortion. according to the court, the constitution only protects fundamental rights that are either explicitly listed in the constitution, or that are deeply rooted in our nation's history. the text of the constitution is not mentioned. according to the court, the right to abortion is not deeply rooted in our nation's history or tradition. nothing in state constitutions, state statutes, judicial opinions, treatise is, the history of the court declared that women have a right to end a pregnancy. on the contrary, the court writes, most states have banned abortion but at the time the 14th amendment was adopted. it is the 14th amendment that provides protection for fundamental rights. the right didn't exist when the
4:15 pm
14th amendment was ratified. the right does not exist today. what about -- the principle that courts should follow precedent? the supreme court five reasons why over -- overruling woe -- these reasons do not match the -- whether or not to follow precedent. in other words, in overruling precedent on abortion, the supreme court ignored precedent on -- what are these reasons? most significantly for the court, roe was egregiously wrong the day it was decided. just as brown versus board of education was right to overrule the indefensible -- dobbs's right to overrule roe v. wade.
4:16 pm
one reason row is shameful is because it was an exercise of raw judicial power in making up a new right. okay. i know i'm in the summary stage. i just want to pause there. because i have to underscore the chutzpah of complaining about the exercise of broad judicial power in row, given that dobbs is the poster child for the exercise of raw judicial power. to compare roe to plessy? plus he is the case that constitutional segregation, a case that ensured black americans were second class. -- roe didn't do that. just the opposite. it helped women move away from second classes. back to the reasons why it was right for them to overrule it. another reason was that existing abortion doctrine was
4:17 pm
un-workable. this is one of the factors that is often considered. courts can actually apply precedent to make sense -- about how the court applies it is just rubbish. so much of the opinion. the rule that all pre-viability bands an unconstitutional, the rule that could easily have decided this case, that's about as simple a rule as you will ever find in constitutional law. all right. to symbolize the majority opinion, which is the controlling opinion, because it has five votes, the right to abortion is not deeply rooted in our nation's history and judicial. to conclude otherwise is so egregious, we have to overall the cases that recognize the right to abortion. justice kavanaugh's concurrence adds nothing but if a claim of neutrality. he argues the court is simply -- with regard to abortion. true.
4:18 pm
the majority holds the constitution doesn't protect it. but it didn't hold the constitution prohibits it either. we should be thankful for this neutrality. chief justice roberts, who would prefer to eliminate women's rights incrementally, laments the fact that the court did not simply -- without complete eradicating the right to abortion. justice thomas would burn it all down. as far as justice thomas is concerned, substantive due process is an oxymoron. he would eliminate all fundamental rights that depend on it, including the right to contraception, the right to sexual intimacy, the right to same sex marriage. it does not mention the right to -- notably, the court doesn't need to adopt thomas's approach to eliminate other fundamental rights. that majorities new test is enough to put them at risk. remember, if the right is not deeply rooted in this supreme
4:19 pm
court version of history, then it is not protected by the constitution. i think we can all be fairly confident the supreme court can readily construct a history where neither contraception, nor sexual intimacy, nor same-sex marriage -- there are obviously a lot's worth of criticism. let me highlight three. first, it is ludicrous to make in 19th century history a touchstone for our fundamental rights today. as the dissent pointed out, this approach means women in the 21st century do not have a right to abortion because the same white men in the 19th century that did not let women vote also did not declare a woman had a right to abortion. that is no way to do fundamental rights. we should not be relying on a history filled with racism and sexism and homophobia to
4:20 pm
determine our fundamental rights today. second. even if history were -- the courts conclusion that abortion was not deeply rooted is contestable. this is setting aside all the issues the supreme court -- even if the court were to look at history and frame the right for the most narrow way possible. for example, when the supreme court was deciding that a marriage equality was protected by the constitution, it didn't ask whether same-sex marriage was deeply rooted in our nation institution. it asked whether marriage was deeply rooted. here, the court did not have to ask -- it could ask, is the right to bodily autonomy deeply rooted, or the right controlling your own medical treatment, or the right to make decisions about procreation, or your family, or
4:21 pm
as the dissent but it, the right to own your own body? dobbs majority intentionally did not. in other words, the fix was in from the start. third. one of the majority's motives is that abortion is different from any other right because abortion is murder. okay. it didn't use such blunt language, but that is the gist. it is a major reason why roe is so much more egregious than other substantive due process decisions. of course, this idea that a fertilized egg is a person from the moment of conception, who deserves the same consideration as an actual woman, is ultimately a religious point of view and not the universal. in the u.s., it's the view of a powerful subset. other religions don't share
4:22 pm
that view. in fact, one synagogue is actually challenging and abortion ban on the grounds that abortion -- what's my final point? my final point is that the supreme court is imposing one religious viewpoint on all of us. or, to put in a way for you law geeks, right into the constitution one particular view of economics that dobbs -- a particular religious view of pregnancy. there is tons more to be said about equality, callousness, but i'm on a strict time limit. i will stop there. >> thanks so much, caroline. sorry. thanks so much, caroline.
4:23 pm
very impressed that you able to stick to the timeline in such a momentous decision. let me pause and see if anybody else from the panel wants to raise an issue or ask a question. i'm particularly interested in what the next generation of controversies is going to be. both of respect to reproductive rights, -- more generally. where do they go next? is there functionally a sixth justice majority, and it's just a chief justice holding back on the question of timing, where does a supermajority that really is convinced that prior doctrine was grossly wrongly decided and that the constitution has nothing protective to say about any of this, what comes next?
4:24 pm
caroline or anybody? >> as i said, what comes next is the elimination of any other right that they feel hostile toward. because again, the new text for determining whether there is a substantive due process right is whether it's deeply rooted in our nation's history and tradition. i think that leaves many of the right to now enjoy the vulnerable. certainly, i think the right to contraception could very well be in its sights next. because, again, they can look to history, cherry pick what they want to see, and conclude that this was not a right that was recognized by the men who made the laws and wrote the tree's, and decided court cases. and therefore, it is not protected by the constitution.
4:25 pm
>> what about with respect to reproductive rights, specifically? what are we looking at? are we looking at, you know, different forms of contraception that conservatives use for if asians? what's next? >> i think if you are setting aside one else is gonna get restricted -- here is how i see it unfolding. first, they have already eliminated abortions from unwanted pregnancies. that's -- the next one to go will be abortions even if they are medically necessary, because doctors will air on the side of caution. they are worried about losing their license. they don't know if some judge is going to second guess their decision about why their abortion is necessary. the next thing that will be attacked will be contraception. it will be deemed completely inaccurately --
4:26 pm
so plan be will no longer become available. next will be medicines that might cause miscarriages, because, again, doctors are worried. they don't want to be accused of inducing and abortion. and so, they will air on the side of abortion caution, and stopped prescribing. that also, but if life truly begins at abortion, medical research might be in trouble. ivf might be in trouble. the states will be coming after the women as well. especially if they can't target the providers anymore, as might be the case if medication abortion becomes more widespread. they are going to start imposing criminal penalties on women as well as abortion providers. because maybe the women are the only ones they can get their hands on. but then, states will try and prevent women from trying to go to other states to get
4:27 pm
abortions. we already see states drafting law to try and do that. next? mix carriages will be investigated as abortions. miscarriages are insanely common. everyone knows someone who has miscarriages. just ask the women in your lives. but because you can't necessarily tell the difference between a miscarriage or an abortion. the term for miscarriage is spontaneous abortion. anytime a woman seeks medical care for her miscarriage, she may be investigated as possibly violating a law inducing and abortion. they'veagain, women's behavior g pregnancy will be monitored. after all, we've already established the precedent the fetus has more rights than the woman. let's do you think i'm just being hysterical, all these things have already happened -- it's a dystopian nightmare come true. i haven't even gotten to the
4:28 pm
idea that embryos are people. and who knows what might flow from that. you asked me what might unfold next in terms of women's access to control their bodies. i think those are some of the things that come next. >> and what do you think can be done if the constitution doesn't provide any of this protection? do you think there is a realistic prospect of federal legislation? the president saying he would support ending the filibuster, with respect to making roe v. wade a statutory right, state constitutional right? are they are just gonna be battles everywhere? is this gonna be a ground war that goes on for decades? >> yes. it. will there are so many questions that have been raised by this. everything from, how do you stop someone from traveling to another state? the free speech implications.
4:29 pm
you talked about abortion in a state that doesn't allow it? you advertise for another state? will there be endless litigation, yes. there will be endless litigation. -- what can be done? people asking me this? possibly, we will have a federal law. maybe the federal government will establish clinics on federal property. perhaps it will make medication abortion, the abortion pill, easier to obtain. i don't know, i really don't know with the odds are of that happening. the next focus is what happens in the states. it's all about with the states do or do not align. we already know half of them are poised to eliminate the right to abortion.
4:30 pm
there are some states that have a firm -- affirmatively gone out of their way to protect abortion. there are some states, like florida, for example, that has constitutional prop protection for abortion in their own state constitution. and so they will be litigation in florida about what happens there when the republican controlled legislature tries to take away right, that a more liberal judiciary had held was protected by the florida constitution explicit right to privacy. >> daryl, do you want to raise this question of state constitutional protections and litigation? >> yeah. i think the point is -- nothing to -- excellent presentation about the problems about the protection not being nationwide. what's the legislative fix is.
4:31 pm
one thing that's apparent to me is that one thing we might see is a whole new round of abortion rights politics playing out in state judicial races. why? because state judicial races, in most states, are statewide races. you can't gerrymander, you know, an outcome with respect to the elected judiciary in some of the states in a way that the state legislature might be gerrymandered to insulate it from unpopular abortion restricting kind of legislation, as we have seen already. i'm kind of wondering what you think the future is when we talk about state constitutional rights to abortion, which, as people on the panel know, would not be subject to federal judicial review, right? the state supreme court is the last word on what the state
4:32 pm
constitution rights guarantee. >> caroline? >> again, i think it will be -- the battle will be fought at the state level for wet it is allowed or not allowed in a state. i don't know how that will shape out. we will get a better sense as time goes on. >> darrell, why don't we stick with you? -- among his many specialties is the second amendment. obviously, this term we have an absolutely momentous second amendment decision which takes us beyond just the threshold question of whether there is an individual right untethered
4:33 pm
from a militia in the second amendment. the follow-up question -- now the rubber hits the road. we have a majority disagreement willing to take up guns rights cases and put meat on the bones of the question of what it means. so, darrell, can you take us through that? >> sure, absolutely. thank you very much for the opportunity to speak to you today. the case i'm going to talk about is called new york state rifle and pistol association versus bruen. or a nicer path versus bruen. it would billed as the second amendment case of the last decade. there is no doubt it delivered. they're gun right forces put all their ships on this case. it came up with a trifecta. what did they want? they wanted the supreme court to say that the second amendment right to wear arms
4:34 pm
extends beyond the home. it did. they wanted the supreme court to say that the licensing rules that require some showing of proper cause we -- sometimes known as may issue loss -- required probable cause, or good cause to obtain a license to carry a gun in public was unconstitutional. the court said that as well. and then the big issue that i think most people that were sort of casually watching it, but which i was deeply interested in, and in fact wrote an amicus, filed an amicus on behalf of either party, was this methodological issue. the gun rights people wanted the court to say that the method of figuring out whether some things are regulation, violates a second amendment or not, is via text history and tradition only approached. and they got that as well. they really did hit the jackpot
4:35 pm
with this super majority of his core on these issues that have been close to the heart of their rights advocates for the past 20 years. just briefly, before i recap the case, what does this mean? well, in a narrow way, it means that some of these states that have these may issue rules will have to go back and revise them. the court was fairly clear that the states -- whether it's more objective metrics for issuing licenses, that's constitutional. the states that have the issue licenses, like new york, california, new jersey, so forth, will go back and revise their law. the big issue is this -- because why it is such a big issue is because it affects not just this narrow issue about licensing, it affects how the
4:36 pm
courts are supposed to approach every regulation, whether we're talking about regulations on the hands of -- guns in the hands of 18 year olds. when we're talking about regulations on magazines or ar-15s. when we are talking about the so-called sensitive places doctrine where schools and government buildings can presumably borrow guns from the property. all these things are now up for grabs, that the court has endorsed this condition, only approach. there's no other way to say. we are in a brain new world, with respect to them second amendment and its effect on our lives. let me recap the case very briefly. my surface, versus bruen, involve the challenge by two plaintiffs to the new york licensing law. this licensing law had been in place for over 100 years, known as the sullivan law.
4:37 pm
this required to have an unrestricted license to carry a gun in the state of new york. you had to submit an application to the -- the local law enforcement officials or a judge and show proper cause. was proper cause? according to new york, proper cause was some kind of need that is different than just the general need of having a gun for self defense shared by everybody. you carry a lot of money, you are a government official, like a judge in peril because of the nature of your job. that was the kind of proper cause this licensing law required. this was challenged as, i said, as a violation of the second amendment. a six justice conservative super majority delivered for the plaintiffs and for gun rights advocates.
4:38 pm
the first thing thomas wrote, justice clarence thomas, wrote for the majority. he's said that bear means it just a carry. just to be able to carry that gun outside the home. he said that history is the metric by which you decide whether you can carry guns are not. he says the history showed a fairly robust tradition of allowing people to carry guns outside their homes for purposes of self-defense. that there was unequivocal tradition that-limited that to a good cause showing. he also rejected what was the predominant form of a review of second amendment cases that ever since the supreme court's big decision essentially holding there is an individual right to keep and bear arms for purposes like self-defense,
4:39 pm
inhaler, that's 2008, rejected the kind of approach the lower courts had developed for over a decade. that approach was a two-step framework. the first step of this framework was a categorical or historical approach, where the court would look at categories of history and try to make a determination about whether the kind of issue that the party had brought up even raised a second amendment claim at all. so a person saying i want to be able to keep this shoulder fired missile, for example. that doesn't present a second amendment question at all. it is categorically outside the second amendment. why? because there are long-standing relations on -- having a shoulder fired missile is obviously a dangerous and unusual weapon. the second prong of this two part framework in the lower courts was wet went -- i think it was a conventional
4:40 pm
tailoring or scrutiny analysis that has allowed governments to proffer some kind of information or evidence about why they are regulating, what's legitimate purpose they are trying to achieve through the regulation. and how much it actually impacts otherwise constitutionally protected activity. this tended a rise in the -- so familiar in many areas of constitutional law. justice thomas writing for the majority opinion, for the majority says this is one step too many. only the traditional approach is the way to figure out whether something is a second amendment violation or not. he was quick to add in the opinion that, clearly, we live in a different era than in 1791, when the second amendment was ratified. and what he assumed is that
4:41 pm
lower courts will take modern regulations from the 21st century, or the 20th century, and look at them if they are in some sense analogous to a historical regulation. he says you don't need an exact twin, but you have to have some kind of analog and that determines whether or not a regulation is constitutional. justice alito wrote a concurrence, mostly taking on justice breyer's defense. he stipulated that the only issue being decided in the case was about the licensing rules that nothing about who can possess guns, who can buy guns, the kinds of weapons available, was being decided that day. justice kavanaugh, with a chief for us to roberts, also reiterated that the holding with-limited about this question about licensing, this is about may or shall, discretion versus lack of
4:42 pm
discretion of issuing these licenses. he reiterated that it's not absolute, even sided prior supreme court decisions in heller and mcdonald versus city of chicago, which is the incorporation case. that properly interpreted a variety of gun regulations available, and met nothing in the opinion should cast doubt on long holding prohibitions of guns in the hands of felons or the mentally ill, guns in places like schools and -- even again quoting a prior decision that says, neither just examples there might be either -- others. bear rhoda concurrence questioning the length of time to determine the history is. just as breyer's dissent, which essentially said, look at the
4:43 pm
real cost of gun violence in america. you are shackling legislators and people representatives in being able to address it with this hide bound history only approach that, a, is a rejection of what was settled in president in the lower courts. and to, it is a type of approach that justices are not able to handle. they may not know with the historical materials they are looking at actually say i mean. it's incredibly judge empowering to do this all by analogy, which just depends on levels of generality. so it's a new world. why do i think for the future? i'll summit very quickly. certainly, i think that the future is about figuring out, you know what kind of regulations can survive this kind of --
4:44 pm
i think a lot of what we had assumed about data on gun violence or chronological data is gonna have to be re-packaged and explained to courts about how it relates to this and a logical work the courts are supposed to employ. dangerousness, for example, how i'm going to do something with a historical regulation. i certainly think issues like training and this issue about where guns can be prohibited, because they are sensitive, will be the new front of both litigation and legislation in the future. >> thank you so much, darrell. does anybody on the panel want to raise something before -- >> >> i would be curious to talk about, you mentioned, darrell, justice kavanaugh's concurring, which i thought was really interesting. he and the chief justice have his concurrence. they judge joined the majority
4:45 pm
opinion. it's really hard to reconcile what the concurrence is saying with the majority opinion. the concurrence seems to limit the rationale for striking down this new york law to the standard of discussion in the new york regime. as you mentioned, it has this long block quote from the heller opinion that lists the kind of regulations that were considered presumptively lawful, whatever that means. some of those regulations don't seem to have nearly the kind of historical pedigree that this new york statute had. and so if the new york statute could not cut it how do things like, you know, prohibitions on the mentally ill or sensitive places, the kinds of laws that are described here, like schools, government buildings, how do those get of health? and so -- just wondering how you see the litigation playing out on this?
4:46 pm
i suppose these two justices did join that majority opinion. they also seem to be sending a strong signal to the lower courts that a lot of these kind of regulations should be upheld. >> right it's an excellent question, and so i can imagine a couple of directions. one might be that regulators who are trying to defend laws, essentially say that the justice majority where the assistant judgment majority for the court really is -- you have to incorporate these caveats from the leader and from kavanaugh, and from the chief justice. now the most aggressive version of that would be some say something like the methodology, was only cloyed as to the specific issue about licensing. that we should take heart in
4:47 pm
the fact that all these other regulations may be subject to more conventional present evidence about their effectiveness, and that is what we are doing. i mean, that's an aggressive way of reading it, but it's a plausible way of reading it. a more likely outcome is that imagine justice alito and certainly justice kavanaugh and the chief, might be actually thinking about levels of generality. they might be imagining a world in which there is plenty of room for regulation because levels of [inaudible] are so high. there are tons of regulations in history, somewhat actually noxious pedigrees that are about disarming or keeping guns out of the hands of quote, unquote, dangerous people, and if it's at that level of generality, that is, these are historical regulations keeping
4:48 pm
guns out of the hands of dangerous people, then it regulation on keeping hands out of -- guns out of hands of people with mental health crises or the mentally ill, are just part dangerous people. but these are one of the things that my colleagues also work on in the second amendment, and that i'm concerned about, it's the modification and the justification of these is completely obscure, follow this kind of weird and a logical dance that we are going to do as opposed to, this is the reason, right? the reason why we are doing it is because it is -- it has an effect on people carrying guns, which was much more transparent when the two part framework was still in place. >> thank you so much. my one confident prediction is the supreme court will uphold restrictions on bringing guns into the supreme court, just as
4:49 pm
it upheld a massive mass requirements to bring in -- those didn't fly in a lot of contacts. >> what would be -- would be very interesting is, again, in a world in which sensitive places, it becomes important -- how can -- what sensitive places will float? right? it's one thing to have people on the doorstep of the supreme court building with an air of 15 protesting, it's quite another having them outside their home. in front of your house, where there are protesting. so there is this floating idea of sensitive places is an interesting dynamic. >> thanks so much. let's turn to ediberto roman who's a professor at fiu law and expert on citizenship law,
4:50 pm
immigration, territorial related issues, and we had not just a major immigration decision today but also major ruling perspective the rights of indigenous peoples and people in territories. so ediberto roman. >> thanks, tom, i appreciated. it's wonderful to be here. i have to say how much i appreciate the previous two presentations. unlike those, i don't have one case to review. and in those instances, those two cases, we are looking at scenarios in which we're likely gonna be discussing these cases for months on and, if not years, if not decades. examining five cases that all -- will not capturing the media tension. although today, we did see a lot of discussion concerning the binding texas case,
4:51 pm
considering policy -- which i will talk about briefly. but the point i want to make is these cases are somewhat of a harbinger for things to come in line of this court. and, tom, i think you raised a question of what we're likely to see in the future with respect to the top decision. and i think the presentation related to that was fantastic. i would just add, not only did justice thomas provide us an agenda, or some of us, for future issues on attack, but in the next term, i think under the courts framework in dame's, we will see the end of affirmative action as the first hour, so to speak, with the new philosophy with the text tense point, or the notion of a right leaning -- historically. at the time of the framers.
4:52 pm
as you might suspect, that is a position that causes me great pause given that somewhat sacrosanct view of the constitution that the court seems to have, notwithstanding the excellent points that we caroline raised with respect to the nature of the constitution. and it's limiting view of so many of us within the society. but getting into the issue at hand with respect to these five cases, there is a host of ways to examine them. we can look at them from a standpoint of separation of powers, we can look at it from a standpoint of administrative procedure act, we can -- at least four of them, we can look at them in terms of administrative law. in terms of those standpoints, it would be hard to reconcile these cases. the way we can reconcile the immigration cases is to look at it in the general philosophical
4:53 pm
standpoint in terms of judicial philosophy of deference. of deference to the executive. and i think it's the best way to examine these cases, and it's consistent with a somewhat conservative view of the role of the court versus the role of the executive. and that's the best way to examining these cases. but i will take a slightly broader look at these cases to incorporate the territorial case in terms of these issues. and i'm going to examine it in the light of how famous historian henry fawn are examine these sorts of issues, deciding who are the weekend we the people in our constitution. and that's how i like to examine immigration cases and territorial cases, to look at them as membership cases and when we decide from them. let's examine these cases briefly, i won't go into great
4:54 pm
detail, but i will examine them very briefly and try to reconcile, and perhaps raise a couple of questions concerning notes, for example the q&a, the very first question raised, how do we view these decisions to affect future administrative law and determinations, and i'll be happy to address that in a q&a asked afterwards. it's going to be challenging because i think i have an answer to that up until this morning's decision. but we will talk about it shortly. so first cases, i want to mention, ever so briefly, is garland v. gonzalez. and basically examine the question of whether the government can detain immigrants for months or even years during the immigration proceedings without providing them due process of a bond hearing. this case does away with any of you that pro immigration advocates have raised in the recent past possibility of
4:55 pm
procedural or subsequent due process coming into play or being a vehicle to provide rights for immigrants. in the garland for gonzalez decision, the court held that the inf, the immigration and nationality act, bazaars federal courts from detaining immigrants request for classified injunctive relief. and the court went to great pains to come to that conclusion because, frankly, the language of the inf could be reasonably read to the law such challenges. so much so that the court in a somewhat ironic twist said that these classified claims aren't available, yet individual claims are available. i think, again, a deference point to the executive and their stance with respect to the agency --
4:56 pm
the agency stands to respect that, and from statutory interpretation, arguably, even a plain meaning of interpretation of the ina, very specific reference to section 2:52 subsection one, but we won't go into that level of detail given that i have so many cases, but i'm happy to discuss it a little further in the q&a if the question arises. and then we go to the next case, johnson versus martinez. and basically, in this case, we are looking at a question raised that, after being detained for several months, the petition of file for rid of habeas corpus to the district court challenging on both statutory and constitutional ground hands continue detention, once again, without a bond hearing. implicit in this is the question of due process rights, does the constitution provide some basic rights. and here, the court, once again, came to the conclusion, again,
4:57 pm
with respect to deference, to the agency, that the ina does not require the government to provide non-citizens detained for six months or even more with a bond hearing. notwithstanding the language of the statutes that suggests that these -- state, frankly, these matters have to be addressed within this timeframe. but the court basically provided waiver rule, inconsistent with the strict reading of the statute. now, we move on to yet another case dealing with deference, again, to the executive in the immigration context. and in this one it's patel versus garland, and we are basically look at a question of the court's or jurisdiction, like the two other cases examine the courts jurisdiction to review a claim that, in this context we, this is a scenario
4:58 pm
where an individual basically made a misrepresentation with respect to a drivers license education and by virtue of that became disqualified from subsequently applying for citizenship. the petition challenged that administrative determination, arguing that basically it was a discretionary determination and the court should look at the entire facts with respects with -- respect to that argument, as you might suspect, the federal court here, much like in the prior cases concluded federal courts lack jurisdiction to review facts part of the discretionary relief proceedings under the ina. so with the three cases are entirely consistent with respect to difference, with respect to, arguably, and almost overreaching to support the administrative decision and limiting the rights of the courts. now, we get to today's decision
4:59 pm
that many of us just heard about in the news a couple of hours ago. it's a decision i'll have to study quite a bit, i've written about it, but i haven't completed my analysis of it, in part because it seems quite inconsistent. the only way i can find consistency with this case is, once again, the deference to the executive. and biden versus texas, the supreme court addressed the bidens administration attempt to reason the migrant protection protocols, better known as the remaining mexico policy. the supreme court this morning ruled that the biden administration on, basically what was viewed by most to be a controversial immigration policy, had the authority to reverse the trump era policy that requires asylum seekers to remain in mexico while their cases are reviewed in the courts. now this case, while consistent
5:00 pm
with respect to deference to the executive, causes me some calls, and i have to examine closely because this reversal of a rule set forth by prior administrations seems to run in the face of the administrative procedure act that normally would require a notice and comment period opportunity as well as a full administrative procedure rules associated with creating this new rule, in terms of revoking the pie rule. again, it's a matter will be happy to address further in the future, we other than looking at the issue of difference, this decision that was held by immigrant advocates as a windfall in terms of the opportunity to change a harsh policy in the past, is at the very least questionable with
5:01 pm
respect to the administrative procedure act. the only way i would be able to reconcile on it is to look at the prior decision, the protocols to remain in mexico as not a rule, but as a policy. and then it would withstand at these apa -- i will switch gears a little bit, still being in the framework of who are we and we the people. and look at a decision that, on that capturing the headlines, individuals like myself being puerto rican as well as close to 4 million others in the territory, as well as the over 2 million in the states, affects them directly. that is the u.s. versus we filed a muddle case, in which the case essentially, a way to sum it up, i basically have one minute to go, said that congress has broad oversight over the u.s. territories, under the territorial laws of the constitution. and by virtue of that, procedures, war rights such as
5:02 pm
the due process clause in the fifth amendment do not apply to the citizens of puerto rico, and in essence, they can be discriminated against, it supplemental security income program. notwithstanding the fact that they are u.s. citizens. and ironically, here we have a rift because between kavanaugh and horses. to put it bluntly, kavanaugh used the stated or long-lasting language of frankly the racist decisions of the insular cases around the 19 hundreds and [inaudible] and basically attacked cabinet and called for the overturning of the answer the cases because of the unequal treatment of u.s. citizens and the territory. and i will sum it up by justice gorsuch aptly -- apt observation of the flaws of the insular cases are as
5:03 pm
fundamental as they are shameful. so we have, in essence, a set of decisions that gave us a very narrow view of who are the we in the constitution, with great deference to the executive. >>nm thanks so much. let's turn to the environment and potentially a couple other very significant decisions, depending on time. -- is the director of strategic reactions for earth justice. today we got a major ruling with respect to climate change. >> i think this presentation will echo two themes of the presenters. this is a case where the method of how the court decided the case is quite important, along with the bottom line impacts it will have a climate. this decision came out four hours ago.
5:04 pm
we are digesting -- before anyone wraps their arms around it. i will do my best. i think with this case, it has a complicated procedural history. but knowing at least a little bit of it, it is helpful to understand the case. just very briefly, the case is west virginia versus epa. my case is about -- under one specific provision of -- which is section 1:11 d of the cleaner act. that's a provision that applies when two other major programs of the clean air act aren't being used to regulate a specific kind of source. in 2015, the epa decided to regulate greenhouse gas emissions from power plants. the way the statute works is once epa made that decision to
5:05 pm
regulate new power plants, this provision, 1:11 d was triggered, and epa was required to regulate existing power plants as well. the way he paid as that is the statue tells apa to identify the best system of emissions reduction out in the universe to consider things like caused, energy consequences, all of that. and then to set a top line member to reduce emissions. epa hands the baton over to states, which develop plans to implement that standard and meet that standard. with the obama administration did in 2015 is they issued its regulation and side, the best and most efficient reduction is looking out over what the industry is actually doing, it involves both improvements that plan to make -- tacking something on to the plant, tweaking a plants efficiency but also the way plants were producing their emissions, by shifting away from coal, for example, the
5:06 pm
most greenhouse gas intensive, form to renewable sources of energy. epa issue that regulation, which basically, like i mentioned told states the cutback plans by 2018 to meet those standards, the supreme court stepped in 2016 and stated the regulation. this is the fourth time -- any court have ruled on the merits of that regulation. the clean power plant never took effect. the trump administration came into office and repealed it a clean power plan, saying the epa had interpreted the statute to broadly under the prior administration, put in place its own rule, which was incredibly narrow, and basically only took into account the kind of minor efficiency improvements you could make, and then issued its regulation, which was immediately challenged. and states like west virginia, the coal companies that are in the name of the case, came into
5:07 pm
defend, either they were going too far, or defend the rule. the d.c. circuit struck down the trump administration's act to ban. at that point with that decision meant was that clean power plant had been brought into existence. because that was 2021, the states could not meet the deadlines of submitting plans by 2018. all the guidelines, the standard epa was asking states to figure out how to meet had been met by the industry on its own, without any regulation at all. at the time of that decision, there was no regulatory action that had any impact on anyone in place. and yet, that the supreme court decided to agree to west virginia's request to take the case and the coal companies on their side, and to hear the case. the reason why the court was being asked to dove decide the case, to apply something -- and what that is is --
5:08 pm
when any court is interpreting a statute, it usually starts with a text in the statute and tries to figure out with the words mean. under the major questions act, the court starts with a different question. it starts by asking whether the agency's action is major, which is to find a sort of a hazy way, but perhaps the agencies trying to do something new, trying to address a big problem in a way it hasn't done before. basically anything that raises the hairs on the neck of a judge, asking if the agencies going too far will trigger the major questions doctrine. under that doctrine triggered, the result is there will be skepticism on anything the agency does. it will demand an incredible level of clarity from the statute. instead of asking does the statute authorize the agency to do this, it's going to ask, is there any reason for me to believe -- for me to dislodge my belief that congress it didn't mean to
5:09 pm
give the agency this kind of power? basically, it's on the scale against the agency, it can be very hard to overcome. you can see how groups that are skeptical of agency regulation or don't like it, or are actively hostile towards federal agency regulations options, might like this nostrum. it acts as a -- where it only applies if the agency tries to do something. it doesn't help anybody who might want the agency to try to do something to protect the environment or protect the public. like i said, that's why people wanted the court to take the case, the real motivation. that's what they got. the decision the supreme court issued this morning ruled against epa. it said the clean power plan, which again, never went into effect, has no effect now. it said with the epa try to do in that regulation went too
5:10 pm
far. it triggered the major questions doctrine. the court wasn't convinced by any of the reasons why the statute was in fact clear. as the dissent points out, but i think, what is so notable about this ruling is that the majority opinion spent just a couple pages talking about the text of the statute. it really is not the same kind of discussion as statutory text that is in all the other statutory opinions of the court issued this term. it's not what's the word system means. it's the word system is broad, and therefore, it cannot possibly be clear enough to authorize what the agency does, even if it might literally encompass what's the course -- but because it is so expensive, it doesn't authorize it with the clarity we would require. that is just not the kind of
5:11 pm
textural assignment the court has proclaimed as the right way to approach statutes. it's not the kind of textural -- textualism taught in law schools. the dissent has pointed out to that this is a way for the court to look at a regulation, this court or any court, to look at a regulation that does not like, where thing is go too far, make a policy adjustment, and to say we are gonna make it very hard for the agency to win when it's authorities challenged. in this particular case, if you take the words of the court, if you take the court at its work, all i did was say a specific approach in the clean power plan went too far. it is sort of ironic or an odd result of the fact that it doesn't have any need to engage with the text of the statute, means it otherwise doesn't limit with the epa can do, because it doesn't have any reason to get into the weeds of the system it means.
5:12 pm
the bottom line climate impact of the decision is that the court has taken what's is the most effective way to reduce emissions from the largest source of energy related greenhouse gas emissions on the table it. has narrowed the scope of what the epa can do, and it has given people that will bring the inevitable challenge of whatever the biden administration does a pretty heavy tool to use when they bring that inevitable challenge. we will see what epa does and how it grapples with this decision. i don't mean to downplay the impacts of this decision. both as a matter of climate policy, and as a matter of how to interpret statutes that protect the public. try to do something big to address big problems. this is a bad decision. it's also a decision that doesn't go as far as people were concerned it might. for now, that is at least some comfort. i'm also going to cover very
5:13 pm
briefly two cases dealing with religious rights. very briefly, the first case is called carson. this is a case about how maine funds its public schools. because of its geography, maine and localities could have a public school next to every student. it just didn't have the resources to accomplish that. and the way the main statute works is that locality can't create a public school that is owned and can never pay for students to attend nearby public or private schools, or can pay for students to attend a school of the parents choice, so long as that school is nonreligious. parents challenged that as a violation of their free exercise rights. the court agreed with that claim. with that means is that public funds will go directly to religious schools that teach children religion.
5:14 pm
as justice breyer points out in the dissent, this is a sea change from the world in which states could use to find these kinds of school choice programs but were not required by the constitution to do so. now it seems like they might be. the senate decision on that religious rights here is called kennedy. this is the cause of the case about the football coach you might have heard of. the football coach is telling he wanted to engage in quiet, private prayer on school property in school hours, in a way that wouldn't interfere with his job performance. on his telling, he was fired, or at least not rehired, because he refused to stop doing that. on that version of the facts, the majority says there is a free exercise violation and also violation of his free speech rights. if you read the majority of the
5:15 pm
dissent, i think one thing that's incredibly striking about that decision is that they have incompatible versions of the record, the dissent points out that the football coach had a multi year practice of kneeling at the 50 yard line, as eventually surrounded by his students, reporters, politicians but. leading that group in prayer. that is what he was asked to stop doing. and not to pray in a more private not just rep to buck -- and on that basis, the school had done nothing wrong. here to there is another methodology point where the court also refers to invoke these cases to the history as well. with that i will stop. >> thanks so much. arbitration is an area obviously in which the court has been active in recent
5:16 pm
years. kirti datla deepak gupta is a partner in his own from, and is probably the most successful consumer advocate at the supreme court, at least of his generation. deepak gupta, do you want to talk about -- >> thanks, tom, and thanks to the other panelists, i'm glad to be here with you. i wish we were in person. so yeah, the cases i will talk about, these are not the big blockbusters of the term, these were probably not the cases that are going to make the headlines like abortion, guns, religion, and the administrative state, but i do think -- i'm glad tom took some time out in the panel for us to talk about these because these cases do concern the ability of ordinary consumers, workers, civil rights plaintiffs and -- to get in the courthouse door. these are cases, as tom mentioned, under the federal arbitration act where the court is deciding whether companies can use clauses in the fine
5:17 pm
print of their contracts where consumers and workers to block access to the civil justice system. and the court has a steady diet of these cases almost every term going back about a decade the. and those cases have tended to break down along the five lines, they've tended to be quite controversial, there is a case i argued over a decade ago called the atp versus concepts yang where the court decided 5 to 4 to allow companies to use the fine print of their contracts to prevent consumers and employees from banding together to go into class actions and to enforce arbitration court clauses instead. so, it used to be that these cases where principally 5 to 4, and the rule was kind of that the plaintiff would you lose every single case. and i think there is a shift at the court --
5:18 pm
that is underway at the court, where that is no longer the case. the three cases that i i'm going to talk about briefly, sexton versus southwest, morgan versus sundance, and biking river versus marina, none of these were five for cases. there was far more agreement across the whole court we. two of them were completely unanimous, and one of them garnered broad agreement. and the other thing that is notable is, in two of these three cases, the plaintiffs won, and they won in ways that i think will matter quite a bit going forward. all three of these cases involve plaintiffs who were workers, who were bringing wage and our claims. sort of garden-variety claims, one of them was a sales rep for cruise line, one of those a cargo loader for an airline, and one of them was involved in a taco bell worker.
5:19 pm
and the one i will mention is this case called southwest airlines versus accident, and full disclosure, our firm represented the plaintiffs in this case. my colleague, jennifer bennett, argued the case. and our client in the case was less ramped supervisor for southwest airlines. someone who loads and unloads cargo onto the airplane. and the question is whether the federal arbitration act applies at all to the client. miss sexton. if you look at the text of the federal arbitration act which was enacted in 1925, it seems to exempt workers, congress broadly exempted semen railroad workers, and any other class of workers. if you look at that language, you might see that it exams -- circuit city, had interpreted that language much more
5:20 pm
narrowly to apply only to transportation workers. so the question here was whether airline worker would be a transportation worker, whatever they were sufficiently like seamen and railroad workers. so it might seem like a narrow question, it's about transportation workers, but i think, as was the case with some of the cases discussed earlier, the methodological questions, they are at least interesting here. the court departed in how they decided to do -- was instead very interested in, and this bizarre's tragedy in the case, of what the words meant in 1925. and it was abundantly clear from the historical evidence that people who loaded and unloaded cargo's would be considered workers engaged in interstate commerce. it wasn't about abundant amount of cases from that period.
5:21 pm
so this is an example i think of a kind of progressive, original meaning kind of approach that advocates in this cases have been trying and has been working. and this has implications for workers in the gig economy, uber drivers, people who are employed through apps where this will decide whether they can bring class actions, whether wage and our claims will be brought, and it's the subject of meant a lot of litigation. the second case, i will briefly mention, is a case called morgan versus sundance. and again, the question, here, might seem narrow, but the methodology is quite important and could have big implications. the narrow question presented in the case was whether -- was basically, what happens when somebody seems to have waived the right to enforce arbitration. do you require that the
5:22 pm
plaintiffs show prejudice? and lots of lower courts have required that. the supreme court said, unanimously, in a decision led by justice kagan, we are not going to adopt that kind of rule that requires prejudice, the special arbitration rule. but the methodology that justice kagan managed to get all of the justices to sign on to was much more significant. it rejects the idea that this special policy favoring arbitration, that animated so many of the courts cases in this area, that that policy is any kind of basis to allow what you call special rules favoring arbitration. instead, arbitration agreements are supposed to be treated just like any other contract. and are enforceable to the same extent, but not more so. this decision is going to be, i, think deployed in lots of litigation because that is quite -- it represents quite a shift to
5:23 pm
say that the court, although it had adopted so many special rules favoring arbitration, is no longer going to tolerate the kind of policy based justification. and then the final case, i will just briefly mention, was brooklyn refer versus mary anna, the court seemed to stick to the more typical pattern for past cases. justice alito rhoden appeared in concluding that california's private attorney general act was preempted to the extent that it had special joined our rules that were inconsistent with the way the court understands bilateral arbitration. but the case was quite limited. it didn't go as far as employers might have wanted and justice barrett wrote separately to say that the only reason this was preempted is because the public attorney general act procedure that
5:24 pm
california had with similar to other aggregation devices that can't be imposed on parties through an agreement. so the court is sticking to the idea that arbitration can be used as a way of getting around -- it can be used to allow basically what looks like a class action. but i i think the court is cleaning up its jurisprudence and is not going to allow broad policy based appeals to arbitration going forward. surprisingly, the little guy notched some's in the court this term in ways that may matter going forward. >> thanks so much, deepak gupta. let's use the remaining time to cover some of the questions that have come in from the audience and i think some of the most interesting ones
5:25 pm
involved stepping back from individual decisions and thinking about the court institutionally and the public's perception that the justices and alike. one of them is when we think about confirmation hearings now? i think the general impression of them was pretty terrible going in. there has been enormous criticism of some of the members of the coup about what they said, what's the settled nature of roe in light of the dobbs decision. does this change anything about confirmation? when we think? how we go about it? whether we pay and it's attention to it, whether there be any bother with it? did anything go wrong here? unexpected here? what lessons have we learned, if any?
5:26 pm
>> justices are proposed justices are savvy. they have a good set of advisers. they will continue to take similar stances and congressional leaders will continue to try to pry. but i am not an optimist. i am a critical race scholar. you wouldn't expect anything else from me. >> -- with that a grain of salt. i think we're gonna be taking them with a barrel. >> i think if you go back and look at -- probably everyone's gonna hate to hear this -- but if you go back at what the justices said in their confirmation hearings, those were very carefully worded statements. they were about and the value
5:27 pm
of precedent. and there was no commitment not to take a certain action. of course, they were designed to provide some comfort to people. i don't think it's possible to say any of the statements that were made were inaccurate. i think that kind of thing will continue to happen. it's very difficult for somebody as well prepared in a hearing to get any kind of commitment out of them. in fact, it would be inappropriate for people to make commitments about what they may or may not decide. >> the hearings are gonna go forward because there is a chance for the senators who loves the camera. that's not a thing of the past, where you just put the name on a piece of paper and get the person confirmed. i think in line with this is just whole concept of -- if the court is willing to jettison a 50-year settled
5:28 pm
precedent this salient, that is this popular, then what about all the other decisions that we haven't heard of, that lawyers really care about, but no one else is paying attention to? if they are willing to torch -- then i don't know if they're -- i don't know what kind of horizontal, what's kind of control it has any more at all. maybe we are in the world where would justice thomas suggested in gamble, where he essentially said everything is up for a reevaluation under originalist principles is the new order. >> can we talk about the leak for a second? what do you all make of it? what do you think it means for the institution? are we going to see more of
5:29 pm
that? how does the court address it? the status, the justices, the justices families. what does it do to the perception of the court? do you think it was justified? this is a wild development in the history of the court that is antithetical to at least its own very, very settled practices and procedures. >> there have been leaks before. the brother in has a whole book with lots of sourcing from clerks. >> let's be clear about -- we can stop using the word generically leak. >> this is a draft opinion overruling roe v. wade if you want to compared to something that has happened, give me an example. >> there is nothing that compares, this is a monster lead. all i mean to say is there have
5:30 pm
been leaks before. so the idea that the confidentiality of the court was ironclad is not true. but it -- fundamentally changes what happens in the building, there's some evidence trickling out and it reads a lot of distrust. and if it's true that nobody knows who did this, people have to suspect anyone and i think that makes it difficult for the justices to work with each other and their clerks. so there is no question that it kind of casts a pall over not just this term but the courts operations going forward. and i think there is evidence just in the way the opinions weren't coming out quite as quickly. there was a strange little towards the end of the term that seems like maybe there were some practical effects. i'm curious, kirti datla, someone who is under those
5:31 pm
obligations in the building not long ago, can you imagine what that would've been like if you had been there. >> i -- >> sorry, go ahead. >> i was going to say, when you work at the court, you get a speech from the chief that's really the stern askew can ever imagine him to be about confidentiality and it is an incredibly strong norm at the court, it's fraught enough without having to worry about these kinds of things. we and the thing i will say is that, i guess two things. one, i clerked when most people worked from offices and had -- weren't necessarily working from home, so there is a greater possibility that this leak was maybe not intentional, or at

44 Views

info Stream Only

Uploaded by TV Archive on