Skip to main content

tv   Senate Hearing on Digital Replicas AI Concerns  CSPAN  May 21, 2024 6:43pm-8:44pm EDT

6:43 pm
>> musician and actor f ka
6:44 pm
twigs along with other industry leaders testified before a subcommittee about intellectual property concerns with digital replicas in generative artificial intelligence. they also address first amendment protections. the use of defects in the need to hold those committing fraud with a.i. technology accountable. it is two hours. program
6:45 pm
here for live coverage of the hearing on artificial intelligence. you can >> this hearing will come to order. i would like to thank all of our witnesses for participating today and my colleagues for joining me. i would specifically like to thank the staff are working on a consensus basis to put this hearing together and to think senator black and her staff are partnering with us on this hearing. about 10 months ago we held a subcommittee hearing on artificial intelligence and copyright law and i open that hearing with the debut of a new a.i. generated a song. a riff with lyrics created by chatgpt and was climbing technology used to mimic his voice. the song was fun to create with permission from the rights holders of course, but it also highlighted some present legal questions that they raise. was my song protected speech? if i hadn't gotten permission with the sun have violated his
6:46 pm
rights to his voice or his style? since that hearing a.i. generated replicas have only grown more pervasive from defeat videos of celebrities hawking products to songs maybe voice cloning tools posing as legitimate hits to skim cause mimicking a grandma's panicking voice. these were used to advertise medical consumers. these are just a few of the musical artists who have seen their voices mimicked by a.i. clones. a global leader in online protection found one in four american adults have experienced in a.i. voice scam with three quarters having lost money. scammers using a.i. generated replicas in a grandchild's voice to trick a grandparent out of money have
6:47 pm
become so sophisticated both the fcc and ftc have issued warnings. these examples all relate to commercial speech, but a.i. defects do not stop there. you have seen other examples. explicit the fake images and videos addressed in the defiance act that senators have introduced. collection interference addressed in the protect elections from deceptive a.i. act that senators have introduced. these issues are not theoretical. defect images of taylor swift's are brocaded on x before they were taken down. if was clone of president biden encouraged voters to stay home during the new hampshire primary. in slovakia a deep fake likely had an impact on the outcome of a national election. in summary tools have become increasingly sophisticated in it becomes easy to distribute fake images of someone. fakes of their voice or
6:48 pm
likeness. we cannot let this challenge go on answered and in action should not be an option. as he cautioned during a state of the union we must regulate a.i. voice impersonation but do so thoughtfully striking the right balance between defending rights and fostering a.i. innovation and creativity. both congress and the administration have been working to make that balance. they convened nine a.i. forums last year and senator schumer encouraged committees to work on a.i. legislation on a bipartisan basis as we are doing today. that is why i was excited to release this draft last october to protect people from having their images invoices or likeness used to create digital replicas that say or do things they never agreed to or would never say. it accomplishes this broad goal
6:49 pm
in two ways. by holding individuals and companies liable if they produce an unauthorized replica of an individual's voice or image or likeness and by holding platforms liable if they distribute an unauthorized digital replica if the platform knows the person depicted did not authorize it. unlike current laws that many states have enacted which often are focused on celebrities who monetize their likeness in the ordinaries without a remedy now fakes act would apply to all individuals regardless of whether they commercialize their voices or likenesses. we try to be careful to balance these protections against free- speech rights. the first amendment will of course apply to this, but we made clear long recognized parody and satire remain available to creators to continue to foster the artistic and innovative potential.
6:50 pm
over the past six months we have had literally dozens of meetings and received extensive feedback. hundreds if not thousands of proposed revisions and tweaks and edits and wholesale changes on the discussion draft. they either loved or hated it when between. that was exactly the point and i appreciate the many suggestions we have received. that is also the point of having the ring today with folks who support the bill or oppose the bill and to have a real dialogue. the feedback is centered around a few core technical areas be whether we should include a notice to take on structure similar to the dmca or make the right balance with first amendment exclusions. whether a seven-year postmortem term should be adjusted or narrowed be whether a bill should have preemptive impact over similar state laws be whether it should create some process by which individuals with limited resources and minimal damages can enforce the rights under the law.
6:51 pm
so i look forward to continuing this work with my colleagues and immediately following this hearing to work to promptly formalize the act for introduction next month. we have assembled a wonderful panel today with diverse perspectives and i encourage you to tell us what you like about the draft and what you dislike and any specific things about what changes you would like us to consider and why a. i will introduce the witness panel at the moment, but let me next invite him for his opening remarks.>> we are going through the act. we actually have a bunch of ip nerds or other interested parties are showing up.
6:52 pm
i really think it is unique among the other bills that we have carried forward in terms of intellectual property because it touches everybody. normally it is about patent holders or creators. this touches everybody. it is interesting but also one of the reasons why we have to get it right. we have to make sure we come up with concrete solutions. we do not want to overreach. there is a need for legislation. anyone who is in the do not fix it category i respectfully disagree, but i would be fascinated to hear your testimony. we also do not want to miss the opportunity was stifled for innovation. that is why it's important to get it right. we do not even know what it is going to look like 10 years from now. it will be even more sophisticated over a short period of time. we have all seen replicas and
6:53 pm
deep fakes. photos and videos and audio. we will show you an example here shortly. the number is just growing, so we have to work on it and do the fair things. subject to really fake media for much of the last 100 years. now it is getting serious and is producing and multiplying at a rate that requires congressional action. i think i want to go forward a little bit in my comments because i think the senator did a good job of describing some of the things we with our bill, but i would like to think of this video to give you a recent example. this was interesting. i interact with degenerative
6:54 pm
a.i. for about every morning and have for about two years since chatgpt first released to their beta version or open a.i., and it was a week or so ago that i saw the estate of tupac questioning a recent production and is said more to come. we thought it was probably interesting for maybe folks that have followed the issue like us to show the video. we have staff ready to cue that up. ♪ we need ya the west coast savior if you deal with this viciously you seem a little nervous about all the publicity [ bleep ] canadian we need to know the west coast victory
6:55 pm
heard it on the bud and podcast it's got to >> that entire musical rendition is a product of a.i. and interestingly that image is something that is the property of tupac's estate and the other is in a.i. generated image that was obviously done in violation to the extent it was used for marshall purposes in violation of copyright. just gives you an example. this is a hypothetical. it happened shortly after drake released that song. so we have got work to do. legislation addressing issues of digital replicas will have a multibillion-dollar implication. we have got to get it under control. there are a lot of
6:56 pm
questions that have to be asked. my office in particular is guilty of putting drafts out there knowing that they are drafts. sometimes we even do is sooner without the cooperation or involvement of other members because we put scary stuff out there. in this case we did not do that but we try to work on putting out a discussion draft that makes sense we got a lot of things we have to work out. the questions we need answered is it wise that individuals have no right to license out of their digital likeness unless they retain counsel you should be create an exception for harmless noncommercial users. should there be a notice and takedown provision? there is a litany. i will submit the rest of my written statement for the record, but we have to act. hopefully in
6:57 pm
this congress we can act, which means you have to move very quickly or at a minimum laid out a baseline that we can pick up when we come back with a new congress and get it right, so i look forward to advancing everybody's active collaboration. i will always give you the same warning i give everyone be the only thing that makes me mad is when i see somebody trying to through guerrilla warfare undermine the good-faith efforts of this committee and colleague if you are at the table you can have an influence. if you aren't you are going to be on the table. what is in everyone just recognize our office is open to constructive criticism and use cases over the policy doesn't make sense, but if you are in the category of it is and broke you are not with moderate times but i look forward to a good and productive hearing today, and thank you in advance for your productive collaboration as we move forward.
6:58 pm
>> thank you for another positive and engaging hearing. it has been a great experience serving on this committee with up today we welcome six witnesses to testify about the act. a singer, songwriter, producer, and dancer who has used ai to help her create, and also has had personal experience with unauthorrized deepfakes. it's great to have a voice present from the creative community. next, we have a national executive and director for sag- aftra. the screen actors guild of radio, and television artists, a labor representing 160,000 members who work in film, television, music, and more.
6:59 pm
also the voice of the creative community. then we have ben shifter sheffner. thank you, ben. we welcome graham daveys, representing principally audio streaming platforms like spotify and youtube. mr. daveys davies also has a history of music. she teaches and writes on intellectual property law. after i swear in the witnesses, each will have each will have five minutes to provide a summary of the opening statement. senators have written statements. then we will proceed to questioning. each senator gets five minutes for the first round. we will likely have two or three rounds of questioning. time and attendance permitting. will you please stand and raise your right hand to be sworn in?
7:00 pm
help you god? thank you all. let the record reflect the witnesses have been sworn. mr. kinsle, you can proceed with your opening statement. >> chairman kuhns, rank and member tillis, and members of the subcommittee. i'm robert kinsle, chief executive officer of the warner music group. being here today is something i could not have imagined as a young boy growing up behind the iron curtain in comm new york. in in comm there i met an amazing woman from the dominican republic who eventually became my wife. now we have two amazing american daughters. i am a proud u.s. citizen and i have deep appreciation for the freedoms at the heart of this
7:01 pm
great country having grown up without them. for the past 25 years i have been a tech and media post executive. i joined one of music last year after 12 years at youtube and eight years at netflix. warner music is home to an incredible artists moving culture gloss the globe. one of those artists, twigs, is here with me today. she's an extraordinarily gifted singer, songwriter, actor, and performer. i would also like to thank duncan crabtree-ireland who led the recent collective targeting negotiations that addresses concerns surrounding ai and defense of artist rights. music has so often been the canary in the coal mine for the broader trends in our society. more than any other form of communication or entertainment, music drives culture and
7:02 pm
innovation. that is happening again with generative ai. today music companies are helping artist, and tech companies figure out this new world with his both exciting and daunting. it's our job to not only help amplify creativity but to protect the rights, their livelihoods, and their identities. across the industry legends from roberta flack to the beatles have embraced ai as a tool to enhance their creativity. at the same time generative ai is appropriating artists and entities using deep fakes to depict people doing, saying, or singing things that it never happened before. my accident is a vestige of my eastern european upbringing. you can hear my identity and my voice. through ai it is very easy for somebody to impersonate me and cause all manner of havoc.
7:03 pm
they could speak to an artist in a way that could destroy our relationship. they could say untrue things about our publicly traded company to the media that would damage our business. unfettered technology has the potential to impact everyone. even all of you. your identities could be appropriated to mislead your constituents. the truth is everyone is vulnerable. families defrauded by voice clones pretending to be relatives, people placed in pornography without their consent. schoolchildren having their faces inserted into humiliating things. some people have spoken of responsible ai as a threat to freedom of speech. it is precisely the opposite. ai can put words in your mouth and make you say things you did not say or don't believe. that is not freedom of speech. we appreciate the efforts of this committee to address this
7:04 pm
problem including the no fix act discussion draft authored by chairman ranking member tillis, senator black and and senator klobuchar. we strongly support the nine -- bipartisan no ai fraud act introduced earlier this year and the recently enacted elvis act in tennessee. as the committee moves toward the introduction of a senate bill, there are three elements n the bill should contain to be effective. one, and enforceable intellectual property right for likeness and voice. each person should be allowed to license or deny that right on free-market terms and seek redress for unauthorized uses. two, respect for important first amendment principles without going any further and providing loopholes that create
7:05 pm
more victims. three, effective deterrence. to incentivize a commercial marketplace we need to maintain consequences for ai model builders and digital platforms that knowingly violate a person's property right. i applied the committee for its leadership addressing these challenging and rapidly developing issues with urgency. congress should pass legislation this year before the genie is out of the bottle and we still have a chance to get this right. i look forward to answering your questions. thank you. >> thank you, mr. kyncl. twigs. >> as artists we dedicate a lifetime of hard work and sacrifice in the pursuit of excellence. not only in the expectation of achieving commercial success and critical acclaim, but also in the hope of creating a body
7:06 pm
of work and reputation that is our legacy. so why am i here today? i'm here because my music, my dancing, my acting, the way my body moves in front of the camera and the way my voice resonates through a microphone is not by chance. they are reflections of who i am. my art is a canvas on which i paint my identity and the sustaining foundation of my livelihood. it is the very essence of my being. yet this is under threat. ai cannot replicate the depth of my life journey. yet those who control it hold the power to make the light glitches -- the likeness of my art replicated and falsely claim my identity. this threatens to rewrite and unravel the fabric of my very existence. we must enact regulation now to safeguard our authenticity and protect against this appropriation of our name or rights.
7:07 pm
three decades ago we did not realize that the internet would embed itself so deeply into the core of our everyday lives. policies and controls to keep pace with the emergence of technology were not put in place to protect artists, young people and those that are vulnerable. ai is the biggest leap in technological advancement since the internet. you know the saying. full me once, shame on you. for me twice, shame on me. if we make the same mistake with the emergence of ai, it will be shame on us. let me be clear. i am not against ai. as a future facing artist, new technologies are exciting tool that can be used to express promotions, create fantasy worlds and touch the hearts of many people. in the past year i've developed my own deep fake version of myself that's not only train to
7:08 pm
my personality but can use my exact tone of voice to speak many languages. these and similar technologies are highly evaluating -- valuable tools. this, however, is all under my control and i can grant or refuse consent in a way that is meaningful. what is not acceptable is when my art and my identity can simply be taken by third-party and exploited falsely for their own gain without my consent due to the absence of appropriate legislative control and restriction. history has shown us time again that in moments of great technological advancement those in the arts have been the first at their work exploited and in many instances commoditized. soon after it follows that the general and more vulnerable public suffer the same exploitation. by protecting artists with legislation at such a momentous time in history we are protecting a five-year-old
7:09 pm
child in the future from having their voice, likeness, and identity taken and used as a commodity without prior consent. i stand before you today because you have it in your power to protect artists and their work from the dangers of exploitation and the theft inherent in this technology if it remains unchecked. i'm here on behalf of all d. creators whose careers depend deeply on the ability to maintain tight control over their own art, image, voice and identity. our careers and livelihoods are in jeopardy. so potentially are the rights of others in society. you have the power to change this and safeguard our future. as artists and more importantly human beings, we are a facet of our developed identity.
7:10 pm
our creativity is a product of this lived experience overlaid with years of dedication, training, hard work, and dare i say it, significant financial investment and sacrifice. the very essence of our being at its most human level can be violated by the unscrupulous use of ai to create digital facsimile that purports to bs and our work is inherently wrong . it is therefore vital that as an industry and as legislators we work together to ensure we do all we can to protect and create an intellectual right system as well as protect the basis of who we are. we must get this right. you must get this right before it is too late. thank you. >> thank you. mr. crabtree-ireland. >> thank you chairman ranking
7:11 pm
member tillis and members of the subcommittee. my name is duncan crabtree- ireland. on the national director of sagl aftra . i am here today to testify in support of the no fix act. our members believe that ai technology left unregulated poses an existential threat to their ability to, one, require consent for the creative use of their digital representation, two, receive fair payment for use of their voice and likeness and three, protect against having to compete against themselves, their own digital self in the month marketplace. on the chief negotiator for the contracts including last year's historic deal with studios which was only finalized after the longest entertainment industry striking over 40 years that lasted nearly 4 months. the strike and the public response to them highlighted that the entertainment industry and the broader public
7:12 pm
understand that ai poses real threats to them. they fully support protections against those threats. for an artist, their image and likeness are the foundations of their performance, brand, and identity developed over time through investment and hard work .'s sag aftra has long fought for image and voice protections. the exponential proliferation of our traditional intelligence technologies that allow for rapid and realistic fakes of voices and likenesses of audiovisual works and sound recordings makes this urgent for our members. enshrining this protection will ensure our members, creative artists and all of us are protected and service providers provide the same protections to individual images likenesses and voices that they provide now for other intellectual property rights. these should be transferable just like any other intellectual property or any kind of property somebody owns
7:13 pm
with durational limitations on transfers during one's lifetime to ensure we do not enter into an era of digital and indentured servitude just as an olivia haviland fought for the seven year role in the system. some will argue there should be a broad first amendment based exemptions to any legislation for these rights. there are no stronger advocates for the first amendment then our members. they rely on first amendment rights to tell the stories that artists and other countries are often too endangered to tell. however, the supreme court has made clear over half a century ago that the first amendment does not require that the press or any other media be privileged overprotection of an individual being depicted. to the contrary, the court determines which right will prevail. these balancing tests are critical and they are incorporated to the discussion
7:14 pm
draft of the no fix act. they ensure that depicted individuals protected and rewarded for the time and effort put into cultivating their persona while not unduly burdening the right to public interest or entertainment media to tell stories. at the same time it helps ensure that the depicted individual is not compelled to speak for the benefit of third 30s who would misappropriate the value associated with the persona they have carefully crafted.th with new ai technologies that can realistically depict an individual's voice or likeness with just a few seconds of audio or even a single photograph and with constantly y evolving capabilities it is even more important that broad categorical exemptions be avoided and the courts be employed to balance competing interests. it's also essential that action be taken to address these harms now. our members, the public and society are being impacted right now by the use of deep fake technology and we must
7:15 pm
take timely action. just as one of many examples during the ratification for our contract after the strike last year an unknown party on the internet created an unauthorized deep fake video of me saying false things about ep the contract and urging membersu to vote against it. there was no federal right to protect me. tens of thousands of people were misled about something that lereally mattere to so many of us. it is neither necessary nor appropriate to wait for broader artificial intelligence regulation to be adopted. this narrow and technology neutral approach can and should proceed expeditiously forward. the companies behind many of these technologies better understand appropriate boundaries on the conduct. this will provide guidance while helping to ensure individuals are protected from exploitation that puts their livelihoods and reputations at
7:16 pm
risk. thank you again for this opportunity to think -- speak and i look forward to the conversation. >> chair ranking member tillis, members of the committee, thank you for the opportunity to testify on behalf of the motion picture studios about regulation for digital replicas. for over a century members have employed innovative new technologies to tell compelling stories of audiences worldwide. from the introduction of recorded sound in the 1920s, color in the 1930s, the dazzling special effects in movies like this years due in part two, they have long used technology to allow filmmakers to bring their vision to the screen and the most compelling way possible. artificial intelligence is the latest such an innovation impacting our industry. we see great promise as a way to enhance the filmmaking process and provide an even more compelling experience for
7:17 pm
audiences. we also share the concerns of actors and recording artists about how ai can facilitate the unauthorized replication of their likenesses or voices to supplant performances by then which could potentially undermine their ability to earn a living practicing their craft. the no fix act is a thoughtful contribution to the debate about how to establish guardrails against abuses of such technology. however, legislating in this area necessarily involves doing something that the first amendment sharply limits. regulating the content of speech. it will take very careful drafting to accomplish the goals without inadvertently prohibiting legitimate constitutionally protected uses of technology to enhance storytelling. i want to emphasize this is technology that has entirely legitimate uses. uses that are fully protected by and do not require the consent of those being depicted.
7:18 pm
take the classic 1994 film forrest gump which depicted the fictional character played by tom hanks navigating american th life from the 1950s through the 80s including by interacting with real people from that era. famously the filmmakers using digital replica technology available at the time had him interact and converse with presidents -- or should i say former senators kennedy, johnson, and nixon. to be clear those did not require the consent of their heirs and requiring such consent would effectively grant them or corporate successors the ability to censor what they don't like which would violate the first amendment. in my written testimony i detailed specific suggestions for improving the draft so it addresses real harms without encroaching on first amendment rights. i will highlight four points. first, getting the statutory
7:19 pm
exemptions right is crucial. i want to thank the drafters for getting much of the way there. this will let them move forward on a movie or tv series. if these exemptions are not adequate some producers will te not proceed with their projects a classic chilling effect that the first amendment does not allow.e this would exacerbate the problems. third, the scope of the right would focus on living performers. th going beyond that we risk wide
7:20 pm
swaths of first amendment speech which would make the statue honorable to be struck down. . this must be focused on highly realistic depictions of individuals. it should not encompass cartoon versions of people. lastly before legislating they urge the subcommittee to address already covered by existing law such as defamation, fraud, or state publicity law. often the answer will be yes indicating that a new law is not necessary. if there is indeed a gap in the law, for example regarding pornographic or election related deep takes the best solution is narrow and specific legislation targeting that problem. thank you again for the opportunity to testify today and i welcome your questions.
7:21 pm
>> thank you. >> good afternoon chairman ranking member tillis. thank you for giving me the opportunity to speak to you today on this important issue. my name is greg davis president and ceo of the media situation. we support the community efforts to move forward at the level to keep pace with new technology. we join you in the objective, ensuring there are appropriate protections for individual likenesses. it's an important issue for us all. we want to develop a clear and balanced way forward. our members benefit from clarity in law and providing fans with great experiences. indeed, all members have a strong track record of licensing music to fans. they work closely with record labels and publishers with whom they have long relationships
7:22 pm
and robust contracts. this is our common objective. any new or increased rights should be appropriate and targeted. they should not come at the expense of freedom of speech or creative expression. this is needless litigation for protecting personhood. the no fix act proposes to sweep in a broad range of replicas and downstream activities within its scope. the current draft punishes good and bad actors alike. these should not undermine the global content supply chains on which the streaming industry depends. we are still in the early stages of the application by the artistic community. we see the existing practices for taking down the legal or deceptive content to suffice in
7:23 pm
this context. streaming services at the last point in the supply chain. only the originator of the content and that that delivers it to the services have information necessary to establish if it is legitimate or not. streaming services do not t hav any way to know the complex chain of rights beyond the content they receive from labels and distributors. to address when technology is used, we believe the committee objectives would be best achieved if new legislation was developed from the existing right of privacy loss. this would have a number of advantages. these can be balanced with the individuals and publicity. second, my ability sits cr squarely with the bad actors. those who create the deceptive content and place it into the public sphere. thirdly, the focus is on
7:24 pm
commercial use with actual damages which we believe are proven to be sufficient. establishing a federal law that preempts the existing patchwork is both beneficial and necessary. music streaming is a global industry. we believe the rights pertaining to the person should remain inextricably tied to the individual for the duration of their life. this ensures that each person is able to retain control how their voices used. the draft released by chair ranking member tillis and senators klobuchar and blackburn have been critical to think about these complex issues. i have included more in my written testimony which is intended to continue the discussion. i look forward to continued work with the community. >> thank you. >> chair, ranking member and
7:25 pm
other members of the subcommittee thank you for the opportunity to testify about implications of the proposed no fix act. i am a professor of law at the university of san diego school of law. i teach intellectual property classes and i focus on trademark laws and the right to freedom of expression the first amendment of the constitution commands that congress shall make no law that abridges the freedom of speech. congress generally lacks the power to restrict expression because of its message, ideas, subject matter, or content. this rule is subject to a few limited exceptions for historically unprotected speech such as fraudulent speech and obscenity. content-based regulations of speech are generally presumed invalid unless the government can print the law is constitutional. the no fakes act proposes
7:26 pm
restrictions on the content of speech and targets the harm caused by the unauthorized creation and dissemination of digital replicas or deep fakes of individuals and recordings that are nearly indistinguishable from the person's voice, image, or visual like this. when it applies to the use of digital replicas to its personally people in fraudulent speech or commercial misleading speech it is consistent with the first amendment. there is also no conflict with the first amendment when the act restrict's the use of digital replicas and sexually explicit deep fakes without consent if it constitute obscene speech or child pornography. the problem is that the current version also regulates speech that is protected by the first amendment. congress must therefore prove that the act satisfies constitutional scrutiny. the law must be narrowly tailored to directly and materially further its goals
7:27 pm
and not harm the speech that is protected by the first amendment more than necessary. strict analysis may be required when the government is regulating the unauthorized use of digital replicas and political messages news reporting, entertainment and other types of noncommercial speech that is fully protected by the first amendment. as it is currently drafted i believe it is not consistent with the first amendment because it is overbroad and vague. i think a revised version could satisfy intermediate and strict constitutional screening. there are three ways congress can better protect first rs amendment interest in the law. first, it's critical that the law not suppress protected speech more than necessary. the no fakes act does a better job than the no ai fraud act and setting forth specific exemptions. the law can still
7:28 pm
be improved in certain ways i have discussed in my written testimony. it is also important that congress not enacted strict liability role for online service providers that host expression covered by the no fakes act. specific and actual knowledge of the direct infringer's use of an unauthorized digital replica should be required for liability. online service providers should implement a notice takedown system to make it easier to remove unauthorized deep fakes that violate the law. accused infringers must also be able to challenge takedown requests by filing a counter notification with the platform. my second recommendation is for congress to create separate causes of action that target the different arms caused by unauthorized uses of digital replicas. this includes the use of deep fakes to impersonate individuals in a deceptive manner. use of sexually explicit deep
7:29 pm
fakes and uses that substitute for a performance that they typically would have created in real life such as a performance in a song or movie. qu these causes of action should have different requirements and distinct protections. my third recommendation is that congress ensure each provision of the pr law adequately protects interest. congress can better protect expressive values by allowing the new federal statute to preempt the inconsistent state laws that protect the right of publicity and digital replica rights or laws that restrict the unauthorized use and digital replicas. if licensing is allowed by the act, individuals should be able to consent for each different use of the digital replica. allowing others to control a ti person's identity rights through a broad licensing
7:30 pm
agreement will work at cross purposes with many of the stated goals of this proposed legislation. it could potentially lead to greater ai generated deception of the public. it can also stifle the right of people to make a living through their performances and result in the use of their image or voice in sexually explicit material that was authorized by the broad terms of licensing agreement. i encourage congress to bl continue to protect the interest of public figures and ordinary people. i also urge you to continue consulting stakeholders, academics and attorneys with expertise in theh field of law. i look forward to answering your questions as you continue to improve the act. thank you. >> thank you to all of our witnesses. i will start with some questions about exploring how ai replicas are impacting individuals and then use a subsequent round to get into your perspectives on specific revisions. mr. crabtree-ireland, thank
7:31 pm
you for sharing your personal experience of a deep fake. given your experience, should a digital replica right apply to u all individuals regardless of whether they are commercializing their image, voice or likeness? you represent people who make a living commercializing their likeness. why should we have this available to everyone? >> it is a great question. we support a right that is available to everyone. obviously we have explained the impact this can have on people who make a living and whose career is based on their image, likeness, or voice. the impacts are so obvious and so real for so many americans outside of the scope of the commercialized use. the example that i gave is not a controlled example. this could apply to anyone in the impact is so serious.
7:32 pm
as we do support this right on a broader basis it should be applicable to everyone. >> could you help us understand how you are using ai as a creative tool on the one hand and briefly tell us a little bit about your experience with ai deep fakes and what you think the future looks like if we do not heed the call to act. >> over the past year i have been created and ai version of myself that can use my tone of voice exactly to speak in multiple languages. i have done this to be able to reach more of my fans and to ben able to speak to them in the nuance of their language. i have currently explored french, korean, and japanese which is really exciting for me. it means that even with my upcoming album i can explain in depth what it is about creatively.
7:33 pm
it also allows me to spend more time making arch. often being a music artist or any artist in this day and age requires a lotf of press, promo, if it means it is something simple that does not require my heart i can do a one-liner and give it to people to promote a piece of work and it is harmless but ultimately i can spend more time making something that is really meaningful for my fans. ne the next question you asked, -- >> your own experience with deep fakes. >> there are songs online collaborations with myself and other artists i did not make. it makes me feel vulnerable because, first of all as an artist i think the thing that i love about what i do is i am very precise. i take my time with things. it is really -- i am very proud
7:34 pm
of my work and very proud of the fact that i think my fans really trust me because they know that i put so much deep meaning of my north star into what i do. the fact that someone can take my voice, change lyrics, change messaging, maybe work with an artist that i did not want to work with or maybe an artist i wanted to work with and now the surprise is ruined. it leaves me very raw and vulnerable. i think if legislation is not put in place to protect artist, not only will we let artists down that really care about what we do and have spent a long time developing themselves and the way that we work, it also would mean the fans would not be able to trust people they have spent so many years investing in.
7:35 pm
it would affect us spiritually and financially. it makes -- honestly. i am surprised we are even having this conversation because it feels so painfully obvious to me that it is hard to even find the language if i am honest with you. >> your surprise is not unusual. >> ultimately what it boils down to is my spirit, my are and my brand is my brand. i have spent years developing io . it is mine and does not belong to anybody else to be used in a commercial sense or cultural sense or even just for a laugh. i am a human being and we have to protect that. >> thank you. mr. kyncl.
7:36 pm
we have seen a steady increase in the quality of deep fakes with songs on streaming ti platforms that are indistinguishable from talented artists. so what are the challenges that they are creating long-term for the music business and the fans as well as performers. >> thi think twigs addressed on of those . i think she addressed it better than i could. when you have the use deep fakes out there. there is a fixed amount of revenue on each of these platforms if that is eating into the revenue pulled there is less left for her songs. that's the economic impact of it long-term. the value of content that will flow into the
7:37 pm
digital service providers will increase exponentially which will be harder for the artists to be heard and actually reach a lot of fans. creativity will be stifled. >> so there is both a relationship impact, a spiritual impact, a financial impact and a broader ecosystem of creativity. senator tillis, i turn to you. >> thank you. mr. -- ms. ramsey i will start with you and then go to others that may have opinion. you mentioned in your comments that this is a strict liability bill in its current form. some of us think maybe we have to wade into that. you also talked a bit about having, i guess the individual who has been informed of takedown having some recourse. can you talk more about that.
7:38 pm
>> sure. you might have a situation where somebody challenges your own personal use of your identity online and they are the one that is the bad actor but they file a complaint with the service provider and the service provider who wants to avoid liability takes it down. that is one possibility. another would be that the person who is disseminating the issue actually has a defense. the exception applies to this particular use. it could be news reporting or parity. it is critical for the online service provider to be able to put that expression back up if it does not violate the law. under the copyright law as my understanding is that once the information is put back up it stays up unless the owner files a lawsuit. this is what is great about this takedown procedure it is allows ordinary people to get these unauthorized uses off the internet. that is one real benefit of having the notice of takedown procedure and encouraging
7:39 pm
companies to adopt one. there are some challenges. folks like eric goldman and others have talked about this. it is great that you are talking to interested parties when you figure out these issues. >> does anybody have an opinion counter to that? mr. kyncl, can you walk me through what rights typically, what rights are typically granted to record labels under exclusive sound recording ag agreement and likenesses -- are they included in that? >> it is a pretty wide range of rights. anywhere from full copyright to distribution only where the copyright remains with the artistst. increasingly so, they include likeness as well because as you can imagine as we work on the platforms with
7:40 pm
lots of generated content, we are the ones who have a staff of people that is working to issue notices, claim the content, takedown the content and increasingly we need the name image likeness and voice rights to act on their behalf with the platforms. >> i think you believe that the new digital replica rights need to be fully transferable. >> yes. >> why isn't the license enough? >> i think it should be at the choice of the artist. the artist should have a choice to transfer. >> mr. sheffner . the state right of publicity laws restricting commercial speech have existed for many decades. they have developed their own case law and are well understood the new digital
7:41 pm
replica right proposed by the no fix act would affect noncommercial speech beyond what most state laws currently cover. can you explain how novel this proposed rate would be in the context of existing publicity laws and how could we consider preempting similar state level digital replica laws especially when it is such new territory? >> thank you for that question. you are absolutely right. these have existed for more than a century and are really limited to commercial uses in advertisements and on merchandise. what congress is considering doing here is really novel. we think it is fundamentally different in that it would apply inexpressive works like movies, tv shows, songs which are fully protected by the first amendment. there has developed a robust body of case law in the
7:42 pm
traditional commercial right of publicity context which says yes, it applies if you put y somebody's face on a billboard or use it in an advertisement. but it does not apply for example if you are making a biopic or documentary about somebody. you cannot use it to censor those portrayals. again, this is a novel form of right we are in which will be subject to heightened constitutional scrutiny like professor ramsey described. because it applies inexpressive works it is really important up front to provide clarity to film producers so that when they are about to embark on a project they know what is allowed and what is not. if it is too vague or uncertain they will shy away from using this technology to engage in these sorts of portrayals. that shows each on the first amendment case law says the statute is vulnerable to being struck down if it has
7:43 pm
constitutionally protected speech. >> i think that there is a general consensus and we have made progress but the challenges of all this being struck down a significant. we have to do the legwork. thank you. >> thank you. senator hirono. >> thank you. and ranking member tillis for bringing this bill before us. as you say, the bill has gone through a lot of input from a lot of different groups. if i listen to your testimony accurately it does not sound as though any of you think that we should not do something that will protect. i like the framing of protecting personhood. any of you think that we don't need to do anything in this area? okay. looking at the statute. why don't we go down the list
7:44 pm
very quickly. what do you like most about the current will? we will just start with mr. kyncl and what is the most important thing you would want to change if anything. if you could just keep your answer really short. >> i will start with what i believe it needs to contain. it needs to contain consent for the use of people's name, likeness, and voice. second, it needs to contain monetization, which is fair market license that that person can exercise. in order for that to happen, and in order for that to be operationalized, we need two things.
7:45 pm
one, prominence of the content. that the models are trained on and outputting to be retained which means they should keep sufficiently detailed records on what they trained on so that later on that can be embedded into watermarks which are recognized by the platforms. >> the point is it sounds like consent is the critical part. consent of the creator >> and the providence. we are good at tracing provenance on luxury clothing, cheese, wine. we should be able to do it with intellectual property as well. >> going down the line. >> talking about this particular bill is there something that you think is the most critical aspect of the bill that you support? is there anything you would change in the bill? >> i think the most important thing is to put the power in the hands of the artist. i want
7:46 pm
to be in control of my likeness, my brand, my legacy. i have sacrificed so many years to be good at dancing and singing. so much financial input cost so much time. i do it in the name of my legacy. i do it so that one day i can ok look back at my body of work and say that was me. that is what i want protected in the bill. >> thank you. i think what i like most about this bill is the fact that it is broader than limiting to commercial use. the fact is the commercial use limitation may have worked 100 years ago, but commercial use limitation does not solve the problems we face today especially because of generative ai. we need it to be reflected in this legislation. i think if there was one thing that i would change i would adopt a limitation on transfers
7:47 pm
or even licenses. i think it may not be as necessary after death but during lifetime i think it is essential to make sure somebody does not grant a transfer of rights early in their lifetime that turns out to be unfair to them. i think that there are standards we can look at. >> 70 years is a bit long? >> 70 years is a duration in the bill after death. i'm talking about the duration of transfer even during life. if you had a 21-year-old artist granting a transfer of rights in their image lightning nests or voice there should not be a possibility of licensing it for 50 or 60 years and not have any ability to renegotiate the transfer. i think that there onshould be shorter perhaps seven your limitation. >> one thing that we do like about the bill is the first amendment exemptions. we think that they are most of
7:48 pm
the way there to given members a the clarity and certainty they need. i think they can be improved a little bit and we have some specific changes that we recommend. one thing we recommend changing is there is essentially a no preemption provision. we think that should essentially be the opposite. for the reasons i was just discussing this novel law with some uncertainties around the first amendment. we think it is important not to preempt all law but to preempt state regulation of digital replicas inexpressive works like movies,v tv shows, songs that are protected by the first amendment. >> thank you. >> do you mind if we continue with the responses? >> nothing on some of the things said i think protecting personhood is very much encouraged with the draft.
7:49 pm
i think in terms of the key areas that we want to focus on is where liability sets. we would encourage it to be focused on the creator first releasing the content. we would prefer that it was based around publicity laws rather than ip. actual damages and the preemption. >> professor. >> thank you for your question. what i like most about the bill, i love the specific exclusions from liability even though there may be some provisions that should be made on the fact that you are protected personhood. although this does not apply it would be an entire act in a
7:50 pm
news report. that is not a commercial use. in the comedy case in california the law was applied to an identical rendition of the three stooges and a lithograph which is also not commercial speech. there are some circumstances where current laws do apply to noncommercial uses of identity and also false endorsement laws. they have to be used for noncommercial use of good and services. . i think we should have separate causes of action which i said before. for example, a disclaimer might make sense if you're targeting deceptive impersonation of someone. it dispels any confusion. disclaimer does not make any sense of a sexually explicit deep fake. you might have different requirements if it is just a general broad federal right publicity. if you are talking
7:51 pm
about sexually explicit deep fakes they apply to both commercial and noncommercial speech. my concern is that individuals without significant bargaining power in the early stages of their career might sign a contract where it has digital rights provision in it and sign away the rights to their identity for a lengthy time and use of any context. i would like to see some way for those folks who are negotiating those agreements to perhaps have a specific use authorization for a certain movie as opposed to use of your digital identity in any context. instead of 50, 60 years i would say maybe 1 to 5 years. i am not an expert in the area of what is a good term but i
7:52 pm
think it is critical to make it shorter rather than longer. they are just not going to have that kind of bargaining power that music companies will have. >> i think but the professor is suggesting, different causes of action, that is very intriguing. but complicated. we will think on it. >> thank you. thank you for your good work on the bill. we have spent months working on a discussion draft and moving this forward. i am so pleased that today we are to the hearing stage on this. i represent tennessee. it does not matter if you are on beale street or music row or maybe you are working with one of the symphony distributors.
7:53 pm
we distribute more symphonic music out of nashville than anybody else in the world. we have gospel, contemporary christian, we have church music , we have blue grass. we have the museum of african-american music. it is all right there. we are really so protective of our creators. we have the good, bad, and ugly relationship. all of our people in manufacturing and logistics and healthcare, they are innovating and going to town. i am deeply concerned about what is happening to the creative community. working on know fakes and making certain that there is a way for artists to protect that
7:54 pm
name and image likeness, their voice. there is a way for them to exercise their constitutional right t to protect their intellectual property and benefit from that property. that is going to be so important. mr. kyncl, i want to come to you. i appreciate the comment you made when we were visiting, preparing for the hearing. you said we got data wrong, data privacy wrong. we still have not done data privacy. we cannot afford to get ai wrong . that is going to require that we take action. tennessee stepped up last month and past the elvis act. this is a great piece of legislation. mr. chairman, what they did was
7:55 pm
take much of which we've put in the discussion draft and put it in place to protect our innovators and give them the state right of action. not all states are following suit on this. i think that's what we have done is kind of establish the baseline for of edible action. i would like to hear from you, if you will, about the need for a federal standard, federal preemption. >> thank you for your comments on the elvis act. it is truly groundbreaking. we are in a unique moment of time where we can still act and we can get it right before it gets out of hand. the gene is not yet out of the bottle but it will be soon.
7:56 pm
as you mentioned, senator blackburn, we got it wrong, obviously we waited too late. don't get it wrong an identity. it is simply far too important. the speed at which this will happen will be afforded by open sourcing of foundational ai models. once that happens everything accelerates exponentially. therefore it is imperative that congress acts this year. >> we have heard some commentators talking about -- you've got existing law when it comes to privacy or personal property and intellectual property protections. so you can rest on that existing law and that is sufficient, to go in and get a
7:57 pm
takedown order on some of these ai fakes. talk to me about why that is not sufficient. >> today if you think about privacy, how many spam emails do you get every single day in your inbox? quite a lot. your personal information is leaking everywhere whether it is being sold or it is being taken is just not safeguarded properly. when that happens with your face and your voice, it is a whole new game. for you -- and this will happen at a volume that is impossible for every single person to try to personally manage which means it has to be solved with technology. it is technology that is our mission that helps technology manage it. that is important to work with technology platforms to solve this and we have to have a working bill and working law that can be operationalized by
7:58 pm
all of us. it is the existing framework that is simply to lacrimal. >> let me ask you this. mr. chairman if i can get one more question. do you think that the platforms should be held responsible for unauthorized ai fakes that they are continuing to allow to be distributed? >> i think we need to develop a set of conditions that they should meet. if they don't, then yes. there has to be an opportunity for them to cooperate and work together with all of us to make it so. that i think is the detail work that needs to happen. when we achieve that there wil be good actors and many of them are. i think it is the collaboration
7:59 pm
that we wrestle this now. >> thank you for your cooperation on moving forward this great. i have a hold series of questions to ask. i will try to move relatively quickly, if i might. mr. schaffner, you testified that we have to include first amendment exemptions for works that have public interest or newsworthy value. some say that any work involving a celebrity is newsworthy or in the public interest. that raises the challenge of is how we define first amendment exceptions to ensure they do ru not just swallow the role and permit more kinds of uses that the bill is trying to stop. i would be interested in your views on how we enter that and, professor ramsey, how would you craft the exemptions to make sure they do not swallow up the whole bill, particularly with regards to what is new. >> sure. we have talked to your staff
8:00 pm
which we have a great relationship with. we have talked to other stakeholders and listened to the concerns being raised that summer overwrought and could swallow it themselves. we have listed and suggested tweaks to make sure those types of exceptions do not apply, if the use of the digital replica is deceptive. we do not support fraud. fraud is not protected by the first amendment. it should not be allowed. one other thing i would say is that these types of statutory exemptions have been routinely included in state publicity laws over the last 25 years or so since the late 90s. one thing we have seen or that we have not seen is that this type of abuses of the exceptions . they have worked very well in separating out the uses where you should get permission with somebody's face on a billboard
8:01 pm
or lunchbox versus the minor ones. >> professor, briefly. >> christine farley and i recently wrote a paper about how we can balance trademark and free-speech rights when somebody uses a trademark in an informational or expressive way . i think the proposal in that context might also work here. as he mentioned, some of these could be bad. these kinds of uses can be bad, impersonation, et cetera. one approach, in addition to listing out potential offenses, informational expressive use, that is a false statement or false representation so you say this is a certain celebrity when it is not for a certain teenage girl when it is not that would be actionable, still, even though there is some argument it is expressive. or, if the use is likely to mislead reasonable person about the source of the message or the speaker's identity. that way you would be able to at least have courts consider whether it is information
8:02 pm
expressive use or the safety valve if it is really causing harm because it is deceptive, you could still regulate it. >> understood. mr. davies, today you raised concern the bill lacks mechanism or showing form that the members have knowledge, should we incorporate notice and takedown structure, if so, should it be that dmca notice and takedown provisions, is there another mechanism you urge us to consider for knowledge and construction? >> thank you for the question. in terms of the current situation, members handling the leading streaming service, handling majority of music streaming consumption and processes are working very well. i think the example you have used and the other drake example which is a common one, this being no challenge in taking down the content expeditiously. we don't see our members needing additional burdens or incentives here but
8:03 pm
we do understand the committee is keen to look at if there is to be secondary liability, we would very much seek that there be safe harbor for affective takedown. i think that dmca takedown process we don't see as being a good process for here, it was designed for copyright and we have a position in terms of seeing this in a different set of rights. that said, our members can absolutely work with the committee in terms of what we think would be effective notice of takedown and building on the points of professor made in terms of it is essential that we get specific information on how to identify the offending content so that it can be removed efficiently we need information on the notifier in terms of why the content is offending and what basis and also the notice, if there is an objection, it
8:04 pm
can take place. >> i want to talk about pre- emption briefly. professor, several witnesses described existing state of right publicity laws as difficult to navigate patchwork. should the bill broadly pre- empt state laws or let limit pre-emption to unauthorized digital replica? >> i teach publicity lock, property survey, i think we need federal right of publicity law. state laws are so different, you go to jennifer rothman, she has a great log that talks about the different laws, even within a state, statutory provisions have different rules than the common law provisions. >> your answer is yes. >> so yes, i'm just building up, we need pre-emption. the challenges that obviously, congress is doing a great job trying to get this right. you get it right, then you pre- empt state laws, it sympathize everything for litigants, judges . instead of having to figure which law is going to apply in
8:05 pm
a particular, there is right now form shopping going on, people will file suit and whatever state is best for their interest. so yes, we need. >> 70 year postmortem provision modeled after the copyright act, postmortem rights are important but we understand 70 years is a long time, especially for individuals who do not commercialize their image, voice, or likeness. i would be interested, jump ball, several of you, perhaps mr. sheffner first and then others, should postmortem terms be longer for individuals who commercialize image, voice, and likeness? should they be limited or reviewed and re- extended every decade or so? how would you handle postmortem rights? the draft has 70 years rights postmortem, some of you enthusiastically supported that as part of your creative legacy, others have raised concerns. mr. sheffner, you kick this off and we will do this quickly
8:06 pm
. >> sure, we view this through the lens it is a content-based regulation of speech. as professor ramsey said with her openings statement, content- based regulation of speech needs to be justified by compelling government interest nearly tailored to serve that interest. what we have said is that as living professional performers, use of a digital replica without their consent impacts their ability to earn a living, you have compelling government interest in regulating there and it would be appropriate for congress to regulate. postmortem, that job preservation justification goes away. i have yet heard a compelling government interest in protecting digital replicas once somebody is deceased. i think there will be serious first amendment province with extending a right that applies in expressive works postmortem. >> any other witnesses think
8:07 pm
preserving the legacy and property rights of an individual is worthy of some protection? professor ramsey and then mr. crabtree-ireland . >> this will not shock you but i will say it depends on the goal of the law. if we are talking about a law regulating deceptive uses of someone's identity, talking about a law that is governing sexually explicit deepfakes, it seems to me it is fine to have those morgan wright, long-term possibly, may be plus 70. talking about protection of broad federal right of publicity, maybe not so much. i've not written in this area but would recommend looking at the works of people who have come mark bartholomew, jennifer rothman is working on a paper. >> mr. crabtree-ireland. >> to me it is shocking this right does not deserve to be preserved and protected after death. after all the reasons twigs stated about how personal it is, it is an economic right it is a personal right and something that has real value. how that should dissipate upon that and make itself to big
8:08 pm
corporate interest like the ones represented by folks here, it does not make any sense. i would argue there should not be 70 year limitation at all. the right should be perpetual and the reason is that every one of us is unique, there is no other twigs and there never will be. there is no other you or any of us, this is not the same thing as copyright. it is not the same thing we will use this to create more creativity on top of it later. this is about a person's legacy, it is about a person's right to give this to their family and let their family take advantage of the economic benefits they worked their whole life to achieve. from my perspective, this is intellectual property right that their blurbs protection, it should absolutely be protected after death. i'm waiting to hear a good reason why it shouldn't be, to be honest. >> in perpetuity, not at all, mr. twigs, make this brief. >> i agree with mr. duncan
8:09 pm
crabtree-ireland 100%. >> thank you. twigs, would you like to make a comment on that ? forgive me. >> i worked so hard throughout the whole of my career, when i die, i would like anything i created to go to my family and my estate that would have clear instructions of the weight i want to preserve my history and all of the art that i created. >> thank you. >> senator blumenthal. >> thank you, mr. chairman. i got a plane 20 minutes ago coming from connecticut so i do apologize for missing the bulk of the hearing. as you may have heard, we had no votes yesterday so today was a partial day off and i had plans in connecticut so i'm grateful to all of you for being here. we are very hopeful you are in good health and continue creating and i'm a big fan of your work so thank you for
8:10 pm
being here in particular. thank you, mr. chairman, for having this hearing, which focuses on a bill that you are going to introduce, i would like to be added at the appropriate time as a cosponsor. i'm a strong supporter and i believe there ought to be a federal right for people whose image and voice are used without their consent. whether it is an actor or songwriter or a singer or an athlete. what is shared here is a right in one's own likeness and creation as a person, individual right. i think there ought to be a right to take legal action under that right, the right without a remedy is unavailing, as we know from our first year
8:11 pm
in law school which for me, it was quite a few years ago but i have seen repeated again and again in real life as a prosecutor, as an advocate, as a litigator. i would also like to focus on a complementary remedy which could be watermarking or identification, attribution, giving credit. not just the deepfake and the right to recover as a result of the use of it without attribution or credit, so to speak, without watermarking. but also that kind of identification, public crediting of a work. i'm asking not only in the abstract, i had a different subcommittee privacy, technology, and the law.
8:12 pm
the ranking member of that subcommittee and i, senator josh hawley of missouri, set forth a framework, it is the most comprehensive bipartisan framework right now. we should do more adopting the kind of measure senator and others have proposed. it would provide a requirement for watermarking as well as an entity to oversee
8:13 pm
>> bill take it. >> thank you for all of your work. on this important issue. i think, you know, without, without attribution, with use through watermarking, we won't be able to operationalize what we're talking about here today. so you're focusing on absolutely the right issue. and, and i think the, the, the important part in this is to determine the prominence of content that's being displayed, the degrees of similarity to its original, you know, to the original, and then it is up to the rights holders, whether it's artists, music companies, movie studios, et cetera, to then negotiate the commercial relationships with the platforms, separate and aside
8:14 pm
from the laws and how it all works, using all of those can isms. we've actually done this when i, when i was at youtube. this is precisely what we have done with user generated content. we've just done it in the copyright scheme, where it was the exact content referenced, and so, so we built a whole framework around that. this is merely that on steroids, adopted for the ai age, with, you know, many more shades of gray and much more, much more speed. but it's really just upgrading that, but the framework exists. it has been developed by companies like youtube, which is best in class on that. and, and therefore, i'm hopeful that we can take it further and apply that to ai -- degrees of similarity, using watermarks to label content and --. >> thank you. >> another question.
8:15 pm
i mean, i can only really talk from personal experience, that in the last six months, i had 85 of my songs leak online, which is basically the whole of my experimentation for my next column. it was really scary, because it felt like having the whole of my notepad, i guess, of all my ideas being put out to the whole world before it was ready. but on the flipside of that, i felt very secure because i was able to call up my label and say, hey, this has happened, and immediately they could go and take it down, and it just disappeared, and now you can't find it. so i think that watermarking it would protect artists, because then we'll have a point of -- to go to to say, this has happened, and immediately, wherever, you know, been leaked online or put online can be taken down. but one thing i will say is that the thing that's really
8:16 pm
scary is once something is out in the world, we can't take it back. so if someone uses my likeness and said something offensive or says something harmful, people might think that that is me, and we've all seen, in the news, when someone does something wrong, and the big story is, like, the front page, with the only thing on it. they actually didn't do something wrong, it was a mistake, and the rewrite of it is so small. and i think that's the thing that i'm scared about, is even if something does get out in the world that's not me, it's the reputational damage that it will do, and the financial and cultural harm that won't be able to be amended after the fact. >> very good point. if the chairman would give me a little more time, i'd be interested in the others to answer. thank you. >> i agree with mr. cancel on the value of watermarking and other tools as well, c2 p.a.,
8:17 pm
the coalition is working on that. but i also just want to caution, especially in deepfakes, it was mentioned earlier the idea of disclaimers, solving problems there, or the ideas of watermarking solving problems there. we also have to make sure that tools that we use to protect against abuses of these technologies are realistic. and so expecting viewers of content online to read deeply into captions to find disclaimers or things like that, that doesn't really solve these problems. so i hope, as the committee considers what to do, it's not enticed into thinking that that type of solution actually solves the problem. it needs to be more front facing so that the message that's delivered is received by all those who view it. >> thank you for the question, senator blumenthal. as mr. cancel was talking about in the copyright context, watermarking has proved useful in certain contexts. youtube's content i.d. system, which has been a great help in reducing the presence
8:18 pm
of pirated material on that platform. i would just say again our experience in copyright law, though, it's not a silver bullet. it sometimes can help identify the original source of material, but just because it's out, just because it has a watermark on it doesn't stop it from being further disseminated, et cetera. so there's really no silver bullet in this context. >> thank you. the question i'm going to build on, things that already been said. i think robert talked about the ownerships between the services and the rights holders. these are absolutely essential. this is where the content comes from for the services. so we're very reliant on them, on the data, on the metadata that exists. i think it would be true to say that data in music industry already have significant challenges, so for these challenges we work on together. >> thank you. professor? >> so i'll incorporate by reference everything that's been said before, but then also say that i think someone using
8:19 pm
a digital replica to impersonate someone, or, or basically put out a sexually explicit deepfakes, they're not going to use this kind of technology. so it's not going to help in certain circumstances. >> and i met this, i think i use the word, monterey. if not, i meant to say, elementary. it's not a sub, i didn't need a substitute. so i think all these comments is very helpful invalid. >> on behalf of the chair, senator club char. >> thank you very much. that was an ai attempt, i know it was. and assailed kind of closed, not quite. okay. professor ramsey, since you ended there, i'll pick up where you were about some of these and some of the other witnesses mentioned about this deepfakes and how some of these things, whether it's sexually explicit images or whether it is the political robo calls or videos or ads, and i wasn't going to
8:20 pm
start this way, but it makes sense here, because of what you just said. to me, some of this, we just had to get off there. they're not going to be able to listen to a major candidate for president for three minutes, and then think it, then look and see a label. and i think that in other countries, that's what they've done. that's why senator hawley and senator and collins and a number of other senators have come together, we're marking up this bill, along with a labeling bill in the rules committee on elections. could you talk about why that kind of targeted approach to some of these, like, hair on fire things is very important, given the timing of all of this? >> as you can expect, i love the fact that you're working on these targeted laws. but again, one of the things we need to do is protect ordinary people from impersonation. over thanksgiving, someone called my dad when i was standing right next to him. it sounded just like my brother, and he said he was in jail and he needed money to get out of jail. and my dad was not duped by this, but, you know, some people have been, as the
8:21 pm
senators have noted. so i think it's a great idea, but i think that, you know, we still need the over, the more broad act to deal with these kinds of issues for folks that are not politicians, et cetera. >> exactly, and we, my state directors son is in the marines, and she, her husband got a call where it was an impersonation, they scraped his voice. they didn't know where he was stationed, so we're going to see all of this deployed against military families as well. really all these kinds of scams. so it's going to be, i see this, you know, having some of the great uses, especially in healthcare, of ai, but then there's the hell part, and that should be our job to try to put in the guard rails in place, which is why i'm so honored to be working with senator and tillis and blackburn on this bill. so one of the things that interests me during the testimony, you, mr. schaffner, and mr. krebs or our line, you kind of got to this, but both
8:22 pm
the no fakes act and this election built include exemptions, excepted for the use of digital replicas to ensure the bills do not chill speech protected by the first amendment. can you talk a little bit more, as we look at how we can write these in a way, as i have tried with exceptions for satire in the elections bill with senator hawley, how we can do this to ensure that commonsense safeguards do not chill protected speech, and that this is upheld in a court? >> right, so, senator klobuchar, i just want to say , agreeing with professor ramsey, that i think your approach of having specific legislation on pornographic deepfakes, other legislation on election related deepfakes, is really the right way to go. when you have a broad bill that essentially says you need permission to use digital replicas, and then let courts kind of sort it all out, that's where you get into trouble, and you have an overbroad bill that is going to necessarily end up encompassing detected speech, makes it vulnerable to being struck down on overbreadth
8:23 pm
rounds. so these kinds of exceptions, i think, are specific to the type of legislation. in the world of movies, our, our studios, the studios that we represent at the mpa, making a lot of movies that are based on or inspired by real people and events. went through this morning, last five years of all the best picture nominees over the last five years, approximately half are based on or inspired by real people and events. our studios want to make sure that have legislation like this doesn't interfere with their ability to do like that. when you're talking about, say, nonconsensual pornographic deepfakes, you don't need those exceptions for biopic sent satire and parody. that stuff is bad, in almost every circumstance you can think of. and i think this never targeted approach is really the right way to go. >> okay, so mr. duncan, you got the best long name in the world. >> thank you. >> could you talk about balancing that right of creators with the right of those whose voice or likeness may be at risk, sitting next to
8:24 pm
one of them right there, with twigs? and how do you believe we should balance that? >> absolutely. you know, i think we all the great that, obviously, the first amendment has to be protected, and that expressive speech is important. i think, you know, the exceptions that are written into this discussion draft now are not that far off, but i think it supported that they not be expanded upon, nor that may be broader than necessary. because the fact is, we can't anticipate what this technology is going to do tomorrow. we cannot anticipate every iteration of this. and while there are certain specific uses, or concerns, that are being addressed by legislation like the legislation you referenced, there is a broader need for protection. the example i gave in my opening statement is one. twigs has given examples as they apply to her. and so we do need to have that proper balance. and i am concerned that we are only looking at one side of the first amendment consideration here. the other side of the first amendment consideration is the right that each of us has to
8:25 pm
our own freedom of speech, to be able to communicate our ideas, to associate ourselves with ideas that we want to associate with, and not be associated with ideas we disagree with. and that is being really trampled on right now by this unfettered ability of people without a federal right to, to do things like the deepfake ai experience that she experienced, et cetera. and so i do feel like the committee is going to have to go work on, you know, defining these exceptions, making sure they are no broader than necessary to keep the legislation viable, but also to make sure it doesn't swallow up the role, like the chairman said. if we make them so broad that they swell up the rule, that all of this work will have been for naught. and the reality is, today is not like 10 years ago. it's not like 30 years ago. this technology is fundamentally different, and what it can do with all of our faces and voices calls out, it screams out for a remedy that's actually effective. >> and do you see, maybe
8:26 pm
anyone, twigs, any of you, mr. kinzel, want to get this need for a national standard? just because senator blackburn's work with us on this bill, and is going to be a cosponsor, and they just did the -- of course, in minnesota, we have the dillon act and the prince act. no, i just made that up. but we do have people, as you know, who are fiercely, fiercely independent and protective of their incredible music in our state, and, but we have a common law in minnesota that's helpful. there's like, this state, this state. talk about, few of you, if you want to, just this need to have this national standard and why it's so important. >> i just want to comment on some of the things from before, which is, as someone who grew up without the first amendment, i value it probably more than those who have, because i do not take it for granted at all. and it seems like well in the life in america. because half of the movies who were nominated for oscars, you know, were based on, you know,
8:27 pm
existing folks. so saying that any, you know, ai regulation that is respectful of the existing first amendment is not reducing it. it's keeping it as it is, and it's alive and well. so i, i do, do think that we need to stay within the limits of first amendment and not go beyond. as, as to national regulation, we work with global platforms. we're talking about global platforms. not even national. we're talking about global platforms doing anything state- by-state is a very cumbersome process. twigs's content getting on the platform unauthorized, if we had to fight that on a state by, you know, state-by-state, it's untenable. it just doesn't work. >> very good. mr. davis, that'll be my last one, and then we'll go ahead. >> thank you.
8:28 pm
i just need to reinforce what rob has just said, you know? absolutely right. you know, music streaming is global. the success of this is having access to twigs's music from the uk or from tennessee or wherever. so it's high-volume. anything that adds complexity on a state-by-state level is, is a nightmare to this industry. so we, we're very strongly in favor of preemption. >> very good. just the last thing, kind of along those lines, is don't laugh -- just, it'll be very fast. you can put it in writing of you, mr. davies. in january, we heard testimony that generative ai has been used to create unauthorized digital replicas of news anchors making comments, and we have a number of things going on in the journalism area. i have a vested interest, my dad was a journalist for the minneapolis star tribune. but also, senator kennedy, i have the bill to push for negotiation of the content, and to get them reimbursed, mainly from google and facebook for
8:29 pm
the use of this content, something that's going on in australia and canada, and i will not go on. but what steps can streaming services take to ensure that unauthorized digital replicas of journalists are not posted on the streaming platform? >> senator, if i could follow up with you after. i got briefed on that. >> okay, excellent. thank you. >> thank you, senator klobuchar. back to senator tillis for his second -- twigs, if you'd like to comment. >> oh, thank you. i'd actually like to go back to mr. schaffner's point about the desire to make very big and financially successful films about artists without consent. i think the problem is, if you're able to use an artists voice and likeness without consent about their life story, you're giving the impression that it's, i guess the equivalent of a autobiography rather than a biography, you know? and that's the confusion.
8:30 pm
if you're able to use my voice and my exact face, you're saying, this is what happened from my point of view, and it's not. it's what happened from a team of writers in hollywood that want to over dramatize things and maybe make it more tragic or, you know, more fantastical. and i think that's what makes me really nervous and feel uncomfortable and very vulnerable. i don't think it's fair that even after an artist deceases, somebody would be able to make a film about their life using them, you know? we can watch a film about a person, a star from the past, and if it's an actor, we know to take it with a pinch of salt. if it is the person himself, then it just feels too unclear, and, and not fair, and actually not in, not in, what am i trying to say? not be, the best intention of the artist's legacy. >> thank you. >> thank you.
8:31 pm
senator? tillis. >> thank you, mr. chair. i'm going to be brief. i did have a question for you, mr. crabtree ireland. the current draft legislation, individuals only have a right to license out their digital likenesses if they hire an attorney or they're a member of a labor organization. we've gotten some feedback that your, your organization in particular, that this is a giveaway. can you give me other examples in law, giveaway, or really getting vectoring everybody they're into legal counsel arterial union? can you give me examples, other areas in lot where this is the case where you had to engage an attorney or a labor interest to move forward? >> sure, and i guess i would just say, i don't to distrust our union. it would be, you know, any collective bargaining representative. but i, there are a number of examples that our current labor law, labor and employment law, where there are defined worker
8:32 pm
protections that then can be deviated from three collective bargaining arrangement, but not through individual contracts. in this case, the proposal, i think, is a little broader, a little more open, because of the option of securing individual representation by an attorney as an alternative that's not normally present in those kind of statutes. but i'm sure i can provide, i can't give you a longer list right now, but -- >> for the record, we're going to be submitting questions for the record for all of your, provide an opportunity for additional information. mr. chair, i just like it's remarkable, if you take a look at the attendance of, in the audience, and the engagement from the members here, you're hard-pressed to see, i mean, on certain subjects, but on technical subjects like this, to have members go twice, or, a lot of times, demonstrates the interest of twigs. i'm going to end my questions with you. the, i do believe that congress needs to act, but you need to understand that this is, it's tough to get virtually anything
8:33 pm
done, even, even what appears to be common sense, for the reasons we've talked about. we're going to have constitutional questions we had to address. we have to get to a number of matters. and hopefully we do get it done this year. but you are, in your opening statement, you are emotional, or appears to be emotional on one or two points, and i, i'm just trying, i think that will need to understand, i think, excuse me, one of the reasons maybe you got emotional is because this is an existential threat to creators. and i'm trying to figure out how we educate people on the difference between an original creation from a human being and something that was either created or augmented from a machine. and this is, this is more of a societal thing that we have to sort out. at what point is society just prepared to say, boy, the sound as good, i know it comes from a machine, it's not -- you
8:34 pm
mentioned something about the investment of your fans, that they've made in you. how do you invest in a relationship with a machine? i mean, we're at an interesting point in time in history where we could have billions of people think the inauthentic creation of a machine is somehow as good as the hard work of a human being. i wonder, at what point, that when we lose all the creators, philosophical question, at what point can those machines never possibly match the creative genius of an individual? >> thank you. >> it's okay -- which makes no sense to me. >> i think there's two things here. i feel incredibly lucky to have spent the whole of my teenage years without a smartphone, so i straddle a generation where i
8:35 pm
memorized all my friends numbers, i would walk to my friends house. if we said we're going to meet at 1:00, i just would have to be there, you know? there was no texting and saying that i was going to be late. i loved my brain back then. i love how simple it was. i loved what truth was back then. i love that i was able to think for myself. even where we're at with the internet now, it's so confusing, you know, even if you just want to find a simple news story, we can't. even if you want to find the truth about, you know, whether a food, even, is good for you or bad, we can't, you know? it's just a stream of, of nonsense. i look at a lot of my friends that have children and teenagers, and their mental health is really struggling. we're looking at young people that have anxiety, that have depression, because they're overwhelmed with, with information, and, and lack of truth, and lack of stability. anything that scares me is that
8:36 pm
my fans look at me, looked to me for a northstar, a message, a sense of being. my work is something that they can find themselves in. and if you change the narrative of my work, we're just messing with their brains. you know, the, like, the solid essence of my work that i spent 10 years developing, if someone can just take it and make up something completely different, i feel so bad, because i'm harming people then, and there would be nothing that i can do about it. i think the way that we can prevent this from happening is putting the power in the hands of the artists, and also putting the power in the hands of people that are there to protect the artists, whether that's third parties like record labels or agents, or lawyers, you know? that's up to the artist to understand and to, you know, sign a contract, if we want to. you know? but i think the, the way that
8:37 pm
i've been experimenting with deepfake is going to help my fans, it's going to help them understand the nuance of my language across all parts of the world. like, the way that i want to use it is not harmful, the's i think, inherently, artists just want to express their emotions, and say things that you can't say for yourself. so if you're putting words in our mouths, it's going to be devastating. >> i also agree. i'm very glad there were cell phones back when i was a young person, but maybe for other reasons. and polaroids fade, but, but no. i, i do think that, you know, i'm glad that, that we're taking up this bill. i do feel strongly that we should do everything we can to try and move it in this congress. if not, then we just have to lean into it and get it done in the near future. but when we have these discussions, it points to all the other societal challenges, challenges of creators that we
8:38 pm
need to get right. this, this technology, i love it. i interact with generative ai for about an hour every day as a part of my own study of it and study that began back in the 1980s into artificial intelligence, but for me personally. but we've got a lot of work to do. congress has a role to play. but we've got to be very, very careful not to overstep, not to trample the rights of others, and we're going to need your help and your continued engagement to get it right. so thank you all for being here today. >> senator tillis, thank you. thank you for, again, being a great partner. i have even more questions, but we have come to the end of our time, and you and senator blackburn have been terrific to work with. i am grateful to all of our witnesses, for the way that he brought your skills, your value, your background, your creativity, your voice to this hearing today. and we've engaged in a lot of different challenging questions about how we could refine this, how we could narrow it. there have been a lot of
8:39 pm
members who participated. for those who did not participate, or those who still have other questions, the record will be open for questions for the record for the witnesses. they are due one week from today by 5:00 p.m. on may 7th, although, twigs, in your case, two weeks. before we wrap this up in cellophane and move forward. if i could, today's hearing was important to show that when we regulate the use of ai, we have to balance individual privacy rights and first amendment rights in a way that doesn't stifle creativity and innovation with these rapidly developing ai tools. it reinforces what we've heard today, why we need a clear policy to protect image, voice, and likeness of all individuals, from unauthorized ai replicas. and the feedback we heard and that our staff has received over the last six months it's critical. i look forward to working with my colleagues and cosponsors and the witnesses and the others who attended today to
8:40 pm
refine this in the next week or two and get to the point where we can introduce it next month, so we move from discussion draft to reality. i think we need to seize the moment, move forward. thank you for your partnership, thank you for your testimony. with that, this hearing is adjourned. >> i also want ty
8:41 pm
8:42 pm
8:43 pm
8:44 pm

10 Views

info Stream Only

Uploaded by TV Archive on