Skip to main content

tv   Hearing on Childrens Online Safety  CSPAN  February 19, 2025 10:16am-12:27pm EST

10:16 am
amount accounts for the money that states in the united states. and the second part of my question would be, do you think with the united pulling away from aid and possibly shutting down usaid, can china and russia, do they have the capacity to fill in the gap that the united states might leave behind in the developing world? yeah, great question. i think the second one is a little bit is easier to address. so the not so much, you know, aid domestically, but when it comes to world health organization example among >> you can continue to watch this at our website, c-span.org. we will leave for live coverage of a hearing on how to strengthen online safety protections for children, hosted by the senate judiciary committee. live coverage on c-span3.
10:17 am
10:18 am
we are live on capitol hill, expecting the senate judiciary committee to gavel in shortly. his hearing is titled big tech and the online child sexual expectation crisis. among the witnesses -- an attorney whose firm specializes in representing victims of sexual abuse and child exploitation. chair grassley: good morning, everybody. in today's digital era, our young people face risks that previous generations could not even have imagined. even though technology brings
10:19 am
amazing opportunities for education and growth, it also opens doors to new dangers that we must confront. this isn't the first hearing we have had on this issue, and unfortunately, it probably won't be the last. we held a hearing on this same subject roughly a year ago, when we brought ceo's from some of the largest social media companies to discuss safety issues on their platforms, and we held a similar hearing a year before that. on the one hand, this is alarming because the problems is getting worse. in 2023, as an instance, the cyber tip line received 36 and
10:20 am
2/10 million reports of suspected online child sexual exploitation, a 12% increase over 2022. and even though the numbers have not been published for 2024, it seems that they are expected to go up. additionally alarming are the new technologies that are being used by bad actors to exploit children online. predators could use generative ai, for instance, to take normal images of children and manipulate them to create novel forms of csm. in 22 for alone ncmc reported
10:21 am
over 60,000, almost 61,000, instances of generative artificial intelligence csam. despite this so far, congress has enacted no significant legislation to address these dangers against children. and tech platforms have been unhelpful in our legislative efforts. big tech promises to collaborate , but they are noticeably silent in supporting legislation that would effect meaningful change. in fact, big tech's lobbyists swarm this hill, armed with red herrings and scare tactics, suggesting that will somehow break the internet if we implement even these very modest reforms.
10:22 am
meanwhile, these tech platforms generate revenues that dwarf the economies of most nations. how do they make so much money? they do it by compromising our data and privacy, and keeping our children's eyes glued to the screens through addicting algorithms. indeed, in one recent study, 46% of teens reported that they are online "almost constantly," end quote. this has had severe mental consequences for adolescents. it has led to a rise in sexual exploitation, as some algorithms have actually connected victims to their abusers.
10:23 am
such tech platforms -- should such platforms be allowed to profit at the expense of our children's privacy, our children's safety, and our children's health -- should they be allowed to contribute to a toxic digital ecosystem without being held accountable? so i believed everybody the answer is very clear. when these platforms fail to implement adequate safety measures, they are complicit in harms that follow, and they should be held accountable. that said, there are some signs of encouragement. just as new technologies are being developed that exacerbate our -- exacerbate harm to children online, so too are technologies being developed to
10:24 am
combat exploitation. as one example, with ai rapidly evolving, open-source safety tools are being developed to recognize and report csam. some of the witnesses here today will speak to the effectiveness of these tools. additionally, on the committee, on a committee with some of the most diverse viewpoints in the united states senate, we have actually advanced bipartisan legislation that addresses legal gaps in our framework, especially those related to the blanket immunity that section 230 provides. last congress, for example, we reported several bills -- online safety bills -- out of
10:25 am
committee, with overwhelming bipartisan support. and there are a number of bills that are being considered and refined this congress, which we will give attention to in due course. that being said, we can't come up with a wise and effective legislative solution without first understanding the nature and scope of the problem, and so that is why we are having this hearing today. our witnesses come from various backgrounds and represent very diverse perspectives, all of which point to the need for our committee to approve legislation and continue our work to keep kids safe. with that, i will open things up to ranking member durbin to give opening remarks. after that, we will hear from senators blackburn and klobuchar. then, i will introduce the
10:26 am
witnesses and swear. sen. durbin: i want to personally thank you, senator grassley. this is an unusual change in leadership in this committee, and yet an issue which we took up very seriously in the last few years, on a bipartisan basis, has survived the change, and infect this hearing is evidence of the determination of the chairman. i would like to join him in that assurance that we are taking this issue very seriously. it was almost exactly two years ago this committee held a similar hearing. we heard from six witnesses about the harm social media does to our kids and grandkids -- a mom whose son took his own life after he was bullied online. a young woman whose mental and physical health suffered as she chased the unattainable lifestyle depicted on instagram and other apps. experts who told us how big tech designs their platforms to be
10:27 am
addictive, keeping users online for longer and longer and longer times, so they can be fed more targeted ads. individuals combating the tidal wave of child sexual abuse material and csam flooding across the internet. at the end of the hearing, i told the witnesses and the parents of young people in the audience that i was going to roll up my sleeves, get to work, and pass legislation to protect kids from online safety concerns. that spring, the committee reported five bills that help protect kids online. it included my stop csam act, and i think senator hawley for joining me in that effort, which we hope to renew soon. along with bipartisan bills from senators graham, blumenthal, klobuchar, cornyn, blackburn, and also -- osoff. those bills were reported out of this committee unanimously. for anyone who is a newcomer to capitol hill or to this committee, you have the american political spectrum, from one end to the other, on this committee. for us to do anything
10:28 am
unanimously is nothing short of a political miracle. we did it. the senate judiciary committee contains members across the spectrum, the most conservative republican to the most progressive democrat. it is all most unheard of to pass a bill unanimously, yet we did it five times. one of these bills was later signed into law by president biden. it strengthened the cyber tip line run by the national center for missing and exploded children. as for the rest, different story. big tech opened up its $61.5 million lobbying warchest to make sure these bills never became law. let's be clear -- none of these bills are the silver bullet that would make the internet completely safe for our kids. but they would be significant steps toward finally holding tech companies accountable for the harm that they cause, the damages they cause, the death they cause.
10:29 am
that's why the tech companies oppose them as strongly as they did. they did not do it public. publicly, "it's such a great idea to cope rightly, they beat the hell out of us. -- "it's a great idea." privately, they beat the hell out of us. hearings under oath produced some results. several companies implement a child safety improvements just days before their ceo's came to testify. meta ceo mark zuckerberg, under pressure from senator hawley under pressure, -- under questioning, gave a long to apology to the parents is platforms that hurt. apologies and too late reforms are simply not enough. dozens of parents and survivors in that room, the thousands impacted across the country, demand more. i plan to follow through. coming weeks, senator hawley and i will reintroduce the stop csam act and will finally open the courthouse door to families whose children have
10:30 am
been victimized due to big tech's failure to safeguard their online platforms. i hope senator grassley will help me schedule a timely markup on that bill. this week, i will join senators graham, white house, holly, klobuchar, and blackburn to introduce a bill to sunset thisg overdo. section 230 and the legal immunity it provides to big testifying has been on the books since 1996. to the extent this protection was ever needed, its usefulness for this so-called fledgling industry has long since passed. i have no doubt it will finally make the tech industry legally accountable for the damage they're causing but they ought to face the same liability as
10:31 am
every other entry in america. just last week a bill was killed. the kids privacy act. a bill i believe introduced by senator blumenthal and blackburg as well. it passed the senate 91-3 but big tech did it in the house. couldn't even get it in for a vote. the national center for missing and exploited children receives 1,000 reports to its cybertip line every single day. it's not just a statistic. each of these reports involves a victim. it could be anything from images of a toddler being raped to a teenager girl being encouraged to commit suicide. 100,000 reports every single day.
10:32 am
we cannot wait, we have to move. i hope the public demands congress finally do something. >> senator? >> thank you, mr. chairman and i want to say thank you to our witnesses for being here today and mr. gulfy, we appreciate that you are here and sharing your story and mr. chairman, you mentioned that it was over a year ago that we had tech execs in front of us and that nothing much that is changed. that is the tragic part of this situation, that nothing much has changed. there's been window dressing, ads run saying look at us, look at what we're doing but unfortunately there is no enforcement to this. that is why it is still
10:33 am
dangerous for kids to be online. they're still facing online threats, exposure, sexual exploitation, drug trafficking, promotion of suicide, eating disorders and the thing that is so interesting is, in the physical world, there are laws against this. it is only in the virtual space that it remains the wild west and our china can be attacked every single dainon stop, 24/7, 365. it is long overdue and the kids online safety act that senator blumenthal and i have worked on for years now has been mentioned already this morning and there is such a broad bipartisan coalition, whether it's parents,
10:34 am
principals, teachers, pediatricians, child psychologies, even teens themselves have come to us and have said something needs to be done about this. we have had companies like microsoft, x, snap who supported this bill. unfortunately, kids are still being harmed. online. i talked to a mom recently whose child died. they met somebody online who sold them supposed by a xanax. they met them on snap. they took what they thought was a xanax and they died. it was fentanyl. so these are the dangers that are there and while there is broad bipartisan support, senator grassley mentioned the lobbying efforts of some of the
10:35 am
big tech firms and how they went with distortions and lies to the house and this bill did not get through. so it is time to stop this and get it passed. now, senator durbin mentioned the bills we sent out of committee here last year. there was one that got signed into law and it was the bill that senator ossoff and i did, the report act. and this deals with nick mix cybertip line and increases the time that evidence sub it inned to it in mick has to be preserved and it gives law enforcement more time to investigate, to get these criminals into court then get them locked up and we still have so much work to do. now, senator klobuchar and i are going to lead privacy technology and law subcommittee and these
10:36 am
issues will be coming before us. we have plenty of work to do. we're looking forward. mr. chairman, i look forward to convening this committee, working to make certain that we are pushing this legislation, that we are going to protect our children in the virtual space. thank you, mr. chairman. >> thank you so much, mr. chairman, and i am truly looking forward to working with senator blackburn on in important subcommittee. senator lee and i served the committee for a long time but i think right now this situation on the possibility of moving on this bills is going to be a very positive development, as senator blackburn just pointed out, despite the strong support we've had from senator durbin and grassley and senator gramm when he chaired this committee or was
10:37 am
the ranking on this committee, we've comed to run into road blocks to passing these laws and it's getting absolutely absurd. senator grassley is well aware of the anti-trust tech bill that he and i lead. hundreds of millions were spent against it in tv ads and despite of despite the other countries agreeing to some of these consumer protections that, did not happen in america and i think this piece of it, whether it's instagram's promotion of content that encourages eating disorders. a.i. generated pornographic deep fakes or the tragic stories of kids losing their lives to penalty anymore based pills will likely lead the way as we continue to push our anti-privacy bills. this morning we heard from a
10:38 am
brigett of hastings, minnesota. her son was dealing with haik and took what he thought was a percocet but it was a pill lace with fentanyl. he died at age 19. for too long companies have used a blind eye when young children see their content and criminals get on. i'm increases the risk of mental illness, addiction, exploitation and even suicide among kids. i'll never forget the testimony of the f.b.i. director telling us that in just one year, i believe it was 2023, over 20 kids had committed suicide just because of the pornography and pictures that had been put out there when they were innocently
10:39 am
sending a picture to what they thought was a girl friend or boyfriend. i'm hopeful this hearing will be the beginning of actually passing these bills into law. representative gulfy, you and i met through senator cruz and the bill we have. and we have an additional bill with senator cornyn that's really important, shield act. and as you know, the threat of dissemination alone can be tragic, especially for kids. we need to enact the online safety act, which passed the senate on a 91-3 vote. some of these are stalled out in the house. we need to get the federal rules in place for safeguarding our data. according to a recent study, social media platforms generated
10:40 am
$11 billion for ads directed at kids. i am supportive as mentioned by senator durbin of the legislation that he and senator graham and hawley and many others to open the courtroom doors to those harmed by social media by making those reforms to section 230. that legislation was enacted long before any of this was going on and somehow with respect to other industries we've been able to make smart decisions to put more safety rules in place. ask those passengers who were in that plane in toronto who went upside down. about the safety in place. by doing nothing, instead of reaching some reasonable accommodations of settlements or things we can do on legislation,
10:41 am
we let them run wild at the suspense of our kids' lives. thank you. >> when you consider five bills got out of this committee last congress and over the last few years, congress has only been in session about two and a half days a week, this is supposed to be a new regime. i'm not sure that it is and i would hope that some of you folks on the democrat side would push the republicans to make sure we keep the senate in session more than two and a half days a week so we can get some of this done because we had hardly any important legislation the last two years. we were basically just a confirming body. take that -- i hope you enjoyed doing that like i enjoy complaining because we were only
10:42 am
meeting two and a half days a week when the democrats controlled the senate. i want to introduce our guest today. the first witness, mr. brandon guffey. you serve now in the south carolina house of representatives. following the tragic loss of your son gavin, mr. guffey became an advocate for mental health awareness and combating online crimes and as senator blackburn said, we're sorry for the loss of your son, mr. guffey. it's probably hard for you to be here to talk about it but thank you for being here. next we have ms. caray goldberg. plaintiff's attorney. founder of the law firm c.a. goldberg pplc.
10:43 am
she special arc specializes in child abuse, online harassment and other forms of digital abuse. professor larry, former law professor at catholic university of america. professor leery direct the law school's modern prosecution program and she focuses on exploitation of women and children. professor leery has an upcoming article that dives deeply into the role of section 230 and its role in exploiting sexual abuse material. mr. john -- is c.e.o. of raven. started with a former law
10:44 am
enforcement officer. mr. pizzuro is a former commander of the new jersey internet crimes against children task force program. mr. stephen bolcom, c.e.o. and founder of family online safety institute. this international nonprofit is dedicated to making the internet safer for kids. before founding the institute in 2007, mr. balkam spent 30 years in the nonprofit sector championening online safety. i would like to ask you to stand
10:45 am
and be sworn. raise your right hand. do you swear or affirm that the testimony you're about to give before this committee will be the truth, the whole truth and nothing but the truth, so help you god? they have all answered affirmatively. mr. guffey, we'll start with you and go from my left to my right. >> thank you, mr. chairman. distinguished senators. thank you for the opportunity to testify today. my name is representative brandon guffey and i'm here to share why holding big tech companies responsibility is now my life's mission. sometimes god sends you on a path you never thought you'd be on. in july of 20 2, i lost my oldest son, gavin guffey to
10:46 am
suicide. on his online post a week prior to his death. he said look up to where my head should have been years ago. jesus and his word has given me a high none other can compare to his love. he sent out on a black screen to his friends and his younger brother cohen guffey who's with me here today. at 1:40 a.m., gavin took his life. we quickly learned that gavin was contacted on instagram around midnight. he told his friend he would jump off game to chat with her. in just an hour my son was gone. the predator was released two weeks ago from nigeria. the predator not only attacked my son gavin, who was 17 but
10:47 am
also began to extort my 16-year-old son, my 15-year-old cousin and then myself. one of the posts read did your son beg for his life? the profile removed the one that attacked gavin but left up the ones that predators use. one after meta was fully aware of this predator. i vowed from that moment that i would make it my life's mission to protect children online and would not stop. i was shortly elected to the south carolina house and within four months successfully passed what is now known at gavin's law. every kid at least has to have some awareness. i've worked with many states on similar legislation. i started a nonprofit speaking of teens about mental health and the dangerous from big tech. i sold my businesses and went work for a tech company that
10:48 am
provides tools to protect children. i've also become an advocate on the hill urging members to see this is what it is, the et agreest threat to the next generation. i've seen big tech lobby fight us every inch and congress cave instead of listening to we, the people. i've watched a bill go to the house where it refused to be heard. senator blackburn and kuhns have the no-fakes act. these are great bills. senators cruz and klobuchar led the take it down act and it's already passed senate. i'd like nothing more than to be proven wrong about the inert issue of congress, and having the house pass act soon. please don't let us down again. i've witnessed over 40 teens
10:49 am
take their lives since gavin due to this extortion. will it take one of your own children or grandchildren to finally get fed up enough to move? big tech is the big tobacco of this generation. we see groups give stacks over and over, parents of survivors knock on your doors yearly. and we watch companies spend millions lobbying. fighting us this court and continuously absolving themselves of responsibility. in this chamber last january,stood holding a picture of gavin while mark zuckerberg delivered a forced apology. big tech will never be about enforcement. meta pulled down 63,000 accounts
10:50 am
in one day from one country just from lagos, nigeria and just off of instagram. did they pull those down to actually help our children and if so, why shan't haven't they done more since? or was it a p.r. stunt? i beg to say it's nothing more than a p.r. stunt so they can get a pat on the back but have done nothing since. i got way offline but i want to focus on my main edge message to big tech. as lawmakers, i think we have to say either get online or offline and right now we have too many politicians making decisions based on their next election and not enough leaders making decisions based on the next generation. are with you talkers or leaders? if we can't protect the next generation, what are we even fighting for?
10:51 am
tomorrow needs and you our children need you now. >> chair grassley, ranking member turbine and the members of the commission. my name is caray goldberg. i'm the or ridge nateing attorney in a case against snap where our clients' children were matched with drug dealers and sold counterfeit fentanyl pills that killed them. the case has 90 families from all over kunin including families you heard from last week. i'm also joined by my client amy neville, the mother of 14-year-old alexander neville. in another case against snap, criminals are trying to access c-sam and blackmail and extort
10:52 am
kids with it. yesterday the ninth circuit dismissed one of my cases representing a 15-year-old severely autistic boy who at age 15 was recommended to four different pedophiles who raped him over four consecutive days in court, grinders lawyers said they had no duty to restrict children's access. in another case of mine, a 13-year-old, l.s., was lured to the site band lag. another site with no age restrictions. she thought she was meeting a 17-year-old boy but it turned out to be 40-year-old noaa pa dronea in ever from portland, oregon. he posted openly on his music sharing platform songs on her. one called pedophile in a-minor. on june 24th, 2022.
10:53 am
he drove 15 hours to her home, abducted her on the way to school. stopped her in the trunk of the car and raped her and abused her for eight days. band lab refused to provide law enforcement with key information that could have led to her fast rescue. they wanted to respect medrano's privacy, they said. finally, i represent the family of 16-year-old aidan from colorado. who in 20 20 discovered a web scythe that galore nighs suicide and learned about a product he could got on amazon and get delivered to him and use it to end his life. two months later his grieving mother exchanged 15 messages with them telling them about it. and jet amazon continued to promote, sell and delivery it for 26 more months.
10:54 am
i now represent 27 other families who bought it after amazon sold it to aidan and heard from his mother. tech has two main defenses, section 230 and that they didn't know. i was here a year about with my clients, including amy when this committee powerfully told the tech leaders that you were done with discussions and you wanted solutions. families want legislation like section 230, the defiance act shield. they want laws that increase accountability. that create protection boards. create procedures to contest a platform's failure to reremove c-sam and families wasn't
10:55 am
remedies against platforms what they've increased the risk of harm. take my case representing a.m. at age 11, a.m. lived in a normal town living a normal town in michigan when she went to a sleep over and discovered a website that matches strangers for private live streaming. it matched her with a man who made his online sex slave for three years, extorting her, making her at the beck and call of he and his friends to perform for her, sometimes interrupting her at the dinner take table or at school, even forcing her to recruit more kids. his home, which he shared with his wife's day care was raided. in that case, i cannot claim they knew who the offender was,
10:56 am
but instead the knowledge they had. i pointed to criminal cases, articles, exposes, academic journals. as a result of how we pled the case, we advanced into discovery and acquired 60,000 documents exposing the ex boy station of children. we are at a consensus today. we are all here to not repeat history. section 230 was supposed to insent vital responsible content moderation. instead it did the opposite and as we look into the future, on behalf of the victims i represent, we are here to present to know about the harms. thank you. >> thank you, chair grassley, ranking member durbin and all the members of this committee.
10:57 am
as has been mentioned, i'm really grateful for all the work this committee has done on this issue. the experience our children are having in the digital space is one fraught with danger for them and one might want to ask why do you have to work so hard? why do you have to keep passing these laws? why are not the laws that congress has had on the books regarding ex exploitive crimes working? there are lots of answers but the common thread is section 230 of the communications decency act which has been transformed into what i label a de facto near absolute immunity regime and what i mean by that is exactly what this is goldberg just said. this was a law that was designed to incentivizationsed platforms for protection and instead has incentivized them to harm.
10:58 am
i want to make five points. the first two are what i call framing principles. what one reviews text, the history, structure of section 230 of the communications decent issue act, it is clear that this is a law, not a stand-alone law protecting freedom of the internet as tech and its senior gates will try to argue. it is a law borne of the landscape of child protection. there is no question, the senate with the decency act, the house with the internet family and freedoms act were wrestling with the same question. how, as you look at the telecommunications act, how could you as congress have a safer internet and other media for youth? not whether but how. first point. second point, section 230, the communicationings decency act must be regarded as an
10:59 am
experiment. because when you look at the promises tech made back in 1996 and when you look at the supporters of the iife and the house, what you see is they represented to you and to americana this would be a way in which we could protect our children. that was their claim, their promise. point number three, the experiment has failed for all the reasons that have been said already. and why has it failed? i would say to you, in addition to the reference that happens here on capitol hill, the transformation of section 230 of the act into a law that incentivizes harm was not by accident. it wasn't sort of something that just emerged from the internet. it was a system mattic effort by tech and its sur gates to litigate that throughout this country and they went across the country over 30 years arguing not for the narrow, limited
11:00 am
protection for good samaritans that the about states but rather broad immunity. interestingly, immune tip is nowhere in section 230 of the decency act, as a side note. and that result has >> i think to highlight one that's been said, 99,000 or 100,000 reports will happen today in the cyber tip line. it has important effects in the courtroom that ms. goldberg alluded to. i want to highlight a couple of them. this is becoming immunity and not a defense and not essential for two important reasons. first, as an immunity, these cases are thrown out at a motion to dismiss. there are no access to discovery. when we say victim, survivors and state attorney's general are shut out in the courtroom, we don't mean it is hard for these
11:01 am
cases. we mean they are shut out of the courtroom. they do not have their days in court and notwithstanding the harm they experienced. i labeled this reality of the dual danger. first, that shield allowed platforms to engage in a list of criminal activities having nothing to do with publishing. that has allowed this industry to grow to a massive scale from one individual or one company can cause massive harm as we have heard. the other part of that dual danger is because it is an immunity, there is no access to discovery. there is no way to look under the hood of this incredibly dangerous industry. there is no guard rails, and that means as senator klobachar pointed out what it tells us as i wrap up is there is no guardrails for the harm that these folks will experience.
11:02 am
so, i offer some suggestions of reform in my papers. the key thing is to keep the good samaritan protection that section 230 has but to get rid of the c1 protection that's so distorted for harm. i would encourage the senate to listen to the words of justice thomas, where he stated make no mistakes about it. there is danger and delay. that will danger we can do it. if we accept 99,000 reports that'll be made today that means 12,375 reports that'll come in during this hearing. the last five minutes i have spoken, there is 344 reports. if that's not a reason to act enough, i don't know what is. thank you, chair. >> chairman grassley and ranking members durbin and members of
11:03 am
the committee. as a ceo of raven and organization transforming the nation's response, child exploitation, i am here to urge decisive legislative action. despite multiple testimonies before congress, progress has been slow and hindered by special interest groups and financial incentives that favor the status quo. we must prioritize our children's safety and support those who protect them above all else. new threats continue to emerge while all remains unaddressed. artificial intelligence now enable of regular images of children into explicit content, images of children who do not exist and groom children at mask. offenders exploit children for financial games in addition to
11:04 am
prey sexual gratification. yes, legislative in action allows this crisis to persist. they are voluntary cooperation with law enforcement is minimal allowing offenders to continue to exploit children with impunity. in 2023, for example, there were 36 million cyber tips, yet apple holding a 57% market share in the u.s. only reported 275. according to investigators in the field of discord notified users of subpoenas and enabling offenders to erase evidence before law enforcement can act allowing offenders to continue to target our children. electronic service providers permit offenders to rejoin platforms under new alias with the same ip address. while failing to block foreign
11:05 am
ip addresses used for extortion. this leaves our children unprotected, poor moderation, the lack of parental controls in relation to age identification and adequate safety measures further expose children to these measures. social media algorithms pushed harmful contents. ai-powered grooming will allow offenders to manipulate children at scale. troubling, even chat gpt like tools can provide information like groom tactics when framed innocent ways. these dangerous extends beyond child exploitation to drug access to fentanyl and elicit substances. law enforcement is overwhelmed and under resource, operations have been highly successful but the increasing volume of cyber
11:06 am
tips has made proactive investigations nearly impossible. in the u.s., there are 229,000 ip addresses currently right now trading pure to pure images of known child sexual abuse materials and only 923 are being over worked. the mental toll on those who investigate these crimes is severe. prosecutors and law enforcement officers are exposed todd daily horrific content leading to burnout and ptsd. we must provide them with wellness resources to ensure their critical work. as a retired, new jersey state, police commander, i have seen first-hand what can happen, despite its critical role, the
11:07 am
crime program has been under funded despite being responsible for most of the child's investigation in the u.s. while authorized for 60 million in 2008, only 31.8 million has been appropriated. that's $522,000 per task force per year to investigate child exploitation. that's why i urge everyone here to cosponsor the protect our children authorization act of 2025. children are our most valuable resource and their victimization has lasting consequences on society. raven stands ready to collaborate with the members of the senate, the trump administration and the ceos of big tech to develop a solution. the phrase "talk is cheap" is 100% accurate. action is the only remedy. how many of our children in
11:08 am
those protect them will be impacted as a result of our in action in debate. make no mistakes, offenders are winning, children are suffering and those fighting to protect them are left to struggle without the support they need to rescue victims, hold offenders accountable and bolster their own mental health in the process. legislative actions overdue. the solutions are within your power, our children are countying on you and i am counting on you. thank you very much. >> good morning chairman grassley and ranking member durbin and distinguished members of the committee, thank you very much of the opportunity to speak today. my name is stephen balkam, i am the founder and ceo of family online safety institute. we create a safer world for
11:09 am
children and family. i am here as a father and a newly-minted grandfather. chairman, this is my third time testifying before this committee having appeared in july of 1995, the committee hearing called cyber-born in children. while much has changed, our mission remains the same. we believe in a three-prong approach to online safety enlighten public policies, industry best practices and good digital parenting. our goal is to create protection for kids as well as empower young people to navigate digital spaces safely and responsively. we want to protect kids on the internet, not from it. parents of younger children should have the strongest protection possible, including easy to find and easy to use parental controls. as kids grow, our role as parents shift from being
11:10 am
helicopter parents to co-pilots, guiding them as they build digital resilience. research shows the teens value online safety tools like blocking, muting, reporting, and privacy settings. teaching them to use these effectively foster independence and self-regulations. we have found that empowerment is often the best form of protection. we must prepare young people to engage safely and thoughtfully with the digital world, equipping them with digital literacy and understanding of their rights, and responsibilities. now, recently, there has been calls to ban young people from social media and other online spaces. blanket bans deprive children from any positive experiences they may have are difficult to enforce and open up too many possible unintended consequences. after all, children have rights
11:11 am
including the right to safely access the web to information, free expression, and connect with others. instead of blanket bans, we need thoughtful restrictions that include input from young people and that accounts for children's evolving maturity. while technical solutions such as age assurance is improving, there is no universally approve systems as yet. it is challenging to get the balance between safety, privacy and effectiveness right. as i said recently in our annual conference in front of 350 industry leaders, quote, "you can and must do better to create easy to find and easy to use controls for parents and online safety tools for teens in young people. you can and must do better to publicize and promote those
11:12 am
controls. you can and must do better to collaborate with each other to harmonize your tools across the system so parents and teams are not overwhelmed across countless apps and games and websites." in congress has taken some important steps in this space passing camera act three years ago which funds e essential research on children's development and well being, but there are still more work to be done. federal action is critical because states are beginning to fill the gaps with their own online safety laws. unfortunately, even the most well intention laws often faced legal challenges and create a fragmented regulatory landscape. a strong federal framework would provide clarity while allowing states to build upon this. so, congress has the opportunity
11:13 am
to lead with balance and thoughtful policies, including passing a comprehensive data privacy law. funding ongoing research to inform evidence-based policymaking, prioritize specific targeted bills like the take it down act and the kids on social media act, encouraging industry cooperations to simplify parental controls, rejecting blanket bans and favor of thoughtful restrictions that include young people's input and critically supporting digital literacy programs to build resilience in young users. so, to conclude, let us challenge us to reimagine what online safety can look like and not just as a range of restrictions but as a foundation for resilience, confidence, and opportunity. thank you, and i look forward to your questions.
11:14 am
>> chairman grassley: thank you for your testimonies, we'll have five minutes round of questions. ai has open up new possibilities for bad actors generate novel forms and in fact, one recent report found over 3500 generators, spams and images were posted in a single dark web form over nine-month period. could you explain the challenges of ai-generated poses for law enforcement companies. >> mr. pizzuro: well, there are a couple of things. what is ai and what is not? i can take my phone and take a picture of the senator and i can age progress them, you can be a 40-year-old male and i can make you a 21-year-old female and i can make you a 10-year-old girl. with that, ai within these apps,
11:15 am
i can actually nudify those apps. i don't need to groom someone right now. i can get an image of the clear web in order to do that. that's the complexities and the challenges will be how do we determine who's the real victor who's not in a lot of instances. >> chairman grassley: thank you, professor. i heard your points, i am not asking you to repeat any of them. how would you advise reforming section 230 in light of the current online echo system? >> member leary: that
11:16 am
incentivize platforms to be able to remove harmful materials from their platform without being sued. c1 part of the statute should be removed as it has been pointed by so many of you in your opening statements, it serves no purpose if it ever did. now, a myth has been created about it that somehow created the internet and somehow the internet will break without it. that's simply not true. if it were ever true, this is no longer a business that needs that kind of support. instead, it needs to be treated like every other business. another thing i would encourage the senate to do with section 30 is listen to the the attorney general who repeatedly written and asked congress to include the ability for them to enforce their state laws, which is has been ruled to something they could not do in the court when tech has argued for an expansive
11:17 am
interpretation of section 230. it is a state's right issue and not the entire architecture combatting exploitation of our children is involved prosecution, protection, and prevention. within that, involving multiple pressure points, litigation and state prosecution and federal prosecution and that i think would be an important amendment. >> chairman grassley: it will take a long time for section 230 to be formed and also putting some more things that can slow the process up. beyond that reform and liability for big tech, what steps could companies take to protect children online today. >> mr. pizzuro: one of the
11:18 am
things, they notify users when they get legal process. those are certain internal policies. they know ip addresses. i get banned, i create a new user name. there are ip addresses, and ip addresses beyond the scope of the u.s. where children are targeted here. these are things that the company actual lino and can do something. >> chairman grassley: professor leary - as we continue workshoping bills in this committee, do you believe we should pursue reckless standard or knowing standards of the pros and cons? >> professor leary: senator
11:19 am
durbin referred to the red hearing. some low standards that'll expose these businesses to an on slouth of litigation. a couple of comments, most businesses function having to act responsively. anybody who says that recklessness is an easy standard to make, i invite you come to my criminal law class and meet my criminal law students who'll be able to tell you the definition of recklessness and they'll tell you that it is challenging. it is specifically it is a conscious disregard of not just a risk, a substantial and unjustifiable risk. that's the definition of recklessness in the criminal context and it can be used in other context as well. that requires not just objective measure but a level of subjectivity, it is referred to risk creation. that kind of standard is hardly
11:20 am
a day in the park for litigants. it is still quite challenging, and that's why it is a far better standard than knowingly in my opinion. >> chairman grassley: okay, senator durbin. >> senator durbin: thank you for coming back. i am sorry for the circumstances and i know your family have friends have joined coming here today. i recall the first time we met after hearing a year or so ago before the committee. thank you very much. >> thank you. >> senator durbin: a week or two weeks ago we had a hearing on fentanyl and we had another parent of victim - turned out to be laced with fentanyl and took his life. this is a life or death situation that we are dealing with here and we are living it still. we have to keep it in that context.
11:21 am
professor leary, - i am struck by one of your statements that you given to the committee that this notion that we are preparing 230 as an immunity as opposed to defense precludes evidence being gathered and discovery taken place. you say to your remarks that diminishes our knowledge of the actual going-ons of the company. i recall what was being said, this is bigger than tobacco. i know that issue. i introduced a little buy to ban smoking on airplanes. it passed. it triggers conversations and a
11:22 am
discovery process and we gathered together and did something significant. i would like you to expand as you will how the standards precludes our knowledge of what's going on in big tech and their response to this challenge. i think the gathering of that information from the tobacco companies, the demonstration of them lying to the public, for example, led to a down fall. i think the same can be true here. >> professor leary: thank you, senator, i think you are 100% correct on that. there is a period of discovery beforehand. >> senator durbin: like arbitrary negligence. >> professor leary: exactly. they can get information to build on their case or defendants can provide
11:23 am
information which may excapate them. prior to discoveries when these platforms are coming into court and saying judge, we don't have to defend ourselves or litigate this case. you should dismiss this right now prior to discovery. the only way i would say the public has learned a lot of the information about big tech, for example, that i believe led to some of the other duties of care has been through what? congressional investigations, i am reminded of the back page of congressional investigation, which is a two years investigation or whistle blowings and hearings. that's how we are learning this information. only by getting these information can we make informed choices of the big legislative tech. justice thomas commented on this and he underscored this when he
11:24 am
said, "look, let's keep in mind if we fix section 230 - "that's not what he said. "plaintiffs must prove the merit of their case and some claims will undoubtedly fail but the state and government will be able to update their law." >> senator durbin: if i can make one point here going back to my analogy and dealing with tobacco issue in a much larger complex. the initial bill that i introduced and passed in the house banned smoking on planes. what are you talking about? if it is dangerous regardless of the duration of the flight. the reason was i had a minnesota congressman who was a smoker holding up my bill and i went to him.
11:25 am
i went to him and i said, "how long can you go without a cigarette?" he said "two hours." there are things with are dealing with these bills which comomises and trying to move the issue forward. don't assume any language is final. it always influx and subject to negotiations. thank you for joining us. >> thank you, mr. chairman, i would like to thank all the witnesses for being here and testifying on this important issues. these are not easy issues to talk about and not easy in particular because of a tragic circumstances that brought you here. representative duffy, i want to express my sympathy to you for the loss of your son, and no parents would have to go through
11:26 am
that. i want to commend your courage and the strength you have shown as you continue to fight to protect all children. and you have gone through, my heart goes out to you and anyone else experiencing things that you are describing. the past several years, i stronglied a kkated. this is due to increasing concerns of how social media platforms are operating and how they are utilizing section 230. the platform enabled child sexual exploitation and promoting harmful challenges to children and facilitated drug trafficking in many cases to minors. now, i introduced the protect act on this point which mandates stricter safeguards on website posting pornographic contents. victims online exploitation can
11:27 am
face an uphill battle for years. struggling the get online platform to remove images that were non-consensually contained. the bill would require the age and obtain verified consent forms from individuals uploading or appearing in content. the bill would require tech companies to take stronger measures to prevent exploitation occurring on their platform and removal of child sexual exploitation materials. upon receiving notice that the content uploaded without requiring consent. second, i introduced another bill as a complement to the protect act called the screen act. it would require all pornographic websites adopting age and ensuring children can't access to site pornographic contents. in the 20 years since the
11:28 am
supreme court last examined this issue, technological advances demonstrated that prior methods restricting minor access to pornographic online were in effective. ages of 12 to 17 been exposed to pornography. finally, i introduced a third bill called the accountability act which will prevent underage users downloading apps with pornography, to sue the gate keepers of the content in question. technology has advanced significantly over the last two decades, modern age verification technology is now the least
11:29 am
restricted and most effective means through which congress has already access to protect our children from exposure to online pornography. ms. goldberg, if it is okay if i would like to start with you, should app stores like the google play store or apple store held accountable for minor access to legal contents? >> ms. goldberg: as we have said in our cases against amazon, their standards of seller negligence. if you know you are selling an unreasonably dangerous product then there is liability. >> you are saying there would be if it is sold a tangible object to minors with someone of
11:30 am
knowledge reckless of disregard for their age. professor leary, do you believe requiring age qualifications for websites, while imposing serious consequences for uploading sexual graphics. do thank you think these are things that would help children? >> professor leary: the words matter but the idea that anything that'll create friction between children and their exposure to pornography is an important thing. the danger here is to - not to suggest, the danger here is what i see is tech directing things away from them often, right? as they are the solution when we have to have a multi-tiered and multi-levelled approach.
11:31 am
all together really provides much more protection than one or two approaches. >> chairman grassley: klobachar. >> senator klobachar: thank you, grassley. it is wonderful to be here and hear your incredible testimony, three of you, i think. i first want to lead you, our representative duffey, i have been watching your family and friends behind you and i know how difficult this can be. i don't know how anyone can not listen to you. we want to get something done here. can you quickly elaborate on why even the threat of the non-consensual using images can be dangerous?
11:32 am
>> mr. duffey: you are taking your most private moment and sending it out to complete stranger, it is complete vulnerability. i believe in this country, we lost grace and we have too often kick people for the mistakes they make and we tell our kids, everything you do online will stay with you forever. if you imagine you took your darkest moments and post it online. >> senator klobachar: why should this be federal? >> mr. guffey: i submitted things such as the protect app and app store accountability app. section 230 is causing states to go at this at 20 different directions. until if 230 is not going to fix it, the states are fed up of how
11:33 am
congress will be. it will be a whole lot nicer to have uniformed code across the country instead of just protecting children in one state. >> senator klobochar: thank you, and with the airplane seat rule that i just brought up. we don't have that state by state. that would be difficult to have any results. mr. pizzuro, senator and i have this shield act which is important ahead of its time and take it down requires the platform to take it down immediately the non-consensual images but also make sure there are criminal liability for those that are posting it. could you talk about why that helps federal law enforcement? >> mr. pizzuro: sure, there are a lot of gaps.
11:34 am
as an investigator, there are areas that i can't prosecute or facilitate things. if you go to rural areas where there is not a state positive, where there is not really good laws, you are going to need that federal law and that aspect. what shields does is it shields that legislative gap in order for us to do our jobs. >> senator klobachar: good point. another question of the fentanyl and drug track. the dea found 1-third of fentanyl cases investigated had drug ties to social media and other found teens and young adult deaths can be tracked back to social media. that's statistic and that's live loss. how is an online platform can
11:35 am
contribute to this? >> mr. pizzuro: when there are phone pagers and we were doing cartels, this was the advent of technology. with these tech companies in the ai algorithms of the push, that's what they'll see. it does not matter. one of the things that i asked meta and a lot of these companies, can you explain your algorithm. no one can and will because again it is about business and pushing that content and that's what children are seeing. that's why they are at risk. >> senator klobachar: my last question of you ms. goldberg, you represented thousands of victims that never were represented, can you discuss the challenges you face in getting justice for your clients and why passing a federal law like the shield act would make a
11:36 am
difference? >> ms. goldberg: sure, when i started, there were three states that had laws. it was not until we were testifying about it and actually making people realize that the liability needs to be like in the hands of offenders, there is a responsibility of being the recipient of it. the bigger problem was though, the platforms were the ones that is were distributing the content at scale. so, you know, back in old days, revenge porn can be photo copied and put on a windshield. now, with snapchat and google and meta, one picture can be seen by millions of people. we need the uniformity like mr. pizzuro was saying. >> senator klobachar: thank you.
11:37 am
>> chairman grassley: senator klobachar, thank you. >> thank you, chairman grassley. mr. pizzuro, i will start with you. you know the trends of what we are facing online, probably as well as anybody, is that fair to say? >> i would say so. >> senator hawley: would you say it is more of it or less of it? >> mr. pizzuro: hundreds or thousands percent more. >> senator hawley: in 2023, there were 104 million images in videos of suspected child abused materials uploaded onto the internet compared to 450,000 in 2004.
11:38 am
450,000 to 104 million in the last four years in which we have data. here is another statistic. the number of reports of child exploitation materials went from 1 million in 2014 to 36.2 million in 2023. so, in other words, it is an enormous explosion. it is absolutely everywhere. let me ask you some of the remedies about this. if you are a parent and i am a parent of three young little kids, if you are a parent of a victim of child sexual abuse material, your child's images have been used and they have been exploited. you have companies hosted that content recklessly or intentionally or negligently or they have done it, if you are a parent, can i sue them and get it down? >> mr. pizzuro: not right now.
11:39 am
you can sue them but i don't know how successful you will be. >> senator hawley: we know the abuser but we got these companies hosting the content and making money and distributing it. i will go to the company and i will say, this sexual abuse material, this is my kid, this is online and i am reporting it to you. i want you to take it down. they don't take it down and you are telling me i can't go to court? >> mr. pizzuro: you are going to end up losing and that's part of the problem. >> senator hawley: it is a huge problem. you are exactly correct. the state of the law is i can't go to court and hold these companies accountable. we had testimonies a few weeks ago and somebody sitting where you are sitting, a parent whose child was sole drugs in this case over one of these platforms, over snapchat, this parent reported and they did
11:40 am
nothing. the parent said i am going to sue, the snapchat's executive laughed in their face. federal law prohibits you from suing us. let me ask you this in 2019, facebook was fined by the ftc, $5 billion. they are absolutely terrified of a parent coming into court and getting in front of as jour jury
11:41 am
and holding them accountable. that's why it is high time and past time that this congress gave parents the ability to do that. i will say patrol three millionth time, until congress give parents the ability to sue, nothing will change. these companies don't care about fines and regulations. the companies sit here and offer to write regulations. we promise to comply. we'll write them and we'll comply. they won't comply and they buy off the regulators. what they fear are juries. this is what senator durbin has done with his bill that we worked together to give parents the right to get into court and have their day in court is vital. i am proud to be working with him on this. it passed unanimously in this committee last year when he was the chairman. we'll make it stronger this year.
11:42 am
i will say again, there is nothing more important than this congress can do than to stop this than to give parents the right and victims to get the right into court to hold these companies accountable. thank you, there chairman. >> thank you mr. chairman and thank you all for testifying and representative duffey, our hearts go out to you. we have been here many times already, yes, i agree that we have to do something about section 230. one of the things that professor leary mentioned and before i can get to that, the enforcement is really important and i just want to note that is last week when i was questioning mr. blanchard who was president trump's nominee of deputy attorney general, i know that protecting children online is an issue that unifies members of this
11:43 am
committee as you can see. i am disappointed. i explained that if we want to protect children, the last thing we should do is fire prosecutors who fight child exploitation and impose hiring freeze that stops them from filing these, filling these vacancies. that's what's happening. i think we should note that environment in which we are having this hearing. moreover, there was a funding freeze briefly that cut-off funding to internet crimes against children task forces that fights child exploitation in every state. so, child exploitation is a multi faceted issue. i want to get back to professor leary who says the state ought
11:44 am
to have the right to go after child exploitation in court and they are not able to do so because of section 230. does that cover both criminal as well as civil prosecutions by states? >> professor leary: it has been interpreted that way. the state and the way it is interpreted, there are language of section 230 in communications decency act talk about no states and not supporting a state law that's in section 230. courts interpreted that. you can't enforce criminal laws which was attempt in dark cases. i would assume it happens in civil cases under the c1 provision of the statute. >> senator hirono: so you would allow the state to enforce their
11:45 am
own child protection laws? >> professor leary: 100%. my written testimony has a quote from a letter the national attorney general laying out again for the third times, i believe and i don't know how you can have over 50 attorney generals, i believe it is a territory as well. all in agreement on this point. >> senator h hirono: you would believe we need to do something to support the law? >> mr. guffey: yes, states need to be able to have the tools. >> senator hirono: could you provide some background on the case and how section 230 was involved and what you think it
11:46 am
demonstrates of the law around section 230? >> ms. goldberg: yes, it accuses the dating of advertisements to children using instagram and tiktok with child models in school settings and luring them onto the dating app. as i said in my complaint that statistics of 50% of gay kids who were active and had their first sexual experience with an adult that they meet on grinder. grinder has no age verifications and just absolutely turns a blind eye to the fact that there are so many kids that used the product and recommended to adults. now, i claim that was a defective product and grinder because they knew about the problem as i stated in the lawsuit and were refusing to institute any sort of age
11:47 am
verifications and they were condoning trafficking. the case got thrown out by the district court and that was affirmed yesterday by the ninth circuit. >> senator hirono: because of section 230 immunity. >> ms. goldberg: because of the high knowledge standard of actual knowledge that they were imposing which they didn't have to impose. they imposed in the trafficking claim. >> senator hirono: anyone who gets injured by someone else's action ought to be able to pursue illegal remedies and therefore, i agree that we need to remove section 230 immunity somehow. still pay attention of unintended consequences that may flow from that kind of a change. it is not where we ought to be because this is a growing problem. thank you, mr. chairman.
11:48 am
>> representative, i am sorry, your boys are proud of you. you are doing good work. my late father used to tell me that you will never know love until you know the love of a child and i didn't believe him, i knew now. if something happens to my boy, i am just so sorry. >> mr. guffey: thank you so much. >> mr. pizzuro: sir. >> social media is a big part of childhood, is it? >> mr. pizzuro: yes.
11:49 am
>> senator kennedy:can we agree it is -f can we agree that social media lower the cost of being an a-hole? >> mr. pizzuro: yes. >> senator kennedy: can we agree that big parts of social media have become cesspool of sexual exploitation? >> mr. pizzuro: for sure. >> senator kennedy: it lowers the cost of being a pedophile, has it? >> mr. pizzuro: absolutely, it made it easy access. >> senator kennedy: you are familiar of the cyber tip line? >> mr. pizzuro: yes. >> senator kennedy: are they
11:50 am
required reports of instances of child sexual exploitation to the national center? >> mr. pizzuro: of what they see. >> senator kennedy: social media companies have to report these. first, they have to look, don't they? do they make my money when they look? >> mr. pizzuro: no, look at the apple statistics, 275 came from apple. >> senator kennedy: they are not paid to look. >> mr. pizzuro: no. more users and more eye balls. >> senator kennedy: so they cancel more advertisements. is that consistent with the economic interest? once they look and they find it, they have to report it to the national center, right?
11:51 am
>> mr. pizzuro: that's correct. >> senator kennedy: are they paid to look at the national center? >> mr. pizzuro: absolutely not. >> senator kennedy: how many instances of sexual exploitation of children occurring that's not being looked for or reported by the social media company? >> mr. pizzuro: i don't have the statistics, most of esps that don't even report so some over report and some don't report at all. that's part of the challenge. secondary is voluntary, right? whatever they give them is no uniformity in data. >> senator kennedy: what happens if they don't look or they don't report, are they punished? >> mr. pizzuro: no. >> senator kennedy: are you familiar with the safer program?
11:52 am
>> mr. pizzuro: a little bit. >> senator kennedy: they use ai to scan patterns that may be patterns of sexual exploitation, do social media programs all use that? >> mr. pizzuro: i don't know how many do. i don't know if they use technology but they should. >> senator kennedy: are they required to use it? >> mr. pizzuro: no. >> senator kennedy: we got to do something. my last question, do you find it ironic that all these people in big tech who dreamed about and talked about creating a utopia, have managed todd generate more hate and more harm than anyone could ever have possibly
11:53 am
imagined - all to make money. >> mr. pizzuro: and lots of money they made. >> senator kennedy: do you find it ironic? >> mr. pizzuro: really. >> chairman grassley: senator blumenthal. >> senator blumenthal: that may be the only reason i am the only one left. >> chairman grassley: i am looking forward for your question. i am going to turn the gavel to senator blackburn. >> senator blumenthal: representative guffey, thank you for being here today. our hearts go out to you. your courage and strengths make
11:54 am
an enormous difference. i know how strongly you supported the kids online safety act and i am deeply grateful for your support and activism going to louisiana seeking to talk to representative scalise and representative speaker johnson on behalf of that bill. you did an article that i would like to have entered into the record if there is no objection. there seems to be none. when you went to see representatives scalise and johnson, were you given an opportunity to talk to them? >> mr. guffey: no, sir. we did meet with the staff but coming up here to the hill,
11:55 am
unable to meet with one of the representatives. >> senator blumenthal: would you like to meet with them? >> mr. guffey: i would love to. i 1000% agree. >> senator blumenthal: why don't you tell us as a parent but also as an advocate and the author of the article why you think some of the arguments made against - based on the supposed free speech thesis are incorrect. >> mr. guffey: i believe it is follow the money. if you look at the big tech lobby and representatives that fight against it, you follow the money and where it ends up. i believe that fear and as an elected official, i see it myself. you are often worried about what
11:56 am
this will look like and that's one of the reasons whenever i was presenting, i used the phrase that we have too many politicians worried about the next election rather than leaders worried of the next - it is a fox narrative that's put out there. people agree with you and they'll turn right around and share a false narrative. >> senator blumenthal: i think the united states senate recognized a false narrative of a strong bipartisan vote here, we'll have the same kind of support and i thank senator blackburn has been a steadfast partner. i would like to turn to professor leary, i missed attributed the article to representative guffey.
11:57 am
>> professor leary: thank you, first thing about free speech, i believe the article you are referring to, myself and other scholars wrote this piece and it really dispelled these arguments about cosa and really we see again and in fact if you look back in history and it is interesting to look at what some of tech have said over the years. i can go back to 2014, they are making this argument, we can go back and before that to 1996, they told us the communications decency act was going to ruin free speech and they said it about the save act and they said it about - low in be hold, we still have plenty of free speech. the thing to keep in mind of free speeches and the first amendment, that's an amendment designed to help inform us on how to handle these sticky issue. it is not a reason to engage in legislation. that's being used in that manner. there is a distinction between
11:58 am
speech and cubing. cosa address conduct and not content. the speech argument was particularly misplaced with regards to that piece of legislation. >> thank you. >> cosa the conduct involved in product design, there is no more limitation on free speech than there would be if and when because it does, the federal government regulates the safety of the design
11:59 am
>> professor leary, i am going to stay right with you for my question. i appreciate it so much that you would put together the opt-ed and the difference you are making up there, that it was not a free speech infringement. this is as senator blumenthal said, product design as you said conduct but we know the reason that meta and google and the groups lobbied just millions of dollars spent lobbied against this. they assigned a dollar value to each and every kid and i think the dollar value is $270. and so our kids are the product and it is so unseemly. it is absolutely disgusting that
12:00 pm
they devalued the lives of young people in this manner. mr. pizzuro, mr. pizzuro, i want to come to you. senator klobuchar and i have an act that would establish a database for victims of crimes and incentivizes states to collect and enter and share their data. what we are trying to do is get a full picture of what is happening in each of the 50 states when it comes to human trafficking. and we have really had a tough me doing this and finding those people that are behind human trafficking rings. i know your organization, raven,
12:01 pm
as been supportive of the bill. mr. pizzuro: yes. sen. blackburn: i would like you to talk for just a minute about why having a national database is so vitally important to breaking this modern-day slavery apart. mr. pizzuro: the more data we have, the more we are able to understand and see and react, and that is part of the challenges that as states we are so fragmentized, so we are getting data from certain areas. the more data we are able to actually collect, the more likely we are able to put a comprehensive plan and understand how to go after certain trafficking. sen. blackburn: and i want you to touch for just a moment on the use of ai-generated ccm because what we hear from law enforcement is they are having to sift through so many images to figure out what is ai-generated and what is actual. mr. pizzuro: that's a challenge. right now a detective
12:02 pm
investigating something, i can't tell the difference between what a real image is and what is not a real image. so technology exists, i can make those images wherever i want. i can make a child from an adult. the challenge becomes i can take your images off the clearnet, of social media, off open profiles, entering that person into a child, or better yet, have sexually explicit images. the challenge is going to be we cannot see it unless we have the software capabilities in order to do that, which we don't have. sen. blackburn: i appreciate that. senators coons, klobuchar, tillis, and i introduced the no fakes action to do with ai-generated voice and visual likeness of individuals. we think that this will play an important role in the remedy for ai-generated ccm.
12:03 pm
representative guffey, i would love to get your thoughts on that. mr. guffey: on the no fakes act, i personally love it. i resubmitted a bill very similar in the state. i love the idea of using the name, image, and likeness. i think that is very easy thing to hit. as we talk about using ai-generated pornography, one of the problems we have is the statement that this is not a real person, therefore is it really a crime. bills such as name, image, and likeness protects citizens as opposed to focusing solely on what the image is. sen. blackburn: let me ask you this -- and congratulations on getting gavin's law passed. mr. guffey: thank you. sen. blackburn: is there a way you can amend provisions of no fakes onto gavin's law and begin
12:04 pm
to expand the protections at the state level? mr. guffey: in south carolina, unfortunately, no -- sen. blackburn: so it is going to be two separate, so you of a group of bills that do this. mr. guffey: yes, ma'am. sen. blackburn: i am over time. i'm going to recognize senator schiff and gently gavel to m-- turn the devil to ms- -- turn the gavel to ms. moody. sen. schiff: i want to thank you for your advocacy and express condolence for the loss of your son. i can't imagine what you've been through. thank you for taking the trauma and using it to help other families. i have not had the chance is a senator to study multiple approaches of the various bills, although some i've supported in the house. but i wanted to ask you, professor, we established
12:05 pm
section 230 for the reasons you implied -- it was a nascent industry, they urged us to do so so we would not stifle innovation. they also made the argument that without 230, they were not moderate content because they would be sued if they did and this would encourage them to moderate content. there may have been a time when they moderated content, but those days seem to be over. it certainly wasn't an act -- wasn't enacted because it was believed necessary for the first amendment. first amendment stand on its own two feet. the absence of 230, companies could still plead a first amendment defense. what is your preferred approach -- that is, is it a repeal of 230, changing it from immunity to some sort of defense, is it to cabin 230 in some way by
12:06 pm
narrowing the scope? what are the merits of the various approaches? ms. leary: thank you, senator. i would say a couple of things. i would say that there was discussion in the deep background about this free internet nascent industry, and when we look at the policies and the findings at the beginning of section 230 of the communications decency act, there is language to that. but i would repeat, the overwhelming background and discussion was about the child protection piece. and therefore i think that the concern -- i think that better than repealing the entire thing, is to keep the c2 language, which gives protection to a platform if they -- and specifically if they remove anything they consider to be obscene, lewd, lascivious,
12:07 pm
excessively violent, otherwise objectionable. that will protect them. that is all they need. they do not need c which is they do not need --they do not need c1, which has been turned into this de facto near absolute immunity. adding the state's ability to proceed is important thing. outside of 230, holding them liable when they host this material. that has been discussed at length as well. though sense of the other things i listed, i don't want to use up too much of her time, all work together to respond to the complex crime. sen. schiff: counsel, what do you believe would be most helpful in terms of making sure you can get the discovery you need and that we have established the right protections in the right burdens in terms of the platforms?
12:08 pm
ms. goldberg thank you. i agree we are long overdue to abolish section 230, but what is important his clients get to discovery so they actually know what the extent of the problem. the only way we can do that is if the standard is reasonable for parents to plead. if we have to show that the company knew about that picture, that exact victim, that exact perpetrator, there is no way a client's going to be able to overcome emotion --overcome a motion to dismiss and get to discovery. we need to have standards like negligence which law already affords sen. schiff: in all courses of action. sen. schiff:let me ask this question. i don't think there's any doubt that if the companies devoted the technological capability to trying to solve this problem, they can make enormous gains. they wouldn't be able to limit
12:09 pm
the problem altogether. but nonetheless, they could prove very effective. what would you propose the new standard be, then? that is, if it is not going to be possible to completely do away with this, the standard can't be perfection, how would you define the standard of care you would expect the industry to follow given that it hasn't had to follow any standard with protections of 230? >> and if you could quickly answer -- ms. goldberg: these are products, so strict liability should apply good grated and effective product, all users should be able to sue them without having to prove a duty if the project under them. sen. schiff:. -- thank you. sen. moody: thank you, senator. they've been so impressed with
12:10 pm
this committee. i've been a senator all of four weeks, so get ready. i bring an array of has not just as attorney general, but as a mother of a teenager right now. i'm so impressed with the topics we have focused on, specifically this one. it was shocking that the senate was able to me anonymously on protections for children and they ran right into the house that did not go along with some of those things will hoping we can change that as attorney general, i fought in court against many of the platforms, i must get it platforms -- i investigated platforms for homes to children. i'm a mother dealing with this. i tell people all the time it is really hard to be one of the first generations of parents trying to parent children and we don't understand what we're are doing because we don't understand the technology to the degree they do. when i am going through some of the controls, i often have to ask my kid what that means. which seems to defeat the purpose. but here we are. and while i can break down what
12:11 pm
we are addresseding today by private dish what we are addressing today by privacy concerns, honda children, whether that is mental effects, addiction, materials that they never would have been able to have been exposed to in the past but they have ready access to, one of the things i want to talk about quickly is the access to our children by predators and factors. i think this is the third lane that we read about repeatedly in the paper every single day. in my state, from doctors to creditors to you name it, they are getting access to children, and in the past we could lock our doors and they were safe at night. in my child's school a woman was arrested for posing and molesting online using snapchat and tiktok. when you engage with the platforms, they often deny this is happening. but it is happening.
12:12 pm
in the best people that are the parents--and the best people that can represent that other parents, what is happening to their children at home when they thought they were safe as i commend you as a fellow parent, mr. guffey, for taking your pain and channeling that into frustration and anger, because that is what is going to get the attention of lawmakers. we have tried to get the attention of the platforms. we have talked a lot about what needs to be done to force some restrictions and then acting on the round. but we need to dish t-- them acting on their own. but we need to talk about what needs to be done through laws. i will open this up to whoever wants to answer this question. what can we do as lawmakers right now to stop creditors from getting -- stop pre dators from getting access to our children? >> i want to use this as an example because the comment was made that we essentially have
12:13 pm
different laws for the outside world than we do for the inside world. if i had a storage facility and i stored only guns in it there and you as attorney general and someone was breaking the law and i said, ok, the majority are law-abiding citizens, but we also have terrorists and we will stored guns for criminals, if i told you you had to have the digital id to get into the locker, you think that is ludicrous. but that is exactly retreat season. if you are housing it, you should be held responsible. nothing is going to change until we open up civil liability -- these are the world's richest companies since inception of man, and yet they are immune. ms. goldberg: and i would say that if you have designed a product where you are exposing children to predators and you cannot stop that happening, it is a defective product. all apparent victim should have
12:14 pm
to do to sue you is show that you know about the problem and the extent of it. sen. moody: in your expense -- i understand that this has happened, when parents have demanded and shown that the harmful materials online, demanded they take it down, they know about it, there have been refusals to take it down. ms. goldberg: absolutely, in those cases get thrown out of court because the online platform says "i didn't know about that specific incident." of course they are not good to know about that specific incident, or they are going to say "i didn't intend to harm that exact child." sen. moody: mr. pizzuro, i know you have law enforcement experience and i'm grateful that. my husband has a career in law enforcement. what is the number one thing you would recommend that we can do as congress? mr. pizzuro: device-based age verification. if there was a framework for a parent to shut the spigot off and make it easy rather than go
12:15 pm
through 25 different apps, the companies have this. we can stop it at the device level. that is where we prevent children from getting some of these images and the offenders getting access to those children. sen. moody: thank you. since i am the acting chair, i don't want to exceed the boundaries of time. i will turn it over to sen. whitehouse. sen. whitehouse: thank you, and i understand i've been given permission to close out the questioning. don't hesitate to go where you need to be. first of all, ms. goldberg, you said repeal of 230 was long overdue. i'm hoping that that day is coming fairly soon and a bipartisan bill to do just that will be filed by a group of members from this committee. before very long. as you also pointed out, there are standards by which to evaluate the conduct or misconduct of these big platforms that the law already
12:16 pm
affords. and everybody else has to abide by those same standards. if you are a radio station come if you are a newspaper, if you are a manufacturer, if you are an individual, some of them go back to the english common law that came over with the first settlers, and the idea that what representative guffey described quite well as institutions that are the richest since the inception of man shouldn't be bound by the law -- i adore ron wyden, i think is wonderful senator, he put 230 in when these platforms were in people's garages. they have gone from that to being the richest companies since the inception of man. not going to forget that phrase of yours, representative guffey, i like it. with no change in congress's response to the original
12:17 pm
rationale for having that section 230 protection, and also repeated, grotesque failure by these entities to police themselves. it is not as if we are dealing with an array of platforms that have a demonstrated record of meeting the public interest and the safety of their product. not at all. as a lawyer to lawyer, ms. goldberg, talk a little bit about when the section 230 defense first kicks in, and what that means in terms of you and your clients actually being able to get discovery, to take a deposition, to find out the truth of what transpired. ms. goldberg: what happens is that i file a lawsuit with all my facts, with everything i can know, even though there was so much asymmetry of knowledge. i don't know the extent to which
12:18 pm
the platform knows about the exact problem or the overall problem. i can just base it on what has happened to my client if they are alive. otherwise i have to go through their parents. immediately within 30 days -- sen. whitehouse: you file your complaint. ms. goldberg: they file a motion to dismiss saying we are just a publishing platform, we are not a product, this is just speech. and then they attempt to get it dismissed. oftentimes judges will do it without oral argument. and then we never get into discovery. so we never get the opportunity to even show or know the extent to which the platform has been tolerating and making money off of this exact harm. we don't have any information about other similar incidents, nothing. sen. whitehouse: it is a vehicle not only for evading responsibility for bad acts, but a vehicle for covering up what actually took place. it would be slightly different if the section 230 dismissal
12:19 pm
motion was something you'd made a trial, for instance. but not even that. ms. goldberg: i also believe that even more terrifying to tech then facing a jury i to i -- eye to eye is the discovery. it was discovery that made omega shut down, documents showing all of these other incidents of child sexual abuse. they shuttered the platform because they had no defense. sen. whitehouse: discovery is a beautiful thing. senator padilla. sen. padilla: thank you, mr. chair. mr. guffey, i want to begin with you and that you know that my heart goes out to you for your and your family's experience, and i appreciate willingness to be here to share your testimony. mr. guffey: thank you. sen. but the: i wanted -- sen. padilla: i want to alert my
12:20 pm
colleagues to a relatively new product,-character-based chatbot apps. many of these have been flooded with age inappropriate chatbots, which may cause young users to be exposed to sexual or suggestive ai-generated imagery or conversations. as a father of three school-age children, this is personal. further conversations with these chatbots can end tragically. since 2023, two individuals have died by suicide following extensive conversations with ai chatbots. the threat, colleagues, the risk is real. mr. guffey, how would you recommend this committee begin to think through the risk posed by this emerging consumer product category? mr. guffey: whenever it comes to ai, i would have to lean more on
12:21 pm
some of the other panelists up here on their expertise when addressing chatbots. chatbots is something new that i just started really looking into. but on the legal side i'm not an attorney. i'm an angry parent that tries to throw it against the wall, whereas the attorneys are the ones who have to say this is what will hold up in court, this is what will not. sen. padilla: well, we have had to figure out the legalese. but you have the most important voice given your experience. the chatbot piece is the next generation. mr. guffey; yes, sir. sen. padilla: what children have to contend with today, we can only imagine what is coming. mr. guffey: that is the exact problem. it is the problem of today, but as tech is evolving, outlaws don't move fast enough to keep up. i believe having that liability and holding these companies responsible for what they are presenting, if we take online
12:22 pm
services and treated as a service -- if we can instantly treat it as a product -- if we can simply treat it as a product we can hold them to consumer protection laws. >> i think we should think about the international context in which this is playing out. the ai summit in paris was called the ai security summit rather than the ai safety summit, which had taken place in the u.k. and i believe in area. there has been a shift -- and i believe in korea. there has been a shift away from the prevailing thought that we must make these products safe and instead, particularly this administration is urging the vast and quick expansion of these tools. i think you have a role and your colleagues have several to bring that focus back, ideally hope you do. -- and ideally hope you do. sen. padilla: ms. goldberg, you seem -- ms. goldberg: i do. one of my colleagues is litigating a case against character ai where the bot
12:23 pm
encouraged addictive behavior and ultimately led the child to die by suicide. i think what we will find is there is a possibility that courts will perceive this speech as the corporation's own speech and in that case character ai is owned by google and that it won't overcome section 230 challenge. sen. padilla: very good point, actually. what i would do in the interest of time is invite all of you to respond to the same question after the hearing as part of our questions for the record, because i want to get to at least one more topic. i understand senator graham is on his way back as well. last congress we had a hearing similar to this, but instead of you five sitting in front of us testifying, it was the ceo's of the five largest social-media companies testifying to the committee. and i had the opportunity then
12:24 pm
to ask them each about the apparent totals -- parental tools they offer -- all of them offer some sort of parental tool to help minors safely navigate the use of perspective services. i asked them to describe what those tools were, and more specifically what the adoption and use rates of those tools were. you can have tools and protections out there and you can debate whether it is efficient or not. sadly, they either didn't share how widely used these tools were, didn't provide the data, they are big into data, or what data they did show demonstrated to us that the usage rates were actually very low. so the conclusion, unavoidable,
12:25 pm
undeniable, is that the industry isn't doing enough to let parents know what resources are available and aren't investing enough into understanding why these so-called protections aren't being adopted at greater rates. mr. balkam, and your testimony you observed that these controls would better serve minors and their guardians if they were standardized, interoperable, and unified between apps, devices, and brands. how do you think you can make that a reality? mr. balkam: i often use the example of the automobile industry. back in the 1950's and 1960's, if you got out of one car into another, you may not necessarily know where the blinkers are or the light switches were. even the logos were different. laws came into place in the 1960's, and now in a rental car you know where the capabilities are, you know exactly where the lights are, and the symbols
12:26 pm
are all the same. i would like to see the industry come together, ideally voluntarily, but if not with some coercion, to standardize the ways in which parental controls and online safety tools, which are the ones that teenagers and young people use to stay private, to report, and to block, oftentimes without their parents' is base knowledge. let's have a standardized --without their parents' knowled ge. let's have a standardized way that teens can keep themselves safe that is not as confusing as what we've got in the moment. sen. padilla: intends to happen one of two ways. he either gets imposed by some -- it either gets imposed by some level of government in the industry comes away kicking and screaming, or they can live up to their responsibility and come together as an industry and put forward a model that is transparent and that either works or at least we can measure and hold them accountable when
12:27 pm
and where it doesn't. it's been a long morning for all of you. i very much appreciate your participation in today's hearing and the work you do and the perspectives you offered. i'm told senator graham is not coming after all. it falls upon me to not just think all of our witnesses but remind folks that the hearing record will remain open for one week for statement to be submitted into the record. questions for the record may be submitted by senators by five of 5 p.m. on wednesday, february 26. unless there is anything further from the nameplates, this hearing is adjourned.

0 Views

info Stream Only

Uploaded by TV Archive on