Skip to main content

tv   Ana Cabrera Reports  MSNBC  January 31, 2024 7:00am-8:00am PST

7:00 am
mess with her right there. as if don't even play, that's a part of me right there, in other words, okay, okay, that's the start of me right there. as if come that day, that's the end of me right there, as if push come to shove, i would fend for her right there. as if come what may, i would die for her right there. >> uh-huh. >> yeah, so i just feel like that's -- >> everybody can relate to, that right? >> love. >> this is amazing. this is the honey, all right? an anthology of contemporary black poets, editor and "new york times" best selling author, kwame alexander, thank you so much for coming on this morning. >> come back. come back. we need more poetry. >> i know, i feel really uplifted. >> and thank you all for your inspiration, especially you. >> thank you. >> appreciate you. >> of course, the rev. >> do you like rev better in the track suit or in the tailored suit? >> hey, it's all evolving. it's evolution. it's always good.
7:01 am
>> exactly. >> all right. thank you all very much. ana cabrera picks up the coverage right now. right now on "ana cabrera reports," five big tech ceos about to facing a grilling. they're set to focus on capitol hill in moments. are they prepared to make real change as cases of exploitation online skyrocket. house republicans move closer to something that hasn't been done in more than a century, impeaching a sitting cabinet member. do they have any real chance of ousting dhs secretary alejandro mayorkas with such a slim majority. president biden's plan to retaliate after a drone strike killed three american soldiers in jordan. new details on the response that could last weeks.
7:02 am
thanks so much for being here. it is 10:00 eastern. i'm ana cabrera reporting from new york. we want to take you right to capitol hill where five of the top tech ceos in the world are about to testify before the senate judiciary committee, a topic is one that really matters to so many of us. child safety on social media, and one notable name you might recognize there is mark zuckerberg of meta. this will be his eighth time testifying on the hill. he'll be joined by execs from x, formerly twitter, along with snap, discord and tiktok. child safety online is becoming a bigger and bigger concern. listen to this, reports of child sexual abuse material on online platforms grew from 32 million in 2022 to a record high of more than 36 million reports last year. nbc's ryan nobles and kate snow are joining us from capitol hill, and we're going to talk about the politics of this in just a moment. first let's really talk about the issue they're focused on
7:03 am
today, child safety. and kate, you have done a lot of reporting on how families are impacted. what are the biggest problems right now when it comes to sexual exploitation online? >> reporter: yeah, ana, it is a growing problem on all five of these platforms. let me show you where we are first and i'll talk more about that. this is the line to get into the room right now. i'll take you over here, it is a packed hearing room. it is packed outside. we just had the five ceos walk through the doors. we tried to ask them questions, none of them would answer anything at this point. let me tell you about what we do expect today. there are five ceos here, ana. three of them had to be subpoenaed to be here. they didn't want to come, those being discord and x and snapchat. also here meta and tiktok. we expect a lot of questions about child exploitation.
7:04 am
every parent knows when their kids go online, they worry about their safety. i spoke with someone on monday a a young woman who thought she was talking on snapchat to another 14-year-old. in fact she was talking to an older man who was trying to solicit sexual images from her. i think we have a little bit of sound we can play from ellen and i'll tell you more about what questions we may hear today? >> did he ever send you photos? >> yes. >> inappropriate photos? >> mm-hmm. >> naked photos? >> yes. >> more than once? >> correct. >> so ana, senator dick durbin, the chairman here told us yesterday this is long overdue, that these ceos need to be held to account, that they need to answer some really tough questions. again, the subject of the hearing is child sexual exploitation. we have a lot of people in the room, families who have other issues with social media. for example, those who have lost children after they bought pills containing fentanyl on social
7:05 am
media. so that's another issue that may come up here. a lot of heartbroken parents in that room. i've seen them. they're wearing black, and they're holding up photos of their children, ana. >> okay, we'll keep an eye out for those parents. it's such a really important issue and a potentially fatal issue for some families. ryan, this is one of these rare instances too where there seems to be bipartisanship. we will dip in in just a moment. we know there was just a video played. help us set the scene in terms of what you're hearing from lawmakers in terms of the types of answers they're going to be seeking and what is their goal today? >> reporter: well, you know, it's interesting, ana. this is an issue that has bipartisan support, republicans and democrats do believe that something needs to be done to create laws that can protect children when it comes to these social media platforms, but all the legislation that has been introduced has really gone nowhere. in fact, there's really only been one bill in the past ten
7:06 am
years that has become law that was very narrowly focused on child sex trafficking as it relates to social media. there have been a flurry of bills introduced that have been designed to try and help stem this problem. each one of them has either lang languished in committees. lawmakers asking very specific questions about what the law can do to protect children in these spaces. as we often deal with when it comes to tech legislation, we're dealing with a situation where the laws are so far behind the technology, and by the time the laws are implemented, the technology has already moved to the next stage. so what these lawmakers are going to try and learn from these tech ceos is what can be done right now to address this problem. but even if they get those answers, there's still a great deal of skepticism that they'll be able to pass any sort of substantive law in the near future, ana. >> ryan nobles and kate snow.
7:07 am
please stay close and thank you for that reporting. opening statements have begun. let's go ahead and listen in. this is the chairman, dick durbin. >> for predators to access, engage, and groom children for abuse, and the prevalence to see sam on x has grown as the company has gutted its trust in safety work force. today we'll hear from the ceos of those companies. they are not only the tech companies that have contributed to this crisis, they are responsible for many of the dangers our children face online. their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk. coincidentally, coincidentally, several of these companies implemented common sense child safety improvements within the last week, days before their ceos would have to justify their
7:08 am
lack of action before this committee. but the tech industry alone is not to blame for the situation we're in. those of us in congress need to look in the mirror. in 1996, the same year the motorola star tech was flying off shelves and years before social media went mainstream, we passed section 230 of the communications decency act. this law immunized the then fledgling internet platforms from liability for user-generated content. interesting, only one other industry in america has immunity from civil liability, we'll leave that for another day. for the past 30 years, section 230 has remained largely unchanged allowing big tech to grow into the most profitable industry in the history of capitalism. over the past year this committee has unanimously
7:09 am
reported five bills that would finally hold tech companies accountable for child sexual exploitation on their platforms. unanimous. take a look at the opposition in membership of the senate judiciary committee and imagine if you will there's anything we could agree on unanimously. these five bills were object of agreement. one of these bills is critically it would let victims who online providers that promote or aid and abet online child sexual exploitation. this stand against online child sexual exploitation is bipartisan and absolutely necessary. let this hearing be a call to action that we need to get kids online safety legislation to the president's desk. i now turn to the ranking member, senator graham. >> thank you, mr. chairman. the republicans will answer the call. all of us, every one of us is ready to work with you and our democratic colleagues on this
7:10 am
committee to prove to the american people while washington is certainly broken, there is a ray of hope, and it is here. it lies with your children. after years of working on this issue with you and others, i've come to conclude the following. social media companies, as they are currently designed and operate are dangerous products. they're destroying lives, threatening democracy itself. these companies must be reined in or the worst is yet to come. gavin guffey is a representative, republican representative from south carolina in the rock hill area. to all the victims who came and showed us photos of your loved ones, don't quit. it's working. you're making a difference. through you we will get to where we need to go so other people won't have to show a photo of
7:11 am
their family. the damage to your family has been done. hopefully we can take your pain and turn it into something positive so nobody else has to hold up a sign. gavin's son got online to instagram and was tricked by a group in nigeria that put up a young lady posing to be his girlfriend and as things go at that stage in life, he gave her some photos, compromising sexual photos, and it turned out that she was part of an extortion group in nigeria. they threatened the young man that if you don't give us money, we're going to expose these photos. he gave them money, but it wasn't enough. they kept threatening and he killed himself. they threatened mr. guffey and his son. these are bastards by any known
7:12 am
definition. mr. zuckerberg, you and the companies before us -- i know you don't mean it to be so, but you have blood on your hands. [ cheers and applause ] >> you have a product that's killing people. when we had cigarettes killing people, we did something about it, maybe not enough. you're going to talk about guns, we have the atf. nothing here, there's not a damn thing anybody can do about it. you can't be sued. now, senator blumenthal and blackburn, who have been like the dynamic duo here have found emails from your company where they warned you about this stuff and you decided not to hire 45 people that could do a better job of policing this. so the bottom line is you can't be sued. you should be, and these emails would be great for punitive damages, but the courtroom's closed to every american abused by all the companies in front of
7:13 am
me. of all the people in america we could give blanket liability protection to, this would be the last group i would pick. [ applause ] it is now time to repeal section 230. this committee is made up of ideologically the most different people you could find. we've come together through your leadership, mr. chairman, to pass five bills to deal with the problem of exploitation of children. i'll talk about them in depth in a little bit. the bottom line is all these bills have met the same fate. they go nowhere. they leave the committee and they die. now, there's another approach, what do you do with dangerous products? you either allow lawsuits, you have statutory protections to protect consumers or you have a commission of sorts to regulate the industry in question to take your license away if you have a license, none of that exists here.
7:14 am
we live in an america in 2024 where there is no regulatory body dealing with the most profitable biggest companies in the history of the world. they can't be sued and there's not one law on the book that's meaningful protecting the american consumer. other than that we're in a good spot. so here's what i think is going to happen. i think after this hearing today we're going to put a lot of pressure on our colleagues, leadership of the republican, democratic senate to let these bills get to the floor and vote. and i'm going to go down starting in a couple of weeks, make unanimous request- unanimous consent request to do c sam, do the earn it act, do your bill, do all of the bills and you can be famous, come and object. i'm going to give you a chance to be famous. elizabeth warren and lindsey graham have almost nothing in common. i promised her i would say that publicly. [ laughter ] the only thing worse than me doing a bill with elizabeth warren is her doing a bill with
7:15 am
me. we have sort of part that because elizabeth and i see an abuse here that needs to be dealt with. senator durbin and i have different political philosophy, but i appreciate what you've done on this committee. you've been a great partner. to all my democratic partners, thank you very, very much. to my republican colleagues thank you very, very much. save the applause for when we get a result. this is all talk right now, but there will come a day if we keep pressing to get the right answer for the american people. what is that answer? accountability. now these products have an upside. you've enriched our lives in many ways. mr. zuckerberg, you created a product i use. the idea, i think, when you first came of this, be able to talk to your friends and family and pass on your life. you can talk to your friends and family about good things going on in life and i use it.
7:16 am
we all use it. there's an upside to everything here, but the dark side hasn't been dealt with. it's now time to deal with the dark side because people have taken your idea and they have turned it into a nightmare for the american people. they've turned it into a nightmare for the world at large. tiktok, we had a great discussion about how many maybe larry ellison to oracle can protect american data from chinese communist influence, but tiktok, your representative in israel quit the company because tiktok is being used in a way to basically destroy the jewish state. this is not just about individuals. i worry that in 2024 our democracy will be attacked again through these platforms by foreign actors. we're exposed, and ai is just starting so to my colleagues we're here for a reason. this committee has a history of
7:17 am
being tough but also doing things that need to be done. this committee has risen to the occasion. there's more that we can do, but to the members of this committee let's insist that our colleagues rise to the occasion also. let's make sure that a 118th congress we have votes that would fix this problem. all you can do is cast your vote at the end of the day, but you can urge the system to require others to cast their vote. mr. chairman, i will continue to work with you and everybody on this committee to have a day of reckoning on the floor of the united states senate. thank you. >> thank you, senator graham. today we welcome five witnesses whom i'll introduced now, jason citron, the ceo of discord incorporated. mark zuckerberg, the founder and ceo of meta, evan spiegel, the cofounder and ceo of snap incorporated. shou chew, the ceo of tiktok,
7:18 am
and linda yaccarino the ceo of x. i am disappointed that our other witnesses did not offer that same degree of cooperation. mr. citron, spiegel, and ms. ya carino are here pursuant to subpoenas. mr. citron only accepted service of subpoena after u.s. marshals were sent to discord. i hope this is not a sign of your commitment or lack of commitment to addressing the serious issue before us. after i swear in the witnesses, each witness will have five minutes to make an opening statement, and senators will ask questions in an opening round, each of seven minutes. i expect to take a short break at some point during questioning to allow the witnesses to stretch their legs. if anyone is in need of a break at any point, please let my staff know. before i turn to the witnesses,
7:19 am
i'd also like to take a moment to acknowledge that this hearing has gathered a lot of attention as we expected. we have a large audience, the largest i've seen in this room today. i want to make clear as with other judiciary committee hearings, we ask people to behave appropriately. i know there is high emotion in this room for justifiable reasons. i ask you to please follow the traditions of the committee, no standing, shouting, chanting, or applauding witnesses. disruptions will not be tolerated. anyone who does disrupt the hearing will be asked to leave. the witnesses are here today to address a serious topic. we want to hear what they have to say. i thank you for your cooperation. could all of the witnesses please stand to be sworn in. do you affirm the testimony you are about to give before the committee will be the truth, the whole truth, and nothing but the truth so help you god? let the record reflect that all the witnesses have answered in the affirmative.
7:20 am
mr. citron, please proceed with your opening statement. >> good morning. >> good morning. >> my name is jason citron, and i am the cofounder and ceo of discord. we are an american company with about 800 employees living and working in 33 states. today discord has grown to more than 150 million monthly active users. discord is a communications platform where friends hang out and talk online about shared interests from fantasy sports to writing music to video games. i've been playing video games since i was 5-year-old old, and as a kid it's how i had fun and found friendship. many of my fondest memories are of playing video games with friends. we built discord so that anyone could build friendships playing video games from minecraft to
7:21 am
wordle and everything in between. games have always brought us together and discord makes that happen today. discord is one of the many services that have revolutionized how we communicate with each other in the different moments of our lives. imessage, zoom, gmail and on and on. they enrich our lives, create communities, accelerate commerce, health care, and education. just like with all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes. all of us here on the panel today and throughout the tech industry have a solemn and urgent responsibility to ensure that even who uses our platforms is protected from these criminals both online and off. discord has a special responsibility to do that because a lot of our users are young people.
7:22 am
more than 60% of our active users are between the ages of 13 and 24. it's why safety is built into everything we do. it's essential to our mission and our business. and most of all, this is deeply personal. i'm a dad with two kids. i want discord to be a product that they use and love, and i want them to be safe on discord. i want them to be proud of me for helping to bring this product to the world. that's why i am pleased to be here today to discuss the important topic of the online safety of minors. my written testimony provides a comprehensive overview of our safety programs. here are a few examples of how we protect and empower young people. first, we've put our money into safety. the tech sector has a reputation of larger companies buying smaller ones to increase user numbers and boost financial
7:23 am
results. but the largest acquisition we've ever made at discord was a company called centro pea. it didn't help us expand our market share or improve our bottom line. in fact, because it uses ai to help us identify, ban, and report criminals and bad behavior, it has actually lowered our user count by getting rid of bad actors. second, you've heard of end to end encryption that blocks anyone, including the platform itself from seeing users' communications. it's a feature on dozens of platforms, but not on discord. that's a choice we've made. we don't believe we can fulfill our safety obligations if the text messages of teens are fully encrypted because encryption would block our ability to investigate a serious situation and when appropriate report to law enforcement. third, we have a zero tolerance
7:24 am
policy on child sexual abuse material or csam. we scan images, upload it to discord to protect and block the sharing of this abhorrent material. we've also built an innovative tool, teen safety assist, that blocks explicit images and helps young people easily report unwelcome conversations. we've also developed a new semantic hashing technology for detecting novel forms of csam called clip. and we're sharing this knowledge through other platforms of the tech coalition. we recognize that improving online safety requires all of us to work together so we partner with nonprofits, law enforcement, and our tech colleagues to stay ahead of the curve in protecting young people online. wemt to be the platform that empowers our users to have better online experiences, to build true connections, genuine friendships and to have fun. senators, i sincerely hope today is the beginning of an ongoing dialogue that results in real
7:25 am
improvements in online safety. i look forward to your questions and to helping the committee learn more about discord. >> thank you, mr. citron. mr. zuckerberg. (. >> chairman durbin, ranking member graham and members of the committee, every day teens and young people do amazing things on our services. they use our apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. overall, teens tell us this is a positive part of their lives, but some face challenges online so we work hard to provide parents and teens support and controls to reduce potential harms. being a parent is one of the hardest jobs in the world. technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated. it's important to me that our services are positive for everyone who uses them. we are on the side of parents everywhere working hard to raise their kids. over the last eight years, we've
7:26 am
built 30 different tools, resources and features that parents can set time lines for their teens using our apps, see who they're following, or if they report someone for bullying. for teens we've added nudges to remind them when they've been using instagram for a while or if it's getting late and they should go to sleep as well as ways to hide words and people without those people finding out. we put special restrictions on teen accounts on instagram, by default, accounts for under 16 are set to private, have the most restrictive content settings and can't be messaged by adults they don't follow or people they aren't connected to. with so much of our lives spent on mobile devices and social media, it's important to look into the effects on teen mental health and well-being. i take this very seriously. mental health is a complex issue and the existing body of scientific work has not shown a causal length of using social media and young people having worse mental health outcomes. a recent national academies of science report evaluated over 300 studies and found that
7:27 am
research, quote, did not support the conclusion that social media causes changes in adolescent mental health at the population level, end quote. it also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others, still we're going to continue to monitor the research and use it to inform our road map. keeping young people safe online has been a challenge since the internet began, and as criminals evolve their tactics we have to evolve our defenses too. we work closely with law enforcement to find bad actors and help bring them to justice. the difficult reality is no matter how much we invest or how effective our tools are, there is always more to learn and more improvements to make. we remain ready to work with members of this committee, industry, and parents to make the internet safer for everyone. i'm proud of the work that our teams do to improve online child safety on our services and across the entire internet. we have around 40,000 people
7:28 am
overall working on safety and security and we've invested more than $20 billion in this since 2016 including around $5 billion in the last year alone. we have many teams dedicated to child safety and teen well-being and e we lead the industry in a lot of areas we're discussing today. we built technology to tackle the worst online risks and share it to help our whole get better. founding members of take it down, a platform which helps young people to prevent their nude images from being spread online. we also go beyond legal requirements and use sophisticated technology to proactively discover abusive material, and as a result, we find and report more inappropriate content than anyone else in the industry. as the national center for missing and exploited children put it this week, meta goes, quote, above and beyond to make sure that there are no portions of their network where this type of activity occurs, end quote. i hope we can have a substantive
7:29 am
discussion today that drives improvements across the industry including legislation that delivers what parents say they want. a clear system for age verification and control over what apps their kids are using. three out of four parents want app store age verification. and four out of five want parental approval whenever teens download apps. we support this. parents should have the final say on what apps are appropriate for their children and shouldn't have to upload their i.d. every time. that's what app stores are for. we also support setting industry standards on age appropriate content and limiting signals for advertising to teens, to age and location and not behavior. at the end of the day, we want everyone who uses our services to have safe and positive experiences. before i wrap up, i want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure.
7:30 am
these issues are important for every parent and every platform. i'm committed to continuing to work in these areas and i hope we can make progress today. >> thank you, mr. spiegel. (. >> chairman durbin, ranking member graham, and members of the committee, thank you for convening this hearing and for moving forward important legislation to protect children online. i'm evan spiegel, the cofounder and ceo of snap. we created snapchat, an online service that is used by more than 800 million people worldwide to communicate with their friends and family. i know that many of you have been working to protect children online since before snapchat was created, and we are grateful for your long-term dedication to this cause and your willingness to work together to keep our community safe. i want to acknowledge the survivors of online harms and the families who are here today who have suffered the loss of a
7:31 am
loved one. words cannot begin to express the profound sorrow i feel that a service we designed to bring people happiness and joy has been abused to cause harm. i want to be clear that we understand our responsibility to keep our community safe. i also want to recognize the many families who have worked to raise awareness on these issues, pushed for change, and collaborated with lawmakers on important legislation like the cooper davis act, which can help save lives. i started building snapchat with my cofounder bobby murphy when i was 20 years old. we designed snapchat to solve some of the problems we experienced online when we were teenagers. we didn't have an alternative to social media. that meant pictures shared online were permanent, public, and subject to popularity metrics. it didn't feel very good. we built snapchat differently because we wanted a new way to communicate with our friends that was fast, fun, and private. a picture is worth a thousand words so people communicate with images and videos on snapchat. we don't have public likes or comments when you share your
7:32 am
story with friends. snapchat is private by default meaning that people need to opt in to add friends and choose who can contact them. when we built snapchat, we chose to have the images and videos sent through our service definitely by default, like prior generations who have enjoyed the privacy afforded by phone calls, which aren't recorded, our generation has benefitted from the ability to share moments through snapchat that may not be picture perfect but instead convey emotion without permanence. even though snapchat messages are deleted by default, we let everyone know that images and videos can be saved by the recipient. when we take action on illegal or potentially harmful content we retain the evidence for an extended period, which allows us to support law enforcement and hold criminals accountable. to help prevent the spread of harmful content on snapchat, we approve the content recommended on our service using a combination of automated processes and human review. we apply our content rules consistently and fairly across all accounts. we run samples through quality
7:33 am
assurance to verify we are getting it right. we also proactively scan for known child sexual abuse material, drug-related content and other types of harmful content, remove that content, deactivate and device block offending accounts, preserve the evidence for law enforcement and report certain content to the relevant authorities for further action. last year we made 690,000 reports to the national center for missing and exploited children. we removed 2.2 million piece os of drug related content. content moderation efforts, proactive detection, and law enforcement collaboration, bad things can still happen when people use online services. that's why we believe people under the age of 13 are not ready to communicate on snapchat. we encourage parents to use the device level on iphone and android. my wife approves every app our 13-year-old down loads.
7:34 am
for parents who want more vibds and control, we built family center where you can view who your teen is talking to, review privacy settings and set content limits. we have worked for years with members of the committee on legislation like the kids online safety act and the cooper davis act, which we are proud to support. i want to encourage broader industry support for legislation protecting children online. no legislation is perfect, but some rules of the road are better than none. much of the work that we do to protect people that use our service would not be possible without the support of our partners across the industry, government, nonprofit organizations, ngos and in particular law enforcement, and the first responders who have committed their lives to keeping people safe. i am profoundly grateful to prevent criminals from using online services to perpetrate their crimes. i feel an overwhelming sense of gratitude for the opportunities this country has afforded me and my family. i feel a deep obligation to give back and make a positive
7:35 am
difference and i'm grateful to be here today as part of this vitally important democratic process. members of the committee, i give you my commitment we'll be part of the solution for online safety. we will be honest about our shortcomings and will work continuously to improve. >> thank you, mr. spiegel, mr. chew. >> chair durbin, ranking member graham, and members of the committee, i appreciate the opportunity to appear before you today. my name is shou chew, and i am the ceo of tiktok, an online community of more than 1 billion people worldwide including well over 170 million americans who use our app every month to create, to share, and to discover. now, although the average age on tiktok in the u.s. is over 30, we recognize that special safeguards are required to protect minors and especially when it comes to combatting all forms of csam. as a father of three young
7:36 am
children myself, i know that the issues we're discussing today are horrific and the nightmare of every parent. i am proud of our efforts to address the threats to young people online from a commitment to protecting them to our industry leading policies, use of innovative technology, and significant ongoing investments in trust and safety to achieve this goal. tiktok is vigilant about enforcing its 13 and up age policy, and offers an experience for teens that is much more restrictive than you and i would have as adults. we make careful product design choices to help make our app inhospitable to those seeking to harm teens. let me give you a few examples of long-standing policies that are unique to tiktok. we didn't do them last week. first, direct messaging is not available to any users under the age of 16. second, accounts for people under 16 are automatically set
7:37 am
to private along with their content. furthermore, the content cannot be downloaded and will not be recommended to people they do not know. third, every teen under 18 has a screen time limit automatically set to 60 minutes. and fourth, only people 18 and above are allowed to use our live stream feature. i'm proud to say that tiktok was among the first to empower parents to supervise their teens on our app with our family pairing tools. this includes setting screen time limits, filtering out content from the teens' feeds amongst others. we made these choices after consulting with doctors and safety experts who understand the unique stages of teenage development to ensure that we have the appropriate safeguards to prevent harm and minimize risk. now, safety is one of the core
7:38 am
priorities that defines tiktok under my leadership. we currently have more than 40,000 professionals working to protect our community globally, and we expect to invest more than $2 billion in trust and safety efforts this year alone. with a significant part of that in our u.s. operations. our robust community guidelines strictly prohibit content or behavior that puts teenagers at risk of exploitation or other harm, and we vigorously enforce them. our technology moderates all content uploaded to our app to help quickly identify potential csam and other material that breaks our rules. it automatically removes the content or elevates it to our safety professionals for further review. we also moderate direct messages for csam and related material and use third-party tools like
7:39 am
photo dna and take it down to combat csam to prevent content from being uploaded to our platform. we continually need parents, teachers, and teens. i sat down with a group a few days ago. we used their insight to strengthen the protection on our platform and we work with leading groups like the technology coalition. the steps that we're taking to protect teens are a critical part of a larger trust and safety work as we continue our voluntary and unprecedented efforts to build a safe and secure data environment for u.s. users ensuring that our platform remains free from outside manipulation and implementing safeguards on our content recommendation and moderated rules. keeping teens safe online requires a collaborative effort as well as collective action. we share the committee's concern and commitment to protect young people online and we welcome the opportunity to work with you on legislation to achieve this
7:40 am
goal. our commitment is ongoing and unwavering because there is no finish line when it comes to protecting teens. thank you for your time and consideration today. i'm happy to answer your questions. >> thanks, mr. chew. zack reno. >> chairman durbin, ranking member graham, and esteemed members of the committee, thank you for the opportunity to discuss x's work in protecting -- >> ms. yaccarino, could you check if your microphone is on. >> my talk button is on. how is that? >> better, thank you very muff. maybe i'll adjust my chair, apologies. i'll start over. chairman durbin, ranking member graham, andes teemed members of the committee. thank you for the opportunity to discuss x's work to protect the safety of minors online. today's hearing is titled a
7:41 am
crisis, which calls for immediate action. as a mother, this is personal, and i share the sense of urgency. x is an entirely new company, an indispensable platform for the world and for democracy. you have my personal commitment that x will be active and a part of this solution. while i joined x only in june of 2023, i bring a history of working together with government, advocates, and ngos to harness the power of media to protect people. before i joined, i was struck by the leadership steps this new company was taking to protect children. x is not the platform of choice
7:42 am
for children and teens. we do not have a line of business dedicated to children. children under the age of 13 are not allowed to open an account. less than 1% of the u.s. users on x are between the ages of 13 and 17. and those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve. in the last 14 months, x has made material changes to protect minors. our policy is clear, x has zero tolerance towards any material that features or promotes child sexual exploitation. my written testimony details x's
7:43 am
extensive policies on content or actions that are prohibited and include grooming, blackmail, and identifying alleged victims of cse. we've also strengthened our enforcement with more tools and technology to prevent those bad actors from distributing, searching for, and engaging with cse content. if cse content is posted on x, we remove it. and now we also remove any account that engages with cse content whether it is real or computer-generated. last year x suspended 12.4 million accounts for violating our cse policies. this is up from 2.3 million accounts that were removed by
7:44 am
twitter in 2022. in 2023, 850,000 reports were sent to nekmek including our first auto generated report. this is eight times more than was reported by twitter in 2022. we've changed our priorities. we've restructured our trust and safety teams to remain strong and agile. we are building a trust and safety center of excellence in austin, texas, to bring more agents in house to accelerate our impact. we're applying to the technology coalition's project lantern, to make further industry wide progress and impact. we've also opened up our algorithms for increased
7:45 am
transparency. we want america to lead in this solution. x commends the senate for passing the report act, and we support the shield act. it is time for a federal standard to criminalize the sharing of non-consensual intimate material. we need to raise the standards across the entire internet ecosystem, especially for those tech companies that are not here today and not stepping up. x supports the stop csam act. the kids online safety act should continue to progress, and we will support the continuation to engage with it and ensure the protections of the freedom of speech. there are two additional areas
7:46 am
that require everyone's attention. first, as the daughter of a police officer, all meniscus -- must have resources to bring these offenders to justice. second, with artificial intelligence, offenders' tactics will continue to sophisticate and evolve, industry collaboration is imperative here. x believes that the freedom of speech and platform safety can and must coexist. we agree that now is the time to act with urgency. thank you. i look forward to answering your questions. >> thank you very much, ms. yaccarino. now we'll go into rounds of questions, seven minutes each for the members. >> as we're transitioning here, want to hop out for just a quick second and bring in kate anybody, these five top tech
7:47 am
ceos have now delivered their opening statements as we're talking about child safety on their platforms. i'll start with you anish. >> the notion of industry collaboration for self-governance, section 230 called for the private sector to work together to solve some of these issues. their failure to act is what you saw in senator graham's frustrated opening remarks. if they're serious about working together, they can make a lot of progress, while these bills make their way through congress. a new bill is not needed for them to do more to collaborate to solve these problems. >> your thoughts, kate? >> i was struck by what some of the ceos didn't say as much as what they said. the discord ceo was the only person who brought up encryption and that was because discord does not encrypt. several of the bills that are in
7:48 am
discussion right now have provisions in them that some civil liberties activists are worried would weaken encryption. i'm interested to see how mark zuckerberg responds to questions about encryption because meta has just made a major push to encrypt additional platforms, like it's long encrypted whatsapp. it just started encrypting instagram and facebook, so i think he's going to get some pushback from the senators who see encryption as potentially hazardous for children. >> i have so many more questions for both of you, and we know the it senators have questions for these tech ceos. that questioning has begun. we started with the discord ceo, senator dick durbin questioning him. let's listen back in, stay with us, guys. >> i listened closely to your testimony here and it's never been a secret that snapchat --
7:49 am
snapchat is used to send sexually explicit images. in 2013 early in your company's history, you admitted this in an interview. do you remember that interview? >> senator, i don't recall the specific interview. >> you said that when you were first trying to get people on the app, you would, quote, go up to the people and be like, hey, you should try this application, you can send disappearing photos and they would say, oh, for sexting? do you recall that interview? >> when we fist created the application was called pick a boo, the idea was around disappearing images. the feedback we received was they were using it to communicate. we found people were using it to talk visually. >> as early as 2017, law enforcement identified snapchat as the pedophile's go-to sexual exploitation fool. the case of a 12-year-old girl identified in court as l.w. shows the danger. over two and a half years a
7:50 am
predator sexually groomed her sending her sexually explicit images and videos over snapchat. the man admitted that he only used snapchat with l.w. and not any other platforms because he, quote, knew the chats would go away. did you and everyone else at snap really fail to see that the platform was the perfect tool for sexual predators? >> senator, that behavior is disgusting and reprehensible. we provide in-app reporting tools so that people who are being harassed or have been shared inappropriate sexual content can report it. we 230 of the communications decency act. do you have any doubt that had snap faced the prospect of civil liability for facilitating sexual exploitation, the company would have implemented even better safeguards? >> senator, we already work extensively to proactively detect this type of behavior. we make it very difficult for predators to find teens on snapchat.
7:51 am
there are no public friends lists, no public profile photos. when we recommend friends for teens, we make sure they have several mutual friends in common before making that recommendation. we believe those safeguards are important to preventing predators from misusing our platform. >> mr. sitron, according to discord's website, it takes a proactive and automated approach to safety, only on servers with more than 200 members. smaller servers rely on server owners and community moderators to define and enforce behavior. so how do you defend an approach to safety that relies on groups of fewer than 200 sexual predators to report themselves for thinks like grooming, or sextortion? >> chair, our goal is to get all of that content off of our platform and ideally prevent it from showing up in the first place or from people engaging in these kind of horrific activities. we deploy a wide array of
7:52 am
techniques that work across every surface on our -- on discord. we recently mentioned something called teen safety assist, which works everywhere, it acacts lika buddy that lets them know that they may be in a situation that may be appropriate so they can report that to us and block that user. >> if that were working, we wouldn't be here today. >> chair, this is an ongoing challenge for all of us. that is why we're here today. but we do have 15% of our company is focused on trust and safety of which this is one of our top issues, more than we have working on marketing, promoting the company. we take these issues seriously but we know it is an ongoing challenge and i look forward to working with you and our tech peers to improve this approach. >> i hope so. mr. chu, your organization, business, is one of the more
7:53 am
popular ones among children. can you explain to us what you're doing particularly and whether you've seen any evidence of csam in your business? >> yes, senator. we have a strong commitment to invest in trust and safety and as i said in my opening statement, i intend to invest more than $2 billion in trust and safety this year alone. we have 40,000 safety professionals working on this topic. we have built a specialized team to help us identify specialized issues, horrific issues like material like the ones you have mentioned. if we identify any on our platform, we proactively do detection, we will remove it and report them to other authorities. >> why is it tiktok allowing children to be exploited into performing commercialized sex acts? >> senator, i respectfully disagree with that
7:54 am
characterization. our live streaming product is not for anyone under the age of 18. we identify and remove them from using that service. >> i'll turn to my ranking neb, senator graham. >> thank you, mr. citron, you said we need to start a discussion. being honest with you, we have been having this discussion for a very long time. we need to get it resolved, not a discussion. do you agree with that? >> ranking member, i agree this is an issue that we also have been very focused on since we started our company, but this is the first time -- >> are you familiar with the earn it act? >> a little bit, yes. >> do you support that? >> we -- >> yes or no? >> we're not prepared to support it today, but we believe -- >> the csam act? >> the stop csam act, we are not
7:55 am
prepared to support today. >> do you support the shield act? >> we believe that the cyber tip line -- >> do you support it yes or no? >> we believe the cyber tip line and -- >> i'll take that to be no. the project safe childhood act, do you support it? >> we believe that -- >> i'll take that to be no. the report act, to you support it? >> ranking member graham, we very much look forward to having conversations with you and your team. >> -- a bill that will solve the problem. do you support removing section 230 liability protections for social media companies? >> i believe that section 230 is -- needs to be updated. it is a very old law. >> do you support repealing it, so people can sue if they plead they're harmed? >> i think that section 230 as written, while it has many downsides -- >> thank you very much. so, here you are, if you wait on these guys to solve the problem, we're going to die waiting.
7:56 am
mr. zuckerber -- try to be respectful here. the representative from south carolina, mr. guffy's son got caught up in a sextortion ring in nigeria using instagram and he was shaken down, paid money, that wasn't enough and he killed himself using instagram. what would you like to say to him? >> that's terrible. i mean, no one should have to go through something like that. >> do you think he should be allowed to sue you? >> i think that they can sue us. >> i think he should and he can't. so, bottom line here, folks, is that this committee is done with
7:57 am
talking. we passed five bills unanimously that in their different ways, look at who did this. klobuchar, cornyn, klobuchar, blackburn, osoff, we have found common ground here that is astonishing and we had hearing after hearing, mr. chairman, and the bottom line is i've come to conclude, gentlemen, that you're not going to support any of this. how do you say your last name? >> yaccarino. >> do you support the earn it act? >> we strongly support the collaboration to raise industry -- >> no, no, no. do you support the earn it act? did you support the earn it -- in english, do you support the earn it act, yes or no? we don't need double speak here. >> we look forward to supporting
7:58 am
and continuing our conversation. >> but you have taken -- the reason the earn it act is important, you can lose your liability protections when children are exploited and you didn't use best business practices. see, the earn it act means you have to earn liability protection. you're giving it no matter what you do. so, to the members of this committee, it is now time to make sure that the people who are holding up the signs can sue on behalf of their loved ones. nothing will change until the courtroom door is open to victims of social media. $2 billion, mr. chew. how much -- what percentage of that is what you made last year? >> senator, it is a significant and increasing investment -- >> you paid taxes. i mean, 2% is what percent of your revenue? >> senator, we're not ready to share our financials in public -- >> i just think $2 billion sounds like a lot, unless you make $100 billion.
7:59 am
the point is, you know, when you tell us you're going to spend $2 billion, great, but how much do you make? you know, it is all about eyeballs. our goal is to get eyeballs on you. the damage being done, do you realize, mr. chew, that your tiktok representative in israel resigned yesterday? >> yes, i am aware. >> and he said i resign from tiktok, we're living in a time in which our existence as jews in israel and israel is under attack and in danger, multiple screen shots taken from tiktok's internal employee chat platform known as lark shows how tiktok's trust and safety officers celebrate the barbaric acts of hamas and other iranian-backed terror groups including the houthis and yemen. >> senator, i need to make it very clear that pro-hamas content and hate speech are not allowed on our platform.
8:00 am
>> why did he resign? why did he quit? >> senator, we also do not allow -- >> why did he quit? >> we do not allow this -- >> he quit. he gave up a good job because he thinks your platform is being used to help people who want to destroy the jewish state. i'm not saying you want that. mr. zuckerberg, i'm not saying you want as an individual any of the harms. i am saying that the product you have created, with all the up sides, have a dark side. mr. citron, i'm tired of talking, i'm tired of having discussions. we know the answer here. here's the ultimate answer. stand behind your product. go to the american courtroom and defend your practices. open up the courthouse door. until you do that, nothing will change. until these people can be sued for the damage they're doing, it is all talk. i'm a republican who believes in free enterprise. but i also believe that every american who has been w

86 Views

1 Favorite

info Stream Only

Uploaded by TV Archive on