tv [untitled] October 5, 2021 5:30pm-6:00pm AST
5:30 pm
get taken off the internet. i don't know why it went down. i know that for more than 5 hours, facebook wasn't needed to deepen, divides the stabilize democracies, and make young girls and women feel bad about their bodies. it also means the millions of small businesses weren't able to reach potential customers. and countless photos of new babies weren't joyously celebrated by family and friends around the world. i believe in the potential of facebook, we can have social media, we enjoy. that connects us without tearing our democracy, a partners democracy, putting our children in danger and selling ethnic violence around the world. we can do better. i have worked as a product manager, a large tech company since 2006, including google interest, yelp and facebook. my job is largely focused on algorithmic products like google plus search and recommendation systems like the one that powers the facebook news feed. having worked on 4 different types of social networks,
5:31 pm
i understand how complex and nuanced these problems are. however, the choices being made inside of facebook are disastrous. our children are public safety for our privacy and for our democracy. and that is why we must demand facebook make changes during my time at facebook 1st working as a lead product manager for civic misinformation. and later on counter espionage, i saw facebook repeatedly encounter conflicts between its own prophets and our safety. facebook consistently resolve these conflicts in favor of its own profits. the result has been more division, more harm, more lies, more threats, and more combat. in some cases, this is this dangerous online talk has led to actual violence that harms and even kelsey, of all is, is not simply a matter of certain social media. users being angry or unstable,
5:32 pm
are about one side being radicalized events. the other, it is about facebook choosing to grow at all costs, becoming an almost trillion dollar company by buying its profits with our safety. during my time at facebook, i came to realize a devastating truth. almost no one outside of facebook knows what happens inside of facebook. the company intentionally hides vital information from the public, from the us government and from governments around the world. the documents i have provided congress prove that facebook has repeatedly misled the public about its own research reveals about the safety of children. he advocacy of it's artificial intelligence systems, and role in spreading divisive and extreme messages. i came for it because i believe that every human being deserves the dignity of the truth. the severity of this crisis demands that we break out of our pre this regulatory frames. facebook wants to tricky any thinking that privacy protections or changes to section $230.00
5:33 pm
alone will be sufficient. while important, these will not get to the core of the issue, which is that no one truly understands the destructive choices made by facebook. except facebook. we can afford nothing less than full transparency. as long as facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. until the incentives change, facebook will not change. left alone, facebook will continue to make choices that go against the common good. are common good. when we realize bid tobacco was hiding, the harms that cause government took action. when we figured out cars were safer with seat belts, the government took action. and when a government learned that opioids were taking lives, the government took action. i implore you to do the same here to day facebook shapes are perception of the world by choosing the information we see. even those
5:34 pm
who don't use facebook are impacted by the majority who do a company with such frightening influence over so many people over their deepest thoughts, feelings and behavior needs real oversight. but facebook's closed design means it has no real oversight. only facebook knows how it personalizes your feed for you. at other large, 10 companies like google, any independent researcher can download from the internet. the company search results and write papers about what they find. and they do. but facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system. facebook will tell you privacy means they can't give you data. this is not true. when tobacco companies claimed that filtered cigarettes were safer for consumers, scientists could independently and validate these marketing messages and confirm
5:35 pm
that in fact, they posed a greater threat to human health. the public cannot do the same with facebook. we are given no other option than to take their marketing messages on blind faith. not only does the company hide most of its own data, my disclosure has proved that when facebook is directly asked questions as important as how do you impact the health and safety of our children? they mislead and day i, they choose to mislead and mr. act. facebook has not earned our blind faith is inability to send to facebook's actual systems and confirm how they work is communicated it and to work. ask and confirm that they work as communicated is like the department of transportation regulating cars by only watching them drive down the highway. to day, no regulator has a many of solutions for how to fix facebook. because facebook didn't want them to know enough about what's causing the problems otherwise they,
5:36 pm
when otherwise there would have been need for a whistleblower. how is the public supposed to assess it? facebook is resolving conflicts of interest in a way that is aligned with the public good. if the public has no visibility and how facebook operates, this must change facebook. warranty to believe the problems we're talking about are unsolvable. they want you to believe in false choices. they want you to believe you must choose between a facebook full of divisive and extreme content, are losing one of the most important values. our country was founded upon free speech that you must choose between public oversight of facebook's choices and your personal privacy. that to be able to share fun photos of your kids with old friends . you must also be inundated with anger driven veracity. they want you to believe that this is just part of the deal. i am here to day to tell you that's not true.
5:37 pm
these problems are solvable, a safer free speech respecting, more enjoyable social media is possible. but there is one thing that i hope everyone takes away from these disclosures. it is that facebook can change, but is clearly not going to do so on its own. my sears, that without action, divisive and extremist behaviors we see to day are only the beginning. what we saw in me and mar and are now seen in ethiopia. are only the opening chapters of a story. so terrifying. no one wants to re the end of it. congress can change the rules, the facebook plays by and stop them, many harms. it is now causing. we now know the truth about facebook, destructive impact. i really appreciate the seriousness which the members of congress and the securities and exchange commission are approaching these issues. i came forward at great personal rest because i believe we still have time to act. we
5:38 pm
must act now and asking you are elected representatives, act. thank you. thank you, miss hogan. thank you for taking that personal risk and we will do anything and everything to protect and stop any retaliation against you and any legal action that the company may bring to bear or anyone else and made that i think very clear in the course of these proceedings i want to ask you about this idea. disclosure you've talked to mad rob looking in effect at a car going down the road and we're going to have 5 minute rounds of questions, maybe a 2nd round. if you're willing to do it, we're here today to look under the hood, and that's what we need to do more in august. andrew blackburn, i wrote to marcia cooper and we asked him pretty straightforward questions about
5:39 pm
how the company works and safeguards, children and teens on instagram, facebook dive, duff side track in effect, misled us. so i'm gonna ask you a few stray quote for a question to break down some of what you said and if you can answer them yes or no, that would be great. is facebook's research, it's own research ever found that it's platforms can add a negative effect on children and teens, mental health or well be many a facebook's internal research reports indicate that i facebook has a serious negative harm on a non significant and not a significant portion of teenagers and, and younger and children. and his facebook ever offered features that it knew had a negative effect on children and teen mental health. facebook knows that it's
5:40 pm
amplification algorithms, things like engagement base ranking on instagram can lead children from very innocuous topics like healthy recipes. i think all of us can eat a little more healthy all the way from just something innocent, like healthy recipes to anorexia promoting content over a very short period of time. and has facebook ever found again, and it's research that kids show sign of addiction? on instagram, facebook has studied a pattern that they call problematic use what we might more commonly call addiction . and it has a very high bar for what it believes. it says you use self identify, they don't have control of your usage, and that is materially harming your health. your school work, or your physical health. 5 to 6 percent of 14 year olds had to self awareness to admit both those questions. it is likely that far more than 5 to 6 percent of 14
5:41 pm
year olds are live are, are even, are addicted to instagram last thursday. my colleagues and i asked ms. davis, who was representing facebook, about how the decision lou made, whether to pause permanently east glam for kids. and she said, quote, there is no one person who makes a decision like that. we think about that collaboratively. it's as though she couldn't mention arc soccer group's name, isn't the, the one who will be making this decision from your experience from the company. mark holds a very unique role in the tech industry, and that he holds over 55 percent of all the voting shares for facebook. there are no similarly powerful companies there as unilaterally controlled, and the end of the buck stops with mark. there is no 100 currently holding market
5:42 pm
capitol by himself. and mark zuckerberg in effect, is the algorithm designer in chief? correct? um, i received an m b a from harvard, and a emphasized to us that we are responsible for the organizations that we built on. mark has built an organization that is very metrics driven. it isn't, it is intended to be flat. there is no unilateral responsibility. the metrics make the decision. unfortunately, that itself is a decision. and in the and he has the seo and the chairman of facebook, he is responsible for those decisions. the buck stops with him at the buck stops with him. oh, and speaking of the buck stopping i, you have said that baseball should declare moral bankruptcy. i agree. yeah, i think it's, it's actions and it's failure to acknowledge his responsibility. indicate moral bankruptcy. there is
5:43 pm
a cycle occurring inside the company where facebook is struggling for a long time to recruit and retain the number of employees and needs to tackle large scope of projects that is chosen to take on. facebook is stuck in a cycle where it struggles struggles to hire that causes it to understand projects which causes scandals, which then makes it harder. the higher part of why facebook needs to come out and say, we did something wrong. we made some choices that we regret, is the only way we can move forward and he'll facebook as we 1st left him at the truth. i the way we'll have reconciliation and we can move forward is by 1st being honest and declaring moral bankruptcy being honest. and acknowledging that facebook has cars and i get a lot of pain and with ment more money. and it has profited off spreading dis, information and misinformation and sewing, hey hey, scott answers to facebook. destructive impact always seems to be more
5:44 pm
facebook. we need more facebook, which means more pain and more money for facebook. would you agree? i don't think at any point facebook set out to make a destructive platform, i think it is a challenge of that. facebook has set up an organization where the parts the organization responsible for growing and expanding organization are separate and not, not regularly cross pollinated with the parts of the company to focus on the harms of the company is causing. and as a result, regularly integrity actions, projects that were hard fly by the teams, try and keep us safe, are undone by new growth projects that counter at the same remedies. i do think it's a thing of your organizational problems that need oversight. and facebook needs help in order to move forward to a more healthy place. and whether its teens, bullied in the suicide or thought. or in the genocide of ethnic minorities
5:45 pm
in my armor or fanning the flames that division within our own country or in europe. they are ultimately responsible for the immorality of the pain. its cause. facebook needs to take responsibility for the consequences of its choices and needs to be willing to accept small st. awesome, proven. and i think, i think just that act of being able to admit that it's a mixed bag is important. and i think that what we saw from in taking a last week is an example of the kind of behavior we need to support facebook and growing outer, which is instead of just focusing on all the good they do and may have responsibilities to also remedy the harm, but mark zuckerberg, new policy is no apologies. no admissions, no acknowledgement, nothing to see here. we're going to deflect it and go sailing. i turned to the
5:46 pm
ranking member. thank you mister chairman. thank you for your testimony. i want to stay with mr. davis and some of her comments, because i had asked her last week about the under age users and she had made the comment, i'm going to quote from her testimony. if we find an account of someone who's under 13, we remove them. and the last 3 months we removed $600000.00 accounts of under 13 year old in quote. and i have to tell you, it seems to me that there was a problem. if you have 600000, i towns from children who ought not to be there in the 1st life. so what did mark zuckerberg know about facebook plans to bring kids on a new users and advertised to them?
5:47 pm
there are ports within facebook that show cohort analyses where they examine out what ages do people join facebook and instagram and based on those cohort analyses . so facebook likes to say children lie about their age is to get on to the platform. the reality is enough, kids tell the truth that you can work backwards to figure out what are approximately the real ages of anyone who's on the platform. when facebook does cohort analyses and looks back retrospectively and discovers things like, you know, up to 10 to 15 percent of even 10 year olds and a given cohort, maybe on facebook, instagram. ah, okay. so this is why adam necessary, who's the ceo of instagram would have replied to j j. c. wow. when she said to him, oh, i've been on instagram since i was a. he said he didn't want to know that. i know it would be for this reason, correct. i'm
5:48 pm
a pattern of behavior that i saw it. facebook was that often palms williams and so understaffed that there was, i had an implicit discouragement from having better detection systems. and example i worked on my last him a facebook was on the counter espionage team with him. the threat intelligence org . and at any given time, our team could only handle a 3rd of the cases that we knew about. i mean, knew that if we built even a basic detector, we would likely have many warranties. okay, lemme earliest, oh, yeah, let me ask you this. so you look at the way that they have the data, but they are choosing to keep that data and advertise from it, right? you sell it to 3rd parties. so what does facebook do? you've got these 600000. i camps that ought not to be on there and a morning and right, but then you delete those accounts. but what happens to that data?
5:49 pm
does facebook keep that data? do they keep it until those children go to age 13 cents, a sure saying they can work backward in fig. figure out the true age of a a user. so what do they do with it? do they delete it? do they store it? do they keep it? how does i process that? i'm, i am my understanding of facebook scatter attention policies and i want to be really clear. i, you work directly on that is that they delete. when they de leon account, they delete all the data and i believe name case in compliance with g d p r. ah, i hang with regard to children under age on the platform. facebook do substantially more to detect more those children and they should have to publish for congress those processes because there are lots of subtleties, know, things like could be much more effective than probably what they're doing to got it . nance staying with this under age children since this hearing is all about kids
5:50 pm
and event online privacy, i want you to tell me how facebook is able to do market research on these children that are under age 13 because mr. davis was to really, ah, she didn't deny this last week. so how are they doing this? do they ab bring kids in to focus groups with their parents? how do they get that permission? she said they got permission from parents. is there a permission slip or a form that gets assigned and then how do they know which kids to target? i'm you bunch unpack, there i was there was maybe how did they recruit children for focus groups or did recruit, recruit teenagers? most tech companies have systems where they can, ah,
5:51 pm
analyze the data that is on their servers. so i, most of the focus groups i read are that i saw analysis of were around messenger kids, which has children on it. and, and those focus groups appear to be children interacting in person. i often large trucking companies use either sourcing agencies that will go and identify people who meet certain demographic criteria where they will reach out directly based on current data on the platform. over simple, on the case of messenger kids, maybe would want to study a child that was an active user and one that was a less active user. you might reach out to some that came from each population and said these are children that are under age 13. yeah. and they know it um for, for some of the studies and i assume they get i seem to get permission but i don't work on them. okay. well we're still waiting to get a copy of parental consent form that would involve children. my time is expired.
5:52 pm
mister chairman, i'll save my other questions our 2nd round, if we're able to get those. thank free. thank ye. under by prince andrew chloe. thank you very much mister chairman. thank you so much, miss hogan, for shutting a light on hull facebook time and time again has put profit over people. when their own research found that more than 13 percent of teen girls say that instagram made their thoughts of suicide words, where did they do? they proposed instagram for kids, which has now been put on pause because of public pressure. when they found out that they're all, the rhythms are fostering polarization misinformation and hate that they allowed 99 percent of their violent contact to remain unchecked on their platform. including lead up to the january 6 insurrection. where did they do? they now, as we know, mark zuckerberg is going sailing and saying no apologies. i think the time has come for action, and i think you are the catalysts for that action. you have said privacy legislation
5:53 pm
is not enough. i completely agree with you, but i think, you know, we have not done anything to update our privacy laws in this country of federal privacy laws, nothing zilch in any major way. why? because they're a lobbyist around every single corner of this building that have been hired by the tech industry. we have done nothing when it comes to making the algorithms more transparent along for the university research that you referred to. why? because facebook and the other tech companies are throwing a bunch money around this town and people are listening to them. we have done nothing significantly past, although we are on a, by person basis working in that and i trust subcommittee to get something done. and consolidation which you understand allows the dominant platforms to control all this, like the bullies in the neighborhood by out the companies that may be, could have competed with them and added the bells and whistles. so the time for action is now. so i'll start,
5:54 pm
i'll start with the same thing that i asked facebook's head of safety when she testified before us last week. i asked her how they estimate the lifetime value of a user for kids who start using their product before they turn 13. she's abated the question and said, that's not the way we think about it. is that right? or is it your experience that facebook estimates and that, and puts a value on how much money they get from users in general, get to kids in a 2nd. is that a motivating force for them based on what i saw in terms of allocation of integrity spending. so one of the things disclose my wall street journal was that i believe it's like 87 percent of all the misinformation spending is spent on english, but only about like 9 percent of the users are english speakers. i'm. it seems that that a spoken best. laura and users, you make the more money, even though the danger may not be evenly distributed based on profitability. does it make sense that having a younger person get hooked on social media?
5:55 pm
i'm at a young age makes them more profitable over the long term as they have a life ahead of them is looks internal documents, talk about the importance of getting younger users, for example, tweens on to instagram like instagram, kids, because they need to have um, like they noted children bring their parents online and things like that. and so they understand the value of younger users for the love, the long term success of facebook. facebook reported advertising revenue to be $51.58 per user in the last quarter in the us and canada. when i asked miss davis how much of that came from instagram users under 18, she wouldn't say, i do you think the tunes are profitable for their company? i would assume so based on advertising for things like television, i've yet much, substantially higher advertising rates for customer. he don't yet have preferences or habits as i am, i'm sure they are some of the more profitable users on facebook, but i do not work directly on them. another major issue that's come out of this at
5:56 pm
eating disorders. as studies have found that eating disorders actually have the highest mortality rate of any mental illness for women. and i let a bill on this with centers capital baldwin that we passed into law. and i'm concerned that this algorithms that they have pushes outrageous content and promoting anorexia and the like. i know it's personal to you and you think that there algorithms push some of this content to young girls. facebook knows that they're, they in the inc. engagement based ranking, the way that they pick the content and instagram for young users for all users on amplifies preferences. and they have done something called a pro act, a pro active incident response where they say take things they've heard. for example, like a can you, he led by the algorithms to anorexia content and they have literally recreated that experiment themselves and confirmed yes this, this happens to people that facebook knows that they are,
5:57 pm
that they are leading young users to anorexia. honda, do you think they are deliberately designing their product to be addictive beyond even that content on facebook has a long history of having successful in various active. i'm gross division. i'm where they take little tiny tweaks and constantly constantly, constantly are trying to optimize it to grow. ah, that those kinds of stickiness could be construed as things that facilitate addiction, right? lasting all f, as we've seen this same kind of content in the political world, you brought up other countries in what's been happening there. on 60 minutes, you said that facebook implemented safeguards to reduce misinformation ahead of the 2020 election, but turned off those safeguards. right? after the election, and you know that the insurrection occurred january 6th, you think that facebook turned off the safeguards a because they were costing the company money because it was reducing profits. this book has been emphasizing a false choice. they fed on the safeguards that were in place before the election,
5:58 pm
imp implicated free speech. the choices that were happening on the platform were really about how reactive in which he was the platform, right. like how viral was the platform and facebook changed those safety defaults in the run up to the election because they knew they were dangerous. and because they wanted that growth back, they wanted the celebration on the platform back asked election they, we, they returned to their original defaults. and the fact that they had to break the glass on january 6th and turn them back on. i think that's deeply problematic. agree, thank you very much for your bravery in coming forward centers soon. thank mister chair in ranking member blackburn and i've been arguing for some time that it is time for congress to act. and i think the question is always, what is the correct way to do it the right way to do it? consistent with our 1st amendment right to free speech or this committee
5:59 pm
doesn't have jurisdiction over the any trust issued as the judiciary committee. i'm not averse to looking at the monopolistic nature of facebook. honestly, i think that's a real issue that needs to be examined and perhaps addressed as well. but at least under this committee's jurisdiction, there are a couple of things i think we can do. and i have of these legislation and centers, blackburn and, and blumenthal are both co sponsors called the filter bubble transparency act. and essentially what it would do is give users the options to engage with social media platforms without being manipulated by the secret formulas that essentially dictate the content that you see when you open up an app or log on to a website. um it, we also, i think need to hold big checked accountable by reforming section 230 in one of the best opportunities i think, to do that at least for in a bipartisan way is the platform accountability, consumer transparency or the pact act. and that's legislation. and i've co sponsored with senator shots,
6:00 pm
which in addition to stripping section $230.00 protections for content that a court determines to be illegal. the pack deck would also increase transparency and due process for users around the content, moderation process. and importantly, in the context we're talking about today with this hearing with a major big tech whistleblower, the pact act would explore the viability of the federal program for big tech employs to blow the whistle and they're watching the news hour on al jazeera with me fairly back to boys 15, g m t 11. am in washington dc where a facebook whistleblower has been testifying at an internet safety hearing on capitol hill. francis hogan is a former product manager on facebook civic integrity team and she lease documents that she says po facebook repeatedly prioritize growth.
47 Views
Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=667232261)