tv [untitled] October 5, 2021 6:00pm-6:31pm AST
6:00 pm
in one of the best opportunities i think to do that, at least for in a bipartisan way is the platform accountability, consumer transparency or the pack deck. and that's legislation that i've co sponsored with senator shots, which in addition to stripping section to 30 protections for content that a court determines to be illegal. the pack deck would also increase transparency and due process for users around the content, moderation process. and importantly, in the context we're talking about today with this hearing with a major big tech whistleblower, the pact act would explore the viability of a federal program for a big tech employs to blow was on there. watching the news hour on al jazeera with me fully back to board 15, g m t 11. am in washington dc where a facebook whistleblower has been testifying at an internet safety hearing on capitol hill. francis hogan is a former product manager on facebook. civic integrity team and she lease documents that she says pro facebook repeatedly prioritize growth overs faith to of its uses
6:01 pm
. now this hearing today comes a day after a facebook outage that affected 3500000 uses. worldwide. hogan has been specifically testifying about the company's research inter instagram's effect on the mental health of young uses. in our opening statement, francis hogan said she joined facebook because of its potential to bring out the best in people. and she concluded by saying, and calling on congress to implement regulations that changed the rules for facebook. she's now being questioned by lawmakers. let's continue to listen. that's . that's just not true. there are a lot of facebook lies to present things as false traces like you have to choose between having lots of span. let's imagine we ordered our feeds by time. like i'm message or i'm or other forms of, of social media. they are kind of logically based, they're going to say you're really in spanish, you're in it sound like you're not gonna enjoy your feed. the reality is that those experiences have
6:02 pm
a lot of permutations. there are ways that we can make those experiences where computers don't regulate what we see. we together socially regulate what we see. and they don't want us to have that conversation. because facebook knows that when they pick out there, the content we focus on using computers, we spend more time on their platform. they make more money. um, the dangers of engagement based ranking are that facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment or richer. and it's interesting because those clicks and commentary shares aren't even necessarily for your benefit. it's because they know that other people will produce more content if they get the likes and comments and re shares. they prioritize content in your feed, so you will give a little hits of dope and mean to your friends. so they will create more content. and they have run experiments on people, producers, side experiments where they have confirmed so you,
6:03 pm
you and your hard information, you provided the wall street journal. it's been found the facebook altered it's algorithm in attempt to boost these meaningful social interactions or m s. i. but rather than strengthening bonds between family and friends on the platform, the algorithm instead rewarded more outrageous, sensationalism in. and i think facebook would say that it's algorithms are used, connect individuals with other friends and family that are largely positive. do you believe that facebook's algorithms make its platform a better place for more so users and should consumers have the option to use facebook and instagram without being manipulated by algorithms designed to keep them engaged on their platform? i strongly believe, like i've spent most my career working on systems like engagement base ranking, like when i come to you and say these things, i'm basically damn in 10 years, my own work, right? i'm engagement base ranking. facebook says we can do it safely because we have a i, you know, the, the artificial intelligence will find the bad content, the,
6:04 pm
we know our engagement base rent, he's promoting, they've written blog posts on how they know engagement based rankings dangerous. but the a i will save us. facebook's own research says they cannot adequately identify dangerous content. and as a result, those dangerous algorithms they admit are picking up the extreme sentence, the division they can't protect us from the harms that they know exists in their own system. and so i, i don't think it's just a question and saying should people have the option of choosing to not be manipulated by their algorithms? i think if we had appropriate oversight or if we reformed 230 to make facebook responsible for the consequences of their intentional ranking decisions. i think they would, they would get rid of engagement base ranking because it is causing on teenagers to be exposed to more anorexia content. it is pulling families apart. and in places like ethiopia, it's literally fanning ethnic violence. ah, i encourage reform of these platforms not,
6:05 pm
not picking and choosing individual ideas. instead, making the platforms himself safer, less twitchy, less reactive, less viral, because that's how we scalable. solve these problems and keep mixture. i would simply say let's, let's get to work. so we got 2 things we can do here. thanks. i great, thank you. finish up. thank you mister chairman, ranking member, thank you for your courage in coming forward. was there a particular moment when you came to the conclusion that reform from the inside was impossible and that you decided to be a whistle blower? it was a long series of moments where i became aware that facebook one faceless conflicts of interest between its own prophets and the common gate public safety at facebook consistently chose to prioritize its profits. i think the moment which i realized we needed to get help from the outside. that the only way these problems would be
6:06 pm
solved is by saw them together. not solving them alone was when civic integrity was dissolved following the 2020 election. it really felt like a betrayal of the promises that facebook and made to people who had sacrificed a great deal to keep the election safe. i'd basically dissolving our community and integrate in just other parts of the company. and when i know they're there responses that they've sort of distributed the duties. yeah, that's an excuse, right. i, i cannot stand the hearts of other men and i don't know what they are. let me say this way. don't work. right. and i can tell you that when i left the company, so my, the people who i worked with were disproportionate, maybe 75 percent of my pod of 7 people. i'm a product managers program, anders. most them a come from civic integrity. all of us left at the, in authentic behavior pod, either for other parts of the company or the, or the company entirely over the same 60 period of time. so 6 months after the
6:07 pm
reorganization, we had clearly lost faith. but those changes were coming. you said in your opening statement that they know how to make facebook and instagram safer. so thought experiment. you are now the chief executive officer and chairman of the company. what changes would you immediately as to i, i would immediately establish a policy of how to share information or research from inside the company with appropriate oversight bodies like congress. i waited, i would give proposed legislation to congress saying, here's what an effective oversight agency would look like. i would actively engage with academics to make sure that that people who are quite who are confirming our facebook mark and the message is true, have the information they need to confirm these things. and i would, i immediately implement the quotesoft interventions that were identified to protect
6:08 pm
the 2020 election. that's things like requiring someone to click on a link before re sharing it. because other companies like twitter have found that that significantly reduces misinformation. no one is sent here by being forced to click on a link before re sharing it. thank you. i want to to the back to instagram. so targeting of kids, we all know that they announced a pause. but that reminds me of what they announced. bob, when they were gonna issue a digital currency and they got beat up by the u. s. senate banking committee and they said never mind. and now they're coming back around, hoping that nobody notices that they are going to try to issue a currency. now let's set aside for the moment, the sort of the business model which appears to be gobble up everything do everything. that's the gross growth strategy. do you believe that they're actually going to discontinue as to gram kids, or they're just waiting for the dust to settle?
6:09 pm
i, i would, we sincerely surprised if they do not continue. we're canister grandkids and i would be amazed if a year from now we don't have this conversation again at why on facebook understands that if they want to continue to grow, they have to find new users. they have to make sure that, that the next generation is just as engaged instagram as the current one. ah, and the way they'll do that is by making sure that children establish habits before they have good self regulation by hooking kids. i hooking kids i would like to emphasize when the documents that we sent in on problematic use, examined the rates of problematic use by age and not peaked with 14 year olds. it's what it's just like cigarettes, teenagers don't have good self regulation. they say explicitly, i feel bad when i use instagram and yet i can't stop. i'm, we need to protect against a just my final question. i have a long list of misstatements miss directions and i'll write lies from the company.
6:10 pm
i don't have time to read them, but you're as intimate with all of these disruptions as i am. so i will just jump to the and oh, if you were a member of this panel, would you believe what facebook is saying? i would not believe i'd facebook is not earned our right to just black blind trust in them. trust is it last week? one of the most beautiful things i heard on the on the committee was um i trust his earned and facebook is not earned our trust. thank you. thanks sandra shots, a sen ran and then we can join by the chair and her cantwell. she'll be next. when a break at about $1130.00, if that's okay cuz we have a vote and and then we'll reconvene. good. mister sherman, thank you for the conversation so far. reminds me that you and i are to resolve our
6:11 pm
differences and introduce legislation. so as centered through said, let's go to work our, our differences are very minor, where they seem very minor in the face of the revelations that we now seen. so i'm hoping we can move forward. i sure that view mister chairman, thank you. i thank you very much for your testimony. what examples do you know we, we've talked about particularly children teenage girls in specifically. but what other examples do you know about where facebook are instagram new it's decisions would be harmful to its users, but still proceeded with the, with the plan and executed those harmful that horrible behavior. facebook's internal research is aware of writing problems facing children on instagram that are on a they know that severe harm is happening to children. for example,
6:12 pm
in the case of bullying, facebook knows that instagram dramatically changes the experience of high school. so when we were in high school and i was in high school, i most kids you looked at me and changed your, your sorry, i was on high school. you know, most kids have positive homeless. like it doesn't matter how bad it is at school. kids can go home and re sat for 16 hours. kids kids who are bullied on instagram. the bullying follows them home. it follows them into their bedrooms. the last thing they see before they go to bed at night is someone being cruel to them, or the 1st thing they see in the morning is someone being cruel to them. kids are learning their own friends like people who they care about them are cruel to them. like, think about how that can impact their domestic relationships when they become 20 somethings or 30 somethings to believe that people who care about you are mean to you. on facebook knows that parents to day because they didn't experience these
6:13 pm
things, they never experienced. this addictive experience with a piece of technology. they give their children bad advice. they say things like, why don't you just stop using it. and so that facebook's own research is aware that children express feelings of loneliness and struggling with is things because they can't even get support from their own parents. i don't understand how facebook and know all these things and nod escalated to some, only congress for help and support in navigating these problems. let me ask a question. you know, in a broader way, besides teenagers, research girls are beside youth. now, are there other practices at facebook instagram that are known to be harmful, but yet are pushy? i'm a facebook is aware that choices it made in establishing like meaningful social meaning from social interactions. so engagement based ranking that didn't care if you bullied someone or commit hate speech in the comments that was meaningful. they know that that change directly
6:14 pm
change publishers behavior that companies like bus he wrote and said the content is most successful on our platform is some of the content we're most ashamed of. you have a problem with your ranking and they did nothing. they know that politicians are being forced to take positions. they know their own constituents don't like or approve of because those, the ones that get distributed on facebook that's a huge, huge negative impact. the all of his will also notice that they have admitted him. public engagement based ranking is dangerous without integrity and security systems . but then not rolled out those integrity and security systems, most of the languages in the world. and that's what causing things like ethnic violence in ethiopia. thank you for your answer of what is the magnitude of facebook's revenues or profits that come from the sale of user data? oh i'm, i'm sorry, i've never worked on that. i'm not aware. thank you. ah, what regulations or legal actions by congress or by administrative action do you
6:15 pm
think would have the most consequence or be feared most by facebook, instagram or allied companies? i strongly encourage reforming section $232.00 are exempt decisions about algorithms, right? so modifying to 30 around content, i think as a it's, it's very complicated because i use you generated content is something that companies have less control over their, of a 100 percent control over their algorithms. and facebook should not get a free path and choices it makes to prioritize growth and fire ality and reactive, nist over public safety. they shouldn't get a free pass on that because they're paying for their profits right now with our safety. so i strongly encourage reform of 230 in that way. i also believe there needs to be a dedicated oversight body, because right now the only people in the world who are trained to analyze these experiments to understand what's happening inside of facebook. or people who,
6:16 pm
you know, grew up inside a facebook or pinterest or another social media company. and there needs to be a regulatory home, or someone like me could do a tour of duty after work in a place like this. and have a place to work on things like regulation to bring that information out to the oversight boards that have the right to do oversight, regulatory agency within the federal government. yes. thank you very much. like a germ send the tant. well, thank you mister chairman. line, thank you for holding this hearing and i think my colleagues have brought up a lot of important issues. and so i think i just want to continue on that vain. first of all, the privacy act that i introduced along with several my colleagues actually does have f t c oversight of algorithm transparency. in some instances, i hope you take a look at that and tell us what other areas you think we should add to that level of transparency. but clearly that's the, the issue at hand here, i think in your coming forward. so thank you again for your willingness to do that
6:17 pm
. the documentation that you say now exists is the, the level of transparency about what's going on. that people haven't been able to see. and so your information that you say has gone up to the highest levels at facebook is that they purposely knew that their algorithms were continuing to have miss information and have information. and that when presented with information about this terminology, you know, downstream m a thigh meaningful social information, knowing that it was this choice. you could continue this wrong headed information, hate information about the ringa or you could continue to get higher click through rates. and i know you said you don't know about profits, but i'm pretty sure you know that on a page. now, if you click through that next page, i'm pretty sure there's a lot more ad revenue than if you didn't click through. so you're saying the documents exist that at the highest level at facebook,
6:18 pm
you had information discussing these 2 choices and that people chose, even though they knew that it was misinformation and hurtful and maybe even causing people lives. they continued to choose profit. we're submitted documents to congress outlining our mark secker. it was directly presented with a list of, quote, soft interventions. so hard intervention is like taking a piece of content off facebook, taking a user off facebook. soft interventions are about making slightly different choices to make the platform less viral. less twitchy. ah, mark was presented with these options, and chose to not remove downstream. m. s. i, in april of 2020. even though he in even just isolated in, at risk countries, that countries at risk of violence. ah, if it had any impact on the overall m. s. i metric. so he chose which in translation, millions less money. yeah. he said right. it was there. another reason given why
6:19 pm
they would do it other than they thought it would really affect their numbers. um, i do know, i don't know for certain like jeff jeff horwitz 3rd 4th the wall street journal. i struggled with us. we sat there and read these minutes and we like, how is this possible? like we've just read a 100 pages on how downstream emma's i, it expands hate speech, misinformation violence and signing content, graphic violence content. why? when you get rid of this and we leave the best theory that we've come up with, and i want to emphasize this is just our interpretation. it is, people's bonuses are tied to emphasize, right? like people, people stay or leave the company based on what they get paid. and like if you heard emma sy lunch people didn't what we're going to get their bonuses. so you're saying that this practice even still continues to day, like we're still in this environment. i'm personally, yeah, very frustrated by those because we presented information to facebook from one of my own constituents in 2018, talking about this issue with rosie nga pleading with the company and we pleaded
6:20 pm
with the company and they continue to not address this issue. now you're pointing out that these same algorithms are being used and they know darn well in ethiopia that it's causing and inciting violence. and again, they are still today choosing profit overtaking this information down. is that correct? when writing b and the united states in the summer of last year, they turned off dance. remember, not only for when they detected content was health content, which is probably covet and on the civic content. but facebook's own algorithms are bad at finding this content. it's still in the raw form for 8090 percent of even that sensitive content in countries where they don't have integrity systems in the language, local language, and the case of ethiopia, eric 100000000 people on ethiopia and 6 languages. facebook only supports 2 of those languages for integrity systems. this strategy of focusing on language specific content specific systems, ai to save us,
6:21 pm
is jim to fail. how i need to get to the one of a personal i'm sending a letter to facebook today. they better not delete any information as it relates to the row. he can go are investigations about how they proceeded on this, particularly on in light of your information or the documents. but are we also now talking about advertising fraud or you selling something to advertisers? that's not really what they're getting. we know about this because of the newspaper issues. we're trying to say that journalism that basically has to meet a different standard, a public interest standard, that basically is out there basically proving every day or they can be sued. these guys are a social media platform, the doesn't have to live with that. and then the consequences, they're telling their advertisers that this was a threat. we see it, we see it, people are coming back to a local journalism because they're like, we want to be again of what the trusted brand. we don't want to be in, you know, your website. so i, i think you're finding for the f as he see is an interesting one. but i think that we also have to look out what are the other issues here. and one of them is, did you do fraud?
6:22 pm
did they defraud advertisers and telling them this was the advertising content that you were going to be advertising? when in reality was something different was based on a different model. we have multiple examples of question answers for the advertising staff, the sales staff where advertisers say after, on the riots last summer rast, should we come back to facebook or after the insurrection? like, should we come back to facebook? and facebook said in, they're talking points that they gave to advertisers, we're doing everything in our power to make the saver or we take out all the hates of each one. we find it a facebook's own and that was not true. that was not true. they get 3 to 5 percent of his wage. thank you. thank you mister chairman, next sen. cantwell, and if you want to make your letter available to other members of the committee, as he glad to join you myself, am. thank you for suggesting it. thank you. and senator lea. thank you mister chairman, and thank you miss hogan for joining us this week. it's it very, very helpful. our grateful that you're willing to make yourself available. last
6:23 pm
week, we had another witness from facebook. ms. davis. she came in, she testified before this committee and she focused on, among other things, the extent to which facebook target tabs to children, including ads that are either sexually suggestive or geared toward adult themed products or themes in general. now i didn't, i, well, i appreciate her willingness to be here. i didn't get the clearest answers in response to some of those questions. and so i'm hoping that you can help shed some light on some of those issues related to facebook's advertising processes. here today is we get into this, i want to 1st read you a quote i got from from ms davis last week. here, here's what she said during her questioning, what? when we do adds to young people. there are only 3 things that an advertiser can target around age, gender location. we also prohibit certain as to young people including weight loss
6:24 pm
ads. we don't allow tobacco ads at all. busy many to young people, we don't allow them to children. we don't allow them to miners close. cook. now is since that exchange happened last week out, there are a number of individuals and groups, including a group called the technology transparency project or ttp that have indicated that that part of a testimony was an accurate that it was false. he paid no noted that a t p had conducted an experiment and then just last month and their, their goal was to run a series of ads that would be targeted to children, ages $13.00 to $17.00, to users in the united states. now i want to emphasize that t d p didn't end up running these ads as they stop them from being distributed to
6:25 pm
the users. but facebook did, in fact, approve them and as i understand it, facebook approved them for an audience of up to 9100000 users, all of whom were teens to i brought a few of these to show you today. this is, this is the 1st one i wanted to showcase this 1st one as a colorful rafa. ah, encouraging kids to quote, throw a skills party like no other. which in all, as the graphic indicates and as, as the slang jargon. also independently suggests this involves kids getting together randomly to abuse prescription drugs. the 2nd graphic displays and anna tip, that is a tip specifically designed to encourage and promote anorexia and it's on there. now that the language, the annotated itself independently promotes that the ad also promotes it in so far as it was suggesting an image,
6:26 pm
as you ought to look at when you need motivation to be more anorexic, i guess you could say ok. now the 3rd one invites children to find their partner online and to make a love connection. you look lonely, find your partner now to make a love connection and look at it, be entirely different kettle of fish. if this were targeted to an adult audience, it is not targeted to 13, to 17 year olds. now and obviously i don't support and t d p does not support these messages, particularly when targeted to impressionable children. and, and again, just to be clear, kitty, pete did not end up pushing the ads out after receiving facebook's approval, but it did in fact receive facebook's approval. oh, so i think this says something. one could argue that it proves that facebook is allowing and, and perhaps facilitating a targeting of harmful adult themed ads to our nation's children. so could you
6:27 pm
please explain to me, ms. hagan, i'm at how these ads. oh, with a target audience of 13 to 17 year old children, how would they possibly be approved by facebook and it is a i involved in that i did not work directly on the ad of little system. ah, what was resonant for me about your testimony as facebook? i has a deep focus on scale. so scale is, can we do things very cheaply for a huge number of people, which is part of why they rely on a i so much is very possible that none of those ads were seen by a human. and the reality is that we seen for repeated documents with in my disclosures, is that facebook say i systems only catch a very tiny minority of offending content and best case scenario. and the case is something like hate speech at most they will ever get 10 to 20 percent in the case
6:28 pm
of children, that means drug paraphernalia as like that. and it's likely if they rely on computers and not humans, they will also likely never get more than 10 to 20 percent of those that are stood mister chairman, got one minor follow up question would, is should be easy to answer and i had, oh, so or facebook make may claim that it only targets ads based on age, gender, and location. um, even though these things seem to counteract that, but let's set that aside from an o and that they're not facing ads are based on specific interest categories. does facebook still collect interest categories data on teenagers, even if they aren't at that moment? targeting ads at teens based on those interest categories. i think it's very important to differentiate between white targeting our advertisers allowed to
6:29 pm
specify and what targeting facebook may learn for an ad. let's imagine you had some text on ad. it would likely extract out features that if i was relevant for that, for example, i'm in the case of something about partying. it would learn. partying is a concept. i'm very suspicious that personalized ads are still not being delivered to, to teenagers on instagram. because the algorithms learned correlations, they learn interactions where your party, i may still go to kids interests in partying because facebook is pro, is almost certainly has a ranking model in the background that it says this person wants more party related content. interesting. it, thank you. that's very helpful, and what that suggest to me is that while they're, they're saying they're not targeting gains with those ads. now the algorithm might do some of that work for them, which might explain why they collect the data even well claiming that they're not targeting those as. and i can't say whether or not as the intention,
6:30 pm
but the reality is it's very, very, very difficult to understand these algorithms today. and over and over and over again, we saw these biases, the algorithms unintentionally learn. and so yeah, it's very hard to disengage. disentangle out these factors as long as you have engagement base rank him. thank you miss heather. thank you very much, haley up. better morning. thank you mr. phillip. very much. oh, thank you, miss hogan, you, you are a 21st century american hero wanting our country. the danger for young people, for our democracy and our nation owes you just a huge debt of gratitude or the courage you're showing here today. so thank you. miss hogan. i you agree that facebook actively seeks to attract children in teens on which platforms i'm facebook actively.
14 Views
Uploaded by TV Archive on