Skip to main content

tv   Washington Journal Rebecca Kern  CSPAN  March 26, 2021 10:52am-11:27am EDT

10:52 am
>> c-span is your hunter -- unfiltered view of government, created by america's cable television company in 1979. we provide c-span to viewers as a public service. host: rebecca reports on technology policy. she joined us about the hearing that took place on capitol hill featuring the heads of major technology companies. good morning to you. guest: can you -- good morning. host: who appeared before congress. guest: yesterday, we had the ceo of facebook, mark zuckerberg, the ceo of twitter, jack dorsey, and the ceo of google alphabets,
10:53 am
sundar pichai. they were brought before the house energy and commerce, two joint subcommittees to discuss the growing amount of misinformation and disinformation being spread on their platforms and contributing to the violent attack on the capital on january 6th. host: what were members of congress hoping to learn from these heads? guest: they wanted to get them to admit some culpability in the spread of this misinformation about the november 2020 election that a lot of individuals, pro-trump supporters believed and were urged to go to the capital to stop the final confirmation of election results. they asked a lot of yes or no question, whether zuckerberg,
10:54 am
pichai, and dorsey agreed that they played a role. each ceo did not want to answer just yes or no. the one that went the furthest in taking some accountability for the stop the steal group that claimed the election was stolen, a lot of propaganda being spread, the one ceo was jack dorsey who said that that was spread on his platform, he took some responsibility for it being there. although he did say it was a broader issue and that they were labeling this as misinformation on their platform. i do not know if the lawmakers got exactly, the yes we did play a direct role answer they were seeking i think. host: let's watch a bit of the exchange as chairman mike doyle of pennsylvania asking to heads -- the heads of the companies
10:55 am
about the january 6th attack and the role social media played. here are some of the exchange. >> i want to ask all three of you if your platform bear some responsibility for disseminating disinformation related to the election and the stop the steal movement that led to the capitol attack. just a yes or no answer. mr. zuckerberg. >> our response ability is to build systems -- >> i want a yes or no answer. yes or no. do you bear some responsibility for what happened? >> our response ability is to make sure that we build effective systems -- >> mr. pichai, yes or no? >> i think we worked hard, this election effort was one of our most substantive efforts. >> is that a yes or no? >> it is a complex question. >> ok, we will move on. mr. dorsey. >> yes, but you have to take
10:56 am
into consideration a broader ecosystem. it is not the technology platforms only. >> thank you. thank you. and i agree with that. host: rebecca kern on all three fronts, they wanted to go beyond the yes or no and all talking about bigger issues at play when it comes to -- elaborate from there. guest: there are a lot of issues at play here and what we did see is like i was mentioning, they were labeling these tweets going into the election as misinformation and providing links to election centers for accurate information and news websites to redirect, to correct factual information, but facebook and twitter actively did that. they are dealing with billions of posts a day on their platforms and that gets through and it gets viewed widely even
10:57 am
before they even label it and then we had former president trump who was spreading the information from his account directly and they labeled it, but they did not remove a good deal of his tweets, they were still up there. and so his followers took that as truth and we really did not see the platforms act to take trump down, remove his platform on facebook, twitter, and youtube until after the january 6th riot and they claimed they had the authority to do that because of fear of further incitement of violence. so, in these lawmakers eyes, it rang too late. to take responsibility until after this horrific event that day. but, he is still suspended from all three accounts. host: our guest to talk about this topic of social media,
10:58 am
disinformation, in light of the hearing which you can see on c-span and if you want to ask questions about it, (202) 748-8000 for democrats, (202) 748-8001 for republicans and independents, (202) 748-8002. if you want to text the questions, (202) 748-8003 is how you do that. you can post the question on twitter, facebook as well. one of the things that has been part of this discussion, rebecca kern, is something called section 230 of the communications decency act. show our viewers at home a little bit about what it says, it says, no provider or user of an interactive computer service should be treated as the publisher or speaker of any information provided by another information content provider. that is the technical law. talk about it in the context of what they were talking about yesterday. guest: yesterday, it is a 25-year-old law and it basically -- a lot of people referred to
10:59 am
it as tech company's immunities shield. a liabilities shield. they are not held accountable and cannot be sued for that majority of third-party content on their platform. so, this is what lawmakers have bipartisan agreement on. this law is outdated and we have not updated it in decades and we need to change it. so, there is agreement there. where there is not agreement is how to change it. and do so without hurting small startup companies who do not have a large, legal department. many do not have any legal departments when you're starting out and it can run them out of business, or if you take away their shield, they will not want to remove content so the internet will become a pretty
11:00 am
terrible place and so, the shield also gives them the sword. the law gives them the shield and a sword is the phrase. the sword allows them to remove content if they find it violates their terms and conditions. and so they do remove hate speech and violent speech and other content, extremist content, it does not all get through all the time, it is not all get removed. lawmakers want to peel back that law. we have seen a few proposal, mainly from democrats, basically holding them accountable for violating say civil rights laws and international terrorist laws if they do so on the platform if content violates that, they have to remove it or be held liable for that continent. -- that content. republicans tend to -- we have not seen any detailed proposals. they mainly complained that when
11:01 am
content is removed, it goes after conservatives, so they claim there is conservative bias, conservative censorship of speech. we saw them ban former president trump, rudy giuliani, other individuals from the previous administration. we have not seen a lot of detailed proposals on how they want to change the law yet. host: and concern for new regulation coming to the tech companies, have they made an effort to self police themselves? have they expanded on that? guest: of the three ceos yesterday, the one who came out at the start was mark zuckerberg saying that he is open to changing section 230. if you go on any website, you will hear advertisements, they are pushing that narrative right now.
11:02 am
the one recommendation that he would say is that he thinks tech companies should be providing transparency reports. facebook already does that, quarterly of their takedown, the content they takedown, they categorize it, how much of certain types of speech they remove that violates their terms and conditions. they want to see it as an industry standard and all of their competitors do that. beyond that, we did not see him say much more about how he wants the law to be changed. ultimately, if content gets by that is inappropriate speech and they do not want to have be held -- have to be held liable. he claimed that they cannot always remove everything. they are not perfect. that was the common refrain. we did not hear jack dorsey
11:03 am
or senator pichai agreed to specific changes either. host: ellie in ohio, democrats line. you are on first with our guest. go ahead. caller: good morning. i was calling to say that in my opinion, all three of them are dirty as hell. their bottom line is money, now -- not the content of the hate speech or anything else that their nobody lets fly on the internet. all three of their platforms are there to make money. not to worry about what is being said on the platforms. it is just about money, the almighty dollar. the fact that they do not take this crap down before stuff hits the fan shows that this is the
11:04 am
bottom line. host: that is ellie in ohio. guest: that was a common refrain among democrats and republicans yesterday, that they only care about making money and a lot of this incendiary speech and content gets shared widely, a lot of people use the refrain that a light travels faster than truth and that is true on these -- that a lie travels faster than truth and that is true on these platforms and like i said, even if misinformation gets caught out there by some content moderators at the platforms, it already has been shared thousands or millions of times. so, it is difficult to control the speech, but i agree with what the caller was saying, there is a lot of contempt for these platforms that they do not take more responsibility, that they do just care about their bottom line and more incendiary content drives more viewers and subscribers even though it is
11:05 am
free. these platforms take a lot of your data and it is used and nothing is free per se. so, i think that is a real concern and i think it will be a quandary on how these lawmakers you know, look to change the law and how the companies react to the feedback yesterday and there was a lot of yelling. but, i do not know what more will happen after it. host: carla, wayne city, illinois, republican line. caller: yes, i have two questions, really. the first one is, i agree that the media platforms on social media, they censor a lot, but i noticed they do not censor near as much on the blm, but they censor more people that are speaking their mind about how they feel.
11:06 am
and not focusing on other things besides donald trump or things like that. but then, my second question is, is what is the difference between social media and the regular news media, not giving the full truth and not being responsible for their actions. guest: those are good questions. i think it is interesting, i do not know if you watched yesterday, a the complaints and previous hearings, they were on capitol hill before the election, were about moving -- about removing conservative leaning speech and censorship. the platforms claim not to be biased and to remove more conservative leaning speech. i do not have the stats myself
11:07 am
as to the breakdown there, but that is certainly a perception shared by republicans on capitol hill. but, i do think also that it is going to be interesting to see where the platforms go from here. remind me, what was the second question? host: i apologize, i did not jot it down. i apologize for that. let's hear from robert, new jersey, democrats line. caller: hi, how are you doing? host: fine, go ahead. caller: this is coming from a democrat. i hate the former president. but, i am not sure i am comfortable living in a society where a source of information for millions of people is being censored the way it is and i guess it is a private company and they have their own terms
11:08 am
and conditions, but you are still taking information away, information that they want to hear from millions of people. so, i would like to hear your thoughts on that. guest: it is a good question and a position that they are put in where they are controlling speech and what information people receive. i do not know if you saw bernie sanders in the interview this week raising the same concerns. a big company can takedown a former president, who is to say they cannot take on other world leaders or progressive leaders for -- or democrats. they are very powerful in the moves they take and they are private companies and they are not subject to first amendment requirements. that is limited to the government. they can do what they want on their platforms, but they are still widely read around the world, so one action what we did
11:09 am
here when it comes to world leaders on the platforms as jack dorsey from twitter said, they are doing a survey in hundreds of languages around the world on their platform to determine whether to change their world leader policy. the supreme leader of iran, other world leaders who do share pretty incendiary content on their twitter feed, sometimes they do remain. so, they are looking at how to approach this issue, how to evenly, or fairly, as fair as you can do it, moderate speech from world leaders and it is a question they are having to address and i really think it is what the platforms will have to do because as we have seen, congress cannot act fast enough to really punish or moderate these companies right now until they come to some agreement, so we really have to rely on self-regulation from companies at the time.
11:10 am
host: here is a bit of that exchange with the ceo of twitter, jack dorsey, he was going back and forth with the health minority leader steve scalise talking about bias when it talks about censoring speech. >> this is the new york post, this new paper -- this newspaper goes back to founded by 1801. alexander hamilton, and for weeks, this source article before an election about hunter biden was banned by twitter and when you contrast that, you have this washington post article that was designed to mis-portray a conversation between president trump and the georgia's secretary of state since been -- parts of this have been debunked and this article can still be tweeted out. i want to ask mr. dorsey, do you recognize that there is a real concern that if there is an
11:11 am
anti-conservative bias on twitter's you have -- behalf do you recognize that this has to stop? twitter will be viewed by both sides as a place where everyone will get fair treatment. >> we made a mistake with the new york post, we corrected that within 24 hours. we do not write policy according to any particular political leaning. if we find any of it, we write it out. we will make mistakes and our goal is to correct them as quickly as possible and in that case, we did. host: to the larger issue of the topic of when these tech companies involve themselves and -- in looking at content and what to let through and what is not, is this done by people with eyeballs, is it done by algorithms, is it a combination of both, how does it work? guest: they are not very transparent with all of that. they do have content moderators for sure.
11:12 am
i think facebook says that there are 80 around the world that actually look at this content. but, it definitely goes through algorithms, computer models before and i am sure they tag keywords that are not allowed on their platform to stop these posts before they are widely spread and then they have the human element and they did share some stats between october and december of last year, they removed 1.3 billion, this is facebook, fake accounts. so, they also have to deal with that as well. fraud on the platforms. so, it is a really highly evolved problem and coming at them from all directions i think every ceo uses the phrase whack a mole. they are driving -- they are just trying to keep up. it is hard. honestly, if it was an easy
11:13 am
solution on content moderation, i think we would have a change in the law and we would not see this as a problem anymore. it is a really multifaceted, difficult problem. host: robert in new jersey, democrats line, you are on with our guest, rebecca kern. go ahead. caller: i just asked a question, i think -- i do not have any more questions. host: sorry about that. thank you for letting us know that. florida independents line, hi. ,caller: is the guest familiar with the election integrity partnership report on how social media was used to influence the narrative of a stolen election? and if you think this report, which was quite detailed, will be brought up during the hearings.
11:14 am
guest: i am not as familiar with that, i do not know who put that report out. but you are saying that they spread their information that it was not stolen? host: if i may interrupt, the caller, who put out the report? host: -- caller: it was an election integrity partnership of stanford university, university of stanford and the university of washington, plus many partners. some of the takeaways were that misleading and false claim narratives coalesce into a meta-narrative into a stolen election which led to the january 6th insurrection. host: respond as you wish. guest: that is certainly an issue that was brought up. that is why they were brought to
11:15 am
capitol hill, to interrogate them on their role of the misinformation on their platforms. as i mentioned at the top, they were pretty contrite -- dorsey went the furthest in saying that he had some responsibility in the widely spread, stop the steal messaging on the platform although i would say facebook had a lot of that on their platform too. and, a lot of groups that were formed related to the stop the steal movement. so, he put a lot of the blame on individuals and donald trump and therefore, we saw them stop donald trump. but, you have to the question, what platform were they using to spread that information? it was their platform. i would say there is more there that i do not know what just -- what could happen next, if it
11:16 am
is a lawsuit against the company, but coming back to the issue we talked about earlier, section 230 wherein they cannot get sued for third-party content on their platforms. so, it is a real problem. it is really a problem congress created through that law as well. host: the website for the report that the caller brought up, it is stanford.edu if you want to reference that report on taking a look at january 6th. is there a shared definition of disinformation, misinformation, how does this play into the topic. guest: i wish they did that at the top of the hearing. misinformation is spreading false information. without the intent of malice. maybe you did not realize it was not true and you reach we did it anyways. disinformation is the intent, malicious intent, of spreading false information.
11:17 am
i would go as far as saying after the november election and we had an accurate count of the outcome and now president biden was the winner, trump was spreading disinformation saying he had won the election. it would be an example of disinformation. knowing it is not sure even if you do not want to agree with that, but that was false information and it was spread as if it was truth. i do have to give the company's credit that they labeled these tweets, but they did not take them down because of their world leader policy, which i mentioned jack dorsey is reevaluating, but they keep on tweets from world leaders because they are newsworthy. because they are world leaders. at the end they take tweets down because we had the final election count and they were just wrong.
11:18 am
host: can i ask, how prepared do you think the legislators themselves were asking these types of questions when it comes to technology policy? guest: really good question. we have seen a real range. i had a lot of frustration that they cannot pronounce the -- sundar pichai's name. the lack of education was disappointing. he has been on capitol hill multiple times. he is the ceo of a major company. it was an annoyance and frustration. but, mainly, they were pretty well educated, obviously the younger members understood the platforms better. it happens all the time. i do think we had new narratives coming from -- we have a new republican leader of the health energy committee, the first
11:19 am
female in that role. she is a mother of three and she really took the narrative -- all the republicans, he saw it all down the line, were talking about the impact that social media has on mental health and the well-being of children. it was really compelling, the arguments she was making about massive increases in suicide rates, especially in her state of washington alone, she was citing real increases due to a lot of loneliness and self judgment and negativity that teens and young people face when they look on these platforms continuously all the time and compare themselves to others. also talked about the growth of child predators on these platforms and it is a really serious problem and facebook and twitter and google are well
11:20 am
aware of it. they do report a lot of this to the national center for exploited children. they tracked these individuals down, but as i said, it is hard to catch everything. they did not have any answers on taking accountability for the impact, the mental health impact that these platforms have on children. i do not think she prepared them for those kind of questions. she came on strong, so i think they were really unified in that messaging and if we see legislation to address these issues, to hold these companies accountable, for these mental health issues, maybe that will be the outcome of this hearing. we will have to wait and see. host: cleveland, tennessee, democrats line.
11:21 am
caller: yeah. i was just wondering, he promised that we would get $1300 more per month. is that a load of bull or is he going to do something about it because -- host: i will stop you there because this goes back to the previous segment, which is not our guest's topic. david, grand rapids, michigan, last call. caller: so, i cannot stand when scalise comes up there and throws the lies. i want the internet to tell the truth. the thing about -- when i watch msnbc, nbc, cnn, even fox to a certain extent, the thing about fox, they do not tell the whole truth, they do not tell the whole story. at least with msnbc and cnn, they are willing to say things that fox says is wrong whereas
11:22 am
fox flat out says we are right and you are wrong. my thing is, the internet -- the mainstream media is not allowed to lie to you, i do not care what anyone says, they are not allowed to lie to you. we need to make the internet the same way by crack -- passing the correct laws. host: let's leave it there and let our guest respond to that. guest: i think that is fair. i think media, mass media, like a lot of these publications and tv sites you are talking about are held to different standards, they are held liable and conveys -- and can face defamation lawsuits. that is the crux of the issue they were talking about yesterday. and in fact this morning, fox did get sued by dominion voting system because of the lies they
11:23 am
spread about their systems being faulty and not counting up all the votes. so, they are liable and can get sued. i sense frustration from the listener and from lawmakers but here's the thing, they have the power to change the law. so, maybe this growing public anger from their constituents will really push them to act, but the thing is, we are in -- we are not in a divided congress, we have democrat-controlled votes, but we need democrats and republicans to join together in the senate to pass any bill, and to pass any bill, you need 60 votes unless you change the filibuster. you'll need democrats and republicans to work together to pass any change. you could technically get a bill through the house without any republicans, but you will need more republicans in the senate. host: if that goes there, you hear from many different sectors
11:24 am
reporting on this when it comes to the internet, people are concerned about a chilling effect, especially when you put -- that is a major push back as far as pushing more laws in. guest: i do not know what will happen. i do think this law is outdated, i think 25 years ago, the internet was a very different place. i do think you can put forward some narrowly focused reforms that can tackle some of the real, serious, terrible content on these platforms. we did see it happen, in 2018, congress jointly passed a law that basically made it illegal for sex workers to be -- any of that content to be shared on those platforms because there was a lot of sex trafficking happening and they took out -- they made an exemption in section 230 saying that you are held liable in any content
11:25 am
-- if any content related to sex trafficking occurs on your platform, so they can get sued. so, and that was bipartisan. i do have some hope that if the issue gets bad enough, you can get both republicans and democrats to agree on a narrow provision to take some accountability, and have them be liable for certain content on the platforms, they make it narrowly focused. so, we will see. just add on, there is one bipartisan bill to date in the senate it is the pact act, from senator ryan from hawaii, a democrat, and senator john thune from south dakota and they are trying to update section 230, narrowly reform it and they will require these transparency reports that facebook is calling for that would report how much of this speech is taking online
11:26 am
-- off-line and they would also have to remove content within 4 days of a court order if a court views that as illegal content. it would put more response ability on the tech platforms. host: rebecca kern -- you can see her work at bgov.com. thank you for talking to us today. guest: thank you for having me. host: we will finish off the program the way we started, we will get your thoughts on president biden's first news conference since his inauguration, the scenes that came out of it, your perception of what took place yesterday and here is how you can let us know. democrats, (202) 748-8000. republicans, (202) 748-8001. independents, (202) 748-8002. if you want to text us, do that at (202) 748-8003. facebook is available at facebook.com/c-span. and twitter if you want to post there

47 Views

info Stream Only

Uploaded by TV Archive on