Skip to main content

tv   Washington Journal Bret Jacobson  CSPAN  March 28, 2021 10:17pm-11:00pm EDT

10:17 pm
the history of our nations, and still, we've not emerged from the crisis, even if we see real reason for hope. but our cooperation with allies and partners provides us with more than a small bit of optimism and faith. it shows us the way forward -- together, rooted in our shared values, and committed not only to rebuilding our alliances and partnerships, but to building them back better. if we do this, there are no challenges we cannot and will not overcome. thank you very much. >> go to c-span.org/coronavirus for the federal response to the coronavirus pandemic. find the latest briefings and the biden administration's response. use the interactive gallery of maps and follow the cases in the u.s. and worldwide. go to c-span.org/coronavirus.
10:18 pm
joins us for a discussion efforts to regulate big tech. he is cofounder of red edge. bret jacobson, explain what you do and how you do it. guest: we try to help our clients explain complex issues about we -- about how we govern ourselves using a lot of the digital platforms, emerging technologies. our goal is to have the population as educated and active as possible. we are focused on keeping those platforms and conversations free and vibrant. host: does your work have a political leaning to it? guest: we tend to work with the freedom side, lower tax and less regulation. we want people to be as free as possible and unencumbered by the government as possible. host: largely on the issue of
10:19 pm
moderating social media platforms, regulating big tech. do you think big tech should have more regulation? guest: it is perfectly fair to keep asking a lot of questions. i do not think their behavior is requiring of greater regulation. you have one side, the democrats saying they are not doing enough to moderate content and you have one side saying they are doing too much regulating of content. they are in that uncomfortable but effective and efficient middleground. host: section 230 comes up when we had this conversation. guest: it is a rule that keeps away trial attorneys from suing websites for every comment that somebody does not like. it is a protection that was put
10:20 pm
into allow internet websites to grow. it has done that pretty well. it has kept it relatively free and vibrant. there are questions about whether that should be curtailed or used as a cudgel to get what politicians want, which is their side a slightly more favorable outcome in tumultuous times. host: what does section 230 mean for red edge? guest: we can get our grassroots individuals to comment, talk to congress, talk to regulators, talk to each other, share content and ideas. for us, it is important that 230 is kept intact or the spirit of it is kept intact. the great thing about the internet has been the ability to share massive amounts of information with more people. we are better informed than other -- than ever.
10:21 pm
after having watched the hearing, we are concerned that both sides are trying to figure out what color is the new golden goose. host: mark zuckerberg stating he is open to changes in section 230 after a couple of decades. here is what he had to say. >> i would support two specific changes, especially for large platforms. i want to call out that for smaller platforms, we need to be careful about any changes we make that remove their immunity because that could hurt competition. first, platforms should have to issue transparency reports that state the prevalence of content across all different categories, harmful content, everything from child exploitation, incitement of violence, terrorism,
10:22 pm
intellectual property violations, pornography -- >> where with those transparency reports be reported to? >> as a model, facebook has been doing something to this effect for every quarter. we report on the prevalence of each category of harmful content and about how effective our systems are about removing that content. company should be held accountable to having these systems. i would propose creating accountability for the large platforms to have effective systems in place to moderate and remove clearly illegal content. things like sex trafficking or child exploitation or terrorist content. it will be reasonable to
10:23 pm
condition immunity for the larger platforms on having a generally effective system in place to moderate clearly illegal types of content. host: on those recommendations from mark zuckerberg. guest: he makes -- it makes a lot of sense to prioritize reporting. one of the more interesting lines of questioning from democratic house members was access more data to see how the platforms are performing. more information is probably better. it could probably be done in a voluntary fashion. facebook -- if that is where it needs to go so people start to feel safe, that is great. it is important that he called out that very small companies need to be exempted from that. that regulatory burden would kill them before they had a chance to compete.
10:24 pm
it is good that facebook is offering some potential solutions. hopefully, congress will start to look at that as a conversation so we are improving our conversation and not just regulating them to death. host: bret jacobson is cofounder of the digital advocacy company red edge, joining us for this conversation about regulating big tech. would you support more regulation for big tech? (202) 748-8000. if you would not, (202) 748-8001 . you said companies should be able to regulate. what happens if they do not do it voluntarily? or not doing it well enough, and who decides not well enough? guest: those are good questions
10:25 pm
for hearings. i do not know if there is a regulatory framework established. you have recourse to the courts for civil remedies if necessary. i do not think there is a great framework to start with a so that would be more interesting question. what would a successful framework look like? right now, i do not think government is equipped to move at the speed of digital conversations. what we don't want is bureaucracy that moves in months and years trying to make decisions regarding platforms and online dialogue that happens in seconds and milliseconds. host: when would you remove something on a platform that you have worked on? guest: if it is anything inciting violence, threat to an individual, inappropriate for children. i am a parent of three girls and
10:26 pm
i am not looking for the internet to be this wild thing. companies -- clearly, illegal content can go quickly. it starts to get a little more difficult to decide once you are trying to gauge opinions. the conversation framed in the hearing about disinformation and misinformation is where it starts to get really sticky because in america's discourse, the right and left her having a hard time agreeing -- left are having a hard time agreeing on what a fact is. host: steve scalise brought up this idea of censoring based on political bias on the social media platforms. directing his question to jack dorsey.
10:27 pm
>> this is the new york post, a newspaper that goes back to 1801 founded by alexander hamilton. this article, right before an election about hunter biden was banned by twitter. when you contrast that, you have this washington post article that was designed to mis-portray a conversation between president trump and the georgia secretary of state. since then, parts of this have been debunked. i want to ask mr. dorsey, do you recognize there is this real concern that there is an anti-conservative bias on twitter's behalf? do you recognize that this must stop if this is going to be -- if twitter is going to be viewed by both sides as a place where everybody will get free -- fair treatment? >> we made a terrible mistake with the new york post.
10:28 pm
we corrected that within 24 hours. we do not write policy according to any particular political leaning. if we find any of it, we write it out. we will make mistakes and our goal is to correct them as quickly as possible. host: bret jacobson, i know you watch that hearing as well. guest: i thought it was a perfectly fair question. i thought throughout the day, there were some questions that were off-the-wall. i thought that one was spot on. it was a pretty honest answer by somebody running a company that is pioneering new technology and new ways of connecting people. there will be a lot of mistakes and we have to distinguish between mistakes of intent, which is what republicans and conservatives are concerned about, and mistakes of practice and habits, which is what the
10:29 pm
platforms are saying is going on. it is not a new question by represented of scully's. this has been -- representative steve scalise. this has been happening for years. it is something platforms need to do a little better job of reviewing internally and discussing with people who have those concerns. host: big tech regulation is what we are talking about. if you support more of it, (202) 748-8000. if you oppose more of it, (202) 748-8001. good morning. caller: what i was going to say if people don't like what is on the platform, they don't need regulation. it is a consumer item. you go to another platform. when you sign up for a platform, like facebook or twitter, before
10:30 pm
you even get onto the platform, you agree to the rules. it is not your platform to be like, i disagree with it. leave it. don't try to regulate something that belongs to somebody. it is not a public platform. it is something you went to and you agreed to follow the rules on the platform. host: bret jacobson? guest: i think that is right. one thing i hear frequently is this concern from conservatives that if the platform or if they do not want to be on a particular platform, they have nowhere else to go. with news that the trump empire will roll out own social platform, that undercuts one of
10:31 pm
those key arguments that there is not the market place for people to go and share their ideas. there are other upstarts. there are several out there were people can go. i think that by a large, we want people to vote with their digital feet in their wallets and go wherever they feel the most comfortable. minimize the amount of government interference. host: silver spring, maryland, supports more regulation. are you with us? i will let you keep working on that. this is barry out of new york. caller: good morning. thank you for c-span. i just wanted to point out that we would be opening a big pandora's box if we start to regulate the ability of private companies to remove people from
10:32 pm
their platforms that they deem to be offensive. this idea would proliferate throughout more than just online arena. it would -- i could imagine fox news, there has been a lot of controversy about when they post videos and comments that don't align with the fox news narrative getting removed from fox news. all these conservative outlets have videos they post and comments people post in response. host: what about this idea of misinformation or specific disinformation from a foreign country, perhaps, ahead of the election. ? caller: that could be taken care
10:33 pm
of if these companies were really diligent about making sure they remove their bots. that is where some legislation could be productive. some assistance from government agency helping little companies that may want to compete against twitter or facebook, helping them to determine, using algorithms and ai or whatever the how they do. host: thank you for the call. guest: i think the caller is correct that if we are getting into state-sponsored misinformation, we probably also need cooperation with our own government in terms of tracking where that is coming from. there has been some interesting research that shows information's biking during the working hours in moscow, which
10:34 pm
is something we can working with the government -- spiking during the working hours in moscow. i think these platforms are very well positioned to be able to hunt down this 99-1 rule that you see online. you see that in conspiracy theories. a lot of misinformation. hopefully, the platforms are able to figure out where the sources of the really most common, the most harmful, the most deleterious misinformation is coming from. we want those to be targets and ideally, wasting time.
10:35 pm
host: do you think the hearings are a waste of their time? they always get a lot of attention, but are they worth it? caller: i do not think they are worth it until both parties on the hill can agree on what the problem is. lastly, you had two different hearings going on. the accusations were wildly apart. i do not think it is productive until people can agree on what the problem is. you also have some productivity lost by these random accusations that were being utilized by members using their five minutes to go on and on. you saw some pretty crazy stuff in hearing and after the hearing these companies have blood on their hands.
10:36 pm
once you are at the point of that wild hyperbole, you are not getting closer to a solution. you are getting further away from it. host: one of the moments you tweeted about is when jack dorsey was asked about twitter being in arbiter of truth. >> is not often i find myself agreeing with bernie sanders. he said, if you are asking me if i feel comfortable that the president of the u.s. should not express his views on twitter, i do not feel comfortable about that. yesterday, it was donald trump who was blamed. tomorrow, it could be somebody else. do you think the law should allow you to be arbiters of truth? mr. zuckerberg? >> congressman, i think it is good to have a law that allows
10:37 pm
platforms to moderate content. but, as i said, i think we benefit from more transparency and accountability. >> mr. dorsey? >> i do not think we should be arbiters of truth and i do not think the government should be either. host: to that response? guest: libertarians have a new hero in jack dorsey for putting something specifically -- i do not think the history of free speech and free expression would bear out that we would want government deciding what we see in our newsfeeds. we probably do not want the platforms deciding what is true and what is not true. i think it is difficult to figure out what is appropriate in terms of optimizing content for users in terms of what they enjoy versus any harms we might be seeing if misinformation is
10:38 pm
being prioritized or if we feel like people are being radicalized. we need to separate questions -- is something good? if not, is the government the answer? rarely, when it comes to speech, is government the answer. if we are concerned about the content that people are seeing and the unhealthy diet of it, what are solutions we could do to make better consumers? how can we pressure the platforms to do better? one of the self-interested answers we saw in the hearing was from the ceo of google who said advertisers, who are their primary constituency, do not want to be around a lot of conspiracy content. there is pressure that could be put to the market and advertisers for platforms to keep their platforms clean and healthy and places for fair and free speech.
10:39 pm
host: on the issue of advertising, is social media being taken over by ads? you can call in. the line for those who support more regulation, (202) 748-8000. west palm beach, florida. caller: i am calling about google. as far as i can tell, we did not respond to their demands for advertising dollars. when you call the 800 number for their customer service department, you get a sales department. you are directed to go to a business website, which directs you to a bunch of pages and says for $100, you can do this. if you still have a problem,
10:40 pm
click this button, and intend days, we will send you a postcard -- and intend days, we will send you a postcard. host: what business are you in? caller: it is a small moving company. our business is 90% referrals. i have people calling me in a panic because they are booked -- they booked jobs and they are being told we are closed. host: thank you for sharing your experience in florida. guest: i run a small business that i would not appreciate if google was telling the world we were closed. we are trying to stay busy, trying to do good work. i would sympathize with the caller.
10:41 pm
it sounds like the postcard you mentioned is probably part of the identification authentication system, which is part and parcel of the overall conversation of how we are weeding out misinformation. unfortunately, i am not very successful as a help desk person but i share her frustrations. host: cleveland, ohio, opposes more regulation. caller: i oppose more regulation because it falls in line with voting. every time there is voting, there is a foreign country interfering with our senators, voting polls, voting regulations. i think the communication has to
10:42 pm
start somewhere to find out who is responsible for letting all of these companies get in and letting these foreign companies interfere with our business. host: bret jacobson? guest: voting information and misinformation around voting should be taken extremely seriously and to one of the earlier points, that is when you want the state department, your intelligence community to monitor if it is foreign state actors or if it is foreign nonstate aggressive actors. if we try to put responsibility of the company level, at the platform level, you are reducing the number of resources they can direct at correcting the problem or preventing problems. you need the same sort of government resources that are hopefully protecting our banks, financial institutions.
10:43 pm
so much information, so much of our public perception is shaped in those daily conversations. host: mike doyle asked the ceos how much responsibility there platform t --heir -- their platform bore out of the january 6 attack at the u.s. capitol. >> does your platform bear some responsibility for disseminating information that led to the attack on the u.s. capitol? yes or no answer. >> chairman, our responsibility is to build systems -- >> yes or no answer. do you bear some responsibility for what happened? >> our responsibility is to make sure we build effective systems.
10:44 pm
>> yes or no? >> we worked hard this election effort was one of our more substantial efforts. >> yes or no? >> is a complex question. >> mr. dorsey? >> you have to take into consideration a broader ecosystem. >> thank you. i agree with that. guest: i think you can start to hear the frustration in jack dorsey's voice. i am not sure the question was extremely intellectually fair. these people are responsible for companies that might have legal liability if they answer the wrong way. you want private companies to feel a responsibility to try to
10:45 pm
make sure the community they operate in is safe and healthy. i think we can hold them accountable to that general premise. specifically, operating within the appropriate laws at the city, county, and federal level. it is fair to hold companies to a standard. the question is, are they responsible for the actions of seemingly deranged individuals? the answer is i don't know. i think tough question for us to think about is do we want crazy people to operate where we can see them or where we cannot see them? ideally, we are at least monitor -- able to monitor their activity and get ahead of it. host: good morning. caller: good morning. how are you?
10:46 pm
i have some questions. whatever happened to slander and libel? what made big tech the court system to decide who has the right to say what? and not say something? i mean, big tech is way out of line and does not have the right to tell people what they can or cannot say. that is what they went into business for and that is what they set up to make money off of. that is what you get. host: what would you want to see as somebody who supports more regulation? caller: number one, i recently bought amazon fire. the place did not tell me this. i am not into the social media
10:47 pm
stuff. they want my bank account information to even use it. i will not give them that. what is it that makes you think you have the right to all of our personal information? i do not think that should be allowed. host: privacy issues? guest: we have clients that work in the privacy area that are pushing for a federal privacy standard. one of the benefits of that is to make it easier for people to understand the trade-off, what they get in return for giving up their personal information. we certainly are former -- are for more transparency and more clarity. products require information to be able to process payments, so that is a good case for the user can decide whether the value is there for them or whether they need to return a product or service. so much opportunity for
10:48 pm
entertainment education, communication, making things more clear and transparent are good goals for product sellers. host: from red edge, the efforts you have to try to educate voters about polls and the positioning of polls, the privacy issue that comes into doing that? guest: the user would give a service our app information about where they live, which is somewhat personal information. used to be published in the white pages all the time. once it is online, people feel a little different about having their address public. you try to make that clear at
10:49 pm
the start of the transaction. i think people are still within several years of trying to figure out what information should really be private, what information are they comfortable trading away to allow systems and services to offer them easier lives. host: rededge.com. bret jacobson is with us for the next 30 minutes. caller: referencing what the earlier caller was talking about , larry ellison of oracle said that as far as privacy goes, we gave that way a long time ago. signing off these agreements at the beginning of various
10:50 pm
different -- when you are logging on, we gave up a lot of the privacy information. the bigger question is, how do you police the truth? how do you get the truth out there without false information? if you allow people to make the correction, people have to be somewhat informed and engaged. the problem is, i want to give you an example. i was posting on facebook. i was reading various science, what was coming out of china. i was posting on facebook that we should be wearing masks given what was out there and i was complaining about the who not yet calling it a pandemic. things started popping up. this is going up against, knowledge.
10:51 pm
take a look at the -- common knowledge. take a look at the who website. i think the organizations themselves are doing enough or possibly too much trying to counter what is going on. host: bret jacobson, take that up. guest: the caller is on to interesting dynamic that is facilitated by digital technology, which is this move over the last couple decades to open source technology. it allows lots and lots of different users to contribute solutions to a problem. one of the most important ways that manifests is in cybersecurity. as bad actors try to exploit a loophole, you have a bunch of people acting as self-appointed white t cells to get rid of the program -- the problem.
10:52 pm
that is a better model than having small, relatively isolated governmental solutions try to attack one-to-one misinformation. the sheer scale of people willing to help try to fight for the truth, their version of the truth, is a better long-term solution, something we should be betting on rather than putting our eggs solely in the basket of government. host: this issue always comes up when we have these conversations. " restore the fairness doctrine,," referring to the requirements that were in place in the 1980's. can a fairness doctrine happen
10:53 pm
in social media environment? guest: no, and i do not think we want it to. it would not be reasonable to expect to be able to boil down a complex world and try to force a quality time. -- force equal time. there are 15,000 different perspectives. trying to slow down our media, trying to adjust the media to be more simplistic and two sites only does not make us smarter consumers or better informed or wealthier or healthier. host: bill out of alabama opposes more regulation. caller: i wanted to address a couple of issues. disinformation, back when america was viewed as a free country and eastern europe was a repressive regime reported by the soviet union, we had something called radio free europe. we beamed our messages into
10:54 pm
eastern europe. that was away for the people to hear the truth because the government shut down all the radio stations that had any views different. a lot of our media blocks information and if there is a foreign government that would give us some information, that is a plus. when hillary clinton's emails were being shut down, why couldn't cuba get us the information or china? host: bret jacobson? guest: a lot of this question over misinformation and disinformation comes down to what sources will you ultimately trust? i do not think there is a great argument that the u.s. federal government has on trustworthy communications and we want as many watchdogs as possible trying to label where
10:55 pm
information is coming from so people can take a piece of information with a grain of salt or an ocean worth of salt. it is important to start thinking for the difference between misinformation, which is a misunderstanding, and disinformation, which is intentional distortion of our conversation. usually, that comes in the form of meddling with our elections. host: to the lone star state, this is troy, amarillo, good morning. caller: [inaudible] host: you are going in and out. want to try it one more time? caller: [indiscernible] host: we do not have you but give us a call back. bret jacobson, you mentioned a
10:56 pm
lot of issues came up in this hearing. another one of those issues, how much access, open access minors should have to the social media platforms. asking mark zuckerberg about an instagram for under 13, whether it should be done or not. your thoughts? guest: i think it is better to be done by large companies that can keep things locked down and keep content being shared from a very small number of trusted content creators. pbs, c-span, thinks that are very trustworthy. kids are smart enough that they will find their parents' phone. we want them in places where things are triple verified. that comes with a lot of liability for any company and
10:57 pm
requires a high level of trust. to do it right can be a huge service to parents to kids -- and to kids. to it wrong would be a world of legal and social pain for everybody who gets it wrong. host: running short on time. jim, go ahead. caller: thank you for taking my call. i oppose any regulation that imposes on the freedom of speech. i have never heard of red edge before. i do not even know how to text. technology is great but you do not want technology to run your life. you need to run your life and use technology to your benefit. if i want to hear the news or the weather, i walk outside my door and i will check out the weather. i do not need text. government does not need to be involved in freedom of speech of this company.
10:58 pm
i do not know how to explain it any other way. host: jim, i think you explained it simply. guest: sounds like we need to improve branding for red edge so people know the name more. there are a lot of interesting questions and hopefully, people in d.c. and on capitol hill will start to hone in the most important question, how are we ensuring that people are having the broadest ability to speak freely? how do we police when it is -- how do we keep government out of that? how do private platforms make sure our trust in them is cared for? host: bret jacobson is cofounder of red edge >> c-span's washington journal,
10:59 pm
every day we are taking your calls, live, on the air, on the news of the day and we discussed policy issues that impact you. monday, linda feldman, and a discussion about the future of the u.s. postal service with the american enterprise institute. then, alexis madrigal. washington journal, live at 7:00 eastern monday morning. be sure to join the discussion with your phone calls, text messages, and tweets. >> c-span is your unfiltered view of government. including media,. -- mediacom. >> we never slowed

18 Views

info Stream Only

Uploaded by TV Archive on