Skip to main content

tv   Washington Journal Anthony Salvanto  CSPAN  August 24, 2018 2:06pm-2:55pm EDT

2:06 pm
tested gene therapy. are infants not expected to survive more than 15 months. over the next few months something truly traumatic happened. 100% of the kid to get the highest those -- highest dose of gene therapy were alive. like 3.5 cannot only talk and walk, she can even do push-ups. >> and listen with the free c-span radio app. the democratic national committee holds its summer meeting in chicago this weekend, committee members will consider the potential changes the in the number bidding process.
2:07 pm
covering at 10 a.m. eastern tomorrow morning. continues. host: anthony salvanto is that our desk this morning. he is the election survey director at cbs news and the author of this new book, "where did you get this number?" thanks for being here. guest: thanks for having me. host: before we start about the last time around, the president last night was tweeting about the next election. i want to share with our viewers what he had to say on twitter. he is tweeting this out last night around midnight and he was referencing stories, re-tweeting from dan scavino, jr. pulls re-tweeting a 2020 -- poll. out -- also tweeted
2:08 pm
your reaction. guest: i did not see the tweet, i'm hearing you tell me about it. one of the things i always caution people about looking at 2020 polls. it is a long way away. i'm old enough to remember when jeb bush was leading in the polls for the republican presidential nomination and that was in 2015. when people see those kinds of polls, they don't often say, those polls were wrong or off, because we know that things change and the dynamics change, and we know that polls like that , especially now looking ahead to 2020, are often name id polls . no one is going to pick a name they don't know, they're going to pick the most known name. i'm just trying to get through 2018. host: too early? guest: it's too early. host: when would you start. guest: that depends on when the
2:09 pm
campaign dynamics really start. like any poll number, you want to measure something that is based on what people are actually seeing and how they might actually make their voting choice. i've got my hands full with 2018. host: you are focused on 2018. -- headline from your book why the blue wave is unlikely. guest: we saw the house moving toward the democrats over the summer, but what i want to say regardless of headline is that we have got the house right now at 222 seats for the democrats, that is our best estimate when we look at the competitive races. but there is a margin of error around that. that margin of error is 11 seats to either side. that is 5% of the house.
2:10 pm
it encompasses the possibility that republicans hold the house. we have seen it edged toward democrats over the summer. the house is in play. is in flux and we are going to watch it. we have two months to go. one of the things i caution and i say this in the book is don't treat any poll like it is a prediction of the future. treat it like it is a read on the dynamics of what is happening now and look at what are the moving parts, what are the things that could change, because things i most always do. host: what about the senate? guest: the senate map as the democrats defending seats in a lot of states that voted for president trump. their map is harder comparably in the senate than it is in the house. you go state to state and you look at north dakota, you look at missouri, we've got competitive races for democrats in those places that they have
2:11 pm
to hang onto. we see a competitive race in florida, as well. that map gets a little trickier for them. there are a couple possible pickup opportunities for democrats. most folks agree on arizona might be one which will certainly be in play. overall, what has happened is partisan straight-line voting has just increased and increased and, these days, more than nine out of 10 republicans vote for republicans, democrats vote for democrats. when people like me look at these races, one of the first things we look at is how did they vote in the presidential election? we know that all of those folks are probably going to stay with their party and that creates the baseline. that is why i say it is a red state and a blue state and that is where that idea comes from and it starts where how people pave. host: what could make a blue wave happen if those people
2:12 pm
stick to how they voted in 2016? guest: one of the first things is turnout. what happens is when you look at congressional voting and i'm thatmost folks realize incumbents get reelected no matter what people think of congress. many districts are lopsided to one side or the other. more than 60% voting democrat or republican. so, we look at those competitive house districts and what i see article of patterns. they are comparably -- they have more people with college degrees on average and they are little more affluent than average. those are the kinds of voters the democrats have been trying to put in play. so far, we see them tilting a little bit toward the democrats. but the other thing about them is that they are from coast to coast, they are in the suburbs primarily. we see them from new jersey to california, we've got a few in the suburbs of texas, dallas,
2:13 pm
houston, etc. they are in political battlegrounds that we don't often see in a presidential election. so, it is sort of a new set of competitive territories in that regard. finally, turnout. we are we say it all comes down to turnout, but the fact is that in the polling anyway, the democrats are dependent to some degree on bringing out people who don't typically vote in midterm elections. midterms, you might see something around 40% turnout, less than half of eligible voters. that will have to go up, that will have to be people who might vote in presidential years, but don't vote in midterms, to kind of change that midterm dynamic because the more habitual voters are often a little older and more conservative. host: what about republicans? what do they have to do to hold off losing control of the house and the senate? what issues for both parties polled well to energize their
2:14 pm
base and get new people who would not necessarily vote in midterms to come out? guest: those things are intertwined. the economy, people say the economy is good. so historically, you might say, well that would favor the party in power. in these districts, people do say the economy and local areas are good. the question becomes are they voting on it? often times what we are seeing now is that there is a large gender gap, where women are preferring democrats, men republicans. , those those women moderate women, a small number of the republican women who are more undecided appeared to be voting in part on the president. to what degree he becomes a factor as opposed to voting on the economy is sort of any central thing to watch going forward. host: so, that works for both
2:15 pm
democrats and republicans, either voting against or for the president? guest: in midterm, we also see -- often see a president be a factor. they are "on the ballot." thisfinitely see it in one. for those who feel the president is a factor and are voting against him, there is a -- the people voting against them are saying, they don't like how he handles himself personally. people voting for him say regardless of that, they like how he is managing the government. again, it is what kind of line of thinking raises to the top of mind, that is the key thing to watch. host: why should people trust you and other pollsters after the 2016 election? how are you doing things differently this time around? guest: it starts by explaining very transparently and in plain english what it is that we do. that is one of the things i try to do by writing the book.
2:16 pm
often times with predictions, with the forecasters. do a differento thing. i don't presume to tell you what you are going to think. you should judge me on whether or not i can explain the way people think right now. if i do a good job of that, it is because folks think that i've asked the right questions, that i've framed the issues. people can pick an answer and say, yes, that is how i feel or that is how i feel. or if they hear me describe somebody that doesn't agree with iem, they say, i get it, understand why somebody thinks differently from me. and the fact that things can change. the science of it is part of the thing that we have got to describe. i think that i always get the question, how do you talk to 1000 people and know the country? i think that underpins a lot of
2:17 pm
the skepticism. and very fairly. we all make decisions based on the things we see around us, but there is a whole world of people that we don't get to talk to. how do we do that? what i've described for folks is we create the nation in microcosm. when we take a sample, what george gallup another pollsters in, week create the nation miniature. if we don't talk to you, then we have talked to somebody like you and they have represented you in the poll. folks say, there is no one like me. everyone is special, they've got the run experiences. that's true. on the broad brush strokes that we use in doing the polls, do you think the economy is good or bad? are you voting for the democrat or the republican? we can find someone who would answer the questions the same way that you would. a republican, there are tens of millions of people like you in that regard. democrats, same thing.
2:18 pm
there are tens of millions of other such folks. they represent you in the polls. that is the science behind how we do it. what i want to convey is that if we do call you, then you are very special to us and you will represent all of the people who are like you. host: so don't hang up? [laughter] guest: i hope not. it has gotten harder to find people. no question about it. we have to dial thousands more numbers now then pollsters did 10 or 20 years ago. people are busier, we have moved a lot online. we have moved interviewing online because it is private, faster, and that is one of the things i want to walk people through. we now have to work a little harder to find you, but it is still our mission to do that. host: the phone lines are lighting up. the book is "where did you get this number?: naples to her to
2:19 pm
making sense of the world." the phone lines. .epublicans, (202) 748-8001 democrats, (202) 748-8000. .ndependents, (202) 748-8002 urt.s go to k caller: i've been reduced to trying to read facial expressions. i would say, greta, you are awesome. what are the primary motive so pollsters? me what political affiliation you represent in your voting booth? that i truly believe is the undermining factor that minions and deplorable's like myself are being trained and coerced to read that as rough question. thank you so much for i don't
2:20 pm
know what political affiliation you may be. host: i will take it first to say that i'm not going to tell you because it doesn't matter, the host of the show, the people who sit in this chair, our job is to moderate the conversation in a neutral way and allow both sides to be heard of the conversation. that is the rule that we play -- role that we play and it is an important one. guest: my job is to listen to you, the american people. , and i doll questions it well, it's got a place where everyone can go. if you take the poll and you see that question, you were going to say, that's my answer, i get that one. ,hat is the science of pulling it is the science of measuring how people feel and when we do that in questions, we try to give, we do give everybody a home and a place to go. when we represent people in the polls, we should be describing
2:21 pm
people of a range of viewpoints. we should be describing everybody as they are. i think sometimes the misunderstanding comes because if folks see a poll and they said the majority may be disagrees with them or the , ofteny agrees with them times, we all associate and hang out with people who might agree with us more than disagree with us. that might just be how we choose our friends or social circle. oftentimes, that leads to confusion because people say to me, nobody i know feels that way. that may be the case, but then again, you might not necessarily be talking to the folks that we are talking to because we have been calling her surveying online people from all walks of life and all over the country. and thank you for your question. host: thomas, a democrat in texas. caller: good morning, america, again.
2:22 pm
you wrote a book on this? president, youme know there was going to be a republican. radio.the a.m. listen to rush limbaugh, hannity, and the rest of them. i said, trump is going to be president. host: are you saying that because we had a democrat, because we had president obama, the pendulum was naturally going to swing? caller: of course. if you listen to rush limbaugh. [indiscernible] caller: it started way long before that. guest: the book is on how we do how we try to
2:23 pm
understand what people are thinking. it is an attempt to tell you where your numbers come from, why it is that you see something at 95% or 2% that you happen to see. the political dynamics. the back-and-forth, you could read where that is going more you follow the polls or read the polls like the pros do and hopefully, that is helpful. how can somebody tell if a poll is a good one? has to be that microcosm of the country. got to have the balance of republicans and democrats. a lot of people think that we don't call cell phones. good call cell phones and phone polls do call cell phones.
2:24 pm
we call mostly folks on cell phones. the way we put the sample together is often indicative of whether or not a poll is done well, at least on the phone. online, same thing. the same thing, the microcosm of the country. a good pollster should tell you how they put the sample together. often times, it is up to us in the news when we present the poll to you, that we have vetted andwe have looked at it said that balances right, the demographics are correct, it's got that right mix, that right regional makes -- mix, so that , thatresent it to you means i have done some of that legwork for you. host: what does the margin of error tell you? and is it indicative of a good or bad poll if someone looks that --should they look at the margin of error? guest: absolutely. 66% of peoplet
2:25 pm
say the economy is good. nobody ever says to me, it is no, you are way wrong, it is 64%. there is a margin of error on the estimate. people want to know if they think the economy is good, but i and put that on the a 51%, 49%d say race, there is uncertainty there. and they say,that you can't tell me who is ahead or behind. but that is the limits of what we can do, quite candidly. we are going to have an estimate that has that range around it, such that as i described in the
2:26 pm
book, if we were to repeat the poll again and again, the estimate would keep falling in that range. the good news about the margin of error is think of it as a byproduct of the fact that when right way, wehe have ruled out extreme numbers come the idea of being way off. , the trueght number population number, might be within that range, and that is as well as we can do. host: where was the margin of error in the 2016 general election heading into election day? the days, the weeks before. guest: one of the things that lead to confusion is that the national polls turned out to be very good. the national polls rebut -- reflecting the national popular vote. sometimes, a number can be accurate, but not tell us the right story. in the 2016 case, the real story
2:27 pm
turned out to be these late swings in the upper midwest, specifically in wisconsin and michigan and to some extent pennsylvania. what i sort of draw away from that as a lesson is now, make sure we do our polling focused in on the places that could swing, and try to tell you that story. i wish i had done more polling in michigan and wisconsin. i did not in october. that is one place where even now, for 2018, when i talk about my house estimates in the house races, i'm doing that polling very specifically in the competitive house districts and the ones that could move. one is to try to carry forward for folks this lesson and say, focus on the places that could swing. i already see and i see out there now that there are still the house.lls for
2:28 pm
people probably watch this show and following this, they's the national generic ballot. that is not necessarily reflective of what is going to happen. it might be accurate in that it is measuring what people are saying nationwide, but in that national number, you are talking to people from outside the districts that are going to swing. and translating that into what would happen and the who wins. something that people debate and there is not a straightforward way to do it. use caution when you see the national generic ballot and focus in on the places that could move. host: let's go to jerry in wisconsin. a republican. caller: good morning. just listening to your comments about the more intensive polling in the upper midwest, i'm calling from wisconsin. to ask whethered
2:29 pm
you felt that people who are using blocking apps for telemarketers might be interfering with your polling numbers. i always participated in polls each year when they would call in. get flooded with telemarketing calls. i've set up an app on my cable-tv provider that will block telemarketers, but it is blocking everything that is coming in. i have not received one single poll call that made it through the blocking app this year. those appse who set up, did they lean one way or the other? guest: thank you, a terrific question. , ifor that particular app don't know. i can tell you this. often times, the people are harder to find our younger and they are more mobile. the reason for that is that they are not necessarily at home as
2:30 pm
much, they are not necessarily setting up a telephone or landline. phones, but not necessarily answering them. we look at the composition of the folks that we missed in our polling, the people that said no or the kinds of numbers that did not respond, and pollsters often find it is the younger and more mobile population that is harder to reach. on a broader point, no matter what the technology is, they often times -- we often times have to dial many more numbers than we used to in order to find people. so, if we can't reach you, i've got to work harder and the phone tom has to work even harder go find somebody who would answer the survey in wisconsin like you would have. that is a question for us -- challenge for us. one of the ways we are addressing it is to do this
2:31 pm
online. as people join and participate in surveys online, we set up those panels along with our go andand we actually create a representative sample and then those folks can take it online. you can take it on your smart phone. people whous get to might need to do it quicker or might feel like they need to do it privately. in a broader sense, we have to follow where people go. move from land lines to cell phones, we went to cell phones. as phones got harder and harder, people went online, we have gone online. one of the stories that i want to tell is we need to keep going where people go, that is what beta pollsters do. host: we will go to massachusetts. robert is watching, democrat. caller: good morning, america.
2:32 pm
the polls don't mean nothing. the polls don't mean nothing. we are in a constitutional crisis. .ou have manafort, cohen indicted. the polls don't mean nothing. host: what do you mean, the polls don't mean nothing? caller: freedom of the press. if you take the freedom of the press, polls don't mean nothing. polls don't mean nothing. --n you have robert mueller robert mueller is a great man. mueller has his daughter's taxes, he has ivanka's taxes. everything, he knows
2:33 pm
about the dark money he knows about the dossier. host: but what is your point about what you are saying and how it is related to the polls? caller: that is the problem. it is the polls. it is not the real deal. host: ok. guest: well, you know, pollsters have long argued that there is a place in a democracy for having people have a place to express their voice. to say but they are thinking and feeling. regardless of what happens in politics and what folks are doing in congress and what have you, that there is a place where we can go and measure what the general public is thinking and feeling.
2:34 pm
in a democracy, everybody hopes or thinks elected officials are at least partially responsive to that. pollsters have long argued, before me, that that obviously out to be the case. where we have a situation people tell us that conversations are getting harder to have. there is increasing polarization, animosity between democrats and republicans. there is a next her component to that. in some ways, we are looking at the poll and maybe helping us to understand what the other side is thinking and the reason that people are thinking as they are. some of the things outlined in the book are the amount of mistrust that each site has that is not necessarily warranted.
2:35 pm
when you listen to people, there are a lot more things in common that you might not realize. maybe in some sense giving people that voice helps to have a conversation. host: a couple of headlines to share with you. jumping off of that call. " -- democrats seize on corruption is a midterm issue. this morning in the "washington times," democrats feel heat as liberals fan flames of trump impeachment. you have tom stier, the democrat billionaire, running need to impeach campaign ads. then you have republicans with this headline in the washington post, skipping immigration policy ahead of the midterms. are these wedge issues? how did they pull? guest: that is one of those things that i'm sort of -- i don't know yet. we don't know yet.
2:36 pm
this news has just come out this week. one of the things pollsters always caution is give people a chance to digest it, then go to their sources and put it in context or perspective. one of the things that i've seen so far, i was just talking about partisan lines. no better example than this, the russia investigation. right from the get go, we have seen republicans, republicans say they think it is politically motivated. i've even tested the term witchhunt and they are echoing it, they are responding to the president's messaging on this. they say it is a witchhunt. it is interesting that when they hear the president criticized, that they feel they want to defend him. he certainly has a strong base of supporters that sees this that he is under more
2:37 pm
pressure than previous presidents in their view. democrats say this is a critical matter of national security. those numbers and where we get side,8, and 10 on each that is where we have not moved much. where are the independents? guest: there are a lot of independents, they are mixed, but we often find that underneaths come that declaration, they often vote much the same way all the time. there are republicans who call themselves independent and democrats who call themselves independent. mixed.dents are more
2:38 pm
there is a sliver of folks who say they want to wait and see what the facts say and what unfolds. that has been about a third question. isn there, i think there some room, there might be some room to move, but i think we have to wait and see. host: michelle calls herself an independent in west virginia. you are next. caller: hello. thank you for c-span. thank you for taking my call. i'm a little skeptical about polls, like a lot of people are. it is a lot of how you phrase the question. and theo create a poll teacher would say, the way you phrased the question, the word you used would steer a person to answer it one way or the other. way to get the actual questions for a poll that that say a research
2:39 pm
center, think tank puts out a poll -- is there anyway to get the actual questions and how they were phrased if you want to do at them yourself? what do you think about the type of research from like cambridge in the lytic, basically using people's likes and things they shared on social media as a sort of pull to take a temperature of how people felt about issues, but they were often using that information to manipulate people , so it was kind of like a in a way and- poll do you think others should be transparent, more transparent about polls, and do you think they affect people when they go to vote? if they know their side is ahead on something or behind on something, does it inspire them to go out? some countries do have laws that
2:40 pm
prohibit certain types of advertising or polls being done close to elections. thank you for answering the questions. guest: thank you for a lot of great questions. ok, let me take the last one first. there are some studies and some folks who say that if people see that their candidate is trailing in the polls, will that dissuade turnout? other people say it makes them want to rally and go show up and get their candidate to catch up. it is not clear. side, if somebody sees the candidate is winning, does that inspire them? there are studies in all directions on this. maybe it all comes out in the wash. on the first point, your question about questions is a really good one. , you canut out a poll
2:41 pm
get the full poll released. you can see the questions as they were asked, but your question about the way we phrase things is really good. take for example if i ask how concerned are you about a volcano irruption or an earthquake? that presumes a degree, should i be? is that of a job congressman doing? that is a matter of degree. how good? but if i ask that a little bit differently, are you concerned about a volcano eruption? now you get a yes or no. now you're going to get perhaps a different measurement. the people who are not concerned see that choice right there. for the people who say yes, i can then ask, to what degree are you? , but small,a subtle
2:42 pm
but perhaps meaningful example of the way that a question is structured that we try to do our best to do it so that everybody an a place to go and we get accurate measurement. think about the order of questions. if you ask about how well the economy is doing and you ask about how well somebody is doing financially, then the next follow-up question might be influenced by their thoughts on the economy. we have to be wary of that, we have to be careful of that. so putting together a questionnaire is a science unto itself, and we try to do it, as i said earlier, in a way that best reflects how people might be thinking and a way that is neutral and giving everybody a place to go, and a lot of it -- it starts often times by listening. it starts by listening to the way in which people are
2:43 pm
discussing a matter were a conversation, and trying to pick up on some of that language, so that when you get the question, you read it or hear it, you go, oh yeah, i get it, that answer is mine. host: cambridge analytica? guest: notwithstanding them specifically, i think that there is going to be an increasingly interesting area of research which is, how does what people are seeing and doing on social media both influence them and to what extent can we measure it or read it in a quantifiable way? at at first blush, you look for as much activity as there is on twitter, not everybody is on twitter. one thing that requests of with is it is not whether you are seeing a given tweet, but whether you are posting, whether
2:44 pm
you are writing back and forth to people on social media, we have asked people about that and we don't get very high numbers of people who are active on it. it, notthem are reading everyone is engaged. measuring the degree to which people are influenced is tricky. people can see something and they might be influenced by it. , they mighttime have already been in a social network with people who agree with them in the first place. some of the studies have shown that people on twitter, if you are on the right, you are connected to a lot of people on the right. if you are on the left, you are connected to those people on the left. there is a self-selection into that network and maybe the
2:45 pm
things that you see don't influence you, but they reflect your pre-existing set of beliefs. host: we will go to erie, colorado. cindy, republican. is that right? caller: yes, it is. good morning. i liked what michelle said about the statistics and how the questions are worded, whether it depends on what cut of answer you get. i wonder if you have at all honed into how the news is reflected, how that affects pulling --polling? we know that obama got 80% positive coverage and trump gets 90% negative coverage. how does that play into polling? host: ok. guest: again, we might have a in which people going to whatever source for news they want might be determined by or
2:46 pm
influenced by their views already. way does the which causal arrow run? do people feel a certain way and then seek out or pay attention that reinforce what they already think. one of the things that we have seen is that partisans, when they talk about what they find accurate, what they believe is true is often times going with folks, political officials, or reinforcet already what they think. that is part of the partisan divide. quite frankly. that is part of the partisan divide. host: william, houston, texas, democrat. caller: good morning, c-span,
2:47 pm
thank you for all of -- all you do. i also believe that poll questionnaires -- [indiscernible] you that i is to know swing states, michigan, wisconsin, and pennsylvania, i believe that the lack of participation in minority and disgruntled minority voters is what elected our president today. directed been a poll toward those communities? the african-american community as to lack of participation? in the number of voters who decided to stay at home? i think that is what pushed it over the top. thanks again. host: got the question. guest: thank you for the question. there are studies on that. it was academic studies,
2:48 pm
census studies as well, we did see a lot of frankly what i saw on election night, that as the returns were coming in from some of the high democratic counties in places like michigan, wisconsin, pennsylvania, one of the things we noticed was that turnout was not quite at what we had expected. turnout there was in fact a down relative to both the expectation numbers and what it had been in past years, and there is no question that in the pre-election polling, we had anticipated more of those folks turning out. forward to 2018, one of the things we need to do better is make sure that the turnout models reflect might -- who might really show up. 2016f the things we saw in
2:49 pm
is a relative lack of enthusiasm for hillary clinton. that may have had an impact in that turnout being a little bit lower. turnout is a thing that we wrestle with because what a pollster is trying to do is gauge not just how you feel, but what's -- what somebody is going to do about it, what action somebody is going to take. a lot of variables. you need to have time, you need to have interest, you may not get a chance on election day. we wrestle with that. host: tell our viewers where you work election night and what when they areay watching cbs news and they see these anchors calling the races? guest: my decision desk makes the projections, we call the races. we get both streams in from all over the country.
2:50 pm
in a setoking at that of computer models. puzzle that is sort of appearing to you piece by piece and you are trying to make sense of the whole from looking at a part of it. in.w counties are is that a representative group of counties? no, it is not, so we wait. as more come in, we start to see a pattern. we are seeing county after county where turnout is strong for trump, where there has been some change from the past years toward donald trump, do we have that weata in here now are confident that we can make a projection because the pattern is so strong. host: what is the source of that data? guest: the source of that data
2:51 pm
is county and precinct votes. host: not exit polls? .uest: exit polls too interviewers are deployed to a representative sample. i keep saying representative. representative sample, freezing around a given state, the interview voters, they take the responses to that, that gets relayed back to us. part of our role at the decision desk is looking at that and saying, is this a clear enough pattern? is there a big enough gap between the two candidates that even withstanding some error, which they will invariably be in that poll, that we are still hasident that one candidate won. when we make the projection, we
2:52 pm
are not telling you technically what is going to happen in the future, we are telling you what has happened. the votes have been cast. they are revealing themselves county by county or person by person in the case of an exit poll. we are trying to make sense of that pattern. we bring you the projection, that is when we are confident that one candidate has won. there were some states that were pretty straightforward to call. they were pretty lopsided. there were a few, pennsylvania was called pretty late into the night, and we were waiting on wisconsin for a while. one of the reasons we were doing that is we were looking intensely at some of the high democratic areas and saying, wait, is turnout really down? is turnout really don't from expectations? we started to see that pattern in county after county. we started to see that there was a pattern. host: talk about the button that
2:53 pm
you push? guest: [laughter] there is a click. , what weection system do is i go and push that button and it puts the w flag on a candidate. ask me, once i click that button, we have called the race. that gets relayed up to graphics. i tell people in the control room. people low is ask me, do i get nervous when i do that? it is a lot of pressure, without question, but my test is always in my going to be nervous, not, nervous? i i call this right now, am going to worry about it one more votes come in in five or 10 minutes. that is kind of my test. if i think i'm going to be nervous, then i don't call the race. host: you are the only one allowed to push the button? guest: yes, i push the button. [laughter]
2:54 pm
host: viewers can read more about anthony salvanto and the role that he plays at cbs and about polls. where did you get this number? later today, president trump visiting ohio for a political fundraiser in downtown columbus. speech istrump's scheduled to begin at 6 p.m. eastern live here on c-span. collect sees one of the most qualified nominees ever picked and he has contributed a great deal to his community and the legal profession. besides being an outstanding circuit --e d seek d.c. circuit court of appeals.
2:55 pm
kavanaugh failed spectacularly. >> after conducting a thorough and objective review of its , i'm confident judge kavanaugh will be an excellent addition to our nation's highest court. hearing forate brett kavanaugh live tuesday, september 4 on c-span3. watch anytime on c-span.org or listen on the free c-span radio app. >> president trump>> president a roundtable discussion with administration officials and republican members of congre

62 Views

info Stream Only

Uploaded by TV Archive on