tv [untitled] April 6, 2012 5:00pm-5:30pm EDT
5:00 pm
engagement by private sector, i think we should be cautious of that. >> well, thank you, mr. graves, given the fact most of the time voluntary turns into mandated at some point down the road. does make meer in vouls at the same time. one question, i'd like to -- it's not a question. one thing i'd like to ask you all to do, we all have record of questions for the record, we'd like to have those answered sooner rather than later. if you could do it in the next 30 days, that would be great. and on the fee structure, because i am -- i am concerned about what commissioner rosch has said, i know you have budgeted for fees, just for the last five years, if you could give me a list of how much in feechs you have, in fact -- >> i can give you that right now, actually. >> okay. >> for the last ten years. >> you are just going to hand me a piece of paper? >> i will. >> perfect. that's great.
5:01 pm
and with that, thank y'all so much for being here. fun having you two. i like having opposites because -- opposite party commissioners because i actually think that it's important to show that while per happens you don't agree on everything, you do work together. i don't know that that's true across the board in all agencies but i'm thrilled that it is with you all and thank you so much, both of you, for being here and the great job you are doing and for the help that i know you're going to help us in trying to save money because we all need to do that. >> thank you. >> thank you.
5:02 pm
c-span's congressional directory is a complete guide to the 112th congress. inside, you'll find each member of the house and senate, including contact information, district maps and committee assignments. with more about cabinet members, supreme court justices and the nation's governors. you can pick up a copy for $12.95 plus shipping and handling and order online at c-span.org/shop. all this week in primetime on c-span 3, we've been bringing you american history events. and tonight, beginning at 8:00 eastern, from our american art facts series, we travel to orange county, virginia, and montpelier. at 8:30 p.m., we'll visit a living history museum, depicting early new england life.
5:03 pm
and then at about 9:00, a collection of gowns and white house china and learn more about first ladies in presidential administrations. this week on c-span's "washington classroom," we take a look inside presidential polling. how they're conducted. and how to analyze the results. our guest is frank newport, the editor in chief of gallup polls. "washington classroom" is a partnership with george mason university and the washington center. this semester's class is focusing on presidential politicians. issues shaping the company, with a look back at past elections. >> to the students joining us from the washington classroom here in washington, d.c., representing universities and colleges around the country, we want to focus today on polls and focus group and the latest from the gallup organization this week, showing that barack obama
5:04 pm
is ahead, 49% to 45%, the editor in chief of the gallup poll is joining us here in our studios, frank newport. thank you for being with us. >> my pleasure. >> as we look at the poll, you tell your audience all the time, it's one snapshot, a snapshot in time. doesn't tell you where the country is going, but tells youy the country is at this moment. >> well, a couple of -- it can tell you where it is going if you look at the trend. we have economiced confidence, consumer confidence, we track daily. you can look at it today. when you look at it going back to the last four years, it was as high in march as we've seen it, since we started tracking it in january of '08. it is a snapshot in time, a trend on that measure items us that things are getting better. context is important. you are absolutely right, particularly on a presidential ballot. we call that a trial heat. that does not tell us where things are going. and we do know that when we look back in history, elections, where the candidates stand now, when you pit them against each
5:05 pm
other and say who would you vote for, it can have very low correlation to next september and october. we think it's important. we want to monitor the ups and downs but we never hold out this is a prediction of what will be happening next october. >> i want to talk about election years and a couple of ways that you go about polling. let's take a the broader issue, how you go about conducting a poll. you try to get a cross-section of the electorate and probably more challenging today because so many people use cell phones versus land lines. how do you do it? >> people always ask me that question, you know, about cell phones. well, remember, the object, when dr. george gallup founded our company in 1935, he had exactly the same objective we do. he became famous in the '36 presidential election doing skwlaus we're doing now, who are you going to vote for, roosevelt or landin, the republican candidate. we want to generalize from a
5:06 pm
small group to a large group. that's where the science comes in. what can we learn from science to hem us do the best job and tell us how accurate we are? and we do it many different wachs. gallup used to do it in person. we would randomly sample and interviews were out there with clipboards, stopping the third house on the left and interviewing people and they would literally send the data back in overnight trains. people would grab the numbers off the train and within a day or two have the results. we shifted to telephone in the late '80s at gallup for our primary polling and that's primarily how we do it now. we still use telephones. now, of course, everybody knows the percentage of people who are cpos, not chief petty officers, but cell phone only households is increasing rapidly. so, we now call cell phones at least 40% of all of our samples are made with random samples of cell phones. between the two, between calling land line and cell phone,
5:07 pm
randomly selecting both, we think we still get a good rem t representative sample of americans. >> how do you do that? i've never been polled, that's the other question. >> that's the most frequently asked question i get. basically, we start with a procedure, random digit dialing. that means that we can get from various companies the area code and first three numbers in an exchange for every exchange in america that has cell phones or land lines in it. our corporate office is 609-924 and then it's 9600. that's the 609 area code. that's a bank of telephone numbers what we do, our computers get all of the banks of exchanges, area colds and exchanging in america. cell phone and land line and randomly pin digits to the back of them. everybody has -- everybody's number has a chance of falling in there. even britney spears and brad
5:08 pm
pitt's number have a chance being in there. they are unlisted, but we know -- brad pitt has a known exchange, let's say in beverly hi hills, area code, randomly, we could call him. it's called random digit dial and it's a procedure by which we think we have a good sampling framele of every possible phone in america. >> so, you call somebody who is a truly undecided voter and they say, look, i don't have time, i don't want to talk to you, i can't deal with this right now. how does any news organization, any pochli ipolling firm, deal that? >> we offer them $1 million? no. that's called refusal rates. clearly people that we do contact hang up, say, i don't want to do the interview. and that's not the primary concern, steve. the primary concern is, are the people who hang up different than the people who don't. we monitor that, looking for response bias. to date, we think people hang up and won't do the interview, the
5:09 pm
ones that do, after we weight the data based on census parameters, end up being good rementive samples of the population. the fact that some people refuse we don't think distorts or creates bias in the data, at least not to this point. >> harry truman has become the patron saint of anybody who is behind in the polls, that photo of dewey defeats truman, 1949, one of the classic examples of what not to do with polling. what happened? >> well, that was a famous case and it was gallup and other polling organizations at the time polled and said truman will be defeated in his bid for re-election in '48. after it was over and truman won, there was a lot of panels and investigation, george gallup himself said, this is cyril use, we have to look at it and tried to. and a couple of different things came to light. one thing, they stopped polling too early. a lot of polling organizations stopped a week or two before the
5:10 pm
election. a lot of farmers and others were surging to truman in the last minute. by stopping early, they missed a last-minute change. and we think the sampling mechanisms needed to be improved. they weren't the same random sampling that we use now. we actually tweaked the methods we used after that was over. give you an example, in 1980, reagan, who won, only surged in the last day or two of the tracking of the polling, so, had we stopped a week or two before 1980, we might have said jimmy carter was going to win. most of the polls showed that reagan had come back and was going to win the election. that's why today we poll right through the weekend of the election. >> it is a lesson in human nature. conventional wisdom is that. many newspapers were publishing the headline before the results came in, "the chicago tribune" publishing that headline. >>newspapers have
5:11 pm
learned their lesson. in 2000, another great example of where networks didn't do it ahead of time, once the polls closed in 2000, some declared that gore and/or bush or both had won at one time or the other and of course it wasn't known in the long run. so, even after 2000, everybody learned their lesson to be even more cautious, so, now, if you watch, i think this november, unless it's a real blowout, election, people who monitor elections, news media is going to be extreme little cautious about calling the election until they are absolutely sure one person is going to win. >> our partners at george mason university are listening to this class online and student questions from the washington center beginning with anna hall. anna, you are first. >> hi mr. newport. thank you for being here. my question is, with polling results coming out faster and polling data becoming much more accessible to the american public, in your opinion, has the influence of polling on public opinion and awareness increased
5:12 pm
or do you believe that it's just becoming more background noise in the political arena? >> anna, let me ask you, personally, do polls influence how you feel about a candidate or do they sway your view on an issue? >> personal little they don't but they do make me, i feel like, more aware about how the general public feels, i do like to know those kinds of things. >> okay. >> well, it's a good question, anna. we're often asked that. sometimes in a critical fashion. people say, polls are bad, because they in turn effect public opinion or some candidates say there's a bandwagon effect. if i'm shown behind, it's hard for me to win because of that. in answer to your specific question, we don't have a lot of precise data that shoechls the impact of polls on people's thinking from election year to year, so, we can't quantify how much difference it has made. every time i've been involved in an election, go back to '92, people sail there are more polls
5:13 pm
than ever. they say that, we're getting more polls and they'll say it again this year, we're getting more polls. so, the frequency of polling does look like it is increasing. that's good news for the polling industry. because that means that editors and produce earls feel that polling is interesting information and that's why they put it out there. but you know, my position on this is, i think the american public's pretty smart. if they want to take into account a poll result in making up their opinion, that's great information. in other words, if i say, hmm, it looks like right now more voters are supporting mitt romney than rick santorum, nationally, which is the case, by the way, when you ask republicans who are they going to support, maybe i'm going to vote for romney because others are, i think that's fine information. after all, that's opinions put together from a lot of other people. that's what a lot of our crowd sourcing, comments on websites, are based today. a lot of people are making opinions based on what other people think. when i reserve a hotel, i go to
5:14 pm
trip adviser and read through the comments people have made before i make a decision, so, i don't think it's illegitimate if a citizen in this country says, i'm going to at least in part make up my mind based on what other people are thinking and feeling. >> let me go back to the poll that spokes barack obama at 49%, mitt romney at 45%. couple of questions for you. first of all, it's among registered voters, 23409 likely vote earls, which is a differences. it also is based on democrats and republicans, so it's not just a republican primary poll. and then there's the issue of the margin of error. explain those three points. >> this is registered voters. three groups. we can poll national adults, anybody who is 18 and over if they are registered or not. and obama does better if you look at the broader group. we ask, are you registered to vote? over 80% will tell you they are. so, that whittles it down a little. and we do look at likely voters. those are the people we think
5:15 pm
are most likely to vote. even in a presidential election, you know, we'll be lucky if we have 60% of eligible vote earls, the votinga age population turn out and vote. typically we don't look at likely voters. now our decision isn't until later in the year. a lot of people haven't thought about it. we really don't know. in september and october, people are likely to say, yes, i am paying attention to the election, i do think i'm going to vote, things that we think are more predictive. but generally speaking, important point is, among likely voters, republican candidate usually does swhoom better. as you just pointed out, we have about a four-point lead for barack obama over mitt romney, if the election were held today, that's the question we ask, nationally among registered voters. if we looked at everybody who turned out, romney may do better. the truth of the matter is, republicans are more likely to turn out and vote, even in '08,
5:16 pm
when obama had a lot of motivation, republicans are still more likely to turn out and vote in general. >> and typically the margin of error, it can be 2.5% to 3%, as high as 4% -- >> it can be 20%. >> reliable poll, 3% to 5%, maybe 3 fokt 4%. what is that and why is it 2.5% versus 4.5%? >> as i said earlier, a poll is similply a sample. what we know is, from statistics that if you randomly sample any element from a population, there are statistical laws which govern how accurately will it represent the entire distribution in the population. do you have a big vat of 1,000 marbles and 50% have black, 50% are white. you can write all the numbers down and come up with a distr distribution with how accurate it will be.
5:17 pm
some samples will show a little bit more, a little bit less. that's where the statistics come in when we do a sample of 1,000, we then look up the tables and do a variety of other things, we say, hmm. had we been able to interview everybody, our result is probably within plus or minus 4% of what we would have found if we interviewed everybody. and the size of the margin of error depends on a number of factors but the biggest one is a sample size. 10,000 people, the margin of error would be 2%. 100,000, even lowell earl than that. 20%. but if we just sampled 100 people, it would be quite large. the margin of error is based on, to a large extent, not totally, on the size of the population. when i make a speech, i just tell people, all the margin of error does is remind us this is a sample estimate. barack obama's approval today as we speak is 47%. our gallup estimate today. but if we interviewed all 18-plus americans, all of the
5:18 pm
hundreds of millions, it might be 48% and it might be 46%. probably not precisely 47%. the mar gyp of error items us this is an estimate, not a precise value from the overall population. >> allison, we're going to get to your question in a moment from the washington center. chris clark from george mason has a question about how much pomes influence public opinion. >> well, we don't know how much they influence public opinion, but they may, and i think i'm fine with that. i think that's fine. if people want to look at the numbers, say, hmm, right now, we find, for example, that americans are not very much in favor of the individual mandate of the affordable care act. i guess if a lot of americans don't like it, i'm suspicious of it, too, that's fine. we can't -- it's hard to quantify. political scientists and others find it hard to quantify how much polling influences thinking. but it's an element of the news coverage. it's an element of information that is out there and clearly
5:19 pm
some people they take it into account, though based on all the letters we get, people usually don't believe it, right? if you have a strong conservative and a poll shows people like obama, for example, they'll call us up, say, you can't be right. everybody i talk to doesn't like obama and vice versa. i think a lot of people ignore polls. >> but if a news organization gets a poll from candidate x, that shows he's surging ahead in the polls and they get another poll from candidate y who says he's doing well, how much skepticism should people have? >> 100%. people who work for candidates are very smart poll stermterp ss are very smart poll sterm s are very smart poll stermterp t. they are the smartest in the business. but it is spin. anybody who is speaking on behalf of a campaign's primary goal is to spin. so, any time a pollster or a campaign operative or campaign
5:20 pm
adviser starts to talk about the status of the race, you should be very skeptical because he or she is spinning. and that goes for pollsters. poll sterms are smart. when they talk to the cameras about how their candidate is doing, they are in a spin position and view the respondent, the viewer out there should be very skeptical. >> allison, you are next with a question. >> hi. my question was centered similarly around that concept of partisan pollsters and i was wondering, of partisan pollsters such as public policy polling, do they slant or do they have a tendency to slant the results with questions and sampling to show their client either leading or trailing based on what is advantageous at that time? >> well, excellent and complex question. keep in mind that partisan pollsters, their primary job in life is to advise candidates and groups on how to win elections
5:21 pm
and where they stand. and that's all private. as i mentioned a moment ago, a lot of these people are very smart and they do a good job. that's what we don't see. what we're talking about really, if a partisan pollster releases something public, at what degree do we need to have skepticism about it? some polls primarily work for democratic or republican candidates. you do have to step back and look at it carefully before i agree with it. if it is a policy issue, my advice is, look at the question wording and the sampling yourself. that's hard to do for the average consumer. that's like saying, somebody says, vitamin c cures the common cold, look at the medical evidence yourself. it's hard if you are not a doctor. but you need to at least look at the question wording and see if you can make an assessment yourself. occasionally they will do a legitimate job, but you do have to have a grain of skepticism when you look at policy polls released bipartisan pollsters. in the instance of ppd,
5:22 pm
releasing polling, it's hard to slant the question on who you are going to vote for. then look and see if they overestimate candidates in vary youls races. i would say it's a starting point, if a firm is entrenched and generally lo lly works for e or the other, your initial position should be one of skepticism unless it is proved otherwise. >> you talk about cell phones, and the web. jen griffith from george mason wondering how over the next ten years do you think polling will change? >> well, you know, we are doing a lot of work now. i was president of the american associational of public opinion research, our nation's biggest pollster group, last year and now the immediate past president but in my presidential address, i'm sure you read it carefully -- >> i think we covered it. >> i'm not sure if you would have or not, but i talked about the fact that we as poll steriles have to be open to whatever source of public opinion that's out there. and the biggest phenomenon we're
5:23 pm
finding out is, where as i mentioned, quite a fup people refuse to talk to us when we call them. we now have millions of people who want to give their opinion, what are they doing? they are tweeting it, flight they say, i want people to know what i think about something. and facebook posts, people are trying to give their opinion, as well, blogs and other forms. so, we have this phenomenon today where millions are americans are desperate to get their opinion out there. so, researchers have started to look at the data. you can get access to tweets and to facebook posts. we're talking about terrabytes of data. and you can look at it and see what item it tells you. so, even at gallup, we are looking at the data. we have some teams that are looking at it and our question is, can we learn something that benefits us by analyzing the sentiment and the type of tweets that we see about a topic from day-to-day and month to month? right now, the jury is out. we don't necessarily see it now
5:24 pm
but the question was ten years from now, well, let's talk in ten years. change is the only thing that's constant. >> let mel turn to one of my students from the washington center, casey foster. you are next with a question to frank newport. >> my question is a little bit more general. in presidential election years, more focus on swing states. do you guys, gallup maybe put more emphasis or extra details in predicting swing states? do you find it more difficult? i'm an ohio voter and i know, you know, we're definitely a swing state. do you find it more difficult to assess the way we're going to. >> in a presidential election year? >> yeah, another good question. yeah, in the electoral college, we know voting ut doesn't add a lot of value, they're going to vote republican. in massachusetts, they're going to vote democratic, no matter what happens, unless it is a
5:25 pm
dramatic year, like in 1972. but we like to continue to poll nationally, because we think the national poll gives us a good barometer, efficient barometer. we poll nightly. it's hard to poll in 50 states nightly, so, we poll nationally and we think the trends are representative in a swing state. so, if we are seeing, let's say, talking in september, obama is moving ahead or the gop nominee is moving way ahead nationally, there's no doubt he's probably moving ahead in the swing statements as well. technically, it is more difficult to poll in swing states? why? because my definition, they are split in the middle. it is easy for me to poll in utah, because the republican candidate will be 20, 30 points ahead of the democratic candidate. big deal, it's not a big deal. but in ohio, or in florida, the vote may be like 50/50. and the little nuances of pofling cpofl i polling can make a difference. if it is 60/40 in utah,
5:26 pm
four-point swing doesn't make much at all. it is a challenge, for those who do, to poll in swing states. we poll in the groups of -- just today, we released with "usa today" anning a regrags of pofling in 12 swing state us, which we think is useful. we don't put resources in pochling individually in these states in tracking it all year long. i think it's more efficient to poll nationally and monitor what is happening at this level. >> the ftc has the do not call list and quite often if, you are at home, you see 800 number, you may not pick that up, as well. i know you have to dealle with that. or if you pick up your cell phone, you may ignore the call. let me go back to that earlier point about people who just, you know, won't participate. maybe not so because they don't want to, because they just don't pick up the call. >> yeah, under the do not call law, you -- pollsters can still call. it's actually good for us. market research, tell marketers are now squeezed out. you can put yourself on the
5:27 pm
registry. but legitimate public opinion poll still have the ability to call and it's not against the law. we put gallup so it shows in call earl i.d., which givens us an advantage, i think. our name i.d. is higher than if it is john jones research or acme research, or just a number from new jersey. but when we call, it says gallup, people are more likely to pick up the phone. and by the way, when it comes to cell phones, by law, you can't have robo calling, that is, automated polling, you have to have a live interview who niche yaments the call for cell phones. that's what we do at gallup. >> chris, one of the students from george mason university, why is there such a disprep si in the ruments from different national polls? >> well, usually there's not. we hear that question, but if you look at it in the big picture, there's really not. like today, look at where obama's job approval rating is, go to a site thatting a regra t thatting a regalts that.
5:28 pm
you'll find today there is a remarkable ail agreement that obama's polling, his approval is around 47%, 48%. if you look at the numbers, all the recent polls are remarkably close. in wisconsin, where they will be voting, the republicans will be voting, not gallup, but a number of polls done and all of them are showing romney ahead of santorum and by ranges of, you know, six or eight points, some a little bit more, some a little bit less. but those are remarkably consistent. no polls out there right now saying that rick santorum is ahead by five or ten points. they are all showing roughly the same thing, that romney is ahead. they could change before the vote tomorrow and likely, but there is consistency. much more often than not, in answer to your question, there's a remashlgable consistency among polls. from time to time, there is a discrepancy. sometimes we don't know what causes it. there was a poll by "the new york times" and cbs a few weeks ago that, unlike a lot of other polls, showed obama's job
5:29 pm
approval rating dropping to 41%. a lot of attention paid to it. and our poll didn't show the drop and showed the rating was still higher and that seemed to be the case. why was "the new york times" and cbs poll lower? sometimes we don't know. your sample is not alwaysen estimate. it happens to us. occasionally, you get a poll which is out of it. and that sometimes creating the distinction. but i would keep coming back to the point, if you really, as i do, follow polls for a living, usually the estimates of what the population is doing are remarkably consistent, rather than out of sync with each other. >> i want to talk about a very in exact science, focus groups. i want to share you with one of peter hart's so-called lightning rounds. this is what took place in december in fairfax, virginia, virginia is a key bat pl ground state, it was in 2008, barack obama won, a state republicans hope to pick upn
181 Views
IN COLLECTIONS
CSPAN3Uploaded by TV Archive on
![](http://athena.archive.org/0.gif?kind=track_js&track_js_case=control&cache_bust=352580870)