© 2024 Kansas City Public Radio
NPR in Kansas City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Why Do Election Polls Vary So Much?

MICHEL MARTIN, HOST:

This is TELL ME MORE from NPR News. I'm Michel Martin. Coming up, Shirley Sherrod lost her job at the U.S. Department of Agriculture after she was accused of making racist statements in a speech, an accusation that was false and a smear. Now she's telling her own story in her own way. She has a new book out and she'll tell us more about it in a few minutes.

But first, Shirley Sherrod's old boss, President Obama, is a week away from knowing if he will get four more years in the White House. And with Election Day so close you are probably hearing, along with all the political ads, a lot of new polls.

In fact, the latest NPR poll is out today and it shows Governor Mitt Romney with a one point edge nationwide but trailing the president by four points overall in a dozen key battleground states. But of course, ours is just one poll, and often polls are different.

(SOUNDBITE OF NEWSCASTS)

UNIDENTIFIED MAN: And this morning the latest polls are painting a confusing and somewhat ominous picture.

All this according to our brand new CNN/ORC poll.

UNIDENTIFIED WOMAN: The last poll had him trailing the president.

UNIDENTIFIED MAN: Rasmussen poll, which has it at a dead heat and the Fox News poll which has the president out by six points.

MARTIN: We wanted to know why there seems to be so much variance in the polls and what goes into making them so we've called Simon Jackman. He is a professor of political science and statistics at Stanford University. He also co-directs the Stanford Center for American Democracy. Welcome. Thanks so much for joining us.

SIMON JACKMAN: Thanks for having me.

MARTIN: So Professor Jackman, a lot of people will hear - well, right now let's just say that the polls all seem to be showing a very close race. But people often will hear these polls and they will hear very different things. Why is there so much variation in what people hear?

JACKMAN: Well, one explanation, of course, is that there's just ordinary sampling variation, and that is if I did a sample of 500 people and you did a poll based on a sample of 500 people, we could expect our results to diverge for that simple reason alone. There's a relatively small sample size there. And that's what pollsters refer to when they talk about their margin of error, as it were, with any given poll.

But over and above that, there's this variation that comes from the different way the sausage gets made, if you will, the different sampling techniques. Is it a phone survey? Is it an Internet survey? How long was the field period for the poll? There's a whole bunch of ingredients like that that produce this extra variation, over and above what we'd expect from the usual thing that we might have learned about back in Stats 101, perhaps, due to random sampling.

MARTIN: Of all the years that you've been doing this, are there particular challenges for pollsters right now? I mean, one immediate one that comes to mind is the fact that a lot of people these days don't have land lines anymore; they only use their cell phone. Or they might have a land line and a cell phone. Is that a particular challenge? And how do pollsters deal with that?

JACKMAN: That is a big, big deal in the industry right now, and it is, perhaps, one of the single biggest challenges facing phone polling in the United States right now, the way that people are abandoning land lines. It is illegal to auto-dial a cell phone and so that means that if you're going to try and get penetration into that segment of the population that is now cell phone only, it's become a more expensive proposition.

You've got to have human dialers actually dialing those numbers. You've got to the lists of cell phone numbers in the first instance. And so it's producing a real challenge - over and above the ordinary challenges of declining response rates. I think there's a lot of people out there getting jaded in all the polls we get asked to respond to.

I don't know about your kids, but mine just don't answer the land line anymore, we are - even though we are not a cell phone only household. I suspect like many people we simply ignore the land line when it rings because it's always marketing or someone like that.

And so that is producing a huge problem for the pollsters and a particular problem with political polling where the cell phone only population tends to skew younger, more mobile, and hence, more Democratic. So it's potentially a real source of bias in political polling, your ability to get into that cell phone only population.

MARTIN: Are you suggesting that in general, then, polls may be under-counting the younger more Democratic-leaning vote because they are more likely to not be reached? Is that what you're saying?

JACKMAN: That is a real risk to the industry at the moment and I think the better pollsters are extremely mindful of that and are doing their best to deal with it. Even the robo-pollers, as we call them, those using automated dialing techniques or IVR techniques, as we call them, to do land line only, they're doing their best to deal with that too.

They know that they're not getting to that population. They're using statistical adjustments to try and compensate for that. But, again, that's another source of error and another source of why the numbers might dance around a little more over and above what we'd expect from ordinary sampling variation.

MARTIN: And do you have any ways that you compensate for this for yourself? I mean, what do you do? Do you have some low tech polling other than just, you know, calling your relatives? I'm sure that doesn't really have a high confidence rate.

(LAUGHTER)

MARTIN: It might be satisfying but it doesn't lend a lot of confidence in the accuracy, I bet.

JACKMAN: There's a range of techniques out there, of course. And, you know, I'm currently directing something called the American National Election Studies with colleagues here at Stanford and the University of Michigan, and we're using good old fashion face to face interviewing for that, which is extremely expensive. And we've got a seven week field period.

We started polling just after conventions and we'll go right up until the election itself. That's out at one extreme. On the other extreme, you've also got Internet panels where people have been recruited to take surveys over the Internet. And then in between, I think, you've got a mix of phone methods. But everybody is wrestling with the fact that you can do a random sample, but for various reasons either you're not getting to the people you want or the people that you're getting to aren't participating in the survey.

Everybody's engaging in statistical adjustments of some kind and that is a big driver of the fact that that one poll may say one thing and another poll might say something completely different.

MARTIN: People have also gotten very accustomed to instant polls, or insta-polling, after the debates. For example, after significant news events we often see these kinds of snap polls. How confident should people be in those polls given what you've just told us?

JACKMAN: They're extremely difficult to do and to do well. The mountain you're trying to climb with an insta-poll is enormous. You've got to get a representative sample taking the survey in an extremely compressed time window. And all the usual problems to do with survey participation manifest themselves in a huge way in that compressed timeframe.

People who are turned on by the event, very enthusiastic about what they just saw, are the ones who are more likely to participate in those polls. And as a result, you're even leaning harder on the statistical adjustments to correct for that. And so I take them with a huge grain of salt, at this stage of the game, at least.

MARTIN: Speaking of which, how should people - I mean, obviously if you're in a campaign and it's your job to get this information, you've got your, you know, own level of anxiety about it and you've got your own techniques for figuring out how you're going to use that information; but what about just a person who's just interested in politics who just wants to keep up with things? How should they figure out in whom to have confidence and whom not to have confidence?

I just want to mention that NPR, for example, we use a bipartisan group. We use a Republican pollster and a Democratic pollster working together and ask them to work together to manage the poll. So that you can at least have the sense that there's a bipartisan perspective. What do you recommend that listeners look for? As briefly as you can.

JACKMAN: That's a good start. A bipartisan polling firm, or two firms, doing the work is a great start. Are they doing live interviews or are they auto-dialing to land line only? How long was the field period? The other thing you can do is, I think, you know, look at some of the work I'm doing over at Huffington Post or that Nate Silver at 538 over at the New York Times - which is essentially averaging the polls.

And the good news is your NPR numbers, that you talked about at the top of this segment, are right down the line: a small lead for Romney nationally, perhaps, but Obama still ahead in the swing states. That is right on the money, as it were. That's sort of...

MARTIN: OK.

JACKMAN: ...you know, right where the consensus or the average of the polls would line up.

MARTIN: All right. Professor Simon Jackman teaches political science and statistics at Stanford University and he joined us from Stanford. Professor Jackman, thank you for joining us.

JACKMAN: Thank you.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.

KCUR serves the Kansas City region with breaking news and award-winning podcasts.
Your donation helps keep nonprofit journalism free and available for everyone.