IE 11 is not supported. For an optimal experience visit our site on another browser.

Of what use are opinion polls?

Combined with other data, polls can be useful — but they are not in themselves an infallible source of truth about why people vote the way they do. Here's how polling works.
Image: Mark Penn
Making a point to reporters after a Jan. 5 New Hampshire debate, Hillary Clinton's pollster Mark Penn is one of the 2008 campaign's key players. Win McNamee / Getty Images file

After the failure of opinion polls in New Hampshire to anticipate Sen. Hillary Clinton’s triumph, people are taking a second look at why we pay so much attention to polls.

After Clinton’s New Hampshire victory, Huffington Post editor-in-chief Arianna Huffington, who supports Sen. Barack Obama, urged her readers to rebuff pollsters’ inquiries.

“If enough of us refuse to answer, the polling data will become so unrepresentative and unreliable even the media would have to admit it was useless,” she wrote last week.

But for those who want to keep using polling data, but perhaps more judiciously after the New Hampshire breakdown, here are some basic cautions.

A poll is a small sample of some larger number, an estimate of something about that larger number. For instance, what percentage of voters will cast their ballots for a particular candidate in an election?

There’s no way to know in advance who will show up to vote on Election Day. “The (voting) population is unknown,” said Nancy Mathiowetz, president of American Association for Public Opinion Research. Pollsters are “trying to infer who the population will be on Election Day.”

Oftentimes, pollsters use the phone to gather a sample of people. Common is random-digit telephone dialing of the larger population to try to ensure that their sample is not skewed to one group.

Determining who is going to vote
They sift for likely voters by asking questions such as, “How likely are you to vote in next Tuesday’s election?” This is called the “likely voter screen.”

Sometimes pollsters will try to “tighten” their screen by asking, “Where is your voting location? What precinct are you in? What time will you vote?”

If a respondent says he doesn’t know his polling place or says that he’ll vote at 8:30 p.m., when the polls close at 8 p.m., then he’ll likely be tossed out of the sample.

In some cases, pollsters narrow their survey to those whose names are in a voter file, a list of those who have voted, say, in three out of the last four elections.

But David Paleologos, director of the Political Research Center at Suffolk University in Boston, explained this technique can be used only in states where voters are required to be registered by a certain deadline before an election.

In states such as New Hampshire where one can register on the day of the election, screening out Election Day registrants — people who haven’t voted before — would skew the sample.

Suffolk University’s poll came closer than most polls to assessing the New Hampshire primary: On the day before the primary it showed a statistical tie between Obama, at 39 percent, and Clinton, at 34 percent.

Paleologos said that among the reasons why the Suffolk University poll came closer to the actual outcome was the fact that his poll used a wider screen of likely voters and did not over-estimate that younger voters would comprise a big part of the actual electorate.

One of this generation's master politicians, President Bill Clinton, used polling data to help him out maneuver a Republican-controlled Congress.

Clinton "consults polls as if they were giant wind socks that tell him which way the wind is blowing. And then he asks the pollster to help him determine which current he should try to harness to move him closer to his destination," said Dick Morris, who was a Clinton strategist in his first term as president.

According to Morris, Clinton commissioned polling to determine how voters would react to policy proposals such as the idea of a tax deduction for college tuition.

Clinton also studied polling data on whether he should veto the Republicans' 1996 welfare reform bill.

"Mark Penn had designed a polling model that indicated that welfare veto by itself would transform a fifteen-point win (in the 1996 election) into a three-point loss," Morris said in his book "Behind the Oval Office."

Penn is now Clinton's pollster.

Who won the blue-collar voters?
Once the balloting is over, you’ll hear authoritative-sounding statements such as “Hillary Clinton won 40 percent of the blue-collar vote.”

These statements — often phrased as if there were no uncertainty about them — are derived from exit polls, interviews with a sample of voters as they leave polling places.

The virtue of exit poll interviewing is that we hear from those who have just voted, as opposed to the people telephoned in pre-election polling who may not vote and may not even be registered to vote, even when they lie to pollsters and say they are.

In last week’s New Hampshire Democratic primary a total of 285,541 people cast ballots. The exit poll done by the TV networks interviewed 1,955 of them — about seven-tenths of one percent of the total.

The flaws in exit polls
Exit poll interviews can’t tell us with scientific precision what percentage of blue-collar voters or Catholics or any other demographic group cast their vote for Clinton or John Edwards or Obama.

After all, voters vote by secret ballot. And what a voter tells an exit pollster may not be the whole truth, about his or her church attendance or income for instance.

And a significant percentage of voters refuse to take part in exit polls. In the 2004 presidential election, the refusal rate reached an average of 35 percent nationwide, and peaked at 46 percent in New Hampshire.

But voters cast their ballots in congressional districts and in particular precincts, and for those districts and precincts we do have Census data which gives us racial, income, age, and other information.

Actual vote counts in a precinct can confirm or dispute exit poll data.

Look at how two different places in New Hampshire voted in last week’s Democratic primary.

How they voted in Ward Ten
On the west side of Manchester, N.H. is Ward 10, a place where many Democrats are socially conservative, non-Ivy League blue collar voters. According to the Census, only six percent of the people in this zip code have graduate or professional degrees.

In the 2004 Democratic primary, Sen. John Kerry won Ward 10 with 45 percent. The runner up was Howard Dean, with 18 percent.

Last Tuesday, Clinton won Ward 10 with 48 percent, to Obama’s 28 percent, while Edwards won 17 percent.

In contrast, the more affluent town of Peterborough, where the Census says that nearly 20 percent of the residents have graduate or professional degrees, went for Obama. The vote was Obama 47 percent, Clinton 26 percent, and Edwards 17 percent.

In the 2004 primary Peterborough went for Howard Dean, who got 38 percent, while Kerry won 33 percent.

This contrast between Peterborough and Manchester’s Ward 10 lends support to the New Hampshire exit poll data showing that Obama got more support than Clinton among voters who had done post-graduate study. The exit poll said he won 43 percent of them, while Clinton got 31 percent and Edwards 16 percent.

News media organizations mount a massive effort to gather exit poll data. In 2004, some 70,000 voters were interviewed at nearly 1,500 polling locations, while another 5,818 telephone interviews were conducted of absentee and early voters.

While an exit poll is a more accurate reading of the electorate than a pre-election poll, it, too, can be inaccurate. Case in point: In the 2004 presidential election, exit polling overstated Kerry’s share of the vote.

Kerry voters more willing to be interviewed
Edison-Mitofsky, the firm that conducted the exit poll, said after investigating its own methodology, the error in the estimate was primarily due to “Kerry voters participating in the exit polls at a higher rate than Bush voters.”

Edison-Mitofsky said, “it is difficult to pinpoint precisely the reasons” that Kerry voters took part in the exit polls at a higher rate than Bush voters, but pointed to one factor: “in this election voters were less likely to complete questionnaires from younger interviewers.”

If the voters avoiding the younger exit poll interviewers were Bush supporters, the exit poll sample would be skewed.

As Darrell Huff explained in his classic 1954 book, "How to Lie with Statistics," “The kind of people who make up the interviewing staff can shade the result in an interesting fashion.”

Interviewers do make a difference 
Huff cited the case of the National Opinion Research Center survey conducted during World War II.

Two sets of interviewers surveyed black residents of a Southern city. One set of interviewers was black, the other white.

One survey question: “Would Negroes be treated better or worse here if the Japanese conquered the United States?”

Black interviewers reported that nine percent of those they interviewed said the Japanese would treat black people better. But white interviewers found that only two percent of those they interviewed thought the Japanese would treat them better.

“The operation of a poll comes down in the end to a running battle against the sources of bias,” Huff said, such as interviewer effects and faulty samples. “The battle is never won.”

This is not to imply that polls are worthless. Combined with other data, polls can be useful — but they are not in themselves an infallible source of truth about why people vote they way they do.