From Seattlepi.com, Steve Kircher writes:
We're less than two weeks away from Labor Day, the traditional start of the fall presidential campaign. So it's a good time to take a breath and understand what the polls can tell us as Election Day nears and where they may mislead or even be wrong.
As a pollster for 30 years, I generally trust election polls, but I know they're not infallible. At first blush, there seemed to have been a lot of errors this year. After all, weren't the polls wrong in primary after primary?
Well, here is the first fact check: Of 26 Democratic primary contests this year, the polls were correct in 23 states and wrong in three (New Hampshire, California and Missouri).
You remember New Hampshire. Hillary Clinton defeated Barack Obama when every poll said Obama would win. That was a high-profile primary, coming shortly after Obama's win in the Iowa caucuses, so when all the polls indicated Obama would win comfortably, political commentators pounced, declaring the New Hampshire Democratic primary as evidence that polls cannot be trusted.
On the contrary, I think it's remarkable how accurate polls are, given all of the ways they could differ from the actual election results. And let's be aware of this important caveat: I'm talking only about scientifically designed polls that use random sampling, not Internet surveys or other amateur efforts that do not purport to represent an entire electorate.
That said, here is a user's guide in what to watch for in the polls to come and what can cause a poll to go wrong.
Low response rates: Most interviews are conducted over the telephone, and a big challenge for pollsters is reaching someone to be interviewed. One major national poll, ABC News/Washington Post, reported a 32 percent response rate in one survey. That means they were able to interview one of every three people they called. The good news is that studies have shown that the lower response rates do not lead to major differences in survey results. The biggest challenge is getting someone to answer the phone. Many of the phone numbers dialed just reach an answering machine, which means no one is at home or if someone is at home, they are screening their calls. Older adults and women are more likely to be interviewed and most polls "weight" or adjust their data so this bias is corrected.
Cell phones: In 2007, one in every seven homes had only a cell phone and no landline. Many polls do not include cell phones because it's more complicated and costly to conduct cell phone interviews. Fortunately, cell-phone-only households are similar to landline households in political attitudes. But this may not hold true as the number of households with only cell phones grows. Some pollsters, such as CBS News/New York Times, are interviewing people with cell phones.
Online and recorded interviewing: While most survey interviews are done over the phone with a live interviewer asking questions, some pollsters conduct interviews over the Web or use recorded voices to ask the questions. Pollsters who use these approaches claim their methods are better than the traditional live-phone interview. I'm skeptical. People who participate in online surveys volunteer to participate and they tend to be more motivated and less of a cross section of voters.
Lying: As we can best determine, people are truthful in public opinion surveys, but there are times when the truth is stretched a bit. A recent Wall Street Journal story reported that in a 2005 Harris Poll, 58 percent of interviewees told a live telephone interviewer that they exercise regularly, but in an online survey only 35 percent reported exercising regularly.
People are known to exaggerate when it comes to sensitive matters such as education, household income and sex, and that normally doesn't matter in getting accurate answers in political surveys. But this year may be different because of Barack Obama's racial background. Some people are reluctant to say they're voting against a candidate because of race and as a result, polls tend to overstate the level of support for a black candidate by five to 10 points. This gives an advantage to the more anonymous polls, such as the Rasmussen poll, which uses a recorded voice rather than a live interviewer to ask the questions. In early August, Rasmussen had Obama ahead by one point, while an average of other national polls had him ahead by four points.
Likely voters: Nearly every national poll interviews only registered voters, but some try to identify those people who are most likely to vote. Of eight national polls in early August, five polls of registered voters showed Obama ahead by five points; three polls of likely voters showed Obama ahead by less than a point. That's been the pattern this season, with John McCain doing better among surveys of likely voters.
The likely voter surveys are considered to be more predictive of the actual election results because about six of every 10 registered voters will go to the polls in November. On the other hand, the "likely voter" models tend to discount the candidate preferences of younger voters and since younger voters favor Obama over McCain, the "likely voter" polls may be understating Obama's support if younger voter turnout is higher than in prior presidential elections.
Convention "bounce": John Kerry gained about three points coming out of the 2004 Democratic convention and he kept a lead over George W. Bush until just before the Republican convention. Bush gained three points after the Republican convention and kept that point lead until he won the November election. The Democrats will hold their convention first this year, starting Aug. 25. Any bounce for Obama is likely to be short-lived because Republicans will meet just a week later. The size of any McCain postconvention bounce may be a good clue of how strong a candidate McCain will be.
It's also been the pattern that polls trend in favor of Republican presidential candidates after Labor Day, but that might not hold true this year given the unusually high unpopularity of President Bush and with Republicans losing several congressional races earlier this year in areas that usually favor Republicans.
Timing: Timing looks like it was a major factor in why polls were wrong in three of the 26 Democratic races last spring. In California and Missouri, three polls were wrong and one was right. In both states, the poll that correctly predicted the election included interviews up to the day before the election, enabling it to capture last-minute changes in voter opinions. The polls that were wrong conducted their last interviews two to five days before the election. If one day makes such a difference in poll results, how accurate can polls be three months before an election?
Wall Street Journal columnist John Fund recently observed that summertime polls often differ from the actual election results. In 1976, Gerald Ford was far behind Jimmy Carter in the summer, but lost by only two points. In 1988, Michael Dukakis led George H.W. Bush during the summer, but lost by eight points. In July-August 2004, Kerry led George W. Bush by two to three points, but he lost by 2.5 points.
Why conduct surveys so far in advance of the election? Cynics will say it helps fill the cable news channels and political columns of newspapers. While that's a likely factor, more important is that polls generally are accurate, and even this far out they are helping to track the dynamics of the presidential race, even if the dynamics today are quite a bit different from what they will be in November.