This regression is based on a calculation of the standard error of the mean -- more specifically, we regress to the mean error expected based on the pollster's sample size. Note, however, that we do not regress to the mean for two agencies: Zogby Interactive and Columbus Dispatch. This is because these two pollsters use unconventional methodologies -- Internet-based polls and mail polls, respectively, which evidently have resulted in very poor outcomes. There is no reason to give a pollster credit for regression toward the mean when it uses an untested methodology that should intrinsically be associated with larger methodological errors.
There is no evidence mailing increases the margin of error. In fact, marketers use direct mail polls and they're very effective.
3) Response Bias and Likely Voters. One of the counter-intuitive aspects of the Columbus Dispatch Survey is that it seems to do better at getting a representative sample of likely voters despite having had a lower response rate than comparable telephone studies conducted since 1980. Visser, et. al. (1996) theorize that telephone surveys do worse at identifying likely voters because "the social desirability of being an active participant in the democratic process often leads to an overrporting" of likelihood to vote, past voting and interest in politics.
In contrast, although mail survey response rates are typically very low, the people who respond tend to be highly interested in their topics (Ferness 1951; Jobber 1984). And people highly interested in an election are most likely to vote in it. As a result, the self-selected samples of the Dispatch mail surveys may have been especially likely to turn out. The very nature of the mail survey response procedure may have effectively eliminated non-voters from the obtained samples (p. 198).
The authors found evidence to support this hypothesis. Dispatch survey respondents were more representative of the voting electorate than the "likely voters" identified by telephone surveys.
We have some incomplete clues that this advantage did not exist in the final 2005 survey. Self-identified Democrats outnumbered Republicans by 10 percentage points, even though according to Darrel Rowland's email, "the returns (typically) lean a little Republican, which reflects Ohio's recent history of tilting a bit toward the GOP." In the post-election survey by Republican pollster Newhouse, Democrats outnumbered Republicans among those who reported casting a ballot, but the advantage was only two percentage points (36.9% to 34.7%).
The Dispatch's Darrel Rowland also suggests that the geographic distribution of voters may have been off (presumably a bit heavier in Democratic areas):
However, even when you do weight using our most common method (geographical distribution of the mail poll ballots) the outcome is essentially the same.
The Democratic leaning sample probably contributed to the error, but I have no doubt that weighting by party or region probably would not have reduced the discrepancy significantly. These differences are clues to what may have been a "response bias" that was related more to the vote preference than two political party.
The overall response rate provides another big clue. Visser et. al. (1996) tell us that "between 1980 and 1994, the Dispatch response rates ranged from 21% to 28%, with an average of 25% (p. 185). That rate has not fallen significantly in recent years: 19% in 2000, 22% in 2002 and 25% in 2004. Note that the response rate was only three points lower in 2002 when the vote turnout was 47.8% of registered voters than last year when turnout was 71.8% of registered voters.
This year however, the Dispatch Poll response rate fell off significantly. It was only 11% for the poll conducted in late September and 16%12% on the final survey. Turnout alone does not explain the difference. Turnout this year was 43.8% of registered voters, only a few points lower than in 2002 (47.8%) when the Dispatch achieved nearly double the response rate (22%). (Note: the Dispatch poll is the only media poll I know that routinely publishes its response rate alongside the survey results. They deserve huge credit for that).
So what caused the decline?
As with any look at non-response "proof" is elusive. We know little about those that do not return their surveys because they did not return their surveys. However, consider two theories.
a) A Mail-in Vote Survey about Voting by Mail. Note the reference above by Visser, et. al. to the idea that people who respond to a survey tend to be interested in their topics. A study published just last year (Groves, Presser and Dipko) found stronger evidence for this idea: Response rates were higher among teachers for a survey about education and schools, higher among new parents for a survey about children and parents, higher among seniors for a survey about Medicare and higher among political contributors for a survey about voting and elections.
Now remember that Issue 2 was a proposal to make it easier to vote early or by mail in Ohio. So we have a survey that was, at least in part, about voting by mail. Wouldn't we expect a higher response rating among those who want to vote by mail on an election survey that attempts to replicate voting...by mail?
b) Uncertainty and Confusion = Non-response. Remember my comments about how voting on initiative and referenda can be different, that voters that are confused or uncertain appear to default to a safer "no" vote. That is what happens in the voting booth. But what happens when voters are similarly confused or uncertain, but are confronted with a mail-in survey whose completion evokes a considerably lower sense of civic duty. What if, as was the case this year, but never before in the history of the Dispatch Mail-in Poll, there was no candidate race at the top of the ticket, but only five issues that presented respondents with a far more significant "cognitive burden" to complete.
My hypothesis is that many perennial voters who were confused or uncertain decided to simply pass on completing the survey. Meanwhile, the voters who were familiar with the reform issues and ready to support them were more apt to send them in. This would explain why the response rate was lower than usual, and why the final sample more Democratic than usual and than indicated in a post election survey.
linkIt almost seems the bias everyone is pointing to is that the poll tends to lean Republican, which would make this very good news for Obama.