Shooting the Pollster
Every time a poll comes out with bad news for one side or another, partisans leap to denounce it. A few months ago Bush's pollster rushed to denounce a LA Times poll that showed a big Kerry lead. A few days ago, a Time magazine poll
came out that showed Bush with a 9 point lead among registered voters. Many were quick to condemn it, including Atrios
and numerous posters on DailyKos.com
, who charged that the poll was "massaged." Atrios claimed that the poll was "biased in favor of getting a male respondent," because the poll first asked to speak with the youngest male in the household, and only asked to speak with the oldest female if no male was available.
Then the next day, a Newsweek poll
came out that showed Bush with an even bigger lead, 11 points, leading to some embarrassment for the critics.
Now, I do think that the critics were right on the big point. The Time poll was taken during the Republican Convention, and the Newsweek poll during the final night of the convention and the following day. It seems quite possible that Republicans were more likely to home that day, and more interested in talking to pollsters. Newsweek released enough information about the poll's respondents to suggest that they did in fact skew Republican. Ruy Teixeira at Donkey Rising
has a good discussion.
But criticizing the poll's methodology is just silly. Both polls actually seem better than average to me, because they release more information than usual about the composition of the sample: that's why the critics can criticize. As Teixeira says,
What I do favor is release and prominent display of sample compositions by party ID, as well as basic demographics, whenever a poll comes out. Consumers of poll data should not have to ferret out this information from obscure places--it should be given out-front by the polling organizations or sponsors themselves. Then people can use this information to make judgements about whether and to what extent they find the results of the poll plausible.
It's this bottom line that matters: whether or not the pollsters managed to draw a random sample.
Oh, and Atrios' complaint about a male-skewed sample? Although the pollster's methodology did sound odd to me too, it turns out to be perfectly standard. The Pew Research Center
--who I think are particularly reliable, because they're non-partisan, quasi-academic, and release their data--use this same technique of asking first for a male. Women are more likely to be home, so pollsters try harder to get males, in order to produce a gender-balanced sample.