On Wednesday, the Census Bureau released its biannual study of voting patterns in federal elections, which included a remarkable finding: African-American voter turnout surpassed that of white, non-Hispanic voters in 2012 for the first time in recent memory, perhaps ever.
USA Today ran this news on the front page, and the report received write-ups in every other major national newspaper. There's only one problem: That landmark may have been passed four years ago. Or maybe not at all.
The uncertainty stems from the fact that the data the census used to create this report has what several experts consider a major hole in it: Data on whether people voted is collected every other November in a supplement to the Current Population Survey, a regular government survey of about 60,000 households. If respondents decline to say whether or not they voted, or if the interviewer does not ask, it is assumed that they did not vote.
According to detailed tables released yesterday, 61.8 percent of those surveyed said they voted, 25.4 percent said they did not, and 12.8 percent did not respond. The census figures combine the second two categories.
As a result, the data appears, at first glance, to generally agree with other methods of measuring voter turnout. The Federal Election Commission reports that 129,067,662 people voted for president in the last election, while the census estimates that 132.9 million people voted—the sort of modest 3 percent difference that one might expect from a survey. (There are, of course, reasons to suspect that the FEC figure is also not perfectly accurate.)
It is only by assuming that all people who did not answer the survey did not vote, however, that the census is able to produce estimates in line with ballot totals. Were it to omit nonresponses, as most surveys do, it would end up with figures that were drastically higher than what the FEC reports.
"They are literally cheating to make it look more accurate," says Jon Krosnick, a polling expert at Stanford who has worked with the Census Bureau. "They have been doing it for a long time."
It's not surprising that the census would otherwise find its data indicated inflated voter participation rates. Surveys of voting behavior, in which people are directly asked whether or not they went to the polls, consistently report significantly higher turnout rates than the actual number of ballots cast would suggest.
There are competing explanations for why this occurs. A study that Krosnick and others conducted for American National Election Studies, an academic survey of civic involvement, suggests that there is a bias in who choses to participate in these studies. As the researchers put it: "[P]eople who vote in elections (and thereby choose to express their political preferences) also appear to be unusually likely to participate in political surveys (and thereby choose to express their political preferences)."
Another, more insidious explanation is that people who are directly asked if they voted are tempted to lie and say they did. Academics know this as the "social desirability bias"—even when responding anonymously, people who did not vote may be tempted to say they did since voting is generally considered a good thing.
"This is a very common problem that post-election surveys have a large amount of over-report bias," says Michael McDonald, a professor at George Mason University who closely tracks studies voting data.
While McDonald is not as bluntly critical of the CPS survey as Krosnick is, he too is skeptical of how it conflates those who said they did not vote with those who didn't respond. When he recalculated the recent census figures for white and black turnout while simply omitting those who did not respond, he found that black voters surpassed white voters in turnout four years ago, by a rate of 78.9 to 75.5. As McDonald wrote yesterday in the Huffington Post:
These adjusted numbers may help resolve another incongruity in the CPS survey data. The Census Bureau reports that the overall number of voters increased from 131.1 million in 2008 to 132.9 million. This can't possibly be correct since my tabulations from official election results show the overall turnout declined from 132.7 million to 130.7 million.
McDonald's corrected figures may produce results more proportional to actual voting records, but the raw numbers are absurdly high.
The Census Bureau is well aware of these mitigating factors. At the end of yesterday's report, the author, Thom File, acknowledges that, "In previous years, the disparity in the estimates in presidential elections has varied between 3 percent and 12 percent of the total number of votes shown as cast in the official tallies." The report also notes that "the respondent's willingness and ability to provide correct and accurate answers" is a potential source of error.
"Our strategy is to be as transparent as we can," File said in an interview today. In the bureau's defense, several of the spreadsheets of data that accompanied yesterday's report included raw data on the number of people who did not complete the survey, allowing for the sort of analysis that McDonald conducts. File defended the observations in his report in spite of these issues.
Information on who turns out to vote in an election is critically important to both parties as they project current party identification forward to predict future elections. The past election very well may have been the first one in which African-American voters turned out in higher proportions than white voters, but yesterday's figures are largely incapable of answering that question.