The Best or Worst Pollsters in the 2012 Election – How did Nate Silver Do It?

The Best or Worst Pollsters in the 2012 Election – How did Nate Silver Do It? November 13, 2012

Nate Silver has gotten a lot of press for his near perfect election night predictions. In 2008 Silver gained popularity and influence after he called 49 out of 50 states. Many pundits were predicting that Silver would crash and burn in 2012. Instead, he outperformed his 2008 predictions and now that Florida has been called for president Obama, Silver can tout a perfect record 50 out of 50 states called correctly.

How did Silver do it?

Only Likely Voters Matter: This is one obvious reason Silver’s method outshines everyone else. Anything other than likely voter models is useless information. This is key. Early in the race, many polling outfits poll registered voters. Silver ignores them. Likely voter models are astronomically more accurate, but they are expensive. In registered voters v. likely voters, you get what you pay for.

Weighing the Data Correctly: You have to be able to handicap the pollsters. For polling data, sample size is everything. The higher the sample size, the better the data – unless the data is bad. Silver consistently fares well because he knows that all data is not created equal. Folks like Nate Silver and FiveThirtyEight blog are polling aggregators. They put together many different polling samples from many different polling outfits, and use them to get a bigger sample size – which is fine. But, the key is in how you weight the data in your model. Silver knows how to do this better than anyone. He grinds it out, interviews the pollsters, combs through their methods and doesn’t take shortcuts.

Correcting for Bias: Silver’s stock in trade is his ability to look at a polling outfit’s methodology and track record, then correctly assess how their data is biased and corrects for it. So for instance, if CNN is predicting a candidate has a 3 point lead. Silver looks at all of the factors, and assigns them a bias number – let’s say 1.7 points bias for that candidate. Silver would actually change the CNN result from a 3 point lead to a 1.3 point lead. He changes other people’s polling data to correct for their bias! Then assigns the poll a weight (higher the bias, less the weight it seems to me – Silver doesn’t give out that information), and plugs it into his model.

Math trumps ideology: FiveThirtyEight’s magic formula isn’t public, but all along Silver gave hints as to what he was doing. In the end this is why I put so much stock in Silver’s predictions. He had been telling us all what the polling would do for months. He not only predicted the elections, he predicted the behaviors of the polls for the last 2-3 months of the election cycle. This is because Silver’s final bias correction was toward his own outfit. He kept this analysis in the realm of numbers, not ideology. He’s not an ideologue (self described in an NPR interview as more of a libertarian), he’s a math nerd. It has paid off three elections in a row

Post-Mortems on 2012 Polling
Now that all of the polling information is out there, Silver and his team are sorting through it. As he breaks down the data, there were three things that really surprised me:

  • First, Gallupwasn’t even close. Silver never liked the numbers Gallup was getting, and was clearly not weighing them heavily in his model. As it turns out Gallup finished dead last among similar polling outfits, showing a 7.2% bias toward Romney. Silver’s reasoning was simple. “Gallup has now had three poor elections in a row. In 2008, their polls overestimated Mr. Obama’s performance, while in 2010, they overestimated how well Republicans would do in the race for the United States House.”
  • Second, NPR had a higher bias for Romney than did Fox News (3.6 pt. bias for NPR / 2.6 pt. bias for FN).
  • Third, effective polling methodology has changed. According to Silver, the most accurate polling samples came from pollsters who used internet polling. The second best method was the live pollster making a live phone call. The worst method was the robo-call. It’s a shift to be sure. Silver wrote, “Perhaps it won’t be long before Google, not Gallup, is the most trusted name in polling.”

The Top Five Pollsters in 2012

  1. Investor’s Business Daily wins the most accurate polling data for 2012
  2. Google – came in a close second
  3. The Mellman Group, led by Mark Mellman, shined again this year in third place
  4. RAND Corp., a non-profit research group, came in fourth.
  5. CNN finished out the top 5

Other notables

6. Reuters – just out of the top five
11. Quinnipiac – middle of the pack for one of the high profile polls
12. Marist
19. Zogby

Who Were the Worst Pollsters in 2012?

Rasumussen
American Research
Mason-Dixon
Gallup was dead last


Browse Our Archives