Of the nearly two dozen pollsters that met the parameters for our biennial study of pollster accuracy, it might seem odd at first blush that the crown for highest score came down to a tie between a partisan pollster (GOP firm Public Opinion Strategies) and the often-maligned (around these parts, at least) University of New Hampshire polling center.
However, given what we know now, it would be at least somewhat difficult to say that this distinction carries a great deal of honor with it.
For one thing, the results of this study, which covered pollsters for the 2014 cycle, unearthed (or, at a minimum, provided reams of evidence for) a series of weaknesses with the criteria that I discussed at length a week ago.
Despite unearthing those debatable components of the criteria, in the
analysis I elected, for the sake of continuity with the 2012 study (and the ability to compare results), to keep that set of criteria.
Another qualifying flaw that becomes clear when one looks at winners and losers from the 2014 polling derby is the fact that because the GOP clearly overachieved relative to their polling numbers, it made pollsters that leaned GOP look more accurate. It is no accident, I believe, that some of the pollsters that scored the highest on accuracy in 2014 were among those that scored the lowest in 2012.
Head below the fold for the results of our 2014 study, as well as a look at how we can actually glean some reasonable conclusions from what became an odd and problematic year for pollsters in American elections.