So Many Polls... Why Such Different Results?
By: Amanda Barna and Michelle Henry
With less than one week left before the close presidential election, several new polls are released every day. A question that is often asked is ‘why are the results so different from poll to poll?’ What leads to these differing results is not the questions that are being asked, but rather who is being asked the questions and method by which they are contacted.
DIFFERENT DEFINITIONS OF LIKELY VOTERS - Most political polls published this time of year include results from registered voters who are likely to vote in the upcoming election, essentially trying to predict what voter turnout will look like. The way pollsters define likely voters varies greatly by organization with models ranging from self-reported likelihood to vote to a seven-question index. Differing models can cause political parties and demographic groups to be either overrepresented or underrepresented in predicting voter turnout which can cause such swings in poll outcomes.
CELL PHONE INCLUSION - Cell phone ONLY households, those that no longer have landlines, total 34% in the US, 33% in Ohio. When you add in households that use cell phones MOST OF THE TIME and keep landlines for purposes other than engaging in conversations, the percentage jumps to 49% (51% in Ohio). With half of the states in the US having wireless only or wireless mostly percentages of 50% or more for adults ages 18 and over, less than half of households can be reached using traditional telephone survey methods based on residential landlines. Not including cell phones results in underrepresenting attitudes, opinions and certain populations, such as younger adults and minorities. Even with the prevalence of cell -phones being so high, many pollsters are still not including cell phones in their sample.
Pollsters who include cell phones in their sampling frame: Gallup, NBC News/Marist College, New York Times/CBS News/Quinnipiac, PEW Research Center.
Pollsters who do NOT include cell phones in their sampling frame: Rasmussen Reports (uses an on-line survey tool to reach those who can’t be reached by landline), Public Policy Polling.
Which pollster's approach is best? We will have the answer soon enough- on Election Day.
Poll | Definition of Likely Voter |
---|---|
Gallup/CNN/USA Today | 7-point index based on answers to the 7 questions below:
|
New York Times/CBS News/Quinnipiac | Likely voter model involves weighting respondents by probability of voting. Model includes registration, intent to vote, history of past voting, and when they moved to their current address. They do not release actual probability data. |
Rasmussen | Model is based on past voting history, interest in the current campaign and likely voting intentions. |
Ipsos/Reuters | Model includes self-reported registered voters who indicate a great deal or quite a bit of interest in following news about the campaign and one of the following:
|
Washington Post/ABC News | Although conducted jointly, each organization applies its own likely voter model.
|
NBC News/ Marist College |
Registered to vote, has a likely chance of voting (specific race is identified), and interested in the election (specific race is identified). |
Pew Research Center | For Pew’s likely voter model, those who are NOT registered to vote and those who say they are not planning to vote are never included. A point is scored for each of the following: Given "a lot" of thought to the election, Voted in past election, "Always" vote, "Plan to vote" and "9" or "10" on a 1-10 scale of certainty to vote this November, "Absolutely certain" registered to vote, Have previously voted in precinct or election district, and Follow government and public affairs "most" or "some" of the time, regardless of whether there is an election going on. |