Tuesday, March 25, 2014

Poll questions can be skewed; here's some guidance, and an offer of help in dealing with them

By Al Cross
Director, Institute for Rural Journalism and Community Issues

It's an election year, and polls are flying to and fro. Some are worth writing about, some are not, and some are contradictory. Polls can be unreliable, or can conflict, for many reasons, and those are explored in studies published in Public Opinion Quarterly and excerpted in the latest edition of Journalist's Resource, a service of the Shorenstein Center on Media, Politics and Public Policy in the Kennedy School of Government at Harvard University.

The latest study is “Public Misunderstanding of Political Facts: How Question Wording Affected Estimates of Partisan Differences in Birtherism,” the belief that President Obama was not born in the U.S. "Polling groups frequently found that there was a large partisan gap in terms of belief on the issue — that Republicans were much more doubtful of President Obama’s birthplace, sometimes exceeding Democrats by as much as 48 percentage points," Journalist's Resource notes.

The researchers, Jon A. Krosnick and Neil Malhotra of Stanford University, said they found such difference were attributable to "leading introductory sentences," such as in this question: “According to the Constitution, American presidents must be ‘natural born citizens.’ Some people say Barack Obama was not born in the United States, but was born in another country. Do you think Barack Obama was born in the United States, or do you think he was born in another country?” That poll produced the 48-point difference between Democrats and Republicans. "Open-ended questions, where respondents are not forced to choose but rather must provide their own answer, 'yielded the highest rates of apparent correct understanding and the smallest partisan gaps'." (Read more)

This is a good example of why, in my nearly 16 years as chief political writer for The Courier-Journal of Louisville, I always asked to see a poll's complete questionnaire, at least up to and including the last question for which a result was provided. That also lets you see the "voter screen," the question or questions that try to divine the real "likely voters" in an election; if the percentage of voters making it through the screen is nearly double the expected turnout, the screen is too loose. The pollster also had to certify in writing the poll's methodology (Live interviewers or computerized recording? Sample drawn from file of frequent voters or database of phone numbers? If the latter: Number of callbacks to reach the selected respondents?) and be available for follow-up questions. There are other questions that can be asked about polls; if you need to consult about one, or need help in analyzing one, you can call me at 859-257-3744 or send an email to al.cross@uky.edu.


A 2014 study published in Public Opinion Quarterly, “Public Misunderstanding of Political Facts: How Question Wording Affected Estimates of Partisan Differences in Birtherism,” analyzes the structure of certain key polls in 2010 and 2011 that kept alive the misleading issue of President Obama’s origins. The study also provides results from a survey the researchers conducted in May 2011, after President Obama had released his long-form birth certificate to help settle the matter. Polling groups frequently found that there was a large partisan gap in terms of belief on the issue — that Republicans were much more doubtful of President Obama’s birthplace, sometimes exceeding Democrats by as much as 48 percentage points.
The paper’s lead authors, Jon A. Krosnick and Neil Malhotra of Stanford University, provide insight into the nature of the survey industry that should serve as a loud warning for media members: “We put a spotlight on one instantiation of a story that has played out on the national stage over and over again in recent decades: different survey organizations each craft questions ostensibly measuring the same opinion, and yet the organizations employ different question structures and different choices of words. It is almost as if each organization strives not to ask others’ questions and instead seeks to take a unique approach to measurement.”
The study’s findings include:
  • The evidence suggests that “apparently very different results were partly attributable to the use of leading introductory sentences in one closed-ended question. Removal of those sentences, which were not present in other questions, caused obtained results to be more similar to those produced by the other closed-ended questions.”
  • For example, in April 2011 a CBS/New York Times poll asked a closed-ended question in such a way that produced a high degree of skepticism about President Obama’s origins, particularly among Republicans. The survey question read as follows: “According to the Constitution, American presidents must be ‘natural born citizens.’ Some people say Barack Obama was not born in the United States, but was born in another country. Do you think Barack Obama was born in the United States, or do you think he was born in another country?” The partisan gap revealed by this question (48 percentage points difference between Democrats, 19%, and Republicans, 67%) was larger than that produced by the four other surveys conducted during spring 2011 by other groups.
  • In an experiment in May 2011, the study’s researchers replicated the CBS/New York Times survey wording and then compared it to responses to a simpler question: “Is your best guess that Barack Obama was born in the United States or that he was born in another country?” Among Republicans, this simpler question found diminished skepticism about President Obama’s origins, by 17.6 percentage points. This experiment “illustrates the impact of leading introductory sentences and raises caution about employing them in the future.”
  • One explanation for the variation is that, with the introductory sentences in place, “some respondents may have answered differently after hearing the introductory sentences because those words constituted instructions about which answer was anti-Obama. Viewed in this way, the CBS/New York Times question may have accurately tapped some people’s anti-Obama sentiment but did not necessarily accurately tap their understanding of Mr. Obama’s birthplace.”
  • Overall, open-ended questions, where respondents are not forced to choose but rather must provide their own answer, “yielded the highest rates of apparent correct understanding and the smallest partisan gaps.”
- See more at: http://journalistsresource.org/studies/politics/ads-public-opinion/misunderstanding-birtherism-flawed-survey-wording-research-analysis?utm_source=JR-email&utm_medium=email&utm_campaign=JR-email#sthash.gZFU1EOe.dpuf

No comments: