Polls Can Create an Illusion of Public Opinion

George Bishop

George Bishop is an independent survey research consultant, a retired professor of political science at the University of Cincinnati and the author of The Illusion of Public Opinion.

Updated November 30, 2015, 3:21 AM

George Gallup thought he could measure the Will of the People. Like every other pollster since then, he believed he could not only accurately predict the outcome of elections with his new statistical-survey tool, within a margin of sampling error; but he could also measure public opinion on social, economic and political issues with the same precision. It was just a matter of applying the same methodology, plus careful attention to how the questions were asked.

By asking questions that are vaguely worded about topics that the average citizen poorly understands, pollsters create subjective reality.

With some notable exceptions, polls taken just prior to presidential elections have proved to be quite accurate. Fortunately, we can validate the accuracy of those predictions on Election Day.

But that’s not the case when pollsters ask survey respondents about policy issues such as how President Obama or the Congress is dealing with “the economy” and “the federal budget deficit” or “foreign affairs.” Here pollsters often create an illusion of public opinion by asking respondents questions that are vaguely worded about topics on which the average citizen is poorly informed. Even worse, respondents are often asked to answer questions they’re psychologically incapable of answering, such as the “reasons” why they prefer a given candidate or why they favor this or that policy. Cognitive neuroscientists tell us that we simply do not have introspective access to the unconscious processes that drive our opinions and preferences. We’re clueless.

Take, for example, the Gallup question on presidential approval. “Do you approve or disapprove of the way Barack Obama is handling his job as president?” What does “handling his job as president” mean to respondents? The same goes for the “Affordable Care Act.” Do many respondents know much about it other than that it’s “Obamacare” and that they’re for or against him? Do the meanings of questions vary across respondents and over time? If so, it violates a cardinal assumption of survey practice: that a survey question should mean the same thing to all respondents. Otherwise we’re comparing apples with oranges.

If we ask respondents “why” they approve or disapprove of how President Obama is handling this or that problem, they really don’t know why. All they can do is come up with plausible justifications or “reasons” for their opinions after-the-fact. Willing respondents will answer our questions if you ask them, however vague those questions might be and however uninformed they might be — unless you give them a chance to admit they don’t know much about the issue. If you build the questions, they will answer them. And that’s how pollsters -- unwittingly perhaps -- manufacture the “will of the people.”


Join Opinion on Facebook and follow updates on twitter.com/roomfordebate.

Topics: Politics, democracy, polls

Does Polling Undermine Democracy?

Have public opinion surveys been given too great a role in American politics? Are they measuring public opinion, or creating it? Read More »

Debaters