When a public policy issue, say climate change or health care reform, becomes politicized, people with strong partisan leanings sometimes have a hard time dealing with facts.

Douglas Blanks Hindman, an associate professor in the Edward R. Murrow College of Communication at Washington State University, researches this effect, which he labels the “belief gap” between knowable and testable claims and partisan perception of those claims.

Communication researchers have long had a theory about a knowledge gap, which says the mass media does not distribute information about science and public affairs equally, and over time the difference between what highly educated and less educated people actually know grows considerably.

Douglas Blanks Hindman
Douglas Blanks Hindman

Hindman started questioning a few years ago whether something else—political leanings—was also affecting the acceptance of facts. “I’d been hearing my children coming home from school saying, ‘My friend’s parents think there’s no such thing as global warming.’ That was a shock to me. I knew these were highly educated people and I didn’t understand that.

“I wondered what is going on here. How can highly educated people be having perceptions that are at odds with what are objectively defined or knowledge claims that nonpartisan sources have made?”

Hindman theorized that perhaps we’ve gotten so politically polarized as a society that now instead of a knowledge gap, we have a belief gap based on political affiliation as opposed to educational level.

He tested his ideas using national survey data from the Kaiser Family Foundation on the actual contents of the contentious Affordable Care Act. The surveys, fielded before and after the president signed the bill into law, asked people about four components of the bill: if it requires all Americans to have health insurance or pay a fine; if it closes the Medicare drug prescription “doughnut hole” coverage gap; if it imposes a tax on insurers who offer the most expensive, Cadillac plans; and if it creates health insurance exchanges for marketplaces.

“I don’t care whether you agree with it or not. Is there a difference in how Republicans and Democrats perceive the facts, or does education predict knowledge?” says Hindman. “It turns out the difference was predicted entirely on political affiliation, not by educational level.”

He has been working with communication researchers in other states on the belief gap hypothesis. One colleague says the way questions are framed will sometimes activate partisan sentiment. “For example when asked if it would ‘require Americans to have health insurance or pay a fine,’ Republicans would say yes, that’s in the bill, because it makes Democrats look bad. Democrats might say, oh no, it doesn’t do that,” says Hindman.

The danger of a belief gap is that it could hinder problem-solving on some of the most pressing issues facing the country, he says. “It’s a case where short-term political gain is interfering with distribution of knowledge in ways that can help people solve problems. It’s better to disagree on solutions; at least then you can find something in between. But if you disagree on whether the problem exists at all, you have nowhere to go.”

Another example comes from global warming. Scientists achieved a kind of consensus, and Hindman believes an early public opinion survey would have found a knowledge gap.

Once issues like this become politicized, however, Hindman says a belief gap can form, and those claims often get challenged by an industry that depends on the public not knowing, or being confused about, particular facts. He quotes Upton Sinclair to illustrate the point: “It is difficult to get a man to understand something, when his salary depends upon his not understanding it.”

Perceptions of the economy show how quickly the belief gap can change between partisans as well. In early fall 2008, studies showed most Republicans felt the national economy was “sound,” just as Senator John McCain claimed, while Democrats said the economy was in the tank.

“Guess what happened after Obama was elected?” asks Hindman. “Pew data showed that despite jobs numbers and other factors, Democrats thought the economy was doing much better than Republicans did.”

Hindman makes it clear that he does not think facts alone make public policy, but rather it should be tempered by values.

“It’s not that we should acquiesce to science or to some authority without holding science to ethical standards or values,” he says. “My issue is when politicians adopt these issues for political gain and the partisan followers fall in line without giving it thought.”

He also doesn’t blame the media entirely for the problem, even though they could reject a false equivalence which gives the same importance to two sides of an issue even when they clearly don’t have the same factual underpinnings. “For example, some people say smoking is bad for you, some people don’t, as if they are equally weighted,” says Hindman.

One way the media could help would be to improve public understanding of nonpartisan sources of information, such as the Congressional Budget Office or the Federal Inspectors General. “The Bureau of Labor Statistics doesn’t care who benefits from lower unemployment figures or not. They just do their work,” says Hindman.

In the future, he would like to test his theories against perceptions of the nutritional value of organic foods, misperceptions about the health effects of marijuana, and beliefs about inoculation.