The Science of Why We Deny Science: Motivated Reasoning

By Chris Mooney | April 18, 2011 9:21 am

Over at Mother Jones, I have a major feature story that just went up about the psychology of science denial–and, indeed, denial in general. In it, I unpack a theory called “motivated reasoning,” which political psychologists have used to explain all manner of divides over factual, resolvable issues. Motivated reasoning is, in many ways, the updated, neuroscience infused version of “cognitive dissonance”:

The theory of motivated reasoning builds on a key insight of modern neuroscience (PDF): Reasoning is actually suffused with emotion (or what researchers often call “affect”). Not only are the two inseparable, but our positive or negative feelings about people, things, and ideas arise much more rapidly than our conscious thoughts, in a matter of milliseconds—fast enough to detect with an EEG device, but long before we’re aware of it. That shouldn’t be surprising: Evolution required us to react very quickly to stimuli in our environment. It’s a “basic human survival skill,” explains political scientist Arthur Lupia of the University of Michigan. We push threatening information away; we pull friendly information close. We apply fight-or-flight reflexes not only to predators, but to data itself.

We’re not driven only by emotions, of course—we also reason, deliberate. But reasoning comes later, works slower—and even then, it doesn’t take place in an emotional vacuum. Rather, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about.

Consider a person who has heard about a scientific discovery that deeply challenges her belief in divine creation—a new hominid, say, that confirms our evolutionary origins. What happens next, explains political scientist Charles Taber of Stony Brook University, is a subconscious negative response to the new information—and that response, in turn, guides the type of memories and associations formed in the conscious mind. “They retrieve thoughts that are consistent with their previous beliefs,” says Taber, “and that will lead them to build an argument and challenge what they’re hearing.”

In other words, when we think we’re reasoning, we may instead be rationalizing. Or to use an analogy offered by University of Virginia psychologist Jonathan Haidt: We may think we’re being scientists, but we’re actually being lawyers (PDF). Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases. They include “confirmation bias,” in which we give greater heed to evidence and arguments that bolster our beliefs, and “disconfirmation bias,” in which we expend disproportionate energy trying to debunk or refute views and arguments that we find uncongenial.

I then apply the theory to climate denial, vaccine denial, creationism, and much else–including the persistence of political misinformation, such as the belief that Iraq had weapons of mass destruction. Again, you can read the Mother JOnes piece here. I’ll have much more to say about it soon. I also unpack the implications a bit further over at DeSmogBlog.

Comments (9)

  1. Bobito

    Our “reasoning” is a means to a predetermined end—winning our “case”—and is shot through with biases.

    This is the problem with our highly politicized environment these days. Reasoning has gone away, arguing “your side” of the argument is more important that understanding the other’s point of view and where they may be more, or equally, correct.

    I think this is a product of the information age. Years ago, more people would make unbiased decisions based on how they feel about the information available to them. But these days, people tend to make decisions based on what their political affiliation tells them it should be. As if agreeing with the “other side” on any single point discredits the whole of their political ideology.

    If one finds themselves agreeing with all the views of their political party 100%, there is little doubt that they are not using rational thought, they are using confirmation bias.

  2. Jon

    Identity and group pressures seem very important, that and the ability to deliver messages that will be easily digested by key groups/constituencies…

    As Michael Baroody, president of AEI, said to a reporter back in the 80′s: “…our sophisticated ability to relate ideology to constituencies is what counts…”

    http://tinyurl.com/4sm4zzu

  3. Nullius in Verba

    How would you be able to tell if your own thinking was biased?

  4. Richard Thoman

    I like how the article itself proves the point that it’s talking about by the author’s trying to use reasoning to support his biases. Irony.

  5. Torbjörn Larsson, OM

    @ #5: I noticed the irony too, as evidenced by the first reference here being an unreferenced essay and the Haidt reference discussing moral judgment and its theories which inflated unfounded into meaning science theories. But science theories have factual, not moral behavior, basis.

    It is so close but evidently Mooney doesn’t see the emotional bind he is in, call it cognitive dissonance or motivated reasoning. For science that support an emotion provoking, consistent _and_ factual approach to issues one has only to read Rosenhause’s post on that. It is months old; and it has actual references.

  6. I read your article and thought it was very interesting. Strangely, I had just finished David Brooks’ new book and in it he makes a similar point. In his book, he also makes the point that we need the subconscious valuations in order to be able to actually make a decision. Without the subconscious valuations, we would get stuck on weighing the pros and cons of a decision. It would be interesting to know what happens to the brain when one changes a deeply held belief. It seems that people who convert to another religion or ideology are often much more passionate about that new belief than the believers who have held that belief for a longer time.

  7. Carolinus

    Looking forward to Mr. Mooney’s application of this insight to other commonly and stubbornly held views, such as “socialism didn’t fail; it was never tried”; “the Cold War ended in spite of, not because of, Reagan”; “discovery of a single hominid fossil would confirm our evolutionary origins”; that sort of thing.

  8. Dad

    Hmmm… you lumped creationists in with vaccine denial. Let me encourage you to take the log out of your own eye before working on my speck of dust >>

    How about original cause denial? How about 2nd law of thermodynamics denial? Your science is flawed precisely because you choose to ignore the foundational truths science – there must be original cause, He absolutely must be eternal, or you must throw out the 2nd law. I think your well theories are a house of cards.

NEW ON DISCOVER
OPEN
CITIZEN SCIENCE
ADVERTISEMENT

Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

About Chris Mooney

Chris is a science and political journalist and commentator and the author of three books, including the New York Times bestselling The Republican War on Science--dubbed "a landmark in contemporary political reporting" by Salon.com and a "well-researched, closely argued and amply referenced indictment of the right wing's assault on science and scientists" by Scientific American--Storm World, and Unscientific America: How Scientific Illiteracy Threatens Our Future, co-authored by Sheril Kirshenbaum. They also write "The Intersection" blog together for Discover blogs. For a longer bio and contact information, see here.

ADVERTISEMENT

See More

ADVERTISEMENT
Collapse bottom bar
+

Login to your Account

X
E-mail address:
Password:
Remember me
Forgot your password?
No problem. Click here to have it e-mailed to you.

Not Registered Yet?

Register now for FREE. Registration only takes a few minutes to complete. Register now »