What is Motivated Reasoning? How Does It Work? Dan Kahan Answers

By Chris Mooney | May 5, 2011 8:32 am

I recently came across this post at Science & Religion Today, authored by Dan Kahan, who is the Elizabeth K. Dollard Professor at Yale Law School. It clarifies so many important issues about motivated reasoning–what it is, what it isn’t–that I asked Kahan if I could repost it here, as I think it deserves very wide circulation. He said okay. So here goes:

Recently, scholars and commentators have drawn attention to the contribution of “motivated cognition” to diverse political conflicts, including climate change and the birthplace of President Obama. I will offer a few points to help people assess such claims.

1. To begin, motivated cognition refers to the unconscious tendency of individuals to fit their processing of information to conclusions that suit some end or goal. Consider a classic example. In the 1950s, psychologists asked experimental subjects, students from two Ivy League colleges, to watch a film that featured a set of controversial officiating calls made during a football game between teams from their respective schools. The students from each school were more likely to see the referees’ calls as correct when it favored their school than when it favored their rival. The researchers concluded that the emotional stake the students had in affirming their loyalty to their respective institutions shaped what they saw on the tape.

The end or goal motivates the cognition in the sense that it directs mental operations—in this case, sensory perceptions; in others, assessments of the weight and credibility of empirical evidence, or performance of mathematical or logical computation—that we expect to function independently of that goal or end. But the normal connotation of “motive” as a conscious goal or reason for acting is actually out of place here and can be a source of confusion. The students wanted to experience solidarity with their institutions, but they didn’t treat that as a conscious reason for seeing what they saw. They had no idea (or so we are to believe; one needs a good experimental design to be sure this is so) that their perceptions were being bent in this way.

2. Motivated cognition is best understood as a description or characterization of a process and not an explanation in and of itself. For a genuine explanation, we need to know, at a minimum, what the need or goal was that did the motivating (or directing) of the agent’s mental processing and the precise cognitive mechanism or mechanisms through which it operated to generate the goal-supporting perceptions or beliefs.

Examples of the goals or needs that can motivate cognition are diverse. They include fairly straightforward things, like a person’s financial or related interests. But they reach more intangible stakes, too, such as the need to sustain a positive self-image or protect connections to others with whom someone is intimately connected and on whom someone might well depend for support, emotional or material.

The mechanisms are also diverse. They include dynamics such as biased information search, which involves seeking out (or disproportionally attending to) evidence that is congruent rather than incongruent with the motivating goal; biased assimilation, which refers to the tendency to credit and discredit evidence selectively in patterns that promote rather than frustrate the goal; and identity-protective cognition, which reflects the tendency of people to react dismissively to information when accepting it would cause them to experience dissonance or anxiety. Identifying these more concrete and empirically established mechanisms and giving a plausible and textured account of how they are at work is critical; otherwise, assertions of “motivated cognition” become circular—“x believes that because it was useful; the evidence is that it was useful for x to believe that.”

3. To be sure, motivated cognition can make us stupid, but it is not a consequence of stupidity. Social psychologists and behavioral economists distinguish between two forms of reasoning: “System 1,” which is rapid, intuitive, emotional, and prone to bias, and “System 2,” which is more deliberate, more reflective, more dispassionate, and (it is said) more accurate. A long line of research in social psychology, however, shows that “motivated cognition” spans the divide—that is, that needs and goals can unconsciously steer not only rapid “gut” reactions, but also even more systematic forms of analysis that are thought to be examples of “System 2.” Indeed, some researchers have shown that expert scientists are at least sometimes prone to motivated reasoning—that they conform the performance of their reflective and deliberate evaluations of evidence to the desire they have to see exciting conclusions vindicated and disfavored ones rejected. Scary stuff. And humbling (unless as a result of motivated reasoning we see evidence of its operation only in those who disagree with us—in which case, motivated reasoning makes us anything but humble).

4. Work on motivated cognition and political conflict tends to focus more on the need for maintaining a valued identity, particularly as a member of a group. It is certainly plausible that an individual would employ one or another of the mechanisms for motivated cognition to advance her economic interests. But the seeming inability of economic interests to explain who believes what on issues such as climate change, the HPV vaccine, one or another economic policy involving tax cuts or social welfare spending, and the like is in fact the motivation—as it were—for examining the contribution that identity-protective forms of motivated cognition are making.

Dan Kahan is the Elizabeth K. Dollard Professor of Law and a member of the Cultural Cognition Project at Yale Law School.

CATEGORIZED UNDER: Guest Posts, Motivated Reasoning

Comments (17)

  1. Mike

    Duh. People tend to interpret evidence in a way that is consistent with their prior beliefs. I hope the taxpayers didn’t get stuck with the bill for this ‘research’.

    The pernicious aspect of this banality is that it can be used to build a case that people who disagree with elitist dogma– the earth is burning up, overpopulation is an imminent threat to mankind, DDT is toxic to humans– can be labeled as psychologically defective. Victims of ‘Motivated Reasoning Syndrome’ or some such.

    How much do you wanna bet that a diagnosis of MRS will never be applied to Warmists, Paul Erlich types, or enviro-fascists?

  2. Chris Mooney

    @1 No no no. There is nothing “psychologically defective” here. This is human nature we’re talking about. Everyone is susceptible. This is *normal*. The point is to be aware of it and try to counteract it….

  3. Mike:

    “elitist dogma”? “Warmists”? “enviro-fascists”?

    Thanks for providing such a clear example of motivated reasoning in action.

  4. Thanks Chris for the post and to Mike for, well, being Mike. As you say Chris, we all do this, all the time a million times a day. It is in our nature, but at least for me, an new aspect of our nature that I had not previously appreciated.

    And Mike, I think you are absolutely on to something when you fear that only one side of an argument will be painted with the brush of motivated reasoning, as if it were some defective form of reasoning…as Chris points out, that is not the case…we are all doing it on all sides of issues..the point is indeed to be conscious….the interesting question is, and then what?

  5. Mike wants to bet that motivated reasoning analysis would never be applied to scientists. Considering that the article said “some researchers have shown that expert scientists are at least sometimes prone to motivated reasoning,” I don’t think it’s a very good bet for him.

    I am encouraged though that Mike read the first paragraph of the post, enough to leave a comment related to the topic.

  6. Dark Tent

    some researchers have shown that expert scientists are at least sometimes prone to motivated reasoning—that they conform the performance of their reflective and deliberate evaluations of evidence to the desire they have to see exciting conclusions vindicated and disfavored ones rejected. Scary stuff.

    I think we can all agree that’s a bad thing.

    But I would just like to point out that not all “motivated reasoning” in science is bad.

    In fact, quite inexplicably (mysteriously?), some has led to very successful theories.

    Einstein’s theory of general relativity was “motivated” by his “conviction” that nature must be “simple” and that the equations that describe nature should be “beautiful” (in a mathematical sense.)

    Einstein surely understood that whatever theory (and equations) he came up with had to be in keeping with experiment, but that was not his primary motivator. In fact, in the final push for the field equations of general relativity, he focussed almost exclusively on the mathematics (making the equations generally covariant).

    In the end, his pursuit of a mathematical rather than a physical approach was what allowed him to be successful. See UNTYING THE KNOT: HOW EINSTEIN FOUND HIS WAY BACK TO FIELD EQUATIONS DISCARDED

    There’s actually a funny story (perhaps apocryphal) related to the Einstein case:

    When asked by a student what he would have done if Sir Arthur Eddington’s famous 1919 gravitational lensing experiment, which confirmed relativity, had instead disproved it.

    [Einstein said]” Then I would have felt sorry for the dear Lord. The theory is correct.”

    As quoted in Reality and Scientific Truth : Discussions with Einstein, von Laue, and Planck (1980) by Ilse Rosenthal-Schneider, p. 74

    — from wikiquote

    Einstein is not a lone example of this particular (perhaps peculiar) type of motivated reasoning in science.

    In fact, one could make a pretty good argument that the “pursuit of beautiful equations” (for lack of a better moniker) has been a very significant (and successful) driver of physics.

    There’s really no rational explanation* for why it should work, but it does!

    *unless you consider “God is a mathematician” rational.

  7. This has been approached from a variety of angles in a variety of places, and it is, as Chris hints, a normal part of human development:


  8. kirk

    A more basic theory of the phenomina might include the tendency of Bayesian Inference to just get things completely wrong. The prior probability of an untested hypothesis might start at 50% or coin toss chance. When the first new evidence arrives, let’s say ‘paper’ vs. ‘Plastic’ bag at the grocery store. Repeated cognitive effort might lead two different agents, let’s say a Soccer Mom in a big hurry to get food on the table vs. a green hippie tree hugger will use each factoid differently. As a hippie I believe (my first hypothesis) that plastic in any form will harm Gaia. A soccer mom likes plastic because it fits all of the bathroom trash bins and saves on buying MORE plastic in the form of Hefty Trash Sacks. The agents both agree on the PROBABILITY of the evidence – plastic bags are worse than paper bags because of two orthogonal factoids. But the Bayesian method lead to possibly matching or possibly mis-matching desired outcomes. The soccer mom may decide not to line the trash-can and change the INTERPRETATION of the Bayesian result. Paper sacks win the contest when the kids go away to college.

    But motivated reasoning outside of a controlled study can never be pinned down. Think of it this way. I know that of the 100 employess I hire only 20 of them will deliver 80% of the goodness I hired them for. The obvious solution is to continuously cull at least the bottom 20% because these individuals actually detract from good outcomes. This is a just so story. The real trick – correctly FINDING the lower 20% is now the big problem to solve. But what are the factoids and what probability do I assign to the evidence that Joe is a water cooler jockey that detracts from the strategic and tactical success of the organization. Again, the HR manager thinks Joe is okay because he turns in his performance objectives on time and always meets the idiotic HR guidelines while guarding the water cooler.

    Which facts wins the day? What alternative facts lose the day?

    Of course this is just my cognitive bias, I could be wrong.

  9. Mike: as Chris says, No. The work my own research group does is very much aimed at demonstrating the symmetry of one form of motivated cognition — what we call cultural cognition, to draw attention to a particular account of what the source of the motivation is — across persons of diverse outlooks. One study we did, e.g., shows that perceptions of “scientific consensus” are motivated by cultural values (though the mechanism of the “availability effect”), generating skewed patterns of perceptions of what most scientists believe on a variety of issues. Measured against a baseline like the National Academy of Science’s “expert consensus” reports, nearly all groups are “right” on some issues, “wrong” on others– which is to say, that anyone who thinks that only “those guys” are suffering from motivated cognition is, as it turns out, likely displaying a form of motivated reasoning (one that causes them to see its impact only in others…). Do you not think a science that would help people overcome the mistakes they are making about the sources of conflict about science in our Republic — mistakes that in fact reinforce the conflict by provoking istrust and recrimination — is a good investment for our society? I hope it turns out to be.


  10. Dark Tent

    More examples of “motivated reasoning” at the very highest levels of science:

    “Even when the evidence was going against them, Nobel prize-winners Murray Gell-Mann and Richard Feynman clung on to cherished theories just because they thought they were “beautiful”…
    Paul Dirac famously asserted that: “It is more important to have beauty in one’s equations than to have them fit experiment.” Richard Feynman, too, insisted on believing in one of his theories even when it seemed to contradict experimental data. “There was a moment when I knew how nature worked,” he wrote in 1957. “[The theory] had elegance and beauty.”…

    Despite the high reputation of the physicists responsible for the actual experiments, Feynman and Gell-Mann’s response was that there was something wrong with the experiments. They were right. Thus although experiments are essential for scientific theories, certain theories are just too important – too beautiful, one could say –to be discarded when the experiments don’t go your way.

    A Thing Of Beauty (Arthur I Miller)

  11. Jeff

    I’d like to propose an alternate way of looking at this. To me, alleviating biased cognition in all its forms is actually a function of practice. Before you continue reading, note that I said alleviate, as in lessen or diminish, NOT eliminate. 🙂 While I do believe the effects of biased cognition can be dealt with, and eventually evolved out of the human condition, I do not feel our generations of individuals can completely eliminate it. It’s too new in our collective realization, and in fact our culture reinforces biased cognition for a variety of reasons already stated in other posts.

    I speak only from my own experiences, of course, based on learning about this peculiarity of our minds from my grandmother at age six. Every time I ran crying to her when something “went wrong,” she would always stop me, calm me down, then ask me if I was telling her what my mind thought about what happened or if I was telling her what happened. By practicing that distinction between what happened versus what my mind interpreted, I find that I am able to see this in action within my own mind, and of course in the minds of others. Some folks think me quite weird, in fact, when I suddenly point out potentially flawed assumptions or interpretive reasoning behind my own opinions.

    I’m going to stop you again, since I’m relatively certain some of you are thinking that I just said I don’t suffer from biased cognition. Quite the contrary! I am merely saying that I can habitually see biased cognition in operation, even within myself, due to practicing this mental distinction exercise repeatedly for 36 years. On the flip side, I do sometimes wonder if I invent a perception of biased cognition because that’s what I expect to see, but that’s a rabbit hole I don’t want to follow just now. 🙂

    Suffice to say, I don’t feel humanity is saddled with this mental habit of believing what best supports our existing opinions, even if reality is quite different. We evolved this way in the first place, so I fully believe we will evolve differently over time as humanity grows and develops.

    What I currently don’t understand, however, is how anyone could deny that we humans do this, when the preponderance of evidence is absolutely overwhelming! I fully expect, at some future date, that the humans of the future will look back at us and say “We used to be so insane!”

  12. Matt B.

    I’m not sure if this counts as biased assimilation or identity-protective cognition. I started a discussion on a blog asking people how they would rewrite the 2nd Amendment to say more clearly what they think it says. After a hundred comments, no one had offered a rewrite, but several had claimed that the 2nd Amendment already clearly showed their interpretation (despite the raging arguments going on about it). This came from both sides of the issue.

  13. James in NJ

    Question: Granting that I could be entirely missing the point, don’t we already have a word for the phenomenon of “motivated cognition”; namely “prejudice”? If not, how is motivated cognition different?

  14. Doug

    Motivated reasoning was on display grandly during Chris Mooney’s POI interview. I’m surprised he hasn’t linked to it from the intersection yet.


Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!

About Chris Mooney

Chris is a science and political journalist and commentator and the author of three books, including the New York Times bestselling The Republican War on Science--dubbed "a landmark in contemporary political reporting" by Salon.com and a "well-researched, closely argued and amply referenced indictment of the right wing's assault on science and scientists" by Scientific American--Storm World, and Unscientific America: How Scientific Illiteracy Threatens Our Future, co-authored by Sheril Kirshenbaum. They also write "The Intersection" blog together for Discover blogs.For a longer bio and contact information, see here.


See More

Collapse bottom bar