Category: Motivated Reasoning

Did S&P Engage in Motivated Reasoning to Justify Its U.S. Downgrade?

By Chris Mooney | August 7, 2011 4:03 pm

Like many, many others, I’m pretty worried right now about the economic future of the United States. I’m not concerned with our fundamentals–I’m concerned with our political insanity and our apparent capacity for self-destruction…the Tea Party’s absolute intransigence, and the President’s amazing weakness in the face of the former.

That said, the basis for S&P’s ratings downgrade is called into serious question by this statement from the Treasury Department, which suggests the ratings agency was starting with the answer, and then looking for a rationale–a classic motivated reasoning behavior:

In a document provided to Treasury on Friday afternoon, Standard and Poor’s (S&P) presented a judgment about the credit rating of the U.S. that was based on a $2 trillion mistake. After Treasury pointed out this error – a basic math error of significant consequence – S&P still chose to proceed with their flawed judgment by simply changing their principal rationale for their credit rating decision from an economic one to a political one….

…Independent of this error, there is no justifiable rationale for downgrading the debt of the United States. There are millions of investors around the globe that trade Treasury securities. They assess our creditworthiness every minute of every day, and their collective judgment is that the U.S. has the means and political will to make good on its obligations. The magnitude of this mistake – and the haste with which S&P changed its principal rationale for action when presented with this error – raise fundamental questions about the credibility and integrity of S&P’s ratings action.

What do you think? Could motivated reasoning upend world markets? I myself have little doubt that this is possible. After all, if you were informed of a $ 2 trillion error in your calculations, wouldn’t you want step back and wait, and collect your thoughts, rather than stampeding ahead with a previously agreed-upon action….especially with the stakes so high?

CATEGORIZED UNDER: Economy, Motivated Reasoning

Is It All Just Bias, All The Way Down?

By Chris Mooney | August 3, 2011 11:38 am

My last post at DeSmogBlog, about conservative white men and climate change, has brought up the perennial question: Is everybody as biased as these CWMs are, just on their own pet issues?

So I’ve done another post to at least partially tackle this issue, again at DeSmog. Here’s a brief excerpt:

…let me discuss one important stab at comparing left and right wing biases, found in several studies by Linda Skitka of the University of Illinois-Chicago and her colleagues.

In a well known 2002 paper (see the 5th study), Skitka showed that liberals, unlike conservatives, update their initial views about whether a person who has contracted AIDS while knowing the risks, and engaging in unprotected sex, deserves government subsidized health care services. Conservatives and liberals have the same negative first impression of such a person—they feel personal disapproval or even revulsion. But liberals then change their minds, go against their first impulses, and decide that the person deserves to be treated equally anyway. Conservatives don’t.

But Skitka showed in a more recent study that there are  contexts in which conservatives, too, go against what you might expect. For instance, and as the last study implied, conservatives usually tend to think that there are no “extenuating circumstances”—that you’re personally responsible for what you do and how things turn out, whether you’re a criminal or someone on welfare or someone who knowingly contracts AIDS. However, in the newer study, Skitka showed that conservativesdo consider extenuating circumstances (or what she calls “situational” factors rather than “dispositional” factors) when members of a group that they support, like the military or the police, are accused of wrongdoing.

However, I will note Skitka and her colleagues did not detect conservatives actually changing their views when confronted with new or contradictory evidence—e.g., seemingly definitive proof that soldiers or police had actually done something wrong. She just caught them going against their general tendency to make “dispositional” rather than “situational” attributions. Honestly, you could argue just as easily that she captured flip-flopping (or, special pleading on behalf of the military and the police) as that she captured open-mindedness and flexibility.

In any case, while I agree that everybody has biases, I’m not sure that means I must also agree that everybody is equally biased. To butcher George Orwell, why couldn’t it be the case that all humans are biased, but are some humans are more biased than others?

You can read further here.

The Tablo Story: A Disturbing Case of Motivated Reasoning and the Internet

By Chris Mooney | July 27, 2011 2:10 pm

You have got to read this entire feature story in the Stanford Magazine about Korean hip hop star Daniel Lee (aka Tablo), whose fell under attack from websites who asserted that he hadn’t really graduated from Stanford. He had, but like the hard core birthers, Tablo’s detractors refused to accept any evidence he could provide to document his academic background (like, say, an official university transcript). Meanwhile, the Korean media covered the story by telling “both sides.” Excerpt:

Black [the Stanford registrar] repeatedly confirmed that Daniel Lee the English major was a graduate in good standing but that only seemed to create more agitation. Some emailed to question Black’s integrity, suggesting that he was colluding with Lee. Black got angry. “These people don’t want the truth,” he says. “They dismiss everything that doesn’t align with what they already believe.”

Lee continued to fight back. On August 5, 2010, he released his Canadian citizenship certificate to the press. To his astonishment, he was promptly sued by four anonymous Koreans who charged him with forgery.

“I was doing everything they asked and it was never good enough,” Lee says. “That’s when I realized that they weren’t looking for answers, they just wanted to destroy me.”

Korean media widely reported the suit, which only served to further sow doubt about Lee’s identity among the general population. Gossip-oriented celebrity sites pored over every detail of the charges; the mainstream press even covered the case. The fact that Stanford had officially confirmed Lee’s diploma did not seem to check the flow of articles. By midsummer, Lee’s travails had become one of the biggest news stories in the country.

When is humanity going, like, to wake up and realize that it is part of our nature to generate claims that comport with what we want to believe, and then refuse to admit any contrary evidence, often becoming even more sure of ourselves the more the factual refutations come in?

This part of human nature isn’t just irrational–it can be destructive, and the Internet compounds it. We need to see it for what it is, and broadly acknowledge that it is part of us.

Read the rest of the story here.

CATEGORIZED UNDER: Culture, Motivated Reasoning

Why The Scientifically Literate Can Believe Silly Things

By Chris Mooney | July 26, 2011 10:07 am

If you understand motivated reasoning, then you understand that high levels of knowledge, education, and sophistication are no defense against wrongheaded views like climate change denial and anti-evolutionism.  What I’ll call “sophistication” may even make these phenomena worse, at least among those with deeply ideological or religious views.

The reason is that when we “reason” in areas where we have strong beliefs, our emotions come first and then we rationalize our pre-existing views. And those better at generating self-affirming arguments will be better rationalizers, will fall in love with their own seemingly brilliant arguments, and their minds will become harder to change (but they’ll love to argue).

According to a recent report in Science, those designing the National Science Foundation’s next Science and Engineering Indicators report–and particularly the much cited Chapter 7, which discusses the public’s views and knowledge about science–are now grappling with this problem. The issue involves measuring scientific literacy, and how to treat survey questions over evolution and the Big Bang–questions where religious conservatives who may be otherwise perfectly scientifically literate are going to say they don’t accept what science tells us.

Here’s Science:

Can a person be scientifically literate without accepting the concepts of evolution and the big bang? To many scientists and educators, the answer to that question is an unqualified “no.” Read More

CATEGORIZED UNDER: Motivated Reasoning

"A Little Knowledge": Climate Skepticism and Sophistication

By Chris Mooney | June 27, 2011 9:13 am

My latest DeSmogBlog piece is a bit bibliographic: It elaborates on the post about the latest Kahan study by cobbling together four other studies showing something similar and obviously related:

Higher Education and Climate Skepticism.2008 Pew survey showed not only that Democrats and Republicans are polarized over whether they accept global warming, but also that for Republicans, having a college degree didn’t make one any more open to what scientists have to say. On the contrary, better educated Republicans were more skeptical of modern climate science than their less educated brethren. Only 19 percent of college educated Republicans agreed that the planet is warming due to human actions, versus 31 percent of non-college educated Republicans.

Increased Knowledge and Climate Concern. In a 2009 paper in the journalRisk Analysis, a team of social scientists found that “Among people who trust scientists to provide reliable information about the environment and among Democrats and Independents, increased knowledge has been associated with increased concern. But among people who are skeptical about scientists and among Republicans more knowledge was generally not associated with greater concern.”

Interaction Between Education, Politics, and Views on Climate Change. A 2009 paper in Climatic Change by Lawrence Hamilton of the University of New Hampshire found that in two surveys—of New Hampshire and Michigan residents—climate denial was inversely related to more education and more self professed knowledge of the issue among Republicans/conservatives. The author opined: “Narrowcast media, including many Web sites devoted to discrediting climate-change concerns, provide ideal conduits for channeling politically inspired but scientific sounding arguments to an audience predisposed to retain and repeat them.”

Self Professed Knowledge and Climate Polarization: A series of 2011 surveys by Hamilton similarly found that Republicans and Democrats who profess to know less about the climate issue are closer to one another in their views about whether global warming is really happening. By contrast, Democrats and Republicans who think they know a lot about the issue are completely polarized, with Republicans quite confident the science is wrong.

Not like this finding is robust or anything….For more elaboration, read here.

Why Does More Scientific Literacy Seem to Make Liberals More Accepting of Nuclear Power?

By Chris Mooney | June 24, 2011 1:47 pm

In my last post about the Kahan et al paper, I gave you the headline finding–scientific literacy and numeracy, if anything, seems to worsen climate denial, especially among those already opposed to climate action (hierarchical-individualists/conservatives).

But there’s another intriguing finding in the study. In fact, I would go so far as to call it an anomaly in need of explanation.

You see, it turns out that the pattern on nuclear power is different than the pattern on climate change in the study (see Figure 4). On nuclear power, the egalitarian-communitarians (liberals) generally start out thinking it’s more risky, and the hierarchical-individualists (conservatives) generally start out thinking it’s more safe–when you ask them the question posed in the study anyway (“How much risk do you believe nuclear power poses to human health, safety, or prosperity?”).

The starting positions are just what you would expect: egalitarian-communitarians (liberals) are suspicious of unregulated industry and worried about harm to, basically, everybody, especially the weakest in society. So when they hear about corporations doing risky things (like, say, nuclear power) they get their buttons pushed. The hierarchical-individualists (conservatives) are the opposite–individualists in particular celebrate private industry and the free market, so you would expect them to support nuclear power.

However, unlike in the case of conservatives and climate change, with increasing scientific literacy and numeracy, egalitarian-communitarians (liberals) *do not* move further in the direction where you would presume their initial biases would take them–i.e., towards perceiving more risk. Instead, with more education and numeracy, both groups grow less convinced that nuclear power is risky. Read More

Do Scientific Literacy and Numeracy Worsen Climate Denial?

By Chris Mooney | June 24, 2011 10:04 am

Once again, Dan Kahan and his colleagues at Yale are out with a paper that dramatically challenges–using scientific data–much of what we would like to believe about the relationship between knowing more about science, and accepting science on contested issues. The paper is entitled “The Tragedy of the Risk-Perception Commons: Cultural Conflict, Rationality Conflict, and Climate Change.”

The brilliant maneuver in this study is to do a survey that not only measures whether people accept climate science, but correlates that with their scores on standard scientific literacy questions and tests of numeracy–the ability to think mathematically. Here’s the abstract:

The conventional explanation for controversy over climate change emphasizes impediments to public understanding: limited popular knowledge of science, the inability of ordinary citizens to assess technical information, and the resulting widespread use of unreliable cognitive heuristics to assess risk. A large survey of U.S. adults (N = 1540) found little support for this account. On the whole, the most scientifically literate and numerate subjects were slightly less likely, not more, to see climate change as a serious threat than the least scientifically literate and numerate ones. More importantly, greater scientific literacy and numeracy were associated with greater cultural polarization: respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased. We suggest that this evidence reflects a conflict between two levels of rationality: the individual level, which is characterized by citizens’ effective use of their knowledge and reasoning capacities to form risk perceptions that express their cultural commitments; and the collective level, which is characterized by citizens’ failure to converge on the best available scientific evidence on how to promote their common welfare. Dispelling this “tragedy of the risk-perception commons,” we argue, should be understood as the central aim of the science of science communication.

I plan to blog about several aspects of this paper, as its findings are so central to everything I’m trying to get across these days. For now, I’m just flagging it. I think it is an absolute must read.

Liberalism and Enlightenment History

By Chris Mooney | June 24, 2011 7:41 am

In preparing for my recent Point of Inquiry podcast with Rick Perlstein, I knew my guest would debunk right wing historical narratives of the sort that we’ve recently heard so much of, and do so with gusto. I screen guests at least that well.

But I didn’t know he was going to offer a thesis so in line with the one that I’ve been pushing myself lately–that when it comes to history, liberals are wedded to an Enlightenment tradition that creates its own biases and myopias. Here’s Perlstein:

Liberalism is rooted in this notion of the Enlightenment, the idea that we can use our reason, and we can use empiricism, and we can sort out facts, and using something like the scientific method—although history is not like nuclear physics—to arrive at consensus views of the truth that have a much more solid standing, epistemologically, than what the right wing view of the truth is: which is much more mythic, which is much more based on tribal identification, which is much more based on intuition and tradition. And there’s always been history writing in that mode too. But within the academy, and within the canons of expertise, and within the canons of professionalism, that kind of history has been superseded by a much more empirical, Enlightenment-based history.

As I’m no historian, I’m not exactly sure what the key turning points were–I mean, you could argue that mythic and triumphalist history goes all the way back to Homer. I’m sure much has been written on this, and I bet there’s a canonical work of historiography on this very topic.

In any case, as Perlstein goes on to argue, Enlightenment history has the virtue of being rigorous and accurate–like science does–but all the rigor, and all the details, can get in the way of telling an inspiring and motivating story. Therefore, you sort of have to grudgingly admire the effectiveness of conservative history–at least conservatives know that part of history is about telling a good story, mythic or otherwise.

For more on Perlstein’s thoughts, listen here.

Al Gore and the Enlightenment Ethic

By Chris Mooney | June 23, 2011 1:18 pm

Everybody is talking, and rightly so, about the big Al Gore piece in Rolling Stone on science, reason, and the climate crisis. And it is, indeed, quite a tour de force. Gore is not only a charismatic leader (now that he’s not running for president), he’s a great writer.

Nevertheless, I’m afraid to say that Gore is operating, big time, in liberal Enlightenment mode–precisely what I critiqued in The American Prospect. Let’s give some examples of Gore’s Enlightenment rhetoric:

Admittedly, the contest over global warming is a challenge for the referee because it’s a tag-team match, a real free-for-all. In one corner of the ring are Science and Reason. In the other corner: Poisonous Polluters and Right-wing Ideologues.


We haven’t gone nuts — but the “conversation of democracy” has become so deeply dysfunctional that our ability to make intelligent collective decisions has been seriously impaired. Throughout American history, we relied on the vibrancy of our public square — and the quality of our democratic discourse — to make better decisions than most nations in the history of the world. But we are now routinely making really bad decisions that completely ignore the best available evidence of what is true and what is false. When the distinction between truth and falsehood is systematically attacked without shame or consequence — when a great nation makes crucially important decisions on the basis of completely false information that is no longer adequately filtered through the fact-checking function of a healthy and honest public discussion — the public interest is severely damaged.

I agree with one part of Gore’s message whole heartedly. We really have lost our grip on reality and this really is endangering our politics and our civilization. Without facts, we’re  screwed. We’re dysfunctional.

But I don’t agree with Gore’s account of why this happened. He blames the “powerful.” He blames the “Polluters.” He blames the media. But most of all, for him it’s special interests–money in politics, money in the fossil fuel industry, is blocking our progress and sowing misinformation.

Gore seems to assume that if these pernicious effects were vanquished–or controlled by better policy–then the “public interest” would triumph again and we would all rally around it–just as we would all embrace the same facts again. But that just isn’t true.

The truth is that we are psychologically programmed not to accept the facts; and moreover, we don’t all want the same things–liberals and conservatives, in particular, have different value systems and psychological needs. And liberals, in particular, need to think that society can be rational, and that science can fix our problems–and that if it isn’t working out that way, it must be due to some kind of wrongdoing or nefariousness.

But alas, while our state of dysfunction is very real, the cause is not some evil Machiavellian group of special interests (an argument that works less and less well, by the way, as more and more fossil fuel companies become supporters of climate action). No: the cause lies within ourselves, and our brains.

On the Subject of Fox News, Jon Stewart Shouldn't Have Backed Down

By Chris Mooney | June 22, 2011 8:49 am

Here’s a great segment from Jon Stewart last night–although it has one flaw. But watch first:

The problem is, Stewart wasn’t actually wrong, not even in the teeny way that he confessed to. Politifact cited the wrong studies to refute him, while ignoring numerous studies that I have found, all of which support Stewart. For a full explanation, see my latest DeSmogBlog item. Brief excerpt:

It is of course around contested political facts, and contested scientific facts, where we find active, politically impelled, and emotionally laden misinformation campaigns—and it is in the latter realm that Fox News viewers are clearly more misinformed. Once again, I’ve cited 5 studies to this effect—concerning the Iraq war, the 2010 election, global warming, health care reform, and the Ground Zero Mosque. By contrast, Politifact only cites two of these studies, and attempts to critique one of them (the 2010 election study)—misguidedly to my mind, but who really even cares. It is obvious where the weight of the evidence lies at this point, unless further, relevant studies are brought to bear.

As a result of all of this, Politifact should either produce relevant research to rebut Stewart, or run a far more forthcoming retraction than has been issued so far. Note, however, that the issue grew a tad more complicated last night when Stewart did an excellent segment on all of this, where he both dramatized how much Fox misinformed viewers and yet also kind of conceded Politifact’s point, when he didn’t actually have to. He wasn’t wrong. They were wrong.

When the fact checkers fail—and in this case, they not only failed, they generated a falsehood of their own–they have a special responsibility to self-correct.

Again, full post here.


Discover's Newsletter

Sign up to get the latest science news delivered weekly right to your inbox!


See More

Collapse bottom bar