Wednesday, January 3, 2007

previous entry | main | next entry | TrackBack (0)


Do hawks have a psychological edge?

In the January/February issue of Foreign Policy, Daniel Kahneman and Jonathan Renshon make a very provocative argument -- as a species, humans are too damn hawkish:

National leaders get all sorts of advice in times of tension and conflict. But often the competing counsel can be broken down into two basic categories. On one side are the hawks: They tend to favor coercive action, are more willing to use military force, and are more likely to doubt the value of offering concessions. When they look at adversaries overseas, they often see unremittingly hostile regimes who only understand the language of force. On the other side are the doves, skeptical about the usefulness of force and more inclined to contemplate political solutions. Where hawks see little in their adversaries but hostility, doves often point to subtle openings for dialogue.

As the hawks and doves thrust and parry, one hopes that the decision makers will hear their arguments on the merits and weigh them judiciously before choosing a course of action. Don’t count on it. Modern psychology suggests that policymakers come to the debate predisposed to believe their hawkish advisors more than the doves. There are numerous reasons for the burden of persuasion that doves carry, and some of them have nothing to do with politics or strategy. In fact, a bias in favor of hawkish beliefs and preferences is built into the fabric of the human mind.

Social and cognitive psychologists have identified a number of predictable errors (psychologists call them biases) in the ways that humans judge situations and evaluate risks. Biases have been documented both in the laboratory and in the real world, mostly in situations that have no connection to international politics. For example, people are prone to exaggerating their strengths: About 80 percent of us believe that our driving skills are better than average. In situations of potential conflict, the same optimistic bias makes politicians and generals receptive to advisors who offer highly favorable estimates of the outcomes of war. Such a predisposition, often shared by leaders on both sides of a conflict, is likely to produce a disaster. And this is not an isolated example.

In fact, when we constructed a list of the biases uncovered in 40 years of psychological research, we were startled by what we found: All the biases in our list favor hawks. These psychological impulses—only a few of which we discuss here—incline national leaders to exaggerate the evil intentions of adversaries, to misjudge how adversaries perceive them, to be overly sanguine when hostilities start, and overly reluctant to make necessary concessions in negotiations. In short, these biases have the effect of making wars more likely to begin and more difficult to end.

Foreign Policy also invited Matthew Continetti and Matthew Yglesias to comment on the piece. Yglesias is enthusiastic about the finding, and goes even further:
Kahneman and Renshon actually end up being unduly generous to the hawkish point of view. Sometimes, of course, war is necessary. But since there are two sides to every conflict, hawks won’t always be right. Even in a case where an American president is rightly listening to his hawkish advisors (George H.W. Bush in the first Gulf War, say, or Bill Clinton over Kosovo), a foreign leader (Saddam Hussein, Slobodan Milosevic) is making a serious miscalculation in listening to his hawkish advisors.

In short, most decisions to go to war have been mistakes. Sometimes, as in World War I, both sides are making a mistake, and other times, as in World War II, only one side is, but the upshot is that the impulse to launch wars is more widespread than it ought to be. Indeed, hawks themselves recognize this fact. Pro-war arguments almost always contend that the enemy is irrationally aggressive, while overestimating one’s own military capabilities. Where the hawks go wrong is in their belief that irrational exuberance about violence is the exclusive province of real or potential adversaries, rather than a problem from which they themselves may suffer.

Continetti is less sanguine:
[W]hy do only the fundamental attribution errors of hawks lead to “pernicious” effects? Doves share the same bias; it just works in different ways. If hawks treat hostile behavior at face value when they shouldn’t, so too do doves treat docility. Those who championed the 1973 accords ending the Vietnam War saw them as a chance for the United States to leave Vietnam while preserving the sovereignty of the south. But to North Vietnamese eyes, the cease-fire was merely an opportunity to consolidate their forces for the final seizure of the south, which happened a mere two years later.

The second hawk bias Kahneman and Renshon identify is “excessive optimism,” which the authors speculate “led American policymakers astray as they laid the groundwork for the current war in Iraq.” Yet prior to the war in Iraq, some hawks worried that Saddam Hussein might set oil fields ablaze, as he had done in 1991. They worried that he might launch missiles against American allies in the region, that his removal might be long and bloody, and that post-Saddam Iraq would face humanitarian crises of great magnitude. Doves optimistically argued that Saddam could be “contained” even as the sanctions against him were unraveling and as America’s military presence in Saudi Arabia became increasingly untenable.

Why Kahneman and Renshon limit the biases they identify to hawks is something of a mystery. Take “reactive devaluation,” or “what was said matters less than who said it.” They cite likely American skepticism over any forthcoming Iranian nuclear concessions as an example, albeit conceding that doubt may be warranted in this case. They could have cited a domestic case instead: Just as many Republicans opposed President Clinton’s interventions in Haiti, Bosnia, and Kosovo, and at one point even accused him of resorting to force in order to distract from the Monica Lewinsky scandal, many Democrats now oppose Bush administration policies sight unseen because they don’t like the messenger. Doves are just as susceptible to reactive devaluation as hawks.

I love this article -- in fact, it's going in my Statecraft course for this semester!!

However, I love it in part because it's simultaneously clear, provocative, and way overblown as a hypothesis. That is to say, even if one acknowledges the individual-level cognitive biases discussed in the piece, it's a stretch to then conclude that foreign policies are more belligerent than they should be because of hawk bias.

If I have more time today, I'll try to fill out these cryptic points, but for now, here are my issues with the argument:


1) Definitional squabbles: I don't like the "hawk" and "dove" labels. Individuals can be hawkish in some situations but dovish in others. Indeed, there might be ideologies or operational codes that countermand the crude hawk/dove dichotomy.

2) There might be worse cognitive biases. Click here, here, here and here for a prior discussion about how, regardless of one's hawk and dove proclivities, even political experts get an awful lot wrong for reasons other than hawkishness. In fact, Kahneman and Renshon have posited a hedgehog theory of war, and that makes me think they've sipped from the very elixir they fear.

3) There are rationalist arguments for war. There aren't a lot of them, but they do exist. It would be interesting, however, if one could marry game-theoretic problems of imperfect information and credible commitment problems to fundamental attribution error and other hawk biases (and yes, please e-mail me or post a citation if someone has done this already).

4) This theory massively overpredicts war as an outcome. If one accepts this argument, then one would also have to explain why war has been such a historically rare event -- and it's been getting rarer. There are a lot of countervailing factors that the authors don't mention, including but not limited to bureacratic politics, domestic politics, regime type, balance of power, etc.

5) Organizations act as a particularly powerful constraint on cognitive limitations. This is one point of the original Carnegie school of organizational behavior.

6) I'm not sure Democrats want to be too enthusiastic about this finding. Let's have some fun and apply these cognitive biases to the domestic policy of liberals*. Hmmm..... so liberals will be likely to demonize their political opponents and misread their intentions.... they'll be excessively optimistic and prone to an "illusion of control" in their domestic policy ambitions.... and they'll double down on ambitious social programs that look like they're not working terribly well (cough, health care, cough). Run, run for your lives!!!

*Yes, this applies with almost equal force to Republicans, but Yglesias is defending the thesis here, so I'm using his side as an example.


posted by Dan on 01.03.07 at 11:20 AM




Comments:

Well, the first thought that came to mind as I read the above was that similar research exists on political advertising; i.e. negative ads are more effective than positive ads because people are hardwired to monitor the environment for threats and dangers. It's a survival instinct.

So, the findings don't surprise me and I tend to agree with them.

posted by: Nick Kaufman on 01.03.07 at 11:20 AM [permalink]



It does indeed beg the question of why the human species isnt constantly at war. I think there is truth to it, but on the other hand inertia is also a massive component to human psychology, and grows in power as the size of groups grows. Starting wars is relatively hard, but then so is stopping them sometimes.

And what of dove psychology? How does one explain appeasementism, which also appears to be a universal trait? Submission to the alpha male?

posted by: Mark Buehner on 01.03.07 at 11:20 AM [permalink]



In short, most decisions to go to war have been mistakes.


Well, yeah.

...in the wars that were fought in the last 100 or so years, it seems that the party that initiated hostilities was usually the party that was ultimately defeated.

Insofar as the party that initiated hostilities did so in the belief that it would be the ultimate victor, yes, the decision to go to war was clearly a mistake.

posted by: rosignol on 01.03.07 at 11:20 AM [permalink]



Drawing on a perceived dove/hawk dichotomy is useful for making a point but artificial all the same.

That wars are often if not always the result of 'cognitive' mistakes is hardly news and really does not change anything.

Yglesias' points are idiotic. How can you talk about decision to go to war, especially in cases of WWI and WWII, as being 'wrong' when so much history contributed to the possible inevitability of those events? Hitler just didn't invent the Third Reich out of mid air by making some 'bad' decisions.

That people are attracted to ostensibly strong, clear, forceful [simple] opinions as opposed to
seemingly weak, complex, reserved [confusing] ones
is also not new or particularly helpful since life doesn't give you the chance usually to experiment with each to see which one will work. That only happens in the labs of cognitive scientists.

That the upshot of the article seems to be that when it comes to making decisions fraught with danger and peril we should be cautious and skeptical of motives - again, hardly earth shattering revelation there.

And yet people keep ignoring advice - and that's one fact there's no getting around.

posted by: cull tech on 01.03.07 at 11:20 AM [permalink]



Another problem with this thesis is that it completely ignores the dynamic nature of the game being played out in the run up to war. The policy makers on one side will have to assume that their opponent is a hawk; that is, belligerent, overly optimistic and likely to misjudge any peaceful overture as weakness. In that situation, even doves are likely to conclude that they need a preliminary show of force to overcome the other side's foolish hawkishness.

posted by: David Cohen on 01.03.07 at 11:20 AM [permalink]






Post a Comment:

Name:


Email Address:


URL:




Comments:


Remember your info?