We are willing to suffer more to keep what we have now

Thursday, October 4th, 2018

T. Greer reviews two books by Kenneth Payne on the psychology of strategy — The Psychology of Strategy: Exploring Rationality in the Vietnam War and Strategy, Evolution, and War: From Apes to Artificial Intelligence:

Payne isolates several aspects of human psychology that are especially relevant to strategic decision making. Payne divides these aspects into three broader themes: unconscious biases that affect strategic decision making, interaction between emotions and strategic action, and the critical role that social esteem plays in the psychology of strategy.


For example, there is strong evidence humans become more certain in our beliefs and in our decisions when angry; the strategic calculations of an angry decision maker will be fundamentally different from a sorrowful one. One of the more intriguing ideas in Payne’s catalog of biases is his interpretation of Clausewitz’s dictum that the defense is stronger than the offense. This is true, Payne argues, because humans are loss averse. We are willing to suffer more to keep what we have now than we are to earn something new. If humans think about territory or international prestige this way, then commanders will be bolder when trying to recover lost ground, and their soldiers will be more determined in defense than on the attack.

The most interesting part of this discussion is Payne’s analysis of honor and esteem. Humans are social animals. The need for the esteem of other humans seems deeply ingrained in human psychology, and statesmen and strategists are not immune to this. In The Psychology of Strategy Payne provides scores of examples of strategic decisions made by Lyndon Johnson, Richard Nixon, and other officials that were made more to bolster the social esteem of the decision maker than to defeat the enemy. For Johnson and his officials, esteem and reputation were often explicitly described as the most important objective in the war — one administration official estimated in a memo that 70% of the reason the U.S. was escalating in Vietnam was to avoid humiliation; the other 30% was divided between the need to keep Vietnam out of Chinese hands and to help the people of South Vietnam live a freer life.


Esteem, emotion, and cognitive biases are human phenomena. If wars were fought by non-humans they would be fought differently. This is exactly what Payne imagines for the future of war. Artificial intelligence will be a revolution in warfare, Payne claims, because for the first time in man’s evolutionary history, strategy will be freed from the limits of human psychology: “Rather than creating a danger from AI acting strategically against humans, the main effects of AI are likely to be felt from AI acting in our interests.”


  1. Alistair says:

    I call absolutely wrong on the AI “biases”.

    “Esteem” isn’t a irrational psychological artifact that humans introduced to strategy. It is an integral part of strategy in a game of iterated trials of strength between known parties. “Credibility”, “esteem”, “commitment”, “reliability” or whatever you call it; they evolved for good reasons! No one will ally with anyone who will fold at first push. It is entirely rational to burn resources beyond the size of the local contest to signal your alliance-quality in other contests.

    There is absolutely no reason to suppose a competent AI would not evolve similar needs for credibility and “esteem” amongst its peers.

  2. Bob Sykes says:

    Trump is notoriously “confident.” His commitment to sanctions against all opponents, especially Russia and Iran, will not let him back down and negotiate our differences, and if the Russian and Iranians push back, we likely will get a war. The fake-American-neocons would be delighted.

  3. Lu An Li says:

    Defense is the stronger form of combat in that it is easier to do and you can accomplish more with less and do so in a more planned manner less subjected to dynamic change.

    Since war is politics by other means and political considerations also limit war making decisions, AI will not only fight war in a different manner but ultimately the human override must be included in the machines. Such is the hope.

Leave a Reply