I’ve been meaning to read The Logic of Failure — subtitled Recognizing and Avoiding Error in Complex Situations — for some time, but after reading this review I realize it’s even more up my alley than I imagined:
In any bookstore, you will find dozens or even hundreds of books devoted to “success.” In this book, Dietrich Doerner works the other side of this street. He studies failure. Doerner, a professor of Psychology at the University of Bamberg (Germany) uses empirical methods to study human decision-making processes, with an emphasis on understanding the ways in which these processes can go wrong. His work should be read by anyone with a responsibility for making decisions, particularly complex and important decisions.
Doerner’s basic tool for study is the simulation model. Many of his models bear a resemblence to Sim City and similar games, but are purpose-designed to shed light on particular questions. The nature of many of these models implies that they use human umpires, as well as computer processing. (Doerner uses the simulation results of other researchers, as well as his own experimental work, in developing the ideas in this book.)
Probably the best way to give a feel for the book is to describe some of the simulations and to discuss some of the conclusions that Doerner draws from them.
In the fire simulation, the subject plays the part of a fire chief who is dealing with forest fires. He has 12 brigades at his command, and can deploy them at will. The brigades can also be given limited autonomy to make their own decisions.
The subjects who fail at this game, Doerner finds, are those who apply rigid, context-insensitive rules…such as “always keep the units widely deployed” or “always keep the units concentrated” rather than making these decisions flexibly. He identifies “methodism,” which he defines as “the unthinking application of a sequence of actions we have once learned,” as a key threat to effective decision-making. (The term is borrowed from the great military writer Clausewitz.) Similar results are obtained in another simulation, in which the subject is put in charge of making production decisions in a clothing factory. In this case, the subjects are asked to think out loud as they develop their strategies. The unsuccessful ones tend to use unqualified expressions: constantly, every time, without exception, absolutely, etc…while the successful “factory managers” tend toward qualified expressions: now and then, in general, specifically, perhaps,…
The Moro simulation puts the subject in charge of a third-world country. His decision-making must include issues such as land use, water supply, medical care, etc. Time delays and multiple interactions make this simulation hard to handle effectively…a high proportion of subjects wound up making things worse rather than better for their “citizens.” Human beings, Doerner argues, have much more difficulty understanding patterns that extend over time than patterns that are spatial in nature.
Many subjects in this simulation showed obsessive behavior — they would focus on one aspect, such as building irrigation canals, and ignore everything else, without even really trying to understand the interactions.
Doerner wanted to know what kinds of previous experience would help most in this game, so he ran it once with a set of college students for subjects, and again with a set of experienced business executives. The students had probably been more exposed to concepts of “ecological thinking” — but the executives did significantly better. This argues that there are forms of “tacit knowledge” which are gained as a result of decision-making experience, and which are transferable to at least some degree across subject matter domains.
One simple but surprisingly interesting experiment was the temperature control simulation. Subjects were put in the position of a supermarket manager and told that the thermostat for the freezers has broken down. They had to manually control the refrigeration system to maintain a temperature of 4 degrees C — higher and lower temperatures are both undesirable. They had available to them a regulator and a thermometer; the specific control mechanism was not described to the subjects. The results were often just bizarre. Many participants failed to understand that delays were occurring in the system (a setting does not take effect immediately, just as an air conditioner cannot cool a house immediately) and that these delays needed to be considered when trying to control the system. Instead, they developed beliefs about regulator settings that could best be described as superstitious or magical: “twenty-eight is a good number” or, even more strangely, “odd numbers are good.”
One very interesting angle explored by Doerner is the danger, in decision-making tasks, of knowing too much — of becoming lost in detail and of always needing one more piece of information before coming to a decision. He posits that this problem “probably explains why organizations tend to institutionalize the separation of their information-gathering and decision-making branchs” — as in the development of staff organizations in the military. (It may also, it seems to me, have much to do with the hypercritical attitude that many intellectuals have toward decision-makers in business and government — that is, they fail to understand that the effective decision-maker must reduce a problem to its essences and cannot be forever exploring the “shades of gray”)