The Logic of Failure

Monday, September 26th, 2011

A few years ago I read and enjoyed David Foster’s review of Dietrich Doerner’s The Logic of Failure, which explores how people make decisions — particularly how they make bad decisions when faced with complex systems. Now Foster points to another review, by The Social Pathologist, who cites this passage from the book:

We divided the Greenvale participants into three groups: a control group, a strategy group, and a tactics group. The strategy and tactics groups received instruction in some fairly complicated procedures for dealing with complex systems. The strategy group was introduced to concepts like “system”, “positive feedback,” “negative feedback,” and “critical variable” and to the benefits of formulating goals, determining and, if necessary, changing priorities, and so forth. The tactics group was taught a particular procedure for decision making, namely, “Zangemeister efficiency analysis.”

After the experiment, conducted over several weeks, the participants were asked to evaluate the training they had received; figure 39 shows the results. The members of the strategy and tactics groups all agreed that the training had been “moderately” helpful to them. The members of the control group, who had received training in some nebulous, ill defined “creative thinking,” felt that their training had been of very little use to them. The differences in the evaluations are statistically significant. If we look at the participants’ actual performance as well as at their evaluations of the help they thought they got from their training, however, we find that the three groups did not differ at all in their performance.

Why did the participants who had been “treated” with certain procedures think this essentially useless training had been somewhat helpful? The training gave them what I would call “verbal intelligence” in the field of solving complex problems. Equipped with lots of shiny new concepts, they were able to talk about their thinking, their actions, and the problems they were facing. This gain in eloquence left no mark at all on their performance, however. Other investigators report a similar gap between verbal intelligence and performance intelligence and distinguish between “explicit” and “implicit” knowledge.’ The ability to talk about something does not necessarily reflect an ability to deal with it in reality.

The Social Pathologist draws some conclusions:

If we think about this last experiment for a minute, its findings are profoundly disturbing. It would appear that theoretical problem solving knowledge does not necessarily translate to practical problem solving knowledge. Buisness school graduates do necessarily make good businessmen. Perhaps one of the reasons why Western economies are faltering at the moment is because there are thousands of graduates from business schools occupying positions in senior management who can “talk the talk” but cannot “walk the walk”.

Dorner’s book also has implications for political theory: Take for example democracy. It would appear that the average man is suited to understanding simple and immediate problems and such would vote intelligently on such issues, but what about complex issues with long term consequences? Democratic government, given human cognitive limitations, is surely to fail over the long term since the bulk of men are not able to grasp the long term consequences of even moderately simple decision. Democracy (even tyranny) is ultimately corrupted by its own stupidity.

David builds on this:

The ability to work with abstractions fluently and effectively is important — part of this ability should be understanding the limits of abstractions. In business, for example, there are many companies paying too much attention to computerized “business intelligence systems” mining vast databases of customer behavior, and far too little attention to taking advantage of the tacit knowledge possessed by their front-line employees.

Higher education should result in increased ability to deal well with abstractions — too often, it leads instead to the reification of abstractions, to treating them as more real than reality. I’ve met executive for whom the assumed position of a business on the BCG growth-share matrix (cows, dogs, stars question marks) was much realer than that actual characteristics of the actual business.

Comments

  1. This is consistent with recent suggestions that reasoning didn’t evolve for discovering truth but for winning arguments. The MBA is vomited from business school ready for winning arguments but not for something like, say, business.

  2. Alrenous says:

    I’ve called [reification of abstractions] abstraction intoxication disease. I suspect the causation is that the working memory is saturated with abstractions, leaving no room for the concrete manifestations of the abstractions.

    It’s a natural mistake when thinking about systems much more complex than working memory. Chunking helps. Also, studying physics.

    I’m beginning to suspect that feeling of having learned something isn’t an error. No, wait — having written that out I’m certain. However, subjects think it indicates domain knowledge, and it’s actually something else, like ability to “win” arguments.

Leave a Reply