The Western Way of War

Saturday, March 23rd, 2013

Victor Davis Hanson has argued that the western way of war — based on engaging the enemy in a decisive battle, rather than a series of skirmishes — led to western world dominance.

Peter Turchin could not disagree more:

Winning a battle is a very small part of winning a war. In fact, many wars were won despite losing all the battles. The Romans prevailed over Hannibal in the Second Punic War even though Hannibal smashed them in battle after battle. After the disaster of Cannae perhaps a third of Romans of military age were wiped out, and even a greater proportion of the ruling senatorial class was destroyed (yes, at that time the Roman senators fought and died in the first ranks). Nevertheless, the Romans ultimately prevailed. Similarly, in 1812 Napoleon won all the battles against the Russians, but it was his Grand Army that was ultimately destroyed and two years later Napoleon himself was deposed by the Allied Powers.

There is a great story about an exchange between two colonels, one American and the other Vietnamese in Hanoi, after the end of the US-Vietnam war (I am indebted to Ian Morris for bringing this quote to my attention). During their encounter, Colonel Summers told Colonel Tu, “You know, you never beat us on the battlefield.” Colonel Tu thought about it for a few moments, and then said, “That may be so, but it is also irrelevant.”

Now I don’t want to push this idea to the logical extreme. Winning battles is useful, but it is only one component, and not the most important, of winning wars. As career military officers like to repeat, “amateurs study tactics; professionals study logistics.”

Nuclear Iraq

Friday, March 22nd, 2013

Ten years ago Gregory Cochran sent an email to a concerned friend explaining that Iraq was getting steadily farther away from having nuclear weapons:

Look, back in 1990, they surprised people with their calutrons. No normal country would have made such an effort, because calutrons — mass spectrometers — are an incredibly inefficient way of making a nuclear weapon. We know just how inefficient they are, because E. O Lawrence conned the government into blowing about a quarter of the Manhattan Project budget on a similar effort. Concentrating enough U-235 for one small fission bomb cost hundreds of millions of 1944 dollars. Probably the Japanese could have constructed new cities for less money than this approach took to blow them up. By far the cheaper way is to enrich the uranium just enough to run a reactor and then breed plutonium. The Iraqis wanted U-235, probably because it is much easier to make a device with U-235 than with plutonium. You don’t have to use implosion and you don’t even have to test a gun-type bomb — we didn’t test the Hiroshima bomb. I would guess that they realized their limitations — they’re not exactly overflowing with good physicists and engineers — and chose an approach that they could actually have made work. Implosion is not so easy to make work. India only got their implosion bomb to work on the seventh try, back in 1974, and they have a hell of a lot more technical talent than Iraq.

Anyhow, Iraq doesn’t have the money to do it anymore. The total money going into his government is what, a fifth of what it used to be? (Jeez, quite a bit less than that, when you look carefully) Big non-private organizations tend to gradually slide towards zero output when the money merely stays the same: cut and they fire the worker bees and keep a few Powerpoint specialists. There is no reason to think that Arabs are immune to that kind of logic of bureaucracy. On the contrary. Not only are they not making any nuclear progress, they’re probably making regress.

At best, if we hadn’t interrupted them back in the Gulf War, they would have eventually had a couple. I doubt if it they even would have been an effective deterrent. It’s hard to make classic deterrence work when you have one or two bombs and the other guy has thousands, when he can hit you and you can’t hit him.

Why do we wear pants?

Friday, March 22nd, 2013

Why do we wear pants? Peter Turchin looks at their cultural evolution:

The basic garment worn by the Greeks was the chiton (basically, same as the Roman tunic). And wearing ‘sacks’ around their legs was something that only barbarians did. The Romans of the Classical Age felt the same way. Citizens were required to wear togas for any official functions, and at other times (e.g., for war) they wore tunics.

So if you go back to Italy of the Classical Age, nobody (apart from barbarians) is wearing trousers. Fast forward a thousand years to medieval Italy and all men are wearing a kind of trousers (hose).

Why did the Italians switch from tunics to pants? The answer is the horse. Not only are the horses responsible for why we live in complex, large-scale societies (or, at least, how such large-scale societies first evolved), they are also the reason why males have to swelter in pants in summer, instead of wearing the cool kilt.

We wear pants because of the rise of cavalry:

While classical Greece and Rome produced excellent heavy infantry (hoplites), their cavalry was really pathetic. Yes, some of them (usually, the wealthy) rode horses. Among the Romans the upper class was even called ‘knights’ — equites, from equus, the Latin word for horse, but these ‘knights’ served mostly as officers and perhaps messengers. They never played a decisive role in battle.

On the other hand, the greatest enemy of the Romans, Hannibal, knew how to use cavalry. As long as the Numidian horse riders fought on the side of Carthaginians, they trounced Romans, again and again. The Romans only won one major battle in that war, the Battle of Zama, which ended the war. Interestingly enough, the Numidians switched sides just prior to the battle…

The Romans eventually realized that they had to acquire reasonably efficient cavalry. At first, cavalry was an auxiliary force, manned by non-Roman citizens. During the Empire (from the first century AD on), the Romans began to employ cavalry more effectively. But riding a horse while wearing a tunic is not very comfortable. So Roman cavalrymen started wearing pants, or braccae as they called them (borrowing a Celtic term; this word eventually became ‘breeches’).

After the collapse of the Roman Empire, Europe fell under the rule of warriors who fought from the horseback — the knights (this transition actually occurred during the Carolingian times, roughly eighth century AD). So wearing pants became associated with high-status men, and gradually spread to other males. By the way, I am talking here about the Mediterranean cultures. In northern Europe, of course, pants were worn by both Celtic and Germanic people at least from the Iron Age on.

The same pattern holds elsewhere:

In Japan, for example, the traditional dress is kimono, but the warrior class (samurai) wore baggy pants (sometimes characterized as a divided skirt), hakama.

Before the introduction of horses by Europeans (actually, re-introduction — horses were native to North America, but were hunted to extinction when humans first arrived there), civilized Amerindians wore kilts. But when the Plains Indians started riding horses they also adopted pants. Another correlation is that typically only men wear pants (or men are first to switch to wearing pants).

One striking exception to this rule is the Amazons, who are, of course, famous for their horse-riding and archery skills.

The Power of Swarms

Thursday, March 21st, 2013

Groups — swarms, flocks, herds, mobs — produce complex behaviors from simple rules:

Golden Shiners

Behavior: Seek darkness

Presumably for protection, shiners search out dark waters. But they can’t actually perceive changes in light levels that might guide their way. Instead, they follow one simple directive: When light disappears, slow down. As a result, the fish in a school pile up in dark pools and stay put.


Behavior: Work in rhythm

When ants of a certain species get crowded enough to bump into each other, coordinated waves of activity pulse through every 20 minutes.


Behavior: Be a follower

Absent normal communication, humans can be as impressionable as a flock of sheep. If one member of a walking group is instructed to move toward a target, though other members may not know the target—or even that there is a target—the whole group will eventually be shepherded in its direction.


Behavior: Cannibalism

When enough locusts squeeze together, bites from behind send individuals fleeing to safety. Eventually they organize into conga-line-like clusters to avoid being eaten. They also emit pheromones to attract even more locusts, resulting in a swarm.


Behavior: Do what the neighbors do

These birds coordinate their speed and direction with just a half dozen of their closest murmuration-mates, regardless of how packed the flock gets. Those interactions are enough to steer the entire group in the same direction.


Behavior: Head-butting

When honeybees return from searching for a new nest, they waggle in a dance that identifies the location. But if multiple sites exist, a bee can advocate for its choice by ramming its head into other waggling bees. A bee that gets butted enough times stops dancing, ultimately leaving the hive with one option.

Ask Nassim Taleb Anything

Thursday, March 21st, 2013

I haven’t followed reddit in years, but Nassim Taleb is doing an ask me anything thread over there right now:

  • Rule: any company that would cause a national emergency requiring a bailout should it fail should be classified BAILABLE-OUT and employees should not be allowed to earn more than civil servants. That would force companies to 1) be small, 2) not leech off the taxpayer.
  • I share many things with Ayn Rand. But not selfishness. Rather to me honor to take risks and account for your action is the rule.
  • [Dawkins] doesn’t understand what belief means, and talks religion confusing pisteic (credere) and epistemic. Belief in religion is epiphenomenal. Religion is about practice. The real reason is that he doesn’t of course understand probability.
  • Antifragility is simply a local response. Complexity Science is about systems. My approach is less theoretical (more robust), but if I were to ascribe to a theory, I would subscribe to Complexity theory.
  • I came to realize that FU money was a state of mind. Many rich people never have it. A train conductor/intellectual I know had it.
  • I will be honest. I often discover books because people tell me that I am similar to the writer, and later start imagining that they were an influence. It looks like a backward process.
  • The general problem is that we are not made to control our environment, and we are designed for a degree of variability: in energy, temperature, food composition, sleep duration, exercise (by Jensen’s inequality). Depriving anyone of variations is silly. So we need to force periods of starvation or fasts, sleep deprivation, protein deprivation, etc. Religions force shabbats, fasts, etc. but we are no longer under the sway of religions. The solution is rules.

A commenter shared this systems theory translation by John Michael Greer of a passage from the Tao Te Ching:

A process as described is not the process as it exists;
The terms used to describe it are not the things they describe.
That which evades description is the wholeness of the system;
The act of description is merely a listing of its parts.
Without intentionality, you can experience the whole system;
With intentionality, you can comprehend its effects.
These two approach the same reality in different ways,
And the result appears confusing;
But accepting the apparent confusion
Gives access to the whole system.

Drivers vs. Pedestrians

Thursday, March 21st, 2013

While visiting Frankfurt and Moscow, Peter Turchin found himself thinking about social norms governing interactions between drivers and pedestrians:

These norms vary dramatically between countries, and even regions within countries. In New York City, for example, pedestrians pay no attention to traffic lights — you check the traffic and cross the street. In Seattle, on the other hand, you are not supposed to do that, and cops will actually write you a ticket for jaywalking (at least, they did in the 1980s, when I did my post-doc there).

In Germany pedestrians are very disciplined and will wait to cross the street until they get the green light — even if there is no traffic. For somebody raised in New York (and many other places outside of Germanic countries), this feels really weird, and even unnatural. I noticed that many tourists crossed illegally, with natives looking upon such ‘antisocial behavior’ disapprovingly. Frankfurt is not a big tourist destination, but I wonder whether the norm prohibiting jaywalking is sustainable in cities where the majority of pedestrians are foreigners. Theoretically, if enough people disregard a norm, it should collapse.

Then there are norms regulating when drivers should yield to pedestrians. In many countries, including Russia, pedestrians have the right of way on zebra-crossings. But, as we all know well, just having a law on the books doesn’t mean that it is actually followed. When I again started visiting Russia regularly in the 1990s, I noticed with dismay that drivers paid no attention to pedestrians trying to cross a street. Using zebra crossings became a deadly game of the Russian roulette (sorry about the cliché).

When asked, my friends offered several explanations. One obvious possibility was that the behavior of drivers simply reflected the general unraveling of cooperative norms that accompanied the civilizational and societal collapse of the Soviet Union. Another explanation was that the 1990s were the first decade when the Russians began using automobiles massively, and the norms of civilized behavior simply had not had a chance to spread though the population of new drivers. The third one pointed to the influx of drivers from the North- and Trans-Caucasian republics (then, as now, most taxi drivers in Moscow came from that region), who brought a different set of norms with them.

A more general (ultimate, rather than proximate) explanation is suggested by recent theoretical research on the evolution of cooperation. Cooperative equilibria tend to be fragile, and can collapse in no time at all. A more interesting and difficult question is how we can go from a noncooperative equilibrium to a cooperative equilibrium. This is where the story gets interesting.

During the early 2000s, drivers gradually started treating pedestrians more considerately. This trend became very noticeable last time I was in Moscow, a week ago. Now when you come to a zebra crossing drivers routinely stop for you (a major exception, however, is zebra crossings across very busy roads with four or more lanes). This seems to be a true equilibrium, because all players expect drivers to stop for pedestrians. This includes other drivers, which is important because previously a major worry was that if you stop at a zebra, you could be hit from behind by another car that did not expect you to do it. Pedestrians now start crossing fairly confidently, whereas during the 1990s they behaved like deer during the hunting season. And even cops started enforcing the law, which is probably the most amazing development, given how notoriously corrupt the road police are in Russia.

It’s interesting to speculate how this positive change came about. A part of the explanation is that there were several well-publicized cases of drivers killing pedestrians on zebra walks. Two years ago a law was passed that required drivers to stop when a pedestrian approached a zebra crossing (previously they were required to stop only when someone was already crossing). But while this is undoubtedly part of the story, I feel that laws by themselves are insufficient; there must also be a cultural change that enables laws to become effective.

I queried my local informants and I heard a similar story from three independent sources. Basically, the claim is that this is a case of cultural diffusion of social norms from European countries, carried by Russians who visit them as tourists and businessmen. One of my friends related to me the story of how he was driving in Germany several years ago, and habitually did not stop for pedestrians at a zebra crossing. He particularly noted how those people looked at him as he was whizzing by.

Humans are very good both at conveying the information that a norm is being violated, and are also very sensitive to receiving such signals. Maintaining cooperative norms is much easier if signals are sent to norm violators by third parties. In Moscow now pedestrians expect cars to stop for them, and they will look pointedly at those who don’t do so. This bodes well for the stability of the new cooperative equilibrium. Additionally, while cops should fine violators, my guess is that it is more important that society at large clearly expresses its disapproval of norm violators. We have a legal speed limit of 65 mph on highways in the United States, yet despite millions of tickets handed out, the majority in the state where I live drives at around 80 mph. There is simply no social stigma associated with driving above the speed limit.

The Babbage Difference Engine in High Resolution

Wednesday, March 20th, 2013

Babbage’s difference engine is mesmerizing:

Why Do Economists Urge College, But Not Marriage?

Wednesday, March 20th, 2013

Why Do Economists Urge College, But Not Marriage?

College improves your earning prospects.  So does marriage.  Education makes you more likely to live longer.  So does marriage.  Yet while many economist vocally support initiatives to move more people into college, very few of them vocally favor initiatives to get more people married.

Megan McArdle offers this parsimonious explanation:

All economists are, definitionally, very good at college. Not all economists are good at marriage.

First Sunstone Found

Wednesday, March 20th, 2013

Archaeologists have finally found a fabled Viking sunstone in a later, British shipwreck:

The crystal was found amongst the wreckage of the Alderney, an Elizabethan warship that sank near the Channel Islands in 1592. The stone was discovered less than 3 feet (1 meter) from a pair of navigation dividers, suggesting it may have been kept with the ship’s other navigational tools, according to the research team headed by scientists at the University of Rennes in France.

A chemical analysis confirmed that the stone was Icelandic Spar, or calcite crystal, believed to be the Vikings’ mineral of choice for their fabled sunstones, mentioned in the 13th-century Viking saga of Saint Olaf.

Today, the Alderney crystal would be useless for navigation, because it has been abraded by sand and clouded by magnesium salts. But in better days, such a stone would have bent light in a helpful way for seafarers.

Because of the rhombohedral shape of calcite crystals, “they refract or polarize light in such a way to create a double image,” Mike Harrison, coordinator of the Alderney Maritime Trust, told LiveScience. This means that if you were to look at someone’s face through a clear chunk of Icelandic spar, you would see two faces. But if the crystal is held in just the right position, the double image becomes a single image and you know the crystal is pointing east-west, Harrison said.

These refractive powers remain even in low light when it’s foggy or cloudy or when twilight has come. In a previous study, the researchers proved they could use Icelandic spar to orient themselves within a few degrees of the sun, even after the sun had dipped below the horizon.


Wednesday, March 20th, 2013

Peter Turchin explains the forces behind imperiogenesis, the formation of empires:

Take the case of the Sinic (Chinese) civilization. Over the last three thousand years the cradle of the Chinese civilization, the Yellow River Basin, has been unified by one empire after another. There is no other region on Earth that could rival the Yellow River Basin in the intensity of ‘imperiogenesis’ (proportion of time that it found itself within a large empire). In a series of publications (for example, this one) I have argued that the explanation of this remarkable pattern has to do with the very intensive warfare between nomadic pastoralists (Hunnu, Turks, Mongols, etc) and the agrarian Chinese. This is why China was typically unified from the North (and most frequently from the Northwest) – it was the military pressure from the Great Eurasian Steppe that selected for unusually cohesive North Chinese societies, which then would go on to build huge empires by conquering the rest of East Asia.

Steppe frontiers are crucibles of empires; you add a major river and you are practically guaranteed to have an imperiogenesis hotspot. Examples are numerous: the Nile, the Tigris-Euphrates, and the Indus are the usual suspects. But the first empires in sub-Saharan Africa (Ghana, Mali, and Songhai) arose on the Niger River where it flows through the Sahel. This correlation has been long noted. Karl Wittfogel attempted to explain this observation with his theory of ‘hydraulic empires’, based on the control of irrigation by state bureaucracy, but this theory has been empirically disproved. For example, the major river of eastern Europe, the Volga, was the cradle of a number of empires (Bulghar, the Kazan Khanate, and, most notably, Muscovy-Russia), none of which relied on intensive irrigation. Nor did the Chinese along the Yellow River. In other civilizations irrigation was typically a local, rather than an imperial concern. Most likely, the river effect is due to a combination of good environment for intensive agriculture on alluvial soils and the ease of communications (because transporting goods on water was an order of magnitude cheaper than carting them on land).

High Kings and Galactic Emperors

Tuesday, March 19th, 2013

Science fiction curiously includes a large number of High Kings and Galactic Emperors:

“Curiously” in the sense that (at any rate to ‘Murricans) it is a form of government associated with the past, and certainly not with rocket ships, monorails, food pills, cyborgs, or the rest of the retro-future paraphernalia that sci-fi still loosely connotes in the popular culture.


For my purpose, the virtues or defects of monarchism as a political position are fairly beside the point. Kingship has certainly been widespread, suggesting that it was a workable default position, at any rate in the agrarian age. For an intellectual defense you probably still can’t do better than Hobbes’ Leviathan. Not to mention that as a critique of anarchism and its cousins, it is hard to improve on solitary, poor, nasty, brutish, and short.

But I would argue — in fact, I will argue — that the roots of monarchism in SF have less to do with political philosophy than with basic story considerations.

Bourgeois representative democracy, classical Athenian-style democracy, classical Roman-style republicanism, medieval oligarchical republicanism a la Venice, military juntas, fascistic fuehrerprinzip, Leninist dictatorship of the proletariat, nominally Communist party-committee oligarchy, pure bureaucratic functionary-ism, and both Iranian and al-Queda style theocracy, all have at least one thing in common: The likelihood of a teenage girl becoming head of state under any of these systems is pretty much nil.


Or, to put it another way, hereditary monarchy is singularly well-suited to Romance. By fully entangling the personal and the political it provides great story fuel. And story trumps futurism, or even political philosophy, every time.

One of the commenters mentions the Dune Encyclopedia, which was written as if it existed in the fictional universe of the books:

It filtered all that was known about the present through a “Monarchist” filter. So World War II became a “minor trade dispute between House Tokyo and House Washington in the British Empire”.

Actually, here’s the original passage, featuring the Houses WashingtonNippon, and Windsor:

The practice of maintaining stockpiles of atomic weapons as an integral part of a House’s defenses began when primitive nuclear weapons were invented on Old Terra on the eve of the Little Diaspora, by the “Raw Mental,” Einstein, who was working for House Washington. When Einstein succeeded in his attempts to  construct these weapons, two of the first were used to settle a trade dispute with House Nippon. These weapons were of such a primitive nature that fewer than a million casualties were caused by the explosions — but one must remember that the entire empire at this time had only three billion subjects, all on one planet. The demonstration, though unremarkable by later standards, served two purposes: the destruction of two small cities and the threat of the destruction of others forced House Nippon to concede the lucrative Pacific trade routes to House Washington; and possession of the Empire’s only atomic weapons gave House Washington the prestige and power it needed to displace House Windsor.

Different Traits at Different Ages

Tuesday, March 19th, 2013

We think of people as having traits, Peter Turchin says, when we need to realize that people have different traits at different ages:

Because abilities to do something at the age of 10, 30, 50, etc. are separate (even if correlated) traits, they evolve relatively independently of each other. When grains became a large part of the diet, the ability of children to digest them (and detoxify the chemical compounds plants put into seeds to protect them against predators such as us) became critical. If you don’t have genes to help you deal with this new diet, you don’t survive to adulthood and don’t leave descendants. In other words, evolution worked very hard to adapt the young to the new diet. On the other hand, the intensity of selection on the old (e.g., 55 years old) was much less – in large part, because most people did not live to the age of 55 until very recently. Additionally, once an animal gets past its reproductive age, the evolution largely ceases to have an effect (in humans, presence of older individuals was somewhat important for the survival of their genes in their children and grandchildren, so evolution did not entirely cease, but was greatly slowed down).

What this means is that evolution caused rapid proliferation of genes that enabled children and young adults to easily digest novel foods and detoxify whatever harmful substances were in them. Genes and gene combinations that did the same for older people also increased, but at a much, much slower rate. This may sound puzzling – if we have the detoxifying genes that work for young adults, why shouldn’t they work for older adults? The reason is that one gene-one action model is wrong; it’s not how our bodies work. Most functions are regulated not by a single gene, but by whole networks of them. As we age, some genes come on, and others go off, and the network changes, often in very subtle and nonlinear ways. That’s why we need the ‘trick’ with which I started, to consider functions at different ages as separate traits. During the last 10,000 years evolution worked very hard to optimize the gene network operating during earlier ages to deal with novel foods. But the gene network during later ages was under much less selection to become optimized in this way.

The striking conclusion from this argument is that older people, even those coming from populations that have practiced agriculture for millennia, may suffer adverse health effects from the agricultural diet, despite having no problems when they were younger.

Beguiled by Europe

Monday, March 18th, 2013

We are building in Europe not a United States, but a Yugoslavia, Theodore Dalrymple says:

We shall be lucky to escape violence when it breaks apart.

I passed over the fact that Europe is, so far, the consequence of peace, and not its cause; that multilateral agreements between countries have always been possible without the erection of giant and corrupt bureaucratic apparatuses that weigh like a peine forte et dure on most Western European economies; that the maintenance of peace does not require or depend upon regulating the size of bananas sold in the marketplace; and that the notion that were it not for the European Union, there would be war, is inherently Germanophobic — because no one believes, for instance, that Estonia would otherwise attack Slovenia, or Portugal Slovakia.

It always seems strange to me that in Belgium, of all countries, people should be unable to see the European Union’s dangers. After all, the country is composed of only two main national communities — the French-speaking Walloons and the Dutch-speaking Flemish — and the division between the two is now sharper than at any previous time, to such an extent that the country recently had no government for more than 500 days. (Honesty compels me to admit that Belgium seems to have come to no great harm during that period.) No one in Belgium explains, or even asks, why what has not proved possible for 189 years — full national integration of just two groups sharing so much historical experience and a tiny fragment of territory — should be achievable on a vastly larger scale with innumerable national groups, many of which have deeply ingrained and derogatory stereotypes of one another.

I also pointed out that “Europe” lacks almost all political legitimacy, which will make it impossible to resolve real and growing differences. The results of the subsequent Italian general election — wherein two anti-European demagogues collected between them more than half of the votes — would seem to confirm my prognostication. Anti-German feeling runs high in Italy, and not only there. Matters weren’t much improved by the insensitive remarks of the German minister of labor in a recent edition of Der Spiegel, to the effect that the ongoing economic crisis is lucky for Germany because, with high youth unemployment elsewhere on the continent — 50 percent in Spain, for example — young people, especially the best-qualified, will increasingly seek jobs in Germany. “And that,” she said, “will rejuvenate the country, making it more creative and international.” In other words, the continent’s high unemployment is the solution to Germany’s demographic decline.

America’s New Mandarins

Monday, March 18th, 2013

Megan McArdle discusses America’s new Mandarins:

The Chinese imperial bureaucracy was immensely powerful. Entrance was theoretically open to anyone, from any walk of society — as long as they could pass a very tough examination. The number of passes was tightly restricted to keep the bureaucracy at optimal size.

Passing the tests and becoming a “scholar official” was a ticket to a very good, very secure life. And there is something to like about a system like this … especially if you happen to be good at exams. Of course, once you gave the imperial bureaucracy a lot of power, and made entrance into said bureaucracy conditional on passing a tough exam, what you have is … a country run by people who think that being good at exams is the most important thing on earth. Sound familiar?

The people who pass these sorts of admissions tests are very clever. But they’re also, as time goes on, increasingly narrow. The way to pass a series of highly competitive exams is to focus every fiber of your being on learning what the authorities want, and giving it to them. To the extent that the “Tiger Mom” phenomenon is actually real, it’s arguably the cultural legacy of the Mandarin system.


Almost none of the kids I meet in Washington these days even had boring menial high-school jobs working in a drugstore or waiting tables; they were doing “enriching” internships or academic programs. And thus the separation of the mandarin class grows ever more complete.


And like all elites, they believe that they not only rule because they can, but because they should. Even many quite left-wing folks do not fundamentally question the idea that the world should be run by highly verbal people who test well and turn their work in on time. They may think that machine operators should have more power and money in the workplace, and salesmen and accountants should have less. But if they think there’s anything wrong with the balance of power in the system we all live under, it is that clever mandarins do not have enough power to bend that system to their will. For the good of everyone else, of course. Not that they spend much time with everyone else, but they have excellent imaginations.

The Z-Curve of Human Egalitarianism

Monday, March 18th, 2013

Peter Turchin sees human egalitarianism following a complex Z-curve, zigging to greater inequality during the pre-axial period, and then zagging back toward equality in the last few thousand years:

The starting point for approaching this question is what is sometimes called as the ‘U-shaped curve of despotism’ in human evolution. We know that our closest relatives, the chimps and gorillas, live in fairly ‘despotic’ or inegalitarian societies. The chimps, for example, establish linear dominance hierarchies, in which alpha males get better food and greater access to females. We don’t know for sure whether human ancestors also lived in similarly inegalitarian societies, but it seems likely.

In contrast, as was argued by Christopher Boehm in Hierarchy in the Forest: The Evolution of Egalitarian Behavior, human hunter-gatherers, who lived in small-scale societies before agriculture, were fiercely egalitarian. High degree of equality does not simply happen because hunter-gatherers are poor and cannot accumulate much wealth (chimps also cannot accumulate wealth). No, equality requires active maintenance. People living in small-scale societies possess numerous norms and institutions designed to control ‘upstarts’ — those who attempt to set themselves as alpha-males so that they can gain control of an unfair share of resources (including females). The sanctions deployed against upstarts range from gossip and ridicule to ostracism and, ultimately, assassination.

Thus, until c.10,000 years ago, before agriculture was invented, the human evolutionary trend was that of increasing egalitarianism. The adoption of agriculture, however, enabled the rise of large-scale societies organized as states and empires with highly unequal distributions of power, wealth, and social status. In other words, the trend to greater equality reversed itself. What accounts for this U-turn? Why did humans allow inequality to develop?

The answer apparently is that the U-turn was a side effect of the transition from small-scale to large-scale societies. Small-scale societies of hunter-gatherers were integrated by face-to-face sociality. Such a diffuse, non-centralized social organization was well-suited to maintaining egalitarian ethos. However, once the size of cooperating group increases beyond 100–200 people, even gigantic human brains are overwhelmed by the demands of face-to-face sociality (this is the argument made by Robin Dunbar). Shifting from diffuse, uncentralized social organization to hierarchical organization (as chains of command) allowed evolution to break through the upper limit on society size imposed by face-to-face sociality. A member of a hierarchically organized group needs to have face-to-face interactions with only a few individuals: a superior and several subordinates. Such links can connect everybody in a group of arbitrarily large size. The group size grows by adding additional hierarchical levels.

So far so good, but the great downside of hierarchical organization is that it inevitably leads to inequality. Once you allow a leader to order everybody around, he will use the power to feather his nest. This is sometimes known as the iron law of oligarchy.

I have argued elsewhere that conditions of endemic warfare between human groups create enormous selection pressures for larger group size (“God is on the side of big battalions”) and for effective (which means centralized) military organizations. Under such conditions, emergence of centralized military hierarchies becomes virtually inevitable. The result is the rise of increasingly complex centralized societies — chiefdoms, complex chiefdoms, and archaic states.

As Bellah notes, archaic states were characterized by enormous fusion of power in the person of the ruler. Almost invariably the rulers of such states were ‘divinized’, that is, considered to be gods as well as kings. They had literally the power of life and death over their subjects. One frequent characteristic of early centralized societies was the practice of massive human sacrifice. This naked pursuit of power and voracious appetite for consuming resources is reflected in such characterizations of rulers as a land shark who ‘eats’ island (in Hawaii), or a big rat that gobbles people’s millet (in archaic China).

Thus, although highly effective on the battlefield, a centralized military hierarchy has several drawbacks as a general way of organizing societies. A society cannot really be held together by force alone. Worse, great inequities resulting from rapacious military chiefs and their retinues alienate large segments of the population. As a result, early despotic chiefdoms and archaic states were very fragile and frequently did not outlast their founders.

The tension between the human preference for equitable outcomes and the need for centralized hierarchy brought about the “legitimation crisis of the early state” (this idea was borrowed by Bellah from Jürgen Habermas). The tension became particularly acute during the Axial Age (c.800–200 BCE), for reasons discussed in my review of Bellah’s book and other publications. One central argument in Bellah’s book is that the new world religions and philosophies that arose during the Axial Age began the long job of building more equitable societies. A large part of this evolution was imposing limits on the power of rulers and replacing power based on naked force with legitimate authority.