Growth has to come from within

Thursday, August 3rd, 2023

One of the interesting lines of evidence about the importance of human capital in wealth disparities, Emil O. W. Kirkegaard notes, comes from examples of how groups can recover from setback:

The most obvious example are the losers of World War 2, that is, Japan, Germany, and Italy. Japan and Germany suffered extreme damage to their infrastructure in the later stages of the war but were quickly able to recover.

[…]

Some of this research has been done using slaveholders in the former Confederacy, who lost the US civil war (1861-65). These were wealthy people who owned slaves, but lost this wealth. If they were wealthy because of human capital reasons, we would expect them to regain some fraction of this, and their children to markedly catch-up towards their long-term trend.

[…]

For people who had a lot of wealth from slave owning in 1960, their wealth was still reduced in 1870. However, their sons had actually caught back up by 1900. Likewise with the grandsons.

[…]

In his book The Son Also Rises economic historian Greg Clark tracked elite surnames to gauge social mobility across many countries. One of these was China, where the situation was dire.

[…]

However, when looking up modern data for the elite surnames, these are still over-represented among the current elite, despite the efforts of the communists to eradicate their advantages.

[…]

Here some will object that the recovery of the former Axis powers was due to US subsidies (Marshall plan in Europe). The problem with this idea is that wealthy countries have also tried doing the same kind of growth program in other regions of the world, and they have never worked very well.

[…]

Growth has to come from within.

His most disastrous error was to go into the Soviet Union as a conqueror instead of a liberator

Wednesday, August 2nd, 2023

The advice to attack an enemy’s weak points goes back at least to Sun Tzu in the fifth century B.C., Bevin Alexander explains (in How Hitler Could Have Won World War II), but it is extraordinarily difficult for human beings to follow:

Attacking Russia head-on was wrong to begin with, because it guaranteed the greatest resistance, not the least. A direct attack also forces an enemy back on his reserves and supplies, while it constantly lengthens the supply and reinforcement lines of the attacker. The better strategy is to separate the enemy from his supplies and reserves. That is why an attack on the flank is more likely to be successful.

Nevertheless Hitler could still have won if he had struck at the Soviet Union’s weakness, instead of its strength.

His most disastrous error was to go into the Soviet Union as a conqueror instead of a liberator. The Soviet people had suffered enormously at the hands of the Communist autocracy for two decades. Millions died when the Reds forced people off their land to create collective farms. Millions more were obliged to move great distances and work long hours under terrible conditions in factories and construction projects. The secret police punished any resistance with death or transportation to horrible prison gulags in Siberia. In the gruesome purges of the 1930s, Joseph Stalin had systematically killed all leaders and all military officers who, in his paranoid mind, posed the slightest threat to his dictatorship. Life for the ordinary Russian was drab, full of exhausting work, and dangerous. At the same time, the Soviet Union was an empire ruling over a collection of subjugated peoples who were violently opposed to rule from the Kremlin.

Vast numbers of these people would have risen in rebellion if Hitler’s legions had entered with the promise of freedom and elimination of Soviet oppression. Had Hitler done this, the Soviet Union would have collapsed.

Such a policy would not have given Hitler his Lebensraum immediately. But once the Soviet Union had been shattered, he could have put into effect anything he wanted to with the pieces that remained.

Hitler followed precisely the opposite course of action.

Nearly half-a-million people were living within a 150-mile radius of the explosion

Tuesday, August 1st, 2023

The first atomic bomb test site — selected in 1944 from a shortlist of eight possible test sites in California, Texas, New Mexico, and Colorado — had been selected, in part, for its supposed isolation:

Yet in reality, nearly half-a-million people were living within a 150-mile radius of the explosion, with some as close as 12 miles away. Many, if not most, of these civilians were still asleep when the bomb detonated just before dawn.

Several civilians nearby — stunned by the blast — later reported that they thought they were experiencing the end of the world. A local press report stated that the flash had been so bright that a blind girl in Socorro, New Mexico — about 100 miles from the bombing range — was able to see it, and asked: “What’s that?” In Ruidoso, New Mexico, a group of teenage campers were jolted out of their bunk beds onto their cabin floor. They ran outside, worried that a water heater had exploded. Barbara Kent, one of the campers, recently recalled in an interview with National Geographic that “[A]ll of a sudden, there was a big cloud overheard, and lights in the sky. It hurt our eyes. It was as if the sun came out tremendous. The whole sky turned strange.”

A few hours later, white flakes began to fall from the sky. The campers began to play in the flurry.

“We were grabbing the white flakes, and putting it all over ourselves, pressing it on our faces,” Kent said. “But the strange thing, instead of being cold like snow, it was hot. And we all thought, ‘Well, the reason it’s hot is because it’s summer.’ We were only thirteen; we didn’t know any better.”

One family in Oscuro, about 45 miles away from the site, hung wet bed sheets in their windows to keep the flakes from floating into the house. The strange substance continued to fall from the sky for days, coating everything: orchards, gardens, herds of livestock, cisterns, ponds, and rivers.

The biggest challenge of the coming decades might simply be maintaining the systems we have today

Tuesday, August 1st, 2023

Complex systems won’t survive the competence crisis, Harold Robertson argues:

In a span of fewer than six months in 2017, three U.S. Naval warships experienced three separate collisions resulting in 17 deaths. A year later, powerlines owned by PG&E started a wildfire that killed 85 people. The pipeline carrying almost half of the East Coast’s gasoline shut down due to a ransomware attack. Almost half a million intermodal containers sat on cargo ships unable to dock at Los Angeles ports. A train carrying thousands of tons of hazardous and flammable chemicals derailed near East Palestine, Ohio. Air Traffic Control cleared a FedEx plane to land on a runway occupied by a Southwest plane preparing to take off. Eye drops contaminated with antibiotic-resistant bacteria killed four and blinded fourteen.

[…]

The core issue is that changing political mores have established the systematic promotion of the unqualified and sidelining of the competent. This has continually weakened our society’s ability to manage modern systems. At its inception, it represented a break from the trend of the 1920s to the 1960s, when the direct meritocratic evaluation of competence became the norm across vast swaths of American society.

In the first decades of the twentieth century, the idea that individuals should be systematically evaluated and selected based on their ability rather than wealth, class, or political connections, led to significant changes in selection techniques at all levels of American society. The Scholastic Aptitude Test (SAT) revolutionized college admissions by allowing elite universities to find and recruit talented students from beyond the boarding schools of New England. Following the adoption of the SAT, aptitude tests such as Wonderlic (1936), Graduate Record Examination (1936), Army General Classification Test (1941), and Law School Admission Test (1948) swept the United States. Spurred on by the demands of two world wars, this system of institutional management electrified the Tennessee Valley, created the first atom bomb, invented the transistor, and put a man on the moon.

By the 1960s, the systematic selection for competence came into direct conflict with the political imperatives of the civil rights movement. During the period from 1961 to 1972, a series of Supreme Court rulings, executive orders, and laws—most critically, the Civil Rights Act of 1964—put meritocracy and the new political imperative of protected-group diversity on a collision course. Administrative law judges have accepted statistically observable disparities in outcomes between groups as prima facie evidence of illegal discrimination. The result has been clear: any time meritocracy and diversity come into direct conflict, diversity must take priority.

The resulting norms have steadily eroded institutional competency, causing America’s complex systems to fail with increasing regularity. In the language of a systems theorist, by decreasing the competency of the actors within the system, formerly stable systems have begun to experience normal accidents at a rate that is faster than the system can adapt. The prognosis is harsh but clear: either selection for competence will return or America will experience devolution to more primitive forms of civilization and loss of geopolitical power.

[…]

After the early 1970s, employers responded by shifting from directly testing for ability to using the next best thing: a degree from a highly-selective university. By pushing the selection challenge to the college admissions offices, selective employers did two things: they reduced their risk of lawsuits and they turned the U.S. college application process into a high-stakes war of all against all.

In 1984, Yale sociologist Charles Perrow’s Normal Accidents: Living With High-Risk Technologies explained that catastrophic failures are unavoidable and cannot simply be designed around, when you have systems that are both complex and tightly coupled:

The biggest shortcoming of the theory is that it takes competency as a given. The idea that competent organizations can devolve to a level where the risk of normal accidents becomes unacceptably high is barely addressed. In other words, rather than being taken as absolutes, complexity and tightness should be understood to be relative to the functionality of the people and systems that are managing them. The U.S. has embraced a novel question: what happens when the men who built the complex systems our society relies on cease contributing and are replaced by people who were chosen for reasons other than competency?

The answer is clear: catastrophic normal accidents will happen with increasing regularity. While each failure is officially seen as a separate issue to be fixed with small patches, the reality is that the whole system is seeing failures at an accelerating rate, which will lead in turn to the failure of other systems. In the case of the Camp Fire that killed 85 people, PG&E fired its CEO, filed Chapter 11, and restructured. The system’s response has been to turn off the electricity and raise wildfire insurance premiums. This has resulted in very little reflection.

[…]

Americans living today are the inheritors of systems that created the highest standard of living in human history. Rather than protecting the competency that made those systems possible, the modern preference for diversity has attenuated meritocratic evaluation at all levels of American society. Given the damage already done to competence and morale combined with the natural exodus of baby boomers with decades worth of tacit knowledge, the biggest challenge of the coming decades might simply be maintaining the systems we have today.