North America inherited British government and British democracy

Wednesday, September 2nd, 2020

As a geographer, Jared Diamond has some thoughts on North America and Latin America:

In my undergraduate geography course, I have one session on North America and then a session on South America in which I discuss why North America is more successful economically. There are several factors involved.

One factor is that temperate zones, in general, are economically more successful than the tropics because of the higher productivity and soil fertility of temperate agriculture, which in turn relates to the public health burden. All of North America is a temperate zone. South America only has a small temperate zone. It’s in the far south in Chile, Argentina and Uruguay. Those are the richest countries in Latin America. The richest part of Brazil also lies in the temperate zone.

The second factor is a historical one related to the sailing distance from Europe to the Americas. The sailing distance was shorter from Britain to North America. It was longer from Spain to Argentina and still longer from Spain around the horn to Peru. A shorter sailing distance meant that the ideas and technology of the Industrial Revolution spread much more quickly from Britain, where it originated, to North America, than from Spain to Latin America.

Still another factor is the legacy of Spanish government versus the legacy of British government. One could argue why democracy developed in Britain rather than in Spain, but the fact is that democracy did develop in Britain rather than Spain, and so North America inherited British government and British democracy while Latin America inherited Spanish centralist government and absolutist politics.

Then still another factor is that independence for the U.S. was a more radical break than it was in South America. After the Revolutionary War, all the royalists in the U.S. either fled or were killed. So there was a relatively clean break from Britain. Canada did not have that break, and the break in Latin America was much less abrupt and came later.

Can the demise of democracy and free markets be far behind?

Saturday, August 15th, 2020

Arnold Kling foresees the Twilight of the Bourgeoisie and The Coming of Neo-Feudalism:

Overall, Kotkin’s thesis that the bourgeoisie is in decline is persuasive and disturbing. It is persuasive because the importance of education in social status is everywhere evident. In the 1950s, there were many corporate leaders who had only a high school education, and there were few with graduate degrees. Today, that is reversed.

The consolidation of economic power in Facebook, Apple, Amazon, Microsoft, and Google has been sudden and striking. It has confounded those of us who looked at the Internet revolution as a phenomenon that would empower smaller enterprises by decreasing the importance of physical capital.

The first wave of the Internet boom, in the late 1990s, was characterized by feverish entry and vigorous competition in the realms of Internet search, on-line shopping, and the hardware and software that consumers would use to access the World Wide Web. In contrast, the current tech boom seems to have entrenched the leaders in their respective positions.

Deirdre McCloskey has argued persuasively that bourgeois virtues raised the status of innovation and commerce, paving the way for our modern economic and political systems. If, as Kotkin argues, the status of the bourgeoisie is in the process of decline, can the demise of democracy and free markets be far behind?

Outside, the fresh air was worse

Sunday, July 5th, 2020

After VJ-Day, American soldiers wanted to go home, and Americans wanted them to come home. This left Colonel Jones in Korea in an awkward situation, as T. R. Fehrenbach explains, in This Kind of War:

Colonel Jones received replacements, of course. He got officers from the Quartermaster Corps and the Infantry, and plenty of basic riflemen from the eighteen-year-olds just drafted, who didn’t have Skill One, even for basic riflemen. Engineers he didn’t get. Engineers, like most professional men, serve in the military only when the draft moves them.

With a Group HQ that didn’t know a crowbar from a wrecking iron, and who thought a balk was part of baseball, Colonel Jones, as part of “Blacklist Forty” (code name for Korea), reported to General Hodge in Korea.

[...]

These were days and weeks to break a career officer’s heart. The United States Army, which had been the most powerful in the world, did not melt away in an orderly fashion. It disintegrated into a disorganized mob, clamoring to go home.

[...]

Fortunately for Jones, the Jap soldiers in Korea waiting to be sent home were willing workers.

[...]

The Japs, now that the Greater East Asia Co-Prosperity Sphere was gone, were affable, smiling, professional, and entirely helpful. Jones put them to work.

[...]

Eventually, though, all the Japs had to be repatriated. They took with them, when they left, every military officer, every professional man, every engineer, bank teller, and executive in the Pusan area. They left behind a hell of a mess.

Like most Americans, Colonel Jones was not prepared to take Chosun. The appalling poverty, the dust, dirt, filth, and eternal clamor of Pusan repelled any man accustomed to the West. Orphan children, with running sores, lay in the streets. Society, with the iron Japanese hand gone, was in dissolution. Money was worthless, since the Japanese had printed billions of yen prior to the surrender and passed it out to all who wanted it. Almost all responsible Koreans, particularly the educated were — rightly — tarred with the collaborationist brush.

[...]

He never got used to the stink. Inside the city, the odors were of decaying fish, woodsmoke, garbage, and unwashed humanity. Outside, the fresh air was worse. Koreans, like most Orientals, use human fertilizer. Their fields and paddies, their whole country smells somewhat like the bathroom of a fraternity house on Sunday morning.

[...]

Clothing washed in their rivers turns a sickly brown.

[...]

In Korea, there were no trained administrators for either government or business, regardless of their politics.

[...]

As an engineer, he became responsible for fire fighting in Pusan, and he noticed a great number of fires were breaking out. He asked a Korean fireman about this.

“Oh, it is the different factions, setting each other’s houses afire,” the Korean answered cheerfully.

He soon learned to use Korean guards for U.S. military stores. The Koreans were desperately poor, and would steal anything, even if nailed down — nails had commercial value — but American sentries would not willingly shoot down women and boys carrying off gas cans and water buckets. Not after they had killed two or three, anyway — they lost all heart for it. But Korean guards would shoot or beat hell out of the thieves, if they caught them.

[...]

The summers were hot and dusty, or hot and rainy, with hundred-degree temperatures. The winters were Siberian. The country literally stank, except for the few months during which the ground stayed frozen.

A quiet reservoir of economic strength is forming

Wednesday, June 17th, 2020

A quiet reservoir of economic strength is forming among households flush with cash, the Wall Street Journal reports, and it is reviving consumer spending:

The crisis caused by the coronavirus has pushed millions into unemployment and left them straining to get by. But many consumers in the U.S. and Europe who have held on to their jobs or are getting government benefits have seen their bank accounts swell during lockdowns, according to government data, because of restrictions on shopping and big-spending activities such as tourism.

Consumers with means are driving surprising strength in a number of sectors. People are flocking to home-improvement stores and car dealerships. They want to install pools in their backyards and Jacuzzis in their bathrooms. Spending on furniture has jumped. So have sales of fitness and sports equipment.

And with vacations and summer camps canceled and pool memberships on hold, families are looking for other ways to entertain themselves this summer.

Once you grasp its lessons, you can never again be a normal citizen

Tuesday, June 2nd, 2020

Labor economics stands against the world, Bryan Caplan says:

Once you grasp its lessons, you can never again be a normal citizen.

What are these “central tenets of our secular religion” and what’s wrong with them?

Tenet #1: The main reason today’s workers have a decent standard of living is that government passed a bunch of laws protecting them.

Critique: High worker productivity plus competition between employers is the real reason today’s workers have a decent standard of living. In fact, “pro-worker” laws have dire negative side effects for workers, especially unemployment.

Tenet #2: Strict regulation of immigration, especially low-skilled immigration, prevents poverty and inequality.

Critique: Immigration restrictions massively increase the poverty and inequality of the world — and make the average American poorer in the process. Specialization and trade are fountains of wealth, and immigration is just specialization and trade in labor.

Tenet #3: In the modern economy, nothing is more important than education.

Critique: After making obvious corrections for pre-existing ability, completion probability, and such, the return to education is pretty good for strong students, but mediocre or worse for weak students.

Tenet #4: The modern welfare state strikes a wise balance between compassion and efficiency.

Critique: The welfare state primarily helps the old, not the poor — and 19th-century open immigration did far more for the absolutely poor than the welfare state ever has.

Tenet #5: Increasing education levels is good for society.

Critique: Education is mostly signaling; increasing education is a recipe for credential inflation, not prosperity.

Tenet #6: Racial and gender discrimination remains a serious problem, and without government regulation, would still be rampant.

Critique: Unless government requires discrimination, market forces make it a marginal issue at most. Large group differences persist because groups differ largely in productivity.

Tenet #7: Men have treated women poorly throughout history, and it’s only thanks to feminism that anything’s improved.

Critique: While women in the pre-modern era lived hard lives, so did men. The mating market led to poor outcomes for women because men had very little to offer. Economic growth plus competition in labor and mating markets, not feminism, is the main reason women’s lives improved.

Tenet #8: Overpopulation is a terrible social problem.

Critique: The positive externalities of population — especially idea externalities — far outweigh the negative. Reducing population to help the environment is using a sword to kill a mosquito.

Yes, I’m well-aware that most labor economics classes either neglect these points, or strive for “balance.” But as far as I’m concerned, most labor economists just aren’t doing their job. Their lingering faith in our society’s secular religion clouds their judgment — and prevents them from enlightening their students and laying the groundwork for a better future.

Similar to the dark months after Pearl Harbor

Thursday, May 14th, 2020

World War II offers valuable lessons for the current moment, but when many people picture the World War II economy, they’re thinking about how it operated by 1944 and 1945, when early problems had been solved and war production was at its peak:

By then, industries large and small had joined the war effort: Washing machine manufacturers made artillery shells. Vacuum cleaner companies made bomb fuses. Tanks, airplanes, and anti-aircraft guns rolled off assembly lines that had once produced automobiles. American industry produced more than 96,000 planes in 1940 alone — a 26-fold increase over the 3,611 airplanes produced in 1940. An official military history credits American war production in its heyday with “virtually determining the outcome of the war.”

The current state of the coronavirus pandemic, though, is far more similar to the dark months after Pearl Harbor, when US leaders faced the daunting task of transforming the US economy virtually overnight, than it is to those triumphant final years.

Then as now, every day mattered. In the first months of 1942, top US officials feared that due to lack of equipment, America might lose the war before it got a chance to start fighting it. Their primary goal was transforming the economy as fast as possible.

[...]

Their experience still has lessons for policymakers today. Here are five of them.

1) Centralize and coordinate the government’s purchases of medical equipment, including personal protective gear

Without effective coordination, states and the federal government have entered bidding wars for desperately needed medical equipment. Shipments to states have been confiscated, prompting elaborate schemes like Massachusetts Gov. Charlie Baker’s efforts to get 1 million N95 masks delivered to Massachusetts. Chaos reigns as hospitals try to sort through the confusion of disrupted supply chains. President Trump insists that the federal government is “not a shipping clerk,” but in fact such coordination is precisely the federal government’s role.

The US faced a similar problem during World War I, when purchasing was decentralized. Different branches of the military, including numerous departments within the Army, competed with each other in bidding for contracts. This led to production delays and increased prices for critical supplies.

In World War II, Franklin Delano Roosevelt created the War Production Board. Decisions about what equipment was needed were made by the military, but the board oversaw and coordinated all war production. Its initial role was to get production going in sufficient, previously unthinkable, quantities and to arrange new supply chains to ensure materials ended up in the right hands.

For relatively simple production orders, the board publicized production requirements for the goods it needed and facilitated matching products with interested firms. The more complex and difficult orders were sent to the large, established firms with the greatest expertise in relevant production processes.

But the board’s role did not diminish once production got going. Rather, its focus changed to ensuring that scarce resources were being allocated optimally. Since it takes time for suppliers to expand production to meet demand, ramping up war production so quickly led to short-run scarcity.

The War Production Board was subject to both extensive public scrutiny and congressional oversight from the Truman Committee. Its appeals board heard complaints from business and labor leaders, members of Congress, and state and local politicians. Because requirements were determined by the military, procurement decisions were largely apolitical. Researchers Paul Rhode, James Snyder Jr., and Koleman Strumpf found no evidence that World War II contract placement was systematically biased by political factors.

[...]

2) Repurpose existing institutions and take advantage of existing expertise

After Pearl Harbor, policymakers faced the need to transform the economy at a rapid pace. American policymakers feared the war could be lost before it had fully begun, so speed was paramount. One key element of the transition to a wartime economy was policymakers’ decision to transform existing institutions rather than create entirely new ones.

War Production Board Chair Donald Nelson left purchasing and procurement decisions in the hands of the armed forces, using the board to manage and coordinate. This was one of his most controversial decisions, but it was the right choice — at least for the initial phase of the war — for two reasons.

First, in 1942 as in 2020, every day mattered. Keeping purchasing and procurement in the hands of the agencies that had previously made these decisions saved precious time and allowed production to ramp up faster.

Second, only trained military officers had the expertise needed to evaluate whether specialized products such as airplanes, tanks, and radar met quality standards and fulfilled military needs. Nelson recognized that a civilian agency could not match the military’s expertise in determining such technical details.

Depression-era unemployment offices were also repurposed for the war. As unemployment fell sharply in the early 1940s, the US Employment Service pivoted from coordinating services for the unemployed to matching workers to war production jobs, helping employers find replacements for workers entering the military.

[...]

3) Availability of materials is a key constraint

During World War II, strategic materials, not labor or manufacturing capacity, proved to be the binding constraint on US wartime production.

That is likely to be just as true today. Constraints on manufacturing capacity are orders of magnitude less severe now than in WWII. More than $100 billion of military contracts were placed in the first six months of 1942, compared to $20 billion in defense contracts over all of 1941 and a 1941 GDP of $129 billion. Production capacity initially fell far short of what was needed for the war effort, even with extensive conversion of civilian manufacturing capacity. Today the US needs vast increases in the production of medical equipment, particularly ventilators, personal protective equipment, and test kits, but the total volume of equipment needed is significantly less than a full year’s GDP.

[...]

4) The crisis itself creates strong incentives for manufacturing firms to produce critical equipment

The US did not nationalize major industries to achieve its World War II production miracle. US war production relied primarily on manufacturing by private firms, as the war aligned manufacturing firms’ incentives with those of the nation.

The Defense Production Act is a good mechanism for mobilizing industry — indeed, it was written when the experience of WWII was recent memory — and should be used aggressively as needed.

But with clear and effective federal leadership, its necessary application may be narrow. There are other ways to push industry to produce needed supplies. A government guarantee to buy all medical equipment meeting stated specifications and produced by specified dates at a set price, combined with the incentives provided by the crisis itself, would provide enough incentive for most firms. Voluntary agreements authorized under the DPA would allow firms to cooperate effectively and scale production faster, mimicking the inter-firm cooperation that defined the home front during World War II.

A number of private firms are already converting their production lines to key equipment, from small distilleries making hand sanitizer to Ford Motor Company’s production of ventilators, even in the absence of clear leadership and communication from the federal government.

In WWII, most US firms faced a choice between sitting idle —a home appliance producer cannot produce appliances if it cannot acquire the metal needed to make its products — and participating in war work. The government’s control of raw materials created the incentives for firms to convert voluntarily: Firms that volunteered for war production were able to acquire inputs, while other firms were not.

There was also an overarching incentive for war production: The sooner firms produced the needed materials, the faster the war could be won, and the sooner everyone could get back to real life. That same overarching incentive exists today, and it is powerful.

[...]

5) The evidence supports a strategy of relief now and stimulus after the pandemic

[...]

My research found that the fiscal multiplier in WWII was much smaller than the typical multiplier because the savings rate was so high during the war. Many products, particularly durable goods, were not available for purchase during WWII because they were not produced at all. Consumer spending rebounded strongly after the war ended, particularly on goods, such as cars and appliances, that were not available during the war.

The experience of WWII suggests that when consumption options are significantly restricted, people may spend a smaller share of income than in other times. Specifically, the closest substitute for buying a particular good now is buying that good in the future, when it is available again, rather than buying another good. The extreme uncertainty of the current situation may also depress the multiplier, since people will delay making decisions and larger purchases.

Today, significant sectors of the US economy have ground to a halt, particularly the travel, arts, and restaurant industries. As in WWII, the ordinary lives of millions of Americans have been abruptly transformed. Significant portions of people’s regular consumption baskets are unavailable, even though no formal rationing has been enacted. So, as in WWII, the multiplier on relief spending may be lower than in a “normal” recession.

The evidence from World War II strongly backs up the paradigm that policy should focus on relief now and stimulus later. Targeting relief funds may help increase the multiplier to the extent that most relief funds are used to buy basic necessities. People who lose all or most of their income in this pandemic recession will be more likely to spend on necessities rather than saving, which would increase the fiscal multiplier. However, perfect targeting may be difficult to achieve quickly.

Further evidence from late in the Great Depression suggests that fiscal stimulus may be particularly effective after a long period of downturn, as it can support pent-up demand. This suggests that policymakers should focus on relief for as long as the pandemic continues, including with further rounds of such relief as needed, but then be sure to follow relief with broad-based stimulus to help the economy rebound.

White manual workers once expected that the American Dream would come true for them

Tuesday, May 12th, 2020

In 2015 life expectancy began falling for the first time since the height of the AIDS crisis in 1993:

The causes — mainly suicides, alcohol-related deaths, and drug overdoses — claim roughly 190,000 lives each year.

The casualties are concentrated in the rusted-out factory towns and depressed rural areas left behind by globalization, automation, and downsizing, but as the economists Anne Case and Angus Deaton demonstrate in their new book, Deaths of Despair and the Future of Capitalism, they are also rampant in large cities. Those most vulnerable are distinguished not by where they live but by their race and level of education. Virtually the entire increase in mortality has been among white adults without bachelor’s degrees — some 70 percent of all whites. Blacks, Hispanics, college-educated whites, and Europeans also succumb to suicide, drug overdoses, and alcohol-related deaths, but at much lower rates that have risen little, if at all, over time.

The disparity is most stark in middle age. Since the early 1990s, the death rate for forty-five-to-fifty-four-year-old white Americans with a BA has fallen by 40 percent, but has risen by 25 percent for those without a BA. Although middle-aged blacks are still more likely to die than middle-aged whites, their mortality has also fallen by more than 30 percent since the early 1990s. Similar declines occurred among middle-aged French, Swedish, and British people over the same period.

[...]

White manual workers once expected that the American Dream would come true for them. In Kathryn Newman’s remarkably prescient study of downsizing, Falling from Grace (1988), older people recalled that Elizabeth, New Jersey — where 18 percent of residents now live in poverty — was once a “place of grandeur, where ladies and gentlemen in fine dress promenaded down the main avenue on Sunday.” The Singer Sewing Machine company employed over 10,000 workers, roughly a tenth of the city’s population. The company awarded scholarships to children, sponsored baseball games, and hosted dances and bar mitzvahs in its recreation hall. Each sewing machine had a label, and if returned with a defect, the man who’d made it would fix it himself.

The last American Singer plant closed decades ago, along with thousands of other factories. There were 19.5 million decently paying US manufacturing jobs in 1979, compared to around 12 million today, when the population is almost 50 percent larger.

Apply several millennia of compound interest to see what happens next

Tuesday, April 21st, 2020

The far future might be Post-Malthusian or Neo-Malthusian:

On a long timeframe, there are three coherent views of where history is going: we might escape Malthus forever, and our wealth and happiness compounds ever faster above subsistence; we might be locked around new Malthusian barriers, with higher low-hanging fruit that’s all been picked nonetheless; or history might end. You can write a story about the end of the world, but you can’t make it a franchise: either the world ends or it doesn’t, so eventually you have to stop writing.

The two fictional universes the best exemplify the two visions of the future are Warhammer 40,000 and the Culture series. Like all far-future science fiction, they both start in the present, pick a few technological and social trends, and apply several millennia of compound interest to see what happens next.

In the Culture novels, improvements in physical and software technology reach the point that all essential work can be done by robots, whether they’re hyperadvanced Roombas, tiny Predator Drones, or superhumanly smart ship-based Minds. There’s no need for laws or conflict; when everything is free, there’s nothing to fight over. The Culture has conflicts with other societies, but given their immense productive capacity, victory is inevitable. In The Player of Games, for example, The Culture wants to absorb the empire of Azad in order to treat the Azadians more gently than the emperor does. They could overwhelm it militarily, but they think that’s less elegant than subverting the empire from within.

The world of Warhammer 40,000 is… not that. It’s grim. It’s dark. It’s so much of both that the term grimdark was coined to describe it. W40K’s universe, like that of Culture, is superabundant, but only in suffering and terror. The moral center of the Culture is the Minds, which give humans diverting and amusing tasks. The moral center of W40K is the God-Emperor, who is slowly dying over millennia, kept alive by life support and human sacrifice. Technology exists, but science has been forgotten; their engineers are just an elaborate cargo cult. Fermi estimates of the size of the empire range from trillions to quadrillions of people, but their enemies are tangible manifestations of abstract forces like War, Disease, and Excess. The plot of every Warhammer story is a bleak, bloody retreat ahead of an inevitable loss.

The body counts in these universes vary wildly. In one Culture story, the main character has been away from his homeworld for years and years. He asks for recent news, and learns that the most shocking event of the last few years was an accident in which two people died. In W40K stories, million-casualty terrorist attacks are background noise, and the heroes tend to commit murder about as frequently and casually as the average person checks Instagram.

[...]

To some extent, you can explain the different traits of these universes by the intents of their authors: Iain Banks wanted to imagine what a socialist utopia would be like; Games Workshop wants to produce novels that encourage people to buy pricey game figurines.

But you can also run them through an theoretical lens: in the very, very long-term, do we live in a post-Malthusian world, or a neo-Malthusian one? It’s a topic I’ve explored in the past. Literal Malthusian math no longer applies; we’re not constrained by arable farmland any more. But meta-Malthusianisms are everywhere. As it turns out, when people don’t spend every waking hour eking out an existence in grinding poverty, they find other things to do.

[...]

At the Malthusian limit, the value of a human life rounds down to zero. If you’re either starving, worried about starving, or fighting a war that’s ultimately driven by resource limitations, your ratio of QALYs to lifespan takes a dive. In the other direction, if you’re a hedonic utilitarian — and Iain Banks was an atheist utopian socialist, which tends to eliminate everything but pleasure from the telos menu — then prosperity ratchets up the value of a human life to unfathomable proportions. Banks is making a reasonable extrapolation here; as countries get richer, more of their incremental wealth gets spent on healthcare despite severe diminishing marginal returns.

This variance in values is reflected throughout both books. In the Culture novels, characters are constantly changing their appearance, job, and gender. In Warhammer, too, characters change their appearance — a conceit of the stories is that extended contact with evil causes physical mutations. Intra-Culture conflict is rare and polite (characters argue, even with Minds, but those arguments all have the tone of a loving parent telling a 19-year-old to choose a less practical, more fulfilling college major). In Warhammer, the conflict is constant; the nominal good guys are antiheroes at best, who profess a code of authoritarian values (duty, sacrifice, religious fanaticism, xenophobia) but also hypocritically fail to live up to it. Interestingly, both series tend to have protagonists who are fundamentally loyal to their society, but for opposite reasons: characters are loyal to the Culture because it gives them anything they could possibly want, which makes it the best place it could possibly be. It’s entirely conditional loyalty. In the universe of W40K, loyalty is expected, and unconditional; the reward for intense loyalty is dying in a more interesting way.

Oddly enough, even though Warhammer 40,000 reads as simplistic and the Culture as sophisticated, W40K is the more introspective of the two. Iain Banks was a nice left-wing guy who liked the idea of technology making the world a better place. I don’t know how every Warhammer writer feels, but the whole thing was originally meant as a parody of dark and gritty science fiction. It just turned out that if you took the most extreme parts of that genre, and 10xed them, you got something people absolutely loved. So Banks has a love-love relationship with his creations; he only wants his universe to have conflict so his characters will have something to do. The W40K writers may absolutely loathe their protagonists, and take immense satisfaction in their gory deaths and moral corruption. The way this plays out is that Banks will give his villains halfhearted justifications for going to war against the Culture, whom Banks thinks of as a bunch of fundamentally very nice people who just want to invite every sapient being to their interplanetary orgy. Meanwhile W40K villains make some pretty good points about how crummy it would be to live in a galaxy-spanning police state with widespread misery, zero respect for human rights, and demons.

Both universes are fictional. Moreover, they’re genre fiction, and most of W40K’s literary output qualifies as pulp. They’re good intuition pumps, though. If you extrapolate from the present and don’t get to the apocalypse, one of them is directionally true. Either technology and society improve in a self-reinforcing way, until we reach a future state that rounds up to utopia, or the post-Malthusian period that started around 1800 will end some day. We’ll lose something — social technology, natural resources, all electrical devices — and find that we’re so far beyond our newly-lowered carrying capacity that we simply can’t recover. History doesn’t end, except in apocalypse. Civilization is either a divergent function or a convergent one; it’s either compound interest or a Martingale bet. If you’re building the future, it’s good to pause, think of which part of the function has the biggest exponent, and see what another few millennia of compounding will do.

Is the 1918 influenza pandemic over?

Monday, April 20th, 2020

The sudden nature of the “Spanish” flu pandemic meant that children born just months apart experienced very different conditions in utero:

In particular, children born in 1919 were much more exposed to influenza in utero than children born in 1918 or 1920. The sudden differential to the 1918 flu lets Douglas Almond test for long-term effects in Is the 1918 Influenza Pandemic Over?

Almond finds large effects many decades after exposure.

Fetal health is found to affect nearly every socioeconomic outcome recorded in the 1960, 1970, and 1980 Censuses. Men and women show large and discontinuous reductions in educational attainment if they had been in utero during the pandemic. The children of infected mothers were up to 15 percent less likely to graduate from high school. Wages of men were 5–9 percent lower because of infection. Socioeconomic status…was substantially reduced, and the likelihood of being poor rose as much as 15 percent compared with other cohorts. Public entitlement spending was also increased.

The coronavirus economy lives on in suspense, not free fall

Wednesday, April 1st, 2020

The economy today lives in suspense, not free-fall , Vernon L. Smith suggests:

Not all markets, however, are born equal. Laboratory experiments for goods that cannot be re-traded easily converge to their predicted supply-and-demand equilibrium under conditions of strictly private dispersed information. Their counterpart in the economy, markets for nondurable consumer goods, are a rock of stability. Moreover, these markets are very large, constituting 75% of private product (gross domestic product less government expenditures).

In sharp contrast, laboratory studies of asset markets persistently yield price bubbles in environments with perfect information on fundamental value. Moreover, experiments prove that this propensity to bubble is precisely and only because the items are re-tradable. These studies helped us to understand why all market economic instability arises from durable goods markets, especially housing-mortgage markets, as we have seen in the Great Recession and in the Depression when house prices fell against mortgage debt, plunging households into negative equity. Homeowners, living in houses worth less than what they owe the bank do not feel buoyantly prosperous. The experiments also helped us to understand why security markets are so volatile, but are not a fundamental source of instability, like housing, because securities market loans are short-term callable loans, investor balance sheets are marked steadily to market as prices decline, and there is no build-up of negative equity to dampen long-term expectations.

I believe the economy today lives in suspense, not free-fall. The pandemic will pass; public health institutions have been a model of forthright dissemination of information on the spread of this disease and sanitary procedures to minimize its impact. It’s the citizenry that has been unruly for a time. Supply chains will refill and stabilize quickly, as the pandemic passes, securities markets will recover, and growth will continue to reduce poverty everywhere. Homes are more valuable than ever as a haven of safe and secure living. Provided that we continue to buy them with some of our own money, homes will be part of a secure future.

“With dread” is the only sensible answer

Tuesday, March 17th, 2020

If you’re a socialist, you have to be concerned that so many socialists before you defended totalitarian regimes as they committed atrocities, but you might say that the best socialists spoke out:

A reasonable position. I don’t want my views judged by the quality of the typical person who shares my label, either.

Still, this raises a weighty question: How should the best socialists react when they discover that a new socialist experiment is about to start? “With dread” is the only sensible answer. After all, the best socialists don’t merely know the horrifying history of the Soviet Union and Maoist China. The best socialists also know the psychotic sociology of the typical socialist, who savors the revolutionary “honeymoon” until the horror becomes too blatant to deny.

If dread is the sensible reaction to the latest socialist experiment, then how should the best socialists react to any earnest proposal for a new socialist experiment? It’s complicated. The proposal stage is the perfect time to avoid the errors of the past – to finally do socialism right. Yet this hope must still be heavily laced with dread. After all, socialists have repeatedly tried to learn from the disasters of earlier socialist regimes. When they gained power, disaster still followed.

That’s Bryan Caplan, by the way, and he continues:

At this point, it’s tempting to shift blame to the non-socialist world. Without American-led ostracism, perhaps Cuba would be a fine country today. Or consider Chomsky’s view that the U.S. really won the Vietnam War:

The United States went to war in Vietnam for a very good reason. They were afraid Vietnam would be a successful model of independent development and that would have a virus effect – infect others who might try to follow the same course. There was a very simple war aim – destroy Vietnam. And they did it.

If Chomsky is right about U.S. foreign policy, however, the best socialists should feel even less hope and even more dread. Even if the next generation of socialists finally manages to durably build socialism with a human face, the U.S. will probably strangle it.

Personally, I’m the furthest thing from a socialist. If I were a socialist, though, I would be the world’s most cautious socialist. Socialist experiments don’t merely have a bad track record; socialist self-criticism has a bad track record.

Plunder the bookshelves

Friday, February 28th, 2020

Laura Spinney reviews a number of books about societal collapse for Nature:

The newest is Before the Collapse. In it, energy specialist Ugo Bardi urges us not to resist collapse, which is how the Universe tries “to get rid of the old to make space for the new”.

Similarly, Diamond’s 2019 book Upheaval suggested that a collapse is an opportunity for self-appraisal, after which a society can use its ingenuity to find solutions.

[...]

Questioning Collapse, a 2009 collection of essays edited by archaeologists Patricia McAnany and Norman Yoffee, took Diamond to task for cherry-picking to spin a good yarn, for example in blaming such iconic societal failures as the population crash of Easter Island on its people’s destruction of their own environment.

[...]

In his influential 1988 The Collapse of Complex Societies, archaeologist Joseph Tainter argues that collapse — in the sense of the complete obliteration of a political system and its associated culture — is rare. Even the worst cases are usually better described as rapid loss of complexity, with remnants of the old society living on in what rises from the ashes. After the ‘fall’ of Rome in the fifth century, for example, successor states took more than 1,000 years to achieve comparable economic and technological sophistication, but were always recognizably the empire’s offspring.

[...]

In his thoughtful Understanding Collapse (2017), archaeologist Guy Middleton surveys more than 40 theories of collapse — including Diamond’s — and concludes that the cause is almost always identified as external to the society. Perennial favourites include climate change and barbarian invasions — or, in the Hollywood version, alien lizards. The theories say more about the theorists and their times, Middleton argues, than about the true causes of collapse.

The pressing question, Tainter told a workshop on collapse at Princeton University in New Jersey last April, is why can a society withstand repeated external blows — until one day it cannot? For him, a society fails when it is no longer able to adapt to diminishing returns on innovation: when it can’t afford the bureaucracy required to run it, say. In Why the West Rules — For Now (2010), historian Ian Morris proposes a twist on this, namely that the key to a society’s success lies in its ability to capture energy — by extracting it from the ground, for example, or from nuclear fission once fossil fuels have run out. By contrast, Peter Turchin, author of the 2006 War and Peace and War, suggests that collapse is what happens when a society stops being able to deal with the strains caused by population growth, leading to inequality and strife.

[...]

Goldstone rigorously dissected upheaval in the sixteenth to the nineteenth centuries in his 1991 book Revolution and Rebellion in the Early Modern World. This convinced him that revolution is an inappropriate response to societal tensions, usually leading to tyranny. Solutions have come instead from deep, meaningful reform. Yet the idea that revolution removes obstacles to progress has “deluded literally billions of people”, he argues.

Adjusting for IQ wipes out the ethnic income differential

Wednesday, February 26th, 2020

In the third part of Human Diversity: The Biology of Gender, Race, and Class Charles Murray proposes that racism and sexism are no longer decisively important in who rises to the top, in part because differences in educational attainment and income nearly disappear for people at similar IQ levels:

Even without adjusting for anything, there’s no female disadvantage to worry about when it comes to educational attainment. Women now have higher mean years of education and a higher percentage of college degrees than men and have enjoyed that advantage for many years. These advantages persist over all IQ levels.

[...]

In terms of the raw numbers, Asians have higher educational attainment than any other ethnic group. Blacks and Latinos have substantially lower educational attainment than whites, but these discrepancies are more than eliminated after adjusting for IQ.

[...]

Asians retain their advantage over whites after adjusting for IQ.

[...]

A substantial female disadvantage in earned income exists, but it is almost entirely explained by marriage or children in the household. Using Current Population Survey data for 2018, earnings for women who were not married, had no children living at home, and worked full-time were 93 percent of the earnings of comparable men.

[...]

Married women with children in the house have considerably lower earned income even after adjusting for IQ, but the main source of the income discrepancy is not that married women in the labor force earn less than unmarried women, but that married men earn more than unmarried men.

[...]

Using raw 2018 data from the CPS, Asians have higher mean earned income than whites, while Blacks and Latinos have substantially lower mean earned income than whites.

[...]

In the earlier survey, adjusting for IQ wipes out the ethnic income differential among whites, blacks, and Latinos (Asians were not included in this survey). In the latter survey, whites and Latinos have effectively the same earned income while the fitted mean for blacks is 84 percent of the fitted mean for whites.

[...]

The fitted mean for Asians is 57 percent higher than the fitted mean for whites.

Inherited wealth is a tangential contributor

Tuesday, February 25th, 2020

Charles Murray introduces the third part of Human Diversity: The Biology of Gender, Race, and Class) by mentioning another book about class that he (co-)wrote:

The book’s main title was The Bell Curve. In many ways, it documents the ways in which a segment of American society is a indeed morphing into a castelike upper class. But inherited wealth is a tangential contributor. The bare bones of its argument are that the last half of the twentieth century saw two developments of epochal importance: First, technology, the economy, and the legal system became ever more complex, making the value of the intellectual ability to deal with that complexity soar. Second, the latter half of the twentieth century saw America’s system of higher education become accessible to everyone with enough cognitive talent. The most prestigious schools, formerly training grounds for children of the socioeconomic elite, began to be populated by the students in the top few percentiles of IQ no matter what their family background might be—an emerging cognitive elite. By 2012, what had been predictions about the emerging cognitive elite as we were writing in the early 1990s had become established social facts that I described in another book, Coming Apart.

Vocational doors really did open

Saturday, February 22nd, 2020

A look back at what has happened to educational and job choices over the last 50 years suggests that vocational doors really did open for women during the 1970s, Charles Murray says (in Human Diversity: The Biology of Gender, Race, and Class):

In 1971, 38 percent of women’s bachelor’s degrees were in education. That proportion had fallen by half by the early 1980s. Meanwhile, degrees in business grew from 3 percent in 1971 to 20 percent by 1982.

[...]

Consider first the most Things-oriented STEM careers — physics, chemistry, earth sciences, computer science, mathematics, and engineering. The percentage of women’s degrees obtained in those majors more than doubled from 1971 to 1986 — but “more than doubled” meant going from 4 percent to 10 percent.

And 1986 was the high point. By 1992, that number had dropped to 6 percent, where it has remained, give or take a percentage point, ever since.

[...]

Women’s degrees in People-oriented STEM — biology and health majors — doubled in just the eight years from 1971 (9 percent) to 1979 (18 percent), remained at roughly that level through the turn of the century, then surged again, standing at 27 percent of degrees in 2017.

[...]

It looks as if women were indeed artificially constrained from moving into a variety of Things occupations as of 1970, that those constraints were largely removed, and that equilibrium was reached around 30 years ago.

[...]

The effect of the feminist revolution on the vocations of college-educated women was real but quickly reached a new equilibrium. For women with no more than a high school education, it is as if the feminist revolution never happened.

[...]

The subtext of this chapter has been that it’s not plausible to explain the entire difference in educational and vocational interests as artifacts of gender roles and socialization. If that were the case, the world shouldn’t look the way it does. In contrast, a mixed model — it’s partly culture, partly innate preferences — works just fine. In this narrative, females really were artificially deterred from STEM educations and occupations through the 1950s and into the 1960s. One of the effects of the feminist revolution was that new opportunities opened up for women and women took advantage of them.