Every era’s monetary institutions are virtually unimaginable until they are created

Tuesday, July 23rd, 2019

On Bretton Woods’ 75th Anniversary, Tyler Cowen reminds us that every era’s monetary institutions are virtually unimaginable until they are created:

Every era’s monetary institutions are virtually unimaginable until they are created. Looking forward, don’t assume the status quo will hold forever, but rather prepare to be shocked.

Consider the broader history of monetary and financial institutions. The gold standard (and sometimes bi-metallic) regime that marked the Western world from 1815-1914 was without precedent. In medieval times, gold, silver, copper and bills of exchange — from multiple issuers — all circulated as means of payment, and often there was no single dominant form of money. As the gold standard evolved, however, claims to gold became a global means of settling claims and easing foreign trade and investment. While the system was based on some central bank intervention, most notably from the Bank of England, it was self-regulating to a remarkable degree, and it formed the backbone of one of the West’s most successful eras of economic growth. It was not obvious that the West would arrive at such a felicitous arrangement.Now fast forward to the current day. Currencies are fiat, the ties to gold are gone, and most exchange rates for the major currencies are freely floating, with periodic central bank intervention to manipulate exchange rates. For all the criticism it receives, this arrangement has also proved to be a viable global monetary order, and it has been accompanied by an excellent overall record for global growth.

Yet this fiat monetary order might also have seemed, to previous generations of economists, unlikely to succeed. Fiat currencies were associated with the assignat hyperinflations of the French Revolution, the floating exchange rates and competitive devaluations of the 1920s were not a success, and it was hardly obvious that most of the world’s major central banks would pursue inflation targets of below 2%. Until recent times, the record of floating fiat currencies was mostly disastrous.

It is considered bizarre that former Fed Chairman Alan Greenspan advocated a gold standard in 1967, but in fact it was a pretty reasonable view at the time, even if it turned out to be incorrect. And it wasn’t just Greenspan who didn’t see where the world was heading; Benn Steil, in his well-known book on Bretton Woods, wrote: “Keynes thought of freely floating rates as a sort of blind groping … and certainly not as a viable alternative model for underpinning trade relations among nations.” In reality, we are just emerging from arguably the world’s most rapid age of globalization, from about 1990 to 2007.

The Bretton Woods arrangements also seemed highly unlikely until they were in place. They involved a complicated system of exchange rate pegs, capital controls and a “gold pool” (and other methods) to control gold prices and redemption ratios. What’s more, the whole thing was dependent on America’s role as global hegemon, both politically and economically. The dollar still was tied to gold, and the other major currencies tied to the dollar, but as the system evolved it required that no one was too keen to redeem dollars for gold (the French unwillingness to abide by this stricture was one proximate cause of the collapse of Bretton Woods).

I don’t think a monetary economist from, say, 1890 could have imagined that such an arrangement would prove possible, much less successful. Yet the Bretton Woods arrangements had a wonderful track record, as the 1950s and 1960s generated strong economic growth for both the U.S. and Western Europe.

At the same time, once Bretton Woods ended in the early 1970s, few people thought it was possible to turn back the clock. The system required the U.S. to be a creditor nation, to hold much of the world’s gold stock, and for countries such as France to defer to American wishes on gold convertibility. Once again, the line between an “imaginable” and “unimaginable” monetary arrangement proved to be a thin one.

Another surprising monetary innovation would be the euro. Both Milton Friedman and Paul Krugman warned that the euro was unlikely to succeed and persist. Yet it has proven more durable than many people expected, and there does not seem to be an end in sight. This kind of a common fiat currency, spread across so many nations, is without precedent in world history.

So as you consider the legacy of Bretton Woods this week, remember that core lesson: There will be major changes in monetary and institutional arrangements that no one can even imagine right now. Assume the permanency of the status quo at your peril.

America’s urban rebirth is missing a key element

Monday, July 22nd, 2019

America’s urban rebirth is missing a key element: births:

Since 2011, the number of babies born in New York has declined 9 percent in the five boroughs and 15 percent in Manhattan. (At this rate, Manhattan’s infant population will halve in 30 years.) In that same period, the net number of New York residents leaving the city has more than doubled. There are many reasons New York might be shrinking, but most of them come down to the same unavoidable fact: Raising a family in the city is just too hard. And the same could be said of pretty much every other dense and expensive urban area in the country.

In high-density cities like San Francisco, Seattle, and Washington, D.C., no group is growing faster than rich college-educated whites without children, according to Census analysis by the economist Jed Kolko. By contrast, families with children older than 6 are in outright decline in these places.

Where is the Texas of Spain?

Sunday, July 21st, 2019

Bryan Caplan just returned from Spain:

The quickest way to explain Spain to an American: Spain is the California of Europe.  I grew up in Los Angeles, and often found myself looking around and thinking, “This could easily be California.”  The parallel is most obvious for geography — the deserts, the mountains, the coasts.  But it’s also true architecturally; the typical building in Madrid looks like it was built in California in 1975.  And at least in summer, the climates of Spain and California match closely.  Spain’s left-wing politics would also resonate with Californians, but Spain doesn’t seem very leftist by European standards.  Indeed, Spaniards often told me that their parents remain staunch Franco supporters.

He shares some reflections:

1. Overall, Spain was richer and more functional than I expected. The grocery stores are very well-stocked; the worst grocery store I saw in Spain offered higher quality, more variety, and lower prices than the best grocery store I saw in Denmark, Sweden, or Norway. Restaurants are cheap, even in the tourist areas. Almost all workers I encountered did their jobs with a friendly and professional attitude. There is near-zero violent crime, though many locals warned us about pickpockets.

2. The biggest surprise was the low level of English knowledge of the population. Even in tourist areas, most people spoke virtually no English. Without my sons, I would have been reduced to pantomiming (or Google translate) many times a day. Movie theaters were nevertheless full of undubbed Hollywood movies, and signs in (broken) English were omnipresent.

3. I wasn’t surprised by the high level of immigration, but I was shocked by its distribution. While there are many migrants from Spanish America, no single country has sent more than 15% of Spain’s migrants! The biggest source country, to my surprise, is Romania; my wife chatted with fellow Romanians on a near-daily basis. I was puzzled until a Romanian Uber driver told me that a Romanian can attain near-fluent Spanish in 3-4 months. Morocco comes in at #2, but Muslims are less visible in Madrid than in any other European capital I’ve visited.

4. 75% of our Uber drivers were immigrants, so we heard many tales of the immigrant experience. Romanians aside, we had drivers from Venezuela, Peru, Ecuador, Colombia, and Pakistan. Even the Pakistanis seemed highly assimilated and almost overjoyed to reside in Spain. By the way, Uber in Spain works even better than in the U.S. The median wait time was 3 minutes, and the prices were about one-third less than in the U.S.

5. Refugees from Chavismo were prominent and vocal. One Venezuelan Uber driver was vocally pro-Trump. You might credit Trump’s opposition to Maduro, but the driver said she liked him because “He doesn’t talk like a regular politician.” I wanted to ask, “Couldn’t you say the same about Chavez and Maduro?!” but I was in listening mode.

6. I’ve long been dumbfounded by Spain’s high unemployment rate, which peaked at around 27% during the Great Recession and currently stands at about 15%. Could labor market regulation really be so much worse in Spain than in France or Italy? My chats with local economists — and observation of the labor market — confirmed my skepticism. According to these sources, a lot of officially “unemployed” workers are lying to collect unemployment insurance while they work in the black market.

[...]

7. If I didn’t know the history of the Spanish Civil War, I never would have guessed that Spain ever had a militant labor movement. Tipping was even rarer than in France, but sincere devotion to customer service seems higher than in the U.S. Perhaps my sons charmed them with their high-brow Spanish, but I doubt that explains more than a small share of what I saw. A rental car worker apologized for charging me for returning my car with a 95% full tank, adding, “Sorry, but my boss will yell at me if I don’t.”

8. Catalan independence is a weighty issue for both Barcelona and Madrid libertarians. Madrid libertarians say that an independent Catalonia would be very socialist; Barcelona libertarians say the opposite. I found the madrileños slightly more compelling here, but thought both groups were wasting time on this distraction. Libertarians around the world should downplay identity and focus on the policy trinity of deregulating immigration, employment, and housing. (Plus austerity, of course).

9. UFM Madrid Director Gonzalo Melián was originally an architect. We discussed Spanish housing regulation at length, and I walked away thinking that Spain is strangling construction about as severely as the U.S. does.

10. Spanish housing regulation is especially crazy, however, because the country is unbelievably empty. You can see vast unused lands even ten miles from Madrid. The train trip to Barcelona passes through hundreds of miles of desert. Yes, the U.S. has even lower population density, but Spain is empty even in regions where many millions of people would plausibly like to live. Indeed, population density in Spain is actually lower than in the contiguous U.S.

Naturally, Caplan thinks that Spain needs more immigrants:

12. My biggest epiphany: Spain has more to gain from immigration than virtually any other country on Earth. There are almost 500 million native Spanish speakers on Earth — and only 47 million people in Spain. (Never mind all those non-Spanish speakers who can acquire fluency in less than a year!) Nearly all of these Spanish speakers live in countries that are markedly poorer and more dangerous than Spain, so vast numbers would love to migrate. And due to the low linguistic and cultural barriers, the migrants are ready to hit the ground running. You can already see migration-fueled growth all over Spain, but that’s only a small fraction of Spain’s potential.

[...]

14. How can immigration to Spain be such a free lunch? Simple: Expanding a well-functioning economy is far easier than fixing a poorly-functioning economy. The Romanian economy, for example, has low productivity. Romanian people, however, produce far more in Spain than at home. Give them four months to learn the language, and they’re ready to roll.

15. According to my sources, Spain’s immigration laws stubbornly willfully defy this economic logic. When illegal migrants register with the government, they immediately become eligible for many government benefits. Before migrants can legally work, however, they must wait three years. Unsurprisingly, then, you see many people who look like illegal immigrants working informally on the streets, peddling bottled water, sunglasses, purses, and the like. I met one family that was sponsoring Venezuelan refugees. Without their sponsorship, the refugees would basically be held as prisoners in a government camp — or even get deported to Venezuela. Why not flip these policies, so migrants can work immediately, but wait three years to become eligible for government benefits? Who really thinks that people have a right to the labor of others, but no right to labor themselves?

[...]

Where is the Texas of Spain? I don’t know, but that’s where the future is.

In the great majority of wrecks, all souls were lost

Thursday, July 18th, 2019

During the great Age of Exploration — from the 16th century through the advent of modern navigation and communications — there were more than 9,000 shipwrecks:

In the great majority of wrecks, all souls were lost to a watery grave. Occasionally, survivors endured at sea in small vessels; for example, the Essex went down in 1820, and its crew drifted in narrow whaleboats for weeks, eventually resorting to cannibalism. (Their story inspired Herman Melville to write Moby Dick.) But for our present purposes, we need cases in which survivors made landfall and set up camp, and those are rare.

Nicholas A. Christakis studied shipwrecks for data about the micro-societies that form and then succeed or fail:

We must acknowledge that, even in these twenty examples that fit our criteria, the survivors are not strictly representative of humanity. The people who traveled on ships were not randomly drawn from the human population; they were often serving in the navy or the marines or were enslaved persons, convicts or traders. Shipboard life involved exacting status divisions and command structures to which these people were accustomed. Survivor groups were therefore made up of people who not only frequently came from a single distinctive cultural background (Dutch, Portuguese, English and so on), but who were also part of the various subcultures associated with long ocean voyages during the epoch of exploration. These shipwreck societies were, consequently, mostly male. Furthermore, the majority of our research subjects had narrowly escaped death and were psychologically traumatized, arriving at their islands nearly drowned and sometimes naked and wounded.

We have already discussed some shipwrecks that went badly, devolving into murder and cannibalism. But what factors were shared by shipwreck societies that were most successful? In our sample, the groups that typically fared best were those that had good leadership in the form of mild hierarchy (without any brutality), friendships among the survivors, and evidence of co-operation and altruism.

Shipwrecks make good stories:

One shipwreck in which altruism involving resource sharing and risky volunteerism was particularly evident was the case of the Julia Ann. The ship wrecked in the Isles of Scilly, a reef in the Pacific, on September 7, 1855, stranding fifty-one people for two months. The misadventure was brought to a close when the captain and a crew of nine volunteered to row three days into the horizon to reach Bora Bora, 217 miles to the east, in order to get help. Five lives were lost when the Julia Ann struck a reef, but all of the fifty-one survivors were eventually rescued. A newspaper later reported:

Capt. Pond’s chief desire throughout the whole sad affair seemed to be to save the lives of the passengers and crew, as the following noble act illustrates: While the crew were engaged in getting the passengers ashore [using a lifeline from the wreck offshore], Mr. Owens, the second mate, was going to carry a bag containing eight thousand dollars belonging to the Captain, ashore. The captain ordered him to leave the money and carry a girl ashore…The child was saved, but the money lost.

This visible act of altruism at the outset powerfully established an example for the group to cooperate and work together. Half the Julia Ann castaways were of the Mormon faith, and this may have helped the group cohere. The captain noted that they were “so easy to be governed” and “always ready to hear and obey my counsel.”

This detail from the Blenden Hall account caught my eye:

The eighteen-year-old son of the captain, who himself showed great leadership during the ordeal, kept a diary in penguin blood written in the margins of salvaged newspapers

Christakis directs the Human Nature Lab at Yale.

Kiwis are keeping their guns

Monday, July 15th, 2019

New Zealand has an estimated 1.5 million firearms. It’s not clear how many of those are semi-automatic, but it’s probably far, far more than the 700 that have been turned in under the new gun control scheme:

That gun owners would, in large numbers, defy restrictions should have been anticipated by anybody who knows the history of government attempts to disarm their subjects — or who just glanced across the Tasman Sea to Australia.

“In Australia it is estimated that only about 20% of all banned self-loading rifles have been given up to the authorities,” wrote Franz Csaszar, professor of criminology at the University of Vienna, after Australia’s 1996 compensated confiscation of firearms following a mass murder in Port Arthur, Tasmania. Csaszar put the number of illegally retained arms in Australia at between two and five million.

“Many members of the community still possess grey-market firearms because they did not surrender these during the 1996–97 gun buyback,” the Australian Criminal Intelligence Commission conceded in a 2016 report. “The Australian Criminal Intelligence Commission continues to conservatively estimate that there are more than 260,000 firearms in the illicit firearms market.”

You won’t be at the table

Monday, July 15th, 2019

Saleforce.com has announced a ban on its customers selling “military-style rifles,” and this leads Eric S. Raymond to discuss the dangerous folly of “Software as a Service”:

It’s 2019 and I feel like I shouldn’t have to restate the obvious, but if you want to keep control of your business the software you rely on needs to be open-source. All of it. All of it. And you can’t afford it to be tethered to a service provider even if the software itself is nominally open source.

Otherwise, how do you know some political fanatic isn’t going to decide your product is unclean and chop you off at the knees? It’s rifles today, it’ll be anything that can be tagged “hateful” tomorrow — and you won’t be at the table when the victim-studies majors are defining “hate”. Even if you think you’re their ally, you can’t count on escaping the next turn of the purity spiral.

And that’s disregarding all the more mundane risks that come from the fact that your vendor’s business objectives aren’t the same as yours.

Like any editor, Stalin could be ambivalent

Sunday, July 14th, 2019

The Soviet Union, Aaron Lake Smith reminds us, was a regime founded by freelance writers and editors:

In other words, a nightmare. Pamphleteers, autodidactic theoreticians, critics, publishers of small journals, hot-­take artists, takedown artists, and failed poets who’d reinvented themselves as labor organizers — fractious and at constant war with one another, literary people through and through.

If we imagine the early Soviet Union as a hierarchical publishing company, a magazine or new media outfit like The New Republic or BuzzFeed, Lenin was the founder and publisher, Trotsky was the deputy editor, and Stalin was the seemingly humble managing editor. As anyone who has worked in publishing knows, the managing editor is the hardest worker. They make sure the deadlines are met and the trains run on time. They are, above all, reliable. This particular managing editor takes no vacations, never leaves town. He lives for the work, strives to appear to be the mere executor of the will of the publisher and the company.

When the publisher becomes very sick, it is the managing editor who visits him at home to cheer him up with jokes and receive his instructions. By bringing the boss’s instructions back to the office from on high, he leverages this personal relationship and increases his authority within the organization. It’s not hard to see how Stalin’s ascent within the Bolshevik hierarchy happened. We’ve all seen this person before. When the publisher dies, no one suspects the managing editor of harboring ambitions to take over. But really, who better understands the day-­to-­day functioning of the organization, who better to be in charge?

Stalin was a consummate editor. He seemed to understand that the role was to sublimate ego in order to shape the world quietly in the background. Good editors know how to render themselves invisible. Stalin’s blue pencil, unlike that of other editors, glided across not just poetry chapbooks and literary journals but life itself. “Fool,” “bastard,” “scoundrel,” he wrote in the margins of Andrei Platonov’s 1931 novella, Profit, destroying Platonov’s career. “Radek, you ginger bastard, if you hadn’t pissed into the wind, if you hadn’t been so bad, you’d still be alive,” he scrawled on a male nude drawing that reminded him of Karl Radek, an editor and strategist of the October Revolution whose death he had ordered years earlier. “You need to work, not masturbate,” he wrote on another. The combination of editorial influence with the power of life and death itself resulted in absurd, nearly un­believable situations — such as when Stalin’s old friend and comrade Nikolai Bukharin wrote him from the prison cell Stalin had put him in, begging his inquisitor for a preface to what would be his last book. “I fervently beg you not to let this work disappear… this is completely apart from my personal fateHave pity! Not on me, on the work!

Like any editor, Stalin could be ambivalent. “Stalin has a very particular attitude toward me,” the great Soviet writer Vasily Grossman told his daughter. “He does not send me to the camps, but he never awards me prizes.” Several times anticipated to win the prestigious Stalin Prize for his celebrated novels — in one instance, having planned the victory party, à la ­Hillary at the Javits Center — at the last minute Grossman found his name mysteriously removed from the list each time.

Today Grossman is best known as the author of Life and Fate, a novel often called the War and Peace of the twentieth century. The kaleidoscopic thousand-­page book, which follows the middle-­class Shaposhnikov family through the Second World War, is an indictment of ideological zealotry and a stark account of the horrors of Stalinism. The narrative ranges from the Great Terror to the gulag, the German camps, and Stalin’s late anti-­Semitic campaigns of the 1950s, slowly building the sense that, in their lack of humanity, the Soviet and Nazi regimes became mirror images of each other. “Does human nature undergo a true change in the cauldron of totalitarian violence? Does man lose his innate yearning for freedom?” Grossman asks at a pivotal moment. “The fate of both man and the totalitarian State depend on the answer to this question.” The book was considered so dangerous that all known copies of the text were “arrested” and suppressed by the KGB in 1961, an experience that broke Grossman physically and spiritually. “They strangled me in a dark corner,” he said. After his death, a copy he had hidden with an old friend was smuggled out of Russia on microfilm and published in the West in 1980, only appearing in Russia during the glasnost.

All the hand-wringing about getting into good colleges is probably a waste of time

Wednesday, July 10th, 2019

Scott Alexander looks at increasingly competitive college admissions and ends with this summary:

  1. There is strong evidence for more competition for places at top colleges now than 10, 50, or 100 years ago. There is medium evidence that this is also true for upper-to-medium-tier colleges. It is still easy to get into medium-to-lower-tier colleges.
  2. Until 1900, there was no competition for top colleges, medical schools, or law schools. A secular trend towards increasing admissions (increasing wealth + demand for skills?) plus two shocks from the GI Bill and the Vietnam draft led to a glut of applicants that overwhelmed schools and forced them to begin selecting applicants.
  3. Changes up until ten years ago were because of a growing applicant pool, after which the applicant pool (both domestic and international) stopped growing and started shrinking. Increased competition since ten years ago does not involve applicant pool size.
  4. Changes after ten years ago are less clear, but the most important factor is probably the ease of applying to more colleges. This causes an increase in applications-per-admission which is mostly illusory. However, part of it may be real if it means students are stratifying themselves by ability more effectively. There might also be increased competition just because students got themselves stuck in a high-competition equilibrium (ie an arms race), but in the absence of data this is just speculation.
  5. Medical schools are getting harder to get into, but law schools are getting easier to get into. There is no good data for graduate schools.
  6. All the hand-wringing about getting into good colleges is probably a waste of time, unless you are from a disadvantaged background. For most people, admission to a more selective college does not translate into a more lucrative career or a higher chance of admission to postgraduate education. There may be isolated exceptions at the very top, like for Supreme Court justices.

The trees are ready to cut

Tuesday, July 9th, 2019

A new federal program in the 1980s offered farmers money to reforest depleted land:

Pine trees appealed to Mr. George. He bought loblolly seedlings and pulled his pickup into a parking lot where hands-for-hire congregated.

“We figured we’d plant trees and come back and harvest it in 30 years and in the meantime go into town to make a living doing something else,” he said.

Three decades later the trees are ready to cut, and Mr. George is learning how many other Southerners had the same idea.

A glut of timber has piled up in the Southeast. There are far more ready-to-cut trees than the region’s mills can saw or pulp. The surfeit has crushed timber prices in Mississippi, Alabama and several other states.

The volume of Southern yellow pine, used in housing and to make paper, has surged in recent decades as farmers replaced cropland with trees and as clear-cut forests were replanted. By 2020, the amount of wood growing per acre of timberland in many counties will have more than quadrupled since 1980, U.S. forestry officials estimate.

It has been a big loser for some financial investors, among them the country’s largest pension fund. The California Public Employees’ Retirement System spent more than $2 billion on Southern timberland, and harvested trees at depressed prices to pay interest on money borrowed to buy. Calpers sold much of its land this summer at a loss. A spokeswoman for the pension fund declined to comment.

It has also been tough for the individuals and families who own much of the South’s forestland, and who had banked on its operating as a college fund or retirement account. The region has more than six million owners of at least 10 wooded acres, say academics and forestry consultants. Many of the owners were counting on forests as a long-term investment that could be replenished and passed on to heirs.

The marvel of advancing through life’s stations

Tuesday, July 9th, 2019

Much of our pop culture is made by and for folks who rate high on openness, the sort attracted to novelty — world travels, new drugs, and so forth — but not country music:

Emotional highlights of the low-openness life are going to be the type celebrated in “One Boy, One Girl”: the moment of falling in love with “the one,” the wedding day, the birth one’s children (though I guess the song is about a surprising ultrasound). More generally, country music comes again and again to the marvel of advancing through life’s stations, and finds delight in experiencing traditional familial and social relationships from both sides. Once I was a girl with a mother, now I’m a mother with a girl. My parents took care of me, and now I take care of them. I was once a teenage boy threatened by a girl’s gun-loving father, now I’m a gun-loving father threatening my girl’s teenage boy. Etc. And country is full of assurances that the pleasures of simple, rooted, small-town, lives of faith are deeper and more abiding than the alternatives.

(Hat tip to T. Greer.)

They suddenly find themselves in a society that is disgustingly self-centered

Monday, July 8th, 2019

T. Greer’s life’s short course has brought him to many places, bound him to sundry peoples, and urged him to varied trades:

Yet out of the lands I’ve lived and roles I’ve have donned, none blaze in my memory like the two years I spent as a missionary for the Church of Jesus Christ. It is a shame that few who review my resume ask about that time; more interesting experiences were packed into those few mission years than in the rest of the lot combined.

To be a missionary is to confront the uncanny. You cannot serve without sounding out the weird bottoms of the human heart. But if missionary life forces you to come full contact with mankind at its most desperate and unsettled, so too it asks you to witness mankind at its most awesome and ethereal. Guilt’s blackest pit, fear’s sharpest grip, rage at its bluntest, hope at its highest, love at its longest and fullest — to serve as a missionary is to be thrust in the midst of the full human panorama, with all of its foulness and all of its glory. I doubt I shall ever experience anything like it again. I cannot value its worth. I learned more of humanity’s crooked timbers in the two years I lived as missionary than in all the years before and all the years since.

Attempting to communicate what missionary life is like to those who have not experienced it themselves is difficult. You’ll notice my opening paragraph restricted itself to broad generalities; it is hard to move past that without cheapening or trivializing the experience.

Yet there is one segment of society that seems to get it. In the years since my service, I have been surprised to find that the one group of people who consistently understands my experience are soldiers. In many ways a Mormon missionary is asked to live something like a soldier: like a soldier, missionaries go through an intense ‘boot camp’ experience meant to reshape their sense of self and duty; are asked to dress and act in a manner that erodes individuality; are ‘deployed’ in far-flung places that leave them isolated from their old friends, family members, and community; are pushed into contact with the full gamut of human personality in their new locales; live within a rigid hierarchy, follow an amazing number of arcane rules and regulations, and hold themselves to insane standards of diligence, discipline, and obedience; and spend years doing a job which is not so much a job as it is an all-encompassing way of life.

The last point is the one most salient to this essay. It is part of the reason both many ex-missionaries (known as “RMs” or “Return Missionaries” in Mormon lingo) and many veterans have such trouble adapting to life when they return to their homes. This comparison occurred to me first several years ago, when I read a Facebook comment left by a man who had served as a Marine mechanic in Afghanistan. He was commenting on an interview Sebstation Junger had done to promote his book, Tribe: On Homecoming and Belonging.

I really enjoyed the audiobook of Tribe, by the way, but audiobooks don’t lend themselves to excerpts.

Many RMs report a sense of loss and aimlessness upon returning to “the real world.” They suddenly find themselves in a society that is disgustingly self-centered, a world where there is nothing to sacrifice or plan for except one’s own advancement. For the past two years there was a purpose behind everything they did, a purpose whose scope far transcended their individual concerns. They had given everything — “heart, might, mind and strength” — to this work, and now they are expected to go back to racking up rewards points on their credit card? How could they?

The soldier understands this question. He understands how strange and wonderful life can be when every decision is imbued with terrible meaning. Things which have no particular valence in the civilian sphere are a matter of life or death for the soldier. Mundane aspects of mundane jobs (say, those of the former vehicle mechanic) take on special meaning. A direct line can be drawn between everything he does — laying out a sandbag, turning off a light, operating a radio — and the ability of his team to accomplish their mission. Choice of food, training, and exercise before combat can make the difference between the life and death of a soldier’s comrades in combat. For good or for ill, it is through small decisions like these that great things come to pass.

In this sense the life of the soldier is not really his own. His decisions ripple. His mistakes multiply. The mission demands strict attention to things that are of no consequence in normal life. So much depends on him, yet so little is for him.

This sounds like a burden. In some ways it is. But in other ways it is a gift. Now, and for as long as he is part of the force, even his smallest actions have a significance he could never otherwise hope for. He does not live a normal life. He lives with power and purpose — that rare power and purpose given only to those whose lives are not their own.

[...]

This sort of life is not restricted to soldiers and missionaries. Terrorists obviously experience a similar sort of commitment. So do dissidents, revolutionaries, reformers, abolitionists, and so forth. What matters here is conviction and cause. If the cause is great enough, and the need for service so pressing, then many of the other things — obedience, discipline, exhaustion, consecration, hierarchy, and separation from ordinary life — soon follow. It is no accident that great transformations in history are sprung from groups of people living in just this way. Humanity is both at its most heroic and its most horrifying when questing for transcendence.

These contests will be byzantine

Saturday, July 6th, 2019

Suez Deconstructed aims to be a historically rooted how-to manual for statecraft:

The book seeks to convey the experience of “masterminding solutions to giant international crises,” Zelikow writes, by providing “a sort of simulator that can help condition readers just a little more” before confronting their own crises. It sets up that simulation by scrambling the storytelling. First, Suez Deconstructed divides the crisis into three phases: September 1955 through July 1956, July 1956 through October 1956, and October through November of that year. In doing so, the authors hope to show that “most large problems of statecraft are not one-act plays” but instead begin as one problem and then mutate into new ones. This was the case with Suez, which began with Egypt purchasing Soviet arms and which became a multipronged battle over an international waterway. Second, the book proceeds through these phases not chronologically but by recounting the perspectives of each of the six participants: the United States, the Soviet Union, the United Kingdom, France, Israel, and Egypt. The goal — and the effect — is to deprive the reader of omniscience, creating a “lifelike” compartmentalization of knowledge and perspective.

Zelikow encourages readers to assess Suez by examining three kinds of judgments made by the statesmen during the crisis: value judgments (“What do we care about?”), reality judgments (“What is really going on?”), and action judgments (“What can we do about it?”). Asking these questions, Zelikow argues, is the best means of evaluating the protagonists. Through this structure, Suez Deconstructed hopes to provide “a personal sense, even a checklist, of matters to consider” when confronting questions of statecraft.

The book begins this task by describing the world of 1956. The Cold War’s impermeable borders had not yet solidified, and the superpowers sought the favor of the so-called Third World. Among non-aligned nations, Cold War ideology mattered less than anti-colonialism. In the Middle East, its champion was Egyptian President Gamal Abdel Nasser, who wielded influence by exploiting several festering regional disputes. He rhetorically — and, the French suspected, materially — supported the Algerian revolt against French rule. He competed with Iraq, Egypt’s pro-British and anti-communist rival. He threatened to destroy the State of Israel. And through Egypt ran the Suez Canal, which Europe depended on for oil.

Egypt’s conflict with Israel precipitated the Suez crisis. In September 1955, Nasser struck a stunning and mammoth arms deal with the Soviet Union. The infusion of weaponry threatened Israel’s strategic superiority, undermined Iraq, and vaulted the Soviet Union into the Middle East. From that point forward, Zelikow argues, the question for all the countries in the crisis (aside from Egypt, of course) became “What to do next about Nasser?”

Israel responded with dread, while, Britain, France, and the United States alternated between confrontation and conciliation. Eventually, the United States abandoned Nasser, but he doubled down by nationalizing the Suez Canal. This was too much for France. Hoping to unseat Nasser to halt Egyptian aid to Algeria, it concocted a plan with Israel and, eventually, Britain for Israel to invade Egypt and for British and French troops to seize the Canal Zone on the pretense of separating Israeli and Egyptian forces. The attack began just before the upcoming U.S. presidential election and alongside a revolution in Hungary that triggered a Soviet invasion. The book highlights the Eisenhower administration’s anger at the tripartite plot. Despite having turned on Nasser, Eisenhower seethed at not having been told about the assault, bitterly opposed it, and threatened to ruin the British and French economies by withholding oil shipments.

[...]

Even so, it is possible to extract several key lessons about statecraft. Chief among them is the extent to which policymakers are informed as much by honor and will as by interest. Britain and France, for example, ultimately joined forces to invade Egypt, but they did so for different reasons and with different degrees of resolve. As Zelikow notes, in the mid-1950s, France, recently beaten in Indochina, seemed beleaguered, while Britain “still seemed big,” boasting a “far-flung network of bases and influence.” But appearances could deceive. France was led by men who “had been heroes of the resistance” during World War II and were determined to restore their country’s honor. Outwardly strong, meanwhile, Britain suffered from a gnawing sense of exhaustion.

This imbalance of morale would shape each nation’s actions during the crisis and contribute to Suez’s strange outcome. France’s Socialist-led coalition, Zelikow writes, was “driven by ideas and historical experience.” It possessed a vision of restoring French pride and a dedication to defeating what it saw as “antimodern throwbacks” in Algeria backed by a Mussolini-like figure in Cairo. It was thus undeterred when complications arose and “more creative in [its] policy designs.” But because Washington, Moscow, and Cairo all judged France by its seeming lack of material power and its recent defeats alone, they underestimated its will.

British leaders, equally eager to topple Nasser and more capable of acting independently than the French, nevertheless struggled to overcome their nation’s fatigue. Initially behind the government’s desire to punish Nasser, the British public, as the book details, “[lost] its appetite for military adventure” as diplomacy commenced. British Prime Minster Anthony Eden had long argued for the need to reconcile with anti-colonialism and with Nasser, its chief Middle Eastern apostle. The British public, tired of war, could not long support Eden’s reversal. London ultimately joined French-Israeli strikes not so much out of conviction but to save face — avoiding the embarrassment of abandoning the demands it made of Nasser.

The second lesson that emerges is the centrality of relationships between statesmen, which drove events just as much as, if not more than, money, power, and ideas. One of the central drivers of the war, in fact, was the bond between French and Israeli statesmen. France’s Socialist leaders had all fought in the French Resistance during World War II. They sympathized with Israel, feeling morally obligated to prevent another massacre of the Jewish people and, as one author in the book describes, viewing Israel’s struggle “as a sort of sequel” to the fight against fascism. The Israelis, many of whom were former guerilla fighters themselves, easily related to the French and appreciated their support. Paris and Jerusalem grew closer for practical reasons as well: France sought Israel’s aid in addressing the Algerian revolt. But the relationship extended beyond material interest. As one chapter relates, during French-Israeli negotiations regarding the attack on Egypt, “there was an emotional connection between [the French and Israeli leaders] that documents do not easily capture.” The affection between French and Israeli officials repeatedly propelled the war planning forward.

If intimate ties catalyzed the invasion of Egypt, so, too, did combustible ones — none more so than the rancor between Eden and Dulles. Eden detested Dulles as moralistic, legalistic, and tedious (as related in Suez Deconstructed, he once described Dulles with the quip, “Dull, Duller, Dulles”). Their mutual disregard plagued U.S.-British cooperation. At key moments, Eden believed, Dulles would intervene with a maladroit statement that would harm planning or undermine British leverage. In early October 1956, for example, Dulles stated that there were “no teeth” to the diplomatic plan that the powers had been devising and that when it came to issues of “so-called colonialism,” the United States would “play a somewhat independent role.” For Eden, feeling isolated, this statement “was in some ways the final blow,” spurring him to join the French-Israeli initiative.

The statesmen of the Suez Crisis were haunted by history as much as they were guided by pride and personality — another striking theme that surfaces in Suez Deconstructed. Zelikow begins his overview of the world in 1956 by stating that “[t]hey were a wartime generation,” nations that had “lived through conclusive, cataclysmic wars, some more than one.” Those experiences permeated their approaches to the crisis. French and British leaders could not help but see Nasser as a 1930s potentate.

[...]

It is a rare quality in world leaders to be able to make historical analogies without fully embracing them, thereby becoming trapped.

[...]

The wars of the coming decades, however, are likely to look more like Suez than Berlin or Iraq. They will likely be multi-state conflicts, in which states of every size and strength play major roles. These contests will be byzantine. Like Suez, they will be local skirmishes and global crises simultaneously. They will feature webs of overlapping rivalries and alliances (and rivalries within alliances), strategic and ideological considerations at multiple levels, and high-stakes signaling amid confusion and disinformation.

Happy Secession Day!

Thursday, July 4th, 2019

I almost forgot to wish everyone a happy Secession Day:

Brexit 1776

Conspiracies are normal and common

Thursday, July 4th, 2019

Moldbug’s Cathedral is not a conspiracy, Anomaly UK explains:

It makes more sense to say that the Cathedral is the opposite of a conspiracy. It is what you get when there are no conspiracies.

The word “conspiracy” is basically clickbait, but I’m going to stick with it anyway. Be aware, though, that I don’t mean anything really weird by it. The management of any company is a conspiracy, in that the members discuss plans in private and only publicise them if it is advantageous for them to do so. [Smug Misha] pointed out on twitter that HBO were able to keep the secret of the ending of Game of Thrones for months, despite hundreds of people needing to know it to make the episode.

In this sense, conspiracies are normal and common, though not quite as common as they used to be. That was my argument in the earlier piece: that as recently as a decade or so ago, a political party (or at least a faction within it) could agree an agenda in private and make confidential plans to pursue that agenda. That capability seems, since then, to have been lost. The key debates between leading politicians of the same party over what goals should be pursued and what means should be employed to pursue them are carried out in public.

I stand by that point. But on reflection I think it’s a much bigger deal. This is a recent development in a much longer trend. As I wrote yesterday in a comment, the Cathedral is defined by its lack of secrecy. The distinctive role of the universities and the press is to inform the public, and to do so with authoritative status. It is not defined by its ideology. However, its ideological direction is a predictable consequence of its transparency. A public competition for admiration causes a movement to the extreme: the most attractive position is the one just slightly more extreme than the others. This is the “holiness spiral”.

The breakdown of conspiracy, then, is not just a phenomenon of the last decade that has given us Trump and so on. It is the root cause of the political direction of the last few centuries.

What is the cause of the breakdown of conspiracy? If I had to guess and point at one thing it would be protestantism. That, after all, was largely a move to remove the secrecy from religion. Once democracy got going, that removed much more secrecy. But it’s still an ongoing process: democracy until recently was mediated by non-public formal and informal institutions. The opening of the guilds can be seen as part of the same trend. Many of the things I have written about in the past may be related — the decline in personal loyalty, for example.

That produces a feedback loop — a belief in equality and openness brings more decision-making into the public sphere, which leads to holiness spirals, which leads to ever increasing belief in equality and openness. But it seems to me that the openness comes first, and the ideology results from it. The Cathedral is a sociological construct, not an ideological one.

[...]

However, the actual powers of the state were immediately in the hands of the civil service and political parties, who were not transparent, and exerted a moderating influence. There were self-perpetuating groups of powerful people — conspiracies — who could limit the choices open to the electorate and therefore slow the long-term political trends driven by the Cathedral. Today, as a result of internal democracy in political parties (particularly in the UK, a very recent development), and of unmediated channels of communication, those conspiracies have been broken open. A politician today is fundamentally in the same business as a journalist or a professor — he is competing for status by means of public statements. The internal debates of political parties are now public debates. In the past, in order to become a politician, other politicians had to accept you. Now you can be a TV star or a newspaper columnist today, and be a politician tomorrow. The incumbents can’t quietly agree to stop you, any more than they could quietly agree to have pizza for lunch.

A tough-on-crime WASP using torture, intimidation, and surveillance to bring down a media-savvy terrorist

Friday, June 28th, 2019

What might be called “Nolan’s enigma” began in earnest with The Dark Knight — which involved a tough-on-crime WASP using torture, intimidation, and surveillance to bring down a media-savvy terrorist:

The Dark Knight Rises took things one step further with Bane, a menacing mix of Robespierre and Ruthenberg, whose pseudo-Marxist coup unleashes all manner of mayhem upon Gotham: banishments and public hangings, street brawls and show trials, and — in a scene lifted straight out of the French revolution — the storming of Blackgate (Bastille) prison.

Not to be outdone, Marvel soon embraced its own brand of post-9/11 conservatism. In every Avengers film, Joshua Tait notes, “it really is 1938….The threats are real and the Avengers’ unilateral actions are necessary” to protect life, liberty, and democracy. Each hero thus functions as a kind of Cold Warrior, standing athwart would-be despots and authoritarians, while their enemies function as bland, unidimensional cannon-fodder, a convenient narrative pretext for blowing things up. (To be fair, the bad guys usually do possess weapons of mass destruction; this is fantasy, after all.)

By 2018, however, Marvel had ditched the neocon agitprop and gone full paleo. Black Panther — which Slate described as “the most feminist superhero movie yet” — is about the hereditary monarch of a monoracial ethno-state that keeps immigrants at bay with a high-tech border wall and faces no economic slowdown because of it. In fact, Wakanda becomes the richest country in the world without any international trade whatsoever, all while maintaining traditional religious customs and above-replacement fertility rates — a kind of black Israel. (It does eventually reconcile itself to foreign aid under T’Challa, but not to immigration.) Trouble only begins when Killmonger (a foreigner) challenges Black Panther’s claim to the throne — not because he thinks the current occupant is illegitimate, but because he wants to use Wakandan technology to launch a global, race-based revolution, with no regard for national boundaries.

Then in Avengers: Infinity War, Wakanda opens its border wall and promptly gets invaded by aliens.

So perhaps it is fitting that Avengers: Endgame, the Marvel movie to end all Marvel movies, is even more Burkean — and badass — than its predecessors, a sustained cinematic rejoinder to everything Hollywood believes. If you haven’t seen Endgame yet — or if you take comfort in the delusion that Marvel is “woke” — stop reading now.