Within the Magic Circle

Sunday, December 28th, 2014

Why were liberals so feckless in power?, Walter Russel Mead asks:

Why did they blow the historic opportunity that the Bush implosion gave them?

What liberals are struggling to come to grips with today is the enormous gap between the dominant ideas and discourse in the liberal worlds of journalism, the foundations, and the academy on the one hand, and the wider realities of American life on the other. Within the magic circle, liberal ideas have never been more firmly entrenched and less contested. Increasingly, liberals live in a world in which certain ideas are becoming ever more axiomatic and unquestioned even if, outside the walls, those same ideas often seem outlandish.

Modern American liberalism does its best to suppress dissent and critique (except from the left) at the institutions and milieus that it controls. Dissent is not only misguided; it is morally wrong. Bad thoughts create bad actions, and so the heretics must be silenced or expelled. “Hurtful” speech is not allowed, and so the eccentricities of conventional liberal piety pile up into ever more improbable, ever more unsustainable forms.

[...]

Meanwhile, many liberals are in a tough emotional spot. They live in liberal cocoons, read cocooning news sources, and work in professions and milieus where liberal ideas are as prevalent and as uncontroversial as oxygen. They are certain that these ideas are necessary, important and just — and they can’t imagine that people have solid reasons for disagreeing with them. Yet these ideas are much less well accepted outside the bubble — and the bubbles seem to be shrinking.

The Checkered Game of Life

Friday, December 26th, 2014

The Checkered Game of Life, released in 1860 by Milton Bradley, evolved into the very different modern-day game of Life:

Instead of becoming hair stylists or police officers and poking plastic peg-shaped children into candy-colored SUVs, players of the original game landed on spaces marked with “virtues” and “vices.”

Spaces like “honesty” and “truth” sprung you forward; spaces like “gambling” and “disgrace” slowed your progress.

“I think religion really affected how games were made,” Keren-Detar says. “In Europe, there were more excuses to play, whereas in the US, we didn’t have an established aristocracy. So the idea of playing was very negative, or thought of as lazy and idle and sinful. These games had morphed themselves into being entertaining, but also educational, so you wouldn’t get in trouble for playing a game on a Sunday if it’s based on how to become a better Christian.”

Checkered Game of Life

The pre-Civil War game Mansion of Happiness was even more righteous, Keren-Detar says. Some of its illustrated squares showed characters suffering for their sins with consequences like whipping posts or pillories.

Neither Mansion of Happiness nor The Checkered Game of Life featured dice, probably because dice were still strongly associated with gambling and sin. Checkered Game of Life used a spinning number wheel instead, a feature that survives to this day.

The shift in the narrative of Life over the centuries, Keren-Detar says, suggests a shift in American values.

“The narrative wasn’t dying and going to heaven—it was trying to go to college and be productive and get money,” Keren-Detar says of the version of Life we’re most familiar with. “A lot of games that came out around that time changed from being religious to being industrious.”

Seasons Greetings!

Thursday, December 25th, 2014

Last year, around this time, friends and acquaintances offered Peter Frost all sorts of religiously neutral salutations:

Seasons Greetings! Happy Holidays! Joyeuses fêtes! Meilleurs vœux! Only two people wished me Merry Christmas.

One was Muslim, the other was Jewish.

They meant well. After all, isn’t that the culturally correct greeting? In theory, yes. In practice, most Christians feel uncomfortable affirming their identity. And this self-abnegation gets worse the closer you are to the cultural core of Anglo-America. Immigrants of Christian background enjoy being wished Merry Christmas. Black people likewise. Catholics seem to split half and half, depending on how traditional or nominal they are.

But the WASPs. Oh, the WASPs! With them, those two words are a faux pas.

[...]

What about other cultural groups? Why single out just one? But I’ve heard the answer already. WASPs and their culture dominate North America. The path to power, or simply a better life, runs through their institutions. Minorities can affirm their own identities without restricting the life choices of others, but the same does not hold true for WASPs. Their identity affects everyone and must belong to everyone.

I’m still not convinced. Yes, WASPs did create the institutions of Anglo-America, but their influence in them is now nominal at best. The U.S. Supreme Court used to be a very WASPy place. Now, there’s not a single White Protestant on it. That’s a huge underrepresentation for a group that is still close to 40% of the population. We see the same thing at the Ivy League universities, which originally trained Protestant clergy for the English colonists. Today, how many of their students have any kind of Christian European background? The proportions are estimated to be 20% at Harvard, 22% at Yale, and 15% at Columbia.

Sometimes reality is not what is commonly believed. WASPs are not at all privileged. In fact, they have been largely pushed aside in a country that was once theirs.

[...]

WASPs believe in getting ahead through rugged individualism. Most of the other groups believe in using family and ethnic connections. Guess who wins.

Wholesome and Christian

Wednesday, December 24th, 2014

Modern Americans are often bemused to find that Christmas was illegal in Puritan Massachusetts. Why would the Puritans hate such a wholesome, Christian holiday?

They had their reasons. First, it wasn’t wholesome:

In early modern Europe, roughly the years between 1500 and 1800, the Christmas season was a time to let off steam — and to gorge. It is difficult today to understand what this seasonal feasting was like. For most of the readers of this book, good food is available in sufficient quantity year-round. But early modern Europe was above all a world of scarcity. Few people ate much good food at all, and for everyone the availability of fresh food was seasonally determined. Late summer and early fall would have been the time of fresh vegetables, but December was the season — the only season — for fresh meat. Animals could not be slaughtered until the weather was cold enough to ensure that the meat would not go bad; and any meat saved for the rest of the year would have to be preserved (and rendered less palatable) by salting. December was also the month when the year’s supply of beer or wine was ready to drink. And for farmers, too, this period marked the start of a season of leisure. Little wonder, then, that this was a time of celebratory excess.

[…]

Reveling could easily become rowdiness; lubricated by alcohol, making merry could edge into making trouble. Christmas was a season of “misrule,” a time when ordinary behavioral restraints could be violated with impunity. It was part of what one historian has called “the world of carnival. ” (The term carnival is rooted in the Latin words carne and vale — “farewell to flesh.” And “flesh” refers here not only to meat but also to sex — carnal as well as carnivorous.) Christmas “misrule” meant that not only hunger but also anger and lust could be expressed in public.

Second, it wasn’t Christian:

It was only in the fourth century that the Church officially decided to observe Christmas on December 25. And this date was chosen not for religious reasons but simply because it happened to mark the approximate arrival of the winter solstice, an event that was celebrated long before the advent of Christianity. The Puritans were correct when they pointed out — and they pointed it out often — that Christmas was nothing but a pagan festival covered with a Christian veneer.

Neo-Reactionnaire

Monday, December 22nd, 2014

The French neo-reactionnaire has his own style:

The term “neo-reactionnaire” is an exonym. In other words it is a description applied to the group by outsiders. Insiders say they come from both camps — right and left.

“The big division today is over the nation state,” says Mihaely. “Is the state’s historic role finished, or is it still a major actor in the political, anthropological and cultural arenas?

“The question is not if you are left or right but if you believe in the nation.

“Our position is that the nation is still the only framework in which politics has any meaning. It is the only arena in which things can get done, where people can vote for change and change happens.”

None of the neo-reactionnaires — not even Camus — claims allegiance to the FN. Many of them are Jewish.

Nonetheless they stand accused, by expressing such strong views on Islam, identity and the nation, of promoting the cause of the far right.

Zemmour says he is fed up with being asked about the FN.

“Can’t they understand that the FN is not a cause, it is a consequence. It is a consequence of the disintegration of France.

“People vote for the FN to say to their elites, ‘Stop doing what you are doing!’ But they never do.

“It was Stalin who first realised how effective it was to turn the enemy into a fascist. That is what they are doing to us today.”

Activist vs. Passivist

Sunday, December 21st, 2014

Being a Social Justice Warrior is intentionally uncomfortable, because you’re never outraged enough to solve all of humanity’s problems:

He thinks he’s talking about progressivism versus conservativism, but he isn’t. A conservative happy with his little cabin and occasional hunting excursions, and a progressive happy with her little SoHo flat and occasional poetry slams are psychologically pretty similar. So are a liberal who abandons a cushy life to work as a community organizer in the inner city and fight poverty, and a conservative who abandons a cushy life to serve as an infantryman in Afghanistan to fight terrorism. The distinction Cliff is trying to get at here isn’t left-right. It’s activist versus passivist.

As part of a movement recently deemed postpolitical, I have to admit I fall more on the passivist side of the spectrum — at least this particular conception of it. I talk about politics when they interest me or when I enjoy doing so, and I feel an obligation not to actively make things worse. But I don’t feel like I need to talk nonstop about whatever the designated Issue is until it distresses me and my readers both.

Possibly I just wasn’t designed for politics. I’m actively repulsed by most protests, regardless of cause or alignment, simply because the idea of thousands of enraged people joining together to scream at something — without even considering whether the other side has a point — terrifies and disgusts me. Even hashtag campaigns and other social media protest-substitutes evoke the same feeling of panic.

T. Greer called my attention to this passage:

Five million people participated in the #BlackLivesMatter Twitter campaign. Suppose that solely as a result of this campaign, no currently-serving police officer ever harms an unarmed black person ever again. That’s 100 lives saved per year times let’s say twenty years left in the average officer’s career, for a total of 2000 lives saved, or 1/2500th of a life saved per campaign participant. By coincidence, 1/2500th of a life saved happens to be what you get when you donate $1 to the Against Malaria Foundation. The round-trip bus fare people used to make it to their #BlackLivesMatter protests could have saved ten times as many black lives as the protests themselves, even given completely ridiculous overestimates of the protests’ efficacy.

The moral of the story is that if you feel an obligation to give back to the world, participating in activist politics is one of the worst possible ways to do it.

Moralizing Religions

Sunday, December 21st, 2014

Today’s most popular religions all focus on morality:

Religion wasn’t always based on morality, explains Nicolas Baumard, a psychologist at the École Normale Supérieure in Paris. For the first several thousand years of human recorded history, he notes, religions were based on rituals and short-term rewards. If you wanted rain or a good harvest, for example, you made the necessary sacrifices to the right gods. But between approximately 500 B.C.E. and 300 B.C.E., a radical change appeared all over Eurasia as new religions sprung up from Greece to India to China. All of these religions shared a focus on morality, self-discipline, and asceticism, Baumard says. Eventually these new religions, such as Stoicism, Jainism, and Buddhism, and their immediate successors, including Christianity and Islam, spread around the globe and became the world religions of today. Back in 1947, German philosopher Karl Jaspers dubbed the pivotal time when these new religions arose “the Axial Age.”

So what changed? Baumard and his colleagues propose one simple reason: People got rich. Psychologists have shown that when people have fewer resources at their disposal, prioritizing rewards in the here and now is the best strategy. Saving for the future—much less the afterlife—isn’t the best use of your time when you are trying to find enough to eat today. But when you become more affluent, thinking about the future starts to make sense, and people begin to forgo immediate rewards in order to prioritize long-term goals.

Not coincidentally, the values fostered by affluence, such as self-discipline and short-term sacrifice, are exactly the ones promoted by moralizing religions, which emphasize selflessness and compassion, Baumard says. Once people’s worldly needs were met, religion could afford to shift its focus away from material rewards in the present and toward spiritual rewards in the afterlife. Perhaps once enough people in a given society had made the psychological shift to long-term planning, moralizing religions arose to reflect those new values. “Affluence changed people’s psychology and, in turn, it changed their religion,” Baumard says.

To test that hypothesis, Baumard and his colleagues gathered historical and archaeological data on many different societies across Eurasia in the Axial Age and tracked when and where various moralizing religions emerged. Then they used that data to build a model that predicted how likely it was that a moralizing religion would appear in all sorts of different societies—big or small, rich or poor, primitive or politically complex.

It turned out that one of the best predictors of the emergence of a moralizing religion was a measure of affluence known as “energy capture,” or the amount of calories available as food, fuel, and resources per day to each person in a given society. In cultures where people had access to fewer than 20,000 kilocalories a day, moralizing religions almost never emerged. But when societies crossed that 20,000 kilocalorie threshold, moralizing religions became much more likely, the team reports online today in Current Biology. “You need to have more in order to be able to want to have less,” Baumard says.

The Peripheral

Saturday, December 20th, 2014

William Gibson’s new novel, The Peripheral, explores two futures:

The second future takes place in a 22nd-century post-singularity London, where a recently disgraced publicist navigates a surveillance state ruled by a kleptocracy. Today, the singularity is a theoretical point at which artificial intelligence becomes smarter than us and lies outside our control. According to singularity devotees, we cannot predict what happens at this juncture, but some ideas include mankind uploading our consciousness into computers or causing our own end by runaway nanotechnology. Gibson’s vision of the singularity is a “nerd rapture,” and it’s different and more human than any other singularity depiction I’ve encountered.

“I’ve been making fun of the singularity since I first encountered the idea,” he says. “What you get in The Peripheral is a really fucked-up singularity. It’s like a half-assed singularity coupled with that kind of neoreactionary, dark enlightenment shit. It’s the singularity as experienced by Joseph Heller. We’re people, and we fuck up. We do a singularity, we’re going to fuck it up.”

Indeed, in the novel, we do. An apocalypse Gibson refers to only as “the Jackpot” devastates Earth’s population, and Gibson’s “half-assed singularity” comes along in time to save only the moneyed elite. Gibson’s vision is a multicausal apocalypse, one that refutes the idea of the single-trigger apocalypses (an epidemic, a nuclear holocaust, an asteroid) that have preoccupied man since before the Bible. I asked him why the people with money survived. His response: “Why wouldn’t they?

In The Peripheral, while those with money survive “the Jackpot,” they have no more control over that technology than the poor do. They merely have more access to it.

I’m more than a little curious about his use of neoreactionary and dark enlightenment.

Smart People Read Biographies

Friday, December 19th, 2014

Smart people read biographies, Ryan Holiday says, because they’re some of most actionable and educational reading you can do, so he recommends his favorites:

  1. Plutarch’s Lives, Plutarch – Aside from being the basis of much of Shakespeare, he was one of Montaigne’s favorite writers.
  2. The Power Broker, Robert Caro – Like Huey Long and Willie Stark, Robert Moses was a man who got power, loved power and was transformed by power.
  3. Socrates: A Man for Our Times, Napoleon: A Life, Churchill, Paul Johnson – Paul Johnson is the kind of author whose sweeping judgements you can trust, so you leave this book with what feels like a very solid understanding of who his subjects are a people.

He recommends many more.

Bungling the Conclusions to Wars

Friday, December 19th, 2014

Insurgencies aren’t going away, so we should work toward doing counterinsurgencies better, Max Boot argues:

The first lesson may sound like a no-brainer, but it has been routinely ignored: plan for what comes after the overthrow of a regime. In Afghanistan and Iraq, the George W. Bush administration failed to adequately prepare for what the military calls “Phase IV,” the period after immediate victory — an oversight that allowed law and order to break down in both countries and insurgencies to metastasize. Yet Obama, despite his criticism of Bush’s conduct of the Iraq war, repeated the same mistake in Libya. In 2011, U.S. and nato forces helped rebels topple Muammar al-Qaddafi but then did very little to help the nascent Libyan government establish control of its own territory. As a result, Libya remains riven by militias, which have plunged the country into chaos. Just this past July — almost two years after U.S. Ambassador Christopher Stevens was killed in Benghazi — the State Department had to evacuate its entire embassy staff from Tripoli after fighting there reached the airport.

This is not a problem confined to Bush or Obama. The United States has a long tradition of bungling the conclusions to wars, focusing on narrow military objectives while ignoring the political end state that troops are supposed to be fighting for. This inattention made possible the persecution of freed slaves and their white champions in the South after the American Civil War, the eruption of the Philippine insurrection after the Spanish-American War, the rise of the Nazis in Germany and the Communists in Russia after World War I, the invasions of South Korea and South Vietnam after World War II, and the impetus for the Iraq war after the Gulf War. Too often, U.S. officials have assumed that all the United States has to do is get rid of the bad guys and the postwar peace will take care of itself. But it simply isn’t so. Generating order out of chaos is one of the hardest tasks any country can attempt, and it requires considerable preparation of the kind that the U.S. military undertook for the occupation of Germany and Japan after 1945 — but seldom did before and has seldom done since.

Ineffective Government

Wednesday, December 17th, 2014

Nearly all the well-informed and honest citizens of the United States agree, Scott Adams (How to Fail at Almost Everything and Still Win Big) suggests, that the Federal Government should not enforce marijuana prohibition in states that allow medical marijuana:

That’s an easy law to change, right? I mean, if something like 80% of voters agree on an issue, it’s a no-brainer.

But our ineffective government couldn’t pass a law that had overwhelming support because, I suppose, it is bad for reelection if someone labels you pro-drug. So instead, Congress quietly just removed funding for the FBI’s weed-chasing efforts. No budget means no action in the future. In effect, the federal war on weed is over.

While I appreciate that the government is moving in the direction the citizens prefer, how much does it tell you about the effectiveness of our system that lawmakers couldn’t change a law that nearly 100% of well-informed and honest (meaning not taking money from private prison lobbyists for example) folks prefer?

My point is not about weed. That fight is essentially over. We’re just waiting for the referee to count to ten, although that might play out over several years. Full legalization for adults (in effect) is inevitable because the data will be so clear after a few states do their test runs.

My point is that if your government can’t pass a law that has has nearly universal approval, do you really have a functioning government?

19th-Century Terrorism

Tuesday, December 16th, 2014

To understand the terrorists of today, we can look at their forgotten forebears from the 19th century:

I discovered the secret through reading about 19th-century history, particularly the years from the 1848 revolutions to the outbreak of the First World War in 1914. The key was Bismarck, the Prussian minister-president who unified Germany. If you want to learn about Bismarck, you will probably pick up a book by some historian of international relations, such as A.J.P. Taylor. That’s the right place to start. But it means you can read a lot about Bismarck before finding out about the time in May 1866 when a guy shot him.

Ferdinand Cohen-Blind, a Badenese student of pan-German sentiments, waylaid Bismarck with a pistol on the Unter den Linden. He fired five rounds. None missed. Three merely grazed his midsection, and two ricocheted off his ribs. He went home and ate a big lunch before letting himself be examined by a doctor.

But even the books that condescend to mention this triviality may not tell you about the other time a guy shot Bismarck: A young Catholic tried to kill him in July 1874, during the anti-Catholic Kulturkampf Bismarck had engineered, but only managed to score his right hand with a bullet.

The point is not that Bismarck was particularly hated, although he was. The point is that this period of European (and American) history was crawling with young, often solitary male terrorists, most of whom showed signs of mental disorder when caught and tried, and most of whom were attached to some prevailing utopian cause. They tended to be anarchists, nationalists or socialists, but the distinctions are not always clear, and were not thought particularly important. The 19th-century mind identified these young men as congenital conspirators. It emphasized what they had in common: social maladjustment, mania, an overwhelming sense of mission and, usually, a prior record of minor crimes.

It has become a pastime of mine to pick major royal or ministerial figures from 19th-century continental Europe and look up the little-known assassination attempts against them. Even in peaceful, isolated England, there were no fewer than seven attempts to shoot Queen Victoria. Russian czars, French presidents and Bulgarian prime ministers make particularly fertile ground.

Just try, for example, either Napoleon. A bomb designed to kill the first on his way to the opera injured or killed roughly 30 people around Christmas 1800; the conspirators were pro-Bourbon legitimists. Exactly the same thing happened to the third in 1858: A bomb planted by Italian revolutionaries killed eight and injured 142, while barely stopping the emperor’s carriage.

Biographies will often omit these events totally, much less note the astonishing Napoleonic parallel. Yet all this bombing and gunfire must have had a profound psychological effect on the leaders who were targeted, along with peers elsewhere. The prevalence of assassination obviously influenced the gory histories of the emerging Balkan states and, once you unlock the secret, you can see the imprint of terror on the history of Germany, with its countless princelings and kinglets — all of them frightened all the time, and thus predisposed to political overreaction.

No one sees the murders of three U.S. presidents between 1865 and 1900 as part of the same phenomenon, but it was. And the bad news is that the First World War, which began with a famous assassination, was in some ways a culmination of this tendency to desperate, violent action.

Use of Force

Monday, December 15th, 2014

Back when Todd G. was in law school, he had a wonderful opportunity to teach his classmates about use of force:

For a project in one of my criminal law classes I was invited by the DEA tactical training cadre to bring half my class (and professor) down to the FBI/DEA “Hogan’s Alley” force on force training village in Quantico, Virginia. This was during the time that Waco & Ruby Ridge were being investigated by DOJ and federal law enforcement UOF rules were under severe scrutiny.

Our group was put through a number of exercises ranging from the classic Tueller drill (attacker 21 feet away charges at you with a knife) to team room-clearing.

A few days later I had to present my paper to the entire class. The half that attended the force on force (FOF) exercises sat on the left side of the room and the other students sat on the right.

Just a few minutes into my presentation I brought up the danger of a knife wielding attacker. The right side of the room grew indignant immediately and argued that someone twenty-one feet away — the length of an entire room — simply couldn’t be a deadly threat to someone with a gun. Before I could even reply, the left side of the room erupted in angry shouts: “You’ve never been there!”

Next we discussed opening a closet door to find a stranger holding a pistol that was pointed down toward the ground. Again the students on the right side of the room insisted he couldn’t be threat because he wasn’t pointing the gun at anyone. And again the left side of the room lost its collective mind: “Do you have any idea how fast someone can point a gun at you from that position? It’s faster than you can see it and respond before you get shot!”

It was the easiest presentation I’ve ever given.

A New & Different Kind Of Civil War

Monday, December 15th, 2014

The climax of the civil rights campaign produced a black separatist movement that has endured for half a century:

It emerged at exactly the moment that the two signal civil rights acts passed congress in 1964-65 (the public accommodations act and the voting rights act).

I think the reason for the sudden rise of black separatism was anxiety among black Americans about the prospect of being formally invited to participate in what was then American common culture. By the late 1960s even colleges were chartering new, separate student unions (at the demand of black students). The sad irony of this has been lost to history. But in effect, by that time a large segment of the black population had opted out either actively or mentally from trying to join the then-dominant culture. The gulf between the two cultures has only grown wider since then, egged on by a foolish white-sponsored “diversity” campaign which had imposed the ridiculous idea that a common culture in one nation is unnecessary.

The result is a permanently oppositional black culture with an elaborate ideology of endless grievance and a guilt-tripped white political culture held hostage by it and pandering endlessly to it — and sandwiched in between those two dispositions is a whole lot of really bad behavior. The least you can say about the four incidents involving Trayvon Martin, Michael Brown, Eric Garner, and Tamir Rice is that they involved some degree of ambiguity about what was actually going on, and in probably all those cases, at least, death was not caused by sheer malice. The same is not true about the case of Zemir Begic, or of the many people victimized during last year’s “knockout” game fad, or indeed the astounding number of people being gunned down regularly on the streets of Chicago.

I don’t think we’re capable of making these distinctions anymore, and surely not of doing anything constructive about them. Instead, we just appear to be careening toward a new and different kind of civil war.

South African Burglaries

Saturday, December 13th, 2014

American expat Patrick McGroarty was covering the Oscar Pistorius trial in Pretoria when his home in Johannesburg got burgled. They came back for more the next month. Around the same time, robbers killed the South African national team’s goalie — leading the nation to wonder “why South Africans take from each other, and why these desperate assailants are so quick to kill.”

In this discussion of “South Africans,” McGroarty brings up the touchy subject of race exactly once:

Though the murder rate has fallen by more than half since the end of white minority rule in 1994, the number of people killed in South Africa each year still ranks among the highest in the world.

McGroarty’s family moved into an apartment complex with 24-hour security.