World’s first cloned arctic wolf is now 100 days old

Friday, September 30th, 2022

Chinese researchers have created the world’s first cloned arctic wolf, and it is now 100 days old:

Scottish scientists proved back in 1996 that it was possible to clone a mammal using a cell from an adult animal. Possible — but not easy. Dolly the sheep was the only successful clone in their 277 attempts.

Maya is the world’s first cloned arctic wolf

Cloning is still a challenging process — fewer than 25 animal species have been cloned to date, so the first successful cloning of a species is still newsworthy 25+ years after Dolly’s birth.

The journey to creating the first cloned Arctic wolf began in 2020, when researchers at Sinogene Biotechnology, a Beijing-based biotech, teamed up with the polar theme park Harbin Polarland.

Using skin cells donated by Maya, an arctic wolf housed at Harbin Polarland, Sinogene created 137 embryos using female dogs’ eggs. They then transferred 85 of the embryos into 7 beagle surrogates.

In July 2022, one of those beagles gave birth to a healthy cloned Arctic wolf, also named Maya.

The misogyny and cruelty behind many of the gags are as striking as the black comedy

Thursday, September 29th, 2022

When I saw the TV show M*A*S*H as a kid, I don’t think it even occurred to me that it might be about Korea; it was obviously about Vietnam. Apparently the original movie had the same issue:

Because of the context of the film being made — during the height of America’s involvement in the Vietnam War — 20th Century Fox was concerned that audiences would not understand that it was ostensibly taking place during the Korean War. At the request of the studio, a caption that mentions the Korean setting was added to the beginning of the film, and PA announcements throughout the film served the same purpose. Only a few loudspeaker announcements were used in the original cut. […] The Korean War is explicitly referenced in announcements on the camp public address system and during a radio announcement that plays while Hawkeye and Trapper are putting in Col. Merrill’s office, which also cites the film as taking place in 1951.

I didn’t really watch the show, but when I finally watched the movie, I realized the theme music had been burned into my memory — or, rather, its melody had. The TV show doesn’t include the lyrics to “Suicide Is Painless“:

Director Robert Altman had two stipulations about the song for composer Johnny Mandel: it had to be called “Suicide Is Painless” and it had to be the “stupidest song ever written”. Altman attempted to write the lyric himself, but, upon finding it too difficult for his “45-year-old brain” to write something “stupid” enough, he gave the task to his 15-year-old-son Michael, who reportedly wrote the lyrics in five minutes.

Altman later decided that the song worked so well he would use it as the film’s main theme. This more choral version was sung by uncredited session singers John Bahler, Tom Bahler, Ron Hicklin, and Ian Freebairn-Smith, and was released as a single attributed to “The Mash”. Altman said that, while he only made $70,000 for directing the movie, his son had earned more than $1 million for co-writing the song.

Several instrumental versions of the song were used as the theme for the TV series, but the lyrics were never used in the show. It became a number-one hit in the UK Singles Chart in May 1980. The song was ranked No. 66 on AFI’s 100 Years…100 Songs.

Its opening lyrics:

Through early morning fog I see
Visions of the things to be
The pains that are withheld for me
I realize and I can see

That suicide is painless
It brings on many changes
And I can take or leave it
If I please

The movie struck a nerve:

The film won the Grand Prix du Festival International du Film, later named the Palme d’Or, at the 1970 Cannes Film Festival. The film went on to receive five Academy Award nominations, including Best Picture, and won for Best Adapted Screenplay. In 1996, M*A*S*H was included in the annual selection of 25 motion pictures added to the National Film Registry of the Library of Congress being deemed “culturally, historically, or aesthetically significant” and recommended for preservation.[3] The Academy Film Archive preserved M*A*S*H in 2000. The film inspired the television series M*A*S*H, which ran from 1972 to 1983. Gary Burghoff, who played Radar O’Reilly, was the only actor playing a major character who was retained for the series.

From my perspective, it wasn’t a black comedy so much as a depressing, meandering drama full of unsympathetic characters — with the exception of Radar O’Reilly, I suppose — combined with a low-brow sports comedy. The “ringer” they bring in to beat the other units football team is a black NFL player known as “Spearchucker” Jones. Yeah.

Roger Ebert, in the Chicago Sun-Times, gave the film four (out of four) stars, in a review I can respect and understand, even if I don’t share his assessment:

There is something about war that inspires practical jokes and the heroes…are inspired and utterly heartless…. We laugh, not because “M*A*S*H” is Sgt. Bilko for adults, but because it is so true to the unadmitted sadist in all of us. There is perhaps nothing so exquisite as achieving…sweet mental revenge against someone we hate with particular dedication. And it is the flat-out, poker-faced hatred in “M*A*S*H” that makes it work. Most comedies want us to laugh at things that aren’t really funny; in this one we laugh precisely because they’re not funny. We laugh, that we may not cry…. We can take the unusually high gore-level in “M*A*S*H” because it is originally part of the movie’s logic. If the surgeons didn’t have to face the daily list of maimed and mutilated bodies, none of the rest of their lives would make any sense…. But none of this philosophy comes close to the insane logic of “M*A*S*H,” which is achieved through a peculiar marriage of cinematography, acting, directing, and writing. The movie depends upon timing and tone to be funny…. One of the reasons “M*A*S*H” is so funny is that it’s so desperate.

In a retrospective review for the Chicago Reader, Jonathan Rosenbaum characterized the film as “a somewhat adolescent if stylish antiauthoritarian romp…. But the misogyny and cruelty behind many of the gags are as striking as the black comedy and the original use of overlapping dialogue. This is still watchable for the verve of the ensemble acting and dovetailing direction, but some of the crassness leaves a sour aftertaste.”

Overlapping dialog wasn’t its only innovation:

In his director’s commentary, Altman says that M*A*S*H was the first major studio film to use the word “fuck” in its dialogue. The word is spoken during the football game near the end of the film by Walt “Painless Pole” Waldowski when he says to an opposing football player, “All right, Bud, your fucking head is coming right off!” The actor, John Schuck, said in an interview that Andy Sidaris, who was handling the football sequences, encouraged Schuck to “say something that’ll annoy him.” Schuck did so, and that particular statement made it into the film without a second thought. Previously confined to cult and “underground” films, its use in a film as conventionally screened and professionally distributed as M*A*S*H marked the dawn of a new era of social acceptability for profanity on the big screen, which had until a short time before this film’s release been forbidden outright for any major studio picture in the United States under the Hays Code.

There are very few instances in history where the removal of a monarchy has led to better national outcomes

Wednesday, September 28th, 2022

If you were to go back over the past couple of centuries, Ed West reminds us, the vast majority of the most appalling regimes would be republics:

Mussolini’s Italy might merit a place, and perhaps Tsarist Russia, which killed a fair few political opponents and jailed many more — but compared to their Soviet successors those were rookie numbers.

Such are the obvious advantages of monarchy that there was a point in 2015 when every single Arab republic had Foreign Office advice warning about travel, while every Arab monarchy was considered safe in its entirety.

Comparing countries inevitably suffers from the apples and oranges problem, but it’s still worth contrasting the fate of neighbours: the Syrian Arab Republic has been ruled with unrelenting cruelty by various military upstarts since independence from France, and this competitive brutality has reduced a sophisticated, ancient society to ruins.

Neighbouring Jordan suffered huge disadvantages from the start, having no natural resources, little access to the sea, few cities with established trading networks and a population that was majority refugee, West Bank Palestinians who had fled Israeli victory in 1948. Yet today it is a successful, well-functioning nation-state, having enjoyed decades of rule under the Hashemites, GDP increasing five-fold in 30 years. Is there any scenario where a republic would have been preferable for Jordan, or a monarchy worse for Syria?

Similarly, Morocco has had the benign rule of monarchs while neighbouring Algeria has endured decades of intermittent misery. The two countries have different histories, in particular regarding France, but next-door Libya did previously have a monarchy and has gone through hell since its downfall. Little wonder that many Libyans would have it restored.

There is much that can be said for monarchies but without doubt they insulate against extremism, in particular extremism of the Right. They provide an ersatz version of the militaristic splendour and rigid hierarchy that some crave, and ersatz tradition often works very well. One friend has a theory that royalty presents a healthy outlet for people who might otherwise be patriotic to the point of being dangerously unhinged, channelling their obsessions into mere crankdom.

Meanwhile, compared to the men who rise to power, monarchs tend to be more tolerant as individuals; the King of Morocco’s grandfather heroically saved the country’s Jews during the Second World War, and he was certainly not the only royal to have shown their moral worth during that conflict.

There are very few instances in history where the removal of a monarchy has led to better national outcomes. To take the most famous example, everyone knows the line by the Chinese official that it was ‘too soon to tell’ whether the French Revolution was a good thing (although it’s perhaps a myth, as he was talking about the ‘revolution’ of 1968). In reality the downfall of the Bourbons led to a million deaths in political violence and wars; thousands died in the terror and tens of thousands in the Vendée genocide. France was never really a leading power again, and it has left the country’s politics permanently divided, even to this day; this was in part because its conservative movement emerged out of the bloodshed far more uncompromising than its British equivalent. They got de Maistre; we got Burke.

The fall of the Habsburgs was an unrelenting tragedy and disaster, leading to dangerous instability and eventual mass murder; the Hohenzollerns were unlovely but German history shows that there is always worse around the corner. Romanov Russia was the prison of peoples — yet the Bolsheviks were far more violent and oppressive, and because of Russia’s size and structure there was little chance that a moderate, democratic form of government would survive when the monarchy fell. Today, in the Arab world, monarchy is far more effective because there are otherwise not enough neutral institutions in societies with very powerful clans, and therefore low levels of wider trust. In the absence of a strong civil society, religious extremists sweep all before them — unless a monarch can stop them.

It is true that at a certain level of political development monarchy becomes less important to the functioning of states; the majority of the most developed (and egalitarian) countries are constitutional monarchies, but no more so than neighbours: Britain is not better off than the Irish Republic, nor is the Netherlands compared to Germany, or Sweden with Finland. It is just that republics tend to have had more troubled histories, either conquered by neighbours or subject to revolution or totalitarianism.

Yet even among rich democracies there is benefit to having a king or queen. A few years ago financial journalist Mike Bird collated many of the academic papers looking at the empirical evidence for the effect of monarchy. Among the findings was that social capital is higher in monarchies, that the existence of monarchs boost economic growth where a country has weak executive restraints, and that governments ruled by kings or queens tend to otherwise behave with more restraint, and act with greater accountability towards voters.

Even in western countries like Belgium the monarchy plays a major role in national unity, and in a Britain which has become significantly more divided in recent years, between the composite nations, over ideology, race and religion, and lifestyle. This greater division may explain why British republicanism, once something of a force in the 1980s, ran out of steam at the turn of the millennium. It was not just that republicans could never answer the question of alternatives; there was also the recognition that the unifying power of the Queen might help broadly center-Left aims, especially in a society with far more religious and ethnic diversity than before. Constitutional monarchies, like established churches, tend to be theoretically conservative but progressive in practice.

It was not an education system

Tuesday, September 27th, 2022

Growing up on the Swedish seaside, Henrik Karlsson had a five-minute walk to four open learning facilities, not counting the library and the youth center:

One of the premises was an abandoned church that my friends and I used as a recording studio; we’d renovated it ourselves with funding from a study association. In another, I learned French from an émigré of Montpellier. We arranged public lectures — once, to our great surprise, we even managed to book then general secretary of the United Nations Ban-Ki Moon for a lecture in Uppsala. I analyzed Soviet cinema with a group of whom an unsettling number sang Sång för Stalin before the screenings.

Since leaving Sweden, I have realized that not everyone grows up like this. And I miss it. In fact, if the whole of Sweden was about to burn down and I could only save one thing, I might grab just folkbildningsrörelsen.

Folkbildningsrörelsen: that is the name we have for this movement of self-organized study groups, resource centers, maker spaces, public lectures, and free retreats for personal development.

These types of things exist in other countries too — but not at the same scale. Or even close.


In the 19th century, when these houses and the financing that enables them began to be built out, the main impetus came from the German Bildung tradition.

Bildung etymologically refers to shaping yourself in the image (das Bild) of God. God in this context should be imagined as a highly self-possessed spectral being — in control of its emotions, with mind and heart in harmony, and willing to take individual moral responsibility. Think Bertrand Russell but less atheist, and sitting on a cloud.


In the 19th century, the popular education movement started to grow into a significant societal force. This began with the creation of so-called folk high schools (folkhögskolor). These first emerged in Denmark, in Ryslinge, where Christen Kold in 1851 started a school based on N.F.S. Grundtvig’s idea of an ungraded, discussion-focused institution for higher education, aimed at the lower classes.

Folk high schools were located in scenic areas — not so much to be romantic retreats for city dwellers but to be close to the farmers who were their main clientele. In The Nordic Secret, Andersen and Björkman argue that folk high schools were retreats for ego development along lines similar to Robert Kegan’s. It was about creating the conditions for people who had lived in simple small-scale communities to develop the knowledge and psychological complexity required to navigate modern society. Much emphasis was placed on discussions, practical skills and simulations.


They arranged role-playing events where workers and farmers played out committee meetings and other arcane parts of the political process. This meant that once they got the vote and started sweeping into office, the worker representatives out-maneuvered the representatives from the upper classes, to the great surprise of many who had argued against democracy on the grounds that it would lead to a flood of unwashed plebeians. The secretaries in the government office, who were in the habit of grading political representatives for their professionalism, left good marks for the early workers’ representatives.

At their peak, 10 percent of young adults in rural areas choose to attend folk high schools. Andersen and Björkman’s thesis is that this created a critical mass, well distributed in the population, that had the intellectual and emotional tools needed to effectively navigate a complex society. This, in turn, would explain the rapid transition that the Nordic countries made, from being the poorest in Europe in the 1850s to being the happiest, most equal, and nearly richest societies in the world eighty years later. I think that is overplaying the importance of the folk high school – but it does gesture at the transformative impact that popular education had on large swaths of the population.

And it was only just beginning.


Olsson had returned from a trip to the United States where he had observed the success of the Chautauqua movement, an educational spectacle with speakers, showmen, and preachers, which Theodore Roosevelt, quite aptly, called “the most American thing in America”. Now Olsson was trying to figure out how to bring these ideas to his Good Templar lodge in Lund, to help his fellow Good Templars spread temperance.

What he came up with was a Scandinavian, minimalist version of Chautauqua, which he called a study circle. The study circle, Olsson envisaged, would be made up of equals and elect a leader from among its members. It would take literature as its starting point, and help its members acquire knowledge in the course of free conversation. It was, as all good memes are, a very simple idea. And it was cheap. The members (numbering between 5 and 20) could, if necessary, meet at home and would choose their own study material. That created an economically viable form of education for the working class.

And it was made even more viable three years later, when the Riksdag, Sweden’s parliament, voted to give grants for the purchase of books, on the condition that the books were made available to the general public1.

Another factor behind the success of study circles was their focus on communal self-improvement. Study circles were a child of the temperance movement — a movement that neither sought collective power, like the unions, nor self-improvement for the individual, but rather encouraged people to improve to serve their fellow human beings. This focus on communal self-improvement seems to have provided momentum to the movement. It also helped foster social capital formation, creating dense high-trust networks.


It was not an education system. Rather, it was an attempt to unleash what I have called the learning system. Instead of interventions aimed at controlling what people learn — which is how we can think about traditional education — folkbildningsrörelsen provided people with the resources they needed to learn on their own. The movements created the conditions for an ecosystem to emerge.

It’s our innate evolved form of government

Monday, September 26th, 2022

Erik Hoel describes the gossip trap:

Given that humans have been around for 200,000 years, why did civilization take so long to get started? Why were we stuck in prehistory for so long?


If we imagine being transported back to 50,000 BC, what would we expect to find? In the end, we have to give a metaphor to current life of how things were organized: a follower of Rousseau would expect Burning Man, a follower of Hobbes might expect to find a bunch of warring gangs, the Davids might expect to find the deliberation of a town council full of Kandiaronks.

But perhaps small groups of humans less than the Dunbar number were organized by none of these, since they didn’t need to be—instead, they could be organized via raw social power. That is, you don’t need a formal chief, nor an official council, nor laws or judges. You just need popular people and unpopular people.

After all, who sits with who is something that comes incredibly naturally to humans — it is our point of greatest anxiety and subject to our constant management. This is extremely similar to the grooming hierarchies of primates, and, presumably, our hominid ancestors. So 50,000 BC might be a little more like a high school than anything else.

I know the high school metaphor sounds crazy, but given that any metaphor we’re going to give will fail, I think this one possibly fails less than the others. After all, the central message of The Dawn of Everything is that prehistorical people were just people, with all the weirdness, politicking, cultural hilarity and differentness this implies. But, unlike what the Davids seem to want, most people aren’t Kandiaronk — he was exceptional. Most people are not exceptional. They are…well, like the people you remember from high school. So if we take the heart of the message of The Dawn of Everything seriously, perhaps entering a new tribe in Africa at 50,000 BC would not involve a bunch of mysterious rituals in the jungle enacted by solemn actors with dirt smeared across their faces. Maybe it was a bit more like the infamous lunch table scene from the movie Mean Girls (I encourage you to watch), with some minor surface alterations, like clothes (picture beads and furs instead).


What’s interesting is that anthropologists, from what I’ve read, seem to assume that raw social power is mostly a good thing (one wonders if they’ve ever seen social pressure applied). Mostly they focus on gossip, and if we look at the work of Robin Dunbar, and his 1996 book Grooming, Gossip, and the Evolution of Language, he speculates that the need to gossip was why language was invented in the first place. And gossip has (as far as I can tell), an almost universally positive valence throughout anthropology. In the literature it is portrayed as something that maintains social relationships and rids groups of free-riders and cheats, i.e., gossip is a “leveling mechanism” that prevents individuals from accruing too much power.


But it never seems to strike Dunbar or others that living under a dominion of raw social power, with few to little formal powers anywhere, would be hellish to a citizen of the 21st century (which is why I say the closest analog is high

My mother used to quote Eleanor Roosevelt all the time:

Great minds discuss ideas. Average minds discuss events. Small minds discuss people.

A “gossip trap” is when your whole world doesn’t exceed Dunbar’s number and to organize your society you are forced to discuss mostly people. It is Mean Girls (and mean boys), but forever. And yes, gossip can act as a leveling mechanism and social power has a bunch of positives — it’s the stuff of life, really. But it’s a terrible way to organize society. So perhaps we leveled ourselves into the ground for 90,000 years. Being in the gossip trap means reputational management imposes such a steep slope you can’t climb out of it, and essentially prevents the development of anything interesting, like art or culture or new ideas or new developments or anything at all. Everyone just lives like crabs in a bucket, pulling each other down. All cognitive resources go to reputation management in the group, to being popular, leaving nothing left in the tank for invention or creativity or art or engineering. Again, much like high school.

And this explains why violating the Dunbar number forces you to invent civilization — at a certain size (possibly a lot larger than the actual Dunbar number) you simply can’t organize society using the non-ordinal natural social hierarchy of humans. Eventually, you need to create formal structures, which at first are seasonal and changeable and theatrical, and take all sorts of diverse forms, since the initial condition is just who’s popular. But then these formal systems slowly become real.


Of course we gravitate to cancel culture — it’s our innate evolved form of government.

Arnold Kling keeps saying that the smart phone and social media smash together the intimate world and the remote world:

In the intimate world, gossip is the strong social force. In the remote world, institutions with their formal roles are supposed to be the strong social force. But modern technology has weakened formal roles, and we are falling back on gossip.

A whopping 19 out of 20 principals were replaced in the Houston experiment

Sunday, September 25th, 2022

Connie Morgan believes that Dr. Roland Fryer may have cracked the code on how to eliminate the academic gap between races:

White students score about 30 points higher on math tests than black students. Fryer implemented a strategy at a failing Houston school district that closed the gap. He did it by applying to elementary and secondary schools in the Houston district the five tenets of school success that he discovered in researching the habits of highly successful charter schools. Theories on how to close the academic achievement gap vary from “fix the home” to “fix the school” to “fix the community.” Fryer’s results make a compelling case for “fix the school.”

The five tenets are clear-cut:

  1. Increased Time in School
  2. Good Human Capital Management
  3. High Dosage Tutoring
  4. Data Driven Instruction
  5. Culture of High Expectations

Increasing time children spend in school may be unpalatable for parents concerned about indoctrination but this concern is addressed with the human capital management tenet (more on this in the following paragraph). Others may balk at longer schooldays, citing conflicting research on foreign schools pointing to shorter days as a tenet of student success. However, research on small homogenous countries like Finland is unlikely to reveal practices easily transferable to the United States. Fryer’s experiment confirms, in contrast, that when time in school is spent well, it’s good to spend more of it, particularly when the alternative might be a home environment non-conducive to children’s learning. Fryer had treatment schools in the Houston district increase time on task in various ways, including eliminating breaks between classes, expanding the school day by one hour, offering weekend classes, and adding days to the school year.

Human capital management is probably the most obvious tenet of improving education. In other words, get rid of teachers who won’t embrace the mission and hire ones that will. This tenet is likely the most difficult to execute. Politics and scarcity of resources are the challenges. A whopping 19 out of 20 principals were replaced in the Houston experiment. It took over 300 interviews to find 19 principals to replace them. Of the teachers, 46% were replaced. The district spent more than $5 million buying out teacher contracts. Additionally, feedback to teachers was constant. In the treatment schools, teachers received ten times more observations and feedback than those in the control group. Principals regularly lead staff development and training sessions.

Few schools tutor the number of students for the length of time that Fryer recommends. Remember that extra hour added to the school day? This is where it’s put to work; daily, focused small-group tutoring. In the Houston experiment, low performing fourth graders and all sixth and ninth graders were intensively tutored.

Like their teachers, students in the experiment were constantly being evaluated. Many schools collect data, but few are good at adjusting instruction in light of data. In Fryer’s experiment, treated schools held assessments every three weeks as well as benchmark exams three times in a school year. These results informed tutoring and allowed teachers to set highly specific performance goals with students.

A culture of high expectations is the trickiest tenet to measure. The tenet goes beyond posters that say “Nobody Cares, Work Harder.” Indicators that a real attitude shift has occurred may include things like professional dress codes for teachers, posted achievement goals and/or contracts between parents and schools agreeing to honor expectations.

Math achievement rose significantly in the schools that implemented Fryer’s tenets. Assessment scores increased by 0.15 to 0.18 standard deviations in a year. In layman’s terms, under this program, there is potential to close the math achievement gap between black and white students in less than three years. Even more important than comparison between groups and closed gaps is the absolute good of increased math proficiency among students who have been too long neglected.

The original paper notes that “injecting best practices from charter schools into traditional Houston public schools significantly increases student math achievement in treated elementary and secondary schools — by 0.15 to 0.18 standard deviations a year — and has little effect on reading achievement.”

(Hat tip to Arnold Kling, who expects any gains to fade out.)

Napoleon did not recommend the study of battles, but of campaigns

Saturday, September 24th, 2022

Napoleon did not recommend the study of battles, but of campaigns:

Campaigns before the twentieth century are in many ways the opposite of case studies. The latter thrusts us into a specific scenario, at a certain time and place, with a certain number of troops and resources, and with an immediate objective in mind. In set-piece battles of yore, as in modern operations, most of a commander’s mental efforts took place before things even begin: gathering an intelligence estimate of the enemy, selecting the best scheme of maneuver, matching troops to task, arranging logistics, etc. Once the action commenced, there were only a limited number of decision points where he could influence the course of events—to grossly simplify, the results came down to a series of rolls of the dice. This naturally focuses our attention on the mechanism of victory: what predetermined course of action gives the best probability of success in a scenario, accounting for a limited range of possible enemy responses.

Campaigns, on the other hand, were fundamentally more open-ended. When armies marched out of their winter cantonments for the season, they did not always know in advance which fortress to attack or where to fight a decisive action. Even when they did have a definite objective in mind, the entire challenge lay in precipitating conditions which would allow them to accomplish it; just as often, however, enemy action or the accidents of fate forced them to reconsider their intermediate steps or even the final objective itself.

This open-endedness created a very different decision-making structure. Plans had to account for a far greater degree of uncertainty, and the ability merely to remain in the field as a coherent force was at least as important as the pursuit of an objective. Sheer accident could moot one’s present course—Napoleon’s own staff was famous for issuing a steady stream of countermanding orders as circumstances evolved—and it was quite common for armies to stumble into a decisive engagement without realizing it. In short, decision-making was a far more continuous process.

This remained true even when both sides were clearly heading toward a decisive clash. Most of a commander’s focus had to remain outside the object itself: heeding his own vulnerabilities, considering the enemy’s intervening actions, ensuring that he was not detracting from his own efforts elsewhere, and getting enough men and supplies to the right place. And looming above all that was the risk of failure: what would the immediate consequences be? Was there a safe line of retreat? How would failure change his overall position? Naturally enough, the decision whether to engage in battle was just as important as the battleplan itself.

These questions shine through in narratives and memoirs from past campaigns. Many sources record war council debates over objectives, contingencies, marching routes, and supply considerations. It is by engaging with these debates and working through the decision-making process that a student can develop an intuition for the principles of war—not as a list of rules, but as a way of conducting operations in the face of uncertainty and risk. Even when the sources don’t reveal a commander’s thoughts, it can be just as instructive to try to figure out what he might have been thinking, or whether a failed action might have been somewhat justified by some non-obvious factor.

At its best military history is like a problem set with a partial answer key.

Calling tyranny “stable” may seem paradoxical

Friday, September 23rd, 2022

People in developing nations are not surprised when their government turns over, Daniel Klein & Michael C. Munger note, but those of advanced democracies have grown complacent, even though we know that democracies that appear stable can capsize:

Between 1850 and 1930, Austria-Hungary, Germany, Italy, Japan, and the Ottoman Empire turned into tyrannies. Since the year 2000, there has been a massive increase in the number of people living under tyranny, with fully 80 percent of the world’s population living in countries that Freedom House classifies as not having “free” government systems. In fact, as of 2021, 58 countries, with 38 percent of the world’s population, are now classified as full-on “not free” systems, having collapsed into tyranny.

It is tempting to think “it can’t happen here.” But Americans are more concerned about that than they have been in decades. In July, a CNN poll indicated that 48 percent of respondents think it is “likely” or “somewhat likely” that state actors will successfully overturn the results of a US election because their party did not win.

We, the present authors, are worried that putatively upright countries today are in danger of descending into tyranny. A tyranny — once capacities for control and despotism are constructed, in some cases including expansive government employment, dependency, and largesse — can be nearly impossible to reform. The key to the descent into tyranny, and the stability of tyranny once it is achieved, is this: Tyrants use tyranny to fortify their keep and to protect themselves against the sanctions due them for their crimes.

Calling tyranny “stable” may seem paradoxical. Tyrannies suffer from chaotic upheavals and violent paroxysms. But the state of tyranny itself is stable, like a capsized canoe. Ordered liberty is better for everyone — aside, perhaps, from the despotic faction and their affiliates. It is difficult to restore the rule of law once it is debased. Rectification would call for changes in personnel, operations, and attitudes. The relative power and privilege of the despots would disappear with rectifications. Tyrants use the tools of tyranny to protect themselves against the sanctions due them.

(Hat tip to Arnold Kling.)

It’s amazing to live in a society that often pretends these differences are not real

Thursday, September 22nd, 2022

What if we just looked at what men and women actually talk about in private?, Emil Kirkegaard asks:

We see that the male topics include politics, war/sports/gaming/weapons/death/killing, swearing, music (especially metal/rock), work/science, metals. Women’s topics are much more mundane. There’s a lot of expression of emotions, especially positive. There’s a lot of family talk shown by all the terms of human relationships (sister, daughter, nephew, brother, boyfriend etc.). Of interests, the main thing we see is food (cooking), and some shopping. In fact, it is surprisingly devoid of any abstract interests, I am surprised there are not more words related to clothing and child-rearing.

Overall we see that results are consistent across studies that men and women are interested in and talk about quite different things. It’s amazing to live in a society that often pretends these differences are not real.

(Hat tip to Arnold Kling.)

The Industrial Revolution kicked off the fertility transition

Wednesday, September 21st, 2022

It is ironic that our species, which is defined by our big brains, is evolving to become stupid, George Francis says:

Countless articles of scientific research have found the less intelligent to be having more surviving offspring and breeding faster than the intelligent. The problem was noticed by Darwin and his contemporaries, yet it has mostly been ignored throughout the 20th century. Recently it has had a small yet significant revival with the cult classic book At our Wits’ End, and large genetic databases showing intelligence decline. It has even been featured in the Telegraph.


The famous studies of dysgenics in the UK biobank show there is selection against intelligence but don’t attempt to quantify its size. The famous Icelandic study shows dysgenics but only attempts to measure its effect on years in education, not intelligence itself.


The Industrial Revolution kicked off the fertility transition. Countries that became rich were able to prevent starvation and improve health so the number of surviving children skyrocketed. Then with the advent of contraception, abortion and enjoyable alternatives to raising children, fertility plummeted. Many of the poorer, less intelligent countries started this process later meaning that as the natives of smart countries shrink in their population, the low IQ countries are still rapidly expanding.


My favourite estimate of the rate of dysgenics comes from Woodley’s calculation in At Our Wits’ End. He takes the Icelandic estimate for the rate of dysgenics on the educational attainment (years in education) polygenic score, adjusts it for the higher heritability of intelligence (about 80% heritable) and estimates a 0.8 IQ point decline per decade. That’s a lot. My guess is that this is an overestimate. Education attainment correlates negatively with fertility because it captures both the effect of intelligence and the effect of education, creating an overestimate. On the other hand, years in education is less heritable than IQ, causing the unaltered Icelandic estimate of about 0.3 IQ points to be an underestimate.

This is tricky, so let’s be cautious, split the difference and round up. IQ is falling by 0.6 points a decade.

In 1950 the average genotypic IQ was around 93.6. If you are not a high school graduate you need an IQ of 93 to join the US military. In 1950 something like half of the world’s population was too dumb to be in a professional army. Currently, the global average IQ is around 85 and by the end of the century, it will be 74. Then only an elite fraction would be capable of even being part of a professional army.

Average global IQ of 74 — dysgenics is a big problem! Let’s try and narrow it down a bit more. In Francis and Kirkegaard (forthcoming) we estimate that each national IQ point is associated with a 7.8% increase in GDP per capita. We also estimate the economic effects of dysgenics in that paper slightly differently, but with similar results. Let’s imagine the world is one country with an average IQ of 74 in 2100 and an average IQ of 85 as of 2020. The maths works out as a difference of logs at [exp((74–85)*7.8%) –1] = –58%. The effect of this dysgenic decline will be to cut GDP in half! And of course, that doesn’t even begin to consider the intangible factors GDP doesn’t necessarily include: low crime, social trust, science, culture and the arts.

Climate change is currently predicted to cost us a whopping 4% of world GDP by 2050. My numbers imply dysgenics will cost us 30% of GDP by 2050.

(Hat tip to Arnold Kling, who says, “Have a nice day.”)

Thirty-one percent of the gun owners said they had used a firearm to defend themselves or their property

Tuesday, September 20th, 2022

The largest and most comprehensive survey of American gun owners ever conducted, based on a representative sample of about 54,000 adults, 16,708 of whom were gun owners, suggests that Americans use firearms in self-defense about 1.7 million times a year:

The overall adult gun ownership rate estimated by the survey, 32 percent, is consistent with recent research by Gallup and the Pew Research Center. So is the finding that the rate varies across racial and ethnic groups: It was about 25 percent among African Americans, 28 percent among Hispanics, 19 percent among Asians, and 34 percent among whites. Men accounted for about 58 percent of gun owners.

Because of the unusually large sample, the survey was able to produce state-specific estimates that are apt to be more reliable than previous estimates. Gun ownership rates ranged from about 16 percent in Massachusetts and Hawaii to more than 50 percent in Idaho and West Virginia.

The survey results indicate that Americans own some 415 million firearms, including 171 million handguns, 146 million rifles, and 98 million shotguns. About 30 percent of respondents reported that they had ever owned AR-15s or similar rifles, which are classified as “assault weapons” under several state laws and a proposed federal ban. Such legislation also commonly imposes a limit on magazine capacity, typically 10 rounds. Nearly half of the respondents (48 percent) said they had ever owned magazines that can hold more than 10 rounds.

Those results underline the practical challenges that legislators face when they try to eliminate “assault weapons” or “large capacity” magazines. The survey suggests that up to 44 million AR-15-style rifles and up to 542 million magazines with capacities exceeding 10 rounds are already in circulation.

Those are upper-bound estimates, since people who reported that they ever owned such rifles or magazines may have subsequently sold them. But even allowing for some double counting, these numbers suggest how unrealistic it is to suppose that bans will have a significant impact on criminal use of the targeted products. At the same time, widespread ownership of those products by law-abiding Americans makes the bans vulnerable to constitutional challenges.

Two-thirds of the respondents who reported owning AR-15-style rifles said they used them for recreational target shooting, while half mentioned hunting and a third mentioned competitive shooting. Sixty-two percent said they used such rifles for home defense, and 35 percent cited defense outside the home. Yet politicians who want to ban these rifles insist they are good for nothing but mass murder.


Thirty-one percent of the gun owners said they had used a firearm to defend themselves or their property, often on multiple occasions. As in previous research, the vast majority of such incidents (82 percent) did not involve firing a gun, let alone injuring or killing an attacker. In more than four-fifths of the cases, respondents reported that brandishing or mentioning a firearm was enough to eliminate the threat.

That reality helps explain the wide divergence in estimates of defensive gun uses.


About half of the defensive gun uses identified by the survey involved more than one assailant. Four-fifths occurred inside the gun owner’s home or on his property, while 9 percent happened in a public place and 3 percent happened at work. The most commonly used firearms were handguns (66 percent), followed by shotguns (21 percent) and rifles (13 percent).

There is nothing revolutionary about Robin Hood

Monday, September 19th, 2022

How long has it been since you’ve thought about Robin Hood?, Alexander Palacio asks:

He’s not around as much as he used to be; an odd absence for him and the venerable set of characters and stories that orbit him. Robin and his Merry Men seem underrepresented in modern media. The few big Robin Hood films made recently have flopped. And where is he on television, in video games, in the cultural consciousness? The great outlaw has vanished into the depths of Sherwood, while Nottingham’s forces are at their strongest.


The disappearance of Robin Hood can be stated simply. In the last few decades, writers keep making one or two mistakes when writing Robin Hood. First, they take a grim, gritty, realistic approach to the tone of the story and characters. Second, they interpret Robin’s outlaw status to make him transgressive in a way that is opposed to the medieval social order itself. These approaches are not compatible with Robin Hood as he exists in his archetypal form. They violate the valid expectations people have for a Robin Hood story.

In fact, they directly contradict two fundamental elements of Robin Hood. First, Robin Hood is a lighthearted hero whose personal reward for his actions is having fun. Second, Robin Hood is a defender of the traditional medieval social order against a transgressive nobility. The first point should be obvious. Robin Hood leads the Merry Men.


The second point needs a bit more explanation. It’s not the social order itself that Robin Hood opposes, but the burden of men who abuse their high station. Thus, Robin’s allegiances with Friar Tuck, the good man of the Church, and with whichever good king the story uses (often Richard Lionheart). In the symbolic, associative world of writing, Robin’s ties to Church and Crown simply do not bear interpretation as a revolution against the social order itself. It is the abuse or absence of the social order he fights, not its use or presence.

It’s important to note that in these tales, it’s the common people who support the medieval social order and the nobility and their lackeys who distort it.


These seemingly-trivial new approaches to Robin Hood are critical writing errors. They contradict some of the most foundational elements of a Robin Hood story. When you make your Robin Hood story dour and grim, you obviate the role he has in combating the sorrow that comes from the failure of the nobility to meet its obligations to the people. That’s why Robin always engaged in fun, in contests, in jokes at the expense of the overly earnest. The humour is essential to depict and understand the setting and social dynamics of the story.

When you oppose Robin Hood to the social order itself, you turn him into a mere revolutionary, instead of a defender. Which makes little sense, given his association with the twin bastions of the old order, the Church and the Crown. There is nothing revolutionary about Robin Hood — he is among the most reactionary characters going. But because Chesterton’s point about nobility and novelty is little understood, ideologues perform sleight of hand to reinterpret him as a Marxist class hero. You are left with a story that not only doesn’t make internal sense, but also doesn’t meet expectations for a story about Robin Hood. Nothing about it sings, so the movie flops and nobody reads the book.

Asian-Americans have one-quarter the traffic death rate of Americans as a whole

Sunday, September 18th, 2022

The Japanese risk of dying in a traffic accident is one-sixth the American risk, David Zipper pointed out before explaining all the ways Japan makes its roadways safer — but I pointed out a more parsimonious explanation: Asian-Americans have one-quarter the traffic death rate of Americans as a whole.


The results confirmed the integrity of the self-described ancestry of these individuals

Saturday, September 17th, 2022

Numerous human population genetic studies have come to the identical conclusion, that genetic differentiation is greatest when defined on a continental basis:

The results are the same irrespective of the type of genetic markers employed, be they classical systems [5], restriction fragment length polymorphisms (RFLPs) [6], microsatellites [7,8,9,10,11], or single nucleotide polymorphisms (SNPs) [12]. For example, studying 14 indigenous populations from 5 continents with 30 microsatellite loci, Bowcock et al. [7] observed that the 14 populations clustered into the five continental groups, as depicted in Figure 1.


The African branch included three sub-Saharan populations, CAR pygmies, Zaire pygmies, and the Lisongo; the Caucasian branch included Northern Europeans and Northern Italians; the Pacific Islander branch included Melanesians, New Guineans and Australians; the East Asian branch included Chinese, Japanese and Cambodians; and the Native American branch included Mayans from Mexico and the Surui and Karitiana from the Amazon basin. The identical diagram has since been derived by others, using a similar or greater number of microsatellite markers and individuals [8,9]. More recently, a survey of 3,899 SNPs in 313 genes based on US populations (Caucasians, African-Americans, Asians and Hispanics) once again provided distinct and non-overlapping clustering of the Caucasian, African-American and Asian samples [12]: “The results confirmed the integrity of the self-described ancestry of these individuals”. Hispanics, who represent a recently admixed group between Native American, Caucasian and African, did not form a distinct subgroup, but clustered variously with the other groups. A previous cluster analysis based on a much smaller number of SNPs led to a similar conclusion: “A tree relating 144 individuals from 12 human groups of Africa, Asia, Europe and Oceania, inferred from an average of 75 DNA polymorphisms/individual, is remarkable in that most individuals cluster with other members of their regional group” [13]. Effectively, these population genetic studies have recapitulated the classical definition of races based on continental ancestry – namely African, Caucasian (Europe and Middle East), Asian, Pacific Islander (for example, Australian, New Guinean and Melanesian), and Native American.

Notable among the names of heroes of the British race is that of Beowulf

Friday, September 16th, 2022

I was recently shocked to realize that I didn’t own a single copy of Beowulf, except for a recent graphic novel adaptation and the short summary provided in Bulfinch’s Mythology. Bulfinch’s introduction is from another era (1867):

Notable among the names of heroes of the British race is that of Beowulf, which appeals to all English-speaking people in a very special way, since he is the one hero in whose story we may see the ideals of our English forefathers before they left their Continental home to cross to the islands of Britain.

It was perfectly natural for an American who lived through the Civil War to refer to the British race.