A Self-Licking Ice Cream Cone

December 22nd, 2014

Don’t let logistics drive strategy, Max Boot warns:

The U.S. military created a massive logistical footprint in Afghanistan and Iraq, erecting a series of heavily fortified Little Americas that offered troops everything from ice cream to large-screen TVs. These compounds proved staggeringly expensive to resupply. In the summer of 2006, when both conflicts were going strong, U.S. Central Command had more than 3,000 trucks delivering supplies and another 2,400 delivering fuel to its bases, and these convoys had to be protected by either troops or contractors. The military thus became what soldiers sometimes called a “self-licking ice cream cone” — an organization that fought to sustain itself rather than to achieve a mission.

It is doubtful that senior commanders ever gave much thought to these logistical requirements; they more or less operated on autopilot. Each base commander would bring in a few more amenities to make life better for the troops, a commendable impulse. But in the process, commanders not only created supply-line vulnerabilities but also cut off troops from the populace, neglecting an essential part of any successful counterinsurgency campaign. In the future, the Pentagon should resist the temptation to build up huge bases unless doing so accomplishes the objectives of the war.

I’m not sure, “don’t let logistics drive strategy,” is how I’d phrase that.

Posture, Submit, Flight, or Fight

December 22nd, 2014

When two animals of the same species fight, it’s rarely to the death, David Grossman (On Killing) notes:

Rattlesnakes use their poisonous fangs on other creatures, but they wrestle each other; piranha fish bite anything that moves, but fight each other with flicks of their tails; and animals with antlers and horns attempt to puncture and gore other species with these natural weapons, but meet their own species in relatively harmless head-to-head clashes. Against one’s own species the options of choice in nature are to “posture” before and during mock battle, to “submit” by making oneself harmless or exposing oneself to a killing blow, or to take “flight” from the aggressor. The “fight” option is almost never used, thus ensuring the survival of the species.

Most soldiers don’t fight so much as they posture:

The anthropologist Irenaus Eibl-Eibesfeldt tells us that “One threatens [postures] by making oneself bigger — whether by raising one’s hackles, wearing combs in one’s hair or putting on a bearskin….” Such plumage saw its height in modern history during the Napoleonic era, when soldiers wore high, uncomfortable shako hats that served no purpose other than to make the wearer look and feel like a taller, more dangerous creature. In the same manner, the roars of two posturing beasts are exhibited by men in battle. For centuries the war cries of soldiers have made their opponents’ blood run cold. Whether it be the battle cry of a Greek phalanx, the “Hurrah!” of the Russian infantry, the wail of Scottish bagpipes, or the rebel yell of our own Civil War, soldiers have always instinctively sought to daunt the enemy through nonviolent means prior to physical conflict, while encouraging one another and impressing themselves with their own ferocity, and simultaneously providing a very effective means of drowning the disagreeable yell of the enemy.

With the advent of gunpowder, the soldier has been provided with one of the finest possible means of posturing. Paddy Griffith (Battle Tactics of the Civil War) points out that soldiers in battle have a desperate urge to fire their weapons:

Time and again we read of regiments blazing away uncontrollably, once started, and continuing until all ammunition was gone or all enthusiasm spent. Firing was such a positive act, and gave the men such a physical release for their emotions, that instincts easily took over from training and from the exhortations of officers.

Ardant du Picq became one of the first to document the common tendency of soldiers to fire harmlessly into the air simply for the sake of firing. Du Picq made one of the first thorough investigations into the nature of combat with a questionnaire distributed to French officers in the 1860s. One officer’s response to du Picq stated quite frankly that “a good many soldiers fired into the air at long distances,” while another observed that “a certain number of our soldiers fired almost in the air, without aiming, seeming to want to stun themselves, to become drunk on rifle fire during this gripping crisis.”

I’ve mentioned Grossman’s thoughts on posturing before.

Activist vs. Passivist

December 21st, 2014

Being a Social Justice Warrior is intentionally uncomfortable, because you’re never outraged enough to solve all of humanity’s problems:

He thinks he’s talking about progressivism versus conservativism, but he isn’t. A conservative happy with his little cabin and occasional hunting excursions, and a progressive happy with her little SoHo flat and occasional poetry slams are psychologically pretty similar. So are a liberal who abandons a cushy life to work as a community organizer in the inner city and fight poverty, and a conservative who abandons a cushy life to serve as an infantryman in Afghanistan to fight terrorism. The distinction Cliff is trying to get at here isn’t left-right. It’s activist versus passivist.

As part of a movement recently deemed postpolitical, I have to admit I fall more on the passivist side of the spectrum — at least this particular conception of it. I talk about politics when they interest me or when I enjoy doing so, and I feel an obligation not to actively make things worse. But I don’t feel like I need to talk nonstop about whatever the designated Issue is until it distresses me and my readers both.

Possibly I just wasn’t designed for politics. I’m actively repulsed by most protests, regardless of cause or alignment, simply because the idea of thousands of enraged people joining together to scream at something — without even considering whether the other side has a point — terrifies and disgusts me. Even hashtag campaigns and other social media protest-substitutes evoke the same feeling of panic.

T. Greer called my attention to this passage:

Five million people participated in the #BlackLivesMatter Twitter campaign. Suppose that solely as a result of this campaign, no currently-serving police officer ever harms an unarmed black person ever again. That’s 100 lives saved per year times let’s say twenty years left in the average officer’s career, for a total of 2000 lives saved, or 1/2500th of a life saved per campaign participant. By coincidence, 1/2500th of a life saved happens to be what you get when you donate $1 to the Against Malaria Foundation. The round-trip bus fare people used to make it to their #BlackLivesMatter protests could have saved ten times as many black lives as the protests themselves, even given completely ridiculous overestimates of the protests’ efficacy.

The moral of the story is that if you feel an obligation to give back to the world, participating in activist politics is one of the worst possible ways to do it.

Moralizing Religions

December 21st, 2014

Today’s most popular religions all focus on morality:

Religion wasn’t always based on morality, explains Nicolas Baumard, a psychologist at the École Normale Supérieure in Paris. For the first several thousand years of human recorded history, he notes, religions were based on rituals and short-term rewards. If you wanted rain or a good harvest, for example, you made the necessary sacrifices to the right gods. But between approximately 500 B.C.E. and 300 B.C.E., a radical change appeared all over Eurasia as new religions sprung up from Greece to India to China. All of these religions shared a focus on morality, self-discipline, and asceticism, Baumard says. Eventually these new religions, such as Stoicism, Jainism, and Buddhism, and their immediate successors, including Christianity and Islam, spread around the globe and became the world religions of today. Back in 1947, German philosopher Karl Jaspers dubbed the pivotal time when these new religions arose “the Axial Age.”

So what changed? Baumard and his colleagues propose one simple reason: People got rich. Psychologists have shown that when people have fewer resources at their disposal, prioritizing rewards in the here and now is the best strategy. Saving for the future—much less the afterlife—isn’t the best use of your time when you are trying to find enough to eat today. But when you become more affluent, thinking about the future starts to make sense, and people begin to forgo immediate rewards in order to prioritize long-term goals.

Not coincidentally, the values fostered by affluence, such as self-discipline and short-term sacrifice, are exactly the ones promoted by moralizing religions, which emphasize selflessness and compassion, Baumard says. Once people’s worldly needs were met, religion could afford to shift its focus away from material rewards in the present and toward spiritual rewards in the afterlife. Perhaps once enough people in a given society had made the psychological shift to long-term planning, moralizing religions arose to reflect those new values. “Affluence changed people’s psychology and, in turn, it changed their religion,” Baumard says.

To test that hypothesis, Baumard and his colleagues gathered historical and archaeological data on many different societies across Eurasia in the Axial Age and tracked when and where various moralizing religions emerged. Then they used that data to build a model that predicted how likely it was that a moralizing religion would appear in all sorts of different societies—big or small, rich or poor, primitive or politically complex.

It turned out that one of the best predictors of the emergence of a moralizing religion was a measure of affluence known as “energy capture,” or the amount of calories available as food, fuel, and resources per day to each person in a given society. In cultures where people had access to fewer than 20,000 kilocalories a day, moralizing religions almost never emerged. But when societies crossed that 20,000 kilocalorie threshold, moralizing religions became much more likely, the team reports online today in Current Biology. “You need to have more in order to be able to want to have less,” Baumard says.

The Role of Hate

December 21st, 2014

Terror bombing — pardon, strategic bombing — failed to terrorize its victims into submission. David Grossman (On Killing) suggests that this is because a fear of danger doesn’t cause psychiactric casualties without a fear of hate:

Airpower advocates persist in their support of strategic bombing campaigns (which are rooted in an attrition warfare mentality), even in the face of evidence such as the post-World War II Strategic Bombing Survey, which, in the words of Paul Fussell, ascertained that: “German military and industrial production seemed to increase — just like civilian determination not to surrender — the more bombs were dropped.” Historically. aerial and artillery bombardments are psychologically effective, but only in the front lines when they are combined with the Wind of Hate as manifested in the threat of the physical attack that usually follows such bombardments.

This is why there were mass psychiatric casualties resulting from World War II artillery bombardments, but World War II’s massed bombing of cities was surprisingly counterproductive in breaking the enemy’s will. Such bombardments without an accompanying close-range assault, or at least the threat of such an assault, are ineffective and may even serve no other purpose than to stiffen the resolve of the enemy!

This is why putting friendly troop units in the enemy’s rear is infinitely more important and effective than even the most comprehensive bombardments in his rear, or attrition along his front. This argues strongly for a doctrine similar to the World War II German principle of the Kesselschlacht (i.e., a constant striving for decisive action in the enemy rear) as an essential element in obtaining decisive victory. In this doctrine the Aufrollen (i.e., rolling up the flanks after making a penetration) becomes a secondary operation which is conducted solely to support the Schwerpunkt or the main thrust, which is flexibly directed into the enemy’s center of gravity by the commander’s intent.

In the Korean War the U.S. Army experienced the psychological effectiveness of an enemy who directed penetrations and surprise attacks behind our own lines. During the early years of that war, the rate of psychiatric casualties was almost seven times higher than the average rate for World War II. Only after the war settled down, lines stabilized, and the threat of having enemy forces in the rear areas decreased did the average incidence of psychiatric casualties go down to slightly less than that of World War II. Later, when U.N. forces were able to penetrate and threaten the enemy’s rear area during the Inchon landing, these same processes began to work in their favor.

Even in the ideal bombing grounds of the barren deserts of the 1991 Gulf War, where for over a month the full weight of American, British, French, Canadian, and Italian airpower was brought to bear on the conscript soldiers of a Third World despot, enemy units did not and would not surrender in large numbers until faced with maneuver units on the ground and in their rear.

The Peripheral

December 20th, 2014

William Gibson’s new novel, The Peripheral, explores two futures:

The second future takes place in a 22nd-century post-singularity London, where a recently disgraced publicist navigates a surveillance state ruled by a kleptocracy. Today, the singularity is a theoretical point at which artificial intelligence becomes smarter than us and lies outside our control. According to singularity devotees, we cannot predict what happens at this juncture, but some ideas include mankind uploading our consciousness into computers or causing our own end by runaway nanotechnology. Gibson’s vision of the singularity is a “nerd rapture,” and it’s different and more human than any other singularity depiction I’ve encountered.

“I’ve been making fun of the singularity since I first encountered the idea,” he says. “What you get in The Peripheral is a really fucked-up singularity. It’s like a half-assed singularity coupled with that kind of neoreactionary, dark enlightenment shit. It’s the singularity as experienced by Joseph Heller. We’re people, and we fuck up. We do a singularity, we’re going to fuck it up.”

Indeed, in the novel, we do. An apocalypse Gibson refers to only as “the Jackpot” devastates Earth’s population, and Gibson’s “half-assed singularity” comes along in time to save only the moneyed elite. Gibson’s vision is a multicausal apocalypse, one that refutes the idea of the single-trigger apocalypses (an epidemic, a nuclear holocaust, an asteroid) that have preoccupied man since before the Bible. I asked him why the people with money survived. His response: “Why wouldn’t they?

In The Peripheral, while those with money survive “the Jackpot,” they have no more control over that technology than the poor do. They merely have more access to it.

I’m more than a little curious about his use of neoreactionary and dark enlightenment.

The Telegraph’s Best War and History Books

December 20th, 2014

The Telegraph selects the best war and history books ever written, starting with some obvious classics:

The next selection, 1066 & All That — subtitled “A Memorable History of England, comprising all the parts you can remember, including 103 Good Things, 5 Bad Kings and 2 Genuine Dates” — seems exceedingly English, in a good way.

Whenever I see All Quiet on the Western Front, by Erich Maria Remarque, on a list like this, I immediately think of how Storm of Steel, by Ernst Jünger, rarely makes such lists. The story I’d always heard was that All Quiet was acceptably anti-war, while Storm was too pro-war — but now that I’ve read Storm, I’m not sure I’d call it pro-war at all. It simply feels like a sincere war memoir — although I’ve also heard that different editions had very different tones.

Legion of the Damned, on the other hand, sounds like a very different book from the others on the list:

Written in highly suspicious circumstances by a highly suspicious author (or perhaps his wife, or editor) this is the first in a series of novels that became cartoonish, yet for all that it packs immense power, describing the misadventures of a group of German soldiers on the Eastern Front.

I got a chuckle out of how the Telegraph included the cover for the Warhammer 40k novel of the same name.

Anyway, enjoy the whole list.

Parasympathetic Backlash

December 20th, 2014

Under the stress of combat, the body shifts resources to the sympathetic nervous system and away from the parasympathetic — for a while, as David Grossman (On Killing) explains:

The sympathetic nervous system mobilizes and directs the body’s energy resources for action. It is the physiological equivalent of the frontline soldiers who actually do the fighting in a military unit.

The parasympathetic system is responsible for the body’s digestive and recuperative processes. It is the physiological equivalent of the cooks, mechanics, and clerks that sustain a military unit over an extended period of time.

Usually these two systems sustain a general balance between their demands upon the body’s resources, but during extremely stressful circumstances the “fight or flight” response kicks in and the sympathetic nervous system mobilizes all available energy for survival. This is the physiological equivalent of throwing the cooks, bakers, mechanics, and clerks into the battle. In combat this very often results in nonessential activities such as digestion, bladder control, and sphincter control being completely shut down. This process is so intense that soldiers very often suffer stress diarrhea, and it is not at all uncommon for them to urinate and defecate in their pants as the body literally “blows its ballast” in an attempt to provide all the energy resources required to ensure its survival.

It doesn’t take a rocket scientist to guess that a soldier must pay a heavy physiological price for an enervating process this intense. The price that the body pays is an equally powerful backlash when the neglected demands of the parasympathetic system become ascendant. This parasympathetic backlash occurs as soon as the danger and the excitement are over, and it takes the form of an incredibly powerful weariness and sleepiness on the part of the soldier.

This brings us to the criticality of the reserve:

Napoleon stated that the moment of greatest danger was the instant immediately after victory, and in saying so he demonstrated a remarkable understanding of the way in which soldiers become physiologically and psychologically incapacitated by the parasympathetic backlash that occurs as soon as the momentum of the attack has halted and the soldier briefly believes himself to be safe. During this period of vulnerability, a counterattack by fresh troops can have an effect completely out of proportion to the number of troops attacking.

It is basically for this reason that the maintenance of an “unblown” reserve has historically been essential in combat, with battles often revolving around which side can hold out and deploy their reserves last. The reserve has always played a vital role in combat, but du Picq was one of the earliest advocates not only of “holding out a reserve as long as possible for independent action when the enemy has used his own,” but he also insisted on the revolutionary concept that this process “ought to be applied downward” to the lowest levels. He also perceived the technological process of increasing lethality on the battlefield which continues today. “There is more need than ever to-day, for protecting… the reserves. The power of destruction increases, the morale [of human beings] stays the same.” Clausewitz further understood and put great emphasis on the danger of reserve forces becoming prematurely enervated and exhausted when he cautioned that the reserves should always be maintained out of sight of the battle.

These same basic psycho-physiological principles explain why successful military leaders have historically maintained the momentum of a successful attack. Pursuing and maintaining contact with a defeated enemy is vital in order to completely destroy the enemy (the vast majority of the killing in historical battles occurred during the pursuit, when the enemy turned his back), but it is also valuable to maintain contact with the enemy as long as possible in order to delay that inevitable pause in the battle which will result in the “culmination point.” The culmination point is usually caused as much by logistical processes as anything else, but once the momentum of the pursuit stops (for whatever reasons) there are severe physiological and psychological costs to be paid, and the commander must realize that his forces will begin to immediately slip into a powerful parasympathetic backlash and become vulnerable to any enemy counterattack. An unblown reserve force ready to complete the pursuit is a vital aspect of maneuver warfare and can be of great value in ensuring that this most destructive phase of the battle is effectively executed.

Smart People Read Biographies

December 19th, 2014

Smart people read biographies, Ryan Holiday says, because they’re some of most actionable and educational reading you can do, so he recommends his favorites:

  1. Plutarch’s Lives, Plutarch – Aside from being the basis of much of Shakespeare, he was one of Montaigne’s favorite writers.
  2. The Power Broker, Robert Caro – Like Huey Long and Willie Stark, Robert Moses was a man who got power, loved power and was transformed by power.
  3. Socrates: A Man for Our Times, Napoleon: A Life, Churchill, Paul Johnson – Paul Johnson is the kind of author whose sweeping judgements you can trust, so you leave this book with what feels like a very solid understanding of who his subjects are a people.

He recommends many more.

Real Athletes Throw Knives

December 19th, 2014

When I saw Christopher McDougall (Born to Run, Natural Born Heroes) claim that real athletes throw knives, I almost dismissed him, because throwing knives are cool, but they’re not at all practical — but he was way ahead of me:

Brewster came to my house one afternoon to teach me no-spin knife throwing. He mounted a slice of log on an easel, pulled out three knives, and — as he whipped them in from all kinds of angles and distance — demonstrated why no-spin might be the answer to one of the great riddles of modern anthropology. It goes like this:

Hitting a target is an amazing act of calculation, because often you’re not aiming where something is; you’re aiming where it isn’t. You have to factor angles, directions, and muscle force, all of it in a blink.

We’re the only animal that can pull it off, and once we did, it changed everything. Learning to throw transformed us from prey into predators. Better hunting gave us more food; more food grew us bigger brains. We also upgraded our software:

Throwing taught us the kind of sequential thought that would become the human imagination and spur the creation of language, technology, medicine, and art.
So explain this: If humans are such natural marksmen, why are the majority of us like Shaq at the free-throw line?

“Yeah, that was me,” Brewster says. “I had all the cards stacked against me. Never played baseball, no real sports background at all. First time I threw a knife, I failed miserably.” He’d seen videos of expert throwers, the kind who send knives flipping end-over-end toward showgirls, but when he tried to copy them, he clanged all over the place. Then one day while working construction, Brewster began monkeying around with a screwdriver. If he held his finger straight up along a screwdriver’s spine, he could fling it perfectly into the ground. Every time. A quick Internet search later, Brewster found himself in the midst of an entire tribe experimenting with the same throwback throw. There was Roy Hutchinson, aka “The Great Throwzini,” and Xolette, a high-school science teacher in Florida who likes to no-spin butter knives across her kitchen.

Brewster explains that the spin technique — the kind of throwing you see at every circus and Vegas show — is inherently flawed. It’s not natural. Spin is terrific for long tosses, and it can be supremely accurate, but only under artificial conditions. For a spin to work, both you and the target have to be stationary, and you can only be a precise number of steps away. Shift even a little and you shank.

But with no-spin, you cash in on the fact that your index finger is neurologically wired to your eyeballs. In fact, you can learn no-spin with startling ease. You’ll need a target, naturally. Any solid chunk of wood will do. I just sawed a round slice off the end of a log and bolted it to an old picnic table turned on its side and braced with a two-by-four. (So easy, it almost took me longer to write it than do it.)

Next up: your blades. One of the beauties of no-spin is that just about anything will do. Steak knives, butter knives, screwdrivers, metal chopsticks, nails — if it’s got a point, you can fling it. For ease and safety, though, Brewster recommends a tempered-steel knife that won’t shatter or feel weird in your hand. He makes his own by hand (and sells them at FlyingSteel.com) and brought me a set of three of the simple black shanks he calls North Wind.

The best place to start is so close to the target you could almost reach out and touch it. “The nearer you are, the less you’ll try to overpower the throw,” Brewster explains. “You’ll let the knife sail on its own.” For your first throws, face the target slightly in profile with your left foot forward (opposite for lefties). Then remember these four steps:

  1. GRIP the knife lightly, with your index finger straight up.
  2. EXTEND your arm back and high over your head.
  3. Push your ELBOW forward, not your hand.
  4. RELEASE when the knife passes your ear and the point is still aimed at the sky.

As soon as you get the feel (and don’t be astonished if it only take two or three throws) you can begin stepping back, adding distance each time and experimenting with angles. With a little practice, you’ll soon be letting fly the way your ancestors did: fast, on the move, from any direction.

Bungling the Conclusions to Wars

December 19th, 2014

Insurgencies aren’t going away, so we should work toward doing counterinsurgencies better, Max Boot argues:

The first lesson may sound like a no-brainer, but it has been routinely ignored: plan for what comes after the overthrow of a regime. In Afghanistan and Iraq, the George W. Bush administration failed to adequately prepare for what the military calls “Phase IV,” the period after immediate victory — an oversight that allowed law and order to break down in both countries and insurgencies to metastasize. Yet Obama, despite his criticism of Bush’s conduct of the Iraq war, repeated the same mistake in Libya. In 2011, U.S. and nato forces helped rebels topple Muammar al-Qaddafi but then did very little to help the nascent Libyan government establish control of its own territory. As a result, Libya remains riven by militias, which have plunged the country into chaos. Just this past July — almost two years after U.S. Ambassador Christopher Stevens was killed in Benghazi — the State Department had to evacuate its entire embassy staff from Tripoli after fighting there reached the airport.

This is not a problem confined to Bush or Obama. The United States has a long tradition of bungling the conclusions to wars, focusing on narrow military objectives while ignoring the political end state that troops are supposed to be fighting for. This inattention made possible the persecution of freed slaves and their white champions in the South after the American Civil War, the eruption of the Philippine insurrection after the Spanish-American War, the rise of the Nazis in Germany and the Communists in Russia after World War I, the invasions of South Korea and South Vietnam after World War II, and the impetus for the Iraq war after the Gulf War. Too often, U.S. officials have assumed that all the United States has to do is get rid of the bad guys and the postwar peace will take care of itself. But it simply isn’t so. Generating order out of chaos is one of the hardest tasks any country can attempt, and it requires considerable preparation of the kind that the U.S. military undertook for the occupation of Germany and Japan after 1945 — but seldom did before and has seldom done since.

Welcome to the War

December 19th, 2014

David Grossman (On Killing) introduces the role of physiological arousal and fear with this anecdote from Six War Years 1939–1945:

And then a shell lands behind us, and another over to the side, and by this time we’re scurrying and the Sarge and I and another guy wind up behind a wall. The sergeant said it was an 88 and then he said, “Shit and shit some more.”

I asked him if he was hit and he sort of smiled and said no, he had just pissed his pants. He always pissed them, he said, just when things started and then he was okay. He wasn’t making any apologies either, and then I realized something wasn’t quite right with me, either. There was something warm down there and it seemed to be running down my leg. I felt, and it wasn’t blood. It was piss.

I told the Sarge, I said, “Sarge, I’ve pissed too,” or something like that and he grinned and said, “Welcome to the war.”

How You Know

December 18th, 2014

Paul Graham (Hackers & Painters) remembers little of what he’s read:

I’ve read Villehardouin’s chronicle of the Fourth Crusade at least two times, maybe three. And yet if I had to write down everything I remember from it, I doubt it would amount to much more than a page. Multiply this times several hundred, and I get an uneasy feeling when I look at my bookshelves. What use is it to read all these books if I remember so little from them?

[...]

Reading and experience train your model of the world. And even if you forget the experience or what you read, its effect on your model of the world persists. Your mind is like a compiled program you’ve lost the source of. It works, but you don’t know why.

The place to look for what I learned from Villehardouin’s chronicle is not what I remember from it, but my mental models of the crusades, Venice, medieval culture, siege warfare, and so on. Which doesn’t mean I couldn’t have read more attentively, but at least the harvest of reading is not so miserably small as it might seem.

This is one of those things that seem obvious in retrospect. But it was a surprise to me and presumably would be to anyone else who felt uneasy about (apparently) forgetting so much they’d read.

Realizing it does more than make you feel a little better about forgetting, though. There are specific implications.

For example, reading and experience are usually “compiled” at the time they happen, using the state of your brain at that time. The same book would get compiled differently at different points in your life. Which means it is very much worth reading important books multiple times. I always used to feel some misgivings about rereading books. I unconsciously lumped reading together with work like carpentry, where having to do something again is a sign you did it wrong the first time. Whereas now the phrase “already read” seems almost ill-formed.

Intriguingly, this implication isn’t limited to books.

104 Yards, Strong Hand Only

December 18th, 2014

A few weeks ago an active shooter shot up downtown Austin, but that’s not the interesting part, Chris Hernandez explains:

Sergeant Johnson shot him from 104 yards away, with one shot from a pistol, firing one handed, while holding the reins of two horses.

A few comments I’ve read online suggested the 104-yard pistol shot was an Austin PD conspiracy, because such a shot is impossible. I’ve also heard people say Johnson must be lying or exaggerating. You just can’t shoot someone with one shot, one handed with a pistol from over a hundred yards away.

My own experience and training leads me to a different conclusion. That shot would be amazingly difficult, but not impossible.

Most police officers never train to shoot past twenty five yards. I’ve worked for three departments, plus served as a United Nations police officer in Kosovo, and I can’t recall ever shooting a pistol at long range during police training. But I’ve taken a few pistol courses from private training companies. One of them was at Tiger Valley, near Waco, Texas.

The owner/instructor, TJ Pilling, lined us up on the pistol range one day and said we were going to have a competition. He told us to fire one shot at our targets, which were half-size steel silhouettes. We were at twenty-five yards, and we all hit. He backed us up to thirty-five yards and told us to fire again. We all hit. Forty-five yards. A few missed. Fifty-five yards. Only I and one other officer hit. Sixty-five. I was firing a .40 Glock 22, and aimed just over the top of the target’s head. I missed. The other officer hit.

TJ asked me if I aimed high. I told him I did. He said, “Aim center mass.” I did, and shocked the hell out of myself by hitting the target.

TJ walked us to a bay with a full-size silhouette target at 110 yards, and said, “If you have a 9mm, aim center mass. If it’s a .40, aim at the neck.”

The guys with 9mms started pinging the crap out of the target. I fired several shots standing and couldn’t get a hit, so I went prone and tried again. Eventually, after a spotter helped me walk the rounds in like a mortar, I made repeated hits.

I was, to put it mildly, surprised. I’d been a cop for twelve years at that point, and all my training had focused on shooting twenty-five yards and closer. I’d been in the military seventeen years but received almost no pistol training from either the Marines or Army. Conventional wisdom taught me pistols were last-ditch, close-in weapons, and shooting at someone even twenty-five yards away was stretching it. I had struggled to make accurate hits at twenty-five, had missed a target at that range more than once, and had seen cops and soldiers miss numerous shots even closer than that.

Why Do Human Children Stay Small For So Long?

December 18th, 2014

Why does it take so long for human children to grow up?

A male chimp and male human, for example, both end up with the same body weight but they grow very differently: at year one the human weighs twice that of the chimp but at eight the chimp is twice that of the human. The chimp then gains its adult weight by 12 — six years before the human. A male gorilla is also a faster growing primate — a 330-pound male gorilla weighs 110 pounds by its fifth birthday and 265 pounds by its tenth.

Clues to the answer can be found in the young human brain’s need for energy. Radioactive tracers allow scientists to measure the glucose used in different areas of the brain but this procedure is only used rarely when it is justified by investigating neurological problems. However, the few cases we do have reveal how radically different the childhood brain is from that in adults or infants.

From about the age of four to puberty, the young brain guzzles glucose — the cerebral cortex, its largest part, uses nearly (or more than) double that used earlier or later in life. This creates a problem. A child’s body is a third of the size of an adult but its brain is nearly adult-sized. Calculated as a share, a child’s takes up half of all the energy used by a child.

Map child growth against what is known about brain energy consumption and they shadow in a negative way: one goes up, the other down. The period in which the brain’s need for glucose peaks happens just when body growth most slows. Why? In a recent study in the Proceedings of the National Academy of Sciences, I proposed that this prevents a potential conflict over blood glucose that might otherwise arise between brawn and brain.

A young child has at any moment a limited amount of glucose in its blood circulation (3.4g — the equivalent in weight to about three Smartie candies). Fortunately a child’s liver can quickly generate glucose, providing other organs do not compete against the brain for the glucose. But as French child exercise physiologist Paul Delamarche noted:

Even at rest, it would appear to be difficult for children to maintain blood glucose concentration at a steady level; an immaturity of their gluco-regulatory system would seem to be likely, therefore causing a delay in an adequate response to any stimulus to hypoglycemia like prolonged exercise.

Organs elsewhere in the body fuel themselves with energy sources that do not compete with the brain such as fatty acids. But skeletal muscle can compete when exertion is intense and sustained.

In adults, the liver quickly ramps up its generation of glucose so even active brawn does not usually compete against the brain. But conflict can arise even in adults, and it could pose a real threat to children. Luckily they do not let it happen: they stop exertion if it gets intense and sustained. Not that this makes children inactive — they do even more low and moderate exercise than adolescents and adults.

So putting a break on growth in childhood aids limiting skeletal muscle as a potential glucose competitor to the brain. And not only are their bodies smaller but they contain (as a percentage of their bodies) less skeletal muscle than in adults. And even that skeletal muscle, some research suggests, is of a type that uses less glucose than in active adults.

So human growth rate negatively shadows increased energy use in the child’s brain.