Harvey’s Casino

Saturday, April 23rd, 2016

In the early morning hours of August 26, 1980, a team of men wearing white jumpsuits rolled an IBM copy machine into Harvey’s Resort Hotel and Casino in Stateline, Nevada, near Lake Tahoe — only it wasn’t a copy machine:

So began one of the most unusual cases in [FBI] history.

A note left with the bomb—titled STERN WARNING TO THE MANAGEMENT AND BOMB SQUAD—began ominously: “Do not move or tilt this bomb, because the mechanism controlling the detonators in it will set it off at a movement of less than .01 of the open end Ricter scale.”

“Do not try to take it apart,” the note went on. “The flathead screws are also attached to triggers and as much as ¼ to ¾ of a turn will cause an explosion. …This bomb is so sensitive that the slightest movement either inside or outside will cause it to explode. This bomb can never be dismantled or disarmed without causing an explosion. Not even by the creator.”

An investigator examines the Harvey’s bomb, which contained nearly 1,000 pounds of dynamite and a variety of triggering mechanisms that made it virtually undefeatable.

The “creator,” we later discovered, was 59-year-old John Birges, Sr.—who wanted $3 million in cash in return for supplying directions to disconnect two of the bomb’s three automatic timers so it could be moved to a remote area before exploding.

The device—two steel boxes stacked one atop the other—contained nearly 1,000 pounds of dynamite. Inside the resort, Birges made sure the bomb was exactly level, then armed it using at least eight triggering mechanisms.

Harvey’s Bomb

“We had never seen anything quite like it,” said retired Special Agent Chris Ronay, an explosives examiner who was called to the scene along with other experts.

After being discovered, the bomb was photographed, dusted for fingerprints, X-rayed, and studied. Finally, more than 30 hours later, a plan was agreed upon: if the two boxes could be severed using a shaped charge of C4 explosive, it might disconnect the detonator wiring from the dynamite.

Harvey’s and other nearby casinos in Lake Tahoe were evacuated, and on the afternoon of August 27, the shaped charge was remotely detonated.

The plan was the best one available at the time, but it didn’t work. The bomb exploded, creating a five-story crater in the hotel. “Looking up from ground level,” Ronay said, “you could see TV sets swinging on electric cords and toilets hanging on by pipes. Debris was everywhere.” Fortunately, because of the evacuation, no one was killed or injured.

Harvey’s Bomb Blast

John Birges, Sr. (1922–1996), was a Hungarian immigrant from Clovis, California. He flew for the German Luftwaffe during World War II. He was captured and sentenced to 25 years of hard labor in a Russian gulag. Eight years into his sentence in the gulag, he escaped by blowing it up. He emigrated to the U.S. and built a successful landscaping business, but his addiction to gambling led to his losing a large amount of money and prompted the bomb plot. His gambling debt and experience with explosives were primary pieces of evidence linking him to the Lake Tahoe bombing.

Birges was eventually arrested based on a tip. One of his sons had revealed to his then-girlfriend that his father had placed a bomb in Harvey’s. After the two broke up, she was on a date with another man when they heard about a reward for information, and she informed her new boyfriend about Birges. This man then called the FBI.

(Hat tip to Mangan, who cited Wikipedia.)

Intention is salient

Saturday, April 23rd, 2016

Bernie Sanders has proclaimed that democratic socialism isn’t un-American, Harrison Searles notes:

He says that it is in us all. And in that he is correct: The desire for an egalitarian and unified society is an element of human nature. Such desires are part of our genetic inheritance. A yearning for democratic socialism is a legacy of the band societies that our distant ancestors lived in. Those social instincts were an essential part of the success of the hominid line.

With those instincts, our ancestors were able to traverse the evolutionary minefield of selfishness and competing interests through speech and participatory, consensus-based decisionmaking. Egalitarian decisionmaking complemented the other traits defining the hominid line, including linguistic talent and hypercognition, to make Homo sapiens a master of social interaction. Many animals exploit cooperation for survival, but Homo sapiens are astounding in their ability to do so creatively. In their shared struggle for existence, our ancestors worked socially together in their egalitarian bands. Until the dawn of civilization, just 10,000 years ago, those simple societies were the context of all human interaction. Our genes have not changed very much since the start of civilization. At root, we are still band-man.

With the emergence of vastly complex commercial societies, cooperation began to take on a new form and meaning. Abstract rules, rather than instinctual impulses, came to guide how cooperation worked. The principles of property and voluntary agreement were extended to a widening array of things and activities. With the invention of such abstract rules came the invention of the autonomous individual.

Today, in the context of family and friends, people still rely on primordial social instincts. But those natural proclivities have taken an ever more secondary role in governing cooperation within the whole of society. In “The Fatal Conceit” and other writings, Friedrich Hayek argued that the desire for policies like those favored by Sanders is an atavistic reassertion of people’s primal social instincts. Hayek suggested that there was a conflict between the instincts biologically evolved in bands and the abstract rules culturally evolved in civilization. Yet the yearnings for solidarity and centricity remain a lingering part of human nature. When they are not treated with caution, those yearnings can be turned into misguided policies.

What Daniel Klein calls “the people’s romance” remains a hazard of modern politics. It is the yearning for solidarity; it is the yearning for sentiment, action, and experience that encompasses “the people.” It makes us uneasy with the abstract rules that individuate and detach our social experiences. In an experimental setting, Klein and collaborators have shown our demand for encompassing experience and sentiment.

Over millions of years our minds evolved to read emotional cues and to forms narratives of intention, but not to see unintended consequences. Our minds are adapted for emotional intelligence, not cost-benefit analysis. The people’s romance is a systemic bias that we have to be mindful of. As Paleolithic animals in modernity, we have to be wary of temptations to appraise a policy by the emotional impact of what we see rather than by the consequences of what we don’t see. What we see, above all, is what we imagine to be intended by the central players. Intention is salient.

A Thought Collective

Friday, April 22nd, 2016

Scientific inquiry is prone to the eternal rules of human social life:

In a 2015 paper titled Does Science Advance One Funeral at a Time?, a team of scholars at the National Bureau of Economic Research sought an empirical basis for a remark made by the physicist Max Planck: “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

The researchers identified more than 12,000 “elite” scientists from different fields. The criteria for elite status included funding, number of publications, and whether they were members of the National Academies of Science or the Institute of Medicine. Searching obituaries, the team found 452 who had died before retirement. They then looked to see what happened to the fields from which these celebrated scientists had unexpectedly departed, by analysing publishing patterns.

What they found confirmed the truth of Planck’s maxim. Junior researchers who had worked closely with the elite scientists, authoring papers with them, published less. At the same time, there was a marked increase in papers by newcomers to the field, who were less likely to cite the work of the deceased eminence. The articles by these newcomers were substantive and influential, attracting a high number of citations. They moved the whole field along.

A scientist is part of what the Polish philosopher of science Ludwik Fleck called a “thought collective”: a group of people exchanging ideas in a mutually comprehensible idiom. The group, suggested Fleck, inevitably develops a mind of its own, as the individuals in it converge on a way of communicating, thinking and feeling.

This makes scientific inquiry prone to the eternal rules of human social life: deference to the charismatic, herding towards majority opinion, punishment for deviance, and intense discomfort with admitting to error. Of course, such tendencies are precisely what the scientific method was invented to correct for, and over the long run, it does a good job of it. In the long run, however, we’re all dead, quite possibly sooner than we would be if we hadn’t been following a diet based on poor advice.

Left Cop, Right Cop

Friday, April 22nd, 2016

Somewhere in the Clinton years, Z Man began to sour on official conservatism:

Anyway, I slowly came to the conclusion that the whole Right-Left dynamic was just a myth. One of things about working in Washington, even briefly, is you learn quickly that politics is nothing like you see on TV. Two people on a show ripping one another apart will be at the bar after the show yukking it up like old pals. That’s because they are old pals. The Right-Left narrative has simply become a convenient framework for the reality show called politics. This has been true since the 80’s.

Once you free your mind, if you will, of that framework through which you are expected to see your world, you have to make sense of what you see. If the Right-Left construct is just a version of good cop/bad cop where the people in the media hustle the rest of us so they can live above their utility, then what’s really going on in the world? How do things really work?

The Sugar Conspiracy

Thursday, April 21st, 2016

The sugar conspiracy seems so brazen in retrospect:

Robert Lustig is a paediatric endocrinologist at the University of California who specialises in the treatment of childhood obesity. A 90-minute talk he gave in 2009, titled Sugar: The Bitter Truth, has now been viewed more than six million times on YouTube. In it, Lustig argues forcefully that fructose, a form of sugar ubiquitous in modern diets, is a “poison” culpable for America’s obesity epidemic.

A year or so before the video was posted, Lustig gave a similar talk to a conference of biochemists in Adelaide, Australia. Afterwards, a scientist in the audience approached him. Surely, the man said, you’ve read Yudkin. Lustig shook his head. John Yudkin, said the scientist, was a British professor of nutrition who had sounded the alarm on sugar back in 1972, in a book called Pure, White, and Deadly.

“If only a small fraction of what we know about the effects of sugar were to be revealed in relation to any other material used as a food additive,” wrote Yudkin, “that material would promptly be banned.” The book did well, but Yudkin paid a high price for it. Prominent nutritionists combined with the food industry to destroy his reputation, and his career never recovered. He died, in 1995, a disappointed, largely forgotten man.

[...]

When Yudkin was conducting his research into the effects of sugar, in the 1960s, a new nutritional orthodoxy was in the process of asserting itself. Its central tenet was that a healthy diet is a low-fat diet. Yudkin led a diminishing band of dissenters who believed that sugar, not fat, was the more likely cause of maladies such as obesity, heart disease and diabetes. But by the time he wrote his book, the commanding heights of the field had been seized by proponents of the fat hypothesis. Yudkin found himself fighting a rearguard action, and he was defeated.

Not just defeated, in fact, but buried. When Lustig returned to California, he searched for Pure, White and Deadly in bookstores and online, to no avail. Eventually, he tracked down a copy after submitting a request to his university library. On reading Yudkin’s introduction, he felt a shock of recognition.

“Holy crap,” Lustig thought. “This guy got there 35 years before me.”

[...]

Look at a graph of postwar obesity rates and it becomes clear that something changed after 1980. In the US, the line rises very gradually until, in the early 1980s, it takes off like an aeroplane. Just 12% of Americans were obese in 1950, 15% in 1980, 35% by 2000. In the UK, the line is flat for decades until the mid-1980s, at which point it also turns towards the sky. Only 6% of Britons were obese in 1980. In the next 20 years that figure more than trebled. Today, two thirds of Britons are either obese or overweight, making this the fattest country in the EU. Type 2 diabetes, closely related to obesity, has risen in tandem in both countries.

At best, we can conclude that the official guidelines did not achieve their objective; at worst, they led to a decades-long health catastrophe.

[...]

We tend to think of heretics as contrarians, individuals with a compulsion to flout conventional wisdom. But sometimes a heretic is simply a mainstream thinker who stays facing the same way while everyone around him turns 180 degrees. When, in 1957, John Yudkin first floated his hypothesis that sugar was a hazard to public health, it was taken seriously, as was its proponent. By the time Yudkin retired, 14 years later, both theory and author had been marginalised and derided. Only now is Yudkin’s work being returned, posthumously, to the scientific mainstream.

Read the whole thing.

Take the First Fall

Thursday, April 21st, 2016

When was the last time your feedback improved someone else’s life?

The problem is that we forget we’re giving feedback to a fellow human being, not an advice-taking robot. Even when we’re well-intentioned, the message gets lost in the transmission. It’s like the old saying “What counts is not what’s said, but what’s heard.” We respond emotionally to criticism, even if it’s just implied criticism. (Are you sure you still fit into that dress?) This makes it difficult to help others improve. In other words, we fail to understand and appreciate human nature.

According to Roger Fisher and Alan Sharp’s Getting It Done: How to Lead When You’re Not in Charge, there are three different kinds of feedback:

  • Appreciation
  • Advice
  • Evaluation

Their advice is not to mix the three:

  1. Express Appreciation to Motivate
  2. Offer Advice to Improve Performance
  3. Evaluate Only When Needed
  4. Take the First Fall

Identity

Wednesday, April 20th, 2016

If you stand for nothing, you’ll fall for anything, this video from the Family Policy Institute of Washington argues:

Occupy Le Corbusier

Wednesday, April 20th, 2016

The natural environment has its champions in American politics, but the built environment, where most of us live and work, does not:

Traditional architecture — derived ultimately from the columns, pediments, arches, and other features of ancient Greece and Rome — evolved by trial and error, teaching best practices to builders and architects generation by generation. The centuries forged a classical language that fostered architecture sensitive to the public’s desire for “congenial facades.” But in the mid-20th century, new ideas took over, and the public has ever since been subjected to endless experimentation and vanity projects.

In most cities and towns, the way new buildings look is not influenced by public taste, which is generally traditional. Instead, it is the purview of municipal and institutional facilities committees, design-review panels, the developers who hire architects who cater to the tastes of officialdom, and the local circle of professionals, academics, and journalists who may be relied upon to cluck at any deviation from the elite fashion in the design of new buildings.

Typed Notes

Tuesday, April 19th, 2016

Students type their lecture notes nowadays, but transcribing a lecture isn’t the best way to learn the material:

Generally, people who take class notes on a laptop do take more notes and can more easily keep up with the pace of a lecture than people scribbling with a pen or pencil, researchers have found. College students typically type lecture notes at a rate of about 33 words a minute. People trying to write it down manage about 22 words a minute.

In the short run, it pays off. Researchers at Washington University in St. Louis in 2012 found that laptop note-takers tested immediately after a class could recall more of a lecture and performed slightly better than their pen-pushing classmates when tested on facts presented in class. They reported their experiments with 80 students in the Journal of Educational Psychology.

Any advantage, though, is temporary. After just 24 hours, the computer note takers typically forgot material they’ve transcribed, several studies said. Nor were their copious notes much help in refreshing their memory because they were so superficial.

In contrast, those who took notes by hand could remember the lecture material longer and had a better grip on concepts presented in class, even a week later. The process of taking them down encoded the information more deeply in memory, experts said. Longhand notes also were better for review because they’re more organized.

[...]

Those who wrote out their notes longhand took down fewer words, but appeared to think more intensely about the material as they wrote, and digested what they heard more thoroughly, the researchers reported in Psychological Science. “All of that effort helps you learn,” said Dr. Oppenheimer.

Laptop users instead took notes by rote, taking down what they heard almost word for word.

Unrestricted Submarine Warfare

Tuesday, April 19th, 2016

The U.S. Navy went into World War 2 with a three-phase plan for handling the Japanese, War Plan Orange:

  1. Pull US Navy ships back to their home ports, and sacrifice outposts near Japan — the Philippines and Guam.
  2. With superior force, advance toward Japan, seizing Japanese-occupied islands to establish supply routes and overseas bases. The US, with its superior production power, should be able to reclaim the Philippines within two or three years.
  3. Choke Japanese trade and bombard the Japanese home islands without invading them.

Submarines were seen as auxiliaries or picket ships that would scout ahead of the fleet and extend its range of observation, but that role ended up being filled by aircraft.

War Plan Orange wargames rarely dealt with submarines substantially; the focus was on battleships versus carriers.

But the U.S. Navy wasn’t able to follow through on War Plan Orange and instead started by ordering unrestricted submarine warfare against Japan’s sea lines of communications:

American Navy planners had not totally overlooked unrestricted submarine warfare in 1940 and 1941, but had given little thought to exactly HOW these operations would be carried out. The Navy had not thought out the necessary components for such a campaign, because it went against Mahanian principles which stressed decisive surface battles. The post-war assessment from inside the submarine community was telling: “Neither by training nor indoctrination was the U.S. Submarine Force readied for unrestricted warfare.” Campaign pressures and operational realities would force the Navy to adapt its plans and way of fighting.

Clay Blair observed that because of its lack of doctrine and working weapons, the U.S. submarine offensive did not truly begin until 1944. Up until then it “had been a learning period, a time of testing, of weeding out, of fixing defects in weapons, strategy, and tactics, of waiting for sufficient numbers of submarines and workable torpedoes.” More boats, more aggressive commanders, reliable torpedoes, and better radar/sonar all made their contribution. By the end autumn of 1944, the period of learning and adaptation was over. The American sea wolves were numerous, trained, and well-armed.

As the war drew to a close, the role of the submarine as an offensive weapon was evident. The boats had served as the principal source of attrition for the Japanese economy by targeting Japan’s commerce, especially its oil tanker fleet. Once the Americans had taken positions in the Philippines, Guam, Midway, Saipan and Okinawa, U.S. forces had cut off the Empire’s energy supply. Strategically, the war was essentially over. Japanese economic productivity was grinding to a halt. Japanese tankers were delivering only one tenth of the oil needed for 1944-45.

After the war, there was agreement among Navy leaders that submarines had played a major role in countering Japan. Nimitz, after some distance and reflection in retirement, said: “During the dark, early months of World War II, it was only the tiny American submarine force that held off the Japanese Empire and enabled our fleet to replace their losses and repair their wounds.” More objectively, speaking well after the war, Admiral “Bull” Halsey observed, “If I had to give credit to the instruments and machines that won us the war in the Pacific, I would rank them in this order; submarines, first, radar second, planes third, and bulldozers fourth.”

The judgment of Navy leaders was validated by post-war government assessments. As noted in the US Strategic Bombing Survey, the impact of the submarine attrition warfare was strategic in effect:

Instead of the 28,500,000 barrels of oil its leaders expected to import from the Southern Zone in 1944, it imported only 4,975,000 barrels. In 1945 its imports were confined to the few thousand barrels brought in during January and February by single tankers that succeeded in running the blockade….After the battles of early 1945, when Japan lost the Philippines and Okinawa, United States forces sat astride its vital oil life line. Strategically the war was won.

This history makes one wonder what the U.S. Navy might have achieved if it had invested the same intellectual capital into developing Fleet submarines and working torpedoes that it had in the carrier. Could the United States have choked off Japan’s trade by the summer of 1943?

Pitbulls account for half of dog fatalities

Monday, April 18th, 2016

Only the worst kind of racist — a dog-racist — would suggest that pitbulls account for half of dog fatalities:

According to a report by Merritt Clifton (via Rosalind Arden), pitbulls accounted for 295 of 593 human fatalities due to dogs between 1982-2014, although only making up 6.7% of dogs. But that’s still the second most popular breed, behind only labrador mixes. My observation from walking down the sidewalk is that pitbulls are much more prevalent today in Los Angeles than a half century ago, when they were only vaguely heard of.

In contrast, labradors and lab mixes account for 11.5% of dogs, and only 4 human deaths.

German shepherds, an aggressive/protective breed, are in-between with 15 fatalities and 3.7% of dogs.

Pitbulls, which aren’t particularly big, aren’t the most dangerous dog per capita. The perro de presa canario, a 100+ pound beast, killed 18 people despite being only 0.02% of dogs for sale or adoption. Both are in the molosser class.

Also, wolfish dogs, such as akita, huskies, and wolf-hybrids are pretty scary, as are chows, a wolfish-molosser cross.

Rottweilers are about as dangerous per capita as pit bulls. Dobermans, however, which were notorious when I was a child as WWII guard dogs, have gotten less dangerous: my recollection is that Doberman owners have been breeding for safety while rottweiler owners have been breeding their dogs to be scary.

The Importance of the Battle of Midway

Monday, April 18th, 2016

The importance of the Battle of Midway goes beyond shifting the balance of power and the initiative from the Imperial Japanese Navy to the U.S. Navy. The victory at Midway aided allied strategy worldwide:

That last point needs some explaining. To understand it, begin by putting yourself in the shoes of President Franklin Roosevelt and Prime Minister Winston Churchill at the beginning of May 1942. The military outlook across the world appears very bad for the Allies. The German army is smashing a Soviet offensive to regain Kharkov, and soon will begin a drive to grab the Soviet Union’s oil supplies in the Caucasus. A German and Italian force in North Africa is threatening the Suez Canal. The Japanese have seriously crippled the Pacific Fleet, driven Britain’s Royal Navy out of the Indian Ocean, and threaten to link up with the Germans in the Middle East.

If the Japanese and the Germans do link up, they will cut the British and American supply line through Iran to the Soviet Union, and they may pull the British and French colonies in the Middle East into the Axis orbit. If that happens, Britain may lose control of the Eastern Mediterranean and the Soviet Union may negotiate an armistice with Germany. Even worse, the Chinese, cut off from aid from the United States, may also negotiate a cease-fire with the Japanese. For Churchill, there is the added and dreaded prospect that the Japanese may spark a revolt that will take India from Britain. Something has to be done to stop the Japanese and force them to focus their naval and air forces in the Pacific—away from the Indian Ocean and (possibly) the Arabian Sea.

Midway saves the decision by the Americans and British to focus their major effort against Germany, and the American and British military staffs are free to plan their invasion of North Africa. The U.S. Navy and Marines also begin planning for an operation on Guadalcanal against the Japanese. As Rear Admiral Raymond Spruance—one of the Navy’s carrier task force commanders at Midway—put it after the battle, “We had not been defeated by these superior Japanese forces. Midway to us at the time meant that here is where we start from, here is where we really jump off in a hard, bitter war against the Japanese.” Note his words: “… here is where we start from…” Midway, then, was a turning point, but by no means were the leaders of Japan and Germany ready to throw in the towel.

Income and Household Demographics

Sunday, April 17th, 2016

American households in different income quintiles differ in predictable ways:

Mean number of earners per household. On average, there are significantly more income earners per household in the top income quintile households (1.98) than earners per household in the lowest-income households (0.41). It can also be seen that the average number of earners increases for each higher income quintile, demonstrating that one of the main factors in explaining differences in income among U.S. households is the number of earners per household. Also, the unadjusted ratio of average income for the highest to lowest quintile of 15.9 times ($185,206 to $11,651), falls to a ratio of only 3.3 times when comparing “income per earner” of the two quintiles: $93,538 for the top fifth to $28,417 for the bottom fifth.

Share of households with no earners. Sixty-three percent of U.S. households in the bottom fifth of Americans by income had no earners for the entire year in 2013. In contrast, only 3.1% of the households in the top fifth had no earners in 2013, providing more evidence of the strong relationship between household income and income earners per household.

Marital status of householders. Married-couple households represent a much greater share of the top income quintile (76.8%) than for the bottom income quintile (16%), and single-parent or single households represented a much greater share of the bottom one-fifth of households (84.0%) than for the top 20% (23.2%). Like for the average number of earners per household, the share of married-couple households also increases for each higher income quintile, from 16% (lowest quintile) to 35% to 50% (middle quintile) to 64% to 77% (highest quintile).

Age of householders. More than 7 out of every 10 households (71.9%) in the top income quintile included individuals in their prime earning years between the ages of 35-64, compared to fewer than half (43.9%) of household members in the bottom fifth who were in that prime earning age group last year. The share of householders in the prime earning age group of 35-64 year olds increases with each higher income quintile.

Compared to members of the top income quintile of households by income, household members in the bottom income quintile were 1.4 times more likely (21.8% vs. 15.8%) to be in the youngest age group (under 35 years), and almost three times more likely (34.2% vs. 12.3%) to be in the oldest age group (65 years and over).

By average age, the highest income group is the youngest (48.8 years) and the lowest income group is the oldest (54.4 years).

Work status of householders. Almost five times as many top quintile households included at least one adult who was working full-time in 2013 (78.8%) compared to the bottom income quintile (only 16.1%), and more than five times as many households in the bottom quintile included adults who did not work at all (69.4%) compared to top quintile households whose family members did not work (12.4%). The share of householders working full-time increases at each higher income quintile (16.1% to 43.9% to 60.4% to 70.7% o 78.8%).

Education of householders. Family members of households in the top fifth by income were almost five times more likely to have a college degree (64.6%) than members of households in the bottom income quintile (only 13.5%). In contrast, householders in the lowest income quintile were 15 times more likely than those in the top income quintile to have less than a high school degree in 2013 (24.2 % vs. 1.6%). As expected, the Census data show that there is a significantly positive relationship between education and income.

Selected Characteristics of US Households by Income Quintile 2013

Revitalizing Wargaming

Sunday, April 17th, 2016

We must revitalize wargaming to prepare for future wars, Deputy Secretary of Defense Bob Work and Gen. Paul Selva argue:

Few historical periods match the dynamic technological disruption of the inter-war years of the 1920s and 1930s. During these decades, militaries the world over struggled to adapt to new inventions such as radar and sonar, as well as rapid improvements in wireless communications, mechanization, aviation, aircraft carriers, submarines, and a host of other militarily relevant technologies. Military planners and theorists intuitively understood that all these new technologies, systems, and advances would drive new ways of fighting, but they were forced to envision what future battlefields would look like with few clues to go by.

To help navigate through this period of disruptive change, the United States military made extensive use of analytical wargaming. Wargames were an inexpensive tool during a period of suppressed defense spending to help planners cope with the high degree of contemporary technological and operational uncertainty. They were used to explore a range of possible warfighting futures, generate innovative ideas, and consider how to integrate new technologies into doctrine, operations, and force structure. For example, faculty and students at the Naval War College integrated wargaming into their entire course of study, analyzing the then-novel concept of carrier task force operations, the role of submarines in scouting and raiding, and how to provide logistics support to fleet operations spread over the vast Pacific Ocean. Wargames in classrooms at Quantico helped the Marine Corps develop new concepts for amphibious warfare and conceive of new techniques for capturing advanced naval bases. Wargamers at the Army War College explored how to employ tanks and artillery on infantry-dominated battlefields and examined the logistical challenges of fighting a war far from American shores.

As valuable as they were, wargames were not in and of themselves sufficient to prompt organizational and operational change. As such, all of the services worked hard to test wargame results in fleet and field exercises. Exercises were used to verify game insights using systems at hand or with surrogates that represented desired advanced capabilities identified during game play. The observations and lessons learned in exercises were in turn fed back into new wargames, thus creating a cycle of creative ideas and innovation that generated requirements for new systems, suggested new operation concepts, and influenced force design.

Once the Second World War began, those warfighting communities that had pursued wargaming and exercises with vigor proved far better prepared for modern combat than those that did not. For example, of the three major warfighting communities in the Navy — naval aviation, surface warfare, and submarine — the naval aviation community carried out the most innovative pre-war experimentation and exercises. Although aircraft carriers were originally envisioned as operating in support of the fleet battle line, carrier aviators explored a wide range of futures, including independent carrier operations. As a result, the U.S. carrier force was ready on day one of the Pacific war, and within six months had inflicted a major, lasting defeat on the superior Japanese carrier force at the Battle of Midway. By contrast, pre-war wargames and exercises in the submarine community had emphasized rote doctrine using the submarine fleet as a scouting force for the main battle line, and policy strictures dampened any exploration of independent submarine operations. Unsurprisingly, then, the submarine community proved unprepared for the tactics, techniques, and procedures needed to execute unrestricted warfare on Japanese merchant shipping. Similarly, surface warfare wargames failed to anticipate long-range torpedoes or account for the Japanese emphasis on night surface action. As a consequence, they suffered badly in early clashes against a highly trained Japanese cruiser and destroyer force that excelled at night fighting and was armed with the deadly, long-range Long Lance torpedo.

Today, we are living in a time of rapid technological change and constrained defense spending, not unlike that of the inter-war years.

The story of how the Navy learned to learn to fight is fascinating, by the way.

Kermit and Cookie Monster and the Mystery Box

Saturday, April 16th, 2016

Kermit and Cookie Monster and the Mystery Box goes back to Sesame Street episode 3546, from the 20th season (1988–1989):

The 21st was Henson’s last.