Why not make a teen Rambo and turn the project over to John Milius?

Tuesday, July 13th, 2021

I recently rewatched Red Dawn for the first time in decades, and it wasn’t nearly as cheesy as I expected. The Wikipedia entry explains how it got made:

Originally called Ten Soldiers, it was written by Kevin Reynolds. It was set in the near future as a combined force of Russians and Cubans launched an invasion of the Southwestern U.S.. Ten children take to the hills when their small town is captured, turning into a skilled and lethal guerrilla band.

Producer Barry Beckerman read the script, and, in the words of Peter Bart, “thought it had the potential to become a tough, taut, ‘art’ picture made on a modest budget that could possibly break out to find a wider audience.” He got his father Sidney Beckerman to help him pay a $5,000 option. Reynolds wanted to direct but the Beckermans wanted someone more established. Walter Hill briefly considered the script before turning it down, as did several other directors.

The Beckermans pitched the project to David Begelman when he was at MGM and were turned down. They tried again at that studio when it was being run by Frank Yablans. Senior vice-president for production Peter Bart, who remembers it as a “sharply written anti-war movie…a sort of Lord of the Flies“, took the project to Yablans.

The script’s chances of being filmed increased when Kevin Reynolds became mentored by Steven Spielberg who helped him make Fandango. MGM bought the script.

Bart recalls that things changed when “the chieftains at MGM got a better idea. Instead of making a poignant little antiwar movie, why not make a teen Rambo and turn the project over to John Milius, a genial and rotund filmmaker who loved war movies and also loved war? The idea was especially popular with a member of the MGM board of directors, General Alexander Haig, the former Nixon chief of staff, who yearned to supervise the film personally and develop a movie career.”

Bart says most of MGM’s executives, except for Yablans, were opposed to Milius directing. Bart claims he made a last minute attempt to get Reynolds to direct the film and went to see Spielberg. However, by this stage Fandango was in rough cut, and Bart sensed that Spielberg was disappointed in the film and would not speak up for Reynolds.

Milius was signed to direct at a fee of $1.25 million, plus a gun of his choice.

Milius set about rewriting the script. He and Haig devised a backstory in which the circumstances of the invasion would take place; this was reportedly based on Hitler’s proposed plans to invade the U.S. during World War II. Haig took Milius under his wing, bringing him to the Hudson Institute, the conservative think tank founded by Herman Kahn, to develop a plausible scenario. Milius saw the story as a Third World liberation struggle in reverse; Haig introduced Nicaragua and suggested that, with the collapse of NATO, a left-wing Mexican regime would participate in the Soviet invasion, effectively splitting the U.S. in half. Bart says, “Even Milius was taken aback by Haig’s approach to the project. ‘This is going to end up as a jingoistic, flag-waving movie,’ Milius fretted. As a result, the budget of this once $6 million movie almost tripled.”

Other changes included a shift in focus from conflict within the group to conflict between the teens and their oppressors, and the acceleration of the ages of some of the characters from early teens to high school age and beyond. There was also the addition of a sequence where some children visit a camp to find their parents have been brainwashed.

Milius later said, “I see this as an anti-war movie in the sense that if both sides could see this, maybe it wouldn’t have to happen. I think it would be good for Americans to see what a war would be like. The film isn’t even that violent — the war shows none of the horrors that could happen in World War III. In fact, everything that happened in the movie happened in World War II.”

Bart says Yablans pushed through filming faster than Milius wanted because MGM needed a movie over the summer. Milius wanted more time to plan, including devising futuristic weaponry and to not shoot over winter, but had to accede.

The Pentagon withdrew its cooperation from the film.

Kenyan runners outperformed all other nations by 1,700-fold

Monday, July 12th, 2021

David Epstein explores (in The Sports Gene) Kenya’s dominance in long-distance running:

In the 1964 Olympics, just the third ever in which Kenya competed, a Kipsigis runner named Wilson Kiprugut won bronze in the 800-meters. Four years later, in the altitude of Mexico City, Kenya was the dominant distance running power, winning seven medals in middle- and long-distance events.

[..]

“The conventional wisdom was that blacks could sprint, but that anything that required tactical sophistication, or discipline, or training,” he says, “this was the white man’s province.”

[...]

The 4.9 million Kalenjin people represent about 12 percent of Kenya’s population, but more than three quarters of the country’s top runners.

[...]

Manners wrote that a part of traditional life for Kalenjin warriors was the practice of cattle raiding. Essentially, it entailed stealthily running and walking into the land of neighboring tribes, rounding up cattle, and escorting them back to Kalenjin land as quickly as possible. Cattle raiding was not considered theft so long as the raiders weren’t filching the cattle from the same subtribe within the Kalenjin. “The raids were conducted largely at night,” Manners wrote, “and sometimes ranged over distances as great as 100 miles! Most raiding parties were group ventures but each muren [or warrior] was expected to at least do his share.”

[...]

A muren who brought back a large number of cattle from a raid was hailed as a courageous and athletic warrior and could use his cattle and prestige to acquire wives.

[...]

Korir was thrust into the 3,000-meter steeplechase — a race just shy of two miles that includes hurdles — and in his third-ever attempt at the event won the national junior college championship. Four years later, Korir was the third-ranked steeplechase runner in the world.

[...]

Or the one about Julius Randich, who arrived at Lubbock Christian University in Texas a heavy smoker with no competitive running background. By the end of his first year, 1991–92, Randich was the national small-colleges (NAIA) champion in the 10K. The following year, Randich set NAIA records in the 5K and 10K and was named the outstanding athlete in any sport in the NAIA.

Kalenjin runners became all the rage among NAIA coaches, and several others would win the 10K national championships after Randich, including his younger brother Aron Rono, who won it four straight times.

Rotich, the son of a prosperous Kalenjin farmer, arrived at South Plains Junior College in Texas in 1988, having lived a “comfortably sedentary” life, as Manners describes it. Rotich, a stout 5’8″ and 190 pounds, quickly burned through most of the $10,000 his father had given him for two years of living expenses and tuition. “But rather than return home in disgrace,” Manners wrote, “Paul . . . decided to train in hopes of earning a track scholarship.” Rotich trained at night to avoid the embarrassment of being seen. That concern would be short-lived, as he made the national junior college cross-country championships in his first season. He went on to become a ten-time All-American in cross-country and indoor and outdoor track. As Manners reported, when Rotich returned to Kenya and detailed his running exploits to a cousin, the cousin replied: “So, it is true. If you can run, any Kalenjin can run.”

[...]

Consider this: seventeen American men in history have run a marathon faster than 2:10 (or a 4:58 per mile pace); thirty-two Kalenjin men did it just in October 2011.

[...]

For example: five American high-schoolers have run under four minutes in the mile in history; St. Patrick’s High School, in the Kalenjin training town of Iten, once had four sub–four milers in school at the same time.

[...]

Wilson Kipketer, a former St. Patrick’s student who became a Danish citizen and held the 800-meter world record from 1997 to 2010, does not hold his own high school’s record.

[...]

The kids in his time trials generally come from elite, highly selective, government-funded boarding schools, and essentially none of them have any racing experience.

[...]

Each year, about half of the boys in the time trial will run faster than 5 minutes and 20 seconds in the 1,500-meter time trial, on a shoddy dirt track, above seven thousand feet. (The 1,500 is about 100 meters shy of a mile, and 5:20 translates to a mile time just over 5:40.)

[...]

In the tryout in 2005, a boy named Peter Kosgei ran 4:15 with no real training. Kosgei was accepted to Hamilton College in Clinton, New York, and quickly became the best athlete in the college’s history. In his freshman year, Kosgei won the Division III 3,000-meter steeplechase national title. By the end of his junior year, he had compiled eight more national titles in cross-country and track.

[...]

Evans Kosgei — no relation to Peter — held down a 3.8 GPA in computer science and engineering at Lehigh University and, after adjusting to life in America for a year, decided to go out for cross-country in his sophomore year. He struggled even to finish his five-mile tryout. But, in short order, Kosgei was running at the Division I national championships in both cross-country and track. In 2012, he was named Lehigh’s Graduating Scholar-Athlete of the Year.

[...]

As expected from their latitudes of ancestry, though, the Kalenjin and Danish boys did display body type differences. A greater portion of the body length of the Kalenjin boys was composed of legs. The Kalenjin boys were, on average, two inches shorter than the Danish boys, but had legs that were about three quarters of an inch longer.

The scientists’ most unique finding, though, was not the length of the legs, but their girth. The volume and average thickness of the lower legs of the Kalenjin boys was 15 to 17 percent less than in the Danish boys. The finding is substantial because the leg is akin to a pendulum, and the greater the weight at the end of the pendulum, the more energy is required to swing it.

[...]

Compared with the Danish runners, the Kalenjin runners tested by the Danish scientists had nearly a pound less weight in their lower legs. The scientists calculated the energy savings at 8 percent per kilometer.

[...]

Some anthropologists actually refer to the extreme of a slender body build as the Nilotic type — “Nilotic” refers to a set of related ethnic groups residing in the Nile Valley — and, it so happens, the Kalenjin are a Nilotic people.

The Nilotic body type evolved in low latitude environments that are both hot and dry, because the long, thin proportions are better for cooling.

(Conversely, the extreme of the short, stocky build was historically known as the Eskimo type, though the term “Eskimo” has been replaced in some countries, where it is considered derogatory.)

[...]

Anthropologist Vincent Sarich used world cross-country championship results to calculate that Kenyan runners outperformed all other nations by 1,700-fold. Sarich made a statistical projection that about 80 out of every 1 million Kenyan men have world-class running talent, compared with about 1 out of every 20 million men in the rest of the world.

A 1992 Runner’s World article noted, based purely on population percentages, the statistical chances of Kenyan men having won the medals they did at the 1988 Olympics was 1 in 1,600,000,000.

Will China invade Taiwan?

Sunday, July 11th, 2021

Will China invade Taiwan?

Taiwan itself is very badly defended and the defence it does have is ill-suited to the task (its military has “pursued a suicidal procurement strategy of expensive boutique US kit that will be no use in the crisis, like fighter jets that will be killed on the ground by the opening Chinese missile barrage”). The US military is aimed at fighting the War on Terror, not defending overseas territories against invasion. US public opinion might not support shedding blood over defending Taiwan.

On the other hand: a war would be a huge risk for China; “every Chinese leader has an incentive to leave such a risky endeavour to his successor,” another forecaster writes. In the short term, the balance of power is still with the Americans, and China can afford to be patient and wait until the middle of the century. The forecasters use facts like these to adjust their initial base-rate estimate.

The six forecasters estimate the likelihood of a significant China-Taiwan conflict at between 8% and 23% in the next five years, with a median estimate of 14%. That doesn’t sound all that bad, but it’s worth adding something.

If we’ve learnt anything from Covid, it should be that preparing for the most likely outcome is not enough. The odds of a global pandemic in any given year is probably only about 1%. But if one happens, it turns out, it’s really bad, and it would have been worth investing a significant amount of resources to avoid or mitigate it. One of the superforecasters told me that “a 14% chance of a proper conflict by 2026 is quite a big deal. If someone says there’s a 10% chance of a really bad outcome, the expected value [the impact multiplied by the probability] is still really bad.” So you might not think a particular bad outcome is very likely, but if it’s bad enough, then you ought to prepare for it anyway.

[...]

The median estimate for how likely the US is to come to Taiwan’s aid if there were an invasion is 83%. So we are talking about a very high probability that a Chinese attack on Taiwan would lead to armed conflict between the world’s two superpowers. They also think it’s about 75% likely that the US would try to sink Chinese invasion ships, and say it’s reasonably likely that China would preemptively attack the US forces in the region if they did attack.

What might the knock-on effects be, if the world’s largest economies end up in a shooting war? Well: the US imports about $470 billion’s worth of goods from China a year. The superforecasters’ median estimate is that that would drop by 20%, or, roughly speaking, $100 billion.

[...]

And what’s more, it’s very far from obvious that the US would win. If a war were to break out over Taiwan before 2026, the median estimate is that there’s a 57% chance of Chinese victory; if the war were to break out between 2031 and 2035, when China has had another decade to build up its military relative to the US, the estimate is 66%.

Could the Germans have taken Moscow?

Saturday, July 10th, 2021

When more than 3 million German and German-allied troops surged across the border on June 22, 1941, many expected Operation Barbarossa to be a walkover:

“Bolshevism will collapse as a house of cards,” predicted Nazi Propaganda Minister Joseph Goebbels. An embarrassingly large number of U.S. and British experts agreed, mindful of then-Soviet leader Joseph Stalin’s officer corps purges and the Red Army’s incompetent performance against tiny Finland in the 1939 to 1940 Winter War. An unending series of Soviet military disasters in the summer and fall of 1941 — including the killing or capture of 650,000 Soviet soldiers at Kyiv — only reinforced that opinion.

But the red banner flying over the Reichstag in May 1945 proved the experts wrong. And a new computer wargame helps explains why.

[…]

The game is a number cruncher’s dream, tracking everything from the number of operable tanks and trucks, to the combat and administrative competence of individual generals, to whether sufficient raw materials are reaching arms factories.

[…]

Battlefield success in the game depends on factors like morale, combat experience, troop fatigue, and the skill of their commanders. Because the Germans have better troops and commanders in 1941, they can chew up the Soviet armies, forcing the Soviets to hastily commit unprepared reserves, which in turn get destroyed in a vicious cycle.

[…]

Compared to the lavishly equipped U.S. Army of World War II, the German and Soviet armies faced a logistical nightmare. Although the United States and Britain held an abundance of Detroit-made trucks to haul supplies, the Germans and Soviets were always short of vehicles, and the ones they had were quickly devoured by Russia’s primitive roads. While armored units were fully motorized, Germany and Russia’s poor infantry relied on horses to haul artillery and supplies. For them, World War II was more like World War I (what historian Omer Bartov has called the “de-modernization” of the German army in the East) and only a short step away from French leader Napoléon Bonaparte’s ill-fated invasion of Russia in 1812.

Hence, both sides on the Eastern Front relied on railroads to move troops and supplies. Armies tended to move along routes where there were railroads to supply them, but even then, logistics were difficult. Compared to Western Europe and North America, rail lines in Russia were sparse and wider than European tracks, which meant the Germans had to re-lay them as well as repair Russian scorched-earth damage to rail yards.

[…]

The game features a detailed logistical model that tracks supplies by the tons. (Yes, the tons, although the computer does most of the bean counting). Fuel, ammunition, and food are transported along rail lines to depots, where they are distributed by truck and horse-drawn wagon (and a limited capacity for aerial resupply). But railroads have a limited capacity; the rail lines actually change color on the map as their capacity is quickly overloaded. That leaves trucks, but there aren’t enough of them. And the more trucks that travel through Russia’s forests and swamps, the more trucks that break down. (Yes, the game tracks broken-down and repaired vehicles.)

This is devastating for all mechanized units, for which gasoline is life. But especially so for the Germans in 1941, who relied on their fast-moving panzers to encircle and pin the Russian armies until the foot-slow infantry moved in the for kill. Without gas, the tanks can’t perform their bold maneuvers.

This isn’t a problem at the start of the game as the Germans begin their offensive from well-stocked bases in East Prussia, Poland, and Romania.

[…]

The biggest question: Could the Germans have taken Moscow if they concentrated all of their forces on a single knife-like thrust to the Soviet capital? War in the East 2 suggests this strategy would have been a disaster: There simply wasn’t the rail and truck capacity to mass forces for a Moscow-only offensive.

The game is Gary Grigsby’s War in the East 2, from Matrix Games.

They both fly low and move fast

Friday, July 9th, 2021

Sea-skimming anti-ship missiles — such as the Exocet of Falklands War fame — have worried navies since the 1970s:

What’s changed is the speed of anti-ship missiles. Older weapons such as the Soviet Styx and America’s Harpoon were subsonic, which meant they were slow enough to be jammed or shot down by shipboard anti-missile systems such as the U.S. Navy’s Phalanx multi-barreled cannon. Newer weapons, such as Russia’s P-270 Moskit and Kh-31, could achieve supersonic speeds of Mach 3 or 4 that taxed anti-missile defenses.

But a new generation of Russian and Chinese hypersonic anti-ship missiles — like Russia’s Zircon, with an estimated speed of Mach 6 to 9 – are a different matter. They both fly low and move fast.

[…]

“As opposed to ballistic missile trajectories where Navy guided missile destroyers and cruisers have on the order of several minutes to detect, track, lock onto, and then launch interceptors against a hypersonic reentry vehicle, low flying missiles provide as little as 10 seconds of flight time above the ship’s radar horizon before missile impact,” the Navy explains.

[…]

Drones are a prime candidate for hosting an airborne missile detection radar. “The most obvious candidate aircraft to host the radar system would be on high altitude long-endurance (HALE) and medium-altitude long-endurance (MALE) unmanned aircraft,” according to the Navy.

But even with better radar detection, the physics of hypersonic weapons will still vex the defenders. The high speeds of hypersonic missiles flying through the atmosphere generate plasma clouds that absorb radar waves. “Even when a threat vector is identified so as to constrain the radar surveillance volume, the detection and tracking timeline for single or multiple inbound missiles whose radar return may be buried within a plasma envelope is extremely challenging,” the Navy notes.

Sickle-cell trait and low hemoglobin are evolutionary adaptations to malaria

Thursday, July 8th, 2021

Allen’s rule of body proportions dictates that people from low latitudes and warm climates have long limbs, and Bergmann’s rule dictates that they have narrower builds with slimmer pelvic bones, David Epstein explains (in The Sports Gene), but there’s another, less anatomical reason for western African sprinting dominance:

In 2006, Morrison, with Patrick Cooper, proposed in the West Indian Medical Journal that rampant malaria along the west coast of Africa, from where slaves were taken, led to specific genetic and metabolic alterations beneficial for sprint and power sports. The hypothesis: that malaria in western Africa forced the proliferation of genes that protect against it, and that those genes, which reduce an individual’s ability to make energy aerobically, led to a shift to more fast-twitch muscle fibers, which are less dependent upon oxygen for energy production. Morrison helped with the biology details, but the fundamental idea originally came from Cooper, a writer and childhood friend of Morrison’s.

Cooper was a polymath who had professional success in jobs ranging from music recording to writing speeches for Norman Manley, an architect of Jamaica’s independence, and then for his son, Prime Minister Michael Manley. Early in his career, Cooper had been a reporter for The Gleaner, Jamaica’s largest newspaper. Working at The Gleaner’s sports desk, he first surmised that white athletes had historically dominated sprint and power sports only by systematically excluding or dodging black athletes, like boxing champion Jack Johnson. In later writing, Cooper meticulously documented the fact that athletes with western African heritage become highly overrepresented in sprint and power sports almost immediately once they are allowed a fraction of their white counterparts’ access to sports.

At every Olympics after the U.S. boycott of 1980, every single finalist in the men’s Olympic 100-meters, despite homelands that span from Canada to the Netherlands, Portugal, and Nigeria, has his recent ancestry in sub-Saharan West Africa.

(The same has been true for women at the last two Olympics, and all but one female winner since the U.S.-boycotted 1980 Games has been of recent western African descent.)

And there has not been a white NFL player at cornerback, football’s speediest position, in more than a decade.

[...]

Cooper found the famous body types study of 1968 Olympians, and he latched on to a curious side note recorded by the scientists. The researchers had been surprised to find that “a sizeable number of Negroid Olympic athletes manifested the sickle-cell trait.”

[...]

In 1975, the year after the Mexico City Olympics data was published, another study appeared that Cooper would dissect two decades later, this one showing naturally low hemoglobin levels in African Americans.

[...]

Using data from nearly 30,000 people in ten different states, with ages ranging from the first year to the ninth decade, it reported that African Americans have lower hemoglobin levels at every stage of life than white Americans, even when socioeconomic status and diet are matched.

[...]

Like sickle-cell trait, genetically low hemoglobin — all else being equal — is a genetic disadvantage for endurance sports. Runners of recent western African descent are very much underrepresented at high levels of distance running. (The Jamaican record in the 10K would not even have qualified for the 2012 Olympics)

[...]

And then Cooper found just the potential “compensatory mechanism” he was looking for, in a 1986 study from Laval University in Quebec published in the Journal of Applied Physiology and coauthored by Claude Bouchard, who would go on to become the most influential figure in the field of exercise genetics, and the leader of the HERITAGE Family Study that documented aerobic trainability differences among families.

Bouchard and colleagues took muscle samples from the thighs of two dozen sedentary Laval students, primarily from countries in western Africa, as well as from two dozen sedentary white students, who were identical to the African students in age, height, and weight. The researchers reported that a higher proportion of muscle in the African students was composed of fast-twitch muscle fibers, and a lower proportion was slow-twitch muscle fibers compared with the white students. The African students also had significantly higher activity in the metabolic pathways that rely less on oxygen to create energy and that are engaged during an all-out sprint.

[...]

In his 2003 book, Black Superman: A Cultural and Biological History of the People Who Became the World’s Greatest Athletes, and then in his 2006 paper with Morrison, Cooper first made the argument that West Africans evolved characteristics like a high prevalence of the sickle-cell gene mutation and other gene mutations that cause low hemoglobin for protection from malaria, and that an increase in fast-twitch muscle fibers followed from that, providing more energy production from a pathway that does not rely primarily on oxygen, for people who have reduced capacity to produce energy with oxygen.

The former part of Cooper’s hypothesis — that sickle-cell trait and low hemoglobin are evolutionary adaptations to malaria — now seems undeniable.

In 1954, the same year Sir Roger Bannister broke the four-minute mile, British physician and biochemist Anthony C. Allison, who had been raised on a farm in Kenya, showed that sub-Saharan Africans with sickle-cell trait have far fewer malaria parasites in their blood than inhabitants of the same region who do not have sickle-cell trait.

[...]

Cooper and Morrison’s suggestion that low hemoglobin in African Americans and Afro-Caribbeans is a second adaptation to malaria has been proven true as well, in a deadly manner.

Even as evidence mounted that low hemoglobin levels in Africans native to malarial zones is at least partly genetic, aid workers in Africa looked upon low hemoglobin as a sign purely of a diet with too little iron. In 2001, the United Nations General Assembly charged the world with reducing iron deficiency among children in developing nations. And so, in a well-intended effort to improve nutrition, health-care providers descended on Africa with iron supplements, which raise the hemoglobin levels of those who consume them.

[...]

The problem was that doctors who studied malarial regions saw increased cases of severe malaria wherever iron supplements were dispensed. Since the 1980s, scientists working in Africa and Asia had documented lower rates of malaria death in people with low hemoglobin levels. In 2006, following a large, randomized, placebo-controlled study in Zanzibar that reported a stark increase in malaria illness and death among children given iron supplements, the World Health Organization issued a statement backtracking from the earlier UN position and cautioning health workers about giving iron supplements in areas with high malaria risk. Low hemoglobin, like sickle-cell trait, is apparently protective against malaria.

[...]

About 12 percent of Ivorian citizens are sickle-cell carriers, and in the early 1980s Le Gallais noticed that the top three female Ivorian high jumpers (one of whom won the African championship) became abnormally exhausted during workouts. Le Gallais tested the athletes and found — “surprisingly,” he wrote in an e-mail — “these three athletes were sickle cell trait carriers, despite originating from different ethnic groups in the country.”

[...]

In 1998, he reported that nearly 30 percent of 122 Ivorian national champions in explosive jumping and throwing events were sickle-cell trait carriers, and that they collectively accounted for thirty-seven national records. The top male and female in the group were both sickle-cell carriers.

Progressive activists have found a cause even more unpopular than “Defund the police”

Wednesday, July 7th, 2021

Progressive activists have found a cause even more unpopular than Defund the police, David Frum notes, and are pushing it with even greater vigor:

Eighty-three percent of American adults believe that testing is appropriate to determine whether students may enroll in special or honors programs, according to one of the country’s longest-running continuous polls of attitudes toward education.

Yet across the U.S., blue-state educational authorities have turned hostile to academic testing in almost all of its forms. In recent months, honors programs have been eliminated in Montgomery County, Maryland, and Seattle. On Long Island, New York, and in Pennsylvania and Virginia, curricula are being rethought to eliminate tracking that separates more- and less-adept student populations. New York City’s specialist public high schools are under fierce pressure to revise or eliminate academic standards for admission. Boston’s exam schools will apply different admissions standards in different zip codes. San Francisco’s famous Lowell High School has switched from academically selective admission to a lottery system. At least a thousand colleges and universities have halted use of the SAT, either permanently or as an experiment. But the experiments are rapidly hardening into permanent changes, notably at the University of California, but also in Washington State and Colorado. SAT subject tests have been junked altogether.

Special programs don’t poll as well when the questions stipulate that many Black and Hispanic students would not qualify for admittance. But the programs’ numbers rebound if respondents are assured that students will have equal access to test prep. The New York Post reported earlier this year on an education-reform organization’s findings that almost 80 percent of New Yorkers would want to preserve selective testing at the city’s elite high schools if it were combined with free access to test-preparation coaching for disadvantaged groups.

[…]

The supervisors who led the effort to end academically selective admissions at Lowell now face not only a recall campaign, but also a lawsuit from groups including the Asian American Legal Foundation. Accusations of bigotry have flowed both ways. In March, supporters of the old admissions system surfaced tweets by one of the school’s pro-lottery supervisors that accused Asian Americans of anti-Blackness. Black students at Lowell complain of racist incidents; an Asian American Lowell alum told of being bullied at another, less selective high school.

Some information sticks around when it shouldn’t, while other information vanishes when it should remain

Tuesday, July 6th, 2021

The Internet is rotting, Jonathan Zittrain notes:

The first study, with Kendra Albert and Larry Lessig, focused on documents meant to endure indefinitely: links within scholarly papers, as found in the Harvard Law Review, and judicial opinions of the Supreme Court. We found that 50 percent of the links embedded in Court opinions since 1996, when the first hyperlink was used, no longer worked. And 75 percent of the links in the Harvard Law Review no longer worked.

People tend to overlook the decay of the modern web, when in fact these numbers are extraordinary — they represent a comprehensive breakdown in the chain of custody for facts. Libraries exist, and they still have books in them, but they aren’t stewarding a huge percentage of the information that people are linking to, including within formal, legal documents. No one is. The flexibility of the web — the very feature that makes it work, that had it eclipse CompuServe and other centrally organized networks — diffuses responsibility for this core societal function.

The problem isn’t just for academic articles and judicial opinions. With John Bowers and Clare Stanton, and the kind cooperation of The New York Times, I was able to analyze approximately 2 million externally facing links found in articles at nytimes.com since its inception in 1996. We found that 25 percent of deep links have rotted. (Deep links are links to specific content — think theatlantic.com/article, as opposed to just theatlantic.com.) The older the article, the less likely it is that the links work. If you go back to 1998, 72 percent of the links are dead. Overall, more than half of all articles in The New York Times that contain deep links have at least one rotted link.

[…]

Of course, there’s a keenly related problem of permanency for much of what’s online. People communicate in ways that feel ephemeral and let their guard down commensurately, only to find that a Facebook comment can stick around forever. The upshot is the worst of both worlds: Some information sticks around when it shouldn’t, while other information vanishes when it should remain.

The team is defeated by bureaucracy, indecision, complacency and malaise

Monday, July 5th, 2021

As you might expect from Michael Lewis, his Premonition is terribly well done, Alex Tabarrok says, if formulaic and over-the-top:

But Lewis has a bigger problem than over-the-top writing.

The heroes were defeated. Lewis likes to tell stories of brilliant mavericks like Billy Beane and Michael Burry who go against the grain but eventually, against all odds, emerge victorious. But six hundred thousand people are dead in the United States and whatever victory we have won was ugly and slow. Indeed, Lewis assembles his mighty team but then The Premonition trails off as the team is defeated by bureaucracy, indecision, complacency and malaise before they even have a chance to enter the real battle against the virus.

[…]

If there is one central villain in The Premonition, it’s the CDC. Lewis acknowledges that his perspective has changed. In The Fifth Risk, the system (the “deep state” used non-pejoratively if you will) is full of wisdom and power but it’s under threat from Trump. In The Premonition, Trump is an after-thought, at best a trigger or aggravating factor.

[…]

Lewis’s most sustained analysis comes in a few pages near the end of The Premonition where he argues that the CDC became politicized after it lost credibility due to the 1976 Swine Flu episode. In 1976 a novel influenza strain looked like it might be a repeat of 1918. Encouraged by CDC head David Sencer, President Ford launched a mass vaccination campaign that vaccinated 45 million people. The swine flu, however, petered out and the campaign was widely considered a “debacle” and a “fiasco” that illustrated the danger of ceding control to unelected experts instead of the democratic process. The CDC lost authority and under Reagan the director became a political appointee rather than a career civil servant. Thus, rather than being unprecedented, Trump’s politicization of the CDC had deep roots.

Today the 1976 vaccination campaign looks like a competent response to a real risk that failed to materialize, rather than a failure. So what lessons should we take from this? Lewis doesn’t say but my colleague Garett Jones argues for more independent agencies in his excellent book 10% Less Democracy. The problem with the CDC was that after 1976 it was too responsive to political pressures, i.e. too democratic. What are the alternatives?

The Federal Reserve is governed by a seven-member board each of whom is appointed to a single 14-year term, making it rare for a President to be able to appoint a majority of the board. Moreover, since members cannot be reappointed there is less incentive to curry political favor. The Chairperson is appointed by the President to a four-year term and must also be approved by the Senate. These checks and balances make the Federal Reserve a relatively independent agency with the power to reject democratic pressures for inflationary stimulus. Although independent central banks can be a thorn in the side of politicians who want their aid in juicing the economy as elections approach, the evidence is that independent central banks reduce inflation without reducing economic growth. A multi-member governing board with long and overlapping appointments could also make the CDC more independent from democratic politics which is what you want when a once in 100 year pandemic hits and the organization needs to make unpopular decisions before most people see the danger.

Happy Secession Day!

Sunday, July 4th, 2021

Once again, happy Secession Day:

Legs got longer faster than torsos

Sunday, July 4th, 2021

Repeatedly, studies of families and twins find the heritability of height to be about 80 percent, David Epstein explains (in The Sports Gene):

For much of the twentieth century, denizens of industrialized societies were growing taller at a rate of about one centimeter per decade. In the seventeenth century, the average Frenchman was 5’4″, which is now the average for an American woman. The first generation of Japanese born to immigrant parents in America, known as the Nisei, famously towered over their parents.

In the 1960s, growth expert J. M. Tanner examined a set of identical twins that suggested the range of height variability caused by the environment. The identical boys were separated at birth, one brother raised in a nurturing household, and the other reared by a sadistic relative who kept him locked in a darkened room and made him plead for sips of water. In adulthood, the brother from the nurturing household was three inches taller than his identical twin, but many of their body proportions were similar. “The genetic control of shape is more rigorous than that of size,” Tanner wrote in Fetus into Man. The smaller brother was an abuse-shrunken version of the bigger brother.

[...]

Similarly, female gymnasts delay their growth spurt with furious training, but that does not diminish their ultimate adult height.

[...]

In World Wars I and II, European children were exposed to brief periods of famine during which their growth ground almost to a halt. When food again became plentiful, their bodies put the growth pedal to the metal such that adult height was not curtailed.

[...]

Consider that children grow more quickly in spring and summer than in fall and winter, and that this is apparently due to sunlight signals that enter through the eyeballs, since the growth of totally blind children consists of similar fluctuations but are not synchronized with the seasons.

The height that inhabitants of urban societies gained over the twentieth century came principally from increased leg length. Legs got longer faster than torsos. In developing countries that have gaping nutritional and infection-prevention disparities between the middle class and poor, the difference in height between the comfortable and the afflicted is all in the legs.

Japan displayed a startling growth trend during its “economic miracle” period following World War II. From 1957 to 1977, the average height of a Japanese man increased by 1.7 inches, and of a woman by an inch. By 1980, the height of Japanese people in Japan had caught up with the height of Japanese people in America. Amazingly, the entire height increase was accounted for by increased leg length. Modern Japanese people are still short compared with Europeans, but not as short as they once were. And they now have more similar proportions.

[...]

Every study that has examined race differences in body types has documented a disparity between black and white people that remains whether they reside in Africa, Europe, or the Americas. For any given sitting height — that is, the height of one’s head when one is sitting in a chair — Africans or African Americans have longer legs than Europeans. For a sitting height of two feet, an African American boy will tend to have legs that are 2.4 inches longer than a European boy’s. Legs make up a greater proportion of the body in an individual of recent African origin.

[...]

In their summary of the measurements of 1,265 Olympians from the 1968 Olympics in Mexico City, the scientists state that the successful body types within a sport are much more similar than body types between sports, regardless of ethnicity, but that “the most persistent of these differences” within sports are the narrow hip breadths and longer arms and legs of athletes with recent African ancestry.

[...]

In NBA predraft measurements for active players, the average white American NBA player was 6’7½” with a wingspan of 6’10″. The average African American NBA player was 6’5½” with a 6’11″ wingspan; shorter but longer.

[...]

“So maybe it’s not so much that white men can’t jump. White men just can’t reach high.”

[...]

In 1877, American zoologist Joel Asaph Allen published a seminal paper in which he noted that the extremities of animals get longer and thinner as one travels closer to the equator.

[...]

A 1998 analysis of hundreds of studies of native populations from around the world found that the higher the average annual temperature of a geographic region, the proportionally longer the legs of the people whose ancestors had historically resided there.

[...]

Africans with ancestry in southern regions of the continent, farther from the equator, do not necessarily have especially long limbs.

[...]

Nonetheless, the researchers reported that, compared with white adults of a given height, black adults have a center of mass — approximately the belly button — that is about 3 percent higher.

They used engineering models of bodies moving through fluids — air or water — to determine that the 3 percent difference translates into a 1.5 percent running speed advantage for athletes with the higher belly buttons (i.e., black athletes) and a 1.5 percent swimming speed advantage for athletes with a lower belly button (i.e., white athletes).

People are naturally curious, but they are not naturally good thinkers

Saturday, July 3rd, 2021

A second-edition of Why Don’t Students Like School has just been published, and it stands the test of time, Robert Pondiscio says:

My 2009 copy of Why Don’t Students Like School by Dan Willingham is among the most dog-eared and annotated books I own. Along with E.D. Hirsch’s The Knowledge Deficit (2006) and Doug Lemov’s Teach Like a Champion (2010), I’m hard-pressed to think of another book in the last twenty years that had a greater impact on my teaching, thinking, or writing about education.

[…]

Willingham set out to put between two covers a set of enduring principles from cognitive science (“People are naturally curious, but they are not naturally good thinkers”; “factual knowledge precedes skill”; “proficiency requires practice,” et al.) that can reliably inform and shape classroom practice — a rich vein of ore that Willingham began to mine in his “Ask the Cognitive Scientist” columns for The American Educator starting nearly twenty years ago.

I mentioned the book 11 years ago, when Bryan Caplan was annoyed that it didn’t answer the question in its title.

Here are its nine principles:

  1. People are naturally curious, but they are not naturally good thinkers.
  2. Factual knowledge precedes skill.
  3. Memory is the residue of thought.
  4. We understand new things in the context of things we already know.
  5. Proficiency requires practice.
  6. Cognition is fundamentally different early and late in training.
  7. Children are more alike than different in terms of learning.
  8. Intelligence can be changed through sustained hard work.
  9. Teaching, like any complex cognitive skill, must be practiced to be improved.

I enjoyed some of the comments on Caplan’s post

Boonton notes that people pretend that the students are the customers, or the parents are, but it’s really the taxpayers, who are paying to lock up troublesome kids. I don’t dispute this, but I must say that it’s an inefficient way to address the problem:

If schools aim to imprison students for the good of their true customers, the taxpayers, may I note that one attracts more flies with honey. New York spends $17,000 per student. An annual pass at Walt Disney World costs around $600.

Dahl himself would be exasperated over the 1971 film’s endurance

Friday, July 2nd, 2021

Willy Wonka and the Chocolate Factory came out 50 years ago:

Dahl himself would be exasperated over the 1971 film’s endurance. Though he was nominally billed as its screenwriter, his original adaptation was scarcely detectable beneath all manner of uncredited rewrites, and he was vocal in his disdain for the result, Wilder and all. His list of grievances was long: Dahl had wanted the arch British peculiarity of Spike Milligan or Peter Sellers for Wonka, he was unhappy with the film’s foregrounding of Wonka over Charlie, he resented plot alterations and additions that muddied the cautionary neatness of his original tale, and he wasn’t a fan of Leslie Bricusse and Anthony Newley’s perky song score.

[…]

Stuart, a workmanlike film-maker hitherto best-known for documentaries and sitcom-like farces, directed it with a halting, gear-grinding rhythm and an erratic sense of pace: it’s a stately 45 minutes before Wonka even makes his first appearance, whereupon the film rushes through its fantastical factory setpieces with businesslike indifference.

It does take shockingly long for Wonka and his factory to make their appearance.

I didn’t realize the film introduced “The Candy Man”, which became Sammy Davis Jr.’s hit.

According to Wikipedia, Charlie and the Chocolate Factory was originally going to feature a little black boy, and the Oompa-Loompas were described (and illustrated) as African pygmies, but the film announcement launched a reaction from the NAACP.

Addendum: I also forgot that the film was the source of the oft-quoted, “I said, ‘Good day,’ sir!”

i-said-good-day-sir

Once Upon a Time in Hollywood has just come out in paperback

Thursday, July 1st, 2021

Quentin Tarantino’s Once Upon a Time in Hollywood has just come out:

No, not the film. That came out in 2019. But now HarperCollins is publishing a novelization, written by Tarantino himself, and based on the earlier film. This particular type of fiction — the bastard offspring of the film treatment and the legitimate novel — is probably pop fiction’s least reputable genre, which no doubt is why it appeals to Tarantino.

When HarperCollins announced the project, Tarantino issued a statement:

To this day I have a tremendous amount of affection for the genre. So as a movie-novelization aficionado, I’m proud to announce Once Upon a Time in Hollywood as my contribution to this often marginalized, yet beloved sub-genre in literature. I’m also thrilled to further explore my characters and their world in a literary endeavor that can (hopefully) sit alongside its cinematic counterpart.

The genre is often looked down on:

Tarantino’s affection can probably be at least partially attributed to the year of his birth — 1963. Those of us born into the so-called Baby Boom generation grew up before videocassette players were widely available (and before DVD players and streaming services had even been conceived). Back in those benighted days, if you enjoyed a film based on an original screenplay and you wanted to experience it again after it had left the theater, your options were limited. You could wait for it to appear on television (where it would almost certainly be shortened, censored, cropped from its original aspect-ratio via pan-and-scan technology, and chock-full of commercial breaks), you could hope for it to enjoy a theatrical revival (highly unlikely), or you could seek out a novelization, which, though it would lack the colorful visuals and the musical score and the performances, would at least allow you to be thrilled once again by the plot and the dialogue, or some semblance thereof. Furthermore, although theaters wouldn’t allow people under 16 to see an R-rated film without parental accompaniment, bookstores had no such restrictions. A kid could buy the novelization of an R-rated movie without the book clerk asking to see his ID.