How did war become a game?

Sunday, June 30th, 2019

How did war become a game?

The anti-college is on the rise

Sunday, June 30th, 2019

The anti-college is on the rise, Molly Worthen (Apostles of Reason) notes, in the New York Times:

A small band of students will travel to Sitka, Alaska, this month to help reinvent higher education. They won’t be taking online courses, or abandoning the humanities in favor of classes in business or STEM, or paying high tuition to fund the salaries of more Assistant Vice Provosts for Student Life. They represent a growing movement of students, teachers and reformers who are trying to compensate for mainstream higher education’s failure to help young people find a calling: to figure out what life is really for.

These students will read works by authors ranging from Plato and Herbert Marcuse to Tlingit writers. The point is to “develop and flex a more rigorous political imagination,” according to one course syllabus. They will take on 15 to 20 hours a week of manual labor in Sitka, and set their group’s rules on everything from curfews to cellphones. Last summer’s cohort discouraged the use of phones during class and service hours and ordered everyone to turn off the internet at 10 p.m.

This is Outer Coast, one of an expanding number of educational experiments born out of a deepening sense that mainstream American colleges are too expensive, too bureaucratic, too careerist and too intellectually fragmented to help students figure out their place in the universe and their moral obligations to fellow humans.

There are alternative colleges that replace traditional courses with personalized study; gap-year programs that combine quasi-monastic retreats with world travel; summer seminars devoted to clearing trails and reading philosophy. They aim to prove that it is possible to cultivate moral and existential self-confidence, without the Christian foundation that grounded Western universities until the mid-20th century. They seek to push back against the materialism and individualism that have saturated the secular left and right, all at an affordable price. It’s a tall order.


Outer Coast, founded in 2015, offers one partial solution. Bryden Sweeney-Taylor, who is 38, helped found the program in order to give young people a taste of the education he received at an older countercultural experiment, Deep Springs College, which was founded in 1917. Deep Springs — a tuition-free, highly selective two-year liberal arts college and working ranch near Death Valley in California — combines intense study, manual labor and intimate community to give students “a sense of the purpose of education not just being for oneself but for something larger than one’s self,” Mr. Sweeney-Taylor told me.

Outer Coast — which was co-founded by Jonathan Kreiss-Tomkins, an Alaskan who dropped out of Yale to serve in the Alaska Legislature — plans to eventually expand into a two-year undergraduate program. The aim is to recruit Alaska Natives and other students from underrepresented backgrounds, but also to develop a program that reformers elsewhere might copy — which means “being as financially lean and economically efficient as possible, creating a replicable model for these micro-institutions,” Mr. Sweeney-Taylor said.


After graduating from Yale in 2010, Ms. Marcus founded a Deep Springs-style program for women: the Arete Project, in the Blue Ridge Mountains in North Carolina. She has recently opened a second, coed program in rural Alaska.

The Arete Project — the Greek word means “excellence” in the broadest sense — calls itself “education for citizenship, stewardship and leadership.” It operates on a “pay what you can” model and bans alcohol, tobacco, marijuana and recreational drugs. The project began as a summer seminar with plans to grow into a yearlong undergraduate program (participants can earn college credit). Like Outer Coast, it immerses a small group of students in the demands of self-governance, study and manual labor. Students read a mix of “classical texts and contemporary texts,” Ms. Marcus said.

The curriculum has been a source of contention. “Some students feel strongly that Plato has a great deal to offer their intellectual situations, and some can’t believe they’re being asked to read a dead white man,” she told me. Moreover, romanticizing the pursuit of the pure “life of the mind” risks alienating working-class students. “For organizations that want to be accessible to students in all walks of life, as we do, that means having some kind of value proposition that translates into other parts of their life.”


A second set of new programs — the humanist individualists — owe more to the experiments of the counterculture era: schools like Evergreen State College, founded in Olympia, Wash., in 1967; Kresge College at the University of California, Santa Cruz, founded soon after; or the Jack Kerouac School of Disembodied Poetics, founded in Boulder, Colo., in 1974 by the poets Allen Ginsberg and Anne Waldman. These schools combined an interest in Eastern spirituality with the principles of humanistic psychology and the human potential movement, which emphasized the goodness in all humans and their gift for self-actualization.

In the 1960s, psychologists like Abraham Maslow and Carl Rogers challenged the dark, animalistic portrait of human beings advanced by behavioral psychologists and disciples of Freud — not to mention the grim vision of original sin in traditional Christian theology. Their ideas caught on among educators at a time when students were already in revolt against traditional curriculums. The result was the further collapse of eroding general education requirements in favor of radical student choice, which may have felt empowering in the short term, but added more confusion and doubt to universities’ sense of mission.


The trends that inspired Kresge and Evergreen State have only accelerated. Michelle Jones taught organizational behavior for 15 years in mainstream academic institutions, and in that time “higher ed became more of a business — all the administrators, all the K.P.I.s” (key performance indicators) “all detached from student experience,” she told me. After she got tenure, she said, “I paused to reflect, and I said, ‘I don’t think this is the job I thought it was going to be.’ ”

In 2015, she founded Wayfinding Academy in Portland, Ore., which offers a two-year associate degree in “self and society.” The curriculum is less conventionally academic than those of Outer Coast or the Arete Project, and more personalized.

Students shape their coursework with tutor-counselors called guides. They receive narrative evaluations instead of grades and design independent projects that help them learn “what it takes to do something epic” and how to “find their way back to their purpose when they feel lost,” according to a syllabus for a course on “Making Good Choices.” In that class, assigned readings and videos ranged from interviews with Noam Chomsky to a handout on “Anti-Oppressive Facilitation for Democratic Process” by a group called the Aorta cooperative.

Type A blood converted to universal-donor blood

Saturday, June 29th, 2019

To up the supply of universal-donor blood, scientists have tried transforming the second most common blood, type A, by removing its “A-defining” antigens:

But they’ve met with limited success, as the known enzymes that can strip the red blood cell of the offending sugars aren’t efficient enough to do the job economically.

After 4 years of trying to improve on those enzymes, a team led by Stephen Withers, a chemical biologist at the University of British Columbia (UBC) in Vancouver, Canada, decided to look for a better one among human gut bacteria. Some of these microbes latch onto the gut wall, where they “eat” the sugar-protein combos called mucins that line it. Mucins’ sugars are similar to the type-defining ones on red blood cells.

So UBC postdoc Peter Rahfeld collected a human stool sample and isolated its DNA, which in theory would include genes that encode the bacterial enzymes that digest mucins. Chopping this DNA up and loading different pieces into copies of the commonly used lab bacterium Escherichia coli, the researchers monitored whether any of the microbes subsequently produced proteins with the ability to remove A-defining sugars.

At first, they didn’t see anything promising. But when they tested two of the resulting enzymes at once — adding them to substances that would glow if the sugars were removed — the sugars came right off. The enzymes also worked their magic in human blood. The enzymes originally come from a gut bacterium called Flavonifractor plautii, Rahfeld, Withers, and their colleagues report today in Nature Microbiology. Tiny amounts added to a unit of type A blood could get rid of the offending sugars, they found. “The findings are very promising in terms of their practical utility,” Narla says. In the United States, type A blood makes up just under one-third of the supply, meaning the availability of “universal” donor blood could almost double.

A tough-on-crime WASP using torture, intimidation, and surveillance to bring down a media-savvy terrorist

Friday, June 28th, 2019

What might be called “Nolan’s enigma” began in earnest with The Dark Knight — which involved a tough-on-crime WASP using torture, intimidation, and surveillance to bring down a media-savvy terrorist:

The Dark Knight Rises took things one step further with Bane, a menacing mix of Robespierre and Ruthenberg, whose pseudo-Marxist coup unleashes all manner of mayhem upon Gotham: banishments and public hangings, street brawls and show trials, and — in a scene lifted straight out of the French revolution — the storming of Blackgate (Bastille) prison.

Not to be outdone, Marvel soon embraced its own brand of post-9/11 conservatism. In every Avengers film, Joshua Tait notes, “it really is 1938….The threats are real and the Avengers’ unilateral actions are necessary” to protect life, liberty, and democracy. Each hero thus functions as a kind of Cold Warrior, standing athwart would-be despots and authoritarians, while their enemies function as bland, unidimensional cannon-fodder, a convenient narrative pretext for blowing things up. (To be fair, the bad guys usually do possess weapons of mass destruction; this is fantasy, after all.)

By 2018, however, Marvel had ditched the neocon agitprop and gone full paleo. Black Panther — which Slate described as “the most feminist superhero movie yet” — is about the hereditary monarch of a monoracial ethno-state that keeps immigrants at bay with a high-tech border wall and faces no economic slowdown because of it. In fact, Wakanda becomes the richest country in the world without any international trade whatsoever, all while maintaining traditional religious customs and above-replacement fertility rates — a kind of black Israel. (It does eventually reconcile itself to foreign aid under T’Challa, but not to immigration.) Trouble only begins when Killmonger (a foreigner) challenges Black Panther’s claim to the throne — not because he thinks the current occupant is illegitimate, but because he wants to use Wakandan technology to launch a global, race-based revolution, with no regard for national boundaries.

Then in Avengers: Infinity War, Wakanda opens its border wall and promptly gets invaded by aliens.

So perhaps it is fitting that Avengers: Endgame, the Marvel movie to end all Marvel movies, is even more Burkean — and badass — than its predecessors, a sustained cinematic rejoinder to everything Hollywood believes. If you haven’t seen Endgame yet — or if you take comfort in the delusion that Marvel is “woke” — stop reading now.

The only way to get good content out of the Internet is by having humans in the loop

Thursday, June 27th, 2019

Neal Stephenson’s latest novel, Fall; or, Dodge in Hell, starts in the near future, where most of the Net has become a Miasma:

I saw someone recently describe social media in its current state as a doomsday machine, and I think that’s not far off. We’ve turned over our perception of what’s real to algorithmically driven systems that are designed not to have humans in the loop, because if humans are in the loop they’re not scalable and if they’re not scalable they can’t make tons and tons of money.

The result is the situation we see today where no one agrees on what factual reality is and everyone is driven in the direction of content that is “more engaging,” which almost always means that it’s more emotional, it’s less factually based, it’s less rational, and kind of destructive from a basic civics standpoint.


I think the only way to get good content out of the internet is by having humans in the loop. The reason that social media systems are architected the way they are, as I mentioned before, is because humans are expensive and you can’t scale that kind of system to serve billions and billions of people. What that kind of implies is that if you did want a curated, edited stream, that you would have to pay for it.

So that means that access to that kind of higher-quality view of the world becomes a class-based situation where people who’ve got the money to pay for or partially pay for human editors and curators are getting higher-quality info, which I think is just a slight kind of magnification or intensification of the way things are now anyway.

(I’ve mentioned before that I have mixed feelings about most of Stephenson’s work.)

Velocity is strangling baseball

Thursday, June 27th, 2019

Velocity is strangling baseball:

Baseball’s timeless appeal is predicated upon an equilibrium between pitching and hitting, and in the past, when that equilibrium has been thrown off, the game has always managed, either organically or through small tweaks, to return to an acceptable balance.

But there is growing evidence that essential equilibrium has been distorted by the increasing number of pitchers able to throw the ball harder and faster.


The 2018 season was the first in history in which strikeouts outpaced hits, a trend that has accelerated so far in 2019. The ball is in play less than ever, with a record 35.4 percent of plate appearances in 2019 resulting in a strikeout, walk or home run. Teams are using an average of 3.3 relievers per game in 2019, just below last year’s all-time record of 3.4. The leaguewide batting average of .245 in 2019 is the lowest since 1972 and a drop of 26 points from 1999, at the height of the steroids era. The leaguewide strikeout rate of 8.78 per nine innings, also a record, is higher than the career rate of Roger Clemens.


Most, if not all, of this change can be traced back to the rising velocity of the fastball — the fundamental unit of pitching — from a leaguewide average of 89 mph in 2002, when FanGraphs first recorded data, to 92.9 mph so far this season. At the upper end of the spectrum, the shift is even more striking: In 2008, there were 196 pitches thrown at 100 mph or higher, according to Statcast data. In 2018, there were 1,320, a nearly sevenfold increase. In 2008, only 11 pitchers averaged 95 mph or higher; in 2018, 74 did. Aroldis Chapman of the New York Yankees and Jordan Hicks of the St. Louis Cardinals have both been clocked at 105 mph.


Here, via Statcast, are the slash-lines (batting average/on-base percentage/slugging percentage) of MLB hitters in 2018 against four different pitch-speeds:

• Vs. 92 mph: .283/.364/.475
• Vs. 95 mph: .259/.342/.421
• Vs. 98 mph: .223/.310/.329
• Vs. 101 mph: .198/.257/.214


One seeming contradiction is that fastball usage, as a percentage of overall pitches, has been steadily decreasing, from 64.4 percent of all pitches in 2002 to just 52.8 percent so far this year. But that doesn’t mean pure velocity is any less effective — it merely indicates teams have learned to dole out fastballs in more effective patterns. The simple threat of a 99-mph fastball makes the 92-mph slider or the 90-mph change-up that much more effective.


In a 2018 study headed by former Red Sox trainer Mike Reinold, pitchers who went through a six-week velocity training program featuring weighted balls increased their velocity by an average of more than two mph but were “substantially” more likely to suffer arm injuries than those in the control group.


In 1893, when the mound was moved back 10 feet to its current distance, the change resulted in a 35-point jump in batting average and a 34 percent drop in strikeouts. By comparison, lowering the mound from 15 inches to 10 inches in 1969 resulted in more modest changes: an 11-point rise in batting average and a 2 percent drop in strikeouts.

The older woman recognized the warning signs

Wednesday, June 26th, 2019

We can learn from history, Jared Diamond argues (Upheaval) — both personal history and history history:

I write these lines just after spending an evening with two women friends, one of them a psychologically naïve optimist in her 20’s, the other a perceptive person in her 70’s. The younger woman was devastated by the recent break-up of her relationship with a fascinating man who had seemed so caring, but who suddenly, after several years, cruelly and without warning abandoned the woman. But as the younger woman related her story, even before reaching the devastating denouement, the older woman (without having met the man) recognized the warning signs that the man was a charming but destructive narcissist, of whom she had come to understand quite a few.


Thucydides described how the citizens of the small Greek island of Melos responded to pressure from the powerful Athenian Empire. In a passage now known as the Melian Dialogue, Thucydides reconstructed the gut-wrenching negotiations between the Melians and the Athenians: the Melians bargaining for their freedom and their lives, attempting to convince the Athenians not to use force; and the Athenians warning the Melians to be realistic. Thucydides then briefly related the outcome: the Melians refused Athenian demands, just as the Finns two millennia later initially refused Soviet demands; the Athenians besieged Melos; the Melians resisted successfully for some time; but they eventually had to surrender; and—the Athenians killed all the Melian men and enslaved all the women and children.

Free throws should be easy

Wednesday, June 26th, 2019

Free throws should be easy, right?

For decades, elite players in the NBA, WNBA, and NCAA have averaged between 70 and 75 percent from the foul line. Most of basketball’s sharpest shooters top out in the high eighties, with Nash being one of only two NBA players to retire with a career average above 90 percent. His consistency at the line raises some questions: For starters, why isn’t everyone else better? But also: If Nash can show up unpracticed, four years after retirement, and drain 98 percent of his free throws in an impromptu shootout against a ham-handed journalist, what kept him from shooting that reliably during his career?

On paper, the free throw could not be more straightforward. It’s a direct, unguarded shot at a hoop 18 inches across, 10 feet off the ground, and 15 feet away. Like a carefully controlled experiment, the conditions are exactly the same every single time. Larry Silverberg, a dynamicist at North Carolina State University, has used this fact to study the free throw in remarkable detail. “It’s the same for every single player, so you can actually look at the shot very scientifically,” he says.

An expert in the modeling of physical phenomenon, Silverberg has examined the physics of the free throw for 20 years, using computers to simulate the trajectories of millions of shots. His findings show that a successful free throw boils down to four parameters: the speed at which you release the ball, how straight you shoot it, the angle at which it leaves your hand, and the amount of backspin that you place on it.


The ideal rate of spin is three backward rotations per second, which, incidentally, is about how long it should take the ball to make the trip from a player’s hand to the hoop. (That spin buys you some wiggle room, in the event you over- or under-shoot.) The best angle of trajectory is between 46 and 54 degrees from the horizon, depending on your height. The most advantageous release angle for a given shooter also corresponds to their lowest launch speed—a relationship that helps explain why shots that go in often feel like they require less effort than shots that don’t. As Nash describes it: “There’s no strain, there’s no forcing, there’s no flicking at the rim, there’s just a really smooth stroke.”

The best free throw shooter on earth isn’t a pro basketball player, but Bob Fisher, a 62-year-old soil-conservation technician from Centralia, Kansas:

“I played high school basketball, and I played recreationally till I was 44.” A few years later, in his early 50s, he started practicing free throws every day at his local gym. That was September 2009. Within a couple of months he was consistently sinking more than 100 shots in a row. In January 2010 he set his first world record. Since then, his speed and accuracy from the foul line have garnered him an additional 24 Guinness titles.

Fisher happily shares the secrets to his success. He attributes his accuracy and precision to something he calls the centerline technique (it involves aligning the lower palm and middle finger with the rim of the basket), the details of which he has recounted in a book and instructional video. His consistency he attributes to preparation. For years, Fisher has spent hours a day refining his shot. “All it takes to become good is three things: knowledge, practice, and time,” he says.

There’s a reason we shoot better in practice than in a game:

“I think we’ve all had the experience where we can hit that shot when no one’s watching, but when all eyes are on us we fumble,” says cognitive scientist Sian Beilock, president of Barnard College and author of Choke: What the Secrets of the Brain Reveal About Getting It Right When You Have To. Beilock attributes those mistakes to something she calls paralysis by analysis: When a player overthinks a task, it interrupts the working memory they’ve establish through hours of practice. Remember the hyper-coordinated movements required for releasing a free throw shot at a precise speed? They’re exactly the kind of thing that overanalysis tends to screw up. Closing the gap between training and competition, Beilock says, is a matter of practicing under conditions that simulate high-pressure scenarios: Training under a watchful eye, or competing against the clock.

Focus just on birds and airplanes

Tuesday, June 25th, 2019

Jared Diamond shares a story (Upheaval) about how two nations, which don’t get along, were nonetheless able to solve a problem:

Israel has invaded and partially occupied Lebanon. Lebanon has served as a base for launching rocket attacks into Israel. Nevertheless, bird-watchers of those two countries succeeded in reaching a milestone agreement. Eagles and other large birds migrating seasonally between Europe and Africa fly south from Lebanon through Israel every autumn, then north again from Israel through Lebanon every spring. When aircraft collide with those large birds, the result is often mutual destruction. (I write this sentence a year after my family and I survived the collision of our small chartered plane with an eagle, which dented but didn’t bring down our plane; the eagle died.) Such collisions had been a leading cause of fatal plane accidents in Lebanon and Israel. That stimulated bird-watchers of those two countries to establish a mutual warning system. In the autumn Lebanese bird-watchers warn their Israeli counterparts and Israeli air traffic controllers when they see a flock of large birds over Lebanon heading south towards Israel, and in the spring Israeli bird-watchers warn of birds heading north. While it’s obvious that this agreement is mutually advantageous, it required years of discussions to overcome prevailing hatreds, and to focus just on birds and airplanes.

Resignation was the defining condition of Soviet life

Tuesday, June 25th, 2019

Masha Gessen explains what HBO’s Chernobyl got right and wrong:

Before I get to what the series got so terribly wrong, I should acknowledge what it got right. In “Chernobyl,” which was created and written by Craig Mazin and directed by Johan Renck, the material culture of the Soviet Union is reproduced with an accuracy that has never before been seen in Western television or film—or, for that matter, in Russian television or film. Clothes, objects, and light itself seem to come straight out of nineteen-eighties Ukraine, Belarus, and Moscow. (There are tiny errors, like a holiday uniform worn by schoolchildren on a non-holiday, or teen-agers carrying little kids’ school bags, but this is truly splitting hairs.) Soviet-born Americans—and, indeed, Soviet-born Russians—have been tweeting and blogging in awe at the uncanny precision with which the physical surroundings of Soviet people have been reproduced. The one noticeable mistake in this respect concerns the series makers’ apparent ignorance of the vast divisions between different socioeconomic classes in the Soviet Union: in the series, Valery Legasov (Jared Harris), a member of the Academy of Sciences, lives in nearly the same kind of squalor as a fireman in the Ukrainian town of Pripyat. In fact, Legasov would have lived in an entirely different kind of squalor than the fireman did.

Herein lies one of the series’ biggest flaws: its failure to accurately portray Soviet relationships of power. There are exceptions, flashes of brilliance that shed light on the bizarre workings of Soviet hierarchies. In the first episode, for example, during an emergency meeting of the Pripyat ispolkom, the town’s governing council, an elder statesman, Zharkov (Donald Sumpter), delivers a chilling, and chillingly accurate, speech, urging his compatriots to “have faith.” “We seal off the city,” Zharkov says. “No one leaves. And cut the phone lines. Contain the spread of misinformation. That is how we keep the people from undermining the fruits of their own labor.” This statement has everything: the bureaucratic indirectness of Soviet speech, the privileging of “fruits of labor” over the people who created them, and, of course, the utter disregard for human life.

The final episode of “Chernobyl” also contains a scene that encapsulates the Soviet system perfectly. During the trial of three men who have been deemed responsible for the disaster, a member of the Central Committee overrules the judge, who then looks to the prosecutor for direction—and the prosecutor gives that direction with a nod. This is exactly how Soviet courts worked: they did the bidding of the Central Committee, and the prosecutor wielded more power than the judge.

Unfortunately, apart from these striking moments, the series often veers between caricature and folly. In Episode 2, for example, the Central Committee member Boris Shcherbina (Stellan Skarsgård) threatens to have Legasov shot if he doesn’t tell him how a nuclear reactor works. There are a lot of people throughout the series who appear to act out of fear of being shot. This is inaccurate: summary executions, or even delayed executions on orders of a single apparatchik, were not a feature of Soviet life after the nineteen-thirties. By and large, Soviet people did what they were told without being threatened with guns or any punishment.

Similarly repetitive and ridiculous are the many scenes of heroic scientists confronting intransigent bureaucrats by explicitly criticizing the Soviet system of decision-making. In Episode 3, for example, Legasov asks, rhetorically, “Forgive me—maybe I’ve just spent too much time in my lab, or maybe I’m just stupid. Is this really the way it all works? An uninformed, arbitrary decision that will cost who knows how many lives that is made by some apparatchik, some career Party man?” Yes, of course this is the way it works, and, no, he hasn’t been in his lab so long that he didn’t realize that this is how it works. The fact of the matter is, if he didn’t know how it worked, he would never have had a lab.

Resignation was the defining condition of Soviet life. But resignation is a depressing and untelegenic spectacle. So the creators of “Chernobyl” imagine confrontation where confrontation was unthinkable—and, in doing so, they cross the line from conjuring a fiction to creating a lie. The Belarusian scientist Ulyana Khomyuk (Emily Watson) is even more confrontational than Legasov. “I am a nuclear physicist,” she tells an apparatchik, in Episode 2. “Before you were Deputy Secretary, you worked in a shoe factory.” First, she’d never say this. Second, the apparatchik might have worked at a shoe factory, but, if he was an apparatchik, he was no cobbler; he has come up the Party ladder, which might indeed have begun at the factory—but in an office, not on the factory floor. The apparatchik—or, more accurately, the caricature of the apparatchik—pours himself a glass of vodka from a carafe that sits on his desk and responds, “Yes, I worked in a shoe factory. And now I’m in charge.” He toasts, in what appears to be the middle of the day: “To the workers of the world.” No. No carafe, no vodka in the workplace in front of a hostile stranger, and no boasting “I’m in charge.”

The biggest fiction in this scene, though, is Khomyuk herself. Unlike other characters, she is made up—according to the closing titles, she represents dozens of scientists who helped Legasov investigate the cause of the disaster. Khomyuk appears to embody every possible Hollywood fantasy. She is a truth-knower: the first time we see her, she is already figuring out that something has gone terribly wrong, and she is grasping it terribly fast, unlike the dense men at the actual scene of the disaster, who seem to need hours to take it in. She is also a truth-seeker: she interviews dozens of people (some of them as they are dying of radiation exposure), digs up a scientific paper that has been censored, and figures out exactly what happened, minute by minute. She also gets herself arrested and then immediately seated at a meeting on the disaster, led by Gorbachev. None of this is possible, and all of it is hackneyed. The problem is not just that Khomyuk is a fiction; it’s that the kind of expert knowledge she represents is a fiction. The Soviet system of propaganda and censorship existed not so much for the purpose of spreading a particular message as for the purpose of making learning impossible, replacing facts with mush, and handing the faceless state a monopoly on defining an ever-shifting reality.

In the absence of a Chernobyl narrative, the makers of the series have used the outlines of a disaster movie. There are a few terrible men who bring the disaster about, and a few brave and all-knowing ones, who ultimately save Europe from becoming uninhabitable and who tell the world the truth. It is true that Europe survived; it is not true that anyone got to the truth, or told it.

Average per-capita consumption rates of resources are about 32 times higher in the First World than in the developing world

Monday, June 24th, 2019

Naturally Jared Diamond (Upheaval) is concerned about the environment — and about the rest of the world trying to live like us:

The most discussed primary effect of CO2 release is to act as a so-called greenhouse gas in the atmosphere. That’s because atmospheric CO2 is transparent to the sun’s shortwave radiation, allowing incoming sunlight to pass through the atmosphere and warm the Earth’s surface. The Earth re-radiates that energy back towards space, but at longer thermal infrared wavelengths to which CO2 is opaque. Hence the CO2 absorbs that re-radiated energy and re-emits it in all directions, including back down to the Earth’s surface.


But there are two other primary effects of CO2 release. One is that the CO2 that we produce also gets stored in the oceans as carbonic acid. But the ocean’s acidity is already higher than at any time in the last 15 million years. That dissolves the skeletons of coral, killing coral reefs, which are a major breeding nursery of the ocean’s fish, and which protect tropical and subtropical sea-coasts against storm waves and tsunamis. At present, the world’s coral reefs are contracting by 1% or 2% per year, so they will mostly be gone within this century, and that means big declines in tropical coastal safety and protein availability from seafood.


For instance, when non-poisonous chlorofluorocarbon gases (CFCs) replaced the poisonous gases previously used in refrigerators until the 1940’s, it seemed like a wonderful and safe engineering solution to the refrigerator gas problem, especially because laboratory testing had revealed no downside to CFCs. Unfortunately, lab tests couldn’t reveal how CFCs, once they got into the atmosphere, would begin to destroy the ozone layer that protects us from ultraviolet radiation.


France has generated most of its national electricity requirements from nuclear reactors for many decades without an accident.


Europeans are discouraged from buying expensive big cars with high fuel consumption and low gas mileage, because the purchase tax on cars in some European countries is set at 100%, doubling the cost of the car.


Also, European government taxes on gasoline drive gas prices to more than $9 per gallon, another disincentive to buying a fuel-inefficient car.


These various resources differ in four respects important for understanding their potential for creating problems for us: their renewability, and the resulting management problems; their potential for limiting human societies; their international dimensions; and the international competition that they provoke, including wars.


There have already been some attempts to exploit all three: after World War One the German chemist Fritz Haber worked on a process to extract gold from ocean water; at least one attempt has been made to tow an iceberg from Antarctica to a water-poor Middle Eastern nation; and efforts are far advanced to mine some minerals from the ocean floor.


Fresh water is also mobile: many rivers flow between two or more countries, and many lakes are bordered by two or more countries, hence one country can draw down or pollute fresh water that another country wants to use.


Average per-capita consumption rates of resources like oil and metals, and average per-capita production rates of wastes like plastics and greenhouse gases, are about 32 times higher in the First World than in the developing world.


The First World consists of about 1 billion people who live mostly in North America, Europe, Japan, and Australia, and who have relative average per-capita consumption rates of 32.


Many decades ago, American diplomats used to play a game of debating which of the world’s countries were most irrelevant to U.S. national interests. Popular answers were “Afghanistan” and “Somalia”: those two countries were so poor, and so remote, that it seemed that they could never do anything to create problems for us.


Among the ways in which globalization has made differences in living standards around the world untenable, three stand out. One is the spread of emerging diseases from poor remote countries to rich countries.


Many people in poor countries get frustrated and angry when they become aware of the comfortable lifestyles available elsewhere in the world. Some of them become terrorists, and many others who aren’t terrorists themselves tolerate or support terrorists.


Only in poor countries, where much of the population does feel desperate and angry, is there toleration or support for terrorists.


They have two ways of achieving it. First, governments of developing countries consider an increase in living standards, including consumption rates, as a prime goal of national policy. Second, tens of millions of people in the developing world are unwilling to wait to see whether their government can deliver high living standards within their lifetime. Instead, they seek the First World lifestyle now, by emigrating to the First World, with or without permission: especially by emigrating to Western Europe and the U.S., and also to Australia; and especially from Africa and parts of Asia, and also from Central and South America. It’s proving impossible to keep out the immigrants.

A Muscle Beach bodybuilder & his hapa surfer buddy battle a cult that exploits rich hippies

Monday, June 24th, 2019

Gwern reviews Conan the Barbarian:

(Got around to watching after reading an amusing tweet summary: “An underappreciated thing about the Conan the Barbarian movie is how low-key informed it is by 1970s California beach culture. It’s basically about a Muscle Beach bodybuilder & his hapa surfer buddy doing drugs, having casual sex & battling a cult that exploits rich hippies.” Having already watched Pumping Iron, which shows Arnold Schwarzenegger not long before while still trying to transition from bodybuilding to film and his milieu, I was intrigued by the comparison. And Stentz’s summary is… dead on. It’s so easy to see them as Californian bodybuilders bumbling around, having a good time, distracted by a hippie Californian Asian/human-potential cult — complete with longhaired acolytes twirling flowers and meditating, and hilariously homoerotic dialogue, which as “The Power and the Gory”, takes pains to remind us, was a big part of the bodybuilding scene as even straight bodybuilders would whore themselves out to gay men for money or access to controlled steroids/drugs. I was further surprised by how slow-moving and mild it is — it repeatedly pulls punches and takes more peaceful ways out than its bloody reputation would suggest (even the Seven Samurai-homage set-piece features possibly less bloodshed than the original), right up to the climax. Of course Thulsa Doom is going to transform into his giant serpent form and fight Conan, right? Nope! And then all the cultists just quietly disperse.)

I happen to be listening to Schwarzenegger’s memoir, Total Recall, and this all rings true.

We know of at least three false alarms given by the American detection system

Sunday, June 23rd, 2019

The most obvious crisis that the US has faced — and continues to face — Jared Diamond argues (in Upheaval) is nuclear armageddon:

For example, on the first day of the week-long Cuban Missile Crisis, Kennedy announced publicly that any launch of a Soviet missile from Cuba would require “a full retaliatory response [of the U.S.] upon the Soviet Union.” But Soviet submarine captains had the authority to launch a nuclear torpedo without first having to confer with Soviet leadership in Moscow. One such Soviet submarine captain did consider firing a nuclear torpedo at an American destroyer threatening the submarine; only the intervention of other officers on his ship dissuaded him from doing so. Had the Soviet captain carried out his intent, Kennedy might have faced irresistible pressure to retaliate, leading to irresistible pressure on Khrushchev to retaliate further…


Once missiles have been launched, are underway, and have been detected, the American or Russian president has about 10 minutes to decide whether to launch a retaliatory attack before the incoming missiles destroy the land-based missiles of his country. Launched missiles can’t be recalled.


We know of at least three false alarms given by the American detection system. For example, on November 9, 1979 the U.S. army general serving as watch officer for the U.S. system phoned then-Under-Secretary of Defense William Perry in the middle of the night to say, “My warning computer is showing 200 ICBMs in flight from the Soviet Union to the United States.” But the general concluded that the signal was probably a false alarm, Perry did not awaken President Carter, and Carter did not push the button and needlessly kill a hundred million Soviets. It eventually turned out that the signal was indeed a false alarm due to human error: a computer operator had by mistake inserted into the U.S. warning system computer a training tape simulating the launch of 200 Soviet ICBMs.


We also know of at least one false alarm given by the Russian detection system: a single non-military rocket launched in 1995 from an island off Norway towards the North Pole was misidentified by the automatic tracking algorithm of Russian radar as a missile launched from an American submarine.


U.S. policy towards Russia today ignores the lesson that Finland’s leaders drew from the Soviet threat after 1945: that the only way of securing Finland’s safety was to engage in constant frank discussions with the Soviet Union, and to convince the Soviets that Finland could be trusted and posed no threat (Chapter 2).

We’ve displaced our fears of nuclear weapons onto nuclear power plants

Sunday, June 23rd, 2019

Michael Shellenberger explains (in Forbeswhat HBO’s Chernobyl got wrong:

In interviews around the release of HBO’s “Chernobyl,” screenwriter and show creator Mazin insisted that his mini-series would stick to the facts. “I defer to the less dramatic version of things,” Mazin said, adding, “you don’t want to cross a line into the sensational.”

In truth, “Chernobyl” runs across the line into sensational in the first episode and never looks back.

In one episode, three characters dramatically volunteer to sacrifice their lives to drain radioactive water, but no such event occurred.

“The three men were members of the plant staff with responsibility for that part of the power station and on shift at the time the operation began,” notes Adam Higginbotham, author of, Midnight in Chernobyl, a well-researched new history. “They simply received orders by telephone from the reactor shop manager to open the valves.”

Nor did radiation from the melted reactor crash a helicopter that flew too close, as is suggested in “Chernobyl.” There was a helicopter crash but it took place six months later and had nothing to do with radiation. One of the helicopter’s blades hit a chain dangling from a construction crane.

The most egregious of “Chernobyl” sensationalism is the depiction of radiation as contagious, like a virus. The scientist-hero played by Emily Watson physically drags away the pregnant wife of a Chernobyl firefighter dying from Acute Radiation Syndrome (ARS).

“Get out! Get out of here!” Watson screams, as though every second the woman is with her husband she is poisoning her baby.

But radiation is not contagious. Once someone has removed their clothes and been washed, as the firefighters were in real life, and in “Chernobyl,” the radioactivity is internalized and not contagious.

Why, then, do hospitals isolate radiation victims behind plastic screens? Because their immune systems have been weakened and they are at risk of being exposed to something they can’t handle. In other words, the contamination threat is the opposite of that depicted in “Chernobyl.”

The baby dies. Watson says, “The radiation would have killed the mother, but the baby absorbed it instead.” Mazin and HBO apparently believe such an event actually occurred.

HBO tries to clean-up some of the sensationalism with captions at the very end of the series. None note that claiming a baby died by “absorbing” radiation from its father is total and utter pseudoscience.

There is no good evidence that Chernobyl radiation killed a baby nor that it caused any increase in birth defects.

“We’ve now had a chance to observe all the children that have been born close to Chernobyl,” reported UCLA physician Robert Gale in 1987, and “none of them, at birth, at least, has had any detectable abnormalities.”

Indeed, the only public health impact beyond the deaths of the first responders was 20,000 documented cases of thyroid cancer in those aged under 18 at the time of the accident.

The United Nations in 2017 concluded that only 25%, 5,000, can be attributed to Chernobyl radiation (paragraphs A-C). In earlier studies, the UN estimated there could be up to 16,000 cases attributable to Chernobyl radiation.

Since thyroid cancer has a mortality rate of just one percent, that means the expected deaths from thyroid cancers caused by Chernobyl will be 50 to 160 over an 80-year lifespan.

At the end of the show, HBO claims there was “a dramatic spike in cancer rates across Ukraine and Belarus,” but this too is wrong.

Residents of those two countries were “exposed to doses slightly above natural background radiation levels,” according to the World Health Organization. If there are additional cancer deaths they will be “about 0.6% of the cancer deaths expected in this population due to other causes.”

Radiation is not the superpotent toxin “Chernobyl” depicts. In episode one, high doses of radiation make workers bleed, and in episode two, a nurse who merely touches a firefighter sees her hand turn bright red, as though burned. Neither thing occurred or is possible.

“Chernobyl” ominously depicts people gathered on a bridge watching the Chernobyl fire. At the end of the series, HBO claims, “it has been reported that none survived. It is now known as the “Bridge of Death.”

But the “Bridge of Death” is a sensational urban legend and there is no good evidence to support it.

“Chernobyl” is as misleading for what it leaves out. It gives the impression that all Chernobyl first responders who suffered Acute Radiation Syndrome (ARS) died. In reality, 80 percent of those with ARS survived.

It’s clear that even highly educated and informed viewers, including journalists, mistook much of “Chernobyl” fiction for fact.

The New Yorker repeated the claim that a woman’s baby “absorbed radiation” and died. The New Republic described radiation as “supernaturally persistent” and contagious (a “zombie logic, by which anyone who is poisoned becomes poisonous themselves”). The Economist, People, and others repeated the “bridge of death” urban legend.

There is a human cost to these misrepresentations. The notion that people exposed to radiation are contagious was used to terrify, stigmatize, and isolate people in Hiroshima and Nagasaki, Japan, Chernobyl, and again in Fukushima.

Women in the areas that received low levels of radiation from Chernobyl terminated 100,000 to 200,000 pregnancies in a panic, and those who were exposed to Chernobyl radiation were four times more likely to report anxiety, depression, and post-traumatic stress disorder.


In the end, HBO’s “Chernobyl” gets nuclear wrong for the same reason humankind as a whole has been getting it wrong for over 60 years, which is that we’ve displaced our fears of nuclear weapons onto nuclear power plants.

In reality, Chernobyl proves why nuclear is the safest way to make electricity. In the worst nuclear power accidents, relatively small amounts of particulate matter escape, harming only a handful of people.

Americans spend three to four times more time watching TV together than talking with one another

Saturday, June 22nd, 2019

Jared Diamond argues (in Upheaval) that the US is facing a political and cultural crisis:

No one, in the 5,400-year history of centralized government on all of the continents, has figured out how to ensure that the policies implemented with enviable speed by dictatorships consist predominantly of good policies.


I also acknowledge that democracy isn’t necessarily the best option for all countries; it’s difficult for it to prevail in countries lacking the prerequisites of a literate electorate and a widely accepted national identity.


To understand the fundamental benefits of an immigrant population, imagine that you could divide the population of any country into two groups: one consisting on the average of the youngest, healthiest, boldest, most risk-tolerant, most hard-working, ambitious, and innovative people; the other consisting of everybody else. Transplant the first group to another country, and leave the second group in their country of origin. That selective transplanting approximates the decision to emigrate and its successful accomplishment.


One friend of mine, nominated to a second-level position in the National Oceanic and Atmospheric Administration, withdrew his candidacy when he still hadn’t been confirmed after a year of waiting.


Why has this breakdown of political compromise accelerated within the last two decades? In addition to the other harm that it causes, it’s self-reinforcing, because it makes people other than uncompromising ideologues reluctant to seek government service as an elected representative. Two friends of mine who had been widely respected long-serving U.S. senators, and who seemed likely to succeed once again if they ran for re-election, decided instead to retire because they were so frustrated with the political atmosphere in Congress.


One suggested explanation is the astronomical rise in costs of election campaigns, which has made donors more important than in the past.


As one disillusioned friend wrote me after retiring from a long career in politics, “Of all the issues that we face, I think that the skew of money in our political system and our personal lives has been by far the most damaging. Politicians and political outcomes have been purchased on a grander scale than ever before… the scramble for political money saps time and money and enthusiasm… political schedules bend to money, political discourse worsens, and politicians do not know each other as they fly back and forth to their districts.”


Formerly, our representatives served in Congress in Washington during the week; then they had to remain in Washington for the weekend because they couldn’t return to their home state and back within the span of a weekend. Their families lived in Washington, and their children went to school in Washington. On weekends the representatives and their spouses and children socialized with one another, the representatives got to know one another’s spouses and children, and the representatives spent time with one another as friends and not just as political adversaries or allies. Today, though, the high cost of election campaigns puts pressure on representatives to visit their home state often for the purpose of fund-raising, and the growth of domestic air travel makes that feasible.


Those of you American readers over the age of 40, please reflect on changes that you’ve seen yourself in American elevator behavior (people waiting to enter an elevator now less likely to wait for those exiting the elevator); declining courtesy in traffic (not deferring to other drivers); declining friendliness on hiking trails and streets (Americans under 40 less likely to say hello to strangers than Americans over 40); and above all, in many circles, increasingly abusive “speech” of all sorts, especially in electronic communication.


American academic debates have become more vicious today than they were 60 years ago.


Already at the beginning of my academic career, I found myself involved in scholarly controversies, just as I am now. But I formerly thought of the scientists with whom I disagreed on scientific matters as personal friends, not as personal enemies. For example, I recall spending a vacation in Britain after a physiological conference, touring ruined Cistercian monasteries with a nice and gentle American physiologist with whom I had strongly disagreed about the mechanism of epithelial water transport at the conference. That would be impossible today. Instead, I’ve now repeatedly been sued, threatened with lawsuits, and verbally abused by scholars disagreeing with me. My lecture hosts have been forced to hire bodyguards to shield me from angry critics. One scholar concluded a published review of one of my books with the words “Shut up!”


All of these arenas of American life are facets of the same widely discussed phenomenon: the decline of what is termed “social capital.”


“… social capital refers to connections among individuals—social networks and the norms of reciprocity and trustworthiness that arise from them. In that sense social capital is closely related to what some have called ‘civic virtue.’”


But Americans have been decreasingly involved in such face-to-face groups, while becoming increasingly involved in on-line groups in which you never meet, see, or hear the other person.


The telephone appeared in 1890 but didn’t saturate the U.S. market until around 1957. Radio rose to saturation from 1923 to 1937, and TV from 1948 to 1955. The biggest change has been the more recent rise of the internet, cell phones, and text messaging.


Americans spend three to four times more time watching TV together than talking with one another, and at least one-third of all TV viewing time is spent alone (often on the internet rather than in front of a TV set).


In a Canadian valley were three otherwise similar towns, one of which happened to be out of reach for the TV transmitter serving the area. When that town did gain reception, participation in clubs and other meetings declined compared to participation in that same town before TV arrived, down to levels comparable to participation in the other two towns already served by TV.


In the remote areas of New Guinea where I do fieldwork, and where new communication technologies haven’t yet arrived, all communication is still face-to-face and full-attention—as it used to be in the U.S. Traditional New Guineans spend most of their waking hours talking to one another. In contrast to the distracted and sparse conversations of Americans, traditional New Guinea conversations have no interruptions to look at the cell phone in one’s lap, nor to tap out e-mails or text messages during a conversation with a person physically present but receiving only a fraction of one’s attention.


One American missionary’s son who grew up as a child in a New Guinea village and moved to the U.S. only in his high school years described his shock on discovering the contrast between children’s playing styles in New Guinea and in the U.S. In New Guinea, children in a village wandered in and out of one another’s huts throughout the day. In the U.S., as my friend discovered, “Kids go into their own houses, close the door, and watch TV by themselves.”


South Korean applicants for training as primary schoolteachers have to score in the top 5% on national college entrance exams, and there are 12 teachers applying for every secondary school teaching job in South Korea. In contrast, American teachers have the lowest relative salaries (i.e., relative to average national salaries for all jobs) among major democracies.


All schoolteachers in South Korea, Singapore, and Finland come from the top third of their school classes, but nearly half of American teachers come from the bottom third of their classes.


In all my 53 years of teaching at the University of California (Los Angeles), a university that attracts good students, I have had only one student who told me that he wanted to become a schoolteacher.


For instance, Canada’s criteria for admitting immigrants are more detailed and rational than the U.S.’s. As a result, 80% of Canadians consider immigrants good for the Canadian economy—a far cry from the lacerating divisions in American society over immigration.