Dune’s Half-Century

Tuesday, August 11th, 2015

In 1956, Doubleday published The Dragon in the Sea, the first novel by a California news-paper-man named Frank Herbert:

Even now, the book seems a little hard to pin down. It was, for the most part, a Cold War thriller about the race to harvest offshore oil — except crammed inside the thriller was a near-future science-fiction tale of fantastic technology. And crammed inside the science fiction was a psychological study of naval officers crammed inside submarines.

The Dragon in the Sea received some nice reviews. Anthony Boucher praised it in Fantasy & Science Fiction, and the New York Times compared it to sea-going works by C.S. Forester and Herman Wouk. But readers found the novel confusing, and it didn’t sell particularly well, leaving the 36-year-old Herbert uncertain where to turn next. So he accepted a commission to write something called “They Stopped the Moving Sands.”

However much that sounds like a 1950s sci-fi title, the commission was actually for a non-fiction magazine article about Oregon’s sand dunes and the Department of Agriculture’s attempt to halt their drift by planting them with poverty grasses. The dunes were amazing, Herbert explained in a 1957 letter to his agent: In their undulations, they could “swallow whole cities, lakes, rivers, highways.” He was piling up notes for the article at a furious pace. So many notes, in fact, that he never finished “They Stopped the Moving Sands.”

That turned into Dune, of course:

With Dune, Frank Herbert (1920-1986) made the breakthrough in science fiction that J.R.R. Tolkien had achieved in fantasy — both of them showing all subsequent writers in their fields how to build what we might call Massively Coherent Universes: with clashes of culture, technology, history, language, politics, and religion all worked out in the story’s background.

At the same time, Dune is an occasionally sloppy book and oddly paced. It sprawls when it might be compact and shrinks when it might be discursive. How could an author extend his plot maneuvering through hundreds of pages — and then be satisfied with an ending so rushed that even the death of the hero’s infant child in the final apocalyptic battle is only a side note?

Meanwhile, the prose is sometimes weak, striving for the memorable epigrams it can’t always form. The psychology of the minor characters is ignored at some points and deeply observed at others, which makes those characters flicker in a peculiar way between the two-dimensional walk-ons of myth and the three-dimensional figures of novelistic realism. And the third-person narrator keeps his distance from them by printing what they’re thinking in italics, just so we understand that this is, like, you know, mental speech.

In fact, the book contains so much italics — with the many poems, song lyrics, and extended quotations from fictional sources printed the same way — that the reader wants to bang it against the nightstand once or twice a chapter. Add up all the problems, and you can see why those publishers rejected Herbert’s manuscript. It had a thousand chances to fail and only one chance of succeeding — which it grasped by being so relentlessly, impossibly, irresistibly interesting.

I had my own concerns.

A Plea Regarding “Liberal”

Monday, August 10th, 2015

Dan Klein issues this plea:

Please do not describe leftists, progressives, social democrats, or Democrats as “liberal.”

The word liberal began to take on a political meaning around 1770, he notes:

By virtue of textual digitization, we now can pinpoint the inception with remarkable precision and certainty. In figure 4 we see the introduction of liberal in a political sense, in the expressions liberal policy, liberal system, liberal plan, liberal government, and liberal principles.

Liberal Figure 4

The inception of liberal as a political term should be credited to the Scottish historian William Robertson, who published a book in 1769 that uses the term repeatedly to mean principles of liberty and commercial freedom.(3) Adam Smith embraced and made important use of the semantic innovation in The Wealth of Nations, published in 1776. Smith used the term repeatedly in a signal way to refer to the sort of policy he advocated, a system that gives a strong presumption to individual liberty, and hence commercial and market freedom.

If all nations, Smith says, were to follow “the liberal system of free exportation and free importation,” then they would be like one great cosmopolitan empire, and famines would be prevented. Then he repeats the phrase: “But very few countries have entirely adopted this liberal system.”(4) Smith’s “liberal system” was not concerned solely with international trade. He used “liberal” to describe the application of the same principles to domestic policy issues. Smith was a great opponent of restrictions in the labor market, favoring freedom of contract, and wished to see labor markets “resting on such liberal principles.”(5)

Klein has considered himself a classical liberal for decades:

Today conservatives and libertarians often use the term liberal to refer to leftists, progressives, social democrats, and Democrats. Here I beg you to stop doing so. But if you are not to say “liberal,” what are you to say? One option is to put “liberal” in quotation marks or to say “so-called liberal.” But even better is to use the words that have always signified the mentality of governmentalization: the terms left, progressive, and social democrat.

Prior to the twentieth century, in English-language discourse there was very little talk of “left” and “right,” as shown in figure 8. As the political term left emerged in the twentieth century, it has always signified political and cultural state centralization, through the governmentalization of social affairs. The extreme left is communism. A supposedly more liberal collectivism is socialism. The meaning of the left has changed somewhat, but, despite its verbiage and false consciousness, it still basically remains centered on the governmentalization of social affairs (although we must recognize that on a few issues, the left does lean toward liberalization). The left pretends to favor diversity, but that slogan is in reality just an agenda for people of diverse backgrounds to come together in a broadly uniform set of leftist beliefs.

Liberal Figure 8

As for progressive, the essence was aptly described in 1926 by H.?L. Mencken: “The Progressive is one who is in favor of more taxes instead of less, more bureaus and jobholders, more paternalism and meddling, more regulation of private affairs and less liberty.”(13) That is, the progressive is one who favors greater governmentalization of social affairs. The description has been largely accurate since the word progressive emerged as a political term. As Jonah Goldberg has shown in his regrettably titled book Liberal Fascism, early American progressivism contained rich veins of racism, eugenics, and all-around statism. In figure 9 we see that the political term progressives emerged around 1910.

Liberal Figure 9

Sometimes conservatives and libertarians balk at calling the left “progressive,” not wanting to concede the idea of progress. But I say, let them have it. Not only has progressive always signified statism, but the idea of progress is not suited to true liberalism. The idea of progress is goal-oriented: “Are you making progress on your term paper?” It suggests a goal or destination. But in politics the notion of a social goal or destination is baneful. That collectivists should join together for what they imagine to be progress is perfectly fitting. For them the term progressive is suitable. By contrast, conservatives and libertarians look to, not progress, but improvement.

Another fitting term for leftism is social democracy, which is standard in Europe. Social democracy is a compromise between democratic socialism and a tepid liberalism. The socialistic penchant is foremost, but a vacillating liberalism gnaws at the social democrat’s conscience. In figure 10 we see that the term social democracy emerged around 1900.

Liberal Figure 10

With the onset of the social-democratic age came a confusion of tongues, a Tower of Babel. Over the course of the twentieth century, as the left came to dominate most cultural institutions, its partisans set the semantic rules, and one either played by their rules or found oneself marginalized or excluded. Besides arrogating “liberal” to themselves, they created categories along the lines of “you’re either with us or against us.” There was the left, and then everything else — classical liberals, defenders of tradition, status-quo interest groups, or whatever else — was “the right.” For good measure they would throw in the Nazis, even though they stood for national socialism. With their absurd construction (“the right”), the left, by demonizing any of the groups placed therein — from religious extremists to skinheads to business-interest cronies — would damage and discredit every group within the set of groups denominated as “the right,” most important the true and perennial threat to the leftists’ worldview and selfhood, the classical liberals. How many times have Friedrich Hayek and Milton Friedman, both of whom described themselves as “liberal,” been called “fascist” and “right-wing”?

Cascadia

Monday, August 10th, 2015

When the next full-margin rupture happens, the Pacific Northwest will suffer the worst natural disaster in the history of North America:

Roughly three thousand people died in San Francisco’s 1906 earthquake. Almost two thousand died in Hurricane Katrina. Almost three hundred died in Hurricane Sandy. FEMA projects that nearly thirteen thousand people will die in the Cascadia earthquake and tsunami. Another twenty-seven thousand will be injured, and the agency expects that it will need to provide shelter for a million displaced people, and food and water for another two and a half million.

[...]

Thirty years ago, no one knew that the Cascadia subduction zone had ever produced a major earthquake. Forty-five years ago, no one even knew it existed.

[...]

The Pacific Northwest sits squarely within the Ring of Fire. Off its coast, an oceanic plate is slipping beneath a continental one. Inland, the Cascade volcanoes mark the line where, far below, the Juan de Fuca plate is heating up and melting everything above it. In other words, the Cascadia subduction zone has, as Goldfinger put it, “all the right anatomical parts.” Yet not once in recorded history has it caused a major earthquake — or, for that matter, any quake to speak of. By contrast, other subduction zones produce major earthquakes occasionally and minor ones all the time: magnitude 5.0, magnitude 4.0, magnitude why are the neighbors moving their sofa at midnight. You can scarcely spend a week in Japan without feeling this sort of earthquake. You can spend a lifetime in many parts of the Northwest — several, in fact, if you had them to spend — and not feel so much as a quiver. The question facing geologists in the nineteen-seventies was whether the Cascadia subduction zone had ever broken its eerie silence.

In the late nineteen-eighties, Brian Atwater, a geologist with the United States Geological Survey, and a graduate student named David Yamaguchi found the answer, and another major clue in the Cascadia puzzle. Their discovery is best illustrated in a place called the ghost forest, a grove of western red cedars on the banks of the Copalis River, near the Washington coast. When I paddled out to it last summer, with Atwater and Yamaguchi, it was easy to see how it got its name. The cedars are spread out across a low salt marsh on a wide northern bend in the river, long dead but still standing. Leafless, branchless, barkless, they are reduced to their trunks and worn to a smooth silver-gray, as if they had always carried their own tombstones inside them.

What killed the trees in the ghost forest was saltwater. It had long been assumed that they died slowly, as the sea level around them gradually rose and submerged their roots. But, by 1987, Atwater, who had found in soil layers evidence of sudden land subsidence along the Washington coast, suspected that that was backward — that the trees had died quickly when the ground beneath them plummeted. To find out, he teamed up with Yamaguchi, a specialist in dendrochronology, the study of growth-ring patterns in trees. Yamaguchi took samples of the cedars and found that they had died simultaneously: in tree after tree, the final rings dated to the summer of 1699. Since trees do not grow in the winter, he and Atwater concluded that sometime between August of 1699 and May of 1700 an earthquake had caused the land to drop and killed the cedars. That time frame predated by more than a hundred years the written history of the Pacific Northwest — and so, by rights, the detective story should have ended there.

But it did not. If you travel five thousand miles due west from the ghost forest, you reach the northeast coast of Japan. As the events of 2011 made clear, that coast is vulnerable to tsunamis, and the Japanese have kept track of them since at least 599 A.D. In that fourteen-hundred-year history, one incident has long stood out for its strangeness. On the eighth day of the twelfth month of the twelfth year of the Genroku era, a six-hundred-mile-long wave struck the coast, levelling homes, breaching a castle moat, and causing an accident at sea. The Japanese understood that tsunamis were the result of earthquakes, yet no one felt the ground shake before the Genroku event. The wave had no discernible origin. When scientists began studying it, they called it an orphan tsunami.

Finally, in a 1996 article in Nature, a seismologist named Kenji Satake and three colleagues, drawing on the work of Atwater and Yamaguchi, matched that orphan to its parent — and thereby filled in the blanks in the Cascadia story with uncanny specificity. At approximately nine o’ clock at night on January 26, 1700, a magnitude-9.0 earthquake struck the Pacific Northwest, causing sudden land subsidence, drowning coastal forests, and, out in the ocean, lifting up a wave half the length of a continent. It took roughly fifteen minutes for the Eastern half of that wave to strike the Northwest coast. It took ten hours for the other half to cross the ocean. It reached Japan on January 27, 1700: by the local calendar, the eighth day of the twelfth month of the twelfth year of Genroku.

Once scientists had reconstructed the 1700 earthquake, certain previously overlooked accounts also came to seem like clues. In 1964, Chief Louis Nookmis, of the Huu-ay-aht First Nation, in British Columbia, told a story, passed down through seven generations, about the eradication of Vancouver Island’s Pachena Bay people. “I think it was at nighttime that the land shook,” Nookmis recalled. According to another tribal history, “They sank at once, were all drowned; not one survived.” A hundred years earlier, Billy Balch, a leader of the Makah tribe, recounted a similar story. Before his own time, he said, all the water had receded from Washington State’s Neah Bay, then suddenly poured back in, inundating the entire region. Those who survived later found canoes hanging from the trees. In a 2005 study, Ruth Ludwin, then a seismologist at the University of Washington, together with nine colleagues, collected and analyzed Native American reports of earthquakes and saltwater floods. Some of those reports contained enough information to estimate a date range for the events they described. On average, the midpoint of that range was 1701.

The Destruction of Civil Society

Sunday, August 9th, 2015

Bruce Charlton examines the destruction of civil society by the Left:

Going back to the early 1990s, there emerged a considerable literature and a political movement concerned with Civil Society — which was the layer of organized social life between the government and the family: churches, professions and guilds, charities and clubs and the like.

This movement came in the wake of the fall of the Soviet Union and most of its satellites and colonies in 1989, and the break up of that empire into ‘democratic’ nations. The idea was that totalitarianism had been characterized by the destruction of civil society (either annihilation — as with many Christian churches, or take over by the state.

By contrast, civil society was seen as a vital characteristic of a health and free society — the idea that Men should have forms of organization that were substantially autonomous was seen as both efficient and also morally necessary.

The idea was that civil societies should be encouraged in the emerging nations of central and Eastern Europe -and indeed elsewhere — so that they might become Free as the West was Free.

What we have seen instead has been the near complete destruction of civil society in the West — and the process has bee all but un-remarked and un-noted as a general phenomenon. Almost all forms of human association have been brought under control of the state, most are irrelevant, participation in civil society is very low and feeble, many churches, professions social hobby groups been severely weakened or become extinct.

By the criteria of 25 years ago, objectively this means The West is not free, and is instead totalitarian.

It happened by a different mechanism than under Soviet Communism — which used direct suppression, making institutions illegal, confiscating their assets. imprisoning their leaders etc. In the West the imposition of totalitarianism was a mixture of subsidy-control and strangulation by regulation.

Transparent Luminescent Solar Concentrators

Sunday, August 9th, 2015

Transparent solar panels face an obvious challenge, but transparent luminescent solar concentrators provide a possible solution:

The solar harvesting system uses small organic molecules developed by Lunt and his team to absorb specific nonvisible wavelengths of sunlight.

“We can tune these materials to pick up just the ultraviolet and the near infrared wavelengths that then ‘glow’ at another wavelength in the infrared,” he said.

The “glowing” infrared light is guided to the edge of the plastic where it is converted to electricity by thin strips of photovoltaic solar cells.

Transparent Luminescent Solar Concentrator

“It opens a lot of area to deploy solar energy in a non-intrusive way,” Lunt said. “It can be used on tall buildings with lots of windows or any kind of mobile device that demands high aesthetic quality like a phone or e-reader. Ultimately we want to make solar harvesting surfaces that you do not even know are there.”

Lunt said more work is needed in order to improve its energy-producing efficiency. Currently it is able to produce a solar conversion efficiency close to 1 percent, but noted they aim to reach efficiencies beyond 5 percent when fully optimized. The best colored LSC has an efficiency of around 7 percent.

Decline in Marine Officer Intelligence

Saturday, August 8th, 2015

Brookings’ Michael Klein and Tufts University’s Matthew Cancian — a former Marine officer who served in Afghanistan — take a closer look at the steady and troubling decline in the average intelligence of Marine Corps officers:

After analyzing test scores of 46,000 officers who took the Marine Corps’ required General Classification Test (GCT), Klein and Cancian find that the quality of officers in the Marines, as measured by those test scores, has steadily and significantly declined over the last 34 years.

So what’s causing this steady decline in GCT scores? According to Klein and Cancian, the decline in officer quality might actually have to do with the fact that more people are receiving college degrees than ever before: The authors note that the decrease of GCT scores over time correlates to an increase in the college participation rate during that same period.

Average General Classification Test Scores of Incoming Marine Officers

Eric Crampton suggests some other plausible candidate explanations:

  • Higher opportunity costs for high IQ people since the 80s.
  • Greater cultural disparaging of the military in elite circles, so it is not aspired to by those of higher ability.
  • Decreasing trust in that military is a force for good or really much needed (see the decline as the Soviets turned friendlier under Gorbachev, the levelling off and rise around Gulf War I, and the resumed decline after that).
  • Fewer smart but lower income kids needing to use ROTC to afford college with expansions in student aid.

Ekso Works

Saturday, August 8th, 2015

Rafe Needleman tried out Ekso Bionics’s new unpowered industrial exoskeleton:

The Ekso Works is a framework you strap yourself into that mechanically transmits the load of equipment attached to mounts at the hip, directly to the ground. You can walk, you can bend, and the gear is supported by the Ekso frame. The Ekso Works provides skeletal strength. You provide the balance and motivation.

To set up for my demo, Harding first handed me a 15-pound industrial angle grinder. It’s heavy, awkward, and a pretty uncomfortable piece of gear to control when it’s at arm’s length or overhead. Using one repetitively can cause stress injuries. Or, if you’re a lightweight like me, you’re likely to drop it on your toes after trying to control it for more than a few minutes.

But when the grinder was attached to a Steadicam-like articulated Equipois mount on the Ekso’s hip attachment point, it became weightless. It still had inertia, of course, but it didn’t weigh a thing, and I could wrench it around like it was a six-ounce pair of pliers. I could hold it over my head easily, and control it with precision. If I wanted to use it at arm’s length, I could do that, too; Counterweights attached to the Ekso’s plate on my back kept my center of gravity over my legs.

And I could walk. It was awkward because it was a new physical experience, but it wasn’t difficult.

One key to all the current Ekso products: The knees. They lock when you’re standing up, just like our real knees. That means that when you’re just standing there, no power input is required. The task of balancing, which does take some energy, is up to the person wearing the device. But as I discovered, even when wearing an elaborate cage of struts and joints, with a spring arm and industrial device strapped to your hip and pounds of steel counterweights on your back, you still know how to balance and walk.

It’s just asking for an M56 Smart Gun.

Don’t Criticize the GOP for Performing Its Function

Friday, August 7th, 2015

Don’t criticize the GOP for performing its function, Henry Dampier says:

The purpose of the GOP is to run interference for the left to make it easier for said institutional left to administer the state, and through its administration of the state, the rest of the society. Criticizing the GOP for performing its function is confused — there’s no other role for it to play, considering its non-representation in important institutions like government bureaucracies and, more importantly, universities.

The GOP functions to contain discontent among the population, to channel it into useless issues, and to tell people what they’re allowed to believe and express without provoking social opprobrium and legal consequences.

Wanting to replace the GOP with a new party would just mean taking over its function — which is to perform as a catcher, political policeman, and misdirection apparatus for the state — really not terrible work if you can get it, but nothing like actually controlling the state or changing the methods by which it operates.

Grinds

Friday, August 7th, 2015

When it comes to college admissions, why should an ability to play the bassoon or chuck a lacrosse ball be given any weight in the selection process? To avoid grinds — supposedly:

Jerome Karabel has unearthed a damning paper trail showing that in the first half of the twentieth century, holistic admissions were explicitly engineered to cap the number of Jewish students. Ron Unz, in an exposé even more scathing than Deresiewicz’s, has assembled impressive circumstantial evidence that the same thing is happening today with Asians.

Just as troublingly, why are elite universities, of all institutions, perpetuating the destructive stereotype that smart people are one-dimensional dweebs? It would be an occasion for hilarity if anyone suggested that Harvard pick its graduate students, faculty, or president for their prowess in athletics or music, yet these people are certainly no shallower than our undergraduates. In any case, the stereotype is provably false. Camilla Benbow and David Lubinski have tracked a large sample of precocious teenagers identified solely by high performance on the SAT, and found that when they grew up, they not only excelled in academia, technology, medicine, and business, but won outsize recognition for their novels, plays, poems, paintings, sculptures, and productions in dance, music, and theater. A comparison to a Harvard freshman class would be like a match between the Harlem Globetrotters and the Washington Generals.

What about the rationalization that charitable extracurricular activities teach kids important lessons of moral engagement? There are reasons to be skeptical. A skilled professional I know had to turn down an important freelance assignment because of a recurring commitment to chauffeur her son to a resumé-building “social action” assignment required by his high school. This involved driving the boy for 45 minutes to a community center, cooling her heels while he sorted used clothing for charity, and driving him back — forgoing income which, judiciously donated, could have fed, clothed, and inoculated an African village. The dubious “lessons” of this forced labor as an overqualified ragpicker are that children are entitled to treat their mothers’ time as worth nothing, that you can make the world a better place by destroying economic value, and that the moral worth of an action should be measured by the conspicuousness of the sacrifice rather than the gain to the beneficiary.

Prescriptive Poverty

Thursday, August 6th, 2015

John Derbyshire has a suggestion to sociologists writing books for the general-interest public:

Drop the last chapter. You know, the chapter where, after 200 pages of describing some social problem or other, you offer solutions to the problem.

That thought occurred to him after finishing Robert Putnam’s Our Kids.

He offers some advice to readers, too:

There is now a good nucleus of human-science bloggers providing day-to-day commentary on the human sciences. Some of them, like Bruce Charlton and Greg Cochran, are accredited academics; others, like JayMan and HBD Chick, are thoughtful nonspecialists.

[...]

If the study of human nature interests you, spend a couple of hours reading JayMan’s recent rumination on empathy and universalism, or James Thompson’s on African intelligence. Chase down the links and watch the comment-thread jousting.

Deracinated Nice-Americans

Thursday, August 6th, 2015

As a Brooklyn native, Colin Quinn (The Coloring Book) is, as Steve Sailer puts it, a little weirded out by all the deracinated Nice-Americans newly arrived from suburbs across the land:

But looking at them, you see why this country was built successfully—WASPs are chore-oriented. When they came to New York, there were ten times as many hardware stores…. That’s the one weapon that not even the toughest community in New York City was prepared to combat: affability. Guns, knives, bats, but the one thing they never thought you’d get hit with is a can-do attitude. How can you argue with the energy that built the West?…

These new white people hate the white man, too. They’ve been to liberal arts colleges where they learned about oppression and condemn it. And so there they are, these people who look like they stepped out of a Norman Rockwell painting, on the sidewalks of Bed-Stuy, singing along with Notorious B.I.G.’s “Ten Crack Commandments,” sanding chairs.

[...]

Supposedly nonjudgmental judgmentalism used to be what Southern California was for, but now it’s encroaching even here in New York—where people are supposed to come to judge things.

A Freer Market for Force

Thursday, August 6th, 2015

For now the market for private military contractors is a monopsony, Sean McFate (The Modern Mercenary) suggests, but we should prepare for a freer market for force:

The result, McFate predicts, will be a return to the Middle Ages, when private warriors determined the outcomes of conflicts and states stood at the sidelines of international politics. This “neo-medieval” world will be characterized by “a non-state-centric, multipolar international system of overlapping authorities and allegiances within the same territory.” Yet it need not be chaotic, he reassures readers, since “the global system will persist in a durable disorder that contains, rather than solves, problems.”

How can the world avoid replicating the problems generated by hired guns in the medieval era? The answer, according to McFate, is to rely less on mere mercenaries and instead foster “military enterprisers.” The former sell their skills to the highest bidder; the latter “raise armies rather than command them” and thus contribute to stability. During the Thirty Years’ War, military enterprisers included such figures as Ernst von Mansfeld, who raised an army for the elector palatine, and Albrecht von Wallenstein, who offered his services to Ferdinand II, the Holy Roman emperor.

But in sketching out a strategy for dealing with a world of privatized power, McFate is too quick to jettison the state-centric principles that have served the world so well since the end of the Thirty Years’ War. The biggest challenges to U.S. security in the years ahead, from climate change to terrorism to cybersecurity, will require more state-to-state collaboration, not less. And U.S. support, tacit or otherwise, for a free market for force will only serve ?to exacerbate these problems.

McFate offers two in-depth case studies of modern contractors: in Liberia, where they played the role of military enterprisers, and in Somalia, where they acted as mercenaries.

Our Flawed New Religion

Wednesday, August 5th, 2015

Americans have developed a new religion, John McWhorter explains — Antiracism:

Of course, most consider antiracism a position, or evidence of morality. However, in 2015, among educated Americans especially, Antiracism — it seriously merits capitalization at this point — is now what any naïve, unbiased anthropologist would describe as a new and increasingly dominant religion. It is what we worship, as sincerely and fervently as many worship God and Jesus and, among most Blue State Americans, more so.

Ta-Nehisi Coates, for example, is a priest of our new religion:

Coates is “revered,” as New York magazine aptly puts it, as someone gifted at phrasing, repeating, and crafting artful variations upon points that are considered crucial — that is, scripture. Specifically, Coates is celebrated as the writer who most aptly expresses the scripture that America’s past was built on racism and that racism still permeates the national fabric.

[...]

People were receiving “The Case for Reparations” as, quite simply, a sermon. Its audience sought not counsel, but proclamation. Coates does not write with this formal intention, but for his readers, he is a preacher. A.O. Scott perfectly demonstrates Coates’s now clerical role in our discourse in saying that his new book is “essential, like water or air” — this is the kind of thing one formerly said of the Greatest Story Ever Told.

I suppose this passage may upset sincere believers:

One hearkens to one’s preacher to keep telling the truth — and also to make sure we hear it often, since many of its tenets are easy to drift away from, which leads us to the next evidence that Antiracism is now a religion. It is inherent to a religion that one is to accept certain suspensions of disbelief. Certain questions are not to be asked, or if asked, only politely — and the answer one gets, despite being somewhat half-cocked, is to be accepted as doing the job.

“Why is the Bible so self-contradictory?” Well, God works in mysterious ways — what’s key is that you believe. “Why does God allows such terrible things to happen?” Well, because we have free will … and it’s complicated but really, just have faith.

It stops there: beyond this first round, one is to classify the issues as uniquely “complicated.” They are “deep,” one says, looking off into the air for a sec in a reflective mode, implying that thinking about this stuff just always leads to more questions, in an infinitely questing Talmudic exploration one cannot expect to yield an actual conclusion.

Antiracism requires much of the same standpoint. For example, one is not to ask “Why are black people so upset about one white cop killing a black man when black men are at much more danger of being killed by one another?” Or, one might ask this, very politely — upon which the answers are flabby but further questions are unwelcome. A common answer is that black communities do protest black-on-black violence — but anyone knows that the outrage against white cops is much, much vaster.

Why? Is the answer “deep,” perhaps? Charles Blow, at least deigning to take the issue by the horns, answers that the black men are killing one another within a racist “structure.” That doesn’t explain why black activists consider the white cop a more appalling threat to a black man than various black men in his own neighborhood. But to push the point means you just don’t “get” it (you haven’t opened your heart to Jesus, perhaps?). Jamelle Bouie answers that there’s a difference between being killed by a fellow citizen and being killed by a figure of authority, but does that mean “It’s not as bad if we do it to ourselves”? Of course not! … but, but (roll of the eyes) “racist,” “doesn’t get it.”

One is not to question, and people can be quite explicit about that. For example, in the “Conversation” about race that we are so often told we need to have, the tacit idea is that black people will express their grievances and whites will agree — again, no questions, or at least not real ones. Here and there lip service is paid to the idea that the Conversation would not be such a one-way affair, but just as typical is the praise that a piece like Reni Eddo-Lodge’s elicits, openly saying that white people who object to any black claims about racism are intolerably mistaken and barely worth engagement (Eddo-Lodge now has a contract to expand the blog post into a book). Usefully representative is a letter that The New York Times chose to print, which was elicited by David Brooks’s piece on Coates’s book, in which a white person chides Brooks for deigning to even ask whether he is allowed to object to some of Coates’s claims.

Note: To say one is not to question is not to claim that no questions are ever asked. The Right quite readily questions Antiracism’s tenets. Key, however, is that among Antiracism adherents, those questions are tartly dismissed as inappropriate and often, predictably, as racist themselves. The questions are received with indignation that one would even ask them, with a running implication that their having been asked is a symptom of, yes, racism’s persistence.

Don’t Cry for Lions

Wednesday, August 5th, 2015

Goodwell Nzou, a doctoral student in molecular and cellular biosciences at Wake Forest University explains, in Zimbabwe, they don’t cry for lions:

When I turned on the news and discovered that the [Facebook] messages [about Cecil] were about a lion killed by an American dentist, the village boy inside me instinctively cheered: One lion fewer to menace families like mine.

My excitement was doused when I realized that the lion killer was being painted as the villain. I faced the starkest cultural contradiction I’d experienced during my five years studying in the United States.

Did all those Americans signing petitions understand that lions actually kill people? That all the talk about Cecil being “beloved” or a “local favorite” was media hype? Did Jimmy Kimmel choke up because Cecil was murdered or because he confused him with Simba from “The Lion King”?

In my village in Zimbabwe, surrounded by wildlife conservation areas, no lion has ever been beloved, or granted an affectionate nickname. They are objects of terror.

When I was 9 years old, a solitary lion prowled villages near my home. After it killed a few chickens, some goats and finally a cow, we were warned to walk to school in groups and stop playing outside. My sisters no longer went alone to the river to collect water or wash dishes; my mother waited for my father and older brothers, armed with machetes, axes and spears, to escort her into the bush to collect firewood.

A week later, my mother gathered me with nine of my siblings to explain that her uncle had been attacked but escaped with nothing more than an injured leg. The lion sucked the life out of the village: No one socialized by fires at night; no one dared stroll over to a neighbor’s homestead.

When the lion was finally killed, no one cared whether its murderer was a local person or a white trophy hunter, whether it was poached or killed legally. We danced and sang about the vanquishing of the fearsome beast and our escape from serious harm.

Recently, a 14-year-old boy in a village not far from mine wasn’t so lucky. Sleeping in his family’s fields, as villagers do to protect crops from the hippos, buffalo and elephants that trample them, he was mauled by a lion and died.

The killing of Cecil hasn’t garnered much more sympathy from urban Zimbabweans, although they live with no such danger. Few have ever seen a lion, since game drives are a luxury residents of a country with an average monthly income below $150 cannot afford.

Don’t misunderstand me: For Zimbabweans, wild animals have near-mystical significance. We belong to clans, and each clan claims an animal totem as its mythological ancestor. Mine is Nzou, elephant, and by tradition, I can’t eat elephant meat; it would be akin to eating a relative’s flesh. But our respect for these animals has never kept us from hunting them or allowing them to be hunted. (I’m familiar with dangerous animals; I lost my right leg to a snakebite when I was 11.)

The American tendency to romanticize animals that have been given actual names and to jump onto a hashtag train has turned an ordinary situation — there were 800 lions legally killed over a decade by well-heeled foreigners who shelled out serious money to prove their prowess — into what seems to my Zimbabwean eyes an absurdist circus.

PETA is calling for the hunter to be hanged. Zimbabwean politicians are accusing the United States of staging Cecil’s killing as a “ploy” to make our country look bad. And Americans who can’t find Zimbabwe on a map are applauding the nation’s demand for the extradition of the dentist, unaware that a baby elephant was reportedly slaughtered for our president’s most recent birthday banquet.

A Cultural History of Capes

Wednesday, August 5th, 2015

How did the cape become so dramatic?

That’s a story that starts with the very etymology of the term “cape.”

The Latin word for cape, cappa, forms the basis for the word “escape,” which comes from ex cappa. “To escape,” wrote Walter William Skeat in An Etymological Dictionary of the English Language, “is to ex-cape oneself, to slip out of one’s cape and get away.”

From the early days of the cape, when Latin was still spoken on the streets, capes spoke of battle, status, and statuses in battle. Military commanders of the Roman Empire donned paludamentum — a long, flowing cape fastened at one shoulder — as part of their ceremonial battle preparations. Centurions fighting under their command got to wear capes, too, but had to settle for the sagum, a less majestic, less flowy version that fastened with a clasp across the shoulders.

Vercingétorix and Caesar by Royer

Over the centuries, the cape and the sword came to be regarded as a package deal. In 1594, Italian fencing master Giacomo di Grasse penned a True Arte of Defence, in which he included several tips on vanquishing an enemy when armed with a sword-and-cloak combo.

[...]

This “flinging of the cloak” is an early appearance of the cape as a mantle fit for bouts of flouncing. To throw back the sides of a cloak, or toss one side of a cape over one’s shoulder, is a pleasingly dramatic way of revealing a weapon, showing one’s true identity, or punctuating a satisfying riposte, whether physical or verbal. These seeds of “cape as garment of flamboyance,” thus planted, would be harvested centuries later by cape aficionado Oscar Wilde, then augmented with glitter by performers like Liberace.

The practical approach of wearing a cape over one shoulder in order to keep one’s sword arm free became a fashion trend during the late 16th century, when gentlemen donned the “mandilion,” a hip-length cloak with open side seams.

Mandilion worn by Robert Devereaux

The cape as the preferred outerwear of adventurers gained ground with the dashing swashbuckler archetype, first established in literature of the 16th century but most popular during the mid-19th- to early 20th centuries. Many of the protagonists belonging to the genre were known to throw on a cape, grab a sword, and head for the forests in search of mischief. Among characters who couldn’t spell “caper” without a cape were The Three Musketeers, Cyrano de Bergerac, The Scarlet Pimpernel, and Zorro.

Amid all the suave rapier-waving and damsel-saving going on in the swashbuckler genre, a work of literature emerged that dragged the cape into the world of the macabre and the supernatural: Bram Stoker’s Dracula. Written in 1897, Stoker’s version of the eponymous Count did not, however, feature the high-collared, black and red cape to which pop culture is now accustomed.

Bela Lugosi as Dracula in Cape

The iconic Dracula cape now inextricably linked with the character was not established until the 1920s, when adaptations of Dracula hit the stage. And the cloak revamp had more to do with budgetary concerns and theatrical trickery than aesthetics, according to Jonathan Bignell in “A Taste of the Gothic: Film and Television Versions of Dracula”:

Stage versions of the novel needed to have Dracula on stage in drawing-room settings, rather than appearing rarely and in a wide range of outside locations as in the novel. The need to turn Dracula into a melodramatic tale of mystery taking place indoors was the reason for the costuming of Dracula in evening dress and opera cloak, making him look like the sinister hypnotists, seducers and evil aristocrats of the Victorian popular theatre.

The high-collared cape which we now recognize as a hallmark of the Dracula character was first used in the stage versions. Its function was to hide the back of the actor’s head as he escaped through concealed panels in the set to disappear from the stage, while the other actors were left holding his suddenly empty cloak.

From there, the cape became the defining feature of comicbook superheroes, like Superman and Batman.

Action Comics #1