Admirably Adamantine

August 3rd, 2015

After reading Ta-Nehisi Coates’ recent biography, Steve Sailer quipped, “Despite all the violence Coates has suffered at the hands of other blacks, his racial loyalty remains admirably adamantine,” and commenter ABD replied:

I get the sense that there is a lot of psychic stress experienced by the Talented Tenth, i.e. bourgeois middle-class blacks like Ta-Nahesi Coates. They basically live among white people according to white norms, but this creates in them a need to “keep it real” by defending and excusing the dysfunction of underclass blacks. It’s like how the most anti-colonial element of society in the former European empires was the Westernized native elite rather than traditionalist tribal elders.

In a saner world, the Talented Tenth would have an aristocratic attitude of protective paternalism toward black proles, in the spirit of Booker Washington. In actuality, black ethnocentrism, despite being a fundamentally conservative trait, gets filtered through the ideology of the white liberals who run post-white America.

So instead of straightforward tribal noblesse oblige, gentry blacks like TNC (and Obama) live by the fundamentally dishonest and passive-aggressive code of leftism: all animals are equal, but some animals are more equal than others. They lie that D’Shawntavious is their social equal, and the implausibility of that claim means they have to lie again by claiming that Whitey is a wrecker.

I actually kind of pity middle-class blacks who wish to be champions of their people even as they operate within an ideological framework created by leftists who despise their own people.

Noisome Odors

August 3rd, 2015

Smells can bring back memories, including traumatic memories — so the military may want to inoculate the troops:

“Our goal was to test whether you could pre-expose or inoculate people with these odors, and subsequently prevent the negative memory from taking,” she recounted. “That’s exactly what we showed. You could take an odor that was initially unfamiliar, expose an individual to it in a neutral context, and then when you paired that same odor with a negative experience, it no longer had a strong associative power.”

Prior to Dalton’s findings, the military trained soldiers in mock villages that were accurate recreations of what soldiers might encounter in Afghanistan or Iraq, but only to a point: these mockups failed to mimic the olfactory environment of the Middle East. “They covered everything with visual cues, and sometimes there was smoke,” said Dalton. “But bodies rotting in the sun for days at a time? Food-smells of a very foreign culture? Those were the things that were likely to be present at the time soldiers were experiencing these extreme stressors, and those were the things that were becoming tightly bound to the negative emotional state and persisting well beyond the original experience.”

Adding the mixed stenches of sewage, burning garbage, and local spices might not seem like the most crucial component of building a mock Iraqi village, but the science behind pre-exposure prevention of PTSD was strong. The military and VA took the hint. “I know they are [now] actually doing training with realistic olfactory environments,” Dalton said.

Familiarizing the armed forces with the smells of war not only helps mitigate soldiers’ future memories of traumatic events, it also prepares newly deployed soldiers for the smells of a novel environment that might otherwise distract them from their duties. In 2006, the Army and the Marines began training some of their troops with virtual-reality devices that included high-tech collars designed to emit noisome odors like melting plastic, or rotting flesh, prior to deployment.

The First Rule of White Club

August 2nd, 2015

Steve Sailer provides his own sacrilegious take on Coates’ Between the World and Me:

America’s foremost public intellectual, Ta-Nehisi Coates, has published a new best-selling minibook, Between the World and Me, that’s interesting for what it reveals about a forbidden subject: the psychological damage done by pervasive black violence to soft, sensitive, bookish souls such as Coates. The Atlantic writer’s black radical parents forced the frightened child to grow up in Baltimore’s black community, where he lived in constant terror of the other boys. Any white person who wrote as intensely about how blacks scared him would be career-crucified out of his job, so it’s striking to read Coates recounting at length how horrible it is to live around poor blacks if you are a timid, retiring sort.

[...]

Despite all the violence Coates has suffered at the hands of other blacks, his racial loyalty remains admirably adamantine. Thus, his ploy, as psychologically transparent as it is popular with liberal whites, is to blame his lifelong petrified unhappiness on the white suburbanites he envied for being able to live far from black thugs.

Unfortunately for Coates’ persuasiveness, white people, unlike blacks, have never actually done anything terribly bad to him. The worst memory he can dredge up is the time an Upper West Side white woman pushed his 4-year-old son to get the dawdling kid to stop clogging an escalator exit. She even had the racist nerve to say, “Come on!”

Coates reacted as unreasonably as a guest star on Seinfeld would. Ever since this Escalator Incident, he’s been dwelling on how, while it might have looked like yet another example of blacks behaving badly, it was, when you stop to think about slavery and Crow (not to mention redlining), really all the fault of whites.

The central event in Between the World and Me is the fatal shooting in 2000 of an acquaintance from Howard U. by an undercover deputy from Prince George County, the country’s most affluent black-majority county. Coates refers to this tragedy repeatedly as proof of America’s demonic drive to destroy black bodies.

[...]

Since I’m a horrible person, my immediate response to Coates’ tale was…okay…black-run county, affirmative-action hiring, and poor police decision-making…you know, I bet the shooter cop was black.

And sure enough, the Carlton Jones who shot Prince Jones turned out to be black. Coates eventually gets around to briefly admitting that awkward fact, but only after seven pages of purple prose about people who believe they are whites destroying black bodies.

Don’t be a victim of muscle loss

August 2nd, 2015

The average muscle loss in men between the ages of 50 and 70 is 30 percent, Mangan notes, and from ages 70 to 80, another 20 to 30% of muscle is lost:

Add those figures and you’ve got the basis for the fact that most 80-year-old men will have lost 50% of their muscle mass.

It doesn’t have to be that way, if you keep active:

Muscle-Loss Cross-Sections

The Jesuits of the British Empire

August 1st, 2015

Steve Sailer explores Carroll Quigley’s conspiracy theory, The Milner Group:

Academic historians dislike the concept that history is often made by groups of individuals plotting together in confidence, even though one obvious way to get big things done is to make plans with your friends and allies while keeping your rivals in the dark as long as possible.

[...]

[Quigley’s book, The Anglo-American Establishment] actually focused on one group of British establishmentarians, the progressive imperialists who set up the British equivalent of the CFR, the Royal Institute of International Affairs (a.k.a., Chatham House), edited The Times of London for most of the first four decades of the 20th Century, and largely controlled the peculiarly influential All Souls College at Oxford.

Quigley calls them the Milner Group after Alfred Milner (1854-1925), an eminence grise who more or less started the Boer War of 1899-1902, then mentored “Milner’s Kindergarten” of bright young men in South Africa, and finally popped up again in Lloyd George’s war cabinet in 1917, but mostly served behind the scenes.

Quigley traces the Milner Group back to the far more colorful Cecil Rhodes’ desire to start a “Secret Society” to promote Angl0-American unity and global domination.

Seriously, Rhodes’ first will explicitly arranged to leave his fortune to a secret society:

To and for the establishment, promotion and development of a Secret Society, the true aim and object whereof shall be for the extension of British rule throughout the world, the perfecting of a system of emigration from the United Kingdom, and of colonisation by British subjects of all lands where the means of livelihood are attainable by energy, labour and enterprise, and especially the occupation by British settlers of the entire Continent of Africa, the Holy Land, the Valley of the Euphrates, the Islands of Cyprus and Candia, the whole of South America, the Islands of the Pacific not heretofore possessed by Great Britain, the whole of the Malay Archipelago, the seaboard of China and Japan, the ultimate recovery of the United States of America as an integral part of the British Empire, the inauguration of a system of Colonial representation in the Imperial Parliament which may tend to weld together the disjointed members of the Empire and, finally, the foundation of so great a Power as to render wars impossible and promote the best interests of humanity.

The era’s culture of male self-admiration tended to elicit high male achievement, Sailer notes.

The Milner group relied on a certain kind of quasi-secrecy, the famous Chatham House Rule:

If you are invited to a meeting at Chatham House where, say, John Kerry explains the Iran deal, you are allowed to discuss what you learned but not mention the name of whoever you heard it from. That’s a clever way to split the Gordian Knot of wanting to propagandize without being seen to propagandize.

Quigley’s explanation of the how the Milner Group coordinated Establishment opinion is relevant in the U.S. today:

The Times was to be a paper for the people who are influential, and not for the masses.…

By the interaction of these various branches on one another, under the pretense that each branch was an autonomous power, the influence of each branch was increased through a process of mutual reinforcement. The unanimity among the various branches was believed by the outside world to be the result of the influence of a single Truth, while really it was result of a single group.

Thus, a statesman (a member of the Group) announces a policy. About the same time, the Royal Institute of International Affairs publishes a study on the subject, and an Oxford don, a Fellow of All Souls (and a member of the Group) also publishes a volume on the subject (probably through a publishing house, like G. Bell and Sons or Faber and Faber, allied to the Group). The statesman’s policy is subjected to critical analysis and final approval in a “leader” in The Times, while the two books are reviewed (in a single review) in The Times Literary Supplement. Both the “leader” and the review are anonymous but are written by members of the Group. And finally, at about the same time, an anonymous article in The Round Table strongly advocates the same policy.

The cumulative effect of such tactics as this, even if each tactical move influences only a small number of important people, is bound to be great. If necessary, the strategy can be carried further, by arranging for the secretary to the Rhodes Trustees to go to America for a series of “informal discussions” with former Rhodes Scholars, while a prominent retired statesman (possibly a former Viceroy of India) is persuaded to say a few words at the unveiling of a plaque in All Souls or New College in honor of some deceased Warden. By a curious coincidence, both the “informal discussions” in America and the unveiling speech at Oxford touch on the same topical subject.

Blowing up a Balloon

August 1st, 2015

Nutrition gets little attention or respect in medicine, but Dr. Malcolm Kendrick became side tracked by the very powerful and consistent association between heart disease and diabetes:

In short, if you added together what was clear about diabetes and insulin resistance, you got a model of type II diabetes which looked pretty much like this:

  • You eat too much food.
  • You put on weight.
  • As you put on weight you become more and more insulin resistant.
  • At first you will develop insulin resistance syndrome.
  • If you keep putting on weight you will become so insulin resistant that you will develop frank type II diabetes.

I call this the ‘blowing up a balloon’ theory of diabetes. As a balloon expands you have to blow harder and harder to overcome the resistance. As you get fatter and fatter you need more and more insulin to force fats into fat cells. As with many things in medicine this is a nice simple story. It is also very easy to understand, and it is tantalisingly close to being correct.

[...]

Beginning with the most obese group people on the planet earth, namely Sumo wrestlers. I wanted to know how many of them have diabetes, and it did not take long to discover that, whilst in training, none of them have diabetes.

I then searched for the opposite end of the spectrum. Were there people with no adipose tissue, and how many of them had diabetes? Surprisingly, there is one such group, the least obese people on earth. They are those with Beradinelli-Siep lipodystrophy. This is a genetic abnormality which means that these poor unfortunates have almost no fat cells. How many of them have type II diabetes? Well, all of them actually.

I then looked for the population with the highest rate of diabetes in the world. This happens to be the Pima Indians of North Mexico/Southern US. I have seen figures reporting that over 80% of adult males Pima Indians have type II diabetes. It may even be more. And yes, they are very obese.

However, there are two other very interesting facts about the Pima Indians. First, they have a very low rate of heart disease. Or they did last time I looked. Perhaps most importantly, in their youth, when they are not obese, they produce far more insulin in response to food than ‘normal’ populations. Or, to put this another way, they are hyper-insulinaemic before they are obese, and long before they become diabetic. So their excess insulin production is not a result of becoming fatter. The causal chain is the other way around.

I have found that if you speak to most doctors about these facts, a look of complete incomprehension passes over their faces. ‘That cannot be right.’ Of course if you believe in the ‘blowing up a balloon’ model of diabetes, then the Pima Indians, Sumo Wrestlers and those with Beradinalli-Siep lipodystrophy do not make any sense. However, in science, when observations do not fit your hypothesis, it is the hypothesis that needs to change, not the facts.

Just to summarize these ‘paradoxical’ facts:

  • You do not need any fat cells to develop diabetes/if you have no fat cells there is a 100% probability that you will be diabetic.
  • You can be very , very, obese and not have diabetes.
  • You can have increased insulin production long before you become obese (and/or insulin resistant). You become obese later.

[...]

However, luckily, there is another model that fits all the facts. One that I prepared earlier:

  • You produce too much insulin.
  • This forces your body to store fat.
  • You become obese.
  • At a certain point insulin resistance develops to block further weight gain.
  • This resistance becomes more and more severe until…
  • You become diabetic.

This model explains the Pima Indians. Can Sumo wrestlers be fitted into this model? Yes, with a couple of addendums. Sumo Wrestlers eat to become fat, because added mass provides a competitive advantage if you are trying to shove someone else out of a small ring, before they do it to you.

To achieve super-obesity, they wake up, train for two hours, then eat as much as they can of a high carbohydrate, low fat, broth. They then lie about for a few hours allowing the high insulin levels created by the high carbohydrate diet to convert excess sugars to fat, storing this in adipose tissue. Later on they train very hard again, then eat, then sleep. Rpt.

The reason why they do not become diabetic is on this regime is simply because they exercise very, very, hard. They burn up all the sugar/glycogen stores in the liver and muscle whilst exercising, which means that when they eat, the sugar(s) can – at least at first – be easily stored in muscle and liver (so there is no insulin resistance to overcome). However, once these guys stop training, things do not look so good. Diabetes lurks..

Those with Beradinelli-Siep lipodystrophy have the reverse problem to Sumo Wrestlers. Because they have no fat cells there is nowhere to store excess energy to go. If they eat carbohydrate/sugar, the first 1,500 calories can be stored as glycogen – after that there is nowhere left. If the liver converts sugar to fat, there is nowhere for that to go either. So, you get ‘back-pressure’ through the system. It doesn’t matter how high the insulin level gets, if you have nowhere to store energy you have nowhere to store energy. End of.

Whilst those with lipodystrophy cannot tell us much about diabetes and obesity in ‘normal’ people. This condition does make it very clear that diabetes – insulin resistance, high insulin and high sugar levels – is primarily an issue with energy storage and how the body goes about this storage, and the role that insulin plays. If there is somewhere for excess energy to go easily, insulin levels will not go up, and nor will blood sugar levels.

But what of ‘normal’ people. Can normal people be fitted into the updated model of type II diabetes? Well, of course, they can. But you need another step in the new model, the first step. Which means we have a new causal chain, and it looks something like this ‘You eat too much carbohydrate.’ Adding in this step gives us the new model:

  • You eat too much carbohydrate/sugar.
  • You produce too much insulin.
  • This forces your body to store fat.
  • You become obese.
  • At a certain point insulin resistance develops to block further weight gain.
  • This resistance becomes more and more severe until…
  • You become diabetic.

(Hat tip to Mangan.)

Killing the Queen: Ronda Rousey

July 31st, 2015

Jack Slack gives his advice for killing the queen, Ronda Rousey:

Really it all comes down to avoiding the clinch for as long as possible by circling off as Rousey comes in on a straight line. And using long, non-committal strikes to punish Rousey’s bull rushes, or intercepting elbows to hurt and deter Rousey. This can already be seen frequently in men’s MMA: it’s sound, proven strategy against a rushing opponent, whether he wants the infight, a brawl, a shot, or the clinch. Realistically though, anyone who doesn’t either 1) bumrush straight into the clinch with Rousey or 2) concede the clinch while swinging desperately for a knockout in the first minute is already way ahead of the game.  Then when the clinch does inevitably come, making sure it isn’t the be all and end all of the fight. But let’s face it, even avoiding one or two of Rousey’s charges would be considered a good performance.

The Cimmerian Hypothesis

July 31st, 2015

Beyond the Black River ends with these words:

‘Barbarism is the natural state of mankind,’ the borderer said, still staring somberly at the Cimmerian. ‘Civilization is unnatural. It is a whim of circumstance. And barbarism must always ultimately triumph.’

This is more than a bit of bluster meant to add color to an adventure story, John Michael Greer argues:

Science fiction has made much of its claim to be a “literature of ideas,” but a strong case can be made that the weird tale as developed by Lovecraft, Smith, Howard, and their peers has at least as much claim to the same label, and the ideas that feature in a classic weird tale are often a good deal more challenging than those that are the stock in trade of most science fiction: “gee, what happens if I extrapolate this technological trend a little further?” and the like. The authors who published with Weird Tales back in the day, in particular, liked to pose edgy questions about the way that the posturings of our species and its contemporary cultures appeared in the cold light of a cosmos that’s wholly uninterested in our overblown opinion of ourselves.

Thus I think it’s worth giving Conan and his fellow barbarians their due, and treating what we may as well call the Cimmerian hypothesis as a serious proposal about the underlying structure of human history.

What sets barbarian societies apart from civilized ones, he suggests, is that a much smaller fraction of the environment that barbarians encounter results from human action:

When you go outdoors in Cimmeria — if you’re not outdoors to start with, which you probably are — nearly everything you encounter has been put there by nature. There are no towns of any size, just scattered clusters of dwellings in the midst of a mostly unaltered environment. Where your Aquilonian town dweller who steps outside may have to look hard to see anything that was put there by nature, your Cimmerian who shoulders his battle-ax and goes for a stroll may have to look hard to see anything that was put there by human beings.

What’s more, there’s a difference in what we might usefully call the transparency of human constructions. In Cimmeria, if you do manage to get in out of the weather, the stones and timbers of the hovel where you’ve taken shelter are recognizable lumps of rock and pieces of tree; your hosts smell like the pheromone-laden social primates they are; and when their barbarian generosity inspires them to serve you a feast, they send someone out to shoot a deer, hack it into gobbets, and cook the result in some relatively simple manner that leaves no doubt in anyone’s mind that you’re all chewing on parts of a dead animal. Follow Conan’s route down into the cities of Aquilonia, and you’re in a different world, where paint and plaster, soap and perfume, and fancy cookery, among many other things, obscure nature’s contributions to the human world.

Here’s where his argument takes an unexpected turn:

“Primitive” cultures — that is to say, human societies that rely on relatively simple technological suites — differ from one another just as dramatically as they differ from modern Western industrial societies; nor do simpler technological suites correlate with simpler cultural forms.

[...]

Thus traditional tribal societies are no more natural than civilizations are, in one important sense of the word “natural;” that is, tribal societies are as complex, abstract, unique, and historically contingent as civilizations are. There is, however, one kind of human society that doesn’t share these characteristics — a kind of society that tends to be intellectually and culturally as well as technologically simpler than most, and that recurs in astonishingly similar forms around the world and across time. We’ve talked about it at quite some length in this blog; it’s the distinctive dark age society that emerges in the ruins of every fallen civilization after the barbarian war leaders settle down to become petty kings, the survivors of the civilization’s once-vast population get to work eking out a bare subsistence from the depleted topsoil, and most of the heritage of the wrecked past goes into history’s dumpster.

If there’s such a thing as a natural human society, the basic dark age society is probably it, since it emerges when the complex, abstract, unique, and historically contingent cultures of the former civilization and its hostile neighbors have both imploded, and the survivors of the collapse have to put something together in a hurry with nothing but raw human relationships and the constraints of the natural world to guide them. Of course once things settle down the new society begins moving off in its own complex, abstract, unique, and historically contingent direction; the dark age societies of post-Mycenean Greece, post-Roman Britain, post-Heian Japan, and their many equivalents have massive similarities, but the new societies that emerged from those cauldrons of cultural rebirth had much less in common with one another than their forbears did.

Human societies that don’t have urban centers tend to last much longer than those that do, he notes:

As we’ve seen, a core difference between civilizations and other human societies is that people in civilizations tend to cut themselves off from the immediate experience of nature nature to a much greater extent than the uncivilized do. Does this help explain why civilizations crash and burn so reliably, leaving the barbarians to play drinking games with mead while sitting unsteadily on the smoldering ruins?

As it happens, I think it does.

As we’ve discussed at length in the last three weekly posts here, human intelligence is not the sort of protean, world-transforming superpower with limitless potential it’s been labeled by the more overenthusiastic partisans of human exceptionalism. Rather, it’s an interesting capacity possessed by one species of social primates, and quite possibly shared by some other animal species as well. Like every other biological capacity, it evolved through a process of adaptation to the environment—not, please note, to some abstract concept of the environment, but to the specific stimuli and responses that a social primate gets from the African savanna and its inhabitants, including but not limited to other social primates of the same species. It’s indicative that when our species originally spread out of Africa, it seems to have settled first in those parts of the Old World that had roughly savanna-like ecosystems, and only later worked out the bugs of living in such radically different environments as boreal forests, tropical jungles, and the like.

The interplay between the human brain and the natural environment is considerably more significant than has often been realized. For the last forty years or so, a scholarly discipline called ecopsychology has explored some of the ways that interactions with nature shape the human mind. More recently, in response to the frantic attempts of American parents to isolate their children from a galaxy of largely imaginary risks, psychologists have begun to talk about “nature deficit disorder,” the set of emotional and intellectual dysfunctions that show up reliably in children who have been deprived of the normal human experience of growing up in intimate contact with the natural world.

All of this should have been obvious from first principles. Studies of human and animal behavior alike have shown repeatedly that psychological health depends on receiving certain highly specific stimuli at certain stages in the maturation process. The famous experiments by Henry Harlow [sic], who showed that monkeys raised with a mother-substitute wrapped in terrycloth grew up more or less normal, while those raised with a bare metal mother-substitute turned out psychotic even when all their other needs were met, are among the more famous of these, but there have been many more, and many of them can be shown to affect human capacities in direct and demonstrable ways.

Universities Are an Illusion

July 30th, 2015

The University of North Carolina at Chapel Hill’s department of African and Afro-American studies offered a now-notorious selection of non-classes for athletes — and others — who needed a little… help:

After the fraud was exposed and both the university chancellor and Mr. Davis lost their jobs, outside investigators discovered that U.N.C. had essentially no system for upholding the academic integrity of courses. “So long as a department was offering a course,” one distinguished professor told the investigators, “it was a legitimate course.”

Mr. Davis came to understand this all too well. As the investigators wrote in their final report, Mr. Davis “found Chapel Hill’s attitude toward student-athlete academics to be like an ‘Easter egg,’ beautiful and impressive to the outside world, but without much life inside.”

Most colleges, presumably, aren’t harboring in-house credit mills. Yet in its underlying design, organizational values and daily operations, North Carolina is no different from most other colleges and universities. These organizations are not coherent academic enterprises with consistent standards of classroom excellence. When it comes to exerting influence over teaching and learning, they’re Easter eggs. They barely exist.

This goes a long way toward explaining why colleges spend so much time and effort creating a sense of tribal solidarity among students and alumni. Think of the chant that Joe Paterno and students cried out together at the height of their university’s pedophilia scandal: “We are! Penn State!” The costumes, rituals and gladiatorial contests with rival colleges are all designed to portray the university as united and indivisible. Newer colleges that lack such deeply rooted identities spend millions of dollars on branding consultants in order to create them.

They do this to paper over uncomfortable truths revealed by their own researchers.

The Bible of academic research on how colleges affect students is a book titled, plainly enough, “How College Affects Students.” It’s an 848-page synthesis of many thousands of independent research studies over the decades. The latest edition was published in 2005 by Ernest Pascarella and Patrick Terenzini, professors at the University of Iowa and Penn State.

The sections devoted to how colleges differ from one another are notable for how little they find. As Mr. Pascarella and Mr. Terenzini carefully document, studies have found that some colleges are indeed better than others in certain ways. Students tend to learn more in colleges where they have closer relationships with faculty and peers, for example, and earn a little more after graduating from more selective institutions.

But these findings are overwhelmed in both size and degree by the many instances in which researchers trying to detect differences between colleges found nothing.

“The great majority of postsecondary institutions appear to have surprisingly similar net impacts on student growth,” the authors write. “If there is one thing that characterizes the research on between-college effects on the acquisition of subject matter knowledge and academic skills, it is that in the most internally valid studies, even the statistically significant effects tend to be quite small and often trivial in magnitude.”

The fact that universities hardly exist as unified teaching organizations should not be confused with the question of whether going to college is “worth it.” The typical student who graduates from a college somewhere fares far better in the job market than the typical student who doesn’t.

[...]

People can learn a lot in college, and many do. But which college matters much less than everyone assumes. As Mr. Pascarella and Mr. Terenzini explain, the real differences exist at the departmental level, or within the classrooms of individual professors, who teach with a great deal of autonomy under the principles of academic freedom. The illusory university pretends that all professors are guided by a shared sense of educational excellence specific to their institution. In truth, as the former University of California president Clark Kerr observed long ago, professors are “a series of individual faculty entrepreneurs held together by a common grievance over parking.”

The problem for students is that it is all but impossible to know ahead of time which part of the disunified university is which. Consumers of higher education have been taught that their main choice lies between whole institutions that are qualitatively different from one another. Because this is wrong, the higher education market often fails, which is probably one reason that a third of students who enroll in four-year colleges transfer or drop out within three years.

The whole apparatus of selective college admissions is designed to deliberately confuse things that exist with things that don’t. Many of the most prestigious colleges are an order of magnitude wealthier and more selective than the typical university. These are the primary factors driving their annual rankings at or near the top of the U.S. News list of “best” colleges. The implication is that the differences in the quality of education they provide are of a similar size. There is no evidence to suggest that this is remotely true.

When college leaders talk about academic standards, they often mean admissions standards, not standards for what happens in classrooms themselves. Or they vaguely appeal to traditions and shared values without any hard evidence of their meaning. This is understandable, because the alternative is admitting that many selective institutions are not intrinsically excellent; they were just lucky enough to get into the business of selecting the best and brightest before everyone else.

Jim Baen’s Top 10 Science Fiction Books

July 30th, 2015

Jim Baen called David Drake on Thursday, June 8, 2006, saying that Amazon had asked him at BookExpo for a list of the ten science fiction books that everybody should read. David Drake goes on:

He wanted me to join him in coming up with the list. Jim and I did this sort of thing — him calling me as a resource — very frequently. The only thing unusual is the fact that he’d had a mini-stroke during the night. When he finally went to the hospital on Monday, June 12, he had the fatal stroke. This was literally some of the last thinking on SF that Jim did.

I was rather shocked that I had in fact read all ten:

  1. Foundation by Isaac Asimov
  2. Stranger in a Strange Land by Robert A. Heinlein
  3. A Canticle for Leibowitz by Walter M. Miller
  4. 20,000 Leagues Under the Sea by Jules Verne
  5. Dune by Frank Herbert
  6. Lest Darkness Fall by L. Sprague deCamp
  7. Against the Fall of Night by Arthur C. Clarke
  8. Citizen of the Galaxy by Robert A. Heinlein
  9. The Time Machine by H.G. Wells
  10. A Connecticut Yankee in King Arthur’s Court by Mark Twain

I went to back to see when I mentioned any of these book earlier, and the last time I mentioned Jim Baen’s list, I lamented that I’d missed a few. So, I have made some progress.

Anyway, I’ve mentioned Foundation repeatedly, but not to discuss its literary merits:

I’d consider myself a Heinlein fan, but I barely made it through Stranger in a Strange Land as a teen, and I didn’t make it through a few years ago. I much preferred the short novel he wrote while taking a break from Stranger:

I didn’t find Citizen of the Galaxy memorable.

When I tried to read A Canticle for Leibowitz in college, the pre-Vatican II Catholicism didn’t work for me, but when I re-read it a few years ago I found it powerful and insightful:

I did not know it at the time, but Walter Miller, the author, had served in a bomber crew that helped destroy the monastery at Monte Cassino during World War II, and he converted to Catholicism after the war. Seen through his sympathetic eyes, the Church is a source of great practical wisdom, with established methods for steering flawed human beings toward productive behaviors — not unlike the Overcoming Bias and Less Wrong crowds, but more experienced, if also tied to a peculiar cosmology.

20,000 Leagues Under the Sea never did much for me. He’s supposed to be the father of science fiction, so he may simply need a better translator.

Dune I mention here regularly, as a powerful novel that didn’t quite work for me — but obviously stuck with me:

I definitely enjoyed Lest Darkness Fall, which has its modern-day protagonist bring telegraphy without electricity to ancient Rome. That scenario raises the interesting question of ideas behind their time.

The Time Machine is an absolute classic. Lawrence Auster would certainly recommend it. Wells wrote many novels worth reading.

By contrast, Mark Twain might qualify as a wit of the highest order, yet I found A Connecticut Yankee In King Arthur’s Court hard to take. It’s mainly hamfisted whig history, which played to his audience, I’m sure.

How to Hire Better Cops

July 29th, 2015

It’s seldom recognized that the argument for racial quotas makes more sense for cops than for firemen, Steve Sailer remarks:

A couple of weeks ago my column “The Density Divide” brought up a fundamental divide between two kinds of human activity: objective striving against the natural world versus subjective contention for social dominance. Firemen battle chemical reactions, so it makes sense to hire them meritocratically, while policemen try to impose their will upon other people, so political considerations matter relatively more.

Asimov, Heinlein and Virgil

July 29th, 2015

Science fiction is speculative fiction, John C. Wright argues:

Many a fan of Science Fiction would like to include any classical work containing an unearthly or supernatural element in the work t be Science Fiction, including the Odyssey, Aenead, Fourth Eclogue, Divine Comedy, Tempest and Faust, not to mention the Ring Cycle of Wagner.

Science fiction is the fiction of the scientific revolution. It is the unique product of the revolution in thought that ushered in the modern age. That revolution changed both the theory and the practice of life, the paradigm and the technology, both what men thought about the cosmos and how they lived their daily lives.

Having lived through one paradigm shift and its attendant technological advancements, an audience was ready for fictional speculation about the next paradigm shift, the next technological advancement.

Speculative fiction, properly so called, is fiction taking place in a cosmos that differs from what the audience understands to be the real world, either (in science fiction) after the next paradigm shift or (in fantasy) before the previous one. Both challenge the imagination by rejecting the paradigm, or the technology, current to the time and place in which the author and his readers generally agree they live.

Even a single element unearthly or extraterrestrial element in an otherwise mundane setting —a Mindreader in Brooklyn—can make the story science fiction; this is because discovering a Mindreader in Brooklyn would overthrow the current paradigm. We don’t believe in telepaths, and James Randi disbelieves even less than we do. Therefore a tale where the reader is asked to take that possibility seriously, to think through the implications, challenges the current paradigm.

The genre is called “speculative” because of the emphasis on implications. The Invisible Man of H.G. Wells has to run around naked because his clothing was not also transparent; and his footprints dinted the snow. The invisible ring in Orlando Furioso had no such logical limitations: it was magic. When Brandamart puts it in her mouth, she vanishes.

All this is in marked contrast to the epics and poems mentioned here. They were written by authors whose purpose was to confirm the paradigm of the time and place in which they wrote.
Dante was not attempting to lead his Christian readers into speculations about what the pre-Christian world looked like to pre-Christians, or to imagine what the world was like had that long-lost world-view been true. Dante did not write a fantasy. He wrote the opposite. Pagan elements are introduced (Ulysses, etc.) for the express purposed of being retrofitted into a Christian philosophical framework. This would be the same as if some author (for example Mary Renault) took a character from the previous prescientific world view (for example, Theseus) and retold his story explaining all the supernatural elements in terms of scientifically and anthropologically modern ideas (for example THE KING MUST DIE).
The speculative element is exactly what is missing in Dante: and I say this with the greatest respect for Dante’s scientific learning. His astronomy and his optics are spot on perfect. But when the shades in Purgatory see the shadow of Dante on the ground, and the departed spirits cast no shadow, it is not explained how the ghostly eyeballs can see Dante’s shadow, if the photons are passing through them–and if the photons are not passing through them, then how is it that the departed spirits cast no shadows? Common folk wisdom of Dante’s time said shades were shadowless, and he had craft and art enough to work this cleverly into his poem. But he did not speculate about scientific implications. Dante’s take on ghosts was meant to confirm the paradigm of his age.
In contrast, Robert E. Howard wrote fantasy. Conan does not live in our universe as we understand it: he cannot be fitted into the modern scientific world-view. Conan is a speculation (if we may dignify it with that term) about what the world would have been like had the men of the previous paradigm been correct in their view of the universe: a realm of capricious gods, monsters, bold barbarians, beautiful slavegirls, pirates, kings, where magic worked and sorcery hung thick as incense on the air.

Do not be deceived by the presence of wondrous and fantastic elements in the great poets. All tales are really about wonder. All readers suspend their skepticism at least in part for the sake of the tale being told. I truly doubt every man in the audience of Homer believed in Amazons or Centaurs. Certainly Plato scoffs at Homer’s portrayal of Gods and demigods. And there were skeptics even in Shakespeare’s day who did not believe the ghosts: but ghosts were an accepted part of the revenge story, and so a ghost in HAMLET was not something alien to their paradigm of the universe. There are many modern skeptics who do not believe in love at first sight, but who will accept it as possible for the sake of watching a love story.

So, with all due respect, while we have the liberty to define SF broadly enough to include anything and everything we want (indeed, a liberty I take here), we run the risk of sounding puffed and presumptuous. I have never been at an SF Con were a fan said his three favorite science fiction authors were Asimov, Heinlein and Virgil. I have never found a copy of Shakespeare’s TEMPEST in the Dungeon and Dragon’s aisle at the bookstore, even though Prospero is clearly a Twelfth Level mage, able to cast a seventh level control weather spell with an area-effect modifier.

How Dare You Say That! The Evolution of Profanity

July 28th, 2015

John H. McWhorter (The Language Hoax) explores the evolution of profanity:

In medieval English, at a time when wars were fought in disputes over religious doctrine and authority, the chief category of profanity was, at first, invoking—that is, swearing to—the name of God, Jesus or other religious figures in heated moments, along the lines of “By God!” Even now, we describe profanity as “swearing” or as muttering “oaths.”

It might seem like a kind of obsessive piety to us now, but the culture of that day was largely oral, and swearing—making a sincere oral testament—was a key gesture of commitment. To swear by or to God lightly was considered sinful, which is the origin of the expression to take the Lord’s name in vain (translated from Biblical Hebrew for “emptily”).

The need to avoid such transgressions produced various euphemisms, many of them familiar today, such as “by Jove,” “by George,” “gosh,” “golly” and “Odsbodikins,” which started as “God’s body.” “Zounds!” was a twee shortening of “By his wounds,” as in those of Jesus. A time traveler to the 17th century would encounter variations on that theme such as “Zlids!” and “Znails!”, referring to “his” eyelids and nails.

In the 19th century, “Drat!” was a way to say “God rot.” Around the same time, darn started when people avoided saying “Eternal damnation!” by saying “Tarnation!”, which, because of the D-word hovering around, was easy to recast as “Darnation!”, from which “darn!” was a short step.

By the late 18th century, sex, excretion and the parts associated with same had come to be treated as equally profane as “swearing” in the religious sense. Such matters had always been considered bawdy topics, of course, but the space for ordinary words referring to them had been shrinking for centuries already.

Chaucer had available to him a thoroughly inoffensive word referring to the sex act, swive.

I think that qualifies as the word of the day!

We are hardly beyond taboos, McWhorter notes; we just observe different ones:

Today, what we regard as truly profane isn’t religion or sex but the slandering of groups, especially groups that have historically suffered discrimination or worse. Our profanity consists of the N-word, that C-word once suitable for an anatomy book discussion of women’s bodies, and a word beginning with f referring to gay men (and some would include a word referring to women beginning with b).

It might seem strained to compare our feelings about the N-word with a bygone era’s appalled shuddering over the utterance of “By God!” But do note that I have to euphemize the N-word here in print just as someone would have once have felt compelled to say, “By Jove!”

[...]

But we are just as capable as previous eras of policing our taboos with unquestioning excess. An administrator in Washington, D.C.’s Office of the Public Advocate had to resign in 1999 for using the word niggardly in a staff meeting. At the University of Virginia, there was a campus protest in 2003 after a medical school staffer said that a sports team called the Redskins “was as derogatory to Indians as having a team called n— would be to blacks.” Julian Bond, who was then the head of the NAACP, said that only his respect for free speech kept him from recommending that she be fired. In 2014, the lawyer and writer Wendy Kaminer elicited aggrieved comments for saying, during a panel discussion at Smith College, that when we use euphemisms for the N-word we all “hear the word n— in our head.”

[...]

Some might object that we should not check that impulse, and that extremism is necessary to create lasting social change. But it’s useful to recall that, when it comes to profanity, there were once people who considered themselves every bit as enlightened as we see ourselves today, with the same ardent and appalled sense of moral urgency. They were people who said “Odsbodikins” and did everything they could to avoid talking about their pants.

The Point of Disbelief

July 28th, 2015

Definitions of SF are a subject not likely to be addressed to everyone’s satisfaction, but John C. Wright makes an effort:

The simplest definition is to say that, where normal stories are about rescuing princesses from pirates, science fiction stories are about rescuing space-princesses from space-pirates.

Behind this facetious definition there is a thought worth examining:

All stories are falsehoods used to reveal some truth. The falsehood is one the storyteller and the audience tacitly agree shall be treated as true for the purposes of telling the tale. In this respect, the storyteller is a magician who enchants his audience; they are willing to believe the unbelievable, to suspend their disbelief. But if he makes too great a demand on their willing suspension of disbelief, the spell is broken, and his illusion stands open to their contempt as a cheap trick.

Different audiences will place this ‘point of disbelief’ at different heights.

For example, in a comedy, the audience is willing to accept the most unlikely and unrealistic coincidences in plot or stunts in action, merely because it is funny. The tolerance is high. In a gritty action thriller, however, any unrealistic detail, such as shooting seven bullets from a six-shooter, will break the spell for a serious audience.

Every reader will recognize when it has happened once or twice that his point of disbelief has been notched upward. Let me use a war picture as an example. When the hero runs through a hail of machinegun bullets fired by Nazis unscathed (or, in Science Fiction, when he runs through a lightningstorm of blaster fire from Imperial Stormtroopers) something clicks in our brains, and we smile, and settle back in the theater seat, and we don’t take the movie was seriously as we did the moment before. We might still like it: but now it is a ‘popcorn’ movie, light entertainment. Our tolerance for unreality for light movies is more generous than for gravid ones. Compare that, on the other hand, with the opening sequence in SAVING PRIVATE RYAN, where the whistling storm of machinegun-fire was realistic and horrifying. No one was running around protected by an invisible aura of ‘main character glow’. The point of disbelief was low.

When we have put our tolerance at the high point, either because it is a genre we like or an author we like, we react grumpily to any evidence that the scenes are unrealistic. Complaints seen like nit-picking, small-mindedness. The complainer cannot get in the spirit of things. He is trying to break the spell.

What makes the calculation of where to put the point of disbelief complex is two factors:

First, unbelievable things actually do happen in real life: there are moments of high heroism and deep horror, eerie coincidences and true love. There really are men like Napoleon and George Washington, who change history. Stories are supposed to be about the unusual: anyone who works on a newspaper can tell you that.

Second, the craft of the artist consist of certain tricks and devices he uses to make the unbelievable seem real. This is called verisimilitude. Verisimilitude is the illusion of reality: a thing that is not real, but which seems realistic.

Stephen King writes with masterful craft by using settings and people as one might find in any small town in America; only after the reader is habituated into trusting these descriptions, do odd, and then unearthly elements begin to intrude on the picture. He is correctly regarded as a fine horror writer, perhaps the finest, because of his mastery of this device of verisimilitude.

There is a famous scene in Homer, when Andromache brings her baby out to say farewell to Hector before that warrior prince issues forth to battle. Astyonax is startled by the plumes on his helm of his father and begins to cry. This is the type of realistic detail suddenly makes the unearthly elements in the epic seem more realistic. When Hector batters down the gate of the Achaian palisade, he hoists a rock so large that “two men, such as men are now, could not have lifted it.” The fact that the baby was startled by his gleaming armor makes Hector seem like a real person; even when he does feats no one now-a-days can do, the feeling of reality is maintained. Instead of shaking their heads, and saying no one could lift up so large a rock, the listeners nod and listen.

Now, along the spectrum of realistic to unrealistic fiction, Speculative Fiction (by which I mean Science Fiction and Fantasy together) occupies the more unrealistic side. Indeed, Speculative readers not only tolerate but demand that a high demand be placed on their imaginations: they want to see life or Mars, or Barsoom, or Middle Earth, or in the Year 2000 or in the Hyperborean Age. We place the point of disbelief very high.

The separation of fantasy from science fiction is merely the difference in the craft of verisimilitude used. Fantasy impersonates the tone and style, the tropes and details of medieval and ancient songs, epics and folktales. Unearthly and unbelievable things can happen in Middle-Earth, provided they seem to happen in the same mood and atmosphere as ORLANDO FURIOSO or LE MORTE D’ARTHUR. If the mood is not broken, the audience will accept the illusion as real.

Science Fiction impersonates science. The science does not need to be real, but it needs to produce a realistic illusion. Time Travel, or Faster-Than-Light drive, are both as fantastical as Santa’s Elves: but, in the communal imagination of SF, they are assumed to be the product of scientific investigation, built in a workshop or lab, produced by the same ingenuity as Robert Fulton or the Wright Brothers.

This point is worth dwelling on. In order to create verisimilitude in THE TIME TRAVELER, the author H.G. Wells has a frame in the first chapter. The scene opens with an unnamed first-person narrator describing a conversation at a dinner party: the idea that time is a dimension that can be crossed like length, breadth, and height are introduced, and a machine for crossing time, similar to a flying machine, comes on stage as a prop. Now the reader is ready to accept the idea of a man who crosses time in a time machine the way a sea-traveler crosses the sea in a steamship. The Ghost of Christmas Yet to Be might bring Scrooge into the future to view a prophecy, but this is a supernatural visitation. The Time Traveler’s vehicle is natural, a product of his workshop, no more supernatural than a steam engine. But without the frame of the dinner party, where we meet the Time Traveler, without the initial theoretical discussion, the stress on the readers willing suspension of disbelief would be greater.

This is the unique property of Science Fiction. The readers of Science Fiction are expected to know something about modern science, and they expect that whatever fantastic adventure about to be told them will be framed in terms of some explanation that is plausibly scientific. Whether the science fiction is hard or soft depends on how implausible the scientific explanation is, and how central the story it is.

Science fiction readers expect to be convinced by having a discussion or lecture take place in the text, which has enough real science to make the fake science seem real. These lectures are unknown in other genres.

Tales where the props and settings from science fiction are merely thrown in for flavor, or to produce a background of wonder, are rightly called Space Opera: adventure stories that take place in space, no different, really, than similar tales taking place in remote jungles, pirate-infested seas, golden palaces, or the mountains of Tibet. STAR WARS, for example, is space opera, since the science is there merely for flavor. The same tale could have taken place, almost unchanged, in the fairytale Japan of legend.

There is, by the way, a similar division in fantasy between hard and soft, or high and low. Fantasy that accurately follows the ancient models of the world, now lost, which our ancestors knew, is realistic fantasy (if we can use that term). The language is elevated, the action is mannered. Sword and Sorcery stories follow the themes of ancient epics and folktales. Oriental fantasy follows the model of Arabian Night’s Tales, with their strange vistas, Jinn-haunted palaces, and cruel bejeweled splendors. The ‘Dying Earth’ tales of Jack Vance are a superb example of this opulent oriental flavor, even though they take place in the Far Future rather than the Far East.

Fantasy where the characters talk and act like middle-class gamers from Southern California, except that they swing swordsand shoot lightning from their fingertips, is a tale where the fantasy settings and props are merely thrown in for flavor. We should call such unrealistic fantasy Elf Opera.

But the point, the main point, of speculative fiction, both fantasy and science fiction, is that they are both ultramundane. Fantasy is unearthly, and science fiction is extraterrestrial. They deal with things that do not happen in the here-and-now. Either the setting is in another world Beyond the Fields We Know, or something from the Other World or Outer Space has intruded into our comfortable little reality. When something from Beyond intrudes into our little world, the reaction is either terror or awe. All the old SF magazines had titles reflecting this: Thrilling, Wonder, Amazing, and so on. Any definition of Science Fiction or Fantasy that does not point to this central characteristic of unearthliness is defective.

Scrabble Francophone

July 27th, 2015

The French-language Scrabble world championship just went to a New Zealander — who doesn’t speak French:

The BBC reported that Nigel Richards, originally from Christchurch, defeated a rival from French-speaking Gabon in the final in Louvain, Belgium, on Monday.

He had only started studying the French dictionary about eight weeks ago, said a close friend of Mr Richards, Liz Fagerlund.

“He doesn’t speak French at all, he just learnt the words. He won’t know what they mean, wouldn’t be able to carry out a conversation in French I wouldn’t think.”

Mr Richards, now in his late forties, is a previous English Scrabble champion. He is based in Malaysia.

He has won five US National titles and the World Scrabble Championship three times.