Are Jews Smarter? What Genetic Science Tells Us

Friday, October 21st, 2005

This idea is getting a lot of play. From Are Jews Smarter? What Genetic Science Tells Us:

This story begins, as it inevitably must, in the Old Country.

At some point during the tenth century, a group of Jews abandoned the lush hills of Lucca, Italy, and — at the invitation of Charlemagne — headed for the severer climes of the Rhineland and Northern France. These Jews didn’t have a name for themselves, at first. They were tied together mostly by kinship. But ultimately, they became known as Ashkenazim, a variation on the Hebrew word for one of Noah’s grandsons.

In some ways, life was good for the Jews in this strange new place. They’d been lured there on favorable terms, with promises of physical protection, peaceful travel, and the ability to adjudicate their own quarrels. (The charter of Henry IV, dated 1090, includes this assurance: “If anyone shall wound a Jew, but not mortally, he shall pay one pound of gold . . . If he is unable to pay the prescribed amount . . . his eyes will be put out and his right hand cut off.”) But in other ways, life was difficult. The Ashkenazim couldn’t own land. They were banned from the guilds. They were heavily taxed.

Yet the Ashkenazim did very well, in spite of these constraints, because they found an ingenious way to adapt to their new environment that didn’t rely on physical labor. What they noticed, as they set up their towns, located mainly at the crossroads of trade routes, was that there was no one around to lend money.

So there it was: a demand and a new supplier. Because of the Christian prohibition against usury, Jews found themselves a financially indispensable place in their new home, extending loans to peasants, tradesmen, knights, courtiers, even the occasional monastery. The records from these days are scarce. But where they exist, they are often startling. In 1270, for example, 80 percent of the 228 adult Jewish males in Perpignan, France, made their living lending money to their Gentile neighbors, according to Marcus Arkin’s Aspects of Jewish Economic History. One of the most prolific was a rabbi. Two others were identified, in the notarial records, as “poets.”

Success at money-lending required a different set of skills than farming or any of the traditional trades. Some, surely, were social: cultivating connections, winning over trust (or maybe bullying your way there, Shylock’s awful pound of flesh). It probably required some aggression, because the field was competitive, with Jews suffering so few professional options. But it also required cognitive skills, or something my generation would call numeracy — a fluency in mathematics, a dexterity with numbers — and my grandmother’s generation would call “a head for figures.” If you were Jewish in Perpignan in 1270, and you didn’t have a head for figures, you didn’t stand much of a chance.

Numeracy, literacy, critical reasoning: For millennia, these have been the currency of Jewish culture, the stuff of Talmudic study, immigrant success, and Borscht Belt punch lines. Two Jews, three opinions . . . Keep practicing, you’ll thank me later . . . Q: When does a Jewish fetus become a human? A: When it graduates from medical school.

Of course, there’s another side to this shining coin. Jewish cleverness has also been an enduring feature of anti-Semitic paranoia. In the sixteenth century, Martin Luther said Jewish doctors were so smart they could develop a poison that could kill Christians in a single day — or any other time period of their choosing (and four centuries later, Pravda suggested Jewish doctors were spies sent to kill Stalin). After the calamities of September 11, one of the creepier conspiracy theories to whip through the Muslim world was the idea that only Jews were cunning enough to have pulled off the hijackings.

Last summer, Henry Harpending, an evolutionary anthropologist at the University of Utah, and Gregory Cochran, an independent scholar with a flair for controversy, skipped cheerfully into the center of this minefield. The two shopped around a paper that tried to establish a genetic argument for the fabled intelligence of Jews. It contended that the diseases most commonly found in Ashkenazim — particularly the lysosomal storage diseases, like Tay-Sachs — were likely connected to and, indeed, in some sense responsible for outsize intellectual achievement in Ashkenazi Jews. The paper contained references, but no footnotes. It was not written in the genteel, dispassionate voice common to scientific inquiries but as a polemic. Its science was mainly conjecture. Most American academics expected the thing to drop like a stone.

It didn’t.

Update: As Robert McHenry points out in The Education of Gesture, that intro can’t be true:

If you haven’t caught it yet, here’s the problem: Charlemagne died in 814 CE. No one is expected to know that particular fact, but many generally educated persons might recall that he was crowned Holy Roman Emperor at Christmas in 800. This would make his survival into the tenth century highly unlikely on the face of it.

On the DVR

Thursday, October 20th, 2005

I couldn’t agree more with Michael Blowhard’s On the DVR:

I wonder if people who haven’t yet sprung for a digital video recorder — a Tivo, or maybe a box that your cable company will rent you — understand how dramatically using one can change your experience of television.

[...]

But for me what’s been most wonderful is the way the DVR — essentially some software and a hard drive — becomes the TV equivalent of your book or CD library. When The Wife and I settle in to do a little tube-watching, we don’t see ‘what’s on television.’ Instead, we check out what’s waiting for us on the hard drive.

You can accumulate an amazing collection of shows with only a minimal amount of programming effort. It used to be common to say that TV was the enemy of true culture. These days … Well, if you use your DVR wisely, watching TV can become a rewarding part of a classy cultural life.

The list of what’s on his DVR seems eerily familiar.

Why do we believe in God?

Thursday, October 20th, 2005

Robert Winston asks, Why do we believe in God? and looks at the long-held belief that extreme religiosity and insanity are linked:

Many years ago, a team of researchers at the department of anthropology at the University of Minnesota decided to put this association to the test. They studied certain fringe religious groups, such as fundamentalist Baptists, Pentecostalists and the snake-handlers of West Virginia, to see if they showed the particular type of psychopathology associated with mental illness. Members of mainstream Protestant churches from a similar social and financial background provided a good control group for comparison. Some of the wilder fundamentalists prayed with what can only be described as great and transcendental ecstasy, but there was no obvious sign of any particular psychopathology among most of the people studied. After further analysis, however, there appeared a tendency to what can only be described as mental instability in one particular group. The study was blinded, so that most of the research team involved with questionnaires did not have access to the final data. When they were asked which group they thought would show the most disturbed psychopathology, the whole team identified the snake-handlers. But when the data were revealed, the reverse was true: there was more mental illness among the conventional Protestant churchgoers — the ‘extrinsically’ religious — than among the fervently committed.

Vegas, Baby… Vegas!

Thursday, October 20th, 2005

In Vegas, Baby… Vegas!, James K. Glassman discusses Learning from Las Vegas:

In 1972, architects Robert Venturi and Denise Scott Brown wrote a book called Learning From Las Vegas, which celebrated the gambling capital’s architecture. Designers and builders, the authors insisted, should respond to the tastes and desires of “common” folks, as the architects of Las Vegas had.

Learning from Las Vegas created a scandal. In a typical commentary from a cultural journal, the Ohio Review described the book as “dangerous,” and warned that it “inverts the ideas that many have based their professional lives upon. It threatens those things that we use to distinguish the difference between us, the cultured, and them, the vulgar.”

Flash forward 33 years. America’s professional classes — especially economists, journalists, and politicians — have even more to learn from Las Vegas. I go there three or four times a year, and I suggest that the mandarins of Washington and New York should take similar pilgrimages to learn how the world really works.

Judging by the number of people rushing to live in it, Las Vegas is one of the most successful cities in the world. By far America’s fastest-growing metropolitan area, its population rose from 273,000 in 1970 to 1,700,000 today. The city also attracted 37 million visitors last year — about the same as New York City.

Las Vegas has become the most exciting and gorgeous urban artifact of the past few decades. It has the best restaurants (practically every great chef now has an outpost, most recently Daniel Boulud), most dramatic hotels, most creative nightclubs, and grand shopping. Of course, it also has gambling.

But it’s not just the gambling. Dave Kirvin, one of Vegas’s top PR executives, points out that the big casino-hotels now collect the majority of their revenues from “non-gaming activities” — rooms, dinners, drinks, shows. Many other places have now adopted gambling, but none has approached the success of Las Vegas.

The Age of Radical Enhancement

Thursday, October 20th, 2005

In The Age of Radical Enhancement, Arnold Kling again looks at ideas raised by Kurzweil’s singularity:

Perhaps the last unenhanced human to make a significant contribution in the field of mathematics has already been born. In twenty years, the tenure track at top university mathematics departments may consist entirely of people who depend on drugs, direct neural-computer connections, genetic modification, or a combination of all three in order to achieve high-level performance.

Some people would argue that the leading edge of this phenomenon is athletes’ use of steroids. I would caution, however, that athletics is atypical in that it is a zero-sum game, and we should not automatically adopt zero-sum bioethics.

When Kling asked his college-age daughter is they knew many students taking Adderall, an amphetamine cocktail prescribed for ADD, they each responded “Of course.” Here’s what Slate writer Joshua Foer had to say about The Adderall Me:

Depressives have Prozac, worrywarts have Valium, gym rats have steroids, and overachievers have Adderall. Usually prescribed to treat Attention Deficit Hyperactivity Disorder, the drug is a cocktail of amphetamines that increases alertness, concentration, and mental-processing speed and decreases fatigue. It’s often called a cognitive steroid because it can make people better at whatever it is they’re doing. When scientists administered amphetamines to college shot-putters, they were able to throw more than 4 percent farther. According to one recent study, as many as one in five college students have taken Adderall or its chemical cousin Ritalin as study buddies.

The drug also has a distinguished literary pedigree. During his most productive two decades, W.H. Auden began every morning with a fix of Benzedrine, an over-the-counter amphetamine similar to Adderall that was used to treat nasal congestion. James Agee, Graham Greene, and Philip K. Dick all took the drug to increase their output. Before the FDA made Benzedrine prescription-only in 1959, Jack Kerouac got hopped up on it and wrote On the Road in a three-week “kick-writing” session. “Amphetamines gave me a quickness of thought and writing that was at least three times my normal rhythm,” another devotee, John-Paul Sartre, once remarked.

If stimulants worked for those writers, why not for me? [...] As an experiment, I decided to take Adderall for a week. The results were miraculous. On a recent Tuesday, after whipping my brother in two out of three games of pingpong — a triumph that has occurred exactly once before in the history of our rivalry — I proceeded to best my previous high score by almost 10 percent in the online anagrams game that has been my recent procrastination tool of choice. Then I sat down and read 175 pages of Stephen Jay Gould’s impenetrably dense book The Structure of Evolutionary Theory. It was like I’d been bitten by a radioactive spider.

An anecdote involving mathematician Paul Erdös:

There’s also the risk that Adderall can work too well. The mathematician Paul Erdös, who famously opined that “a mathematician is a device for turning coffee into theorems,” began taking Benzedrine in his late 50s and credited the drug with extending his productivity long past the expiration date of his colleagues. But he eventually became psychologically dependent. In 1979, a friend offered Erdös $500 if he could kick his Benzedrine habit for just a month. Erdös met the challenge, but his productivity plummeted so drastically that he decided to go back on the drug. After a 1987 Atlantic Monthly profile discussed his love affair with psychostimulants, the mathematician wrote the author a rueful note. “You shouldn’t have mentioned the stuff about Benzedrine,” he said. “It’s not that you got it wrong. It’s just that I don’t want kids who are thinking about going into mathematics to think that they have to take drugs to succeed.”

Epic Pooh and Into the Woods

Wednesday, October 19th, 2005

Years ago, fantasy writer Michael Moorcock attacked the popular works of Tolkien (Lord of the Rings), C.S. Lewis (Chronicles of Narnia), and Richard Adams (Watership Down) as Epic Pooh — comforting, unchallenging, and conservative:

I sometimes think that as Britain declines, dreaming of a sweeter past, entertaining few hopes for a finer future, her middle-classes turn increasingly to the fantasy of rural life and talking animals, the safety of the woods that are the pattern of the paper on the nursery room wall. Old hippies, housewives, civil servants, share in this wistful trance; eating nothing as dangerous or exotic as the lotus, but chewing instead on a form of mildly anaesthetic British cabbage. If the bulk of American sf could be said to be written by robots, about robots, for robots, then the bulk of English fantasy seems to be written by rabbits, about rabbits and for rabbits.

In Into the Woods, James Parker takes a different point of view:

But writing off “Watership Down” as a manifesto of middle-class conservatism misses the point; the book’s unique effect resides not solely in the comforting, cabbage-muffled discourse of the well-behaved rabbits, but in the irruption — into their quiet, grey world — of violence and domination.

For this was the other sphere of Adams’s experience: Prior to his career in government, he’d had an action-packed war, serving in the Middle East and then participating, as an officer in the 1st Airborne Division, in Operation Market Garden, the calamitous and bloody Allied attempt to clear the main bridges in German-occupied Holland. The Second World War shaped him as irresistibly as the first had shaped those other primary English fantasists, Tolkien and Lewis: “I must confess,” wrote Adams, “that it was the high point of my life, and the rest has been little more than an aftermath.”

Active duty is no guarantee against whimsy — A.A. Milne, creator of Winnie the Pooh, was a signals officer at the Battle of the Somme — but the marks of Adams’s war on “Watership Down” are plain. The lines of power in the book are drawn with brutal clarity, from the Owslafa — the Gestapo-like enforcers of General Woundwort’s warren — to the more improvised and benign, but no less efficient, command structure used by Hazel and his band of runaway bucks. And the novel’s violence, ever-threatening, occurs with a terrible, scuffling abruptness, leaving half-severed ears, torn haunches, nostrils filled with blood. The frozen state of “tharn” — defined in the book’s “Lapine Glossary” as “stupefied, distraught, hypnotized with fear” — is a facet of rabbit-hood, certainly, but its human version is shell shock: locked terror, the draining away of courage.

Kurzweilomics

Wednesday, October 19th, 2005

Arnold Kling opens his Kurzweilomics piece — about the economic consequences of Kurzweil’s singularity — with an anecdote from Simon Kuznets, cited by Robert Fogel:

He used to give a one-year course in growth economics, both at Johns Hopkins and Harvard. One of the points he made was that if you wanted to find accurate forecasts of what happened in the past, don’t look at what the economists said. The economists in 1850 wrote that the progress of the last decade had been so great that it could not possibly continue. And economists at the end of the nineteenth century wrote that the progress of the last half century had been so great that it could not possibly continue during the twentieth century. Kuznets said you would come closest to an accurate forecast if you read the writers of science fiction. But even the writers of science fiction were too pessimistic.

If returns are in fact accelerating, many of our problems go away:

If output per person in 2025 is more than 5 times what it is today, then the economy will have won the race. That means that all of the concerns that economists raise about the middle of this century, such as the external debt of the U.S. economy (the cumulative trade deficit), the fiscal implications of Social Security and Medicare, or gloomy scenarios for global warming, will be trivialized by the sheer heights that economic wealth will have scaled by that time. If Kurzweil is correct, then the mountain of debt that we fear we are accumulating now will seem like a molehill by 2040. We will pay off this debt the way someone who wins a million-dollar lottery pays off a car loan.

My Story: An Anecdotal Argument for Immigration Reform

Wednesday, October 19th, 2005

Ilya Shapiro and his Russian parents escaped to Canada, where he grew up. In My Story: An Anecdotal Argument for Immigration Reform he explains how difficult it is to become an permanent resident or citizen of the United States:

And yet there is no way to become a permanent resident without a spouse or employer acting as a sponsor (or without winning the ‘green card lottery,’ for which neither Canadians nor Russians — were I to reacquire that passport and avoid being sent to Chechnya — are eligible). Unlike every other immigrant-accepting country, the United States makes no provision for ‘independent immigration.’ That is, the executive and legislative branches have not established a set of criteria by which immigration workers can evaluate would-be immigrants — no ‘points system’ like the one that enabled my engineer parents to come to Canada.

While I am hugely grateful for the opportunity to live and work in America (and in our nation’s capital), I am not presently able to use the wonderful education and skills I have been given for the higher purpose that has long directed my path: the service of my country. I cannot work in the State or Defense Departments, in the challenging and critical Justice Department jobs for which I am otherwise qualified, in Executive Office positions, or in any other legal or policy-making posts for which I have prepared my whole life. I cannot even ‘put my money where my mouth is’ (in terms of my support of our engagement in Iraq) by serving in the military JAG Corps — or even enlisting as a simple infantryman.

Smarter on Drugs

Wednesday, October 19th, 2005

Michael S. Gazzaniga thinks people are Smarter on Drugs:

Scientists have known for years that more commonplace chemicals such as adrenaline, glucose and caffeine increase memory and performance. We all know it, too: procrastinators find clarity of mind in the adrenaline rush to meet a deadline; we try not to work ‘on an empty stomach’; and we are willing to pay a premium for a vente latte — all testimony to our appreciation of these legal activities.

Self-medicating with Starbucks is one thing. But consider the following. In July 2002 Jerome Yesavage and his colleagues at Stanford University discovered that donepezil, a drug approved by the FDA to slow the memory loss of Alzheimer’s patients, improves the memory of the normal population. The researchers trained pilots in a flight simulator to perform specific maneuvers and to respond to emergencies that developed during their mock flight, after giving half the pilots donepezil and half a placebo. One month later they retested the pilots and found that those who had taken the donepezil remembered their training better, as shown by improved performance. The possibility exists that donepezil could become a Ritalin for college students.

He compares donepezil (Aricept) to Ritalin because “it is commonly thought to boost SAT scores by more than 100 points, for both the hyperactive and the normal user.”

Total Film’s 50 Greatest Horror Movies

Wednesday, October 19th, 2005

The editors of Total Film have presented their list of The 50 Greatest Horror Movies of all time. Here’s the first half of the list:

1 THE TEXAS CHAIN SAW MASSACRE 1974
2 HALLOWEEN 1978
3 SUSPIRIA 1977
4 DAWN OF THE DEAD 1978
5 THE SHINING 1980
6 PSYCHO 1960
7 THE WICKER MAN 1973
8 ROSEMARY’S BABY 1968
9 DON’T LOOK NOW 1973
10 CANNIBAL HOLOCAUST 1980
11 THE THING 1982
12 CARRIE 1976
13 THE EXORCIST 1973
14 THE BLAIR WITCH PROJECT 1999
15 WITCHFINDER GENERAL 1968
16 THE HAUNTING 1963
17 THE EVIL DEAD 1981
18 PEEPING TOM 1960
19 ALIEN 1979
20 BRIDE OF FRANKENSTEIN 1935
21 NIGHT OF THE LIVING DEAD 1968
22 CURSE OF THE CAT PEOPLE 1944
23 SWITCHBLADE ROMANCE 2003
24 A NIGHTMARE ON ELM STREET 1984
25 AN AMERICAN WEREWOLF IN LONDON 1981

I’ll have to hunt down a few of those, which I haven’t seen yet — I spent last Halloween weekend catching up on horror movies — but first I must fulfill my obligation to disagree with those rankings.

I won’t quibble over one and two; they’re obviously horror classics. The first Halloween, by the way, is remarkably low-gore. Let’s skip to three. Last year I anxiously awaited seeing Suspiria for the first time — thank you, DVR and obscure cable channel — and I can say it was a total waste of time. It wouldn’t make my top 50.

Dawn of the Dead definitely deserves to be high on the list — even the fast-zombie remake — but the original Night of the Living Dead deserves to be higher — and way, way higher than 21.

The Shining is definitely super-creepy. Psycho, on the other hand, has one utterly, fantastically horrifying shower scene — and not much else. I’d rank it much lower. I don’t know The Wicker Man.

Rosemary’s Baby is brilliant. I don’t know Don’t Look Now or Cannibal Holocaust, but I have my doubts. The Thing, Carrie, and The Exorcist all belong high on the list. I only caught Carrie for the first time last year — again, huzzah for the DVR! — and it might be one of the best horror movies I’ve ever seen. It’s so much more than that one famous blood-bath at the end.

The Blair Witch Project worked for me. The Haunting didn’t. At all. I don’t know Witchfinder General.

Sam Raimi’s Evil Dead is a classic, of sorts, but it’s better known for its extremely quotable sequels, the tongue-in-cheek Evil Dead 2 and Army of Darkness. I don’t know Peeping Tom.

Alien may be my favorite “horror” movie of all time, but I understand why not everyone would rate it as one of the top horror movies of all time: it has all the trappings of serious science fiction.

Bride of Frankenstein may be a classic, but it’s awful. Of course, the original Frankenstein is really, really awful — but it introduced an iconic character design for the monster, and it had some wonderfully gothic imagery. Still, I can’t believe the abnormal brain bit from Young Frankenstein was in the original.

I haven’t caught Curse of the Cat People yet, and I don’t know Switchblade Romance, but I did rewatch A Nightmare on Elm Street last Halloween, and I wasn’t impressed. I haven’t seen An American Werewolf in London in years, but I remember it as good ‘n’ creepy.

Chasing Ground

Wednesday, October 19th, 2005

Alex Tabarrok points out “an excellent article on the housing market based around a discussion of the development firm Toll Brothers,” which explains why U.S. housing prices might continue to rise:

In Britain you pay seven times your annual income for a home; in the U.S. you pay three and a half.’ The British get 330 square feet, per person, in their homes; in the U.S., we get 750 square feet. Not only does Toll say he believes the next generation of buyers will be paying twice as much of their annual incomes; in terms of space, he also seems to think they’re going to get only half as much. ‘And that average, million-dollar insane home in the burbs? It’s going to be $4 million.

Increasing regulation restricts the housing supply and drives up prices — and communities are more likely to regulate new construction if they don’t benefit from it:

European towns also have less incentive to encourage development, Wachter says, because they generally do not, unlike their American equivalents, depend on their local tax base to pay for education and services, which tend to be federalized.

Definitely read the whole article.

Recipe for Destruction

Wednesday, October 19th, 2005

Ray Kurzweil and Bill Joy declare the 1918 flu’s genome a Recipe for Destruction:

To shed light on how the virus evolved, the United States Department of Health and Human Services published the full genome of the 1918 influenza virus on the Internet in the GenBank database.

This is extremely foolish. The genome is essentially the design of a weapon of mass destruction. No responsible scientist would advocate publishing precise designs for an atomic bomb, and in two ways revealing the sequence for the flu virus is even more dangerous.

First, it would be easier to create and release this highly destructive virus from the genetic data than it would be to build and detonate an atomic bomb given only its design, as you don’t need rare raw materials like plutonium or enriched uranium. Synthesizing the virus from scratch would be difficult, but far from impossible. An easier approach would be to modify a conventional flu virus with the eight unique and now published genes of the 1918 killer virus.

Second, release of the virus would be far worse than an atomic bomb. Analyses have shown that the detonation of an atomic bomb in an American city could kill as many as one million people. Release of a highly communicable and deadly biological virus could kill tens of millions, with some estimates in the hundreds of millions.

New species of flying reptile named for fang teeth

Wednesday, October 19th, 2005

When you hear that paleontologists have found a new species of flying reptile and named it for its fang teeth, you assume they chose a name like megadon (lit. big teeth):

Palaeobiologists at the University of Portsmouth in southern England dubbed the remains of the pterosaur found on a beach on the Isle of Wight three years ago Caulkicephalus trimicrodon.

Caulkhead is the informal name for natives of the Isle of Wight, off the southern coast of England, and trimicrodon means three small teeth.

‘It has massive fang-like front teeth, behind which are three small teeth. Behind those are bigger teeth and then rows of smaller teeth,’ said Dr David Martill, who described the specimen in the journal Cretaceous Research.

‘It was a fish-eater, with a crest on the tip of its snout and a wing span of 5 meters (yards) which would have made it one of the largest flying animals of its time,’ he added in a statement.

When I look at the artist’s rendering, I do not think three small teeth.

Scientists Study Gorilla Who Uses Tools

Tuesday, October 18th, 2005

Scientists Study Gorilla Who Uses Tools:

It had been thought that the premeditated use of stones and sticks to accomplish a task like cracking nuts was restricted to humans and the smaller, more agile chimpanzees. Then in late September, keepers at a Dian Fossey Gorilla Fund International sanctuary in this eastern Congo city saw 2 1/2-year-old female gorilla Itebero smashing palm nuts between rocks in the ‘hammer and anvil’ technique, considered among the most complex tool use behaviors.

Academic strife: the American University in the slough of despond

Tuesday, October 18th, 2005

In Academic strife: the American University in the slough of despond, Norman Levitt confronts “cultural competence”:

Like its predecessors ‘affirmative action,’ ‘diversity,’ and ‘multiculturalism’, it attempts to cloak problematical and even disturbing policy initiatives in linguistic vestments that suggest that no right-minded person could possibly demur. A ‘culturally competent’ academic, one might naively surmise, would be one who has absorbed and is able to propound some of the deep values — ethical, aesthetic or epistemological — that embody the stellar achievements of Western culture, one who could explain, for instance, why Dante or Kant or Ingres is present, at least subtly, in the assumptions under which we all live. Or something like that.

This, alas, would be a comical error.

“Here is an illustrative if fragmentary list,” as Levitt says, “of transgressions that would likely strip an academic of any chance of being designated culturally competent”:

  1. Suggesting that affirmative action might conflict with other standards of justice and equity, or that opponents of affirmative action are not ipso facto Klansmen waiting for their white sheets to come back from the laundry;
  2. Taking issue with the claim that Malcolm X was a paragon of humanitarianism and political genius;
  3. Disputing the wisdom of feminist theory as regards the social constructedness of gender;
  4. Asserting that the early demographic history of the Americas is more accurately revealed by scientific anthropology than by the Native American folklore and myth celebrated by tribal militants;
  5. Expressing doubts that ‘queer theory’ should be made the epicenter of literary studies.