Reading The Coming Anarchy Online

Sunday, December 27th, 2009

Curzon has compiled links to the articles that became Robert Kaplan’s The Coming Anarchy — they’re now available through The Atlantic‘s archives:

  1. The Coming Anarchy
  2. Was Democracy Just a Moment? (blog post here)
  3. Idealism Won’t Stop Mass Murder
  4. Special Intelligence
  5. And Now for the News: The Disturbing Relevance of Gibbon’s Decline and Fall
  6. Proportionalism: A Realistic Approach to Foreign Policy
  7. Kissinger, Metternich, and Realism
  8. Conrad’s Nostromo and the Third World (first paragraph; remainder)
  9. The Dangers of Peace

Ringing of the Bells

Friday, December 25th, 2009

The Swedish Chef, Beaker, and Animal perform the Ringing of the Bells:

Genocidal American Imperialists

Thursday, December 24th, 2009

Wretchard tells a tale of genocidal American imperialists — and data gathered by one’s own lying eyes:

Once upon a time I was sitting with some guys who were excoriating the “genocidal American imperialist”. So I asked them a question: how many people do you actually know who’ve been killed by a genocidal American soldier? Nobody could mention a single instance. “Ok,” I said, “how many people do you know who’ve been liquidated by the New People’s Army?” Almost everyone knew one or more. They could see where this was going and therefore added to the argument certain “adjustments”. The Americans were starving the Filipino people; they “ordered” Filipino lackeys to kill peasants. Etc. etc. So they adjusted their mental model to produce their preconception because they couldn’t generate it from their own experience.

Now this is not to say that they might actually be right about the “genocidal American imperialist” but they will have to introduce evidence apart from the data gathered by their own lying eyes. Someone asked me whether ten years of freezing weather would dispel a belief in global warming. I said “no.” When people find that the mothership doesn’t arrive on the scheduled date they just wait some more.

Respect means different things

Wednesday, December 23rd, 2009

Respect means different things in different cultures, Wretchard reminds us:

I think I’ve told the story of running into a Sulu Muslim named “Pershing”. Upon inquiry the man (who was a high ranking sort of guy) explained his grandfather had named him after the toughest hombre he could think of.

My guess is that the Afghans would understand the idea that Americans were retaliating for an attack against the American tribe. In fact, my ignorant guess is that they would find the idea perfectly natural. To convey that any attack against the American tribe would result in a reprisal and that therefore, all must live in watchful harmony so that the incidents would never be repeatable, might in my ignorant view, actually be comprehensible.

So respect in this matter might not take the form of treating them the way a British socialist would like to be “respected”, but as a man of the mountains might want to be respected. When we come to them with what seems to them a legalistically bizarre set of rules, which constitute our idea of “humanity”, misunderstandings might arise. They attack Americans, the American President decides to withdraw. The Americans don’t explain they are there to revenge themselves for an attack by al-Qaeda, in fact they deny it, for revenge is terribly un-PC in the West; but rather to help the cause of gender equality, democracy and fight climate change, all of which play well in a Labour meeting hall but which may be gibberish out in the sticks. And so forth and so on.

In the end, they may think we are mad and respect us the less for it. And maybe we are mad, we just don’t know it.

Why Are Europeans White?

Tuesday, December 22nd, 2009

Why are Europeans white? Because they live in a not-so-sunny region, far from the equator, of course — but there’s more to it, Frank Sweet claims:

Northern Europeans are lighter than everyone to the south (Mediterraneans), to the east (Mongols and east-Asians), to the west (Native Americans across the Atlantic), and to the North (Inuit, Sammi, Chukchi, Aleut).

Clearly, there once was a factor at work in Europe other than dim sunlight.

Here is another map of skin tone. Again, the blob surrounding the Baltic Sea is like nothing else on the planet. That this pale population surrounds the Baltic gives the first hint. It must have something to do with the oceans.

When did the inabitants of the Baltic region lose their melanin?

It must have happened after 16 KYA (16 thousand years ago). The Baltic region was covered by ice before then and nobody lived there.

In fact, it happened after 13 KYA. Cave art from that time always shows normally pigmented people. Notice that in this painting from 13 KYA, the hunters are the same color as the deer.

It must have happened before 4.6 KYA because depigmented people first began to appear in art at that time. These Egyptian statues were painted in 2613 BC. They portray Prince Rahotep and his consort Nefret, of the Old Kingdom, early Fourth Dynasty. Notice that he is brown but she is pink.

And so, the next step in solving the puzzle is to ask, “What happened in Europe between 13 KYA and 4.6 KYA?”

What happened was the invention and spread of agriculture. Before 10 KYA people everywhere lived by hunting and gathering. Then, almost simultaneously, cereal growing was invented in four spots around the globe: Iraq (wheat, barley, rye), China (rice), Nigeria (sorghum), and Mexico (corn or maize).

What does skin tone have to do with eating cereal?

All meats have some vitamin D. Fish have very high amounts. But grains have no vitamin D at all. People who eat grains do not get vitamin D from food; they must get it from sunlight.

This usually works out fine because grains grow only where it is warm. And this means only in latitudes with bright sunlight, with one exception.

It is where the warm waters of the Gulf Stream wash ashore. The Baltic is the only place on earth where ocean currents keep it warm enough to grow grain despite dim sunlight.

When the inhabitants of this region switched to grain about 6 KYA, they suddenly got insufficient vitamin D to survive. They had stopped eating mostly meat and fish in a place where sunlight was too dim to produce vitamin D in normally pigmented skin.

And so they adapted by retaining into adulthood the infantile trait of extreme paleness.

Air Force Philosophy

Tuesday, December 22nd, 2009

The fact that the US Air Force was broadcasting unencrypted video feeds from its drones raises some questions about the USAF’s management philosophy:

The Navy evolved from a situation where when the ship was over the horizon, it was gone. No calling it back except with another ship and no real hope in that case of catching the first. Further, the first ship had to have the authority to do what needed to be done with the confidence that unless it was totally outrageous, it would be ignored or blessed later.

The USAF was born with a radio in its ear. Everything was under close control and only some senior Colonel or a General could make important decisions. And that same Colonel or General decided later what an important decision was. The attitude is that it has to be decided up the line. That leads to a failure to learn to trust your subordinates. Ergo, “Can you guarantee that all the encryption keys make it down to the lowest levels in the Army or USMC [United States Marine Corps]? No way.”

Fella doesnt know many Sailors or Marines.

At least the Army could send a courier on a fast horse to catch the dispatched unit. But that didnt always work, Witness JEB Stuart before Gettysburg.

USAF enlisted I encountered back in the day were not trained to make decisions outside their sandbox. Navy Chiefs and senior Marine NCO’s? Yep. Ever hear of the Strategic Corporal? How about the Strategic Airman?

No offense to the Junior Service but I’m just sayin’…

When Will White People Stop Making Movies Like Avatar?

Tuesday, December 22nd, 2009

When will white people stop making movies like Avatar?, Annalee Newitz asks:

This is a classic scenario you’ve seen in non-scifi epics from Dances With Wolves to The Last Samurai, where a white guy manages to get himself accepted into a closed society of people of color and eventually becomes its most awesome member. But it’s also, as I indicated earlier, very similar in some ways to District 9. In that film, our (anti)hero Wikus is trying to relocate a shantytown of aliens to a region far outside Johannesburg. When he’s accidentally squirted with fluid from an alien technology, he begins turning into one of the aliens against his will. Deformed and cast out of human society, Wikus reluctantly helps one of the aliens to launch their stalled ship and seek help from their home planet.

If we think of Avatar and its ilk as white fantasies about race, what kinds of patterns do we see emerging in these fantasies?

In both Avatar and District 9, humans are the cause of alien oppression and distress. Then, a white man who was one of the oppressors switches sides at the last minute, assimilating into the alien culture and becoming its savior. This is also the basic story of Dune, where a member of the white royalty flees his posh palace on the planet Dune to become leader of the worm-riding native Fremen (the worm-riding rite of passage has an analog in Avatar, where Jake proves his manhood by riding a giant bird). An interesting tweak on this story can be seen in 1980s flick Enemy Mine, where a white man (Dennis Quaid) and the alien he’s been battling (Louis Gossett Jr.) are stranded on a hostile planet together for years. Eventually they become best friends, and when the alien dies, the human raises the alien’s child as his own. When humans arrive on the planet and try to enslave the alien child, he lays down his life to rescue it. His loyalties to an alien have become stronger than to his own species.

These are movies about white guilt. Our main white characters realize that they are complicit in a system which is destroying aliens, AKA people of color — their cultures, their habitats, and their populations. The whites realize this when they begin to assimilate into the “alien” cultures and see things from a new perspective. To purge their overwhelming sense of guilt, they switch sides, become “race traitors,” and fight against their old comrades. But then they go beyond assimilation and become leaders of the people they once oppressed. This is the essence of the white guilt fantasy, laid bare. It’s not just a wish to be absolved of the crimes whites have committed against people of color; it’s not just a wish to join the side of moral justice in battle. It’s a wish to lead people of color from the inside rather than from the (oppressive, white) outside.

Think of it this way. Avatar is a fantasy about ceasing to be white, giving up the old human meatsack to join the blue people, but never losing white privilege. Jake never really knows what it’s like to be a Na’vi because he always has the option to switch back into human mode. Interestingly, Wikus in District 9 learns a very different lesson. He’s becoming alien and he can’t go back. He has no other choice but to live in the slums and eat catfood. And guess what? He really hates it. He helps his alien buddy to escape Earth solely because he’s hoping the guy will come back in a few years with a “cure” for his alienness. When whites fantasize about becoming other races, it’s only fun if they can blithely ignore the fundamental experience of being an oppressed racial group. Which is that you are oppressed, and nobody will let you be a leader of anything.

How Success Killed Duke Nukem

Tuesday, December 22nd, 2009

Clive Thompson explains how success killed Duke Nukem:

To videogame fans, that logo is instantly recognizable. It’s the insignia of Duke Nukem 3D, a computer game that revolutionized shoot-’em-up virtual violence in 1996. Featuring a swaggering, steroidal, wisecracking hero, Duke Nukem 3D became one of the top-selling videogames ever, making its creators very wealthy and leaving fans absolutely delirious for a sequel. The team quickly began work on that sequel, Duke Nukem Forever, and it became one of the most hotly anticipated games of all time.

It was never completed. Screenshots and video snippets would leak out every few years, each time whipping fans into a lather — and each time, the game would recede from view. Normally, videogames take two to four years to build; five years is considered worryingly long. But the Duke Nukem Forever team worked for 12 years straight. As one patient fan pointed out, when development on Duke Nukem Forever started, most computers were still using Windows 95, Pixar had made only one movie — Toy Story — and Xbox did not yet exist.

On May 6, 2009, everything ended. Drained of funds after so many years of work, the game’s developer, 3D Realms, told its employees to collect their stuff and put it in boxes. The next week, the company was sued for millions by its publisher for failing to finish the sequel.

Sometimes you need to learn to let go — which is hard to do when you’ve learned to set ludicrously high expectations, and you have the money to burn:

When Duke Nukem 3D came out, Broussard’s Duke Nukem engine — called Build — produced the best-looking game around. Barely a year later, though, it looked antiquated. Broussard’s key rival in the Dallas gaming scene, id Software, had announced its Quake II engine, which produced graphics that made Build seem blocky and crude. Broussard decided to license the Quake II engine, figuring it would save him precious time; programming an engine from scratch can take years. Though 3D Realms never confirmed how much it paid for the license — Miller referred to it as “a truckload of money” on a gaming news site — the price was said to be as high as $500,000. When the engine was released in December 1997, Broussard’s team quickly began creating game levels, monsters, and weapons around it.

By May 1998, the team had created enough material to show off at E3, the annual videogame industry convention. Duke Nukem Forever was set in Vegas; in the game’s plot, Duke operates a strip club and then has to fight off invading aliens. Broussard showed a trailer featuring a dozen different scenes, including Duke fighting on the back of a moving truck, jet airplanes crashing, and furious firefights with aliens. Critics were awed: “It sets a new benchmark for making a 3-D game more like a Hollywood movie,” Newsday proclaimed. Broussard was clearly obsessed with making his product as aesthetically appealing as possible. When he brought a few journalists over to a computer to show off bits of the game, he pointed out the way you could see individual wrinkles on characters’ faces and mused over how to make his campfire more realistic. (”As soon as we mix in some white smoke and some black smoke, I think we’ll be there,” he said.)

Behind the scenes, though, Broussard was already unhappy with the results and was craving better technology. A few months after the Quake II engine was released, another competitor, Epic MegaGames, unveiled a rival engine called Unreal. Its graphics were more realistic still, and Unreal was better suited to crafting wide-open spaces. 3D Realms was struggling mightily to get Quake II to render the open desert around Las Vegas. One evening just after E3, while the team sat together, a programmer threw out a bombshell: Maybe they should switch to Unreal? “The room got quiet for a moment,” Broussard recalled. Switching engines again seemed insane — it would cost another massive wad of money and require them to scrap much of the work they’d done.

But Broussard decided to make the change. Only weeks after he showed off Duke Nukem Forever, he stunned the gaming industry by announcing the shift to the Unreal engine. “It was effectively a reboot of the project in many respects,” Chris Hargrove, then one of the game’s programmers, told me (though he agreed with the decision). Broussard soon began pushing for even more and cooler game-building tools: He ripped out the ceiling of a room at the 3D Realms office to assemble a motion-capture lab, which would help his team in rendering “complex motions like strippers,” he noted on the 3D Realms Web site.

Broussard simply couldn’t tolerate the idea of Duke Nukem Forever coming out with anything other than the latest and greatest technology and awe-inspiring gameplay. He didn’t just want it to be good. It had to surpass every other game that had ever existed, the same way the original Duke Nukem 3D had.

But because the technology kept getting better, Broussard was on a treadmill. He’d see a new game with a flashy graphics technique and demand the effect be incorporated into Duke Nukem Forever. “One day George started pushing for snow levels,” recalls a developer who worked on Duke Nukem Forever for several years starting in 2000. Why? “He had seen The Thing” — a new game based on the horror movie of the same name, set in the snowbound Antarctic — “and he wanted it.” The staff developed a running joke: If a new title comes out, don’t let George see it. When the influential shoot-’em-up Half-Life debuted in 1998, it opened with a famously interactive narrative sequence in which the player begins his workday in a laboratory, overhearing a coworker’s conversation that slowly sets a mood of dread. The day after Broussard played it, an employee told me, the cofounder walked into the office saying, “Oh my God, we have to have that in Duke Nukem Forever.”

Broussard and Miller had spent $20 million of their own money on Duke Nukem Forever before they went hat in hand to Take-Two, their game publisher, to ask for $6 million to help finish the game. They didn’t get it.

Nor’easter View from Space

Tuesday, December 22nd, 2009

What does a Nor’easter look like from space? White. Lots of white.

The Mid-Atlantic states were completely white on Sunday, December 20, 2009, in the wake of a record-breaking snow storm. The storm deposited between 12 and 30 inches of snow in Virginia, Maryland, and Washington, D.C. on December 19, according to the National Weather Service. For many locations, the snowfall totals broke records for the most snow to fall in a single December day.

The Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Aqua satellite captured this view of the Chesapeake Bay region as the clouds were clearing on December 20. The snow highlights the courses of the Potomac and Susquehanna Rivers from the Appalachian Mountains to the Chesapeake Bay. The ridges and valleys of the Appalachian Mountains are similarly highlighted. The forested peaks are darker than the snow-covered valleys.

The massive snow storm was a Nor’easter, a powerful storm characterized by a strong low-pressure center that forms in the Gulf of Mexico or the Atlantic Ocean and moves northward up the Eastern seaboard. In the Northern Hemisphere, winds flow in toward the center of a low-pressure area in a counter-clockwise spiral, which means that as the storm heads north, the leading winds come in off the ocean from the northeast.

Do Balrogs Have Wings?

Monday, December 21st, 2009

Asking a question like, Do balrogs have wings?, is one way to recapture the feel of medieval theological disputation:

However much fan art depicts it, for the true Lord of the Rings fanboy, there’s only one definitive source: the book itself. What does Fellowship have to say for itself?

Exhibit D:

‘His enemy halted again, facing him, and the shadow about it reached out like two vast wings.’

The Fellowship of the Ring II 5 The Bridge of Khazad-dûm

Exhibit E:

‘…suddenly it drew itself up to a great height, and its wings were spread from wall to wall…’

The Fellowship of the Ring II 5 The Bridge of Khazad-dûm


The relevant Encyclopedia of Arda article continues:
These are quite probably the most hotly debated words Tolkien ever wrote. This seems strange at first, because in fact most people agree that the meaning isn’t particularly ambiguous, and that it’s fairly obvious what the statement means. The dispute begins, though, with a curious fact: like an optical illusion, this quotation has two obvious interpretations. Whatever you think it means, and however sure you are, there are plenty of people who see it quite differently.

The two interpretations:

To one group of readers, ‘its wings were spread from wall to wall’ (2) relates to the immediately preceding ‘the shadow about it reached out like two vast wings’ (1). To them, it just reinforces the preceding statement, and says nothing about any other kind of wings. On the opposite side of the debate, ‘its wings were spread’ (2) is not related to the preceding statement at all. Instead, it’s a definite reference to the Balrog’s real, physical wings.

This is the heart of the debate. As Obi Wan was fond of saying, Luke, you’re going to find that many of the truths we cling to depend greatly on our own point of view.

The Elves Leave Middle Earth – Sodas Are No Longer Free

Monday, December 21st, 2009

Steve Blank’s latest headline did its job and pulled me in — The Elves Leave Middle Earth – Sodas Are No Longer Free:

Last week as a favor to a friend, I sat in on a board meeting of a fairly successful 3½ year-old startup. Given all that could go wrong in this economy, they were doing well. Their business had just crossed cash flow breakeven, had grown past 50 employees, just raised a substantive follow-on round of financing and had recently hired a Chief Financial Officer. It was an impressive performance.

Then the new CFO got up to give her presentation — all kind of expected; Sarbanes Oxley compliance, a new accounting system, beef up IT and security, Section 409A (valuation) compliance, etc. Then she dropped the other shoe.

“Do you know how much our company is spending on free sodas and snacks?” And to answer her own question she presented the spreadsheet totaling it all up.

There were some experienced VC’s in the room and I was waiting for them to “educate” her about startup culture. But my jaw dropped when the board agreed that the “free stuff” had to go.

“We’re too big for that now” was the shared opinion. But we’ll sell them soda “cheap.”

I had lived through this same conversation four times in my career, and each time it ended as an example of unintended consequences. No one on the board or the executive staff was trying to be stupid. But to save $10,000 or so, they unintentionally launched an exodus of their best engineers.

This company had grown from the founders, who hired an early team of superstars, many now managing their own teams. All these engineers were still heads-down, working their tails off, just as they had been doing since the first few months of the company. Too busy working, most were oblivious to the changes that success and growth had brought to the company.

One day the engineering team was clustered in the snack room looking at the soda machine. The sign said, “Soda now 50 cents.” The uproar began. Engineers started complaining about the price of the soda. Someone noticed that instead of the informal reimbursement system for dinners when they were working late, there was now a formal expense report system. Some had already been irritated when “professional” managers had been hired over their teams with reportedly more stock than the early engineers had. Lots of email was exchanged about “how things were changing for the worse.” A few engineers went to the see the CEO.

But the damage had been done. The most talented and senior engineers looked up from their desks and noticed the company was no longer the one they loved. It had changed. And not in a way they were happy with.

The best engineers quietly put the word out that they were available, and in less than month the best and the brightest began to drift away.

Femina Sapiens

Monday, December 21st, 2009

In the twentieth century, the big-brained female — femina sapiens — found herself living in an utterly reshaped habitat, Kay S. Hymowitz reminds us:

“Consider a typical woman born around 1900,” [Stefania Albanesi and Claudia Olivetti] write. “She married at 21 and gave birth to more than three live children between age 23 and 33. The high fetal mortality rate implied an even greater number of pregnancies, so that she would be pregnant for 36 percent of this time. Health risks in connection to pregnancy and childbirth were severe. Septicemia, toxaemia, hemorrhages and obstructed labor could lead to prolonged physical disability and, in the extreme, death.” It wasn’t just the Pill, then; antibiotics, blood banks, improvements in prenatal and obstetric care, and the mass production of safe baby formula fundamentally altered the human environment in ways that laid the foundation for contemporary women’s achievement.

Machinery invented by the brainy Homo sapiens also revolutionized the female lot. Until 1900, the vast majority of people in the Western world lived in conditions much like those in sub-Saharan Africa and parts of the Middle East today. Few had access to electricity; only about a quarter of all American households had running water. In this environment, American women did what women tied to their domiciles with three-plus children have always done: cooking, making and cleaning clothes, hauling water, and the like. But by the mid-twentieth century, human innovation had considerably lightened those essential household tasks. Using U.S. Census data, University of Montreal economist Emanuela Cardia has shown how home technology, including appliances and bathroom plumbing, played a significant role in moving women into the labor force.

Tycoon, Contractor, Soldier, Spy

Monday, December 21st, 2009

Erik Prince — tycoon, contractor, soldier, spy — founded Blackwater with a much more limited concept that it outgrew after 9/11:

Blackwater’s origins were humble, bordering on the primordial. The company took form in the dismal peat bogs of Moyock, North Carolina — not exactly a hotbed of the defense-contracting world.

In 1995, Prince’s father, Edgar, died of a heart attack (the Evangelical James C. Dobson, founder of the socially conservative Focus on the Family, delivered the eulogy at the funeral). Edgar Prince left behind a vibrant auto-parts manufacturing business in Holland, Michigan, with 4,500 employees and a line of products ranging from a lighted sun visor to a programmable garage-door opener. At the time, 25-year-old Erik was serving as a navy seal (he saw service in Haiti, the Middle East, and Bosnia), and neither he nor his sisters were in a position to take over the business. They sold Prince Automotive for $1.35 billion.

Erik Prince and some of his navy friends, it so happens, had been kicking around the idea of opening a full-service training compound to replace the usual patchwork of such facilities. In 1996, Prince took an honorable discharge and began buying up land in North Carolina. “The idea was not to be a defense contractor per se,” Prince says, touring the grounds of what looks and feels like a Disneyland for alpha males. “I just wanted a first-rate training facility for law enforcement, the military, and, in particular, the special-operations community.”

Business was slow. The navy seals came early — January 1998 — but they didn’t come often, and by the time the Blackwater Lodge and Training Center officially opened, that May, Prince’s friends and advisers thought he was throwing good money after bad. “A lot of people said, ‘This is a rich kid’s hunting lodge,’” Prince explains. “They could not figure out what I was doing.”

Today, the site is the flagship for a network of facilities that train some 30,000 attendees a year. Prince, who owns an unmanned, zeppelin-esque airship and spent $45 million to build a fleet of customized, bomb-proof armored personnel carriers, often commutes to the lodge by air, piloting a Cessna Caravan from his home in Virginia. The training center has a private landing strip. Its hangars shelter a petting zoo of aircraft: Bell 412 helicopters (used to tail or shuttle diplomats in Iraq), Black Hawk helicopters (currently being modified to accommodate the security requests of a Gulf State client), a Dash 8 airplane (the type that ferries troops in Afghanistan). Amid the 52 firing ranges are virtual villages designed for addressing every conceivable real-world threat: small town squares, littered with blown-up cars, are situated near railway crossings and maritime mock-ups. At one junction, swat teams fire handguns, sniper rifles, and shotguns; at another, police officers tear around the world’s longest tactical-driving track, dodging simulated roadside bombs.

In keeping with the company’s original name, the central complex, constructed of stone, glass, concrete, and logs, actually resembles a lodge, an REI store on steroids. Here and there are distinctive touches, such as door handles crafted from imitation gun barrels. Where other companies might have Us Weekly lying about the lobby, Blackwater has counterterror magazines with cover stories such as “How to Destroy Al Qaeda.”

In fact, it was al-Qaeda that put Blackwater on the map. In the aftermath of the group’s October 2000 bombing of the U.S.S. Cole, in Yemen, the navy turned to Prince, among others, for help in re-training its sailors to fend off attackers at close range. (To date, the company says, it has put some 125,000 navy personnel through its programs.) In addition to providing a cash infusion, the navy contract helped Blackwater build a database of retired military men—many of them special-forces veterans — who could be called upon to serve as instructors.

When al-Qaeda attacked the U.S. mainland on 9/11, Prince says, he was struck with the urge to either re-enlist or join the C.I.A. He says he actually applied. “I was rejected,” he admits, grinning at the irony of courting the very agency that would later woo him. “They said I didn’t have enough hard skills, enough time in the field.” Undeterred, he decided to turn his Rolodex into a roll call for what would in essence become a private army.

That Old College Lie

Monday, December 21st, 2009

Claiborne Pell — of Pell Grant fame — died at age 90 earlier this year:

What the encomiums to Pell failed to mention is that his grants have been, in all the ways that matter most, a failure. As any parent can tell you, colleges are increasingly unaffordable. Students are borrowing at record levels and loan default rates are rising. More and more low-income students are getting priced out of higher education altogether. The numbers are stark: When Pell grants were named for the senator in 1980, a typical public four-year university cost $2,551 annually. Pell Grants provided $1,750, almost 70 percent of the total. Even private colleges cost only about $5,600 back then. Low-income students could matriculate with little fear of financial hardship, as Pell intended. Over the next three decades, Congress poured vast sums into the program, increasing annual funding from $2 billion to nearly $20 billion. Yet today, Pell Grants cover only 33 percent of the cost of attending a public university. Why? Because prices have increased nearly 500 percent since 1980. Average private college costs, meanwhile, rose to over $34,000 per year.

It’s all part of that old college lie, Kevin Carey says:

The average graduation rate at four-year colleges in the bottom half of the Barron’s taxonomy of admissions selectivity is only 45 percent. And that’s just the average–at scores of colleges, graduation rates are below 30 percent, and wide disparities persist for students of color. Along with community colleges, where only one in three students earns a degree, these low-performing institutions educate the large majority of Pell Grant recipients. Less than 40 percent of low-income students who start college get a degree of any kind within six years.

Are colleges just enforcing high academic standards? Hardly:

A 2006 study from the American Institutes for Research found that only 31 percent of adults with bachelor’s degrees are proficient in “prose literacy” — being able to compare and contrast two newspaper editorials, for example. More than a quarter have math skills so feeble that they can’t calculate the cost of ordering supplies from a catalogue.

America’s higher education has a reputation for being the best in the world, but this is driven by the high quality of a few prestigious institutions and their students. No one really knows how good most colleges are — how well they teach and how much their students learn:

The information deficit turns college into what economists call a “reputational good.” If you go to the store and buy a shirt, you can learn pretty much everything you need to know before you buy it: the material, where it was made, how to clean it, and so on. College is different. You’re paying up-front for professors you’ve never met and degree programs you probably haven’t even chosen yet. Instead, you rely on what other people think of the college. Of course, some students simply have to go the college that’s nearest to them or least expensive. But if you have the luxury of choosing, in all likelihood, you choose based on reputation.

If college reputations were based on objective, publicly available measures of student learning, that would be okay. But they’re not, because no such measurements exist. Instead, reputations are largely based on wealth, admissions selectivity, price, and a generalized sense of fame that is highly influenced by who’s been around the longest and who produces the most research. Not coincidentally, these are the factors that drive the influential U.S. News & World Report rankings that always rate old, wealthy, renowned institutions like Harvard and Princeton as America’s best colleges.

The influence of reputation is exacerbated by the fact that most colleges are non-profit. For-profit institutions succeed by maximizing the difference between revenues and expenditures. While they have strong incentives to get more money, they also have strong incentives to spend less money, by operating in the most efficient manner possible. Non-profit colleges aren’t profit-maximizing; they are reputation-maximizing. And reputations are expensive to buy.

The economist Howard Bowen wrote the classic treatise on how reputation-seeking influences university behavior. He called it the “revenue-to-cost” phenomenon. Essentially, colleges don’t figure out how much money they need to spend and then go get it. Instead, they get as much money as they can and then spend it. Since reputations are relational — the goal is to be better than the other guy — there is no practical limit on how much colleges can spend in pursuit of self-glorification. As former Harvard President Derek Bok wrote, “Universities share one characteristic with compulsive gamblers and exiled royalty: There is never enough money to satisfy their desires.” Inevitably, much of that money comes from students.

The information deficit rewards and sustains these inclinations. In the absence of independent information about quality, consumers assume that price and quality are the same thing. At the trend-setting high end of the market, higher education has become a luxury good, the educational equivalent of a Prada shoe. These are unusually nice shoes, of course, just as Harvard is an unusually good university. But in both cases consumers aren’t paying for quality alone — they’re also paying extra for scarcity and a prominent brand name, the primary value of which is to signal to the rest of the world that they’re rich and connected enough to pay the price.

While most colleges aren’t in Harvard’s league and never will be, they pay attention to industry leaders. Luxury schools set standards for faculty salaries, student amenities, and other expensive things that ripple through the higher education sector as a whole. The status-seeking mindset is infectious. Colleges all want to become more important, and they all know how to get there — spend and charge more.

Indeed, they have little choice. Ten percent of the U.S. News rankings are based on spending per student, with additional points for high faculty salaries and other costly items. If an innovative college found a way to become more efficient and charge less while maintaining academic quality, its U.S. News ranking would actually go down.

Carey argues that publishing more data on college outcomes would result in a better market, but Arnold Kling notes that this presumes that the problem is on the supply side — that colleges want to hold back information:

The Masonomics view is that the problem is more on the demand side — the role that signaling plays in creating perceptions of value.

A point that I keep making about higher education is that it is, like the Harvard-Goldman filter, a form of recursive credentialism. To get certain jobs, you need certain credentials. And the most important credential of all is that you must signal your support for credentialism.

A commenter by the name of agnostic adds that supply and demand are both at work in the higher ed bubble:

The big problem on the supply side is that, just as with the recent finance bubble, managers of assets (the college officials who admit and oversee students) are paid according to volume of assets managed — more students means more tuition and more donations (and maybe more grants if those new students are from “disadvantaged” groups).

They are not paid according to ROI or anything like that. So it wouldn’t matter if we did what Carey says and publish more data like probability of flunking out, probability of graduating in 4 years, loss given flunking out, etc.

These incentives push sub-elite colleges to take in as many students as possible, just as banks took in whatever garbage they could get their hands on.

And as Charles Calomiris pointed out in the context of finance, the true demanders of grade inflation, re-centering of the SAT, etc., is the buy side. If it were the sell side — students, their teachers, parents, etc. — every buyer (college) would know it was a joke and adjust the grades and scores they received accordingly.

Rather, the buy side wants students with inflated grades and test scores because sub-elite colleges need to pass similar regulatory hurdles to admit students — maybe not as formalized as financial reg rules, but still, you can’t admit a bunch of students whose average GPA and SAT is 1.0 and 900. Inflate them to 2.0 and 1000, and would-be regulators or castigators of college admissions boards now have less of a basis to complain.

Hell, the colleges even cherry-pick the best score you got on each sub-test of the SAT. If you take it more than once, only your best math score shows up and only your best verbal, even if on different tests. That’s a smoking gun that the buy side is driving grade inflation, not the demand side.

Five Laws of Human Nature

Monday, December 21st, 2009

Michael Marshall shares five laws of human nature — including Parkinson’s law and Student syndrome — with various sub-laws:

Parkinson’s law

Civil servant, historian and theorist Cyril Northcote Parkinson suggested in a 1955 article that work expands to fill the time available for its completion – backed up with statistical evidence drawn from his historical research. More recent mathematical analyses have lent support to the idea.

Parkinson also came up with the “law of triviality“, which states that the amount of time an organisation spends discussing an issue is inversely proportional to its importance. He argued that nobody dares to expound on important issues in case they’re wrong – but everyone is happy to opine at length about the trivial.

This in turn may be a result of Sayre’s law, which states that in any dispute, the intensity of feeling is inversely proportional to the value of the stakes at issue.

Parkinson also proposed a coefficient of inefficiency, which attempts to define the maximum size a committee can reach before it becomes unable to make decisions. His suggestion that it lay “somewhere 19.9 and 22.4″ has stood the test of time: more recent research suggests that committees cannot include many more than 20 members before becoming utterly hapless.

Student syndrome

“If it weren’t for the last minute, I wouldn’t get anything done.” So said an anonymous wit, and none but the most ferociously well-organised can disagree.

In fact, procrastination is a major problem for some people, especially those who are easily distracted or are uncertain of their ability to complete a task.

One of the most well-known examples of vigorous procrastination is student syndrome. As anyone who has ever been (or known) a student will know, it is standard practice to apply yourself to a task only at the last possible moment before the deadline.

Student syndrome is so common that some experts in project management recommend not assigning long periods of time to particular tasks, because the people who are supposed to do them will simply wait until just before the deadline to start work, and the project will overrun anyway (International Journal of Project Management, vol 18, p 173).

Some of the blame for student syndrome may be laid at the feet of the planning fallacy: the tendency for people to underestimate how long it will take to do something.

If you often get caught out by how long things take, we recommend considering Hofstadter’s law, coined by the cognitive scientist Douglas Hofstadter: “It always takes longer than you expect, even when you take into account Hofstadter’s law.”