For Whom the Bell Dengs, It Dengs for Lee

Sunday, October 25th, 2009

Ask not for whom the bell Dengs, it Dengs for Lee, says Joseph Fouche:

Charlie Rose interviewed the former Prime Minister of Singapore Lee Kuan Yew the other night. Rose asked Lee who, out of all the world leaders he had met, he most admired. Lee answered that he admired Deng Xiaoping for his adaptability. Lee related an anecdote about Deng’s first visit to Singapore in 1978. Deng was surprised by Singapore’s prosperity, which his brief had not adequately covered. Deng asked Lee how he had made Singapore so prosperous. Lee replied that they had attracted foreign direct investment due to Singapore’s cheap (at that time) labor costs. They then became subcontractors then contractors then competitors, learning as they went. Deng observed that Lee had created an egalitarian society using capitalism, an observation Lee seconded. Deng then went back to China and, Lee implies, applied the Singapore model to China with the side effects the world has experienced since.

If this implied influence on Deng is accurate, it makes Lee one of the most influential figures of the twentieth century, all from a little dot south of the Malay peninsula. Lee is often times considered the most effective authoritarian of the late twentieth century, the sort of man that, if they could be produced on demand, would doom democracy.

Charlie Rose has interviewed Lee Kuan Yew three times. The most recent interview is not yet online, but the 2004 and 2000 interviews are.

It’s Complicated

Saturday, October 24th, 2009

The Flynn effect, the observation that IQs have been steadily rising, enough that the tests must be regularly recalibrated, likely stems from the fact that life’s more complicated now:

In England 40 years ago, my father was paid, in cash, every Thursday, and was broke by the following Wednesday. He had a quarterly gas bill and a quarterly electricity bill. He paid weekly rent on a property owned by the town. Since he did not believe in life insurance, own a bank account or invest in the stock market, that was the entire extent of his financial concerns.

He read one newspaper, Cecil King’s Daily Mirror. He had two TV channels available to him, both of course black and white. He owned one suit, and I think no more than three sets of underwear. My wife, growing up in mainland China in the 1960s, had an even more spare existence. She had just one toy, which of course she adored.

Now look at us. I have just spent three days doing my income taxes. My financial affairs — the affairs of a modest working family — occupy an entire drawer in a set of filing cabinets. (Filing cabinets! In my house!) Never mind a generation: just in the past eight years I have gone from having one telephone bill to having five: one for a wireless service and two for fixed lines — each of which, for reasons I cannot be bothered to understand, is served by both Verizon and AT&T.

With the help of the Internet I read, or at least skim through, about twenty newspapers or news-websites every morning, ranging from the Wall Street Journal to the Taipei Times. My house contains four working computers. My kids’ bedrooms are silted up with toys, to which they pay little attention. When we take them to McDonald’s, their place-mats are decked out with puzzles, mazes and word games. A stimulating environment? You could say so.

Bootstrapping Complexity

Saturday, October 24th, 2009

Andreas Lloyd liked the ideas Kevin Kelly wrote about in Out of Control, but he found Kelly’s presentation frustrating — so he remixed the book into a new, shorter volume, Bootstrapping Complexity:

Remixing is perhaps too strong a word because he mostly simply dropped entire chapters, with a little re-arranging here and there. It is a very sharp but intelligent edit. But the effect is striking. Instead of a rambling book about one dozen things, Lloyd’s remix of my book focuses it on the cybernetic and feedback aspects of the systems I was reporting on in the early 1990s. I suggested this focus needed a better title than Out of Control, which I never was happy with anyhow, so Lloyd came up with a new one for this version of the book. He calls it Bootstrapping Complexity.

Three Prevalent Views of Human Nature

Friday, October 23rd, 2009

John Derbyshire offers up three prevalent views of human nature, in chronological order by origin:

The “Abrahamic” view is the one promoted by the big old Western faiths: Judaism, Christianity, and Islam. The Darwinian view is the one implied (though not dispositively proved) by Darwinism. The third view I have labeled “Boasian” after anthropologist Franz Boas, who was the first to use it as basis for a comprehensive modern account of human nature.
Abrahamic: Our species homo sapiens is the special creation of God, either as a one-off miracle or by God-guided evolution. Human nature is a mix of attributes, some biological, some inserted by God. The God-given attributes are unique to our species. They are the same in all human populations, forming the foundation of our essential equality. Their existence is independent of our biological nature, even to the degree that they can continue to exist after our deaths. Being non-biological, they certainly do not evolve, even if other features of the living world do, so that our evolution, if it ever took place, ended (except perhaps for some incidental biological features) when God decreed we have these attributes. God rules!

Darwinian: Our species homo sapiens arose, like all other species, from the ordinary processes of evolution, which have continued to the present day. Human nature is a collection of characteristics all susceptible to biological explanation. These characteristics show variation in any one population. A human population that breeds mostly within itself for many generations will develop distinctive profiles of variation, as a result of ordinary biological laws, causing it to diverge from other such populations. Neither individual human beings nor human populations are equal. Some human-nature characteristics can be shaped to some degree by “cultural” (i.e. social or environmental) forces; some cannot. Biology rules!

Boasian: Our species homo sapiens arose, like all other species, from the ordinary processes of evolution. However, these processes ceased in the very early history of the species, leaving us with a human nature uniform across all populations and unchanging over time, forming the foundation of our essential equality. This human nature is infinitely resilient, like a water-filled balloon. Any of its characteristics can be pushed into almost any shape by “cultural” forces (see above), but will submit to radical re-shaping if different forces are applied. Observed variations in human-nature characteristics have probably (in the case of individuals) and certainly (in the case of populations) no biological foundation. Culture rules!

A thing you notice when these three views of human nature are lined up is how far the Darwinian explanation stands from the other two. I have worked my phrasing somewhat to bring this out, but it wasn’t difficult to do so. A Darwinian view of human nature really is quite sensationally revolutionary. In particular, it makes a hash of intrinsic human equality. We may of course — and we should, and I hope we ever shall! — hold equal treatment under the law to be an organizing principle of our civilization; but that is a social agreement, like driving on the right, not a pre-existing fact in the world.

We might even speculate that the Abrahamic and the Boasian views are really the same, or that the second is a scientistic nineteenth-century derivative of the first, as Marxism was of traditional religious millenarianism. As the authors of math textbooks say: I leave this as an exercise for the reader.

The White City

Friday, October 23rd, 2009

The progressive urbanist role-model is not a red city or even a blue city, but a white city:

Among the media, academia and within planning circles, there’s a generally standing answer to the question of what cities are the best, the most progressive and best role models for small and mid-sized cities. The standard list includes Portland, Seattle, Austin, Minneapolis, and Denver. In particular, Portland is held up as a paradigm, with its urban growth boundary, extensive transit system, excellent cycling culture, and a pro-density policy. These cities are frequently contrasted with those of the Rust Belt and South, which are found wanting, often even by locals, as “cool” urban places.

But look closely at these exemplars and a curious fact emerges. If you take away the dominant Tier One cities like New York, Chicago and Los Angeles you will find that the “progressive” cities aren’t red or blue, but another color entirely: white.

In fact, not one of these “progressive” cities even reaches the national average for African American percentage population in its core county. Perhaps not progressiveness but whiteness is the defining characteristic of the group.
[...]
As the college educated flock to these progressive El Dorados, many factors are cited as reasons: transit systems, density, bike lanes, walkable communities, robust art and cultural scenes. But another way to look at it is simply as White Flight writ large. Why move to the suburbs of your stodgy Midwest city to escape African Americans and get criticized for it when you can move to Portland and actually be praised as progressive, urban and hip? Many of the policies of Portland are not that dissimilar from those of upscale suburbs in their effects. Urban growth boundaries and other mechanisms raise land prices and render housing less affordable exactly the same as large lot zoning and building codes that mandate brick and other expensive materials do. They both contribute to reducing housing affordability for historically disadvantaged communities. Just like the most exclusive suburbs.
[...]
In fact, lack of ethnic diversity may have much to do with what allows these places to be “progressive”. It’s easy to have Scandinavian policies if you have Scandinavian demographics.

(Hat tip to Steve Sailer.)

The Cold War Never Ended

Friday, October 23rd, 2009

It may have ended 20 years ago in the then-Soviet Union, but the Cold War never ended here in the West:

The countries held captive by Moscow began their long road to economic and cultural recovery, and to reunification with liberal Europe. But in the West, where Cold War divisions defined politics and society for 40 years, the moment was not greeted as a welcome opportunity for intellectual reconciliation, for fact-checking decades of exaggerations and misperceptions. Instead, then as now, despite the overwhelming volume of new data and the exhilaration of hundreds of millions finding freedom, the battle to control the Cold War narrative raged on unabated. Reagan haters and Reagan hagiographers, Sovietophiles and anti-communists, isolationists and Atlanticists, digested this massive moment in history, then carried on as if nothing much had changed. A new flurry of books timed to coincide with the 20th anniversary of communism’s collapse reinforces the point that the Cold War will never truly be settled by the side that won.

It is bizarre to revisit pre-1989 journalism and punditry on Soviet communism. The suffering of the bit players, those pitiable citizens stranded behind the Iron Curtain, was largely ignored in favor of larger political goals. If Ronald Reagan believed the Kremlin to be the beating heart of an “evil empire,” many of his angriest critics believed, then Moscow couldn’t be all bad. Writing in The Nation in 1984, historian Stephen F. Cohen hissed that, in a perfect world, “fairness would not allow us to defame a nation that has suffered and achieved so much.”

Although uniformly anti-Soviet, some conservatives too were guilty of a Cold War–induced moral blindness, defending authoritarian governments in Africa, Latin America, Asia, and Iberia as bulwarks against communist expansion. Columnist Pat Buchanan celebrated the authoritarian leaders Augusto Pinochet of Chile and Francisco Franco of Spain as “soldier-patriots” and referred quaintly to the racist regime in South Africa as the “Boer Republic.” Others accused America’s most anti-Soviet president of impuissance. As early as 1983, neoconservative writer Norman Podhoretz proclaimed that Reagan’s policies toward the Soviet Union amounted to “appeasement by any other name.”

When the whole rotten experiment suddenly failed, eventually bringing to an end not just Moscow’s Warsaw Pact client governments but the proxy civil wars it fought in the Third World, instead of engaging in overdue self-criticism many commentators clung to shopworn shibboleths. In 1990 the academic Peter Marcuse, also writing in The Nation, bizarrely claimed that East Germany “had never sent dissidents to gulags and rarely to jail” and expressed outrage that the “goal of the German authorities is the simple integration of East into West without reflection,” instead of heeding the pleas of the intellectual class who were at work on a more humane, less Russian brand of socialism.

The weeks and months following the fall of the Wall saw relentless worries, from left and right, about the corrosive influence of Western capitalism, consumerism, and commercial television on the untainted comrades of the Ost. The “prospect of rampant consumerism,” CBS News reported in July 1990, “has East Germany’s newly elected Christian Democratic Prime Minister, Lother De Mozier, worried.” By 1993 Ukrainian National Self Defense, a right-wing populist movement that loathed Russian power, was rallying against the “Americanization of Ukraine through Coca-Cola culture.” Even the famously anti-communist Pope John Paul II warned that “the Western countries run the risk of seeing this collapse of Communism as a one-sided victory of their own economic system, and thereby failing to make necessary corrections in that system.”

When the “shock” of capitalism didn’t jump-start the moribund economies of the East within a calendar year, many in the Western news media declared the entire project dead on arrival. In 1990 ABC Evening News told viewers that East Germany was already a “victim of an overdose of capitalism.” In Southeast Poland, CBS reported, “the transition from communism to capitalism is making more people more miserable every day.” Every new election, even in firmly Western-oriented countries such as Hungary and Poland, was greeted with scare stories about backsliding into communism, lurching into neo-Nazism, or both. Even some of the early 20th-anniversary retrospectives last summer trotted out the same familiar story lines, exponential gains in freedom and prosperity notwithstanding.

With the proliferation of “Old Hopes Replaced by New Fears” stories, the long-running intellectual battle over the Cold War retreated into the halls of academia, where the newly (and, it turned out, briefly) opened Soviet archives further undermined the accepted narratives about Alger Hiss, Julius and Ethel Rosenberg, I.F. Stone, and scores of other causes célèbres of the anti-anti-communists. Western intellectuals were more interested in Francis Fuku-yama’s contention that we were witnessing “the end of history” than in who was most responsible for bringing that history to an alleged close.

Quarterbacks

Thursday, October 22nd, 2009

Rush Limbaugh infamously argued that the media was overrating black quarterbacks for political reasons — and it’s hard to say he was wrong:

Six years later, 2009 is turning out to be a bust for black quarterbacks in the NFL. Not a single one is having a good season.

Seven of the 36 most active quarterbacks are black. David Garrard is probably doing best so far: on Sunday, he got Jacksonville back to .500, but he’s only #20 in passer rating.

On Sunday, Jason Campbell got benched at halftime by the Redskins. Former #1 draft pick JaMarcus Russell did win a game for Oakland, by beating Donovan McNabb 13-9. Seneca Wallace is back on the bench in Seattle. In Tampa Bay, Byron Leftwich has been replaced by young Josh Johnson, who is 32nd in passer rating.

With 140 yards rushing in six games, Garrard is the only black quarterback with at least 100 yards on the ground 30% of the way into the season.

Meanwhile, white quarterbacks are having a great year, with seven with passer ratings over 100, versus only one at the end of last year, although presumably top end ratings will come down as sample sizes increase and the weather worsens.

You could argue that black quarterbacks did better 20 years ago in 1989, when Warren Moon finished 4th in passer rating and Randall Cunningham 14th.

Overall, it looks like the first half of this decade, 2000-2004, was the peak for black quarterbacks in the NFL, while 2005-2009 has marked a surprising regression.

Steve Sailer was not expecting that. In fact, he was expecting the crusty old white coaches to learn to work with gifted young black quarterbacks — as in Any Given Sunday:

What happened?

Well, I don’t watch enough football to have much of an opinion, but here’s a hypothesis. When my older kid played football in a league for 9 and 10 year olds, the coach came out into the huddle and called plays and the teams usually took about two minutes between plays to get themselves organized. Football is just really complicated. The best team in the league just simplified matters by putting their best athlete, a black kid, at quarterback and letting him do whatever he wanted with the ball.

Similarly, whenever my younger kid got roped into playing Madden, a game he never paid much attention to, he’d always pick Michael Vick as his quarterback and just have him run around with the ball, because that was a lot easier than trying to have a quarterback throw to receivers running routes.

From that perspective, all this “future of football” stuff about quarterbacks who can run is backwards: having one player Do It All isn’t the future of football, it’s the past. You can’t stop a great athlete in PeeWee Football, but you can in the NFL. They apply a lot of brainpower to the problem of stopping one man.

No, the future of football is like the present in the NFL, just more so: having all eleven players execute in tandem ever more sophisticated schemes.

Part of the problem is that getting a mobile black quarterback became a quick fix for having a lousy offense. Is your offensive line so porous that a 30 year old white guy would get killed? Put a fast young black guy in at quarterback and let him outrun the defenders. At minimum, it will excite your fans.

After a few years of this, maybe you’ve finally fixed your offensive line, but now your fast black quarterback is banged up and isn’t quite as fast anymore, but he’s been confirmed in his instinct to take off with the ball and run rather than to step up into the pocket.

This happens at lower levels, too. If you are a high school or college coach, why try to train a fast black quarterback to be an NFL pocket passer when you can win now by just letting him freelance?

In contrast, the white sideline dads of America with tall, strong sons have given up on basketball, and they don’t trust their coaches to take the long view of their sons’ potential. So, they are paying out of their own pockets to hire personal quarterbacking tutors. (When USC played Notre Dame this weekend, Matt Barkley of USC said he’d know Jimmy Clausen of Notre Dame for years because they have the same off-season quarterback coach, Steve Clarkson.)

We’d really rather just not think about it

Thursday, October 22nd, 2009

We’d really rather just not think about it, John Derbyshire says, referring to racial inequality:

Fifty years ago it all seemed cut and dried. Just strike down old unjust laws, give the minority a helping hand, give the non-minority some education about civil rights and past disgraces, and in a few years things will come right.

We coasted along under those assumptions for a generation. When it became obvious that things were not coming right in the matter of equal test results, scholars and jurists got to work on the problem.

Liberals [...] naturally assumed it was just a matter of spending more money on schools. This theory was tested to destruction in several places, most sensationally in Kansas City from 1985–97. Under a judge’s order, the school district spent two billion dollars over twelve years, pretty much rebuilding the school system — and the actual schools themselves — from the ground up. The new, lavish facilities included “an Olympic-sized swimming pool with an underwater viewing room, television and animation studios, a robotics lab, a 25-acre wildlife sanctuary, a zoo, a model United Nations with simultaneous translation capability, and field trips to Mexico and Senegal.” The experiment was a complete failure. Drop-out rates rose and test scores fell across the entire twelve years.

Conservatives, thoroughly race-whipped by the liberal media elites, preferred to go along with whatever liberals said, except that they made, and still make, mild throat-clearing noises about school vouchers. It has turned out in practice, however, that the only people keen on school vouchers are the striving poor, a small (and dwindling) demographic with no political weight, and whom nobody in the media or academic elites gives a fig about. The non-striving underclass has zero interest in education; middle-class suburbanites like their schools the way they are, thanks all the same; and teacher unions see vouchers as threats to the public-education gravy train their members ride to well-padded retirement.

As test gaps persisted and lawsuits multiplied, the scholars retreated into metaphysics. The word “culture” was wafted around a lot. It seemed to denote a sort of phlogiston or luminiferous aether, pervading and determining everything, but via mechanisms nobody could explain. We heard about self-esteem issues, “the burden of ‘acting white’,” “stereotype threat,” and a whole raft of other sunbeams-from-cucumbers hypotheses. Stephan and Abigail Thernstrom, two distinguished scholars in the field, produced a much-praised book about test-score gaps with a conclusion in which nothing was concluded. “Choice [of where to live] should not be a class-based privilege.” Where, in a free society, has it ever not been? How will you stop people moving, if they can afford to? “Families must help their children to the best of their ability.” Oh. “Vouchers are a matter of basic equity.” See above. “Big-city superintendents and principals operate in a bureaucratic and political straitjacket.” True, no doubt; but New Haven, pop. 124,000, is not a big city. Test-score gaps are in plain sight out here in the ‘burbs. John Ogbu wrote a book about it. Six years ago.

And the test-score gaps just sat there, and sat there, and sat there, grinning back at us impudently.

At last, we just stopped thinking about the whole disagreeable business. Unfortunately, by that time a great body of law had been built on the theories and pseudotheories of the preceding decades, and couldn’t be wished away. Hence Ricci v. DeStefano.

Getting it Wrong

Thursday, October 22nd, 2009

Apparently “educators” have been pushing “errorless learning” for years:

For example, a classroom teacher might drill students repeatedly on the same multiplication problem, with very little delay between the first and second presentations of the problem, ensuring that the student gets the answer correct each time.

The idea embedded in this approach is that if students make errors, they will learn the errors and be prevented (or slowed) in learning the correct information.

Now peer-reviewed research reveals that getting it wrong improves learning:

People remember things better, longer, if they are given very challenging tests on the material, tests at which they are bound to fail. In a series of experiments, they showed that if students make an unsuccessful attempt to retrieve information before receiving an answer, they remember the information better than in a control condition in which they simply study the information. Trying and failing to retrieve the answer is actually helpful to learning.

Study of baby teeth yields new findings on nuclear fallout

Thursday, October 22nd, 2009

Study of baby teeth yields new findings on nuclear fallout:

The new research was spurred by the 2001 reappearance of 85,000 teeth that had been donated for the 1960s study, which was conducted by Washington University scientists. The teeth were found in an old bunker at the university’s Tyson Research Center where they had been stuffed into envelopes that included information about the donors, one of whom was Edward Ketterer.

“The toll from bomb fallout is probably far greater than prior estimates,” says Joseph Mangano, the lead study author and director of the Radiation and Public Health Project. “Because 40 percent of Americans will be diagnosed with cancer in their lifetime, it is crucial to understand causes such as bomb fallout, so actions to prevent the disease can be taken.”

Edward Ketterer’s contribution proved to be crucial to the new study. That’s because he is one of the 77 male donors diagnosed with cancer who served as case studies.

He passed away in 2006 at the age of 47, just a year after being diagnosed with invasive transitional cell carcinoma. His parents believe his exposure to nuclear fallout may have contributed to his death.

“His doctor always called him his mystery patient because no one understood how he ended up with this cancer, which is a very ugly, unpredictable kind of cancer,” said Joan Ketterer, a retired nurse.

The study is a spin-off of the St. Louis Tooth Survey in which more than 300,000 kids sent their teeth to the Greater St. Louis Citizens Committee for Nuclear Information. Washington University scientists analyzed most of the teeth for strontium-90, which was created by the bomb blasts and absorbed by the teeth and bones of infants.

They suspected that the children were exposed by drinking milk from cows and goats that grazed on grass contaminated by fallout. They called it the “milk pathway.”

The study concluded that St. Louis children born in 1964 had about 50 times more strontium-90 in their baby teeth than those born in 1950, before the start of atomic testing in Nevada.

(Hat tip to Nyrath.)

Something Rotten

Thursday, October 22nd, 2009

Zdeno describes something rotten:

For almost half a decade, my life was a Johnny Cash song. I would drink to the point of blacking out four nights a week, sleep past noon every day, and devote most of my waking hours to chasing loose women and an altered state of mind. I exaggerate only slightly when I say that I accomplished, learned and produced nothing of value throughout this entire dark age of my life.

Was I a bum? A liquor-soaked storefront panhandler? A toothless vagrant, shuffling up and down the streets of Baltimore, peddling handjobs for crack-cocaine?

Not quite. I was a student at one of our continent’s better Universities. And my experience was hardly unique. If I learned one thing over those years, it’s that the modern University is anything but an institution of higher learning, and trust me: Unless you are still inside the beast, or so fresh from the rear of her digestive system that the smell still lingers, you do not fully understand how completely and utterly ridiculous the contemporary higher-education system has become.

Universities do have their good points — we fill them with our best and brightest, after all — but consider the bad:

While pockets of practical, truth-seeking scholarship still remain — engineering, the hard sciences, perhaps a few nooks and crannies in business and economics — the majority of students are studying the 21st century equivalents of Chrysopoeia, Alectormancy and Theodicy. Some of the system’s worst excesses have been culled in the past decade or two, as truth has a way of seeping in the cracks of even the most impressive edifices of falsity, but new methods of waging war against truth and clear thinking are being dreamed up every day. You’ll notice, for example, that no one actually lost their job over the Sokal Affair.

Boxing Day

Wednesday, October 21st, 2009

John Derbyshire wouldn’t call himself a great boxing fan, but enrolling his 9-year-old in Fitness Through Boxing reminded him of his own long-gone glory days:

I had a brief moment of glory at age thirteen when the gym teacher at our boys-only school organized a boxing tournament, with a ring set up in the school auditorium. Though a fundamentally unathletic kid, I was going through a growth spurt, and, as often happens, different parts were growing at different speeds. The part of me that was growing fastest at this particular moment in time was my arms. I looked like a gibbon.

At our low skill level this gave me a great advantage. With decent wind and some grasp of basic technique, I could hold off any opponent till he tired enough to give me an easy opening. I won all my bouts.

The glory didn’t last long — does it ever? The gym teacher left that year, his successors had no interest in boxing, and society soon passed into a zone where the idea of thirteen-year-old boys punching each other’s faces for educational purposes became as unthinkable as the dense fug of tobacco smoke in our school’s staff room.

John and his son both like the boxing gym:

There is an agreeable and good-humored atmosphere in a boxing gym that cannot but be healthful for a growing boy to inhale. Robert A. Heinlein famously remarked that “an armed society is a polite society.” Well, a trained fighter is always armed. It is an odd paradox of human nature, seen in sergeants’ messes as well as boxing gyms, that there is never more ease of manner, concentration on mastering tasks and skills, and warm fellowship among men than when they have come together in a group to perform lawful acts of physical violence.

It is of course an open question how much longer boxing will be lawful in our feminized, lawyered-up society. Rob makes his customers sign a sheaf of wavers before they can put on the gloves. For a while longer yet, though, a boy can still come to a place like this and learn how to take on others in physical combat with skill, courage, and discipline, as men have done for longer than time itself.

Amen.

Pandora

Wednesday, October 21st, 2009

A few years ago, Steve Sailer tried out Pandora, the “Music Genome Project” for Internet radio — which does not recommend songs based on shared tastes but rather relies on experts’ assessments of songs across 250 factors — and found that it worked pretty well:

But one response was off: I put in Revolution Rock by the Clash, which isn’t a rock song at all, but a lazy, joyous reggae ramble. Pandora came back with the punk Career Opportunities by the Clash, which suggests that one of their employees had cut corners and categorized Revolution Rock by title rather than by music.

Anyway, a recent New York Times Magazine piece shares this anecdote:

[Pandora CEO Paul Westergren] likes to tell a story about a Pandora user who wrote in to complain that he started a station based on the music of Sarah McLachlan, and the service served up a Celine Dion song. “I wrote back and said, ‘Was the music just wrong?’ Because we sometimes have data errors,” he recounts. “He said, ‘Well, no, it was the right sort of thing — but it was Celine Dion.’ I said, ‘Well, was it the set, did it not flow in the set?’ He said, ‘No, it kind of worked — but it’s Celine Dion.’ We had a couple more back-and-forths, and finally his last e-mail to me was: ‘Oh, my God, I like Celine Dion.’ ”

This anecdote almost always gets a laugh. “Pandora,” he pointed out, “doesn’t understand why that’s funny.”

A basic problem that you can’t get around in Pandora:

If you like a song not so much because of the style but because it’s an expert execution of a style, then Pandora isn’t as good as a recommendation site.

Pandora performs a sort of factor analysis on your musical tastes, Sailer notes — although he layers quite a bit of his own “insight” on top of it:

Listening to these songs that I picked out a few years ago plus other ones similar to them, I would say I have post-British Empire upper middle class public schoolboy tastes in music. This may seem odd, but my tastes in songs would seem most natural for a Scottish or northern English lad at a southern English boarding school for toffs, or maybe at Sandhurst, the military academy.
[...]
Very strange, but it also fits a lot of my taste in authors as well (Waugh, Orwell, Wodehouse, etc.). I now remember how much I liked David Niven’s autobiography, who was a Sandhurst grad. And the autobiography of Churchill, another public schoolboy/Sandhurst man.

So, it’s no surprise that The Clash were always my favorites. After all, Joe Strummer, despite his appalling teeth, was an upper middle class public schoolboy whose dad, a friend of Kim Philby’s, was a diplomat (i.e., spy) for the fading British Empire.

You could use Pandora’s database for scholarly purposes, he suggests:

For example, T.S. Eliot pointed out that an artist creates his own “school” of predecessors that nobody noticed had anything in common before. For example, I’ve always felt that the ancestors of the punk rock of 1976 included from the 1968 to 1973 era: Communication Breakdown by Led Zeppelin, Paranoid by Black Sabbath, and Saturday Night’s All Right for Fighting by Elton John, three songs that sounded like they have more in common after you’d heard the Ramones, Sex Pistols, and Clash than before. This giant proprietary database would presumably allow those kind of academic hypotheses to be tested objectively.

An Epidemic of Fear

Wednesday, October 21st, 2009

We seem to have replaced actual epidemics with an epidemic of fear:

Before smallpox was eradicated with a vaccine, it killed an estimated 500 million people. And just 60 years ago, polio paralyzed 16,000 Americans every year, while rubella caused birth defects and mental retardation in as many as 20,000 newborns. Measles infected 4 million children, killing 3,000 annually, and a bacterium called Haemophilus influenzae type b caused Hib meningitis in more than 15,000 children, leaving many with permanent brain damage. Infant mortality and abbreviated life spans — now regarded as a third world problem — were a first world reality.

Today, because the looming risk of childhood death is out of sight, it is also largely out of mind, leading a growing number of Americans to worry about what is in fact a much lesser risk: the ill effects of vaccines. If your newborn gets pertussis, for example, there is a 1 percent chance that the baby will die of pulmonary hypertension or other complications. The risk of dying from the pertussis vaccine, by contrast, is practically nonexistent — in fact, no study has linked DTaP (the three-in-one immunization that protects against diphtheria, tetanus, and pertussis) to death in children. Nobody in the pro-vaccine camp asserts that vaccines are risk-free, but the risks are minute in comparison to the alternative.

Still, despite peer-reviewed evidence, many parents ignore the math and agonize about whether to vaccinate. Why? For starters, the human brain has a natural tendency to pattern-match — to ignore the old dictum “correlation does not imply causation” and stubbornly persist in associating proximate phenomena. If two things coexist, the brain often tells us, they must be related. Some parents of autistic children noticed that their child’s condition began to appear shortly after a vaccination. The conclusion: “The vaccine must have caused the autism.” Sounds reasonable, even though, as many scientists have noted, it has long been known that autism and other neurological impairments often become evident at or around the age of 18 to 24 months, which just happens to be the same time children receive multiple vaccinations. Correlation, perhaps. But not causation, as studies have shown.

And if you need a new factoid to support your belief system, it has never been easier to find one. The Internet offers a treasure trove of undifferentiated information, data, research, speculation, half-truths, anecdotes, and conjecture about health and medicine. It is also a democratizing force that tends to undermine authority, cut out the middleman, and empower individuals. In a world where anyone can attend what McCarthy calls the “University of Google,” boning up on immunology before getting your child vaccinated seems like good, responsible parenting. Thanks to the Internet, everyone can be their own medical investigator.

Socially Equitable Communitarianism

Wednesday, October 21st, 2009

Porphyrogenitus cites Mark Steyn’s thoughts on socially equitable communitarianism:

It’s better to pay more in taxes and to share the burdens as a community. It’s kinder, gentler, more compassionate, more equitable. Unfortunately, as recent European election results demonstrate, nothing makes a citizen more selfish than socially equitable communitarianism: Once a fellow’s enjoying the fruits of government health care and all the rest, he couldn’t give a hoot about the broader societal interest; he’s got his, and if it’s going to bankrupt the state a generation hence, well, as long as they can keep the checks coming till he’s dead, it’s fine by him. “Social democracy” is, in that sense, explicitly anti-social.

Somewhere along the way these countries redefined the relationship between government and citizen into something closer to pusher and junkie. And once you’ve done that, it’s very hard to persuade the junkie to cut back his habit.