Jarhead the Bear Cub

Monday, August 16th, 2010

It’s all fun and games when Winnie the Pooh gets his head stuck in a honey pot, but for Jarhead the bear cub it could have meant a slow, painful death:

The bear, his mother and two siblings regularly raided trash bins in Weirsdale, a small community in the vast Ocala National Forest.
[...]
The 6-month-old cub couldn’t eat or drink normally, and was days away from death when he was freed, biologists say.

It took 10 days for officials to track down the bear family using baited traps and following up frantic leads by concerned residents.
[...]
After an overnight stay with his groggy mother, Jarhead and the whole family was moved to a less populated part of the forest for safety.

The Original Mad Man

Monday, August 16th, 2010

Greg Beato calls Albert Lasker the original mad man:

At the time, most ads simply presented a product to the public, perhaps with an illustrated mascot and an uninspired tagline. “How would you like to have a fairy in your home?” went one that depicted a cute little girl. “Use Fairy Soap.” For Lasker, however, just saying a product’s name wasn’t enough. Advertising was news, he decided. It had to communicate something important. And then, with the help of a copywriter named John E. Kennedy, he further developed his philosophy. Advertising was, as Kennedy put it, “salesmanship in print.” You had to appeal to a consumer’s self-interest. You had to give potential customers a “reason why,” a directive that would make them understand how a product could dramatically improve their life in some way.

“A century later, this doesn’t sound like a particularly powerful insight,” the authors of The Man Who Sold America observe. But in 1898, Lasker’s approach to commerce was as revolutionary an idea as Amazon and eBay would be 100 years later. Combining persuasive, consumer-centric copy with a media network that could reach millions of people at once allowed manufacturers to move goods as never before. Over the course of his career, Lasker oversaw dozens of hugely successful campaigns and established Lord & Thomas as one of leading agencies in the United States. Lord & Thomas made oranges and raisins proprietary by creating the brands Sunkist and Sun-Maid. It helped Goodyear quadruple the sale of its tires in less than four years. It figured out a better way to market Kotex, offering it in a “wrapped box” that women could pick up and purchase without having to request it from a sales clerk. It turned Pepsodent and Palmolive into category leaders. It helped Lucky Strikes increase sales from 25 million to 150 million a day by positioning it as an alternative to sweets for women watching their figures. It brought Amos & Andy to NBC’s radio network, hired an unknown Bob Hope to serve as its pitchman for Pepsodent, and helped pioneer the soap opera genre with its radio show The Story of Mary Marlin.

In 1904, a 24-year-old Lasker purchased a 25 percent ownership stake in Lord & Thomas. Eight years later, he owned it completely. Under his guidance, the agency didn’t just craft pretty images and slogans to publicize its clients’ products. Instead, it served a more strategic role. In many instances, it encouraged the companies it worked with to reformulate products in ways that made them more compelling to consumers. In others, it hatched schemes to get consumers to use its clients’ products in ways the client had never pursued. To get people to buy more oranges, for example, and thus benefit its client, the California Fruit Growers Exchange, it hit upon the strategy of popularizing orange juice. First, it worked with a manufacturer to develop better juice extractors — electric ones for commercial use and simple glass ones for the home. Then, it launched its “drink an orange” campaign, with ads that advised consumers to look for the inexpensive glass extractors at their local retailer. This single campaign, the authors of The Man Who Sold America report, increased orange consumption per serving in the United States “from half an orange to between two and three.”

Churchill’s Empire

Monday, August 16th, 2010

Johann Hari heaps praise on Richard Toye’s Churchill’s Empire for portraying Winston Churchill as a thug who recognized a greater thug in Hitler. Much of Hari’s approach consists of quoting Churchill’s colorful rhetoric out of context, Max Boot points out:

There are indications of a remarkable lack of perspective in Hari’s (and Toyes’s) indictment, which misses two larger points about imperialism. First, for most of his life Churchill championed the empire at a time when imperialism was considered the norm. Empires have existed since ancient Mesopotamia and much of the world was ruled by them until the late 1940s. Hari is right that even in Churchill’s day not everyone favored imperialism but most did — including many Americans such as Theodore Roosevelt. By the standards of its day, the British Empire was, with the possible exception of the American Empire, the most liberal and enlightened in the world — certainly far more humane than the empires carved out by the Belgians and Germans in Africa. It is absurd to second-guess Churchill’s pro-imperial views from the vantage point of 21st century political correctness, which extols nationalism (perhaps wrongly) as the epitome of human development.

This bring us to the second point that Hari and his ilk overlook — namely the alternatives to British imperialism. Not only the alternative of other European empires, most of them far more brutal; but also the alternative of other indigenous regimes, most of which were even worse. Empire was not just a European phenomenon, after all; many of the native powers that British soldiers fought, whether the Zulus or the Moghuls, were imperialists in their own right. That, in fact, is one of the reasons why Britain was able to win and police its empire at such low cost — many of its subject peoples considered British rule preferable to that of local dynasties.

Once the British empire and other Western regimes passed from the scene, what replaced them? In India there was civil strife that killed over a million people. At least India managed to establish a more or less democratic government, thanks to the legacy of British rule. That’s more than can be said for most countries where the British did not stay as long. Many places once ruled by British, French, or other European bureaucrats fell under the sway of native tyrants, whose rule turned out to be far less competent and far more bloody. Idi Amin, who took over the former British colony of Uganda, comes to mind. Given the historical record of much of the post-independence world, it is by no means so obvious that Churchill’s preferred alternative — British rule — was not, in the end, superior.

The String Theory

Sunday, August 15th, 2010

A few weeks ago Buckethead mentioned that CoolTools had compiled a list of the best magazine articles ever, and I started reading the then-top article on the list, David Foster Wallace’s “Federer As Religious Experience” — which didn’t especially move me.

Then, today, Aretae mentioned a “beautiful essay on tennis from 1996″ that he came across via Tyler CowenThe String Theory, also by David Foster Wallace and also on the list. It looks at the very, very good players in the top 100 who still struggle in obscurity:

The realities of the men’s professional-tennis tour bear about as much resemblance to the lush finals you see on TV as a slaughterhouse does to a well-presented cut of restaurant sirloin. For every Sampras-Agassi final we watch, there’s been a weeklong tournament, a pyramidical single-elimination battle between 32, 64, or 128 players, of whom the finalists are the last men standing. But a player has to be eligible to enter that tournament in the first place. Eligibility is determined by ATP computer ranking. Each tournament has a cutoff, a minimum ranking required to be entered automatically in the main draw. Players below that ranking who want to get in have to compete in a kind of pretournament tournament. That’s the easiest way to describe qualies. I’ll try to describe the logistics of the Canadian Open’s qualies in just enough detail to communicate the complexity without boring you mindless.
[...]
The qualie circuit is to professional tennis sort of what AAA baseball is to the major leagues: Somebody playing the qualies in Montreal is an undeniably world-class tennis player, but he’s not quite at the level where the serious TV and money are. In the main draw of the du Maurier Omnium Ltée, a first-round loser will earn $5,400, and a second-round loser $10,300. In the Montreal qualies, a player will receive $560 for losing in the second round and an even $0.00 for losing in the first. This might not be so bad if a lot of the entrants for the qualies hadn’t flown thousands of miles to get here. Plus, there’s the matter of supporting themselves in Montreal. The tournament pays the hotel and meal expenses of players in the main draw but not of those in the qualies. The seven survivors of the qualies, however, will get their hotel expenses retroactively picked up by the tournament. So there’s rather a lot at stake — some of the players in the qualies are literally playing for their supper or for the money to make airfare home or to the site of the next qualie.

You could think of Michael Joyce’s career as now kind of on the cusp between the majors and AAA ball. He still has to qualify for some tournaments, but more and more often he gets straight into the main draw. The move from qualifier to main-draw player is a huge boost, both financially and psychically, but it’s still a couple of plateaus away from true fame and fortune. The main draw’s 64 or 128 players are still mostly the supporting cast for the stars we see in televised finals. But they are also the pool from which superstars are drawn. McEnroe, Sampras, and even Agassi had to play qualies at the start of their careers, and Sampras spent a couple of years losing in the early rounds of main draws before he suddenly erupted in the early nineties and started beating everybody.

Still, even most main-draw players are obscure and unknown. An example is Jakob Hlasek a Czech who is working out with Marc Rosset on one of the practice courts this morning when I first arrive at Stade Jarry. I notice them and go over to watch only because Hlasek and Rosset are so beautiful to see — at this point, I have no idea who they are. They are practicing ground strokes down the line — Rosset’s forehand and Hlasek’s backhand — each ball plumb-line straight and within centimeters of the corner, the players moving with compact nonchalance I’ve since come to recognize in pros when they’re working out: The suggestion is of a very powerful engine in low gear. Jakob Hlasek is six foot two and built like a halfback, his blond hair in a short square Eastern European cut, with icy eyes and cheekbones out to here: He looks like either a Nazi male model or a lifeguard in hell and seems in general just way too scary ever to try to talk to. His backhand is a one-hander, rather like Ivan Lendl’s, and watching him practice it is like watching a great artist casually sketch something. I keep having to remember to blink. There are a million little ways you can tell that somebody’s a great player — details in his posture, in the way he bounces the ball with his racket head to pick it up, in the way he twirls the racket casually while waiting for the ball. Hlasek wears a plain gray T-shirt and some kind of very white European shoes. It’s midmorning and already at least 90 degrees, and he isn’t sweating. Hlasek turned pro in 1983, six years later had one year in the top ten, and for the last few years has been ranked in the sixties and seventies, getting straight into the main draw of all the tournaments and usually losing in the first couple of rounds. Watching Hlasek practice is probably the first time it really strikes me how good these professionals are, because even just fucking around Hlasek is the most impressive tennis player I’ve ever seen. I’d be surprised if anybody reading this article has ever heard of Jakob Hlasek. By the distorted standards of TV’s obsession with Grand Slam finals and the world’s top five, Hlasek is merely an also-ran. But last year, he made $300,000 on the tour (that’s just in prize money, not counting exhibitions and endorsement contracts), and his career winnings are more than $4 million, and it turns out his home base was for a time Monte Carlo, where lots of European players with tax issues end up living.

“Seeing” plays a vital role in tennis:

Except for the serve, power in tennis is not a matter of strength but of timing. This is one reason why so few top tennis players look muscular. Any normal adult male can hit a tennis ball with a pro pace; the trick is being able to hit the ball both hard and accurately. If you can get your body in just the right position and time your stroke so you hit the ball in just the right spot — waist-level, just slightly out in front of you, with your own weight moving from your back leg to your front leg as you make contact — you can both cream the ball and direct it. Since “…just the right…” is a matter of millimeters and microseconds, a certain kind of vision is crucial. Agassi’s vision is literally one in a billion, and it allows him to hit his ground strokes as hard as he can just about every time. Joyce, whose hand-eye coordination is superlative, in the top 1 percent of all athletes everywhere (he’s been exhaustively tested), still has to take some incremental bit of steam off most of his ground strokes if he wants to direct them.

If you’ve played tennis at least a little, you probably have some idea how hard a game is to play really well, Wallace — who played at a pretty high level — says:

I submit to you that you really have no idea at all. I know I didn’t. And television doesn’t really allow you to appreciate what real top-level players can do — how hard they’re actually hitting the ball, and with what control and tactical imagination and artistry. I got to watch Michael Joyce practice several times right up close, like six feet and a chain-link fence away. This is a man who, at full run, can hit a fast-moving tennis ball into a one-foot square area seventy-eight feet away over a net, hard. He can do this something like more than 90 percent of the time. And this is the world’s seventy-ninth-best player, one who has to play the Montreal qualies.

So, how do you get that good?

But it’s better for us not to know the kinds of sacrifices the professional-grade athlete has made to get so very good at one particular thing. Oh, we’ll invoke lush clichés about the lonely heroism of Olympic athletes, the pain and analgesia of football, the early rising and hours of practice and restricted diets, the preflight celibacy, et cetera. But the actual facts of the sacrifices repel us when we see them: basketball geniuses who cannot read, sprinters who dope themselves, defensive tackles who shoot up with bovine hormones until they collapse or explode. We prefer not to consider closely the shockingly vapid and primitive comments uttered by athletes in postcontest interviews or to consider what impoverishments in one’s mental life would allow people actually to think the way great athletes seem to think. Note the way “up close and personal” profiles of professional athletes strain so hard to find evidence of a rounded human life — outside interests and activities, values beyond the sport. We ignore what’s obvious, that most of this straining is farce. It’s farce because the realities of top-level athletics today require an early and total commitment to one area of excellence. An ascetic focus. A subsumption of almost all other features of human life to one chosen talent and pursuit. A consent to live in a world that, like a child’s world, is very small.

Atheists and Fundamentalist Christians

Sunday, August 15th, 2010


Ten Reasons to Be Cautious

Saturday, August 14th, 2010

Brett Arends of the Wall Street Journal gives 10 reasons to be cautious about the economy. His sixth point is that the jobs picture is much worse than they’re telling you:

Forget the “official” unemployment rate of 9.5%. Alternative measures? Try this: Just 61% of the adult population, age 20 or over, has any kind of job right now. That’s the lowest since the early 1980s — when many women stayed at home through choice, driving the numbers down.

Among men today, it’s 66.9%. Back in the ’50s, incidentally, that figure was around 85%, though allowances should be made for the higher number of elderly people alive today. And many of those still working right now can only find part-time work, so just 59% of men age 20 or over currently have a full-time job. This is bullish?

(Today’s bonus question: If a laid-off contractor with two kids, a mortgage and a car loan is working three night shifts a week at his local gas station, how many iPads can he buy for Christmas?)

Left Coast’s Right Turn

Friday, August 13th, 2010

Steve Sailer describes the left coast’s right turn early in the 20th century:

Hollywood was not always so ideologically homogeneous. Consider one of the best films of the industry’s best year, 1939 — “Mr. Smith Goes to Washington.” Leading man Jimmy Stewart, director Frank Capra, and studio head Harry Cohn were all Republicans, while its screenwriter Sidney Buchman was a card-carrying Stalinist. Today, though, acceptable views run the gamut all the way from Eleanor Roosevelt Democrats like Barbra Streisand on the Left to Harry Truman Democrats like Tom Hanks (who named a son “Truman”) on the Right. What happened?

Keep in mind that Hollywood’s relationship with the outside world is tenuous. It’s a self-absorbed community, and its politics are skin-deep, serving functions within the industry that aren’t always obvious to outsiders. Today’s liberal monoculture is in large part an outgrowth of the compromise resolution to the ancient struggle between studio executives and screenwriters that culminated in the endlessly discussed but little understood blacklist of Marxists in the 1950s.

One of the blacklist’s main roots has disappeared down the memory hole because it doesn’t burnish the heroic image created to flatter the Communist victims. A 1919 theater strike won the playwrights of the Dramatists Guild the right to retain copyright in their works. To this day, dramatists own their plays and merely license them to producers. Further, they have the right to approve or reject the cast, director, and any proposed changes in the dialogue. Contractually, a playwright is a rugged individualist, an Ayn Rand hero.

With the introduction of the talkies in 1927, Hollywood began importing trainloads of New York dramatists. Salaries were generous and the climate superb, but the dramatists found the collaborative nature of moviemaking frustrating, even demeaning. Screenwriters were employees in a vast factory, which owned their creations. The studios could, and generally would, have other hired hacks radically rewrite each script, all under the intrusive supervision of some mogul’s semiliterate brother-in-law.

In the 1930s, Hollywood’s Communist Party, under the command of its charismatic commissar, screenwriter John Howard Lawson, improbably but enthusiastically championed the intellectual property rights of scriptwriters. The ink-stained wretches found that the Marxist concept of alienation described their plight. They felt just like the once psychologically fulfilled hand-craftsmen forced into becoming dispossessed factory drones who cannot recognize their creativity in their employer’s output.

Insanely ironic as it seems now, many screenwriters became Communists because they despised the movie business’s need for co-operation. How turning command of the entire economy over to a dictatorship would restore the unfettered joys of individual craftsmanship was a little fuzzy, but, hey, if you couldn’t trust Stalin, whom could you trust?

The possibility of studios blacklisting writers first surfaced in the 1930s, when the moguls’ cartel turned aside the leftist screenwriters’ push to align themselves with the Dramatists League by threatening to fire union supporters. “It wouldn’t be a blacklist because it would all be done over the telephone,” Jack Warner explained.

Decades later, after the formal blacklist era, this labor-management conflict was resolved by a tacit compromise. The blacklisted writers were elevated in the collective memory to the role of martyrs. Their leftism (but not their Stalinism, which was conveniently forgotten) was enshrined as the appropriate ideology of all respectable movie-folk. In return, the producers hung on to their property rights in screenplays.

Law and Disorder in Johannesburg

Friday, August 13th, 2010

I find Louis Theroux mildly annoying, but his Law and Disorder in Johannesburg certainly conveys a feel for life in a lawless part of South Africa:

Are you ready for a world without antibiotics?

Friday, August 13th, 2010

Are you ready for a world without antibiotics?

Last September, Walsh published details of a gene he had discovered, called NDM 1, which passes easily between types of bacteria called enterobacteriaceae such as E. coli and Klebsiella pneumoniae and makes them resistant to almost all of the powerful, last-line group of antibiotics called carbapenems. Yesterday’s paper revealed that NDM 1 is widespread in India and has arrived here as a result of global travel and medical tourism for, among other things, transplants, pregnancy care and cosmetic surgery.

“In many ways, this is it,” Walsh tells me. “This is potentially the end. There are no antibiotics in the pipeline that have activity against NDM 1-producing enterobacteriaceae. We have a bleak window of maybe 10 years, where we are going to have to use the antibiotics we have very wisely, but also grapple with the reality that we have nothing to treat these infections with.”

The Point of No Return

Friday, August 13th, 2010

Jeffrey Goldberg of The Atlantic says that he is not engaging in a thought exercise or a one-man war game when he discusses the plausibility and potential consequences of an Israeli strike on Iran:

Israel has twice before successfully attacked and destroyed an enemy’s nuclear program. In 1981, Israeli warplanes bombed the Iraqi reactor at Osirak, halting — forever, as it turned out — Saddam Hussein’s nuclear ambitions; and in 2007, Israeli planes destroyed a North Korean–built reactor in Syria. An attack on Iran, then, would be unprecedented only in scope and complexity.

I have been exploring the possibility that such a strike will eventually occur for more than seven years, since my first visit to Tehran, where I attempted to understand both the Iranian desire for nuclear weapons and the regime’s theologically motivated desire to see the Jewish state purged from the Middle East, and especially since March of 2009, when I had an extended discussion about the Iranian nuclear program with Benjamin Netanyahu, hours before he was sworn in as Israel’s prime minister. In the months since then, I have interviewed roughly 40 current and past Israeli decision makers about a military strike, as well as many American and Arab officials.

In most of these interviews, I have asked a simple question: what is the percentage chance that Israel will attack the Iranian nuclear program in the near future? Not everyone would answer this question, but a consensus emerged that there is a better than 50 percent chance that Israel will launch a strike by next July.

(Of course, it is in the Israeli interest to let it be known that the country is considering military action, if for no other reason than to concentrate the attention of the Obama administration. But I tested the consensus by speaking to multiple sources both in and out of government, and of different political parties. Citing the extraordinary sensitivity of the subject, most spoke only reluctantly, and on condition of anonymity. They were not part of some public-relations campaign.)

The reasoning offered by Israeli decision makers was uncomplicated: Iran is, at most, one to three years away from having a breakout nuclear capability (often understood to be the capacity to assemble more than one missile-ready nuclear device within about three months of deciding to do so). The Iranian regime, by its own statements and actions, has made itself Israel’s most zealous foe; and the most crucial component of Israeli national-security doctrine, a tenet that dates back to the 1960s, when Israel developed its own nuclear capability as a response to the Jewish experience during the Holocaust, is that no regional adversary should be allowed to achieve nuclear parity with the reborn and still-besieged Jewish state.

Genius, as he calls himself, adds that Iran can bring Israel to its knees without a single explosion:

They will mobilize for war against us, we’ll have to mobilize in response, and our entire country will grind to a halt. Then they’ll demobilize, mobilize again, demobilize, mobilize again, etc., which they can do forever because — as crap as their economy is — they have oil and we don’t have it. An Iranian mobilization could also last for weeks — what do the mullahs care? — but a full Israeli mobilization for war means that our crops rot in the fields, our buses don’t get driven and the simple, day-to-day business of life doesn’t get done. A single full mobilization for war every year could wipe us off the map, let alone having to mobilize the whole military on a monthly or weekly basis.

And if that alone doesn’t change the rules of the game, almost all of Israel is now in range of Hezballah’s missiles. In the era of the Mullahs’ Bomb, Hezballah can begin to strike at Israel with relative impunity. Any response will mean another Iranian mobilization, triggering what we know we don’t want. Does anyone doubt that Tel Aviv will be hit by missiles from Lebanon in the next war? I fully expect to see the inside of a bomb shelter in the not-too-distant future.

(Hat tip to Aretae.)

Gary Kurtz on Star Wars

Thursday, August 12th, 2010

Gary Kurtz produced Star Wars and The Empire Strikes Back, but he and Lucas parted ways when the toys began to take precedence over the films:

For Kurtz, the popular notion that “Star Wars” was always planned as a multi-film epic is laughable. He says that he and Lucas, both USC film school grads who met through mutual friend Francis Ford Coppola in the late 1960s, first sought to do a simple adaptation of “Flash Gordon,” the comic-strip hero who had been featured in movie serials that both filmmakers found charming.

“We tried to buy the rights to ‘Flash Gordon’ from King Features but the deal would have been prohibitive,” Kurtz said. “They wanted too much money, too much control, so starting over and creating from scratch was the answer.”

Lucas came up with a sprawling treatment that pulled from “Flash Gordon,” Arthurian legend, “The Hidden Fortress” and other influences. The document would have required a five-hour film but there was a middle portion that could be carved out as a stand-alone movie. Kurtz championed the project in pitch meetings with studios and worked intensely on casting, scouting locations and finding a way to create a believable alien universe on a tight budget.

“Star Wars” opened with a title sequence that announced it as “Episode IV” as a winking nod to the old serials, not a film franchise underway, Kurtz said.

“Our plan was to do ‘Star Wars’ and then make ‘Apocalypse Now’ and do a black comedy in the vein of ‘M*A*S*H*,’” Kurtz said. “Fox insisted on a sequel or maybe two [to ‘Star Wars’]. Francis [Ford Coppola] … had bought the ["Apocalypse Now"] rights so George could make it. He eventually got tired of waiting and did it on his own, of course.”

The team of Lucas and Kurtz would not hold together during their own journey through the jungles of collaborative filmmaking. Kurtz chooses his words carefully on the topic of their split.

After the release of “Empire” (which was shaped by material left over from that first Lucas treatment), talk turned to a third film and after a decade and a half the partners could no longer find a middle ground.

“We had an outline and George changed everything in it,” Kurtz said. “Instead of bittersweet and poignant he wanted a euphoric ending with everybody happy. The original idea was that they would recover [the kidnapped] Han Solo in the early part of the story and that he would then die in the middle part of the film in a raid on an Imperial base. George then decided he didn’t want any of the principals killed. By that time there were really big toy sales and that was a reason.”

The discussed ending of the film that Kurtz favored presented the rebel forces in tatters, Leia grappling with her new duties as queen and Luke walking off alone “like Clint Eastwood in the spaghetti westerns,” as Kurtz put it.

Kurtz said that ending would have been a more emotionally nuanced finale to an epic adventure than the forest celebration of the Ewoks that essentially ended the trilogy with a teddy bear luau.

He was especially disdainful of the Lucas idea of a second Death Star, which he felt would be too derivative of the 1977 film. “So we agreed that I should probably leave.”

Kurtz went straight over to “The Dark Crystal,” a three-year project with old friend Jim Henson, whom Kurtz had brought in on the creation of Yoda for “Empire.”

Useful Idiots

Thursday, August 12th, 2010

I’m a bit surprised that the modern BBC would do a piece on the useful idiots who did the Communists’ work for them:

The phrase ‘useful idiots’, supposedly Lenin’s, refers to Westerners duped into saying good things about bad regimes.

In political jargon it was used to describe Soviet sympathisers in Western countries and the attitude of the Soviet government towards them.

Useful idiots, in a broader sense, refers to Western journalists, travellers and intellectuals who gave their blessing – often with evangelistic fervour – to tyrannies and tyrants, thereby convincing politicians and public that utopias rather than Belsens thrived.

For instance, Walter Duranty, who served as the Moscow bureau chief of the New York Times from 1922 through 1936, won a Pulitzer Prize in 1932 for a set of stories written in 1931 on the Soviet Union — stories denying the famine. It turns out he had been involved in sado-masochistic activities with Aleister Crowley’s “magickal” crowd, and he was probably being blackmailed by the Communists.

Useful idiots have also served Chairman Mao’s China, General Pinochet’s Chile, Apartheid-controlled South Africa, Saddam Hussein’s Iraq, and President Ahmadinejad’s Iran.

Why are smart people so consistently fooled by evil regimes?, Michael Moynihan of Reason asks. Borepatch answers, Intellectuals see in the mirror the very image of a Philosopher King:

Their entire intellectual training has led them to believe that it is the Intellectual Class that is fit to rule, by virtue of their very Smart thinking. After all, did not Plato himself say that only the philosophers were virtuous, and therefore fit to rule? And have not our Intellectual Elite been told their entire school lives that they are the “Best And Brightest?”

And so all a tyrant needs to do is to flatter this sense of entitlement, and the Intellectual will convince himself that this is indeed the New Jerusalem. No brainwashing is required; a light rinse will do.

But at every stage, the tyrant must appeal to the Intellectual’s sense of superiority. This is why rightist tyrants (say, Peronists) struggle with the Intellectual class. By allowing a more or less unfettered market to run as it will, he offends the vanity of the Intellectual. The Market is a very bad thing indeed, says the Intellectual, because it does not need me a Very Smart Person to run it. The tyrant offers the illusion of access to the Control Room of society — indeed the very goal that the Intellectual has trained for all his life. How could he not be bewitched?

Of course, the tyrant has no intention of actually letting the intellectual control anything; that’s why Lenin coined the term “useful idiots”. So the answer to Reason’s question is quite simple: it’s a cheap appeal to vanity. The more interesting question is why do Intellectuals fall for this so often?

I think that it’s because Intellectuals have been much more successful in the West, where they tend not to end up in Gulags, Re-education camps, or mass graves. In Europe in particular, and increasingly in the USA, they have captured the media and government, and actually have increasingly found themselves in the Control Room of society. It hasn’t been working out so well.

Marijuana may cause Canada’s economic comedown

Thursday, August 12th, 2010

Canada’s economy is doing remarkably well despite the global recession, but this could all change if California legalizes marijuana:

This November, in an effort to increase tax revenue, California will hold a referendum on whether or not to legalise the cultivation and use of marijuana. If passed, the change in law would be devastating to the Canadian economy, halting the flow of billions of dollars from the US into Canada and eventually forcing hundreds of thousands into unemployment.

Over the past 20 years, Canada has developed a substantial and highly profitable marijuana industry that is almost completely dependent on the US market. Between 60 and 90% of the marijuana produced domestically is exported to the US via cross-border smuggling operations. It’s exactly like the alcohol prohibition of the 1920s, only far more sophisticated and more profitable. The establishment of a legal industry based in the US would likely cripple these exports overnight.

Due to its contraband nature, it’s difficult to determine exactly how much marijuana contributes to the Canadian economy, but a number of agencies and economists have estimated that it is in the range of $20bn per year (£12.5bn), making it Canada’s single largest agricultural product. The bulk of production is based in British Columbia, where it employs a labour force of 250,000, roughly one in 14 adults. Although strict financial controls are often credited as the source of Canada’s economic resilience, it’s worth pointing out that marijuana production often insulates communities from larger economic phenomenon.

US Tax Hikes of the 1930s

Thursday, August 12th, 2010

Nathan Lewis discusses the US tax hikes of the 1930s:

I make a big deal of the tax hikes of the 1930s — first the Smoot Hawley Tariff, and then the Hoover tax hike of 1932 — mostly because they are largely ignored by economists who would rather believe that it was all a mysterious aftereffect of a decline in stock prices. Stocks go down, and everyone loses their “animal spirits,” and poof! — ten years of tragedy leading to a World War. You would think that people would find that a little simplistic.

(On the other hand, I think economists like it because it places no blame on the government. Economists know on which side their bread is buttered.)

Also, it was not just U.S. policy, but very similar policy around the world. Virtually all countries had tariff hikes of a similar scale as the Smoot Hawley Tariff (60% tariff on just about everything), and also many countries had domestic tax hikes on the scale of the 1932 Hoover hike. In fact, Hoover was imitating policies in Britain and Germany, who were really the first with the big domestic tax hikes.

The Smoot-Hawley Tariff began as an agricultural tariff:

Farmers weren’t participating in the great economic boom of the 1920s. Indeed, it appears that the introduction of motorized tractors resulted in agricultural overcapacity. About 1/3rd of farmland, in those days, was set aside for horse pasture. The horses then pulled the plows and wagons. When horses were replaced with motorized tractors and trucks, the farmland available for growing crops increased by 50%. At least, that is one story — there is a slight decline in corn, wheat and soybean prices toward the end of the 1920s, but not a lot. To gain congressional support for the tariff, however, the supporters started to add products from other Congressmen’s districts. This all began in 1920s — Herbert Hoover campaigned on an increased tariff for agricultural products in 1928. After his victory, in 1929 Hoover asked Congress for a new tariff in which rates for agricultural products rose and rates for industrial products declined.

In May 1929, the House passed a tariff in which rates for both agricultural and industrial good increased. This went to the Senate, which then debated the tariff. The initial stock market decline in 1929 lines up exactly when tariff supporters in the Senate got enough votes to pass the tariff — by adding more and more items to the list of goods subject to the tariff. In May 1930, the House passed the new bill. At first, Hoover opposed the bill, calling it “vicious, extortionate, and obnoxious.” However, in U.S. history, the Republican party has typically been tariff supporters, while the Democratic party has been in favor of lower tariffs. Republicans have often seen tariffs as a type of economic support, as it protects existing businesses from foreign competition. In the initial downturn of 1929-early 1930, the tariff was then seen as an additional economic booster. Hoover was pressured by his own party to pass the tariff into law, and he did so in June 1930. The new tariff applied to over 20,000 goods and imposed a 60% rate on more than 3,200 products, quadrupling previous rates.

Between 1929 and 1934, with tariffs blasting higher worldwide, world trade decreased by 66%.

This was followed by the Revenue Act of 1932:

This raised the top income tax rate from 25% to 63%, with increases on all incomes above $6,000. The estate tax rose to 45% from 20% and the corporate tax rate rose from 12% to 13.75%, and exemptions were reduced. Exemptions for individuals were reduced, with the basic exemption for a married couple falling from $3,500 to $2,500. This was aimed at bringing 1.7 million new taxpayers into the income tax system. However, the big increases were in excise taxes, which were expected to raise 51% of the $912m in increased revenue expected by the tax hike. (In actuality, revenue declined as the economy imploded.)

This was the result of a debate that began with the idea of introducing a new national sales tax, in an effort to “broaden the tax base.” Ultimately, the national sales tax idea was abandoned with the argument that tax rates should fall higher on luxuries than on necessities.

Illegal Immigrants Account for 1 in 12 US Births

Thursday, August 12th, 2010

If you’ve been near a delivery room in the past few years, you may already have noticed that illegal immigrants account for 1 in 12 US births:

Undocumented immigrants make up slightly more than 4% of the U.S. adult population. However, their babies represented twice that share, or 8%, of all births on U.S. soil in 2008, according to the nonpartisan Pew Hispanic Center’s report.