We just ran out of Nazis

Wednesday, November 30th, 2011

Maybe the source of the stagnation in our space program over the last 40 years, Buckethead suggests, is not government mismanagement, lack of vision, underfunding, red tape or any of that.

Maybe we just ran out of Nazis. (I guess we hit Peak Nazi back in the 1960s.)


Wednesday, November 30th, 2011

General Mills might be described as Big Gluten, but they’ve made the unusual move of pushing gluten-free foods, because there’s a small but dedicated and growing market:

Gluten-free packaged foods — in which wheat has been replaced by alternative ingredients like rice, sorghum and tapioca flours, among others — were almost impossible to find in the 1990s. Most of what did exist was dreadful: think cardboard. It was also hard to find people who understood the disease itself. Doctors believed it wasn’t much of a problem in this country.

“Nobody really was ready to accept the 1 percent prevalence of celiac disease,” says Dr. Stefano Guandalini, founder and medical director of the University of Chicago Celiac Disease Center, who came to the U.S. from Italy in 1996 and found very little awareness of celiac disease. Even experts ignored it, Guandalini says, noting that a prominent medical textbook published as recently as 1999 questioned how widespread it was. “The chapter on celiac disease,” Guandalini says, “quotes a prevalence of 1 in 10,000 in the U.S. and adds that this is mostly a European condition — and the prevalence is decreasing. This is the formal, official teaching in ’99.”

But Guandalini didn’t buy it. And neither did Dr. Alessio Fasano, another Italian who was practicing at the University of Maryland. The genes were here, Fasano recalls thinking, courtesy of our European ancestors, and so was the gluten, a natural component of wheat that provides the elastic qualities that make for delicious baked goods. But the protein is also difficult to digest. And even a healthy intestine does not completely break gluten down. For those with celiac disease, the undigested gluten essentially causes the body’s immune system to lash out at itself, leading to malabsorption, bloating and diarrhea — the classic gastrointestinal symptoms — but also, at times, joint pain, skin rashes and other problems. In Italy, Fasano routinely saw celiac disease. Surely it was in the U.S. too. Hence, in 1996 Fasano published a paper, asking, in the title, a simple question: “Where Have All the American Celiacs Gone?”

The same year that he published the paper, he founded the University of Maryland Center for Celiac Research. He started small; Fasano had only one patient the first year. In a 1998 paper, however, he reported that he had randomly screened 2,000 blood samples for the antibodies that typically indicate a diagnosis of celiac disease and discovered that 1 in 250 tested positive.

Still, doubts lingered. So Fasano set out to do a more comprehensive study — or, as he called it, “the most insane, large epidemiological study” on celiac disease in the U.S. to date. More than 13,000 subjects in 32 states were screened for the antibodies. Those who tested positive underwent further blood tests and, when possible, a small-bowel biopsy to confirm the presence of celiac disease. The results, published in 2003, were stunning: 1 in every 133 people had celiac disease. And among those related to celiac patients, the rates were as high as 1 in 22. People were listening now — and everything about gluten-free living was about to change. “Believe it or not,” he says, “the history of celiac disease as a public health problem in the United States started in 2003.”

As awareness of the disease became more widespread, Fasano expected celiac diagnoses to increase. That, in fact, is what has happened. Since 2009, Quest Diagnostics, a leading testing company, has seen requests for celiac blood tests jump 25 percent. But Fasano didn’t anticipate other developments. He now estimates that 18 million Americans have some degree of gluten sensitivity. And experts have been surprised, in general, by the rising prevalence of celiac disease overall. “It’s not just that we’re better at finding it,” says Dr. Joseph A. Murray, a gastroenterologist at the Mayo Clinic in Rochester, Minn. “It truly has become more common.”

Comparing blood samples from the 1950s to the 1990s, Murray found that young people today are nearly five times as likely to have celiac disease, for reasons he and others researchers cannot explain. And it’s on the rise not only in the U.S. but also in other places where the disease was once considered rare, like Mexico and India. “We don’t know where it’s going to end,” Murray says. “Celiac disease has public health consequences.” And therefore, it has a market.

Gluten-free products aren’t just selling these days; they appear to be recession-proof. According to a recent Nielsen report on consumer trends, the volume of gluten-free products sold in the past year is up 37 percent. Spins, a market-research-and-consulting firm for the natural-products industry, says the gluten-free market is a $6.3 billion industry and growing, up 33 percent since 2009. Niche companies like Amy’s Kitchen, Glutino, Enjoy Life, Bob’s Red Mill and Udi’s Gluten Free Foods are reporting incredible growth.

Major corporations have also been moving into the marketplace: Anheuser-Busch introduced Redbridge, a gluten-free beer, in 2006, and Kellogg rolled out gluten-free Rice Krispies this year. Other companies have begun adding labels that indicate when their products are gluten-free — that is, when they contain fewer than 20 parts per million gluten (the proposed federal standard). Both Frito-Lay and Post Foods have begun such labeling in the past year. It’s the golden age of gluten-free.

Celiacs aren’t the only ones who are grateful. Athletes, in particular, have taken to the diet. Some claim to have more energy when they cut out gluten, a belief that intrigues some experts and riles others. Guandalini dismisses the idea as “totally bogus.” Yet no one can argue with the success of the world’s No. 1 men’s tennis player, Novak Djokovic. Within months of revealing this year that he had a gluten allergy and had altered his diet accordingly, Djokovic posted a remarkable 64-2 record. By September, sportswriters barely let a moment pass without asking about, as one called it, his “off-court eating habits.” After his victory at the U.S. Open final, a reporter wanted to know what he ate for dinner the night before, for breakfast that morning and what he planned to eat that night. “I’ll give you a simple answer,” Djokovic said with a smile. “Last night I didn’t have any gluten, and tonight I will have a bunch of gluten.”

Alan Moore on V for Vendetta Masks

Wednesday, November 30th, 2011

Tom Lamont of The Guardian interviews Alan Moore on the V for Vendetta masks that are all the rage with trendy protesters these days:

But Moore has been caught off-guard in recent years, and particularly in 2011, by the inescapable presence of a certain mask being worn at protests around the world. A sallow, smirking likeness of Guy Fawkes – created by Moore and the artist David Lloyd for their 1982 series V for Vendetta. It has a confused lineage, this mask: the plastic replica that thousands of demonstrators have been wearing is actually a bit of tie-in merchandise from the film version of V for Vendetta, a Joel Silver production made (quite badly) in 2006. Nevertheless, at the disparate Occupy sit-ins this year – in New York, Moscow, Rio, Rome and elsewhere – as well as the repeated anti-government actions in Athens and the gatherings outside G20 and G8 conferences in London and L’Aquila in 2009, the V for Vendetta mask has been a fixture. Julian Assange recently stepped out wearing one, and last week there was a sort of official embalmment of the mask as a symbol of popular feeling when Shepard Fairey altered his famous “Hope” image of Barack Obama to portray a protester wearing one.

It all comes back to Moore – a private man with knotty greying hair and a magnificent beard, who prefers to live without an internet connection and who has not had a working telly for months “on an obscure point of principle” about the digital signal in his hometown of Northampton. He has never yet properly commented on the Vendetta mask phenomenon, and speaking on the phone from his home, Moore seems variously baffled, tickled, roused and quite pleased that his creation has become such a prominent emblem of modern activism.

“I suppose when I was writing V for Vendetta I would in my secret heart of hearts have thought: wouldn’t it be great if these ideas actually made an impact? So when you start to see that idle fantasy intrude on the regular world… It’s peculiar. It feels like a character I created 30 years ago has somehow escaped the realm of fiction.”

V for Vendetta tells of a future Britain (actually 1997, nearly two decades into the future when Moore wrote it) under the heel of a dictatorship. The population are depressed and doing little to help themselves. Enter Evey, an orphan, and V, a costumed vigilante who takes an interest in her. Over 38 chapters, each titled with a word beginning with “V”, we follow the brutal, loquacious antihero and his apprentice as they torment the ruling powers with acts of violent resistance. Throughout, V wears a mask that he never removes: bleached skin and rosy cheeks, pencil beard, eyes half shut above an inscrutable grin. You’ve probably come to know it well.

“That smile is so haunting,” says Moore. “I tried to use the cryptic nature of it to dramatic effect. We could show a picture of the character just standing there, silently, with an expression that could have been pleasant, breezy or more sinister.” As well as the mask, Occupy protesters have taken up as a marrying slogan “We are the 99%”; a reference, originally, to American dissatisfaction with the richest 1% of the US population having such vast control over the country. “And when you’ve got a sea of V masks, I suppose it makes the protesters appear to be almost a single organism – this “99%” we hear so much about. That in itself is formidable. I can see why the protesters have taken to it.”

Moore first noticed the masks being worn by members of the Anonymous group, “bothering Scientologists halfway down Tottenham Court Road” in 2008. It was a demonstration by the online collective against alleged attempts to censor a YouTube video. “I could see the sense of wearing a mask when you were going up against a notoriously litigious outfit like the Church of Scientology.”

But with the mask’s growing popularity, Moore has come to see its appeal as about something more than identity-shielding. “It turns protests into performances. The mask is very operatic; it creates a sense of romance and drama. I mean, protesting, protest marches, they can be very demanding, very gruelling. They can be quite dismal. They’re things that have to be done, but that doesn’t necessarily mean that they’re tremendously enjoyable – whereas actually, they should be.”

At one point in V for Vendetta, V lectures Evey about the importance of melodrama in a resistance effort. Says Moore: “I think it’s appropriate that this generation of protesters have made their rebellion into something the public at large can engage with more readily than with half-hearted chants, with that traditional, downtrodden sort of British protest. These people look like they’re having a good time. And that sends out a tremendous message.”

Silicon Valley Schooling

Wednesday, November 30th, 2011

The New York Times recently noted the irony that many Silicon Valley technologists send their children to a school that does not compute — that is, to the Waldorf School of the Peninsula, which, like all Waldorf schools, emphasizes free play in a homelike environment and discourages modern media and technology.

(As I’ve noted before, they tend not to play up their theosophical roots.)

Other Silicon Valley technologists are sending their children to Bullis elementary school, where they don’t have to pay close to $20k tuition — but instead “donate” $5k to the publicly funded charter school:

Bullis’s popularity shows that even parents in wealthy, top-performing school districts such as Los Altos have become disenchanted and are seeking alternatives. Bullis has higher state standardized test scores and offers more art and extracurricular activities than the Los Altos district, which is cutting music and increasing class size. Bullis has achieved this success while receiving about 60 percent of the conventional system’s public funding.
Parents in Los Altos Hills created Bullis in 2003 because they were angry after the district closed their neighborhood school, said Mark Breier, a founder of the school and former chief executive of Beyond.com.

To get advice on starting Bullis, Breier said he consulted with Silicon Valley luminaries and charter advocates. They included Reed Hastings, chief executive of Netflix Inc. and former president of the California State Board of Education and venture capitalist John Doerr of Kleiner Perkins Caulfield & Byers.

The founding parents won a charter from the Santa Clara County Board of Education after the Los Altos district twice rejected them. After giving spots to current students and their siblings, Bullis reserves half of its slots for residents of the neighborhood that fed into the old school.
On a recent school day at Bullis, a kindergarten class studied Mandarin. Second-graders, sitting cross-legged under pictures of Bach, Mozart, Liszt and Stravinsky, learned to read music. A seventh-grade math class worked on algebra — a year or two before most U.S. schools — while an advanced student did linear equations at a high-school level. The school offers electives in Broadway dance and the stock market.

“We’re lucky to have so many different things we can study here,” said third-grader Ishani Sood, 8, taking a break from her Mandarin class.

A foundation set up to help fund the school asks Bullis parents to donate at least $5,000 for each child they enroll. Those who can’t afford to pay should discuss the reason with a foundation member, “recognizing that other school families will need to make up the difference,” the foundation said on its website.

Naturally they’re under attack for their lack of “diversity” — because the kind of diversity Silicon Valley thrives on is not the right kind.

The Magic of Education

Wednesday, November 30th, 2011

Bryan Caplan, like most professors, has spent most of his life in academia and has very little real-world experience, so, he admits, he can’t very well teach his students real-world skills — yet employers care deeply about the grades he hands out. That’s the magic of education:

Yes, I can train graduate students to become professors.  No magic there; I’m teaching them the one job I know.  But what about my thousands of students who won’t become economics professors?  I can’t teach what I don’t know, and I don’t know how to do the jobs they’re going to have.  Few professors do.

Many educators sooth their consciences by insisting that “I teach my students how to think, not what to think.”  But this platitude goes against a hundred years of educational psychology.  Education is very narrow; students learn the material you specifically teach them… if you’re lucky.

Other educators claim they’re teaching good work habits.  But especially at the college level, this doesn’t pass the laugh test.  How many jobs tolerate a 50% attendance rate — or let you skate by with twelve hours of work a week?  School probably builds character relative to playing videogames.  But it’s hard to see how school could build character relative to a full-time job in the Real World.

At this point, you may be thinking: If professors don’t teach a lot of job skills, don’t teach their students how to think, and don’t instill constructive work habits, why do employers so heavily reward educational success?  The best answer comes straight out of the ivory tower itself.  It’s called the signaling model of education — the subject of my book in progress, The Case Against Education.

According to the signaling model, employers reward educational success because of what it shows (“signals”) about the student.  Good students tend to be smart, hard-working, and conformist — three crucial traits for almost any job.  When a student excels in school, then, employers correctly infer that he’s likely to be a good worker.  What precisely did he study?  What did he learn how to do?  Mere details.  As long as you were a good student, employers surmise that you’ll quickly learn what you need to know on the job.

In the signaling story, what matters is how much education you have compared to competing workers.  When education levels rise, employers respond with higher standards; when education levels fall, employers respond with lower standards.  We’re on a treadmill.

Hitler Wins

Tuesday, November 29th, 2011

One of the most popular forms of alternate history is the story in which Hitler wins — but not all such stories are alternate history:

For more than half a century it has been an enjoyable creative exercise to imagine what kind of ALTERNATE HISTORY might have evolved had Germany won WORLD WAR TWO, and many novels and stories have been written to explore that assumption. But even before the rise of Adolf Hitler (1889-1945), novels like Milo HASTINGS‘s City of Endless Night (June-November 1919 True Story as “Children of ‘Kultur’”; rev 1920) – some of the imagery of which influenced Fritz LANG‘s Metropolis (1926) – envision the Germany of the future in stridently DYSTOPIAN terms; indeed, the first explicit Hitler-Wins tales were not exercises in the reimagining of history but Dreadful Warnings in the tradition of the FUTURE WAR tale: graphic anticipations of what might actually come to pass, unless something is done. The difference between these texts and later ALTERNATE HISTORY tales is profound. (For further discussion of the distinction between alternate history and the FUTURE WAR/FUTURE HISTORY, see bottom paragraph of text.)

The exceptionally nightmarish Swastika Night (1937) as by Murray Constantine (> Katherine BURDEKIN) is, therefore, not set in an alternate world, and nor are several others published 1939-1945. Other examples of FUTURE WAR fictions – or, as in the case of Swastika Night, with its long FEMINIST perspective over several centuries, more properly FUTURE HISTORY fictions – are Loss of Eden (1940; vt If Hitler Comes 1941) by Douglas BROWN and Christopher {SERPELL}, Then We Shall Hear Singing (1942) by Storm JAMESON, Grand Canyon (1942) by Vita SACKVILLE-WEST, If We Should Fail (1942) by Marion {WHITE}, I, James Blunt (1943 chap) by H V MORTON, The Bells Rang (1943) by Anthony ARMSTRONG and Bruce Graeme (1900-1982), When Adolf Came (1943) by Martin HAWKIN, the film The Silent Village (1943) directed by Humphrey Jennings (1907-1950), and Erwin LESSNER‘s Phantom Victory: The Fourth Reich 1945-1960 (1944). The only genuine ALTERNATE HISTORY tale from these years seems to be We Band of Brothers (1939) by George Cecil FOSTER writing as Seaforth, in which conflict breaks out in 1938, ending a year later in the retirement of a successful Hitler and the founding of something like the United Nations. A subcategory – tales in which Hitler seems about to win, but loses an important battle or secret at the last moment – includes many borderline tales of warfare and espionage; among the serious examples are detailed fictional prognoses like Fred ALLHOFF‘s Lightning in the Night (31 August-16 November 1940 LIBERTY; 1979), which predicts a US readiness to use nuclear weapons against Germany as a final resort, and Invasion: Being an Eyewitness Account of the Nazi Invasion (1940) by Hendrik Willem {VAN LOON}.

The death of Hitler in 1945 marked the end of the real WORLD WAR TWO in Europe, but for any number of reasons – the astonishing intensity (and intoxicating vacancy) of the evil he represented; the dreadful clarity of the consequences had the Allies failed; the melodramatic intensity of the conflict itself, with the whole war seeming (then and later) to turn on linchpin decisions and events; and (shamingly) the cheap aesthetic appeal of Nazism, with its Art Deco gear, its sanserif, Babylonian architecture, its brutal elites, its autobahns and Blitzes and Panzer strikes, its extremely attractive helmets, its secrecy and PARANOIA – the war very soon became a focus for speculative thought, and it was only a few months before the first alternate-world Hitler-wins tale was published (in HUNGARY): László Gáspár’s Mi, I. Adolf ["We, Adolf 1"] (1945). After Noel {COWARD}’s play, “Peace in our Time” (performed 1947; 1948), which is set in an ALTERNATE HISTORY LONDON just after the Nazis have won the Battle of Britain, the first significant example in English was SARBAN‘s The Sound of His Horn (1952), which sinuously intertwines sadism and aesthetics into a vision of decadence with roots in Germany’s mythic past. The sardonic MEDIEVAL FUTURISM of the book, which Sarban may have taken from Swastika Night (see above), may have influenced – and certainly served as a tonal precedent for – several works both within the field, like Keith ROBERTS‘s “Weihnachtsabend” (in New Worlds Quarterly 4, anth 1972, ed Michael MOORCOCK), and outside it, as in non-alternate-history fictional portrayals of Germany in faux-pastoral terms like The Birthday King (1962) by Gabriel Fielding (1916-1986) or Le Roi des Aulnes (1970; trans Barbara Bray as The Erl-King 1972 UK) by Michel Tournier (1924-    ). A speculative essay of note is “If Hitler had Won World War II” (19 December 1961 Look) by William L Shirer.

The most famous single Hitler-wins sf tale is probably Philip K DICK‘s The Man in the High Castle (1962), where the German and Japanese victory becomes a kind of poisonous backdrop for a complex tale set in a psychically devastated America; and the most telling commentary on the moral underside of the subgenre is Norman SPINRAD‘s The Iron Dream (1972), in which the young Hitler, a failure at politics, becomes a pulp novelist whose tale Lord of the Swastika exploits, to savagely ironic effect, some of the responses of many readers to tales of “genuine” Nazi triumph.

(Hat tip to Dave Gottlieb, who mentioned it in a comment.)

The Walking Debt

Tuesday, November 29th, 2011

If only one man, Jon Stewart says, could embody the corporate-industrial-government complex in all its cluster$#@!itude:

(Hat tip to Borepatch.)

Apple’s Supply Chain

Tuesday, November 29th, 2011

Apple began innovating in its supply-chain management almost immediately upon Steve Jobs’s return in 1997:

At the time, most computer manufacturers transported products by sea, a far cheaper option than air freight. To ensure that the company’s new, translucent blue iMacs would be widely available at Christmas the following year, Jobs paid $50 million to buy up all the available holiday air freight space, says John Martin, a logistics executive who worked with Jobs to arrange the flights. The move handicapped rivals such as Compaq that later wanted to book air transport. Similarly, when iPod sales took off in 2001, Apple realized it could pack so many of the diminutive music players on planes that it became economical to ship them directly from Chinese factories to consumers’ doors. When an HP staffer bought one and received it a few days later, tracking its progress around the world through Apple’s website, “It was an ‘Oh s—’ moment,” recalls Fawkes.

That mentality — spend exorbitantly wherever necessary, and reap the benefits from greater volume in the long run — is institutionalized throughout Apple’s supply chain, and begins at the design stage. Ive and his engineers sometimes spend months living out of hotel rooms in order to be close to suppliers and manufacturers, helping to tweak the industrial processes that translate prototypes into mass-produced devices. For new designs such as the MacBook’s unibody shell, cut from a single piece of aluminum, Apple’s designers work with suppliers to create new tooling equipment. The decision to focus on a few product lines, and to do little in the way of customization, is a huge advantage. “They have a very unified strategy, and every part of their business is aligned around that strategy,” says Matthew Davis, a supply-chain analyst with Gartner (IT) who has ranked Apple as the world’s best supply chain for the last four years.

When it’s time to go into production, Apple wields a big weapon: More than $80 billion in cash and investments. The company says it plans to nearly double capital expenditures on its supply chain in the next year, to $7.1 billion, while committing another $2.4 billion in prepayments to key suppliers. The tactic ensures availability and low prices for Apple — and sometimes limits the options for everyone else. Before the release of the iPhone 4 in June 2010, rivals such as HTC couldn’t buy as many screens as they needed because manufacturers were busy filling Apple orders, according to a former manager at HTC. To manufacture the iPad 2, Apple bought so many high-end drills to make the device’s internal casing that other companies’ wait time for the machines stretched from six weeks to six months, according to a manager at the drillmaker.

Life as an Apple supplier is lucrative because of the high volumes but painful because of the strings attached. When Apple asks for a price quote for parts such as touchscreens, it demands a detailed accounting of how the manufacturer arrived at the quote, including its estimates for material and labor costs, and its own projected profit. Apple requires many key suppliers to keep two weeks of inventory within a mile of Apple’s assembly plants in Asia, and sometimes doesn’t pay until as long as 90 days after it uses a part, according to an executive who has consulted for Apple and would not speak on the record for fear of compromising the relationship.

Apple sounds suspiciously like Wal-Mart.

Fallen Axis

Monday, November 28th, 2011

The Onion reports that in an alternate universe, the Aryan Broadcasting Company-owned Sci-Fi Channel is broadcasting Falling Axis, which portrays what would have happened if Germany had lost the war:

“Not only is Fallen Axis a chilling, what-if story of a world gone mad, it also asks a number of important questions about what Germany’s victory meant, and why its sacred mission was so critical to the fatherland and all of humankind,” said Hans von Winterstein, TV critic for the Deutsche-American Zeitung. “And Rolf Staal’s performance as former cowboy actor Henry Fonda II, the monstrous American president who attempts to spread his country’s insidious political and economic liberalism across the globe, will horrify even the most stoic among us.”

Producers said depicting the fictional, non-German-controlled America cost upwards of 40 million reichsmarks per episode, with much of the budget going toward recreating the cities of Washington, D.C. and New York exactly as they would have appeared before the famous tide-turning Luftwaffe strike of 1951. In addition, test audiences reported being impressed by the show’s painstaking portrayal of a topsy-turvy 2009 in which American big-band music plays on every radio, Mickey Mouse spouts pro-Semitic propaganda from every cinema screen, and dilution of the supreme race runs rampant.

The show is considered by many to be another boon to the Sci-Fi Channel’s fall schedule, which also includes Battlestar Gleichschaltung, a weekly drama about a starship crew that enforces the total coordination of intergalactic society and commerce, and the hit reality series Jew Hunters, in which a team of paranormal investigators scour banks and former Polish ghettos in search of Jewish spirits.

Peace is an interesting ideal

Monday, November 28th, 2011

Peace is an interesting ideal, Rory Miller says, depending on how you define it:

Like a lot of ideals, it’s squishy enough that you can have other ideals directly opposed to your stated ends and throw enough words into the justification to miss the point.

The thing that gets me about peace activists is that peace is not a thing. It is the absence of another thing. Depending on how you define it, the absence of war or violence or conflict. Depending on how you define those, ‘peace’ ranges from a difficult improbability to an absurd impossibility. In any case where you are looking at an absence, you must look at the thing you want to remove.

You can’t effectively work for peace without taking a good hard look at war or violence or conflict (or all three, depending on your definition). And not a knee-jerk, disapproving look, either. A good hard look at why, if something is so bad, it is so prevalent. Why, if something must be fixed, it is so endemic in the natural world.

It is exactly like any other group attempting to censor or ban any other thing. Prohibition was an ideal, largely put forward by self-righteous teetotalers. People talk about violence, it seems to me, the way that they talked about sex in the fifties. They don’t. Most talk around it. If you have anything to say from experience, you are marginalized.

It kills dialogue. More to the point, it kills progress. Medicine advances as we learn more about disease. We solve problems by studying problems, not by meditating on an imaginary, problem-free end state. I guess, in a way, that is the defining difference between a peace-maker and a peace activist.

Crime fighting is an ideal, he says, just like peace:

And we won’t make progress until we take a good hard look at why crime is prevalent. Which means acknowledging that it works. It satisfies needs. It’s not just that there is little opportunity for honest employment in certain areas. There are damn few jobs, much less entry-level jobs, where you can make thousands of dollars a week, get automatic deference and an instant family.

Crime fighting is an attempt, instead of lowering the rewards of the criminal lifestyle, to raise the risks. Catch ‘em, book ‘em, hard time. You have to take a look, a hard look at whether that is a risk or even a punishment in this subculture… or just the way rugby players think about the occasional injury. I don’t think surveys will help… but I recall the young man about to be transported to prison for the first time at the tender age of eighteen. He was excited. In his family, doing time in prison was the rite of passage to manhood. Jail didn’t count.

And this is where we get to criminals. We look at them from our point of view and our world. Most of the things that make a career criminal would be and are profoundly dysfunctional in polite society. So we look at our world and us and the criminal and try to ‘fix’ what is ‘broken’.

There is nothing broken. For the most part (possible mental illness and stuff aside) the serious criminal is not incomplete. There is no pathology. He is perfectly adapted for his world. The things that we think of as normal and good, the things we try to instill when we rehabilitate, might be profoundly dangerous behaviors when he goes back to his old haunts and sees his old friends.

We pretend we are fixing a person, but in reality we are trying to reshape him into a person that makes us more comfortable. Altering a human for our purposes, not his. In the process making him more likely to die in his natural environment and he damn well knows it.

The few people I know who have truly rehabilitated themselves, started by deciding they wanted to live in the non-criminal world. That’s rare.

How The Empire Strikes Back Should Have Ended

Sunday, November 27th, 2011

How The Empire Strikes Back should have ended:

“This is not a negotiation.”

Wild Cards

Sunday, November 27th, 2011

When I first watched Heroes, it reminded my of G.R.R. Martin’s Wild Cards anthology, which brought a number of sci-fi authors together to write gritty and “realistic” superhero stories in a shared fictional universe.

Then Martin’s Song of Ice and Fire series made its way to HBO. With the success of both Heroes and A Game of Thrones then, we shouldn’t be surprised that Syfy Films has acquired the rights to Wild Cards:

The tales, written initially by science fiction and fantasy authors who also included Roger Zelazny and Lewis Shiner, among others, provided an alternate history of Earth and told superhero stories grounded in realism, a strategy that would be emulated in both comics and, later, in movies such as the recent Christopher Nolan-directed Batman films.

“We had a love of comics books and superheroes that we grew up on,” Martin, who had fan letter published in a Marvel comic in the 1960s, tells The Hollywood Reporter. “But we approached the material differently. We wanted to do it in a grittier, more adult manner than what we were seeing in the ’80s. It’s something that many other people have been doing in the decades ever since.”

One of the unique aspects of the books ­ (the series has changed publishers several times, it is now on volume 22) is the way the characters evolve. Some age, some marry, some die, new ones are introduced, building a tapestry of stories.
“One of the things we have going is the sense of history,” he says. “The comics in the mainstream are doing retcons [retroactive continuity] all the time. [Heroes] get married, then one day, the publisher changes his mind, and then they’re no longer married. To my mind, it’s very frustrating. [Our stories] are in real time. It’s a world that is changing in parallel to our own.”

“This is, beyond Marvel and DC, really the only universe where you have fully realized, fully integrated characters that have been built and developed over the course of 25 years,” says Gregory Noveck, Syfy Films’ senior vp production who joined the division in May and who targeted the books for acquisition. “The trick for us is to find what’s the best movie.”

It really does seem like a better fit for a series.

(Hat tip to Mitro at the Alternate History Weekly Update.)

How The Wizard of Oz Should Have Ended

Saturday, November 26th, 2011

How The Wizard of Oz should have ended:

(Hat tip to Borepatch.)

Freedom Betrayed

Friday, November 25th, 2011

Herbert Hoover is largely forgotten — or misremembered — today, but he was quite an accomplished man, Gerald Russello reminds us:

His many books include an English translation, with his wife, of an important Latin mining treatise. He was also a highly effective organizer, leading the Red Cross’s relief efforts during World War I, and raising massive amounts of funds to relieve Finland during World War II. And of course, he was a successful politician as well. Even after being defeated by FDR in 1932, Hoover roared back with a bestseller book attacking Nazism, socialism, and the New Deal–style liberalism he saw as an antecedent to socialism.

Hoover’s most important accomplishment may be his three-volume Freedom Betrayed, which is just now getting published:

Hoover wrote it over the decades after losing the 1932 election, but for various reasons was reluctant for most of his life to publish the “magnum opus,” as he called it, and so it has waited quietly in the archives of the Hoover Institution.

Hoover represents an older American tradition, one almost eclipsed since the New Deal. Having seen the horrors of war during World War I, he had no interest in seeing American lives lost in another bloody conflict. He was anti-interventionist, even in World War II, and he was keenly aware of the Communist infiltration of the federal government, which he thought more likely given Roosevelt’s left-leaning policies. On the second point, his suspicious largely proved right, as we know from the released Venona cables and other data from Soviet Russia: the Communists indeed were actively recruiting Americans and trying to change American policy, and there were sympathetic ears even in Washington elite circles.

The former position is trickier to defend, even now in the age of the Tea Party and Patrick Buchanan. Hoover, in a detailed analysis, argues that America faced no threat from European powers, which should be left to work out problems for themselves. Hoover was no anti-Semite, nor was he indifferent to the fate of the oppressed peoples of Europe or a member of America First. Hoover favored the Wagner-Rogers Bill, which would waive immigration restrictions for German Jews, and raised money to place German-Jewish scholars in American universities. But he represented a tradition, traceable to George Washington, that looks with a skeptical eye at claims for foreign entanglements and calls to become the world’s policeman. He favored letting Germany and Russia exhaust themselves first, as he stated in a public radio address in June 1941 after the Nazi invasion of Russia. His voice was ultimately drowned out by the Pearl Harbor attack, though he collects scrupulous evidence in this volume of some intelligence pointing to such an attack, a question that is still hotly debated.

Letting Germany and Russia exhaust themselves seems like such an obvious strategy, especially with Japan jumping to the front of the line.


Thursday, November 24th, 2011

It turns out that I’ve discussed Thanksgiving before — once or twice: