Hatred Begins at Home

Friday, August 17th, 2007

In Hatred Begins at Home Peggy Noonan looks at “what turns young Westerners into jihadis” — but she starts with an anecdote from another time and place:

Whenever I think of war, I think of this: It was 1982 or ’83, I was in Northern Ireland, and a local reporter was showing me around Derry, then a center of the Protestant-Catholic conflict. The neighborhood we were in was beat up, poor, with Irish Republican Army graffiti on tired walls. There were some scraggly kids on the street.

Suddenly an armored British army vehicle slowly rounded the corner, and the street came alive with kids pouring out of houses, grabbing the heavy metal lids of garbage bins, and smashing them against the pavement. They made quite a racket.

A woman came out. She was 35 or 40, her short hair standing up, uncombed. It was late afternoon, but she was in an old robe, and you could tell it was the robe she lived in. She stood there and smirked as the soldiers went by. She’d come out to register her dislike for the Brits, and to show the children she approved of their protest.

As I watched this nothing sort of scene, I thought: That’s where it comes from. That’s what keeps it alive.

A Connecticut Yankee In King Arthur’s Court

Thursday, August 16th, 2007

A few weeks back, while discussing the notion of bootstrapping society, a colleague mentioned that he had a first edition of A Connecticut Yankee In King Arthur’s Court, and he’d lend it to me if it would inspire me to read it.

I waved off the offer, not wanting to risk even a small chance of harming a rare book, but he made sure I got the book, and I made sure I read it.

One great advantage to reading the first edition is that it comes lavishly illustrated by Daniel Beard. (And one curiosity is that the first edition’s cover says A Yankee in King Arthur’s Court — the Connecticut doesn’t appear until the title page.)

At any rate, the book matched my expectations in some ways and defied them in others. For instance, from what I’d osmotically absorbed over the years, I’d assumed the book was largely about the protagonist’s Yankee ingenuity and technical superiority over the primitives of King Arthur’s court:

Twain’s book precipitated an entire sub-genre of science fiction, characterized by the depiction of a modern time traveller arriving at an ancient society, anachronistically introducing modern technologies and institutions and completely changing its character.

The best-known example is L. Sprague de Camp‘s Lest Darkness Fall in which an American archaeologist of the 1930s arrives at Ostrogothic Italy and manages to prevent the Dark Ages by introducing printing and other modern inventions. Leo Frankowski wrote the Conrad Stargard series where a 20th century Pole arrives in 13th century Poland and by rapid industrialization manages to defeat the Mongol invasion, as well as completely annihilating the Teutonic Knights. Poul Anderson presented an antithesis in his story The Man Who Came Early, where a modern American who finds himself in Viking Iceland fails to introduce modern technologies despite being an intelligent, competent and well-trained engineer, and finds that in a 10th century environment 10th century technologies work best.

In actuality, the book is more about the modern liberal ideals of America, as it repeatedly attacks the notions of hereditary aristocracy, slavery, and an established church.

DNA, A Prescott Educational Film

Thursday, August 16th, 2007

Step 3: Redefine the Game World

Thursday, August 16th, 2007

In Step 3: Redefine the Game World, Ryan Dancey, who used to be brand manager for the paper-and-pencil Dungeons & Dragons game, explains that one of the key advantages of such paper-and-pencil games is persistence — which seems like something online games should handle just fine:

The place we stand [and] fight is on persistence. The MMORPGs [Massively Multiplayer Online Role-Playing Games] have a big problem with persistence. In effect, they are a write-once, read-many application. That means that the developers spend a lot of time creating an environment for everyone to play in at the same time. If the participants are given the power of persistence, a small group of players (those who play the most, or those who spend the most time figuring out how to manipulate the game game environment) will dominate, and new players, or less active players, will find themselves in a nearly incomprehensible environment.

Ultima Online, the first of the modern MMORPG games, tried this strategy, and exposed most of its fail points. It was, at least in the beginning, a fairly persistent world, with a lot of emergent behavior. The idea was that the world would be a living place, and that player actions would have a lot of impact on the world. Unfortunately, what happened rather quickly is that the “game” of beating the world’s limits became more important than the “game” of telling the world’s stories. A small group (as a percentage of the total) of the users constantly found ways to derail the simulation, forcing numerous resets, arbitrary limits, and other top-down control mechanisms, combined with a social-play pattern that made no sense, outside the unique environment of UO itself.

For example, one thing that happened in UO due to persistence was that roaming groups of players would take it on themselves to kill, and keep killing new, low level players. As a result, starting a new PC, (or coming into the game as a new player) became almost intolerable. Every few minutes, for no “story” based reason, the PC would die, and have to go through the mechanics of returning to the game. This tactic became known as “griefing”, that is, the play pattern of deriving fun from making some other player’s life miserable, and it plays a part in virtually all MMORPG experiences to a greater or lesser degree.

Another example was illogical manipulation of the environment. A group of players noticed that a certain kind of monster fed mostly on sheep, but if there were no sheep available, it would feed on other creatures. By playing as a group, they were able to slaughter all the ‘normal’ food for this creature in a given area, and keep the food from successfully “respawning”. The monster, as a result, eventually followed the dictates of its programming and began attacking PC characters. Since the creature in question was a Dragon, and only the very highest level PCs, working as a team, had the ability to fight one and win, this essentially meant that any character, at any time, could suddenly get killed by a threat it was (within the context of the story) not supposed to have to be dealing with. The result? People banded together to try to keep the sheep alive. Woo-hoo! That’s some fun gaming there!

The result was widespread unhappiness — in fact, the result was laying the groundwork for EverQuest, which gave people a lot of what they wanted (massively mulitplayer roleplaying) without a lot of the griefing, at the expense of persistence.

Most of the games that followed UO tried to “learn” from this experience by downplaying the elements of persistence. World of Warcraft is at best “quasi-persistent”. The only in-game effect over which the players have control which has persistence is the creation of certain objects, which can be exchanged between players, or sold for cash, but which cannot be left in the environment for others to find, given to (or used by) NPCs, etc.
[...]
The MMORPG format has a probably unfixable problem with player driven persistence that is inherent to the platform: There is little (if any) human moderation of player actions. If it were possible for a PC to carve its initials on walls, in short order, every carvable surface in the game would be so covered, because all it takes is a small minority of the participants to decide that’s “fun” to translate into mass defacement.

Example: Players in World of Warcraft discovered that a highly contagious, and very deadly disease was spread by a monster in a specific dungeon. They also discovered that characters of a certain class & level could become infected with this disease, but could kept alive for long periods by higher level characters with sufficient healing resources. Groups on every server, acting independently but on the same strategy, managed to get an infected character out of the dungeon and back to the major cities of the world, whereupon the disease, which was never designed to be exposed to anyone not in that dungeon, infected all the characters playing in those locations. For several hours, the game became almost unplayable until Blizzard patched the game and removed the exploit.

Blows Against the Empire

Wednesday, August 15th, 2007

Philip K. Dick is best known for the films loosely based on his stories: Blade Runner, Total Recall, Minority Report, etc.

Now, the Library of America has bestowed a certain amount of respectability on his work by compiling Four Novels of the 1960s, a collection including The Man in the High Castle, The Three Stigmata of Palmer Eldritch, Do Androids Dream of Electric Sheep?, and Ubik.

In Blows Against the Empire, New Yorker writer Adam Gopnik examines the troubled writer and his work:

Of all American writers, none have got the genre-hack-to-hidden-genius treatment quite so fully as Philip K. Dick, the California-raised and based science-fiction writer who, beginning in the nineteen-fifties, wrote thirty-six speed-fuelled novels, went crazy in the early seventies, and died in 1982, only fifty-three. His reputation has risen through the two parallel operations that genre writers get when they get big. First, he has become a prime inspiration for the movies, becoming for contemporary science-fiction and fantasy movies what Raymond Chandler was for film noir: at least eight feature films, including “Total Recall,” “Minority Report,” “A Scanner Darkly,” and, most memorably, Ridley Scott’s “Blade Runner,” have been adapted from Dick’s books, and even more — from Terry Gilliam’s “Brazil” to the “Matrix” series — owe a defining debt to his mixture of mordant comedy and wild metaphysics.

But Dick has also become for our time what Edgar Allan Poe was for Gilded Age America: the doomed genius who supplies a style of horrors and frissons.

[...]

Dick’s early history is at once tormented, hustling, and oddly lit by the bright California sunshine of the late fifties. Born in 1928, he had a twin, a sister named Jane, who died when she was only a month old; like Elvis Presley, who also had a twin sibling who died, Dick seems to have been haunted for the rest of his life by his missing Other. He seems to have blamed his mother, unfairly, for her death, poisoning their relations. He had one of those classic, bitter American childhoods, with warring parents, and was dragged back and forth across the country. He had loved science fiction since boyhood — he later told of how at twelve he had a dream of searching in Astounding Stories for a story called “The Empire Never Ended” that would reveal the mysteries of existence — and he began writing quickie sci-fi novels for Ace in the fifties and sixties. “I love SF,” he said once. “I love to read it; I love to write it. The SF writer sees not just possibilities but wild possibilities. It’s not just ‘What if’ — it’s ‘My God; what if’ — in frenzy and hysteria. The Martians are always coming.” The hysteria suited him. He seems to have been a man of intellectual passion and compulsive appetite (he was married five times), the kind of guy who can’t drink one cup of coffee without drinking six, and then stays up all night to tell you what Schopenhauer really said and how it affects your understanding of Hitchcock and what that had to do with Christopher Marlowe.

By the way, Bladerunner fans will want to pick up the new five-disc ultimate collector’s edition.

Sikh To Death

Tuesday, August 14th, 2007

In Sikh To Death, “war nerd” Gary Brecher shares some of the sects’ martial history:

Then along comes the founder of Sikhism, Nanak, and says, “There is no Muslim, there is no Hindu.” Meaning the Hell with both of you. Sikhs were radicals from the start. All the little traditions people know about them started out as in-your-face rebel yells in the Punjab. Like those beards: only the Mughal were allowed to wear long hair and beards. So the Sikh all let theirs grow longer than John and Yoko’s. That name, “Singh,” every Sikh guy has? It means “Lion” but the real point is that it replaced all the caste names they had before. Like Malcolm making his last name “X.”
[...]
The Sikhs evolved a theory of warfare called “the two-and-a-half strikes.” You got a full point for ambushes and hit-and-run attacks, but only a half point for pitched battles where you lost a lot of your own men. Nathan Bedford Forrest, Francis Marion and Patton himself would have agreed.

By 1810 the Sikhs had driven the Mughals out of the Punjab. They owned the place, literally: They had an independent Sikh kingdom running there, and by all accounts it was the one place in India where something sorta resembling law and order actually prevailed.

The only reason the Sikhs didn’t go on to run all of India and maybe the world is simple: They ran into the Brits. Same reason the Zulu didn’t get to own all of southern Africa. A lot of big, strong tribes were on the movie in Queen Victoria’s time, and the same thing happened to most of them: They met the Brits, and that was all she wrote.

Ranjit Singh, the ruler of the Punjab, was smart enough to sign a treaty with the Brits, keep a strong army to back it up, and avoid the sort of little faked “border incidents” the Raj loved to use to start a war. When he died in 1839, the Punjab fell into the usual bickering, and the Brits pounced.

I keep telling you, the Brits circa 1840 weren’t the cute little Monty Python guys you imagine. They were stone killers, the best since the Romans, totally ruthless, no more conscience than a drain contractor. They saw the Sikhs fighting among themselves and went for it.

Even then, even with Sikh traitors fighting for the Brits, the Sikhs had the best of the first Anglo-Sikh war. The Brits lost more than 2,000 men in the first battle, Ferozeshah, in 1845, and were on the verge of offering unconditional surrender when reinforcements arrived and overwhelmed the Khalsa, the Sikh army. The second war, in 1849, was easier, because the Brits, who knew more about occupation than our lame Bremer clones ever will, used the three years in between to bribe, assassinate and divide the Sikh elite. Even so, the Sikh cavalry, fighting basically without any leaders, slaughtered the British cavalry at the battle of Chillianwalla, smacking down the redcoats’ little ceremonial swords with their big scimitars. I’ve read Brit officers’ accounts of that battle, and they say something you get in all accounts of the Sikh: how big and strong the bastards are. The Brits said they felt like children beside the Sikh horsemen, and there’s really funny picture of a white officer surrounded by Sikh soldiers, looking like a pasty little midget with his bodyguards.

And you know the best thing about the Sikhs? They don’t waste time holding grudges. The Brits won; they accepted it, worked with it, and in a few years they were the core of the Raj’s army.

Conspiracy Theories

Tuesday, August 14th, 2007

Bruce Schneier cites an intriguing New Scientist article on conspiracy theories and why people believe them:

So what kind of thought processes contribute to belief in conspiracy theories? A study I carried out in 2002 explored a way of thinking sometimes called “major event-major cause” reasoning. Essentially, people often assume that an event with substantial, significant or wide-ranging consequences is likely to have been caused by something substantial, significant or wide-ranging.

I gave volunteers variations of a newspaper story describing an assassination attempt on a fictitious president. Those who were given the version where the president died were significantly more likely to attribute the event to a conspiracy than those who read the one where the president survived, even though all other aspects of the story were equivalent.

To appreciate why this form of reasoning is seductive, consider the alternative: major events having minor or mundane causes — for example, the assassination of a president by a single, possibly mentally unstable, gunman, or the death of a princess because of a drunk driver. This presents us with a rather chaotic and unpredictable relationship between cause and effect. Instability makes most of us uncomfortable; we prefer to imagine we live in a predictable, safe world, so in a strange way, some conspiracy theories offer us accounts of events that allow us to retain a sense of safety and predictability.

Mixed Feelings

Tuesday, August 14th, 2007

In Mixed Feelings, Sunny Bains tells a number of fascinating stories about sensory prosthetics:

For six weird weeks in the fall of 2004, Udo Wächter had an unerring sense of direction. Every morning after he got out of the shower, Wächter, a sysadmin at the University of Osnabrück in Germany, put on a wide beige belt lined with 13 vibrating pads — the same weight-and-gear modules that make a cell phone judder. On the outside of the belt were a power supply and a sensor that detected Earth’s magnetic field. Whichever buzzer was pointing north would go off. Constantly.

“It was slightly strange at first,” Wächter says, “though on the bike, it was great.” He started to become more aware of the peregrinations he had to make while trying to reach a destination. “I finally understood just how much roads actually wind,” he says. He learned to deal with the stares he got in the library, his belt humming like a distant chain saw. Deep into the experiment, Wächter says, “I suddenly realized that my perception had shifted. I had some kind of internal map of the city in my head. I could always find my way home. Eventually, I felt I couldn’t get lost, even in a completely new place.”

The effects of the “feelSpace belt” — as its inventor, Osnabrück cognitive scientist Peter König, dubbed the device — became even more profound over time. König says while he wore it he was “intuitively aware of the direction of my home or my office. I’d be waiting in line in the cafeteria and spontaneously think: I live over there.” On a visit to Hamburg, about 100 miles away, he noticed that he was conscious of the direction of his hometown. Wächter felt the vibration in his dreams, moving around his waist, just like when he was awake.

There are all kinds of senses humans don’t naturally have:

Direction isn’t something humans can detect innately. Some birds can, of course, and for them it’s no less important than taste or smell are for us. In fact, lots of animals have cool, “extra” senses. Sunfish see polarized light. Loggerhead turtles feel Earth’s magnetic field. Bonnethead sharks detect subtle changes (less than a nanovolt) in small electrical fields. And other critters have heightened versions of familiar senses — bats hear frequencies outside our auditory range, and some insects see ultraviolet light.

In the 1960s, Paul Bach-y-Rita installed a 20-by-20 array of metal rods in the back of an old dentist chair, and people could “see” pictures poked into their backs:

Having long ago abandoned the vaguely Marathon Man like dentist chair, the team now uses a mouthpiece studded with 144 tiny electrodes. It’s attached by ribbon cable to a pulse generator that induces electric current against the tongue. (As a sensing organ, the tongue has a lot going for it: nerves and touch receptors packed close together and bathed in a conducting liquid, saliva.)

So what kind of information could they pipe in? Mitch Tyler, one of Bach-y-Rita’s closest research colleagues, literally stumbled upon the answer in 2000, when he got an inner ear infection. If you’ve had one of these (or a hangover), you know the feeling: Tyler’s world was spinning. His semicircular canals — where the inner ear senses orientation in space — weren’t working. “It was hell,” he says. “I could stay upright only by fixating on distant objects.” Struggling into work one day, he realized that the tongue display might be able to help.

The team attached an accelerometer to the pulse generator, which they programmed to produce a tiny square. Stay upright and you feel the square in the center of your tongue; move to the right or left and the square moves in that direction, too. In this setup, the accelerometer is the sensor and the combination of mouthpiece and tongue is the transducer, the doorway into the brain.

The researchers started testing the device on people with damaged inner ears. Not only did it restore their balance (presumably by giving them a data feed that was cleaner than the one coming from their semi circular canals) but the effects lasted even after they’d removed the mouthpiece — sometimes for hours or days.

The author tried out a rig connecting camera goggles to the tongue zapper:

I cranked up the voltage of the electric shocks to my tongue. It didn’t feel bad, actually — like licking the leads on a really weak 9-volt battery. Arnoldussen handed me a long white foam cylinder and spun my chair toward a large black rectangle painted on the wall. “Move the foam against the black to see how it feels,” she said.

I could see it. Feel it. Whatever — I could tell where the foam was. With Arnold ussen behind me carrying the laptop, I walked around the Wicab offices. I managed to avoid most walls and desks, scanning my head from side to side slowly to give myself a wider field of view, like radar. Thinking back on it, I don’t remember the feeling of the electrodes on my tongue at all during my walkabout. What I remember are pictures: high-contrast images of cubicle walls and office doors, as though I’d seen them with my eyes. Tyler’s group hasn’t done the brain imaging studies to figure out why this is so — they don’t know whether my visual cortex was processing the information from my tongue or whether some other region was doing the work.

I later tried another version of the technology meant for divers. It displayed a set of directional glyphs on my tongue intended to tell them which way to swim. A flashing triangle on the right would mean “turn right,” vertical bars moving right says “float right but keep going straight,” and so on. At the University of Wisconsin lab, Tyler set me up with the prototype, a joystick, and a computer screen depicting a rudimentary maze. After a minute of bumping against the virtual walls, I asked Tyler to hide the maze window, closed my eyes, and successfully navigated two courses in 15 minutes. It was like I had something in my head magically telling me which way to go.

This leads into a device for helping pilots:

First we set a baseline. Schnell sat me down in front of OPL’s elaborate flight simulator and had me fly a couple of missions over some virtual mountains, trying to follow a “path” in the sky. I was awful — I kept oversteering. Eventually, I hit a mountain.

Then he brought out his SOES, a mesh of hard-shell plastic, elastic, and Velcro that fit over my arms and torso, strung with vibrating elements called tactile stimulators, or tactors. “The legs aren’t working,” Schnell said, “but they never helped much anyway.”

Flight became intuitive. When the plane tilted to the right, my right wrist started to vibrate — then the elbow, and then the shoulder as the bank sharpened. It was like my arm was getting deeper and deeper into something. To level off, I just moved the joystick until the buzzing stopped. I closed my eyes so I could ignore the screen.

Finally, Schnell set the simulator to put the plane into a dive. Even with my eyes open, he said, the screen wouldn’t help me because the visual cues were poor. But with the vest, I never lost track of the plane’s orientation. I almost stopped noticing the buzzing on my arms and chest; I simply knew where I was, how I was moving. I pulled the plane out.

At Australia’s Bunny Fence, Variable Cloudiness Prompts Climate Study

Tuesday, August 14th, 2007

At Australia’s Bunny Fence, Variable Cloudiness Prompts Climate Study:

The rabbit-proof fence — or bunny fence — in Western Australia was completed in 1907 and stretches about 2,000 miles. It acts as a boundary separating native vegetation from farmland. Within the fence area, scientists have observed a strange phenomenon: above the native vegetation, the sky is rich in rain-producing clouds. But the sky on the farmland side is clear.

Why?

One theory is that the dark native vegetation absorbs and releases more heat into the atmosphere than the light-colored crops. These native plants release heat that combines with water vapor from the lower atmosphere, resulting in cloud formation.

Another hypothesis is that the warmer air on the native scrubland rises, creating a vacuum in the lower atmosphere that is then filled by cooler air from cropland across the fence. As a result, clouds form on the scrubland side.

A third idea is that a high concentration of aerosols — particles suspended in the atmosphere — on the agricultural side results in small water droplets and a decrease in the probability of rainfall. On the native landscape, the concentration of aerosols is lower, translating into larger droplets and more rainfall.

(Hat tip to FuturePundit.)

East River Fights Bid to Harness Its Currents for Electricity

Tuesday, August 14th, 2007

There’s good news, and there’s bad news, for “green” power company Verdant Power, as the East River fights Verdant’s bid to harness its currents (tides, really) for electricity:

North of the bridge, black cables snake out of the churning surface of the East River. They connect a makeshift control room inside an old shipping container on the island to a battery of futuristic mechanisms that could shape an energy future that does not pollute or use foreign oil — if a five-year-old company named Verdant Power can work out all the bugs.

Weeks after they were formally dedicated by Mayor Michael R. Bloomberg, six underwater turbines that turn the river’s currents into electricity have been shut down for repairs and a basic redesign. The East River’s powerful tides have been wreaking havoc with the giant turbine blades since the first two were installed in December.

“But the good thing is that there’s more power in the East River than we thought,” said Mollie E. Gardner, a geologist for Verdant Power, which owns the equipment.
[...]
It has been a rough eight months for Verdant. Days after the first two turbines were lowered into the water, the East River’s powerful currents sheared off the tips of several blades about a third of the way down.

New blades were ordered, made of a cast aluminum that theoretically would hold up better. They replaced the ones that were broken, and were also installed on four more turbines that were lowered into the river’s eastern channel earlier this year.

Together, the turbines were capable of producing about 1,000 kilowatt hours a day of clean electricity. But the East River tides have proved too formidable even for the stronger blades, putting excessive strain on the bolts that hold them to the turbine hubs.

To keep them from coming apart, all six of the 20-foot-tall mechanisms, which resemble ship propellers on masts, have been shut down for repairs and may not be back in operation until November.

“The only way for us to learn is to get the turbines into the water and start breaking them,” said Trey Taylor, the habitually optimistic founder of Verdant Power.

Tide-mills have their advantages:

Hydro turbines have a few advantages over windmills. While winds are erratic, tides can be charted by the minute, which allows power companies to know exactly when the turbines will be generating power.

Can a 60-year-old drug cure obesity?

Tuesday, August 14th, 2007

Can a 60-year-old drug cure obesity?:

After learning that blocking the brain’s histamine-1 receptor causes weight gain, Tel Aviv-based nutrition expert Nir Barak went hunting for a drug that would stimulate that receptor. He found Betahistine, which has been used to treat vertigo since the 1940s. Betahistine was pulled off the U.S. market in 1970 when the FDA began scrutinizing drugs more rigorously and demanded a new round of clinical trials.

Betahistine’s manufacturer, Unimed, never complied. A generic version is still sold in Europe, but the compound is no longer under patent protection in the United States.

When Barak discovered there had been no inherent issue with the drug’s safety, he knew he’d hit the jackpot. So did Bio-Light Israel Life Science Investments, which has funded all clinical development trials for Barak’s new company, Obecure.

Barak got weight-loss guru Robert Kushner to conduct a double-blind clinical trial. Results haven’t yet been finalized, but Kushner says participants lost up to 12 percent of their body weight.

“They were telling me, ‘It wasn’t hard. I wasn’t thinking about food. I was content,’” Kushner says. “And there were no side effects to speak of.”

Stimulating the histamine-1 receptor appears to reduce the craving not only for food in general but for fatty foods in particular. Less fat usually means less cholesterol, so Obecure is also targeting the $28 billion market for cholesterol-reducing drugs like Lipitor.

“I can’t think of a better time in history to have such a product out,” says Kathryn Harrigan, a professor at Columbia Business School.

The bigger question is, Can a 60-year-old drug make money? It’s off patent, right?

The electronic cigarette

Tuesday, August 14th, 2007

A Chinese company has developed the electronic cigarette:

For smokers who want to quit, there are pills, patches, and gum. But how about an electronic nicotine delivery device that looks and feels like smoking — without the smell or the carcinogens?

That’s what Hong Kong-based Golden Dragon Group is selling. Known as Ruyan (meaning “like smoking”), the electronic cigarette is a $208 battery-powered atomizer.

Cartridges containing pure nicotine, available in three strengths and good for some 350 puffs each, cost about $4.
[...]
Ruyan was launched in China in 2004, and last year its sales reached $36.5 million. Turkey quickly became Golden Dragon’s second-largest market, followed by Israel and Australia.

Working with an unnamed U.S. partner to get FDA approval, Golden Dragon expects to double current sales by the end of the year. Morgan Stanley analyst David Adelman says the e-cigarette would be lucky to snare even 1 percent of the U.S. cigarette market.

Still, that would add up to a healthy (cough) $750 million.

Who bears the responsibility for motorcycle accidents?

Monday, August 13th, 2007

Who bears the responsibility for motorcycle accidents?

The Wall Street Journal reports that “adjusted for miles traveled, [motorcycle] riders were 34 times more likely to die in a highway accident than occupants of passenger cars in 2004.” The article also points out that motorcycles do much worse in accidents than cars.

Imagine that a car driver makes a mistake and crosses a traffic lane without looking. As a result the driver hits a motorcycle and does, let’s say, £1 million of damage to the motorcyclist. But let’s also assume that if the motorcyclist were driving a car he would have suffered trivial damages. So my question is who should be responsible for the damage to the motorcyclist? The car driver’s negligence caused the accident. But if the motorcyclist were driving a more crash-worthy vehicle he wouldn’t have suffered anywhere near £1 million of damage so shouldn’t he bear most of the cost of the accident?

Tek Jansen Comic

Monday, August 13th, 2007

OK, heroes, prepare yourselves for Stephen Colbert’s Tek Jansen #1:

Solar plexus! Bursting out from the hit Comedy Central show, THE COLBERT REPORT–it’s STEPHEN COLBERT’S TEK JANSEN! In this stunning continuation of Stephen Colbert’s critically acclaimed, yet unpublished prose novel, everyone’s favorite sci-fi hero must stand against the enemies of freedom no matter what dark planet they crawl from! Each issue features two stories: A main serialized story written by John Layman and Tom Peyer with art by Scott Chantler & an independent backup story written by Jim Massey with art by Robbi Rodriguez.

Stirling Engines

Monday, August 13th, 2007

Cory Doctorow of Boing Boing is fascinated by this Papercraft stirling engine that runs on coffee and shares a translation of its German description:

The Stirling Engine
Runs on a cup of coffee or an ice pack

The revolutionary concept for this hot-air engine was discovered in 1816 by the Scottish minister Robert Stirling and has been updated for today. The principle is as ingenious as it is simple: In a sealed cylinder, heated from the underside, a piston pushes the enclosed air back and forth between the hot and the cold side. The air therefore expands out and compress together every cycle and that movement is converted via a moving piston and crankshaft into rotary motion.

As an energy source, any type of warmth or cooling that produces a temperature differential can be used, from an open fire to solar energy or any other unused source of heat or cold.

Set this fully functional Stirling engine on a cup with boiling hot coffee (Tea or water also works of course) – give the flywheel a small push to the left – and the apparatus begins simply to pump up and down – for up to an hour!

This isn’t everything it can do: Set it on an ice pack or ice cubes from the freezer and turn the flywheel to the right and it will also pump up and down for an even longer time.

(Caption:It is really a marvel)

Kit made from sturdy punched cardboard with gold stamping, complete with all accessories including laser cut aluminium plates, low-friction plastic axle bearings and spring steel bent wire.

Height 16.5 cm, width and depth 12.6 cm.

The folks at the American Stirling Company have a similar device, the MM-5 Coffee Cup Engine Kit, only it isn’t made from card stock.

Stirling engines are fascinating for a number of reasons. Unlike typical gasoline or diesel engines, they aren’t internal combustion engines; they’re external combustion engines.

One side-effect of this is that they can run on any source of heat, not just literal combustion — solar heat, waste heat given off by heated coffee or electrical equipment, etc.

Stirling engines are also disturbingly simple, consisting of little more than a sealed cylinder of air (or helium or hydrogen). One side is heated, the other cooled, and the piston moves as the gas expands and contracts.

This simple design also makes the Stirling engine remarkably efficient.

So why don’t we see more Stirling engines in use? Well, the first major drawback is that a Stirling engine, like a steam engine, needs time to get going. You can’t simply step on the gas and get more power. Also, Stirling engines tend to have low power density; they tend to be big for the amount of power they put out.

Today, Stirling engines are primarily used in reverse, as cryocoolers.