The Other Telegraph

Wednesday, March 16th, 2011

Until 1793, Freeman Dyson reminds us, African drummers were ahead of Europeans in their ability to transmit information rapidly over long distances. Then along came the telegraph — but not the one most people think of:

In 1793, Claude Chappe, a patriotic citizen of France, wishing to strengthen the defense of the revolutionary government against domestic and foreign enemies, invented a device that he called the telegraph. The telegraph was an optical communication system with stations consisting of large movable pointers mounted on the tops of sixty-foot towers. Each station was manned by an operator who could read a message transmitted by a neighboring station and transmit the same message to the next station in the transmission line.

The distance between neighbors was about seven miles. Along the transmission lines, optical messages in France could travel faster than drum messages in Africa. When Napoleon took charge of the French Republic in 1799, he ordered the completion of the optical telegraph system to link all the major cities of France from Calais and Paris to Toulon and onward to Milan. The telegraph became, as Claude Chappe had intended, an important instrument of national power. Napoleon made sure that it was not available to private users.

Unlike the drum language, which was based on spoken language, the optical telegraph was based on written French. Chappe invented an elaborate coding system to translate written messages into optical signals. Chappe had the opposite problem from the drummers. The drummers had a fast transmission system with ambiguous messages. They needed to slow down the transmission to make the messages unambiguous. Chappe had a painfully slow transmission system with redundant messages. The French language, like most alphabetic languages, is highly redundant, using many more letters than are needed to convey the meaning of a message. Chappe’s coding system allowed messages to be transmitted faster. Many common phrases and proper names were encoded by only two optical symbols, with a substantial gain in speed of transmission. The composer and the reader of the message had code books listing the message codes for eight thousand phrases and names. For Napoleon it was an advantage to have a code that was effectively cryptographic, keeping the content of the messages secret from citizens along the route.

Morse launched his electric telegraph in 1838 and perfected his famous code in 1844.

(I’ve mentioned telegraphy without electricity before. It may qualify as an idea behind its time.)

Deaths per TWh by Energy Source

Wednesday, March 16th, 2011

If you look at deaths per TWh by energy source, you find that nuclear energy is — surprise! — quite safe:

Coal (161)
Oil (36)
Natural Gas (4)
Hydro (1.4)
Nuclear (0.04)

(Hat tip to Nyrath.)

A Simple Game of Rock, Paper, Scissors

Wednesday, March 16th, 2011

Power in Egypt — and the world has many Egypts, Mencius Moldbug says — is a simple game of rock, paper, scissors:

In Egypt there are three kinds of people: sheep (liberals, upper-class), dogs (nationalists, lower-class), and wolves (Islamists, beyond class). Sheep (with a big hand from Twitter and State) beat dogs, dogs kill wolves, wolves eat sheep.

If our twittering hipster is especially hip, she’s seen Persepolis and met the wolves. And indeed, the wolf form is natural to humanity. It is our society, the civilized European system with lots of sheep and some dogs and a very, very rare wolf, that is anomalous. And if it keeps behaving as it is, the anomaly will not take long to rectify.

The “Arab Spring” is springtime indeed for the violent, ruthless young man with a mission. Mubarak’s dogs, equally violent — indeed once Nasser’s wolves themselves, for fat authority turns wolves into dogs — tamed the most violent of wolves with the most wolfish of methods. The dog, half wolf himself, speaks the language of the wolf. The sheep looks at the wolf — and sees a sheep. And there has never been any shortage of wolves who speak sheep. Baa! Baa!

In the dog state, so long as they minded their own business, within very broad definitions of their own, a sheep could live as a sheep. Now we see the sheep state, young heaven for wolves. Even the dogs turn into wolves — what’s an old Mubarak thug to do? Thuggery is all he knows. The old firm has disbanded. The jihad is hiring. Allahu akbar! Indeed, Islam is the future in Egypt — if I were an Egyptian, I’d be working on my raisin right now. Sovereignty is conserved; power creates its own popularity. In anarchy, violence is power, and the wolves have it.

The tragedy of Egypt is that if the dogs and sheep did not respond to different masters, if the sheep did not have Twitter and Harvard to follow, the sheep would do what sheep do naturally and follow the dogs. Who would in turn love and cherish the sheep, and kill the wolves. This is the difference between Mubarak’s Egypt and Elizabeth’s England — both societies with a small educated elite, a vast base of varlets, an absolute ruler and an active, efficient secret police.

In other words, if Egypt’s natural intelligentsia was not Americanized, if it was not drawn away from its own country and its own leadership by the lure of Twitter, it would have no choice but to participate in the government of its own country. Which would, in turn, lose much if not all of its peasant-thug character, having better talent to draw on than peasant thugs. If this hypothesis is correct, it’s the apparent solution — the Americanization of Egypt — which creates the problem.

So the American liberal, who is not after all dumb, if he was genuinely concerned about the Egyptian liberal, would observe reality and tell his tawny friends: chill out. Deal with it. You cannot rule Egypt; we are not the British Empire, we are not going to rule it for you. Yet someone will rule Egypt, as they have since the Scorpion King was a little boy. Do you even begin to know how much worse than Mubarak it can get? If you don’t like peasant thug secret policemen, apply for a visa or just come illegally. Learn a little Spanish and pass for Mexican. Or, you know, just deal. I mean, it’s not like our permanent government is that great either.

But no. And here is the American’s sin: from his own cupidity, from his ennui and folly and innate, instead of using the power of America in the best interest of Egyptians, or even in the best interest of Americans with an Egyptian passport, what does he do? To entertain himself, to get his TV jollies, shouting hosannahs and clapping himself on the back, he assists his Egyptian friends in committing horrible and spectacular political suicide. Is the American moral? Is he realistic? He is both criminal and insane. His nightly news is quite dramatic; his gas goes up by a dollar a gallon; his friends are devoured by wolves. Hell, it’s America, we’re bored and rich.

Thus brains on the road. And thus, Libya — which is to Egypt as Egypt is to New Jersey, at least culturally. Thus America, twittering away, says to Libya: “Come on! Have a revolution! It’s fun! Don’t miss out! Besides, we’re all done with Egypt and we’re getting bored bored BORED!”

Drums That Talk

Tuesday, March 15th, 2011

James Gleick (Chaos) opens his new book, The Information: A History, a Theory, a Flood, with a simple example of applied information theory, drums that talk:

The example is a drum language used in a part of the Democratic Republic of Congo where the human language is Kele. European explorers had been aware for a long time that the irregular rhythms of African drums were carrying mysterious messages through the jungle. Explorers would arrive at villages where no European had been before and find that the village elders were already prepared to meet them.

Sadly, the drum language was only understood and recorded by a single European before it started to disappear. The European was John Carrington, an English missionary who spent his life in Africa and became fluent in both Kele and drum language. He arrived in Africa in 1938 and published his findings in 1949 in a book, The Talking Drums of Africa. Before the arrival of the Europeans with their roads and radios, the Kele-speaking Africans had used the drum language for rapid communication from village to village in the rain forest. Every village had an expert drummer and every villager could understand what the drums were saying. By the time Carrington wrote his book, the use of drum language was already fading and schoolchildren were no longer learning it. In the sixty years since then, telephones made drum language obsolete and completed the process of extinction.

Carrington understood how the structure of the Kele language made drum language possible. Kele is a tonal language with two sharply distinct tones. Each syllable is either low or high. The drum language is spoken by a pair of drums with the same two tones. Each Kele word is spoken by the drums as a sequence of low and high beats. In passing from human Kele to drum language, all the information contained in vowels and consonants is lost. In a European language, the consonants and vowels contain all the information, and if this information were dropped there would be nothing left. But in a tonal language like Kele, some information is carried in the tones and survives the transition from human speaker to drums. The fraction of information that survives in a drum word is small, and the words spoken by the drums are correspondingly ambiguous. A single sequence of tones may have hundreds of meanings depending on the missing vowels and consonants. The drum language must resolve the ambiguity of the individual words by adding more words. When enough redundant words are added, the meaning of the message becomes unique.

In 1954 a visitor from the United States came to Carrington’s mission school. Carrington was taking a walk in the forest and his wife wished to call him home for lunch. She sent him a message in drum language and explained it to the visitor. To be intelligible to Carrington, the message needed to be expressed with redundant and repeated phrases: “White man spirit in forest come come to house of shingles high up above of white man spirit in forest. Woman with yam awaits. Come come.” Carrington heard the message and came home. On the average, about eight words of drum language were needed to transmit one word of human language unambiguously. Western mathematicians would say that about one eighth of the information in the human Kele language belongs to the tones that are transmitted by the drum language. The redundancy of the drum language phrases compensates for the loss of the information in vowels and consonants. The African drummers knew nothing of Western mathematics, but they found the right level of redundancy for their drum language by trial and error. Carrington’s wife had learned the language from the drummers and knew how to use it.

The story of the drum language illustrates the central dogma of information theory. The central dogma says, “Meaning is irrelevant.” Information is independent of the meaning that it expresses, and of the language used to express it. Information is an abstract concept, which can be embodied equally well in human speech or in writing or in drumbeats. All that is needed to transfer information from one language to another is a coding system. A coding system may be simple or complicated. If the code is simple, as it is for the drum language with its two tones, a given amount of information requires a longer message. If the code is complicated, as it is for spoken language, the same amount of information can be conveyed in a shorter message.

That’s Freeman Dyson, by the way, explaining Gleick’s book.

Seconds Before the Big One

Monday, March 14th, 2011

Earthquake alarms could provide warning seconds before the Big One:

Within seconds of an earthquake’s first subtle motions, scientists can now predict with some certainty how strong and widespread the shaking will be. By integrating new science with modern communications technologies, the authorities could get a few tens of seconds’ warning, perhaps even half a minute, to those in harm’s way. That may not sound like much, but it is enough to send shutdown warnings to power plants and rail networks, automatically open elevator doors and alert firefighters.

The Loma Prieta quake was centered south of the Bay in the rugged Santa Cruz Mountains. After the ground started to shake, it took more than 30 seconds for the damaging vibrations to travel the 60 miles to San Francisco and Oakland, the scenes of more than 80 percent of the fatalities. If an earthquake early-warning system had existed back then, it could have provided perhaps a 20-second warning to the heart of the region. This is enough time to slow and stop trains, issue “go around” commands to airplanes on final approach and turn street­lights red—preventing cars from entering hazardous structures such as bridges and tunnels. Workers in hazardous work environments could move to safe zones, and sensitive equipment could enter a hold mode, reducing damage and loss. Schoolchildren and office workers could get under desks before the shaking arrived. The region would be ready to ride out the violence to come.

Such networks are being deployed all over the world in locations as diverse as Mexico, Taiwan, Turkey and Romania. Japan’s system is among the most advanced. The nationwide network issues warnings via most television and radio stations, several cell phone providers, and the public address system of malls and other public spaces. In the three and a half years since the system came online, more than a dozen earthquakes have already triggered widespread alerts. People in factories, schools, trains and automobiles were given a few precious moments to prepare; following the alerts, there were no reports of panic or highway accidents. The U.S. is behind the rest of the world, but a new test bed being deployed in California should soon lead to a full-scale warning system in that fault-ridden state.

That piece, by the way, was scheduled to come out in next month’s Scientific American, but they put it up online early, for obvious reasons.

A martyred and plagiarized heretic

Monday, March 14th, 2011

Matt Ridley (The Rational Optimist) calls William Tyndale a martyred and plagiarized heretic:

Let me confess a prejudice. The authorised Bible has always been a problem for me. Not because I don’t like it — atheists can still revel in the rhythm of the prose of their tribal scripture — but because it was written by six committees of 47 scholars in total. It seems to be an exception to the rule that anything written by committees is written badly. Adam Nicolson, as the erstwhile historian of the Dome, is alive to this paradox.

Then recently I read a wonderful book by Brian Moynahan called ‘If God Spare my Life’ (now republished as `Book of Fire’), based partly on the scholarship of Professor David Daniell of University College London. I discovered a resolution of the paradox. The authorized version is an exception that proves the rule, for more than three-quarters of the prose is in fact the work of a single man.

The King James Bible is usually described as a translation, but look carefully at the king’s instructions to his committees: he asked them not so much to translate from scratch as to revise and reconcile different English translations by reference to the Greek and especially Hebrew texts. In the scholars’ words, their job was ‘to make a good one better, or out of many good ones, one principal good one.’

They drew on several English versions of the bible. According to two Canadian academics, just 2.8% of the New Testament text is original to the King James, 13.4% came from other published English bibles and a remarkable 83.7% from William Tyndale’s translation of 1525 (as revised in 1534). The Old Testament was less Tyndale-dominated, but still about 75.7% his for those books he had finished translating before he died in 1536. That’s a greater plagiarism than cost the German defence minister his job.

Did I say died? Murdered — for translating the bible — at the behest of the very church, which 75 years later adopted so much of his text without acknowledgement.
Tyndale was an English priest who spent most of his life in hiding in Germany and the Low Countries where he translated and printed scripture for distribution in England, to the fury of Thomas More, who bought and burned his works as fast as Tyndale could smuggle them across the Channel. Eventually More — though himself already under arrest in the Tower — managed to get the Louvain authorities to track down Tyndale, arrest him, try him for heresy, defrock him and kill him by strangulation and burning.

Not only is the bulk of the authorised bible plagiarized from Tyndale; the most memorable phrases are his: ‘let there be light’, ‘we live and move and have our being’, ‘fight the good fight’, ‘the powers that be’, ‘a law unto themselves’, `the spirit is willing, but the flesh is weak’, `flowing with milk and honey’, ‘the apple of his eye’, ‘signs of the times’, ‘broken-hearted’, ‘eat, drink and be merry’, ‘salt of the earth’, `fat of the land’, ‘my brother’s keeper’.

Shaolin Kung Fu

Monday, March 14th, 2011

Shaolin Kung Fu is known around the world for its powerful… brand:

Scholars dismiss much of this as legend embroidered with bits of truth. Bare-handed martial arts existed in China long before the fifth century and likely arrived at Shaolin with ex-soldiers seeking refuge. For much of its history, the temple was essentially a wealthy estate with a well-trained private army. The more the monks fought, the more proficient they became as fighters, and the more their fame grew. Yet they were not unbeatable. The temple was sacked repeatedly during its history. The most devastating blow came in 1928, when a vengeful warlord burned down most of the temple, including its library. Centuries of scrolls detailing kung fu theory and training as well as treatises on Chinese medicine and Buddhist scriptures—essentially the temple’s soul—were destroyed, leaving the legacy of Shaolin kung fu to be passed down master to disciple, through men such as Yang Guiwu.

Today, however, temple officials seem more interested in building the Shaolin brand than in restoring its soul. Over the past decade Shi Yongxin, the 45-year-old abbot, has built an international business empire—including touring kung fu troupes, film and TV projects, an online store selling Shaolin-brand tea and soap—and franchised Shaolin temples abroad, including one planned in Australia that will be attached to a golf resort. Furthermore, many of the men manning the temple’s numerous cash registers—men with shaved heads and wearing monks’ robes—admit they’re not monks but employees paid to look the part.

Over tea in his office at the temple, Yongxin calmly makes the case that all of these efforts further Buddhism. “We make more people know about Zen Buddhism,” he says. A slightly jowly, sad-eyed man, he has a politician’s gift for imbuing his remarks with the sense that he believes deeply in what he’s saying. “By registering the Shaolin brand name in other countries, promoting Shaolin traditional cultures, including kung fu, we’re having people around the world know better and believe in Zen Buddhism.”

It is an argument he has made many times in both the Chinese and foreign press, and he isn’t the first abbot to face criticism that Shaolin has pursued riches over enlightenment. A 17th-century magistrate railed against the temple’s “lofty mansions and splendid furnishings.” And yet, whether a force for evangelizing or profitmaking, the Shaolin Temple has helped foster an undeniable kung fu renaissance, which has coincided with China’s own resurgence as an international power. Nowhere is this more evident than in Dengfeng, a sprawling city of 650,000 just six miles from the temple gates. Here some 60 martial arts academies have sprouted over the past two decades and now boast more than 50,000 students. A drive down a main road passes some of the biggest schools. They rise like Vegas casinos, with towering dormitories adorned with murals of kung fu fighters, dragons, and tigers.

Religious Toxicity

Sunday, March 13th, 2011

Eric S. Raymond (The Cathedral and the Bazaar) bristles at the notion of comparing his militant atheism to militant Islam simply because both are anti-Christian. He has his own metrics for religious toxicity:

To understand how militant atheists think about religion, you first have to understand that modern atheism is not simply against religion. It is for something; it opposes religion from a set of principles and values. Those principles first found expression in the French Enlightenment of the 1750s and the writings of men like Voltaire and Diderot. In later centuries they were further developed by (among others) Robert Ingersoll and Bertrand Russell.

Modern ‘militant atheists’ (including me) see themselves as the heirs of Voltaire, the children of the Enlightenment. Our rejection of theism is motivated by specific features of theistic religions. Two, in particular, stand out: (a) religious anti-rationality, and (b) religious violence. Not all religions are afflicted by these in equal measure.

To an atheist, religion A is worse than religion B when religion A requires belief in more anti-rational things than religion B does. More miracles, more superstition, more craziness. Religion A can also be worse than religion B by having a stronger tendency to erupt in violence — pogroms, witch-burnings, religious wars, conversion by the sword.

These compound into a sort of religious threat potential, the estimated likelihood that in any given year the believers are going to boil over into an irrationally murderous mob intent on putting unbelievers to the sword.

Atheists tend to broadly agree about the relative threat potential of major religions. Among those that come in very low on the toxicity scale we can include, for example, the more austere Theravada varieties of Buddhism. These are essentially systems of prescriptive psychology with almost no component of belief in a supernatural, and have no history of warfare or conversion by the sword. Threat potential: near zero.

We class other religions as low in toxicity but suspicious because of their historical roots. A good example of this class is the Baha’i Faith, which is a rather nice inoffensive little religion if you ignore that streak of Shi’a Islam in its past. Some of the quieter and more mystical Christian denominations, like Quakers, fall in this category as well — indeed, many Quakers are barely theistic themselves. I know of several atheists who deliberately adopted Quaker ritual for their weddings and didn’t surprise their atheist friends even a bit by doing so. Threat potential: low.

One Christian subgroup also gives us an example of a religion that maxes out the doctrinal-craziness scale while seeming relatively harmless on the violence front. That would be the Mormons. I mean, really — Amerinds as the Ten Lost Tribes of Israel? God lives on the planet Kolob and you get your own world to rule when you die? How do these people even take themselves seriously? Oh well, at least they seem to plan on inheriting the Earth by out-reproducing unbelievers rather than killing them. That’s something, even though it could easily change in the future. Threat potential: low to middling.

There’s also pretty general agreement on which religions are the toxic worst. These would be the religions that combine particularly crazy superstitions with a blood-soaked historical record. We atheists think of these as deadly memetic plagues, occasionally found in relatively well-behaved quiescent phases but prone to bloom into full-fledged insane murderousness whenever the next charismatic nutcase wanders along to remind them what they’re really about.

And which two religions are at the very top of the threat-potential list? No prizes for guessing that they are Christianity and Islam, not necessarily in that order. Both have relatively tolerable minorities (Christianity’s Quakers and Unitarians, Islam’s Sufis) but have extremely dangerous and powerful fundamentalist groups that effectively dominate the discourse inside their communities.

Virtual Jannisaries

Sunday, March 13th, 2011

Jehu’s reactionary plan for victory starts from the fact that conservatives have more children than progressives:

Simply put, our side of the culture war is reproducing itself, and theirs is not. With a few enhancements (celebrating large families, supporting those among our friends that choose to have them, and having more children ourselves), this is the core of our plan for victory. Notice that this is not a tactic, nor a grand strategy in the Napoleonic sense. No, this is something far scarier to anyone who has studied history, particularly military history. This is logistics. Logistical superiority is decisive over protracted conflicts when the side possessing it has the will.

In His great wisdom, he has made many of the great sins self-limiting.

The next big component is to prevent the other side from stealing our children for use against us as virtual jannisaries. The major weapon in the arsenal here is homeschooling. Homeschool families, in addition to having a much better TFR (total fertility ratio averaging 3.5, which may be an underestimate given that not all of the families surveyed are likely to be done having children) are much more likely to pass their world view on to their children.

The reason for this shouldn’t be surprising. The average kid spends around 6-8 hours a day for an average of 180 days a year in school. If you count other para-school activities where they’re under the auspices of the school, it gets even worse. This has most parents and churches massively outgunned. Most people’s world views aren’t formed by careful contemplation but rather by rote repetition. Here’s another secret. It is practically impossible to teach without also teaching a world view. I found this out directly when teaching engineering, which is one of the least ideological subjects I can think of. So since controlling the public schools isn’t feasible until we have hegemony, and schools are inherently agencies of indoctrination, it follows that we should work to withdraw our children from them and undermine their public support, with an eye towards destroying them or coopting them (in the off chance that we start to succeed on a faster time scale than I had hoped).

Fortunately, they really are quite wretched at their stated goals, which is really quite surprising when you think about it. Here they are, with massive staffs and tons of educated people available to them, and massive budgets and resources available to them as well. And they still can’t produce results better than homeschool moms with a high school education or less. This must infuriate them, as it would me, if your average DIY’er could do better engineering than me with my vaunted PhD and years of industry experience.

I can practically see the sly grin forming:

The next part of the strategy, having secured our own children, is to steal theirs. How would we do this? The answer is, the stupid, or perhaps blessed bastards will give them to us. All we have to do is to take them by offering them what they claim they want — i.e., a better education for less money. Getting their young sons is particularly easy. You see, the leftist indoctrination is so anti-male, and especially anti-young boy, that all you have to do to make loyal allies of them is one simple thing. Don’t hate them. A mere lack of animus is really all it takes.

The Short History Of ‘Hello’

Saturday, March 12th, 2011

As a child, I was surprised to learn that ‘hello’ was a fairly new word:

The Oxford English Dictionary says the first published use of “hello” goes back only to 1827. And it wasn’t mainly a greeting back then. Ammon says people in the 1830′s said hello to attract attention (“Hello, what do you think you’re doing?”), or to express surprise (“Hello, what have we here?”). Hello didn’t become “hi” until the telephone arrived.

The dictionary says it was Thomas Edison who put hello into common usage. He urged the people who used his phone to say “hello” when answering. His rival, Alexander Graham Bell, thought the better word was “ahoy.”

“Ahoy,” it turns out, had been around longer — at least 100 years longer — than hello. It too was a greeting, albeit a nautical one, derived from the Dutch “hoi,” meaning “hello.” Bell felt so strongly about “ahoy” he used it for the rest of his life.

And so, by the way, does the entirely fictional “Monty” Burns, evil owner of the Springfield Nuclear Power Plant on The Simpsons.

I remember reading the Winnie the Pooh stories as a child, and Pooh would call out “Halloo!”, which struck me as wrong — not quite “hello,” and used in place of “hey!”

The first phone books included authoritative How To sections on their first pages and “hello” was frequently the officially sanctioned greeting.

In fact, the first phone book ever published, by the District Telephone Company of New Haven, Connecticut, in 1878 (with 50 subscribers listed) told users to begin their conversations with “a firm and cheery ‘hulloa.’”

The phonebook’s recommended Way To End A Phone Conversation — “that is all” — did not take off:

This strikes me as an eminently more honest and forthright way to end a phone call than “good-bye.” “Good-bye,” “bye-bye,” and all the other variants are ultimately contractions of the phrase “God Be with you” (or “with ye”). I don’t know about you, but I don’t really mean to say that when I end a conversation. I suppose I could say “ciao” — which does have a certain etymological background of coming from the Italian schiavo, which means “I am your slave,” and I don’t much want to say that either…

Mistakes made by people who write about politics

Saturday, March 12th, 2011

Foseti addresses some mistakes made by people who write about politics, like ignoring that regulations are written by normal people with normal human motivations:

There are still some old timers around the various agencies that I work with who take the concept of public service seriously. They fly coach even if they’re eligible for a more expensive ticket. They get the taxi driver to fill out the receipt, etc. But all of these people will retire in the next few years.

Stop acting like regulations are written with an eye to costs and benefits. Those of us that write rules cannot possible understand the total costs and benefits of any particular rule. Lots of people write as if we carefully weigh the costs and benefits and come to optimal solution that is beneficial for society. No regulation has ever been written in this way. Agencies may debate costs and benefits amongst themselves, but there are too many variables so that this debate degenerates quickly. Since I’m a good bureaucrat, I can argue that benefits of virtually any potential regulation outweigh the costs. Any other decent bureaucrat could take the other side.

Regulations are written to increase the authority or funding for the agency writing the regulation. I have yet to see an exception to this law (Foseti’s Law?).

In general political columnists and economists pay way too much attention to the person at the top of the agency and not to the permanent staff of the agency. If you want to understand an agency, you have to understand its staff.

I’ve worked at three agencies. At all three agencies, I worked under two different heads of the agencies. I’ve been in government under two different Presidents. So far, I haven’t seen the day-to-day work of anyone (outside of someone I know who works at the Justice Department) change in any way when the President changed or when the head of the agencies changed.

Virtually everyone that writes regulations has the same political philosophy.

Virtually everyone that writes regulations cannot be fired from their job. Authority without responsibility is not a recipe for success in any circumstance.

Monopoly Live

Friday, March 11th, 2011

The ironically named Monopoly Live takes some of the human element out of the (in)famous board game, by having a computer adjudicate all the rules — something Dark Tower did in 1981, when computers were novel.

So, it takes away the dice and play money, but it adds some things, too:

It sprinkles in random events, like a horse race where players must bet on winners.

The computer also tracks how fast or slow play is going, and may intervene to make it lively. If, say, very little property is getting bought, it will announce an auction in the middle of turns.

Monopoly was intended as an anti-monopoly propaganda piece — specifically an anti-land-monopoly piece — but the interviewed expert doesn’t seem to know that:

Mary Flanagan, a game designer and distinguished professor of digital humanities at Dartmouth, said that games tended to reflect the societies that they were played in. For instance, the original Monopoly, issued in 1935 by Parker Brothers, now a subsidiary of Hasbro, reflected “American ingenuity, the sense of needing to have hope, and reinforcing capitalism in the face of real economic despair,” she said.

The OK Plateau

Friday, March 11th, 2011

While training to become a mental athlete, Joshua Foer reached a plateau:

At one point, not long after I started training, my memory stopped improving. No matter how much I practiced, I couldn’t memorize playing cards any faster than 1 every 10 seconds. I was stuck in a rut, and I couldn’t figure out why. “My card times have hit a plateau,” I lamented.

“I would recommend you check out the literature on speed typing,” he replied.

When people first learn to use a keyboard, they improve very quickly from sloppy single-finger pecking to careful two-handed typing, until eventually the fingers move effortlessly and the whole process becomes unconscious. At this point, most people’s typing skills stop progressing. They reach a plateau. If you think about it, it’s strange. We’ve always been told that practice makes perfect, and yet many people sit behind a keyboard for hours a day. So why don’t they just keeping getting better and better?

In the 1960s, the psychologists Paul Fitts and Michael Posner tried to answer this question by describing the three stages of acquiring a new skill. During the first phase, known as the cognitive phase, we intellectualize the task and discover new strategies to accomplish it more proficiently. During the second, the associative phase, we concentrate less, making fewer major errors, and become more efficient. Finally we reach what Fitts and Posner called the autonomous phase, when we’re as good as we need to be at the task and we basically run on autopilot. Most of the time that’s a good thing. The less we have to focus on the repetitive tasks of everyday life, the more we can concentrate on the stuff that really matters. You can actually see this phase shift take place in f.M.R.I.’s of subjects as they learn new tasks: the parts of the brain involved in conscious reasoning become less active, and other parts of the brain take over. You could call it the O.K. plateau.

Psychologists used to think that O.K. plateaus marked the upper bounds of innate ability. In his 1869 book “Hereditary Genius,” Sir Francis Galton argued that a person could improve at mental and physical activities until he hit a wall, which “he cannot by any education or exertion overpass.” In other words, the best we can do is simply the best we can do. But Ericsson and his colleagues have found over and over again that with the right kind of effort, that’s rarely the case. They believe that Galton’s wall often has much less to do with our innate limits than with what we consider an acceptable level of performance. They’ve found that top achievers typically follow the same general pattern. They develop strategies for keeping out of the autonomous stage by doing three things: focusing on their technique, staying goal-oriented and getting immediate feedback on their performance. Amateur musicians, for example, tend to spend their practice time playing music, whereas pros tend to work through tedious exercises or focus on difficult parts of pieces. Similarly, the best ice skaters spend more of their practice time trying jumps that they land less often, while lesser skaters work more on jumps they’ve already mastered. In other words, regular practice simply isn’t enough. To improve, we have to be constantly pushing ourselves beyond where we think our limits lie and then pay attention to how and why we fail. That’s what I needed to do if I was going to improve my memory.

With typing, it’s relatively easy to get past the O.K. plateau. Psychologists have discovered that the most efficient method is to force yourself to type 10 to 20 percent faster than your comfort pace and to allow yourself to make mistakes. Only by watching yourself mistype at that faster speed can you figure out the obstacles that are slowing you down and overcome them. Ericsson suggested that I try the same thing with cards. He told me to find a metronome and to try to memorize a card every time it clicked. Once I figured out my limits, he instructed me to set the metronome 10 to 20 percent faster and keep trying at the quicker pace until I stopped making mistakes. Whenever I came across a card that was particularly troublesome, I was supposed to make a note of it and see if I could figure out why it was giving me cognitive hiccups. The technique worked, and within a couple days I was off the O.K. plateau, and my card times began falling again at a steady clip. Before long, I was committing entire decks to memory in just a few minutes.

More than anything, what differentiates top memorizers from the second tier is that they approach memorization like a science. They develop hypotheses about their limitations; they conduct experiments and track data. “It’s like you’re developing a piece of technology or working on a scientific theory,” the three-time world champ Andi Bell once told me. “You have to analyze what you’re doing.”

This ties in with George Leonard’s Mastery, which Aretae recently mentioned.

A Bad Little Man

Friday, March 11th, 2011

While this news story describes 8-year-old wrestler Stevo Poulin as a bad little man with a mean mohawk, from the video it’s clear that he’s simply far, far more technical than his opponents:

Back to the Dungeon

Friday, March 11th, 2011

Every Friday night, from eighth grade, in 1979, through his senior year in high school, in 1984, Ethan Gilsdorf played Dungeons & Dragons. He stopped playing when he went off to college, but he rediscovered his hobby years later:

When I hit 40, I discovered my cache of D&D rule books and dice some two decades after I’d last laid eyes on it. Stirred by nostalgia, I wrote “Fantasy Freaks and Gaming Geek,” a travel memoir/pop culture investigation that records a year spent “re-geeking” myself and reintegrating D&D and its ilk back into my life. Thanks to the widespread acceptance of gaming and fantasy subcultures — from “Lord of the Rings” to “Harry Potter” to MMOs (online role-playing games) like “World of Warcraft” (aka WoW) — that re-geeking was easier than I expected. I emerged from my hobbit hole and saw a kinder, more tolerant world. A real world where it’s safe to make peace with one’s “inner geek” without risk of reprisal from the jocks. And I’m not the only one making piece with my 20-sided die. “Your story is my story,” countless men tell me.
“D&D is intrinsically nostalgic,” Tavis Allison told me in a recent e-mail. Allison, 40, is a fundraiser for a hospital in New York City and an avid D&Der who also moderated a panel discussion called “Dungeons & Dragons in Contemporary Art” at a New York City gallery this fall. “The art in the oldest books is weird and crude and like a medieval manuscript; even when it was new it reeked of some strange past, and part of the appeal of fantasy in general is this longing for a past that never was. Can you be nostalgic for something you never had?”

Yes. Yes you can.

Pure and simple, for many, D&D represents a lost age: It was an individualized, user-driven, DIY, human-scaled creative space separate from the world of adults and the intrusion of corporate forces. As Allison rightly noted, D&D recalls that day “before orcs and wookiees were the intellectual property of vast transmedia corporations.” Back when you had lots more free time than money — before girlfriends, job, kids. Life.