Math Suggests College Frenzy Will Soon Ease

Sunday, March 9th, 2008

Math Suggests College Frenzy Will Soon Ease:

Projections show that by next year or the year after, the annual number of high school graduates in the United States will peak at about 2.9 million after a 15-year climb. The number is then expected to decline until about 2015. Most universities expect this to translate into fewer applications and less selectivity, with most students probably finding it easier to get into college.

Geek Love

Sunday, March 9th, 2008

Adam Rogers, a senior editor at Wired, has written a piece for the New York Times dedicated to Gary Gygax — Geek Love:

We live in Gary Gygax’s world. The most popular books on earth are fantasy novels about wizards and magic swords. The most popular movies are about characters from superhero comic books. The most popular TV shows look like elaborate role-playing games: intricate, hidden-clue-laden science fiction stories connected to impossibly mathematical games that live both online and in the real world. And you, the viewer, can play only if you’ve sufficiently mastered your home-entertainment command center so that it can download a snippet of audio to your iPhone, process it backward with beluga whale harmonic sequences and then podcast the results to the members of your Yahoo group.

CCRKBA Says Press Purposely Downplays Key Role of Armed Student in Jerusalem

Saturday, March 8th, 2008

CCRKBA Says Press Purposely Downplays Key Role of Armed Student in Jerusalem:

An armed student at Jerusalem’s Mercaz Haray seminary played a crucial role in stopping a gun-wielding terrorist Thursday, but the American press is downplaying his heroism because it proves that armed students can stop campus gunmen, the Citizens Committee for the Right to Keep and Bear Arms said today.

Yitzhak Dadon, 40, was described as “a private citizen who had a gun license and was able to shoot the gunman with his pistol” by reporter Etgar Lefkovitz with the Jerusalem Post. However, many news agencies in the United States are downplaying Dadon’s decisive role in the incident.

Here’s the Ynet version of the story:

A yeshiva student who shot the Jerusalem terrorist says he was busy studying when suddenly shots rang out, prompting him to grab his gun and eventually kill the Palestinian attacker

“We realized something happened so I cocked my handgun,” Yitzhak Dadon told Ynet Thursday evening.

“I went up on the roof and waited for the terrorist. Meanwhile, I saw blood and shattered glass,” Dadon said. “The terrorist continue firing in the air, so I waited to see him again, and then I shot him twice in the head.”

Dadon says the terrorist continued firing even after he was hurt.

“He kept on firing until an IDF officer arrived and shot him again,” Dadon said.

The gunman infiltrated a rabbinical seminary at the entrance of Jerusalem and opened fire after nightfall Thursday, police said. The ZAKA emergency response service has confirmed at least eight people have been killed.

Paramedics said they treated several people for injuries – among them four in serious to critical condition.

Optimize for now!

Friday, March 7th, 2008

David of 37signals says, Optimize for now!

One of the easiest ways to shoot down good ideas, interesting policies, or worthwhile experiments is by injecting the assumption that whatever you’re doing needs to last forever and ever. Which means that the concept has to scale from 5 people to 5,000 or from 100,000 users to 100 million. That’s a terrible way to get from those 5 people to 5,000 or reach those 100 million users.

To reach the top, you have to be willing to use all the tricks that makes sense at the earlier stages. That’s your advantage over the guys who are already sitting up there. So you’re not Google and don’t do a billion dollars in profit every quarter. But I bet you that you’re way more capable of quick, sweeping changes. When you have 100 million users on your email platform, you can’t do the same quick iterations that constantly push upgrades out. When you just have your first few hundred or thousand, you can.

So stop worrying too much about whether giving everyone in your company a credit card at 10 people is going to work when you’re a hundred times bigger. If it doesn’t, you change, come up with something that does work for that size.

The same with your infrastructure. We started on a single server for everything when Basecamp was first launched. There was no point in growing a huge farm of machines if the thing was going to flop anyway. Today we have many more machines and redundancies and surveillance and more because we’re at a different level.

The best way to get to the point of needing more is by optimizing for today. Use the strengths of your current situation instead of being so eager to adopt the hassles of tomorrow.

From Ridiculous to Revolutionary

Friday, March 7th, 2008

I’ve been a fan of Clay Shirky ever since I read his piece on Power Laws, Weblogs, and Inequality a few years ago. Now he has a new book out, Here Comes Everybody, on The Power of Organizing Without Organizations. Katherine Mangu-Ward of Reason writes about his book and about how frivolous technologies go From Ridiculous to Revolutionary:

“Nothing says ‘police state’ like detaining kids for eating ice cream.”
[...]
They were a flashmob, of course. Flashmobs started out as a critique of hipster culture. Bill Wasik, an editor at Harper’s, started sending out messages (as “Bill from New York”) to large groups, suggesting that they do things like make bird noises on a ledge in Central Park. He intended it as a kind of elaborate thumb in the eye of hipster conformism. Others caught on, perhaps omitting the irony, and did things like staging a silent rave in Victoria Station. The New York Times runs a smug story on how flashmobbers “have nothing better to do” with their time. And, as the cliché goes, once The New York Times has heard of a trend, it must be so over, right?

But then, suddenly, flashmobs found their true calling: On a blog in Belarus, someone proposes a flashmob. The plan is to get together in October Square—the preferred site for political action, and a place where concerted action is banned—and eat ice cream.

Black clad secret police appear and drag dairyphilic kids bodily out of the square. “The problem with a group eating ice cream wasn’t the ice cream, it was the group,” says Shirky. “Lukashenko, the leader of Belarus, is the last great Eastern European dictator [but] the Lukshenko government can’t penetrate the conspiracy, right? The whole thing is being done on the web. And they can’t stop the group from entering October Square because they’re not a group when they enter October Square.”

Photos went up online almost immediately. The thinking was that it’s tough to be really consistently oppressive and brutal when the world is watching. (Shirky: “The bug in the system is that the West cares quite a bit less about Eastern European dictatorship than it used to.”)

Shirky has created a new blog specifically for the book — not an uncommon marketing strategy — with some intriguing, timely posts, like this one on WikiLeaks:

There is a tension between freedom of speech in general, and restriction of certain kinds of speech; how can society let people say what they like, while still restricting things like libel or publication of trade secrets? And although the law around these issues hasn’t changed, the economics of media have been so transformed that the old legal bargains between freedom and restriction are breaking, and we have no easy way of replacing them.

The current way we have structured this bargain relies on the motivations of media professionals. Since media outlets are costly and complex to set up and run, every such outlet has a natural constituency, the professional publishers and editors and engineers who have a long-term commitment to the business. Because these professionals have a long-term commitment, it is possible to balance broad freedom of speech with specific classes restrictions, with laws that punish media professionals for publishing libelous material or trade secrets. The threat of these punishments motivate them to act as filters, not publishing such material in their newspapers or airing it on their stations. And because there are so few media outlets, society can rein in certain kinds of speech with very little little legal leverage.

Except none of those things are true anymore. Creating media is no longer costly or complex as an absolute case, it doesn’t require trained professionals, and it doesn’t require long-term commitment. Amateurs now have direct access, without going through a professional bottleneck.

Media, in its most elemental form, is the means of repeating a message thousands or millions of times, a capability that has become vanishingly cheap and held in common by amateurs and professionals. This mass amateurization is an end to the scarcity of media outlets. Now, if you have something to say in public, you don’t need to ask anyone for help or permission. We can try to find you and punish you, but this will always be post hoc — the self-interest of media professionals in keeping their jobs is no longer a way of preventing the amateurs from speaking out.

Cool it!

Thursday, March 6th, 2008

Big IT firms are saying Cool it! — because data centers use a lot of electricity, which generates heat, which demands still more electricity for air conditioning:

As one industry falls, another rises. The banks of the Columbia River in Oregon used to be lined with aluminium smelters. Now they are starting to house what might, for want of a better phrase, be called data smelters. The largest has been installed by Google in a city called The Dalles. Microsoft and Yahoo! are not far behind. Google’s plant consumes as much power as a town of 200,000 people. And that is why it is there in the first place. The cheap hydroelectricity provided by the Columbia River, which once split apart aluminium oxide in order to supply the world with soft-drinks cans and milk-bottle tops, is now being used to shuffle and store masses of information. Computing is an energy-intensive industry. And the world’s biggest internet companies are huge energy consumers — so big that they are contemplating some serious re-engineering in order to curb their demand.

Gary Gygax Passes to the Happy Hunting Grounds

Tuesday, March 4th, 2008

Gary Gygax Passes to the Happy Hunting Grounds, according to the “Troll Lord” of Troll Lord Games, Stephen Chenault:

It is almost too much to get my mind about. But I’ve just had news that our dear Dungeon Master has passed away. Ernie called this morning, he thought we should let the fans know. He’s just sent an email out.

Gary was in his home when he gathered himself up to cross the great divide.

He was a very dear friend of mine. And I will miss him so.

God Speed My Friend.

Steve

Addendum: NPR interviews Stephen Chenault:

Imagine a mournful horn echoing across thousands of fantasy worlds: E. Gary Gygax, the co-creator of the role-playing game Dungeons and Dragons, died Wednesday morning. He was 69.

Gary Gygax was an icon to fans of the game, many of whom would show up at his home in Lake Geneva, Wis.

What began as a fantasy game published in book form in the early 1970s, eventually morphed and tumbled onto kitchen tables and dorm room floors. Players assumed the character of elves and dwarves, magicians and swordsmen, and confronted the primal conflict between good and evil.

“D&D,” as fans call it, is the granddaddy of popular online games that attract hundreds of thousands of gamers to the Internet today.

Stephen Chenault, owner of Troll Lord Games, was a close friend of Gary Gygax. He talks to Melissa Block about Gygax and the beloved game he helped create.

Experimental Medication Kicks Depression in Hours Instead of Weeks

Tuesday, March 4th, 2008

Experimental Medication Kicks Depression in Hours Instead of Weeks — if by “experimental medication” they mean “street drug”:

People with treatment-resistant depression experienced symptom relief in as little as two hours with a single intravenous dose of ketamine, a medication usually used in higher doses as an anesthetic in humans and animals, in a preliminary study. Current antidepressants routinely take eight weeks or more to exert their effect in treatment-resistant patients and four to six weeks in more responsive patients — a major drawback of these medications. Some participants in this study, who previously had tried an average of six medications without relief, continued to show benefits over the next seven days after just a single dose of the experimental treatment, according to researchers conducting the study at the National Institutes of Health’s National Institute of Mental Health.

Single-Sex Public Education

Tuesday, March 4th, 2008

Elizabeth Weil writes about Single-Sex Public Education:

Foley Intermediate School began offering separate classes for boys and girls a few years ago, after the school’s principal, Lee Mansell, read a book by Michael Gurian called Boys and Girls Learn Differently! After that, she read a magazine article by Sax and thought that his insights would help improve the test scores of Foley’s lowest-achieving cohort, minority boys. Sax went on to publish those ideas in Why Gender Matters: What Parents and Teachers Need to Know about the Emerging Science of Sex Differences. Both books feature conversion stories of children, particularly boys, failing and on Ritalin in coeducational settings and then pulling themselves together in single-sex schools. Sax’s book and lectures also include neurological diagrams and scores of citations of obscure scientific studies, like one by a Swedish researcher who found, in a study of 96 adults, that males and females have different emotional and cognitive responses to different kinds of light. Sax refers to a few other studies that he says show that girls and boys draw differently, including one from a group of Japanese researchers who found girls’ drawings typically depict still lifes of people, pets or flowers, using 10 or more crayons, favoring warm colors like red, green, beige and brown; boys, on the other hand, draw action, using 6 or fewer colors, mostly cool hues like gray, blue, silver and black. This apparent difference, which Sax argues is hard-wired, causes teachers to praise girls’ artwork and make boys feel that they’re drawing incorrectly. Under Sax’s leadership, teachers learn to say things like, “Damien, take your green crayon and draw some sparks and take your black crayon and draw some black lines coming out from the back of the vehicle, to make it look like it’s going faster.” “Now Damien feels encouraged,” Sax explained to me when I first met him last spring in San Francisco. “To say: ‘Why don’t you use more colors? Why don’t you put someone in the vehicle?’ is as discouraging as if you say to Emily, ‘Well, this is nice, but why don’t you have one of them kick the other one — give us some action.’”

An interesting case study:

Wright was one of the first principals in the country to address the racial and socioeconomic achievement gaps by separating boys from girls. In 1999, he was sent to the failing Thurgood Marshall Elementary School, in Seattle, to try to turn the place around. One of the first things he noticed was that three boys were getting suspended for every girl, “and for the most ridiculous things in the world — a boy would burp, or he’d pass gas, or a girl would say, ‘He hit me.’ ” Nationwide, boys are nearly twice as likely as girls to be suspended, and more likely to drop out of high school than girls (65 percent of boys complete high school in four years; 72 percent of girls do). Boys make up two-thirds of special-education students. They are 1.5 times more likely to be held back a grade and 2.5 times more likely to be given diagnoses of A.D.H.D. So Wright met with his fourth-grade teachers and recalls telling them, “O.K., here’s what we’re going to do: how about you take all the boys and you take all the girls?” Wright says that in 2001, after Marshall’s first year in a single-sex format, the percentage of boys meeting the state’s academic standards rose from 10 percent to 35 percent in math and 10 percent to 53 percent in reading and writing.

If Saul Bass did the titles for Star Wars

Tuesday, March 4th, 2008

If Saul Bass did the titles for Star Wars, they’d look something like this:

As this Saul Bass Retrospective explains, he made his name with the title sequence to The Man with the Golden Arm. He also did the title sequence to Alfred Hitchcock’s Psycho. I had no idea he did the title sequence to Alien.

His title sequence to Anatomy of a Murder perhaps best exemplifies the signature style being spoofed — pardon, being paid homage to — by that Star Wars sequence:

Or perhaps his title sequence to It’s a Mad Mad Mad Mad World:

Pixie dust unsuitable for household lighting

Monday, March 3rd, 2008

Pixie dust is unsuitable for household lighting:

It’s a funky looking thing, which was widely reported around the gadget blogs, and was alleged by its designer, Clay Moulton, to give the equivalent light output of a 40-watt incandescent bulb for four hours from the energy of a weight dropping about four feet, or 122cm. When the weight gets to the bottom, you just lift it back to the top and away you go again.

Now, it stands to reason that a mere 1.2-metre drop isn’t going to give you forty actual watts for four hours unless the weight is incredibly heavy. Ignoring losses, it would by definition take forty watts of power over another four hours to lift the weight back up again, which is 160 watt-hours, which is quite a lot. A normal adult human in reasonable shape can manage about 75 watts of output when pedalling away on a bike connected to a generator; it’d take more than two hours of such pedalling to raise that weight back to the top of the Gravia light’s tube, if the weight were heavy enough to make a constant 40 watts on the way back down.

So I just assumed the lamp’s brightness was greatly overstated, and wasn’t even four-watts-of-LEDs-that-are-sort-of-equivalent-to-forty-watts-of-incandescent. But since they’d clearly actually made the thing and it’d won an award, I presumed it did work, if only as a night-light. Fair enough.

But neither Clay Moulton nor anybody else has, actually, built a Gravia.

The damn thing doesn’t exist.

And Mr Moulton, who apparently designed the thing as part of his Virginia Tech master’s thesis, didn’t even bother to check whether his design could possibly bloody work at all, even if you built it with LEDs from ten years in the future.

Actual Performance, Perceived Performance

Monday, March 3rd, 2008

Jeff Atwood comments on a study about actual performance versus perceived performance in software:

Although all the progress bars took exactly the same amount of time in the test, two characteristics made users think the process was faster, even if it wasn’t:
  1. progress bars that moved smoothly towards completion
  2. progress bars that sped up towards the end

Scientist Turns Microscope on Herself

Monday, March 3rd, 2008

Scientist Turns Microscope on Herself:

One of the most fascinating talks at the TED conference so far was given by Jill Bolte Taylor, a neuroanatomist, who gave a riveting account of a stroke she experienced in 1996.

Taylor’s knowledge of the brain made her the perfect witness to her body’s gradual shutdown. Over the course of four hours she watched her body deteriorate in stages, all the while processing its breakdown as if she were a curious explorer taking field notes. The first to go was her perception of herself as separate from the objects around her.

I should step back and say that before she described what happened to her brain and body, she brought out a real brain on stage, with spinal cord attached to it, and explained the distinctions between the functions performed by the right and left hemispheres of the brain. The right hemisphere, she said, is all about the present. It processes information from the sensory systems to give us a picture of the current moment — what it looks, smells, sounds and feels like.

The left hemisphere makes a collage of the present moment, picks out details and categorizes them and associates them with everything in the past that we’ve ever learned and then projects it into the future to determine possibilities. It’s the left hemisphere she says where brain chatter resides and the voice that says “I am.” This is the part of the brain that says we’re something separate from the scenery around us, and this is the part of the brain she temporarily lost during her stroke.

So on the morning of December 10, 1996, Taylor awoke with pounding, caustic pain behind her left eye. It came in waves, gripping and releasing her. Nonetheless, she started her morning routine, oblivious to what was happening. She jumped on an exercise machine and looked down at her hands and says they looked like primitive claws to her. She didn’t recognize her body as hers.

“It was as though my consciousness had shifted away from my consciousness of personality to where a mysterious person was having this experience,” she said.

She also couldn’t define the boundaries of where her body ended and the things around her began. The molecules of her arm blended with the molecules in the wall. It made her feel enormous and expansive and connected to all of the energy around her, which gave her a sense of peace.

“Imagine what it would feel like to lose thirty-seven years of emotional baggage,” she said.

It occurred to her that she had to get to work, but then her right arm became paralyzed and that’s when she finally realized she was having a stroke. She says rather than feel panic, her brain said, “Wow, this is so cool” — proof that scientists don’t think like the rest of us.

She decided to call her office but didn’t know the number. So she pulled out a stack of business cards, sifting for one with her work number. It took 45 minutes to get through a third of the cards. By then, however, the hemorrhage had grown and she didn’t know how to work the phone. She waited for a moment of clarity to return — it came in waves — but when she tried to dial the number from one of the cards it just looked like squiggles. She matched the shapes of the squiggles on the card to the squiggles on the phone and eventually reached a colleague. When he answered the phone, all she heard him say was, “Whaa, whaa, whaa” — a bit like the sound the adults in Peanuts cartoons make. When she opened her mouth to respond, the same sound came from her.

Later when she was in the ambulance she felt the energy in her body lift and her spirit surrender.

“In that moment I knew that I was no longer the choreographer of my life,” she said. She woke up later that afternoon, surprised that she was still alive. Two and a half weeks later surgeons removed a blood-clot the size of a golf ball from her skull.

It took her eight years to completely recover.

In praise of the humble but world-changing tuber

Monday, March 3rd, 2008

The Economist speaks in praise of the humble but world-changing tuber:

Unlikely though it seems, the potato promoted economic development by underpinning the industrial revolution in England in the 19th century. It provided a cheap source of calories and was easy to cultivate, so it liberated workers from the land. Potatoes became popular in the north of England, as people there specialised in livestock farming and domestic industry, while farmers in the south (where the soil was more suitable) concentrated on wheat production. By a happy accident, this concentrated industrial activity in the regions where coal was readily available, and a potato-driven population boom provided ample workers for the new factories. Friedrich Engels even declared that the potato was the equal of iron for its “historically revolutionary role”.

The potato promoted free trade by contributing to the abolition of Britain’s Corn Laws — the cause which prompted the founding of The Economist in 1843. The Corn Laws restricted imports of grain into the United Kingdom in order to protect domestic wheat producers. Landowners supported the laws, since cheap imported grain would reduce their income, but industrialists opposed them because imports would drive down the cost of food, allowing people to spend more on manufactured goods. Ultimately it was not the eloquence of the arguments against the Corn Laws that led to their abolition — and more’s the pity. It was the tragedy of the Irish potato famine of 1845, in which 1m Irish perished when the potato crop on which they subsisted succumbed to blight. The need to import grain to relieve the situation in Ireland forced the government, which was dominated by landowners who backed the Corn Laws, to reverse its position.

This paved the way for liberalisation in other areas, and free trade became British policy. As the Duke of Wellington complained at the time, “rotten potatoes have done it all.”

Better Dead Than High

Monday, March 3rd, 2008

Radley Balko notes that drug warriors see people as Better Dead Than High:

Over the years, drug warriors from former Drug Czar William Bennett to current Czar John Walters to recent DEA Administrator Karen Tandy have defended the efficacy of alcohol prohibition. All three have called the experiment a “success,” and the notion that it failed a “myth.”

They insist alcohol prohibition was a success because it reduced alcohol consumption. That assertion itself is debatable, but even assuming they’re right, the argument itself is revealing.

Americans didn’t pass prohibition because there’s something inherently evil about alcohol. They passed it because of the alleged deleterious effects associated with drinking.

To call Prohibition a “success,” you’d have to ignore the precipitous rise in homicides and other violent crime during the period; the rise in hospitalizations due to alcohol poisoning; the number of people blinded or killed by drinking toxic, black-market gin; the corrupting influence of Prohibition on government officials, from beat cops to the halls of Congress to Harding’s attorney general; and the corresponding erosion of the rule of law.

Of course, the 18th Amendment was passed because prohibitionists convinced the country that their movement would alleviate many of these problems. But once Prohibition was in place — and still today among its defenders — it became not about the negative effects of alcohol, but about preventing people from drinking as an ends unto itself. Stop people from drinking, and we’ve won. Never mind that the cure was worse than the disease.

In December 2006, the ONDCP put out a triumphant press release celebrating a five-year decline in the use of illicit drugs among teens.

“There has been a substance abuse sea change among American teens,” Walters said in the release. “They are getting the message that dangerous drugs damage their lives and limit their futures. We know that if people don’t start using drugs during their teen years, they are very unlikely to go on to develop drug problems later in life.”

But the following February, the Centers for Disease Control reported that deaths from drug overdoses rose nearly 70 percent over the previous five years.

Half the overdose deaths were attributable to cocaine, heroin, and prescription drugs (the number of overdose deaths caused by marijuana — the drug most targeted by the ONDCP — remained at zero). One of the biggest increases (113%) came among aged 15-22, those same teenagers Walters was celebrating just three months earlier.

To look at those two figures and conclude that the drug war is moving in the right direction betrays a near-religious devotion to preventing recreational drug use, at any cost.

Prohibition advocates are again measuring success not on how well the drug war is preventing real, tangible harm, but simply on how effectively they’re preventing people from getting high.