Understanding Dictatorships

Saturday, September 12th, 2009

Understanding dictatorships means understanding the difference between totalitarian dictators, like Hitler and Stalin, and Third World dictators, who are often weak and lack control, as in Iran:

The most common misunderstanding centers on the fact that all governments, even dictatorships, need some form of legitimacy to justify their rule to their own people. Otherwise they must revert to brute force, which is both expensive and corrupting to the police and army, who then abuse their respective powers and cause growing public resentment and anger. But while force and fear are temporarily effective, they are not enough for the longer term. A foreign threat thus helps dictators, as it is used to justify their despotic rule. Economic blockades can also reinforce dictatorial power and indeed even make governments richer as they profit from the consequent smuggling and black markets. In the eyes and minds of the conquered, American soldiers certainly do not have “legitimacy,” as we have repeatedly learned.

Understanding how such dictatorships actually function would help Washington to avoid more foreign policy disasters. If Americans better understood the weaknesses of most foreign tyrannies, we’d be less inclined to see them as great threats. Also, we would have to face the reality that administering them effectively would mean establishing a permanent corps of occupation forces on the British or Roman model. Even then modern communications and weaponry might make our rule fail. Tribal societies cannot be easily converted into democracies.

In the old days legitimacy came from the divine right of kings or priests who gained their authority from God. In tribal societies, custom and inherited status have played much the same role. Tribes are ruled by a council of elders on the theory that they have the experience, knowledge, and wisdom to make intelligent decisions.

The Roman emperors claimed religious and Senatorial authority. Later they provided a “rule of law” with safety and free trade for their subjects who previously had only known wars, piracy, and civil strife. Remember that Saint Paul could not be tortured by the police because he was a Roman citizen. Yet even the Romans needed to provide bread and circuses (welfare) to the masses in order to maintain support for their legitimacy.

In modern times democracy provides that legitimacy, hence the extreme measures — including fake elections — dictatorships will go to in order to claim the semblance of lawful control. Wartime, however, was always recognized as needing centralized, unrestrained rule. In ancient Greece even democracies, when at war, would elect a dictator for a year at a time on the theory that only a single ruler could act forcefully without delays and second guessing by committees, elders, or legislators.

In Iraq we learned belatedly that Saddam Hussein ruled through tribal leaders, in particular by bribing and accommodating them. Intimidation was certainly part of his rule, but not the base of it. Washington’s usual war propaganda went all out with stories of his (in particular his sons’) torturing the innocent and killing at will. However, in tribal societies, rape and wanton killings bring about vengeance and are not done lightly. Hussein ruled mainly through his own tribe, relying upon them in key positions of power, a method in accordance with tribal traditions.

Nor was it considered “corrupt” to use government power to profit one’s family, clan, or tribe. Everybody did it! Look at Saudi Arabia, Kuwait, or Afghanistan today. Profiting oneself, one’s clan, and one’s tribe is a tradition stretching back thousands of years. What America calls “corruption” has been the world’s way of life until relatively recently. Saddam Hussein”s original theory of government was Baathist Arab socialism, a form of national socialism or fascism first supported in the Arab world as a way to modernize their nations. But after the First Gulf War in 1991 Hussein reverted to tribal control.

The British, on the other hand, ruled their empire by playing different tribes against each other. They well understood that after generations of war, rape, and pillage most tribes hated their neighbors far more than they hated any foreign enemy. Only in modern times, with the rise of nationalism, did Third World nations finally overthrow European imperialism.

Washington’s plans to create democracy and legitimate government for Iraq and Afghanistan in a few years crashes against these traditions. For thousands of years tribal systems have provided for personal and economic security. Such traditions do not change quickly. Clans and tribes provided for widows and orphans (insurance), shared economic scarcity, provided for common defense, and offered vengeance for harm done to their members. (For details see my earlier article “Tribes, Veils and Democracy.”) However, tribal societies are also inherently unjust for smaller tribes and thus are usually unstable and unable to bring much economic development.

The weakness of most Third World dictatorships is evidenced by their dysfunction and poverty. In the case of Iran, for example, gasoline costs only 20 cents per gallon, although much is imported and the government is too incompetent to build more refineries. A strong government would not be subsidizing it. The mullahs used to have legitimacy on the basis of religion, traditional values, and nationalism. Now, however, they’ve lost most of it and depend upon the force of their militia and “Revolutionary Guards.” The reward for these enforcers has been the control of many businesses and even the black market. Yet that easy money corrupts them and makes them more abusive. Yet Iran is demonized in America as if it was a competent state — it isn’t. And its government won’t last.

The former appeal of communism to many Third World leaders was because its ideology gave it a form of legitimacy, justifying the most brutal repression to break down tribal loyalties in the name of throwing off imperialist rule and promising fast economic development. Communist revolutionaries were very cognizant of the political strength of tribal custom and religious fundamentalism. They saw both as being inimical to both their rule and to economic development and tried always to suppress them.

Although effective when allied with nationalism, communism was so inefficient and unresponsive a system in throwing off European (and American) colonialism that most regimes collapsed or adopted free-market measures once Soviet subsidies ceased coming.

Read the whole thing for Jon Basil Utley’s personal experiences in Batista’s Cuba, the Russia of 1987, and Peru after a coup by leftist generals.

A reader going by the alias of Stretchy added this:

“I thought then how Finland had a strong democratic government, not afraid to charge riders for the real costs of public transport.”

A strong democratic government would not be afraid to charge patients the real costs of health care, would not be afraid to charge retirees the real costs of retirement, would not be afraid to make home buyers pay the real costs of a new home, would not be afraid to let farmers receive the real value of their wares, etc.

Why the high-IQ lack common sense

Friday, September 11th, 2009

Bruce Charlton explains why the high-IQ lack common sense:

In previous editorials I have written about the absent-minded and socially-inept ‘nutty professor’ stereotype in science, and the phenomenon of ‘psychological neoteny’ whereby intelligent modern people (including scientists) decline to grow-up and instead remain in a state of perpetual novelty-seeking adolescence. These can be seen as specific examples of the general phenomenon of ‘clever sillies’ whereby intelligent people with high levels of technical ability are seen (by the majority of the rest of the population) as having foolish ideas and behaviours outside the realm of their professional expertise. In short, it has often been observed that high IQ types are lacking in ‘common sense’ — and especially when it comes to dealing with other human beings.

General intelligence is not just a cognitive ability; it is also a cognitive disposition. So, the greater cognitive abilities of higher IQ tend also to be accompanied by a distinctive high IQ personality type including the trait of ‘Openness to experience’, ‘enlightened’ or progressive left-wing political values, and atheism.

Drawing on the ideas of Kanazawa, my suggested explanation for this association between intelligence and personality is that an increasing relative level of IQ brings with it a tendency differentially to over-use general intelligence in problem-solving, and to over-ride those instinctive and spontaneous forms of evolved behaviour which could be termed common sense. Preferential use of abstract analysis is often useful when dealing with the many evolutionary novelties to be found in modernizing societies; but is not usually useful for dealing with social and psychological problems for which humans have evolved ‘domain-specific’ adaptive behaviours. And since evolved common sense usually produces the right answers in the social domain; this implies that, when it comes to solving social problems, the most intelligent people are more likely than those of average intelligence to have novel but silly ideas, and therefore to believe and behave maladaptively.

I further suggest that this random silliness of the most intelligent people may be amplified to generate systematic wrongness when intellectuals are in addition ‘advertising’ their own high intelligence in the evolutionarily novel context of a modern IQ meritocracy. The cognitively stratified context of communicating almost exclusively with others of similar intelligence, generates opinions and behaviours among the highest IQ people which are not just lacking in common sense but perversely wrong. Hence the phenomenon of ‘political correctness’ (PC); whereby false and foolish ideas have come to dominate, and moralistically be enforced upon, the ruling elites of whole nations.

His closing note:

I should in all honesty point out that I recognize this phenomenon from the inside. In other words, I myself am a prime example of a ‘clever silly’; having spent much of adolescence and early adult life passively absorbing high-IQ-elite-approved, ingenious-but-daft ideas that later needed, painfully, to be dismantled. I have eventually been forced to acknowledge that when it comes to the psycho-social domain, the commonsense verdict of the majority of ordinary people throughout history is much more likely to be accurate than the latest fashionably brilliant insight of the ruling elite. So, this article has been written on the assumption, eminently challengeable, that although I have nearly always been wrong in the past — I now am right.

A Good Place to Start a Fight

Friday, September 11th, 2009

Eric Falkenstein discusses the politics of insincerity:

A major problem in politics is that it is not optimal for any party to say what they mean. People pound the table as to how innocuous a certain policy is, and how ‘crazy’ anyone must be to be against it. Others highlight a different endgame, a principle, or the insincerity of the policy. This is why Michael Kinsley famously said a ‘gaffe’ is when a politician accidentally says the truth. Ignorance, and bad faith, make truth-telling a dominated strategy.

It is important to distinguish between private and public sphere here, as in my private life I can adopt a truth-telling strategy because when I encounter the ignorant and those of bad faith, I can simply avoid those neighbors and friends going forward. In contrast, one must build coalitions in public, and one cannot simply abstain from interacting with such parties. Thus, insincerity is needed much more in public contacts than private contacts [one still needs some insincerity in private, like saying 'your butt doesn't look big in that' to your spouse].

Ignorant people will misinterpret your assertions or plans. The idea that getting rid of the minimum wage helps the poor or that giving people money to destroy old cars is a waste of money, is a complex assertion that takes an equilibrium argument, and is primarily theoretical. The benefits are seen and the costs are unseen. Alternatively, the idea that it is optimal for governments to have 5-year plans for industrial production at one time seemed obvious, based on the fact one plans before building a bridge. In this case, the error is not in undercounting the unseen, but a flawed analogy.

Then there are those with bad faith. Often these aren’t people out to get you, but rather, see your immediate aim as not in the best interest of their overall plan, and so want to stop it at all costs. Your failure is not their direct aim, but rather, consistent with their objective. Their opposition can be direct (‘no new taxes!), but it can also be indirect, helping the ignorant develop antipathy by clever caricature (‘he wants to hurt small businesses!’).

Thus, people often speak in metaphors based on principles no one is against. For example, in litigation, when asked ‘what is your endgame?’, an honest response would state one’s direct claim against the defendant. This would be a specific demand, but that presents a problem. Perhaps your endgame is something that an ignorant person would find highly dubious or self-serving if discovered. For example, you could merely want to effect a noncompete agreement, stifling a new competitor. Perhaps your endgame is costing your ex-employee a lot of money to signal to current employees the futility of trying to negotiate for more within the firm. Clearly, these are not sympathetic aims, even if your plan for crushing some plebes is part of a greater good via using your ultimate booty to fund a charity in Africa. So instead of saying something specific you say ‘to protect our intellectual property and enforce valid contracts’. You start broad, and when pressed, get less broad, but always keep at a level where any Sunday school teacher would agree with your goals.

In health care, I think the bottom line is that most people see this as a foot in the door to greater government control of a large segment of our economy, one that will be used for more egalitarian, and politicized, allocation of resources. Democrats in this country like egalitarian redistributions, and ‘politicized’ is just a pejorative for ‘democratized’. Republicans emphasize the inefficiencies of egalitarian distributions, the violations of liberty. As health care is expensive and already highly regulated, it’s sort of like the Balkans of historical Europe, a good place to start a fight on this more fundamental issue.

Lego Thinks Beyond the Brick

Friday, September 11th, 2009

What happens when a McKinsey alum takes over Lego?

Jorgen Vig Knudstorp, 40, a father of four and a McKinsey & Company alumnus who took over as Lego’s chief executive in 2004, made it clear that results, not simply feeling good about making the best toys, would be essential if Lego was to succeed.

“We needed to build a mind-set where nonperformance wasn’t accepted,” Mr. Knudstorp says. Now, “there’s no place to hide if performance is poor,” he says. “You will be embarrassed, and embarrassment is stronger than fear.”
[...]
Founded in 1932 on the principle of “play well,” or “leg godt” in Danish, by a local carpenter, Ole Kirk Christiansen, this privately held company had a very Scandinavian aversion to talking about profits, much less orienting the company around them.

Mr. Christiansen’s family still owns Lego and its business may still be fun and games, but working here isn’t. Before Mr. Knudstorp’s arrival, deadlines came and went, and development time for new toys could stretch out for years; in 2004, the company racked up a $344 million loss.

Now, employee pay is tied to measuring up to management’s key performance indicators (K.P.I.’s, in Lego-speak). And cost-saving touches are encouraged when it comes to designing new toys.

That has helped to lower development time by 50 percent, with some new products moving from idea to box in as little as a year. Mr. Knudstorp’s bottom-line-oriented team, meanwhile, has shifted some manufacturing and distribution from Billund to cheaper locales in Central Europe and Mexico.

Lego is following Apple’s lead:

Last month, it opened its first “concept store” in Concord, N.C., where parents can bring children for birthday parties and classes with master builders; another concept store is set to open near Baltimore this fall. It’s all part of a broader retail expansion that will give Lego 47 retail stores worldwide by year-end, up from 27 in 2007.

They’re also moving “beyond the brick”:

In 2010, the first board game designed by Lego will go on sale in the United States, while its new virtual reality system, Lego Universe, will make its debut on the Web, with children able to act out roles from Lego games and build toys from virtual bricks.

Video games — yes, Lego is there, too — are increasingly important to the company, as are Lego’s legions of adult fans, who can now buy kits to build architect-designed models of Frank Lloyd Wright’s Fallingwater and the Guggenheim Museum. What’s more, the company is in talks with Warner Brothers about a mixed live-action and animation Lego-themed movie that would move the company and its Lego brand even further into the Hollywood orbit.

“Developing a movie doesn’t come cheap,” says Soren Torp Laursen, a 23-year Lego employee who heads its North American operations. “But five years ago, we were in the midst of a crisis, and now we’re in a growth phase. We are definitely taking bigger risks than we previously did.”

Those bigger risks seem to be paying off, at least so far:

Amid a 5 percent drop in total United States toy sales last year and the industry’s worst holiday season in three decades, according to Sean McGowan, an analyst at Needham & Company, Lego’s sales surged 18.7 percent in 2008. And despite a worsening global recession, Lego powered through the first half of 2009, with a 23 percent sales increase over the period a year earlier. It earned $355 million before taxes last year, and $178 million in the first half of 2009.

The numbers are all the more impressive given the sales declines this year at the two biggest toymakers, Mattel and Hasbro.

It looks like everything about the company has improved, including its supply chain:

John Barbour, a former top executive of Toys “R” Us, recalls “a series of truly frustrating meetings” with Lego officials in Billund and New York at the beginning of the decade, which climaxed when Mr. Barbour bluntly told them that Toys “R” Us cared more about the Lego brand than they did.

The most popular toys would run out, he recalls, and Lego was simply unable to ship more or manage the complex process of producing the plastic pieces for its most complicated sets.

That began to change in 2004, after Mr. Knudstorp took over in Billund and Mr. Laursen arrived at Lego’s regional headquarters in Enfield, Conn. Besides reaching out to top retailers and cutting costs, they untangled a supply chain that churns out 29 billion pieces a year.

The changes also filtered down to the ranks of Lego’s toy designers, says Paal Smith-Meyer, head of Lego’s new-business group. The number of different bricks or elements that go into Lego toys has shrunk to less than 7,000 from roughly 13,000, and designers are encouraged to reuse parts, so that a piece of an X-wing fighter from the “Star Wars” series might end up in Indiana Jones’s jeep or a pirate ship.

That’s very different from when Mr. Meyer joined Lego a decade ago. Though creating a mold to make a new plastic element might cost 50,000 euros. on average, he recalls that 90 percent of new elements were developed and used just one time.

Nowadays, Mr. Meyer says, “you have to design for Lego. If you want to design for yourself, go be an artist.”

People Lie About Alpha

Friday, September 11th, 2009

If you take risks and make money, Eric Falkenstein notes, then, after the fact, everyone says that you took good risks, but if you take risks and lose money, well, you were just being foolish.

So people lie about alpha. They pretend that their returns from taking risks (beta-bets) are risk-adjusted returns (pure alpha). But then, it has long been the case that successful people are good at doing one thing while saying they are doing another:

Augustus Ceasar was successful because unlike Julius Ceasar he appeased the senators by making it seem like he restored the Republic (where the senate is in charge), when in practice he had probably more power than Julius Ceasar. When unions are successful they promote their agenda by appealing to how they are helping their customers, assiduously maintaining quality via their exclusionary rules. Affirmative Action was successful because proponents said it definitely does not imply quotas. The key is that many large strategies involve duplicity.

Glitches and Denial

Friday, September 11th, 2009

In self-defense, catastrophic failures come from glitches, Rory Miller says — from choking:

I’ve put a man through scenarios, a big, tough jail guard with probably a hundred fights under his belt and he could not point a real gun at another human being. He wasn’t even aware that he wasn’t doing it. I’ve seen another who curled up and ‘died’ when hit with a plastic bullet. I’ve seen the 6’4″ former marine who ran and hid from inmates and the 5’2″ single mom with no training or experience who fought like a tiger. And the blackbelt with a roomful of trophies who still freezes though no one could ever call him a rookie. What you believe about yourself, all the stories, all the logical progression (I’ve been training for this for ten years, I’ve been hit by blackbelts, surely I won’t freeze!”) doesn’t have a whole lot of bearing on how you will perform.

Paranoid survivor

Thursday, September 10th, 2009

The Economist calls Andy Grove Dr. Andrew Grove — and a paranoid survivor with a barbed wit:

Instead [of blowing his own trumpet as former boss of Intel] he started by displaying a headline from the Wall Street Journal heralding the recent takeover of General Motors by the American government as the start of “a new era”. He gave a potted history of his own industry’s spectacular rise, pointing out that plenty of venerable firms — with names like Digital, Wang and IBM — were nearly or completely wiped out along the way.

Then, to put a sting in his Schumpeterian tale, he displayed a fabricated headline from that same newspaper, this one supposedly drawn from a couple of decades ago: “Presidential Action Saves Computer Industry”. A fake article beneath it describes government intervention to prop up the ailing mainframe industry. It sounds ridiculous, of course. Computer firms come and go all the time, such is the pace of innovation in the industry. Yet for some reason this healthy attitude towards creative destruction is not shared by other industries. This is just one of the ways in which Dr Grove believes that his business can teach other industries a thing or two. He thinks fields such as energy and health care could be transformed if they were run more like the computer industry — and made greater use of its products.

Grocery Check-Out Times

Thursday, September 10th, 2009

Dan Meyer spent 90 minutes watching, counting, and timing groceries as they slid across a scanner — and he found out that each extra item costs you less than three seconds, but each extra person in front of you costs 48 extra seconds:

I don’t find it at all surprising that the y-intercept is non-zero, and that one more person adds as much time as 17 more items — or that the express lane isn’t faster.

(Hat tip to Tyler Cowen.)

City of Dreams

Thursday, September 10th, 2009

Drake Bennett of The Boston Globe writes about Paul Romer’s city of dreams, the charter city:

Central to Romer’s proposal, no matter what the exact structure, is the belief that the developed world has lessons to teach poorer countries about how to deter violence, spur trade, incorporate new technologies, and train a workforce, and that we shouldn’t let political correctness blind us to that fact. Along with the land, a charter city’s labor would mostly come from the host nation, but, as in colonial Hong Kong, the laws would be based on those that have worked elsewhere.

And like charter schools, charter cities would work in part by showing up their neighbors, drawing workers and business away from existing cities and forcing them to adapt and modernize.

“It’s about choices,” Romer says. “No one would have to move to a charter city, but the point is we want to provide people with the sort of choices that many in poor countries currently don’t have.”

If charter cities act like charter schools, shouldn’t we expect the same resistance from Third-World dictators that we see from Teachers’ Unions?

Italic Handwriting

Thursday, September 10th, 2009

Inga Dubay and Barbara Getty have apparently written a series of books on italic handwriting, which is baffling when you consider that their advice can be condensed down to this: don’t use loopy cursive; just attach your letters when you print, if you feel like it.

This deserves a New York Times piece? Anyway, if you already print and occasionally join your letters, you may want to know that the loopy cursive you’ve given up is known in educational circles as the Palmer method.

Credit Laundering

Thursday, September 10th, 2009

Insight — which is more than mere knowledge — generally comes through personal connections, Cringely argues, rather than books — and so far we’ve had to create campuses and pay $50,000 per year to enjoy such personal connections:

That no longer makes sense.

Education, which — along with health care — seems to exist in an alternate economic universe, ought to be subject to the same economic realities as anything else. We should have a marketplace for insight. Take a variety of experts (both professors and lay specialists) and make them available over the Internet by video conference. Each expert charges by the minute with those charges adjusting over time until a real market value is reached. The whole setup would run like iTunes and sessions would be recorded for later review.

Remember, all lectures are also available online for free. What costs is the personal touch.

Say a particularly good professor wants to make $200,000 per year by working no more than 20 hours per week or about 1000 hours per year. That gives them a billing rate of $200 per hour.

Now look back at your university career. How much one-on-one time did you actually get with the professors who really influenced your life? I did the calculation and came up with about two hours per week, max. Imagine a four-year undergraduate career running 30 weeks per year — 120 total weeks of school — times two hours of insight per week for a total of 240 hours. At $200 per hour the cost comes to $48,000 or $12,000 per year.

That’s a huge savings compared to the $200,000+ an MIT-level education would cost today (remember the MIT online degree — there is one — costs the same as if you were attending in Cambridge). And ideally the pool of insightful experts would be far greater than any one university could ever employ. And that’s the point of this exercise; it can’t be an emulation of a traditional university, because that would inevitably disappoint — it has to be in at least one way clearly, obviously, stupendously better than what’s available now.

This brings up the subject of Straighterline, a new online quasi-university charging $99 per month for “all you can eat“:

Straighterline has a problem with accreditation — they can’t get it. So they cut deals with no-name schools to effectively launder their credits, passing them on to third-party schools. I see nothing wrong with this but in time Straighterline or schools like it will have to take a more direct approach to the problem of gaining acceptance. The University of Phoenix did that through the simple expedient of offering real classes all over the place and charging a lot more than $99 per month for all-you-can-learn. Exciting as that price is, it is precisely what scares the crap out of many established colleges.

If I were running Straighterline, then, I’d get ready to file a big restraint of trade lawsuit against some big vulnerable school caught up in, say, an NCAA athletic recruiting scandal. ”Pick your targets carefully,” Pa Cringely always said.

The other thing I would strongly recommend is that Straighterline put some big bucks into recruiting its own stellar faculty. Spend whatever it takes to get the top people in some discipline to start. Hire academics if you can and lay practitioners if you can’t. Most academic contracts don’t prohibit teaching part-time elsewhere and if they do try to stop the practice, well that’s just a further example of restraint of trade.

Lost world of fanged frogs and giant rats discovered in Papua New Guinea

Thursday, September 10th, 2009

Biologists from Oxford University, the London Zoo, and the Smithsonian Institution have discovered a lost world in Papua New Guinea‘s Bosavi crater:

A team of scientists from Britain, the United States and Papua New Guinea found more than 40 previously unidentified species when they climbed into the kilometre-deep crater of Mount Bosavi and explored a pristine jungle habitat teeming with life that has evolved in isolation since the volcano last erupted 200,000 years ago. In a remarkably rich haul from just five weeks of exploration, the biologists discovered 16 frogs which have never before been recorded by science, at least three new fish, a new bat and a giant rat, which may turn out to be the biggest in the world.
[...]
They found the three-kilometre wide crater populated by spectacular birds of paradise and in the absence of big cats and monkeys, which are found in the remote jungles of the Amazon and Sumatra, the main predators are giant monitor lizards while kangaroos have evolved to live in trees. New species include a camouflaged gecko, a fanged frog and a fish called the Henamo grunter, named because it makes grunting noises from its swim bladder.

View the photo gallery.

Thin-Film Solar Startup Debuts With $4 Billion in Contracts

Thursday, September 10th, 2009

A startup with a secret recipe for printing cheap solar cells on aluminum foil debuted today — with $4 billion in contracts:

Nanosolar’s technology consists of sandwiches of copper, indium, gallium and selenide (CIGS) that are 100 times thinner than the silicon solar cells that dominate the solar photovoltaics market. Its potential convinced Google founders Sergey Brin and Larry Page to back the company as angel investors in its early days.

Two big announcements marked its coming out party: The company has $4 billion in contracts and can make money selling its products for $1 per watt of a panel’s capacity. That’s cheap enough to compete with fossil fuels in markets across the world.

Small solar farms should face fewer NIMBY hurdles than big coal or nuclear plants.

Traditional solar cells can reach higher efficiencies — 40 percent versus 16 percent — but they require a lot of silicon, and they’re not cheap.

Storing All the Stuff We Accumulate

Thursday, September 10th, 2009

Why are we storing all the stuff we accumulate?

The first modern self-storage facilities opened in the 1960s, and for two decades storage remained a low-profile industry, helping people muddle through what it terms “life events.” For the most part, storage units were meant to temporarily absorb the possessions of those in transition: moving, marrying or divorcing, or dealing with a death in the family. And the late 20th century turned out to be a golden age of life events in America, with peaking divorce rates and a rush of second- and third-home buying. At the same time, the first baby boomers were left to face down the caches of heirlooms and clutter in their parents’ basements.

But by the end of the ’90s, there seemed to be almost limitless, pent-up demand for storage around the country, more than life events readily explained. Storage was seen as an invincible investment and became the go-to solution for developers with awkward, leftover scraps of land. After an industry report found that Hawaii ranked among the states with the least amount of storage space in the nation, storage barons rushed in, almost doubling the available square footage there between 2004 and 2007. One man converted a network of caves on Oahu, used to house munitions during World War II, into a storage facility. (The caves are naturally climate-controlled, perfect for wine.) Around the United States, newcomers to the industry were building even against the advice of their expert consultants. “We were cranking these things out at exponential rates,” an industry veteran named Tom Litton told me. “It was just nuts.”

Litton’s parents owned one of the earliest storage facilities, in Tucson. He now has two of his own, both in California, and manages 23 others. Among the ones he built and has since sold is a stunning 1,000-unit glass-fronted complex in Antioch. It could pass for a small corporate headquarters and is one of seven storage facilities within five miles of Statewide in either direction along Highway 4.

Across America, from 2000 to 2005, upward of 3,000 self-storage facilities went up every year. Somehow, Americans managed to fill that brand-new empty space. In June, Public Storage, the industry’s largest chain, reported that its 2,100 facilities in 38 states were, on average, still about 91 percent full. It raises a simple question: where was all that stuff before?

“A lot of it just comes down to the great American propensity toward accumulating stuff,” Litton explained. Between 1970 and 2008, real disposable personal income per capita doubled, and by 2008 we were spending nearly all of it — all but 2.7 percent — each year. Meanwhile, the price of much of what we were buying plunged. Even by the early ’90s, American families had, on average, twice as many possessions as they did 25 years earlier. By 2005, according to the Boston College sociologist Juliet B. Schor, the average consumer purchased one new piece of clothing every five and a half days.

Schor has been hacking intrepidly through the jumble of available data quantifying the last decade’s consumption spree. Between 1998 and 2005, she found, the number of vacuum cleaners coming into the country every year more than doubled. The number of toasters, ovens and coffeemakers tripled. A 2006 U.C.L.A. study found middle-class families in Los Angeles “battling a nearly universal overaccumulation of goods.” Garages were clogged. Toys and outdoor furniture collected in the corners of backyards. “The home-goods storage crisis has reached almost epic proportions,” the authors of the study wrote. A new kind of customer was being propelled, hands full, into self-storage.

“A lot of the expansion we experienced as an industry was people choosing to store,” Litton told me. A Self Storage Association study showed that, by 2007, the once-quintessential client — the family in the middle of a move, using storage to solve a short-term, logistical problem — had lost its majority. Fifty percent of renters were now simply storing what wouldn’t fit in their homes — even though the size of the average American house had almost doubled in the previous 50 years, to 2,300 square feet.

Consider our national furniture habit. In an unpublished paper, Schor writes that “anecdotal evidence suggests an ‘Ikea effect.’ ” We’ve spent more on furniture even as prices have dropped, thereby amassing more of it. The amount entering the United States from overseas doubled between 1998 and 2005, reaching some 650 million pieces a year. Comparing Schor’s data with E.P.A. data on municipal solid waste shows that the rate at which we threw out old furniture rose about one-thirteenth as fast during roughly the same period. In other words, most of that new stuff — and any older furniture it displaced — is presumably still knocking around somewhere. In fact, some seven million American households now have at least one piece of furniture in their storage units. Furniture is the most commonly stored thing in America.

The marketing consultant Derek Naylor told me that people stockpile furniture while saving for bigger or second homes but then, in some cases, “they don’t want to clutter up their new home with all the things they have in storage.” So they buy new, nicer things and keep paying to store the old ones anyway. Clem Tang, a spokesman for Public Storage, explains: “You say, ‘I paid $1,000 for this table a couple of years ago. I’m not getting rid of it, or selling it for 10 bucks at a garage sale. That’s like throwing away $1,000.’ ” It’s not a surprising response in a society replacing things at such an accelerated rate — this inability to see our last table as suddenly worthless, even though we’ve just been out shopping for a new one as though it were.

“My parents were Depression babies,” Litton told me, “and what they taught me was, it’s the accumulation of things that defines you as an American, and to throw anything away was being wasteful.” The self-storage industry reconciles these opposing values: paying for storage is, paradoxically, thrifty. “That propensity toward consumption is what fueled the world’s economy,” Litton said. The self-storage industry almost had to expand; it grew along with the volume of container ships reaching our ports. (Some storage facilities I visited in California are, in fact, made of shipping containers, which became surplus goods themselves as our trade deficit grew.)

By 2007, a full 15 percent of customers told the Self Storage Association they were storing items that they “no longer need or want.” It was the third-most-popular use for a unit and was projected to grow to 25 percent of renters the following year. The line between necessity and convenience — between temporary life event and permanent lifestyle — totally blurred.

How the moustache won an empire

Wednesday, September 9th, 2009

Piers Brendon (The Decline And Fall Of The British Empire) pokes fun at how the moustache won an empire:

As Britain’s influence stretched across the globe, the moustaches worn by our fighting men and leaders flourished, but by the time of the postcolonial humiliation of Suez in 1956, the prime minister of the day, Anthony Eden, sported an apologetic, hardly noticeable growth.