Plastic bags are thought to endanger marine animals

Tuesday, May 14th, 2019

Plastic bags are thought to endanger marine animals, but they may protect us humans:

San Francisco County was the first major US jurisdiction to enact such a regulation, implementing a ban in 2007 and extending it to all retailers in 2012. There is evidence, however, that reusable grocery bags, a common substitute for plastic bags, contain potentially harmful bacteria, especially coliform bacteria such as E. coli. We examine deaths and emergency room admissions related to these bacteria in the wake of the San Francisco ban. We find that both deaths and ER visits spiked as soon as the ban went into effect. Relative to other counties, deaths in San Francisco increase by 50-100 percent, and ER visits increase by a comparable amount. Subsequent bans by other cities in California appear to be associated with similar effects.

“The curious task of economics is to demonstrate to men how little they really know about what they imagine they can design.”

Debt is free and Western criticisms of excessive infrastructure investment are nonsense

Friday, May 10th, 2019

T. Greer describes the central problems with China’s Belt and Road initiative:

There is also a gap between how BRI projects are supposed to be chosen and how they actually have been selected. Xi and other party leaders have characterized BRI investment in Eurasia as following along defined “economic corridors” that would directly connect China to markets and peoples in other parts of the continent. By these means the party hopes to channel capital into areas where it will have the largest long-term benefit and will make cumulative infrastructure improvements possible.

This has not happened: one analysis of 173 BRI projects concluded that with the exception of the China-Pakistan Economic Corridor (CPEC) “there appears to be no significant relationship between corridor participation and project activity… [suggesting that] interest groups within and outside China are skewing President Xi’s signature foreign policy vision.”

This skew is an inevitable result of China’s internal political system. BRI projects are not centrally directed. Instead, lower state bodies like provincial and regional governments have been tasked with developing their own BRI projects. The officials in charge of these projects have no incentive to approve financially sound investments: by the time any given project materializes, they will have been transferred elsewhere. BRI projects are shaped first and foremost by the political incentives their planners face in China: There is no better way to signal one’s loyalty to Xi than by laboring for his favored foreign-policy initiative. From this perspective, the most important criteria for a project is how easily the BRI label can be slapped on to it…..

The problems China has had with the BRI stem from contradictions inherent in the ends party leaders envision for the initiative and the means they have supplied to reach them. BRI projects are chosen through a decentralized project-management system and then funded through concessional loans offered primarily by PRC policy banks. This is a recipe for cost escalation and corruption. In countries like Cambodia, a one-party state ruled by autocrats, this state of affairs is viable, for there is little chance that leaders will be held accountable for lining their pockets (or, more rarely, the coffers of their local communities) at the entire nation’s expense. But most BRI countries are not Cambodia. In democracies this way of doing things is simply not sustainable, and in most BRI countries it is only so long before an angry opposition eager to pin their opponents with malfeasance comes to power, armed with the evidence of misplaced or exploitative projects.

He goes on to cite Andrew Batson’s explanation:

Local governments discovered they could borrow basically without limit to fund infrastructure projects, and despite many predictions of doom, those debts have not yet collapsed. The lesson China has learned is that debt is free and that Western criticisms of excessive infrastructure investment are nonsense, so there is never any downside to borrowing to build more infrastructure. China’s infrastructure-building complex, facing diminishing returns domestically, is now applying that lesson to the whole world.

In Belt and Road projects, foreign countries simply take the place of Chinese local governments in this model (those who detect a neo-imperial vibe around the Belt and Road are, in this sense, onto something). Even the players are the same. In the 1990s, China Development Bank helped invent the local-government financing vehicle structure that underpinned the massive domestic infrastructure boom. Now, China Development Bank is one of the biggest lenders for overseas construction projects.

Those who defend the Belt and Road against the charge of debt-trap diplomacy are technically correct. But those same defenders also tend to portray the lack of competitive tenders and over-reliance on Chinese construction companies in Belt and Road projects as “problems” that detract from the initiative’s promise. They miss the central role of the SOE infrastructure-complex interest group in driving the Belt and Road. Structures that funnel projects funded by state banks to Chinese SOEs aren’t “problems” from China’s perspective – they are the whole point.

Economists love property taxes, but no one else does

Wednesday, April 17th, 2019

Economists love property taxes, but no one else does:

When the value of land rises, it’s generally not because of something the landowner has done. The resulting rents and other monetary gains, Adam Smith wrote in 1776, “are a species of revenue which the owner, in many cases, enjoys without any care or attention of his own.”

This made the landowner, Smith continued in The Wealth of Nations, an excellent target for taxation.

[...]

Income taxes and Social Security contributions are withheld from paychecks before the recipients get their hands on the money. Sales taxes (and value-added taxes outside the U.S.) are remitted by merchants and other business. It’s only with property taxes that a regular person gets a bill and has to pay it.

There’s clearly something to that, although for many homeowners property taxes are bundled into mortgage payments and thus a bit less obviously visible. Still, I can think of at least two other reasons for property taxes’ unpopularity that are actually side effects of what economists like about them. To wit:

  1. Property tax bills can rise without property owners doing anything, and
  2. Rising tax bills can push property owners (homeowners in particular) to make economic decisions they might prefer to avoid.

People can adjust their spending, and often their income. But they can’t help it if, say, house prices go up 80 percent in just three years — as they did in California from 1975 to 1978. Well, actually, they could help it, by going to the polls in June 1978 and approving Proposition 13, a set of restrictions on property tax rates and assessments that have shaped the state’s economy and government ever since.

[...]

Also, taxing property is in general more problematic politically than it was back when Henry George’s ideas were in vogue in the late 1800s and early 1900s — because homeowners have gone from a minority of the U.S. population to a majority with an especially high propensity to vote.

The contemporary world is always testing his belief in central banking

Sunday, April 14th, 2019

Tyler Cowen considers some arguments for a gold standard:

Historical data indicates that industrial production volatility was not higher before 1914, when the U.S. was on the gold standard, compared to after 1947, when it mostly wasn’t. And there are similar results for the volatility of unemployment. That’s not quite an argument for the gold standard, but it should cause opponents of the gold standard to think twice. Whatever the imperfections of a gold standard might be, monetary authorities make a lot of mistakes, too.

Furthermore, in the broader historical context, including the more distant past, the gold standard doesn’t look so bad. The age of the gold standard (and sometimes silver standard, and sometimes bimetallism) in the 19th century was largely one of peace and economic growth, running from 1815 until World War I. The fiat money era that followed was a disaster, as the 1920s brought monetary chaos, competitive devaluations, and even some hyperinflations and deflations, a few of which were driven by the desire to restore the old gold par at incorrect rates. It would have been better had the world managed to keep its gold-centered monetary order of 1913.

Even the Bretton Woods arrangement, which has a good record in terms of stability and growth, involved gold convertibility of a sort, albeit with no domestic convertibility and lots of pressures to discourage actual conversion from foreigners. Once the tie of the dollar to gold broke entirely in the early 1970s, inflation and interest rates were high and again monetary chaos followed. From the vantage point of, say, 1979, some form of gold standard really did seem better.

What was not obvious then was that monetary policy was going to be so good and so stable for the next four decades, albeit with a number of mistakes. Today’s case for the gold standard is based on the view that these recent decades of good fiat money management are a historical outlier and cannot be sustained. I don’t share that opinion, but neither do I think it is crazy or a sign of extreme ignorance.

So why don’t I favor a gold standard? First, governments have a long history of interfering with gold standards, for better or worse. So it doesn’t really remove politics from monetary policy. Second, central banks should respond with extreme countercyclical pressure when a financial crisis hits, such as in 2008. That is harder to do with a gold standard, and usually it requires the suspension of gold convertibility. Third, the price of gold is now greatly influenced by demand from China and India, and it seems unwise for that to partially drive what is in essence U.S. monetary policy. Most generally, I still think central bank governance can do a better job than a gold-based system that sometimes creates excess deflationary pressures.

Nonetheless, the contemporary world is always testing my belief in central banking.

The key is not options but obliquity

Wednesday, March 27th, 2019

Eric Falkenstein explains why Taleb’s Antifragile book is a fraud:

In Nassim Taleb’ book Antifragile he emphasizes that ‘if you see a fraud and do not say fraud, you are a fraud,’ I am thus compelled to note that Antifragile is a fraud because its theme is based on intentional misdirection. The most conspicuous and popular examples he presents are also explicitly mentioned as not the essence of antifragility. Indeed, incoherence is Taleb’s explicit strategy, as the Wikipedia entry on Antifragility notes Taleb presents his book in a way to make it difficult to criticize. He tried to squeeze a weltanschauung onto the Procrustean bed of his Black Swan and generated a synecdoche that confuses the part with the whole.

[...]

There are two ways to generate an option payoff. One is to buy an option; another is via dynamic replication, which involves doubling down a position as it becomes more in-the-money. The outsized success of winners over losers in dynamic systems generates large convexities, but to be a winner, the keys are not buying options, but rather, obliquity, the process of achieving success indirectly via a combination of vision, excellence, resilience, and flexibility. To describe the essence of this as being convex distracts people from focusing on their strengths, areas where they have, in a sense, insider information. Meanwhile, simple hormesis helps generate the efficiency and resiliency that allows firms/organisms to outgrow their competitors, why everyone mentions examples of hormesis, waves their hands, and hopes no one notices the bait-and-switch.

Promoting the new idea that acquiring options on the next Black Swan is the basis of “our own existence as a species on this planet” is the sort of hyperbole you hear at TED talks. It is the sort of thing bureaucrats love because they are generally too high up to have much domain-specific expertise, and the incoherent but plausible-sounding theme allows one to talk about strategy without actually knowing anything specific. Then you give examples of your great idea that are really something else entirely, and fade to black…

Their overriding goal is not enlightenment

Thursday, March 14th, 2019

The admissions scandal is an opportunity to separate the lofty mythology of college from the sordid reality:

Despite the grand aspirations that students avow on their admission essays, their overriding goal is not enlightenment, but status.

Consider why these parents would even desire to fake their kids’ SAT scores. We can imagine them thinking, I desperately want my child to master mathematics, writing and history — and no one teaches math, writing and history like Yale does! But we all know this is fanciful. People don’t cheat because they want to learn more. They cheat to get a diploma from Yale or Stanford — modernity’s preferred passport to great careers and high society.

What, then, is the point of sneaking into an elite school, if you lack the ability to master the material? If the cheaters planned to major in one of the rare subjects with clear standards and well-defined career paths — like computer science, electrical engineering or chemistry — this would be a show-stopping question. Most majors, however, ask little of their students — and get less. Standards were higher in the 1960s, when typical college students toiled about 40 hours a week. Today, however, students work only two-thirds as hard. Full-time college has become a part-time job.

If computer-science students slacked off like this, employers would soon notice. Most of their peers, however, have little reason to dread a day of reckoning — because, to be blunt, most of what college students study is irrelevant in the real world. Think of all the math, history, science, poetry and foreign language you had to study in school — if you can. Indeed, you’ve probably long since forgotten most of what you learned about these subjects. Few of us use it, so almost all of us lose it. The average high school student studies a foreign language for a full two years, but, according to my own research, less than 1% of American adults even claim they gained fluency in a classroom.

Why do employers put up with such a dysfunctional educational system? Part of the answer is that government and donors lavish funding on the status quo with direct subsidies, student loans and alumni donations. As a result, any unsubsidized alternative, starved of resources, must be twice as good to do half as well. The deeper answer, though, is that American higher education tolerably performs one useful service for American business: certification. Most students at places like Yale and Stanford aren’t learning much, but they’re still awesome to behold if you’re looking to fill a position. Ivy Leaguers are more than just smart; when tangible rewards are on the line, they’re hardworking conformists. They hunger for conventional success. From employers’ point of view, it doesn’t matter if college fosters these traits or merely flags them. As long as elite students usually make excellent employees, the mechanism doesn’t matter.

So why cheat your kid into the Ivy League or a similarly elite school? For the lifelong benefits of corrupt certification. When I was in high school, my crusty health teacher loved to single out a random teen and scoff, “You’re wanted … for impersonating a student.” If you can get your less-than-brilliant, less-than-driven child admitted, he’ll probably get to impersonate a standardly awesome Ivy League graduate for the rest of his life. Of course, the superrich parents the FBI is accusing could have just let their kids skip college and live off their trust funds, but it’s not merely a matter of money. It’s also about youthful self-esteem — and parental bragging rights.

The Complexity of the World repeatedly makes fools of them

Thursday, February 7th, 2019

Bryan Caplan is a fan of dystopian fiction, but he had overlooked Henry Hazlitt’s The Great Idea (subsequently republished as Time Will Run Back) until last December, because he had feared a long-winded, clunky version of Economics in One Lesson — but he gave it a chance, and his gamble paid off:

I read the whole thing (almost 400 pages) on a red-eye flight – feeling wide awake the whole way.

The book’s premise: Centuries hence, mankind groans under a world Communist government centered in Moscow. People live in Stalinist fear and penury. Censorship is so extreme that virtually all pre-revolutionary writings have been destroyed; even Marx has been censored, to prevent anyone from reverse engineering whatever “capitalism” was. However, due to a marital dispute, Peter Uldanov, the dictator’s son, was raised in an island paradise, free of both the horrors and the rationalizations of his dystopian society. When the dictator nears death, he brings Peter to Moscow and appoints him his heir. The well-meaning but naive Peter is instantly horrified by Communism, and sets out to fix it. In time, he rediscovers free-market economics, and sets the world to right.

Yes, this sounds trite to me, too. But Hazlitt is a master of pacing. It takes almost 200 pages before any of Peter’s reforms start to work. Until then, it’s one false start after another, because so many of the seemingly dysfunctional policies of the Stalinist society are remedies for other dysfunctional policies.

[...]

In most literary dialogues, at least one of the characters has the answers. (“Yes, Socrates, you are quite right!”) What’s novel about Hazlitt’s dialogues is that all the characters are deeply confused. Even when they sound reasonable, the Complexity of the World repeatedly makes fools of them.

The Great Idea was originally published in 1951. Stalin was still alive.

Pave the muddy paths

Monday, February 4th, 2019

We often think of “law” and “legislation” as synonyms, Mike Munger notes, but Hayek argued otherwise:

Habits that are shared might be called “customs,” informal rules that might be written down nowhere. These are agreements, in the sense that we all agree that is the way we do things, even though we never actually sat down and signed anything.

A while back I wrote about the Pittsburgh left turn as an example of such a custom. It is important that the habit of waiting for someone to turn left in front of you be “agreed” on, in the sense that the expectation is widely shared — and met — because otherwise it wouldn’t be effective in making traffic move faster. These customs can come to govern behavior, however, precisely because they shape expectations, and violating expectations may be expensive or dangerous.

Those customs, if they consistently lead to useful outcomes, are “laws.” They are discoverable by experience and emerge in the form of traditions. But it is useful to write them down so that they can be enforced more effectively and can be easily learned by new generations. Laws that are written down are rules, commands, and prohibitions we call “legislation.”

The problem is that legislation need not arise from law at all.

Hayek was rightly concerned about the conceit that experts know what is best for everyone else:

I often illustrate this with what I call the Hayek Sidewalk Plan. Imagine that a new university has been built, and you are on the committee charged with laying out the sidewalks. What would you do?

You might walk around, look at aerial maps of the campus, and draw lines to try to guess where people will want to walk. Or you might want to have a purely aesthetic conception of the problem, and put the sidewalks in places or in patterns that are pleasing to the eye as you look out the windows of the administration building.

But all of that is legislation. No individual, or small committee of individuals, could possibly have enough information or foresight to be able to know in advance where people are going to want to walk. After all, universities are peopled by broadly diverse groups, with heterogeneous plans and purposes. People are often willing to walk on the sidewalks, if that serves their purpose at that point. But you probably don’t want to build a sidewalk from every doorway to every other doorway on the campus.

What would a law look like, in this setting? No one person, after all, has any effect walking on the grass, and all the different plans and purposes, taken one at a time, contain no information that you can use. But there is a physical manifestation of the aggregation of all these plans and purposes working themselves out over time. I don’t intend to make a path, and neither do you. But if enough of us, over time, find it useful to walk in the same place to accomplish our own idiosyncratic purposes, a visible record of the shared pattern emerges: a muddy path.

So, the law for the Hayek Sidewalk Plan committee will be discoverable if we adjourn for six months or so and then have a drone take some overhead photographs. It is clear now where people, acting as individuals but observable together in the shared result called a muddy path, want the sidewalks to be placed. And the task of the committee is simply to “legislate” by paving the muddy paths.

If we think of the process of discovering law as “looking for the muddy paths,” and legislation as “paving the muddy paths,” we have a simple but quite powerful way of thinking about the rule of law.

Affordability has its costs

Saturday, February 2nd, 2019

Besides its obvious shortcomings, Los Angeles has a number of subtle problems that go back to decisions made long ago:

Much of the Los Angeles area would be better today if early city fathers had realized how valuable the property would eventually become. Los Angeles has quite high population density these days, but lacks urban amenities. The San Fernando Valley on the north side of the city of Los Angeles, for instance, was built up under the assumption that it would remain a rural retreat from the big city, but it now has over 1.75 million residents.

In contrast, Chicago was laid out after its 1871 fire by men like Daniel Burnham who took “Make no little plans” as their motto. L.A. wasn’t. And it’s hard to fix urban-planning mistakes afterward.

To take a seemingly trivial example, Chicago, where I lived from 1982 to 2000, was set up with most streets having sidewalks, and the sidewalks are usually wide enough for two people to walk abreast while conversing. In contrast, sidewalks on residential streets in Los Angeles often peter out at the developers’ whims, and those that exist are usually a little too narrow for two people. So pedestrians end up conversing over their shoulders.

One reason for the sidewalk shortage is that Los Angeles was the first major city in America to develop after the automobile.

Another is that much of it was laid out to be affordable after the stock-market crash of 1929. That introduced a more democratic, less elitist ethos. There’s a lot to be said for the remarkable living standards of average people in postwar L.A., but the city is paying the price today for cutting corners back then.

Chicago, in contrast, was mostly built during the era before the New Deal when upscale bourgeois values dominated tastes. For instance, my Chicago condo was in a three-story brick building on an elegant block of other three-story brick buildings. It was a very respectable-looking block, with every building striving to live up to proper bourgeois standards.

This doesn’t mean that everybody can keep up appearances at all times. My Chicago condo had been built in 1923 with optimistic touches like nine-foot ceilings. During the Depression, the owners must have been ruined as the units were split up into two apartments. But a couple of generations later, the building was rehabbed, and the tall ceilings and other generous touches were still there.

Los Angeles, in contrast, reflects an odd combination of mass-market needs and celebrity tastes.

In 1915, Charlie Chaplin, rapidly becoming the most famous man in the world, lived in Chicago a couple of blocks from where my old condo would go up. But in 1916, as filmmakers realized the advantages of sunshine, he moved from Chicago to Los Angeles.

The movies did in the chance of Los Angeles developing physically along bourgeois lines. Film people valued privacy and self-expression. Screenwriter Nathanael West’s 1939 novel Day of the Locust complained of the excessive diversity of Hollywood houses:

But not even the soft wash of dusk could help the houses. Only dynamite would be of any use against the Mexican ranch houses, Samoan huts, Mediterranean villas, Egyptian and Japanese temples, Swiss chalets, Tudor cottages, and every possible combination of these styles that lined the slopes of the canyon.

One of the most popular architects of celebrity homes was an African-American named Paul Revere Williams whose view, in contrast to the more academically celebrated Los Angeles architects such as Schindler and Neutra, was that his movie-star clients paid him to make their whims come true. So if, say, Frank Sinatra desired a Japanese Modern house with superb acoustics for his state-of-the-art stereo, Williams would figure out how to give the client what he wanted.

Another need celebrities have is privacy from tourists. Not having a sidewalk in front of your house for your stalkers to assemble upon makes sense if you are a world-famous actor.

The peculiar needs of movie stars influence everybody else’s tastes in L.A., with generally unfortunate results. If you are in constant danger of being pestered by crazed fans, it can be a good idea to go everywhere by car. But not being able to walk down your own street without risking being hit by traffic is a dumb idea if you are a nobody.

One lesson from Los Angeles ought to be that it’s hard to retrofit urban-planning mistakes made for reasons of affordability and expedience.

For example, the Los Angeles River, a floodplain that is dry most of the year, almost washed the city away in the 1938 flood. The Army Corps of Engineers were called in and rapidly built the notorious concrete ditch that is now the L.A. River to keep, say, Lockheed from being carried out to sea in the next deluge, causing America to lose the upcoming war.

After the war, newer desert communities like Scottsdale and Palm Springs realized that it makes more sense to convert natural flood channels into parks and golf courses that can absorb runoff. Moreover, the 1994 earthquake in Los Angeles demonstrated that putting up apartment buildings on the old sand and gravel riverbed had been a bad idea, as numerous apartment buildings near the river collapsed.

For decades, public-spirited Angelenos have generated countless plans to replace the ugly concrete culvert. But to do that would require a broader channel, which would demand using eminent domain to purchase all the very expensive real estate along the river. And so nothing ever gets done.

Similarly, it’s hard to undo affordable-housing construction, unless it happens to be in a hugely valuable location, such as along the beach. Gentrification is most likely where there’s something to gentrify.

For instance, Van Nuys in the heart of the San Fernando Valley was built as an affordable place for people who couldn’t afford cars. I recall it in the 1960s being a dump.

Driving through Van Nuys last week, it was still the same dump.

Affordability has its costs.

If some idiot from the South tried to be polite, the system broke down

Friday, February 1st, 2019

As you travel the world, some of the local rules you can look up or read about, but often the rules are just assumed because “everyone” knows them:

I described an experience of mine in Erlangen, Germany, in an earlier column, where I didn’t know about the practice of collecting a deposit on shopping carts. No one told me about this, and I thought I recognized the context of “grocery store” as familiar, one where I knew the rules. But I didn’t.

I had another experience in Germany, one that made me think of the importance of what Hayek called “the particular circumstances of time and place.” Erlangen, where I taught at Friedrich Alexander University, is a city of bicycles. There are roads, but most are narrow and there are so many bikes that it can be frustrating to drive.

The bike riders, as is true in many American cities, paid little attention to the traffic lights. Often, there were so many bikes that it was not possible to cross the street without getting in the way. But I noticed that people did cross, just walking right out into the street.

I tried this, several times, in my first time in Erlangen. But being from the southern United States, I’m polite and deferential. So, I would start across the street, but then look up the street, and if a bike was close and coming fast I’d stop.

And get hit by a large, sturdy German on a large, sturdy German bicycle. And then I got yelled at, in German. What had I done wrong? Eventually, I figured it out: there had evolved a convention for crossing the street and for riding bicycles. The pedestrian simply walked at a constant speed, without even looking. The bicyclist would ride directly at the pedestrian, actually aiming at the spot where the pedestrian was at that point in time. Since the pedestrian kept moving in a predictable fashion, the cyclist would pass directly and safely behind the pedestrian.

If some idiot from the southern United States, in an effort to impose his own views of “polite” behavior on people whose evolved rules were different, tried to be polite and stop, the system broke down. Though that idiot (me) was stopping to avoid being hit, I was actually being rude by violating the rules. These rules were not written down and could not easily be changed.

In fact, a number of my German colleagues even denied that it was a rule, at first. But then they would say, “Well, right, you can’t stop. That would be dumb. So, okay, I guess it is a rule, after all.”

More precisely, this rule — like many other important rules you encounter in “foreign” settings — is really a convention. A convention, according to Lewis (1969), is a persistent (though not necessarily permanent) regularity in the resolution of recurring coordination problems, in situations characterized by recurrent interactions where outcomes are (inter)dependent.

Conventions, then, exist when people all agree on a rule of behavior, even if no one ever said the rule out loud or wrote it down. No one actor can choose an outcome, and no actor can challenge the regularity by unilaterally deviating from the conventional behavior. But deviation can result in substantial harm, as when someone tries to drive on the left in a country where “we” drive on the right, or social sanction, as when there is intentional punishment on behalf of other actors if deviation is observed and publicized.

According to David Hume, convention is

a general sense of common interest; which sense all the members of the society express to one another, and which induces them to regulate their conduct by certain rules. I observe that it will be to my interest [e.g.] to leave another in the possession of his goods, provided he will act in the same manner with regard to me. When this common sense of interest is mutually expressed and is known to both, it produces a suitable resolution and behavior. And this may properly enough be called a convention or agreement betwixt us, though without the interposition of a promise; since the actions of each of us have a reference to those of the other, and are performed upon the supposition that something is to be performed on the other part. (Hume, 1978; llI.ii.2)

Notice how different this is from the “gamer” conception of laws and rules. For the gamer, all the rules can be — in fact, must be — written down and can be examined and rearranged. For the world traveler, the experience of finding out the rules can involve trial and error, and even the natives likely do not fully understand that the rules and norms of their culture are unique.

One of my favorite examples is actually from the United States, the so-called Pittsburgh Left Turn. In an article in the Pittsburgh City Paper in 2006, Chris Potter wrote:

As longtime residents know, the Pittsburgh Left takes place when two or more cars — one planning to go straight, and the other to turn left — face off at a red light without a “left-turn only” lane or signal. The Pittsburgh Left occurs when the light turns green, and the driver turning left takes the turn without yielding to the oncoming car.

Pittsburgh is an old city, many of whose streets were designed before automobiles held sway. [That means] that street grids are constricted, with little room for amenities like left-turn-only lanes. The absence of such lanes means drivers have to solve traffic problems on their own. Instead of letting one car at the head of an intersection bottle up traffic behind it, the Pittsburgh Left gives the turning driver a chance to get out of everyone else’s way. In exchange for a few seconds of patience, the Pittsburgh Left allows traffic in both directions to move smoothly for the duration of the signal. Of course, the system only works if both drivers know about it. No doubt that’s why newcomers find it so vexing.

The Pittsburgh Left is a very efficient convention. On two-lane streets, turning left can block traffic as the turning car waits for an opening. And left-turn arrows are expensive and add time to each traffic light cycle. Far better to let the left turners — if there are any — go first. If there are no left turners, traffic just proceeds normally, not waiting on a left arrow.

Of course, if some idiot from the southern United States (yes, me again) is driving in Pittsburgh, that person expects to go when the light turns green. I blew my horn when two cars turned left in front of me. And people on the sidewalk yelled at me, as did the left-turning drivers. Once again, I didn’t know the rules, because I was a foreigner, at least in terms of the rules of the road in Pittsburgh.

Actually, it’s worse than that. The Pittsburgh Left is technically illegal, according to the Pennsylvania Driver’s Handbook (p. 47): “Drivers turning left must yield to oncoming vehicles going straight ahead.” The written rules, the gamer rules, appear to endorse one pattern of action. But the actual rules, the ones you have to travel around to learn, may be quite different. Real rules are not written down, and the people living in that rule system may not understand either the nature or effects of the rules. It is very difficult to change conventions, because they represent the expectations people have developed in dealing with each other over years or decades.

Hayek understood this clearly, and argued for what I have called the “world traveler” conception over what I have called the “gamer” conception of rules and laws. As Hayek said in 1988, in The Fatal Conceit:

To understand our civilisation, one must appreciate that the extended order resulted not from human design or intention but spontaneously: it arose from unintentionally conforming to certain traditional and largely moral practices, many of which men tend to dislike, whose significance they usually fail to understand, whose validity they cannot prove, and which have nonetheless fairly rapidly spread by means of an evolutionary selection — the comparative increase of population and wealth — of those groups that happened to follow them.… This process is perhaps the least appreciated facet of human evolution.

Throw out your used books

Sunday, January 27th, 2019

You should simply throw out your used books, Tyler Cowen argues, instead of gifting them:

If you donate the otherwise-thrashed book somewhere, someone might read it. OK, maybe that person will read one more book in life but more likely that book will substitute for that person reading some other book instead. Or substitute for watching a wonderful movie.

So you have to ask yourself — this book — is it better on average than what an attracted reader might otherwise spend time with? Even within any particular point of view most books simply aren’t that good, and furthermore many books end up being wrong. These books are traps for the unwary, and furthermore gifting the book puts some sentimental value on it, thereby increasing the chance that it is read. Gift very selectively! And ponder the margin.

You should be most likely to give book gifts to people whose reading taste you don’t respect very much. That said, sometimes a very bad book can be useful because it might appeal to “bad” readers and lure them away from even worse books. Please make all the appropriate calculations.

Few even had wallets

Tuesday, January 22nd, 2019

A century ago the market economy was important, but a lot of economic activity still took place within the family, Peter Frost notes, especially in rural areas:

In the late 1980s I interviewed elderly French Canadians in a small rural community, and I was struck by how little the market economy mattered in their youth. At that time none of them had bank accounts. Few even had wallets. Coins and bills were kept at home in a small wooden box for special occasions, like the yearly trip to Quebec City. The rest of the time these people grew their own food and made their own clothes and furniture. Farms did produce food for local markets, but this surplus was of secondary importance and could just as often be bartered with neighbors or donated to the priest. Farm families were also large and typically brought together many people from three or four generations.

By the 1980s things had changed considerably. Many of my interviewees were living in circumstances of extreme social isolation, with only occasional visits from family or friends. Even among middle-aged members of the community there were many who lived alone, either because of divorce or because of relationships that had never gone anywhere. This is a major cultural change, and it has occurred in the absence of any underlying changes to the way people think and feel.

Whenever I raise this point I’m usually told we’re nonetheless better off today, not only materially but also in terms of enjoying varied and more interesting lives. That argument made sense back in the 1980s — in the wake of a long economic boom that had doubled incomes, increased life expectancy, and improved our lives through labor-saving devices, new forms of home entertainment, and stimulating interactions with a broader range of people.

Today, that argument seems less convincing. Median income has stagnated since the 1970s and may even be decreasing if we adjust for monetization of activities, like child care, that were previously nonmonetized. Life expectancy too has leveled off and is now declining in the U.S. because of rising suicide rates among people who live alone. Finally, cultural diversity is having the perverse effect of reducing intellectual diversity. More and more topics are considered off-limits in public discourse and, increasingly, in private conversation.

Liberalism is no longer delivering the goods — not only material goods but also the goods of long-term relationships and rewarding social interaction.

Previously they had been a lumpenproletariat of single men and women

Monday, January 21st, 2019

Liberal regimes tend to erode their own cultural and genetic foundations, thus undermining the cause of their success:

Liberalism emerged in northwest Europe. This was where conditions were most conducive to dissolving the bonds of kinship and creating communities of atomized individuals who produce and consume for a market. Northwest Europeans were most likely to embark on this evolutionary trajectory because of their tendency toward late marriage, their high proportion of adults who live alone, their weaker kinship ties and, conversely, their greater individualism. This is the Western European Marriage Pattern, and it seems to go far back in time. The market economy began to take shape at a later date, possibly with the expansion of North Sea trade during early medieval times and certainly with the take-off of the North Sea trading area in the mid-1300s (Note 1).

Thus began a process of gene-culture coevolution: people pushed the limits of their phenotype to exploit the possibilities of the market economy; selection then brought the mean genotype into line with the new phenotype. The cycle then continued anew, with the mean phenotype always one step ahead of the mean genotype.

This gene-culture coevolution has interested several researchers. Gregory Clark has linked the demographic expansion of the English middle class to specific behavioral changes in the English population: increasing future time orientation; greater acceptance of the State monopoly on violence and consequently less willingness to use violence to settle personal disputes; and, more generally, a shift toward bourgeois values of thrift, reserve, self-control, and foresight. Heiner Rindermann has presented the evidence for a steady rise in mean IQ in Western Europe during the late medieval and early modern era. Henry Harpending and myself have investigated genetic pacification during the same timeframe in English society. Finally, hbd*chick has written about individualism in relation to the Western European Marriage Pattern (Note 2).

This process of gene-culture coevolution came to a halt in the late 19th century. Cottage industries gave way to large firms that invested in housing and other services for their workers, and this corporate paternalism eventually became the model for the welfare state, first in Germany and then elsewhere in the West. Working people could now settle down and have families, whereas previously they had largely been a lumpenproletariat of single men and women. Meanwhile, middle-class fertility began to decline, partly because of the rising cost of maintaining a middle-class lifestyle and partly because of sociocultural changes (increasing acceptance and availability of contraception, feminism, etc.).

This reversal of class differences in fertility seems to have reversed the gene-culture coevolution of the late medieval and early modern era.

Liberalism delivered the goods

Sunday, January 20th, 2019

How did liberalism become so dominant?

In a word, it delivered the goods. Liberal regimes were better able to mobilize labor, capital, and raw resources over long distances and across different communities. Conservative regimes were less flexible and, by their very nature, tied to a single ethnocultural community. Liberals pushed and pushed for more individualism and social atomization, thereby reaping the benefits of access to an ever larger market economy.

The benefits included not only more wealth but also more military power. During the American Civil War, the North benefited not only from a greater capacity to produce arms and ammunition but also from a more extensive railway system and a larger pool of recruits, including young migrants of diverse origins — one in four members of the Union army was an immigrant (Doyle 2015).

During the First World War, Britain and France could likewise draw on not only their own manpower but also that of their colonies and elsewhere. France recruited half a million African soldiers to fight in Europe, and Britain over a million Indian troops to fight in Europe, the Middle East, and East Africa (Koller 2014; Wikipedia 2018b). An additional 300,000 laborers were brought to Europe and the Middle East for non-combat roles from China, Egypt, India, and South Africa (Wikipedia 2018a). In contrast, the Central Powers had to rely almost entirely on their own human resources. The Allied powers thus turned a European civil war into a truly global conflict.

The same imbalance developed during the Second World War. The Allies could produce arms and ammunition in greater quantities and far from enemy attack in North America, India, and South Africa, while recruiting large numbers of soldiers overseas. More than a million African soldiers fought for Britain and France, their contribution being particularly critical to the Burma campaign, the Italian campaign, and the invasion of southern France (Krinninger and Mwanamilongo 2015; Wikipedia 2018c). Meanwhile, India provided over 2.5 million soldiers, who fought in North Africa, Europe, and Asia (Wikipedia 2018d). India also produced armaments and resources for the war effort, notably coal, iron ore, and steel.

Liberalism thus succeeded not so much in the battle of ideas as on the actual battlefield.

If you make a community truly open it will eventually become little more than a motel

Saturday, January 19th, 2019

The emergence of the middle class was associated with the rise of liberalism and its belief in the supremacy of the individual:

John Locke (1632–1704) is considered to be the “father of liberalism,” but belief in the individual as the ultimate moral arbiter was already evident in Protestant and pre-Protestant thinkers going back to John Wycliffe (1320s–1384) and earlier. These are all elaborations and refinements of the same mindset.

Liberalism has been dominant in Britain and its main overseas offshoot, the United States, since the 18th century. There is some difference between right-liberals and left-liberals, but both see the individual as the fundamental unit of society and both seek to maximize personal autonomy at the expense of kinship-based forms of social organization, i.e., the nuclear family, the extended family, the kin group, the community, and the ethnie. Right-liberals are willing to tolerate these older forms and let them gradually self-liquidate, whereas left-liberals want to use the power of the State to liquidate them. Some left-liberals say they simply want to redefine these older forms of sociality to make them voluntary and open to everyone. Redefine, however, means eliminate. If you make a community truly “open” it will eventually become little more than a motel: a place where people share space, where they may or may not know each other, and where very few if any are linked by longstanding ties — certainly not ties of kinship.

For a long time, liberalism was merely dominant in Britain and the U.S. The market economy coexisted with kinship as the proper way to organize social and economic life. The latter form of sociality was even dominant in some groups and regions, such as the Celtic fringe, Catholic communities, the American “Bible Belt,” and rural or semi-rural areas in general. Today, those subcultures are largely gone. Opposition to liberalism is for the most part limited, ironically, to individuals who act on their own.