Few even had wallets

Tuesday, January 22nd, 2019

A century ago the market economy was important, but a lot of economic activity still took place within the family, Peter Frost notes, especially in rural areas:

In the late 1980s I interviewed elderly French Canadians in a small rural community, and I was struck by how little the market economy mattered in their youth. At that time none of them had bank accounts. Few even had wallets. Coins and bills were kept at home in a small wooden box for special occasions, like the yearly trip to Quebec City. The rest of the time these people grew their own food and made their own clothes and furniture. Farms did produce food for local markets, but this surplus was of secondary importance and could just as often be bartered with neighbors or donated to the priest. Farm families were also large and typically brought together many people from three or four generations.

By the 1980s things had changed considerably. Many of my interviewees were living in circumstances of extreme social isolation, with only occasional visits from family or friends. Even among middle-aged members of the community there were many who lived alone, either because of divorce or because of relationships that had never gone anywhere. This is a major cultural change, and it has occurred in the absence of any underlying changes to the way people think and feel.

Whenever I raise this point I’m usually told we’re nonetheless better off today, not only materially but also in terms of enjoying varied and more interesting lives. That argument made sense back in the 1980s — in the wake of a long economic boom that had doubled incomes, increased life expectancy, and improved our lives through labor-saving devices, new forms of home entertainment, and stimulating interactions with a broader range of people.

Today, that argument seems less convincing. Median income has stagnated since the 1970s and may even be decreasing if we adjust for monetization of activities, like child care, that were previously nonmonetized. Life expectancy too has leveled off and is now declining in the U.S. because of rising suicide rates among people who live alone. Finally, cultural diversity is having the perverse effect of reducing intellectual diversity. More and more topics are considered off-limits in public discourse and, increasingly, in private conversation.

Liberalism is no longer delivering the goods — not only material goods but also the goods of long-term relationships and rewarding social interaction.

Previously they had been a lumpenproletariat of single men and women

Monday, January 21st, 2019

Liberal regimes tend to erode their own cultural and genetic foundations, thus undermining the cause of their success:

Liberalism emerged in northwest Europe. This was where conditions were most conducive to dissolving the bonds of kinship and creating communities of atomized individuals who produce and consume for a market. Northwest Europeans were most likely to embark on this evolutionary trajectory because of their tendency toward late marriage, their high proportion of adults who live alone, their weaker kinship ties and, conversely, their greater individualism. This is the Western European Marriage Pattern, and it seems to go far back in time. The market economy began to take shape at a later date, possibly with the expansion of North Sea trade during early medieval times and certainly with the take-off of the North Sea trading area in the mid-1300s (Note 1).

Thus began a process of gene-culture coevolution: people pushed the limits of their phenotype to exploit the possibilities of the market economy; selection then brought the mean genotype into line with the new phenotype. The cycle then continued anew, with the mean phenotype always one step ahead of the mean genotype.

This gene-culture coevolution has interested several researchers. Gregory Clark has linked the demographic expansion of the English middle class to specific behavioral changes in the English population: increasing future time orientation; greater acceptance of the State monopoly on violence and consequently less willingness to use violence to settle personal disputes; and, more generally, a shift toward bourgeois values of thrift, reserve, self-control, and foresight. Heiner Rindermann has presented the evidence for a steady rise in mean IQ in Western Europe during the late medieval and early modern era. Henry Harpending and myself have investigated genetic pacification during the same timeframe in English society. Finally, hbd*chick has written about individualism in relation to the Western European Marriage Pattern (Note 2).

This process of gene-culture coevolution came to a halt in the late 19th century. Cottage industries gave way to large firms that invested in housing and other services for their workers, and this corporate paternalism eventually became the model for the welfare state, first in Germany and then elsewhere in the West. Working people could now settle down and have families, whereas previously they had largely been a lumpenproletariat of single men and women. Meanwhile, middle-class fertility began to decline, partly because of the rising cost of maintaining a middle-class lifestyle and partly because of sociocultural changes (increasing acceptance and availability of contraception, feminism, etc.).

This reversal of class differences in fertility seems to have reversed the gene-culture coevolution of the late medieval and early modern era.

Liberalism delivered the goods

Sunday, January 20th, 2019

How did liberalism become so dominant?

In a word, it delivered the goods. Liberal regimes were better able to mobilize labor, capital, and raw resources over long distances and across different communities. Conservative regimes were less flexible and, by their very nature, tied to a single ethnocultural community. Liberals pushed and pushed for more individualism and social atomization, thereby reaping the benefits of access to an ever larger market economy.

The benefits included not only more wealth but also more military power. During the American Civil War, the North benefited not only from a greater capacity to produce arms and ammunition but also from a more extensive railway system and a larger pool of recruits, including young migrants of diverse origins — one in four members of the Union army was an immigrant (Doyle 2015).

During the First World War, Britain and France could likewise draw on not only their own manpower but also that of their colonies and elsewhere. France recruited half a million African soldiers to fight in Europe, and Britain over a million Indian troops to fight in Europe, the Middle East, and East Africa (Koller 2014; Wikipedia 2018b). An additional 300,000 laborers were brought to Europe and the Middle East for non-combat roles from China, Egypt, India, and South Africa (Wikipedia 2018a). In contrast, the Central Powers had to rely almost entirely on their own human resources. The Allied powers thus turned a European civil war into a truly global conflict.

The same imbalance developed during the Second World War. The Allies could produce arms and ammunition in greater quantities and far from enemy attack in North America, India, and South Africa, while recruiting large numbers of soldiers overseas. More than a million African soldiers fought for Britain and France, their contribution being particularly critical to the Burma campaign, the Italian campaign, and the invasion of southern France (Krinninger and Mwanamilongo 2015; Wikipedia 2018c). Meanwhile, India provided over 2.5 million soldiers, who fought in North Africa, Europe, and Asia (Wikipedia 2018d). India also produced armaments and resources for the war effort, notably coal, iron ore, and steel.

Liberalism thus succeeded not so much in the battle of ideas as on the actual battlefield.

If you make a community truly open it will eventually become little more than a motel

Saturday, January 19th, 2019

The emergence of the middle class was associated with the rise of liberalism and its belief in the supremacy of the individual:

John Locke (1632–1704) is considered to be the “father of liberalism,” but belief in the individual as the ultimate moral arbiter was already evident in Protestant and pre-Protestant thinkers going back to John Wycliffe (1320s–1384) and earlier. These are all elaborations and refinements of the same mindset.

Liberalism has been dominant in Britain and its main overseas offshoot, the United States, since the 18th century. There is some difference between right-liberals and left-liberals, but both see the individual as the fundamental unit of society and both seek to maximize personal autonomy at the expense of kinship-based forms of social organization, i.e., the nuclear family, the extended family, the kin group, the community, and the ethnie. Right-liberals are willing to tolerate these older forms and let them gradually self-liquidate, whereas left-liberals want to use the power of the State to liquidate them. Some left-liberals say they simply want to redefine these older forms of sociality to make them voluntary and open to everyone. Redefine, however, means eliminate. If you make a community truly “open” it will eventually become little more than a motel: a place where people share space, where they may or may not know each other, and where very few if any are linked by longstanding ties — certainly not ties of kinship.

For a long time, liberalism was merely dominant in Britain and the U.S. The market economy coexisted with kinship as the proper way to organize social and economic life. The latter form of sociality was even dominant in some groups and regions, such as the Celtic fringe, Catholic communities, the American “Bible Belt,” and rural or semi-rural areas in general. Today, those subcultures are largely gone. Opposition to liberalism is for the most part limited, ironically, to individuals who act on their own.

This is the mindset that enabled northwest Europeans to exploit the possibilities of the market economy

Friday, January 18th, 2019

There is reason to believe that northwest Europeans were pre-adapted to the market economy:

They were not the first to create markets, but they were the first to replace kinship with the market as the main way of organizing social and economic life. Already in the fourteenth century, their kinship ties were weaker than those of other human populations, as attested by marriage data going back to before the Black Death and in some cases to the seventh century (Frost 2017). The data reveal a characteristic pattern:

  • men and women marry relatively late
  • many people never marry
  • children usually leave the nuclear family to form new households
  • households often have non-kin members

This behavioral pattern was associated with a psychological one:

  • weaker kinship and stronger individualism;
  • framing of social rules in terms of moral universalism and moral absolutism, as opposed to kinship-based morality (nepotism, amoral familialism);
  • greater tendency to use internal controls on behavior (guilt proneness, empathy) than external controls (public shaming, community surveillance, etc.)

This is the mindset that enabled northwest Europeans to exploit the possibilities of the market economy. Because they could more easily move toward individualism and social atomization, they could go farther in reorganizing social relationships along market-oriented lines. They could thus mobilize capital, labor, and raw resources more efficiently, thereby gaining more wealth and, ultimately, more military power.

This new cultural environment in turn led to further behavioral and psychological changes. Northwest Europeans have adapted to it just as humans elsewhere have adapted to their own cultural environments, through gene-culture coevolution.


Northwest Europeans adapted to the market economy, especially those who formed the nascent middle class of merchants, yeomen, and petty traders. Over time, this class enjoyed higher fertility and became demographically more important, as shown by Clark (2007, 2009a, 2009b) in his study of medieval and post-medieval England: the lower classes had negative population growth and were steadily replaced, generation after generation, by downwardly mobile individuals from the middle class. By the early 19th century most English people were either middle-class or impoverished descendants of the middle class.

This demographic change was associated with behavioral and psychological changes to the average English person. Time orientation became shifted toward the future, as seen by increased willingness to save money and defer gratification. There was also a long-term decline in personal violence, with male homicide falling steadily from 1150 to 1800 and, parallel to this, a decline in blood sports and other violent though legal practices (cock fighting, bear and bull baiting, public executions). This change can largely be attributed to the State’s monopoly on violence and the consequent removal of violence-prone individuals through court-ordered or extrajudicial executions. Between 1500 and 1750, court-ordered executions removed 0.5 to 1.0% of all men of each generation, with perhaps just as many dying at the scene of the crime or in prison while awaiting trial (Clark 2007; Frost and Harpending 2015).

Similarly, Rindermann (2018) has argued that mean IQ steadily rose in Western Europe during late medieval and post-medieval times. More people were able to reach higher stages of mental development. Previously, the average person could learn language and social norms well enough, but their ability to reason was hindered by cognitive egocentrism, anthropomorphism, finalism, and animism (Rindermann 2018, p. 49). From the sixteenth century onward, more and more people could better understand probability, cause and effect, and the perspective of another person, whether real or hypothetical. This improvement preceded universal education and improvements in nutrition and sanitation (Rindermann 2018, pp. 86-87).

Macroeconomics is a combination of voodoo complex systems and politics

Wednesday, January 16th, 2019

In a recent interview, Shane Parrish asked Naval Ravikant, What big ideas have you changed your mind on in the last few years?

There’s a lot on kind of the life level. There’s a couple, obviously, in the business level. I think on a more practical basis, I’ve just stopped believing in macroeconomics. I studied economics in school and computer science. There was a time when I thought I was going to be a PhD in economics and all of that. The further I get, the more I realize macroeconomics is a combination of voodoo complex systems and politics. You can find macroeconomists that take every side of every argument. I think that discipline, because it doesn’t make falsifiable predictions, which is the hallmark of science, because it doesn’t make falsifiable predictions, it’s become corrupted.

You never have the counterexample on the economy. You can never take the US economy and run two different experiments at the same time. Because there’s so much data, people kind of cherry-pick for whatever political narrative they’re trying to push. To the extent that people spend all their time watching the macroeconomy or the fed forecasts or which way the stocks are going to go the next year, is it going to be a good year or bad year, that’s all junk. It’s no better than astrology. In fact, it’s probably even worse because it’s less entertaining. It’s just more stress-inducing. I think of macroeconomics as a junk science. All apologies to macroeconomists.

That said, microeconomics and game theory are fundamental. I don’t think you can be successful in business or even navigating through most of our modern capital society without an extremely good understanding of supply and demand and labor versus capital and game theory and tit for tat and those kinds of things. Macroeconomics is a religion that I gave up, but there are many others. I’ve changed my mind on death, on the nature of life, on the purpose of life, on marriage. I was originally not someone who wanted to be married and have kids. There have been a lot of fundamental changes. The most practical one is I gave up macro and I embraced micro.

I would say that’s not just true in macroeconomics, that true in everything. I don’t believe in macro-environmentalism, I believe in microenvironmentalism. I don’t believe in macro-charity. I believe in micro-charity.

I don’t believe in macro improving the world. There’s a lot of people out there who get really fired up about I’m going to change the world, I’m going to change this person, I’m going to change the way people think.

I think it’s all micro. It’s like change yourself, then maybe change your family and your neighbor before you get into abstract concepts about I’m going to change the world.

Culture is too important to be left to the sociologists

Wednesday, December 19th, 2018

Culture matters, Virginia Postrel reminds us:

The mid-20th century period in which the modern libertarian movement arose is now looked upon with great nostalgia, especially in the United States. As my friend Brink Lindsey puts it, the right wants to live there and the left wants to work there.

When Donald Trump says “Make America Great Again,” the again refers to the world in which he grew up. The war was over, standards of living were rising, and new technologies from vaccines to synthetic fibers promised a better future.

Social critics of the day deplored mass production, mass consumption, and mass media, but the general public enjoyed their fruits. The burgeoning middle class happily replaced tenements with “little boxes made of ticky-tacky.” Snobs might look down on the suburbs, but families were delighted to settle in them. Faith in government was high, and other institutions—universities, churches, corporations, unions, and civic groups—enjoyed widespread respect.

It looked like a satisfactory equilibrium. But it wasn’t. The 1950s, after all, produced the 1960s.

Consider a series of best-selling books: The Lonely Crowd, by David Riesman, published in 1950; Atlas Shrugged by Ayn Rand and The Organization Man by William Whyte, both published in 1957; and The Feminine Mystique by Betty Friedan, published in 1963. All of these books, and undoubtedly others I’ve overlooked, took up the same essential theme: the frustration of the person of talent and integrity in a society demanding conformity and what Riesman called “other-directedness.”

These books succeeded in the economic marketplace, as well as the marketplace of ideas, because they tapped a growing sense of discontent with the prevailing social and business ethos. Their audience might have been a minority of the population, but it was a large, gifted, and ultimately influential one. Despite the era’s prosperity—or perhaps because of it—many people had come to resent social norms that demanded that they keep their heads down, do what was expected of them, and be content to be treated as homogeneous threads in the social fabric. The ensuing cultural upheaval, which peaked in the late 1970s, took many different forms, with unanticipated results.

One of the most paradoxical examples I’ve run across comes from Dana Thomas’s 2015 book Gods and Kings, on the fashion designers Alexander McQueen and John Galliano. It’s about Galliano, who was born in Gibraltar and grew up in South London as the son of a plumber. His career, Thomas comments in passing, was made possible by two cultural phenomena: Thatcherism and punk.

How could that be? After all, Thatcherism and punk are usually seen as antagonistic. I asked Thomas about it in an interview. “Both were breaking down British social rules and constraints,” she said. Punk brought together kids of all classes, while Thatcher’s economic reforms encouraged entrepreneurship.

    If you had an idea and you had the backing then you could make it happen, no matter what your dad did in life or your mother did in life or where you came from or what your background was, or where you grew up or what your accent sounded like. These were all barriers before. So it double-whammied for Galliano. It was great. Because it allowed him to get out of South London, get into a good art school and be seen as a bona fide talent on his own standing, as opposed to where he came from. And he was also able to get the backing to start his company, because there was more money out there. It gave him more freedom. Before punk and before Thatcherism, chances were the son of a plumber was not going to wind up being the head of a couture house.

If you care about the open society, how could you not be interested in a phenomenon like that? How exactly do such transformations take place, and what are their unexpected ripple effects? What processes of experimentation and feedback are at work? Could a young designer do the same thing today and, if not, why not? Are these moments of cultural and economic opportunity inherently fleeting?

(Hat tip to Arnold Kling.)

Why Is American mass transit so bad?

Friday, December 14th, 2018

Why Is American mass transit so bad? It’s a long story:

One hundred years ago, the United States had a public transportation system that was the envy of the world. Today, outside a few major urban centers, it is barely on life support. Even in New York City, subway ridership is well below its 1946 peak. Annual per capita transit trips in the U.S. plummeted from 115.8 in 1950 to 36.1 in 1970, where they have roughly remained since, even as population has grown.

This has not happened in much of the rest of the world.


What happened? Over the past hundred years the clearest cause is this: Transit providers in the U.S. have continually cut basic local service in a vain effort to improve their finances. But they only succeeded in driving riders and revenue away. When the transit service that cities provide is not attractive, the demand from passengers that might “justify” its improvement will never materialize.


[The Age of Rail] was an era when transit could usually make money when combined with real-estate speculation on the newly accessible lands, at least in the short term. But then as now, it struggled to cover its costs over the long term, let alone turn a profit. By the 1920s, as the automobile became a fierce competitor, privately run transit struggled.

But public subsidy was politically challenging: There was a popular perception of transit as a business controlled by rapacious profiteers—as unpopular as cable companies and airlines are today. In 1920, the President’s Commission on Electric Railways described the entire industry as “virtually bankrupt,” thanks to rapid inflation in the World War I years and the nascent encroachment of the car.

The Depression crushed most transit companies, and the handful of major projects that moved forward in the 1930s were bankrolled by the New-Deal-era federal government: See the State and Milwaukee-Dearborn subways in Chicago, the South Broad Street subway in Philadelphia, and the Sixth Avenue subway in New York. But federal infrastructure investment would soon shift almost entirely to highways.


It is not a coincidence that, while almost every interurban and streetcar line in the U.S. failed, nearly every grade-separated subway or elevated system survived. Transit agencies continued to provide frequent service on these lines so they remained viable, and when trains did not have to share the road and stop at intersections, they could also be time competitive with the car. The subways and els of Chicago, Philadelphia, New York, and Boston are all still around, while the vast streetcar and interurban networks of Los Angeles, Minneapolis, Atlanta, Detroit, and many others are long gone. Only when transit didn’t need to share the road with the car, and frequent service continued, was it able to survive.


All of these [systems introduced in the 1970s] featured fast, partially automated trains running deep into the suburbs, often in the median of expressways. With their plush seating and futuristic design, they were designed to attract people who could afford to drive.

But these high-tech systems were a skeleton without a body, unable to provide access to most of the urban area without an effective connecting bus network. The bus lines that could have fed passengers to the stations had long atrophied, or they never existed at all. In many cases, the new rapid transit systems weren’t even operated by the same agency as the local buses, meaning double fares and little coordination. With no connecting bus services and few people within walking distance in low-density suburbs, the only way to get people to stations was to provide vast lots for parking. But even huge garages can’t fit enough people to fill a subway. Most people without cars were left little better off than they had been before the projects, and many people with cars chose to drive the whole way rather than parking at the station and getting on the train.


Service drives demand. When riders started to switch to the car in the early postwar years, American transit systems almost universally cut service to restore their financial viability. But this drove more people away, producing a vicious cycle until just about everybody who could drive, drove. In the fastest-growing areas, little or no transit was provided at all, because it was deemed to be not economically viable. Therefore, new suburbs had to be entirely auto-oriented.

Do the rich capture all the gains from economic growth?

Tuesday, November 13th, 2018

Do the rich capture all the gains from economic growth? Russ Roberts explains why it matters how you measure these things:

But the biggest problem with the pessimistic studies is that they rarely follow the same people to see how they do over time. Instead, they rely on a snapshot at two points in time. So for example, researchers look at the median income of the middle quintile in 1975 and compare that to the median income of the median quintile in 2014, say. When they find little or no change, they conclude that the average American is making no progress.

But the people in the snapshots are not the same people. These snapshots fail to correct for changes in the composition of workers and changes in household structure that distort the measurement of economic progress. There is immigration. There are large changes in the marriage rate over the period being examined. And there is economic mobility as people move up and down the economic ladder as their luck and opportunities fluctuate.

How important are these effects? One way to find out is to follow the same people over time. When you follow the same people over time, you get very different results about the impact of the economy on the poor, the middle, and the rich.

Studies that use panel data — data that is generated from following the same people over time — consistently find that the largest gains over time accrue to the poorest workers and that the richest workers get very little of the gains. This is true in survey data. It is true in data gathered from tax returns.

Some of the most important books Nick Szabo has read

Friday, November 9th, 2018

Nick Szabo shared a list of some of the most important books he’s read on Twitter:

  1. The Selfish Gene, by Richard Dawkins
  2. Metaphors We Live By, by George Lakoff and Mark Johnson
  3. The Wealth of Nations, by Adam Smith
  4. The Fatal Conceit, by F. A. Hayek

Wired to look for chances to earn money

Monday, October 29th, 2018

Americans have a blind spot when it comes to saving:

Americans seem to excel at working. But saving? Not so much. As of last year, the median American household had only $1,100 saved for retirement, according to an analysis from the Federal Reserve Bank of St. Louis.

While many factors likely contribute to the poor U.S. savings rate, a recent Cornell University study published in the journal Nature Communications pointed to another factor that may be at least partially to blame: our brains. More specifically, the researchers found that our brains may be wired to look for chances to earn money — but fail to recognize chances to save, even when they are right in front of us.

The study measured something we can’t usually measure ourselves: how much attention we pay to earning and saving opportunities. First, participants had to identify colors shown quickly on a computer: one “earning” color that let them gain 30 cents, a neutral color that had no monetary effect and one “saving” color that let them avoid losing 30 cents.

When the “earning” color was shown, a staggering 87.5% of participants identified it more quickly and accurately than when the “saving” color was shown. Even in trials that framed “saving” as earnings that would come slightly later, participants were still better at immediate earning.

In the study’s second part, participants had to identify which color appeared first. Three out of four said they saw the “earning” color appear first — when in fact, the “saving” color did. This suggests our “earning” bias may even be strong enough to warp our perception of time.

How precision engineers created the modern world

Wednesday, October 24th, 2018

Simon Winchester’s The Perfectionists explains how precision engineers created the modern world:

The story of precision begins with metal.

And the story begins, according to Winchester, at a specific place and time: North Wales, “on a cool May day in 1776.” The Age of Steam was getting underway. So was the Industrial Revolution — almost but not quite the same thing. In Scotland, James Watt was designing a new engine to pump water by means of the power of steam. In England, John “Iron-Mad” Wilkinson was improving the manufacture of cannons, which were prone to exploding, with notorious consequences for the sailors manning the gun decks of the navy’s ships. Rather than casting cannons as hollow tubes, Wilkinson invented a machine that took solid blocks of iron and bored cylindrical holes into them: straight and precise, one after another, each cannon identical to the last. His boring machine, which he patented, made him a rich man.

Watt, meanwhile, had patented his steam engine, a giant machine, tall as a house, at its heart a four-foot-wide cylinder in which blasts of steam forced a piston up and down. His first engines were hugely powerful and yet frustratingly inefficient. They leaked. Steam gushed everywhere. Winchester, a master of detail, lists the ways the inventor tried to plug the gaps between cylinder and piston: rubber, linseed oil–soaked leather, paste of soaked paper and flour, corkboard shims, and half-dried horse dung — until finally John Wilkinson came along. He wanted a Watt engine to power one of his bellows. He saw the problem and had the solution ready-made. He could bore steam-engine cylinders from solid iron just as he had naval cannons, and on a larger scale. He made a massive boring tool of ultrahard iron and, with huge iron rods and iron sleighs and chains and blocks and “searing heat and grinding din,” achieved a cylinder, four feet in diameter, which as Watt later wrote “does not err the thickness of an old shilling at any part.”

By “an old shilling” he meant a tenth of an inch, which is a reminder that measurement itself — the science and the terminology — was in its infancy. An engineer today would say a tolerance of 0.1 inches.

James Watt’s fame eclipses Iron-Mad Wilkinson’s, but it is Wilkinson’s precision that enabled Watt’s steam engine to power pumps and mills and factories all over England, igniting the Industrial Revolution. As much as the machinery itself, the discovery of tolerance is crucial to this story. The tolerance is the clearance between, in this case, cylinder and piston. It is a specification on which an engineer (and a customer) can rely. It is the foundational concept for the world of increasing precision. When machine parts could be made to a tolerance of one tenth of an inch, soon finer tolerances would be possible: a hundredth of an inch, a thousandth, a ten-thousandth, and less.

Watt’s invention was a machine. Wilkinson’s was a machine tool: a machine for making machines and their parts. More and better machines followed, some so basic that we barely think of them as machines: toilets, locks, pulley blocks for sailing ships, muskets. The history of machinery has been written before, of course, as has the history of industrialization. These can be histories of science or economics. By focusing instead on the arrow of increasing precision, Winchester is, in effect, walking us around a familiar object to expose an unfamiliar perspective.

Can precision really be a creation of the industrial world? The word comes from Latin by way of middle French, but first it meant “cutting off” or “trimming.” The sense of exactitude comes later. It seems incredible that the ancients lacked this concept, so pervasive in modern thinking, but they measured time with sundials and sandglasses, and they counted space with hands and feet, and the “stone” has survived into modern Britain as a measure of weight.

Any assessment of ancient technology has to include, however, a single extraordinary discovery — an archaeological oddball the size of a toaster, named the “Antikythera mechanism,” after the island near Crete where Greek sponge divers recovered it in 1900 from a shipwreck 150 feet deep. Archaeologists were astonished to find, inside a shell of wood and bronze dated to the first or second century BC, a complex clockwork machine comprising at least thirty bronze dials and gears with intricate meshing teeth. In the annals of archaeology, it’s a complete outlier. It displays a mechanical complexity otherwise unknown in the ancient world and not matched again until fourteenth-century Europe. To call it “clockwork” is an anachronism: clocks came much later. Yet the gears seem to have been made — by hand — to a tolerance of a few tenths of a millimeter.

After a century of investigation and speculation, scientists have settled on the view that the Antikythera mechanism was an analog computer, intended to demonstrate astronomical cycles. Dials seem to represent the sun, the moon, and the five planets then known. It might have been able to predict eclipses of the moon. Where planetary motion is concerned, however, it seems to have been highly flawed. The engineering is better than the underlying astronomy. As Winchester notes, the Antikythera mechanism represents a device that is amazingly precise, yet not very accurate.

What makes precision a feature of the modern world is the transition from craftsmanship to mass production. The genius of machine tools — as opposed to mere machines — lies in their repeatability. Artisans of shoes or tables or even clocks can make things exquisite and precise, “but their precision was very much for the few,” Winchester writes. “It was only when precision was created for the many that precision as a concept began to have the profound impact on society as a whole that it does today.” That was John Wilkinson’s achievement in 1776: “the first construction possessed of a degree of real and reproducible mechanical precision — precision that was measurable, recordable, repeatable.”

Perhaps the canonical machine tool — surely the oldest — is the lathe, a turning device for cutting and shaping table legs, gun barrels, and screws. Wooden lathes date back to ancient China and Egypt. However, metal lathes, enormous and powerful, turning out metal machine parts, did not come into their own until the end of the eighteenth century. You can explain that in terms of available energy: water wheels and steam engines. Or you can explain it as Winchester does, in terms of precision. The British inventor Henry Maudslay made the first successful screw-cutting lathe in 1800, and to Winchester the crucial part of his invention is a device known as a slide rest: the device that holds the cutting tools and adjusts their position as delicately as possible, with the help of gears. Maudslay’s lathe, described by one historian as “the mother tool of the industrial age,” achieved a tolerance of one ten-thousandth of an inch. Metal screws and other pieces could be turned out by the hundreds and then the thousands, every one exactly the same.

Because they were replicable, they were interchangeable. Because they were interchangeable, they made possible a world of mass production and the warehousing and distribution of component parts. A French gunsmith, Honoré Blanc, is credited with showing in 1785 that flintlocks for muskets could be made with interchangeable parts. Before an audience, he disassembled twenty-five flintlocks into twenty-five frizzle springs, twenty-five face plates, twenty-five bridles, and twenty-five pans, randomly shuffled the pieces, and then rebuilt “out of this confusion of components” twenty-five new locks. Particularly impressed was the American minister to France, Thomas Jefferson, who posted by packet ship a letter explaining the new method for the benefit of Congress:

It consists in the making every part of them so exactly alike that what belongs to any one, may be used for every other musket in the magazine…. I put several together myself taking pieces at hazard as they came to hand, and they fitted in the most perfect manner. The advantages of this, when arms need repair, are evident.

As it was, when a musket broke down in the field, a soldier needed to find a blacksmith.

Replication and standardization are so hard-wired into our world that we forget how the unstandardized world functioned. A Massachusetts inventor named Thomas Blanchard in 1817 created a lathe that made wooden lasts for shoes. Cobblers still made the shoes, but now the sizes could be systematized. “Prior to that,” says Winchester, “shoes were offered up in barrels, at random. A customer shuffled through the barrel until finding a shoe that fit, more or less comfortably.” Before long, Blanchard’s lathe was making standardized gun stocks at the Springfield Armory and then at its successor, the Harpers Ferry Armory, which began turning out muskets and rifles by the thousands on machines powered by water turbines at the convergence of the Shenandoah and Potomac Rivers. “These were the first truly mechanically produced production-line objects made anywhere,” Winchester writes. “They were machine-made in their entirety, ‘lock, stock, and barrel.’” It is perhaps no surprise that the military played from the first, and continues to play, a leading and deadly part in the development of precision-based technologies and methods.

Why Paul Romer and William Nordhaus won the Nobel Prize in economics

Monday, October 8th, 2018

Tyler Cowen explains why Paul Romer won the Nobel Prize in economics and why William Nordhaus won the Nobel Prize in economics:

These are excellent Nobel Prize selections, Romer for economic growth and Nordhaus for environmental economics. The two picks are brought together by the emphasis on wealth, the true nature of wealth, and how nations and societies fare at the macro level. These are two highly relevant picks. Think of Romer as having outlined the logic behind how ideas leverage productivity into ongoing spurts of growth, as for instance we have seen in Silicon Valley. Think of Nordhaus as explaining how economic growth interacts with the value of the environment.

The Lazy Goldmaker is Azeroth’s most famous financial guru

Thursday, September 13th, 2018

The Lazy Goldmaker is the World of Warcraft’s financial guru:

In August, shortly after the release of World of Warcraft’s seventh expansion, Battle For Azeroth, The Lazy Goldmaker posted one of his meticulous spreadsheets to the WoW economy subreddit. It contains a set of expertly appraised auction house margins for all of Azeroth’s many tradeskills—blacksmithed weapons, stat-buffing cooking recipes, excavated gems.


The Goldmaker himself chooses to remain anonymous, but he does disclose that he is 30 years old and Norwegian. It was during the Burning Crusade, more than a decade ago, that he first became interested in the economic side of Blizzard’s immortal MMO, and he’s been operating The Lazy Goldmaker blog—where he posts columns, analysis, and other musings—since 2016, shortly after the launch of the Legion expansion.


World of Warcraft lets The Goldmaker experiment—he’ll spend hours tinkering with the untapped capital of, say, the profit yields of the new Inscription recipes—and he’ll report back on his blog detailing each of his successes and failures, much to the glee of his international bulwark of disciples. After all, it’s not like he’s risking anything truly disastrous or life-changing. As the Goldmaker reiterates to me, we’re talking about the currency of elves, dwarves, and orcs in a computer game. He can afford to be a little cavalier with his investments, because “it’s just pixels at the end of the day.”

“I’m always looking for markets that players aren’t focusing on,” he says. “Because there are only so many people in the gold-making scene, so there’s always going to be something that players aren’t looking at.”


You can read the fundamentals of how The Goldmaker breaks down his economic principles in a beginner’s guide he posted to his website this March. “World of Warcraft is a game about constantly improving your character,” he writes, and as a financial opportunist, it’s your job to provide avenues to either help those characters boost their power levels or beautify their models. So, as an upstart auction house shark, you’ll learn to farm efficient materials in Azeroth, target specific high-value recipes that you can turn around quickly, and buy out supplies when they’re abundant and repost them when they’re scarce.

Doomsday prepping for less crazy folk

Friday, August 24th, 2018

Michal Zalewski discusses doomsday prepping for less crazy folk:

The prepper culture begs to be taken with a grain of salt. In a sense, it has all the makings of a doomsday cult: a tribe of unkempt misfits who hoard gold bullion, study herbalism, and preach about the imminent collapse of our society.

Today, we see such worries as absurd. It’s not that life-altering disasters are rare: every year, we hear about millions of people displaced by wildfires, earthquakes, hurricanes, or floods. Heck, not a decade goes by without at least one first-class democracy lapsing into armed conflict or fiscal disarray. But having grown up in a period of unprecedented prosperity and calm, we take our way of life for granted – and find it difficult to believe that an episode of bad weather or a currency crisis could destroy almost everything we worked for to date.

I suspect that we dismiss such hazards not only because they seem surreal, but also because worrying about them makes us feel helpless and lost. What’s more, we follow the same instincts to tune out far more pedestrian and avoidable risks; for example, most of us don’t plan ahead for losing a job, for dealing with a week-long water outage, or for surviving the night if our home goes up in smoke.

For many, the singular strategy for dealing with such dangers is to pray for the government to bail us out. But no matter if our elected officials prefer to school us with passages from Milton Friedman or from Thomas Piketty, the hard truth is that no state can provide a robust safety net for all of life’s likely contingencies; in most places, government-run social programs are severely deficient in funding, in efficiency, and in scope. Large-scale disasters pit us against even worse odds; from New Orleans in 2005 to Fukushima in 2011, there are countless stories of people left behind due to political dysfunction, poorly allocated resources, or lost paperwork.

And so, the purpose of this guide is to combat the mindset of learned helplessness by promoting simple, level-headed, personal preparedness techniques that are easy to implement, don’t cost much, and will probably help you cope with whatever life throws your way.