An alternative universe in which golf courses are a prime subject for intellectualizing

March 23rd, 2018

The New York Times laments that its obituaries have been dominated by white men since 1851 and decides to rectify this by eulogizing the overlooked Ada Lovelace — who wasn’t actually overlooked in her lifetime, and whose death made the front page of the New York Times on December 15, 1852:

Nor has Ada, the Countess Lovelace, been ignored over the last 40 or 50 years. In the late 1970s the Pentagon named after her its new programming language Ada that it tried to impose on all defense contractors.


The real Ada was, like her friend Charles Babbage, a major celebrity in her own time. It hardly hurt that she was the only legitimate child of the most famous man in Europe in the post-Waterloo era, the poet Lord Byron. Nor did it hurt that she was born an aristocrat and married an aristocrat.

In particular, her having a rather masculine turn of intellect made her more renown during her life.

In fact, when an 1844 bestseller anticipating Darwin’s theory of evolution was published under a pseudonym, Ada Lovelace was one of the prime suspects. This is where the story takes a very Steve Sailer turn:

As it turned out to the surprise of most, the real author was a hard-working but fairly obscure Scottish journalist and golf course architect named Robert Chambers, who had come up with the idea that everything is the product of “development” while recovering from overwork by playing golf daily on The Old Course at St. Andrews, a links that had developed over centuries of play without much in the way of intelligent design until about Chambers’ day.

The reason the contributions to the theory of computer science by Lovelace (and Babbage) was overlooked in late 19th and early 20th century was of course because there were no computers. Babbage’s famous (at the time) and well-funded Analytical Engine project had failed. Similarly, nobody much cared about Leonardo da Vinci’s helicopter sketch until after the helicopter had been invented.

It would be interesting to look into whether Lovelace’s idea that her friend Babbage’s engine could turn into a general purpose computer contributed to Babbage’s notorious problem with specification creep. If he’d been able to say Enough! to what his engine was supposed to do, he might have gotten it finished. But I don’t know if Lovelace’s ideas worsened Babbage’s failings.

Interestingly, Chambers was mostly overlooked during his own lifetime (not revealing himself as the author of the bestseller until decades later), nor since then, although Secord’s book does much to revive the man’s memory.

The link between Chambers’ evolutionary thinking and his obsession with links golf courses that had originally evolved without a designer has likewise been forgotten. This is even though Chambers’ great-grandson, golf architect Sir Guy Campbell, cowrote a history of golf in the 1950s with Darwin’s grandson Bernard Darwin, spelling out how the St. Andrews links had evolved over the eons.

One could imagine an alternative universe in which golf courses are a prime subject for intellectualizing and thus Chambers is a famous figure in intellectual history. But that’s not the one we live in.

Jordan Peterson is trying to save the Western civilization by devising a post-Christian system of ethics

March 22nd, 2018

Tanner Greer totally reevaluated everything he said about Jordan Peterson and now argues that Jordan Peterson is trying to save Western civilization by devising a post-Christian system of ethics:

The spectacular rise of Jordan Peterson has caught much of the world flat-footed. Discussions of the psychology professor from the University of Toronto tend to focus on the enormous popular movement his lectures have spawned, rather than the actual ideas presented in the lectures themselves. As a result, no one seems to know who the “real” Jordan Peterson is.

In a way, this is understandable. Peterson is a man of several personae. One Peterson is the inventor of an innovative and compelling neuropsychological model of human behavior. This is the Peterson presented in a dozen research articles reviewed and published by his academic peers.

Another Peterson dispenses pieces of practical advice and dispels progressive dogmas with a quiet, fatherly charisma. This is the Peterson made famous in podcasts, television interviews, and his best selling self-help book.

And this project is grand. It is nothing less than the revitalization of Western civilization itself.

Read the whole thing, of course.

How does the number of steps to buy a gun relate to overall homicide and suicide rates?

March 21st, 2018

A recent New York Times story lamented how few steps there were to buy a gun in the US versus other countries, so sociologist David Yamane decided to ask the obvious question, Is there a patterned relationship between the number of steps someone has to go through to buy a gun in these 15 countries and the countries’ overall rates of homicide and suicide?

Homicide rates are weakly related to the number of steps it takes to buy a gun in these 15 countries. The polynomial trendline increases through the middle of the range then decreases at the high end (Japan), but the correlation is weak (0.071).

The relationship between suicide rates and the number of steps it takes to buy a gun is slightly stronger (0.085), but still weak and not in the direction suicide prevention advocates would like. The polynomial trendline increases fairly consistently through the range then jumps up somewhat at the end (again, Japan).

Looking at the combined rate of homicide and suicide, we see a still stronger though still weak correlation (0.123) with steps to buy a gun, with the polynomial trendline starting at the United States (2 steps and 14.58 combined rate) and arcing its way upward and leveling off toward Japan (13 steps and 18.71 combined rate). In between you can find two countries with 8 steps but dramatically different death rates by homicide and suicide (Austria’s 12.61 rate and Brazil’s 32.34 rate). Ditto for 7 steps: Germany 9.95 combined rate vs. Russia’s 45.91 combined rate.

The closest countries to the United States are Austria (8 steps, 12.61 combined rate) and Yemen (2 steps, 16.67 combined rate).

Will any shows from the Golden Age of TV endure?

March 20th, 2018

Will any shows from the Golden Age of TV endure?

If you sometimes feel overwhelmed by the amount of television out there — by the increasing number of shows being praised by your peers, by the cascade of critically acclaimed programming on the ever-enlarging expanse of channels and pay tiers and streaming services — you’re not alone. At the Television Critics Association’s winter meeting in January, John Landgraf, the CEO of FX, highlighted the ongoing explosion in scripted programming. According to a report on Landgraf’s speech in Variety, 2017 saw 487 scripted series air on networks, cable, pay cable, and streaming services — up from 455 in 2016, which was up from 422 in 2015. Only 153 of the 2017 series aired on network TV — ABC, NBC, etc. — while 175 were on basic cable. Streaming services are the biggest driver in the latest TV boom; outlets like Netflix, Amazon, and Hulu accounted for another 117 series. HBO and the other premium cable channels made up the final 42.

“Overall, the total series output on television since 2002 has grown by 168 percent,” Variety reported. By way of comparison, America’s population is up about 13 percent in the same time. The number of hours in the day has remained static, at 24. Simply put: There’s vastly more content (to use a vulgarity that reduces art to a consumable but feels proper when describing the aforementioned torrent) than ever before — and that’s not including the ever-increasing number of feature films or video games that take hundreds of hours to play or YouTube channels making millionaires out of 6-year-old kids. The fragmented nature of our viewing habits means a TV show on a pay cable station can get by with a few hundred thousand viewers if critics like it and it pulls in awards; the biggest “hits” in the world of scripted entertainment are watched by less than 5 percent of the population, if we are to trust the ratings. Of course, with a plethora of viewing options — live airing, DVRed recording, streaming on TVs and laptops and iPhones — relying on something as prosaic as the Nielsen ratings to measure popularity is a mug’s game. We need to scan Google searches and Twitter trends and Facebook topics to see what’s really driving the conversation at any given time.


For several decades, the syndication model provided repetition that helped create a common cultural currency. That model has now weakened — syndication has become less appealing to audiences — as the marketplace has been flooded with new programs and as new technologies have created new viewing options. This will likely make the sitcom almost obsolete as anything other than a day-of laugh-delivery device. The Simpsons at the peak of its powers is a show rooted in its time, one that relies as heavily on pop-culture references as it does on repeated lines of clever dialogue becoming inside jokes among initiates. Strip the show from its moment — as future audiences will experience it — and take away the repetition needed to impress the cleverness of its wordplay on viewers, and what are you left with? Something that lasts? A masterpiece that rewards critical scrutiny for future generations? Or something that fades into the ether, a pleasant memory for those born between 1970 and 1990, and perhaps an artifact of interest to scholars studying the 1990s, but few others?

We could never have imagined what happened in Venezuela

March 19th, 2018

We never could have imagined — or prepped for — what happened in Venezuela, a Venezuelan “prepper” explains:

An economic collapse this long seemed like something that was entirely out of the question. It was entirely unpredictable. I would have expected a pandemics or a coup d’etat long before this hungry zombie-like scenario.

We knew something disturbing was going to happen sooner or later. We could feel it in the atmosphere…but nothing like this. We never thought it would be impossible to find a battery, or engine oil, or gasoline (Jeez, this was an oil-producing country!!) or that kids were going to be endangered in the very door of their schools.

He lists a number of supplies he should have stockpiled and preparations he should have made. A few stand out:

A large, buried diesel custom-made aluminum tank with a proper sized generator (there is not too much space left in our place: we live in a subdivision, houses are wall to wall next to each other) with a homemade silencer, and adequately rigged to the wiring of the house for the largest systems, like freezers and air conditioning.

Enclosing our garage before the steel rebar disappeared from the white market and the production was destined to the black and grey market. (I hate fencing, it is like living in a birdcage, but this would helped a lot for peace of mind).

Perhaps a chicken coop with a couple of hens. The eggs price has been so inflated this days that a single egg costs more than the minimum wage. A hen produces more than a laborer. Do you remember that stories about the eggs, chocolate, and potatoes acting as currency in the WWII? It is becoming currency here too.

Another SUV, with a much taller ground clearance, larger tires, diesel-powered with no electronics and a huge front fender. Something heavy, strong, black or dark grey, windows covered by that plastic clear bullet proof sheeting, able to plow a pack of thugs in motorcycles out of the way without a blink.

Kitty Hawk’s Cora

March 18th, 2018

Kitty Hawk Corporation’s new Cora air taxi “is powered by 12 independent lift fans, which enable her to take off and land vertically like a helicopter” and has a range of “about 62 miles” while flying at “about 110 miles per hour” at an altitude “between 500 ft to 3000 ft above the ground”:

What’s a fire alarm for?

March 17th, 2018

What is the function of a fire alarm?

One might think that the function of a fire alarm is to provide you with important evidence about a fire existing, allowing you to change your policy accordingly and exit the building.

In the classic experiment by Latane and Darley in 1968, eight groups of three students each were asked to fill out a questionnaire in a room that shortly after began filling up with smoke. Five out of the eight groups didn’t react or report the smoke, even as it became dense enough to make them start coughing. Subsequent manipulations showed that a lone student will respond 75% of the time; while a student accompanied by two actors told to feign apathy will respond only 10% of the time. This and other experiments seemed to pin down that what’s happening is pluralistic ignorance. We don’t want to look panicky by being afraid of what isn’t an emergency, so we try to look calm while glancing out of the corners of our eyes to see how others are reacting, but of course they are also trying to look calm.

(I’ve read a number of replications and variations on this research, and the effect size is blatant. I would not expect this to be one of the results that dies to the replication crisis, and I haven’t yet heard about the replication crisis touching it. But we have to put a maybe-not marker on everything now.)

A fire alarm creates common knowledge, in the you-know-I-know sense, that there is a fire; after which it is socially safe to react. When the fire alarm goes off, you know that everyone else knows there is a fire, you know you won’t lose face if you proceed to exit the building.

The fire alarm doesn’t tell us with certainty that a fire is there. In fact, I can’t recall one time in my life when, exiting a building on a fire alarm, there was an actual fire. Really, a fire alarm is weaker evidence of fire than smoke coming from under a door.

But the fire alarm tells us that it’s socially okay to react to the fire. It promises us with certainty that we won’t be embarrassed if we now proceed to exit in an orderly fashion.

That’s Eliezer Yudkowsky leading up to his real point, that there’s no fire alarm for Artificial General Intelligence.

How psychopaths see the world

March 16th, 2018

A new study looks at how psychopaths see the world:

Here are people who can understand what their victims are thinking but just don’t care. Hence their actions. But Baskin-Sommers found that there’s more to their minds than it seems.

Most of us mentalize automatically. From infancy, other minds involuntarily seep into our own. The same thing, apparently, happens less strongly in psychopaths. By studying the Connecticut inmates, Baskin-Sommers and her colleagues, Lindsey Drayton and Laurie Santos, showed that these people can deliberately take another person’s perspective, but on average, they don’t automatically do so to the extent that most other people do. “This is the first time we’re seeing evidence that psychopaths don’t have this automatic ability that most of us have,” Baskin-Sommers says.


The U.S. prison system doesn’t assess psychopathy at intake, so Baskin-Sommers administered a standard test herself to 106 male inmates from the Connecticut prison. Of them, 22 proved to be psychopaths, 28 were not, and the rest fell in a gray zone.


The psychopaths proved to be “glib, narcissistic, and conniving,” she adds. “They can be aggressive, and they like to tell us gruesome details of murders, I think to shock us. But it’s not like that all the time. They do a lot of impression management.”

After assessing the 106 volunteers, she then gave them a computer-based task. They saw a picture of a human avatar in prison khakis, standing in a room, and facing either right or left. There were either two red dots on the wall in front of the avatar, or one dot in front of them and one dot behind them. Their job was to verify how many dots either they or the avatar could see.

Normally, people can accurately say how many dots the avatar sees, but they’re slower if there are dots behind the avatar. That’s because what they see (two dots) interferes with their ability to see through the avatar’s eyes (one dot). This is called egocentric interference. But they’re also slower to say how many dots they can see if that number differs from the avatar’s count. This shows how readily humans take other perspectives: Volunteers are automatically affected by the avatar’s perspective, even when it hurts their own performance. This is called altercentric interference.

Baskin-Sommers found that the psychopathic inmates showed the usual level of egocentric interference — that is, their own perspective was muscling in on the avatar’s. But they showed much less altercentric interference than the other inmates — the avatar’s perspective wasn’t messing with their own, as it would for most other people.

This sounds a bit like another condition:

Other groups of people also show differences in their theory of mind. For example, in one study, Frith asked people to predict where a girl might search for a marble that had been moved without her knowledge. The onlookers knew the marble’s whereabouts, so could they override their own knowledge to step into the girl’s shoes? Eye-tracking software revealed that neurotypical adults look at the same place the girl would, but people with Asperger’s syndrome are less likely to. They don’t seem to spontaneously anticipate others’ actions. “It is a bit worrying if [Baskin-Sommers and her colleagues] are proposing the very same underlying mechanism to explain callousness in psychopathy that we used previously to explain communication problems in autism, albeit based on a different test,” Frith says. “These are very different conditions, after all.”

Giving us what we actually pay attention to

March 15th, 2018

Andrew Marantz of the New Yorker looks at Reddit and the struggle to detoxify the Internet:

The_Donald, with more than half a million subscribers, is by far the biggest pro-Trump subreddit, but it ranks just below No. 150 on the list of all subreddits; it’s roughly the same size as r/CryptoCurrency and r/ComicBooks. “Some people on The_Donald are expressing their genuine political beliefs, and obviously that’s something we want to encourage,” Huffman said. “Others are maybe not expressing sincere beliefs, but are treating it more like a game—If I post this ridiculous or offensive thing, can I get people to upvote it? And then some people, to quote ‘The Dark Knight,’ just want to watch the world burn.” On some smaller far-right subreddits, the discourse is more unhinged. One, created in July of 2016, was called r/Physical_Removal. According to its “About Us” section, it was a subreddit for people who believe that liberals “qualify to get a helicopter ride.” “Helicopter ride,” an allusion to Augusto Pinochet’s reputed habit of throwing Communists out of helicopters, is alt-right slang for murder.

The_Donald accounts for less than one per cent of Reddit’s traffic, but it occupies far more than one per cent of the Reddit-wide conversation. Trolls set a cunning trap. By ignoring their provocations, you risk seeming complicit. By responding, you amplify their message. Trump, perhaps the world’s most skilled troll, can get attention whenever he wants, simply by being outrageous. Traditional journalists and editors can decide to resist the bait, and sometimes they do, but that option isn’t available on user-generated platforms. Social-media executives claim to transcend subjectivity, and they have designed their platforms to be feedback machines, giving us not what we claim to want, nor what might be good for us, but what we actually pay attention to.

There are no good solutions to this problem, and so tech executives tend to discuss it as seldom as possible, and only in the airiest of platitudes. Twitter has rebuffed repeated calls to ban President Trump’s account, despite his many apparent violations of company policy. (If tweeting that North Korea “won’t be around much longer” doesn’t break Twitter’s rule against “specific threats of violence,” it’s not clear what would.) Last fall, on his Facebook page, Mark Zuckerberg addressed—sort of, obliquely—the widespread critique that his company was exacerbating political polarization. “We’ll keep working to ensure the integrity of free and fair elections around the world, and to ensure our community is a platform for all ideas and force for good in democracy,” he wrote, then stepped away as a global howl of frustration grew in the comments.

I asked a few social-media executives to talk to me about all this. I didn’t expect definitive answers, I told them; I just wanted to hear them think through the questions. Unsurprisingly, no one jumped at the chance. Twitter mostly ignored my e-mails. Snapchat’s P.R. representatives had breakfast with me once, then ignored my e-mails. Facebook’s representatives talked to me for weeks, asking precise, intelligent questions, before they started to ignore my e-mails.

Reddit has more reason to be transparent. It’s big, but doesn’t feel indispensable to most Internet users or, for that matter, to most advertisers. Moreover, Anderson Cooper’s CNN segment was hardly the only bit of vividly terrible press that Reddit has received over the years. All social networks contain vitriol and bigotry, but not all social networks are equally associated with these things in the public imagination. Recently, I typed “Reddit is” into Google. Three of the top suggested auto-completions were “toxic,” “cancer,” and “hot garbage.”

A proton battery combines the best aspects of hydrogen fuel cells and conventional batteries

March 14th, 2018

Researchers from RMIT University in Melbourne, Australia have produced a working-prototype proton battery, which combines the best aspects of hydrogen fuel cells and battery-based electrical power:

The latest version combines a carbon electrode for solid-state storage of hydrogen with a reversible fuel cell to provide an integrated rechargeable unit.

The successful use of an electrode made from activated carbon in a proton battery is a significant step forward and is reported in the International Journal of Hydrogen Energy.

During charging, protons produced by water splitting in a reversible fuel cell are conducted through the cell membrane and directly bond with the storage material with the aid of electrons supplied by the applied voltage, without forming hydrogen gas.

In electricity supply mode this process is reversed; hydrogen atoms are released from the storage and lose an electron to become protons once again. These protons then pass back through the cell membrane where they combine with oxygen and electrons from the external circuit to re-form water.

A major potential advantage of the proton battery is much higher energy efficiency than conventional hydrogen systems, making it comparable to lithium ion batteries. The losses associated with hydrogen gas evolution and splitting back into protons are eliminated.

Several years ago the RMIT team showed that a proton battery with a metal alloy electrode for storing hydrogen could work, but its reversibility and rechargeability was too low. Also the alloy employed contained rare-earth elements, and was thus heavy and costly.

The latest experimental results showed that a porous activated-carbon electrode made from phenolic resin was able to store around 1 wt% hydrogen in the electrode. This is an energy per unit mass already comparable with commercially-available lithium ion batteries, even though the proton battery is far from being optimised. The maximum cell voltage was 1.2 volt.

Waking up to hidden motives

March 13th, 2018

I haven’t read The Elephant in the Brain (yet), but I enjoyed Robin Hanson’s talk with Sam Harris about hidden motives:

They discuss selfishness, hypocrisy, norms and meta-norms, cheating, deception, self-deception, education, the evolutionary logic of conversation, social status, signaling and counter-signaling, common knowledge, AI, and many other topics.

I especially enjoyed the misguided questions from the audience.

Let’s talk about bombs for a minute

March 13th, 2018

Let’s talk about bombs for a minute, Greg Ellifritz suggests:

This week, a Utah high school student was arrested after he attempted to detonate a large backpack bomb in his school. Luckily, the bomb malfunctioned and the school was evacuated before anyone was hurt.

Those of you who have taken my “Response to a Terrorist Bombing” class might remember how I discussed that in worldwide terrorist events, the trend is moving more and more towards combining bombs and guns in the attack.

If you find yourself in the middle of a mass shooting, you must be prepared for the coming bomb blasts. If you survive a bomb blast, you must be looking out for people with guns shooting up the evacuation site. That’s simply the reality of modern terrorist attacks worldwide.

This particular incident had only a bombing component (likely because it was committed by a lone high school student without any true support of a terrorist network). I predict we will see more and more of these as well.

After the Las Vegas concert shooting and the Florida school shooting, people are becoming more conscious of the potential carnage that can be inflicted by a deranged gunman armed with a semi-automatic rifle and a lot of ammunition. There are currently multiple social and political pressures being applied to limit the purchase and/or possession of these rifles. While I don’t personally think that tactic will be effective at reducing mass casualties in a terrorist attack, I believe it will become harder and harder to legally acquire semi-automatic rifles in the future.

What will the terrorist resort to if he can’t get a rifle and lots of ammo? You guessed it…bombs. Look at terrorist attacks worldwide. In countries with very strict gun control, we see terrorists use bombs more often. Bombs are easy to make and can cause massive casualties if placed in the right location at the right time. Bombs also bring a disproportionate amount of media attention, which is exactly what the killers and terrorists crave.

If you predict that semi-automatic rifles will become harder to legally acquire in the future, then you have to be prepared for more terrorist bombing incidents.

Be careful what you wish for.

The culture will simply be that which is best at reproducing itself

March 12th, 2018

Amazon Studios recently announced plans to adapt the first novel of Iain M. Banks’ Culture Series, Consider Phlebas.

Philosophy professor Joseph Heath offers an appreciations of Banks’ Culture:

In this context, what distinguishes Banks’s work is that he imagines a scenario in which technological development has also driven changes in the social structure, such that the social and political challenges people confront are new. Indeed, Banks distinguishes himself in having thought carefully about the social and political consequences of technological development. For example, once a society has semi-intelligent drones that can be assigned to supervise individuals at all times, what need is there for a criminal justice system? Thus in the Culture, an individual who commits a sufficiently serious crime is assigned — involuntarily — a “slap drone,” who simply prevents that person from committing any crime again. Not only does this reduce recidivism to zero, the prospect of being supervised by a drone for the rest of one’s life also serves as a powerful deterrent to crime.

This is an absolutely plausible extrapolation from current trends — even just looking at how ankle monitoring bracelets work today. But it also raises further questions. For instance, once there is no need for a criminal justice system, one of the central functions of the state has been eliminated. This is one of the social changes underlying the political anarchism that is a central feature of the Culture. There is, however, a more fundamental postulate. The core feature of Banks’s universe is that he imagines a scenario in which technological development has freed culture from all functional constraints — and thus, he imagines a situation in which culture has become purely memetic. This is perhaps the most important idea in his work, but it requires some unpacking.

The term “meme” was introduced by Richard Dawkins, in an attempt to articulate some cultural equivalent to the role that the “gene” plays in biological evolution.2 The basic building-block of life for Dawkins, one may recall, is “the replicator,” understood simply as “that which reproduces itself.” His key observation is that one can find replicators not just in the biological sphere, but in human social behaviour. In many cases, these “memes” produce obvious benefits to their host, so it is not difficult to see how they succeed in reproducing themselves — consider, for instance, the human practice of using fire to cook food, which is reproduced culturally. In other cases, however, cultural patterns get reproduced, not because they offer any particular benefits — in some cases they are even costly to the host — but because they have a particularly effective “trick,” when it comes to getting themselves reproduced.


Historically, in this process of competition among cultures, a dominant source of competitive advantage has been the ability to promote a desirable social structure, or an effective system of cooperation. Consider the enormous influence that Roman culture exercised in the West. The fact that, one thousand years after the fall of Rome, schoolboys were still memorizing Cicero, the Justinian code remained de facto law throughout vast regions, and Latin was still the written language of the learned classes of Europe, is an extraordinary legacy. The major reason for imitation of the Romans was simply that their culture is one that sustained the greatest, most long-lasting empire the West has ever seen.

Similarly, Han culture was able to spread throughout China in large part through the institutions that it promoted, not just the imperial system, but the vast bureaucracy that sustained it, along with the competitive examination system that promoted effective administration.

Societies with strong institutions become wealthier, more powerful militarily, or some combination of the two. These are the ones whose culture reproduces, either because it is imitated, or because it is imposed on others.4 And yet the dominant trend in human societies, over the past century, has been significant convergence with respect to institutional structure. Most importantly, there has been practically universal acceptance of the need for a market economy and a bureaucratic state as the only desirable social structure at the national level. One can think of this as the basic blueprint of a “successful” society. This has led to an incredible narrowing of cultural possibilities, as cultures that are functionally incompatible with capitalism or bureaucracy are slowly extinguished or transformed.

This winnowing down of cultural possibilities is what constitutes the trend that is often falsely described as “Westernization.” Much of it is actually just a process of adaptation that any society must undergo, in order to bring its culture into alignment with the functional requirements of capitalism and bureaucracy. It is not that other cultures are becoming more “Western,” it is that all cultures, including Western ones, are converging around a small number of variants.5

One interesting consequence of this process is that the competition between cultures is becoming defunctionalized. The institutions of modern bureaucratic capitalism solve many of the traditional problems of social integration in an almost mechanical way. As a result, when considering the modern “hypercultures” — e.g. American, Japanese, European — there is little to choose from a functional point of view. None are particularly better or worse, from the standpoint of constructing a successful society. And so what is there left to compete on? All that is left are the memetic properties of the culture, which is to say, the pure capacity to reproduce itself.


Now consider Banks’s scenario. Consider the process that is generating modern hypercultures, and imagine it continuing for another three or four hundred years. The first consequence is that the culture will become entirely defunctionalized. Banks imagines a scenario in which all of the endemic problems of human society have been given essentially technological solutions (in much the same way that drones have solved the problem of criminal justice). Most importantly, he imagines that the fundamental problem of scarcity has been solved, and so there is no longer any obligation for anyone to work (although, of course, people remain free to do so if they wish). All important decisions are made by a benevolent technocracy of AIs (or the “Minds”).

And so what is left for humanity (or, more accurately, humanoids)? At the individual level, Banks imagines a life very much like the one described by Bernard Suits in The Grasshopper — everything becomes a game, and thus at some level, non-serious. But where Banks went further than Suits was in thinking about the social consequences. What happens when culture becomes freed from all functional constraints? It seems clear that, in the interplanetary competition that develops, the culture that emerges will be the most virulent, or the most contagious. In other words, “the Culture” will simply be that which is best at reproducing itself, by appealing to the sensibilities and tastes of humanoid life-forms.

Try several sports before specializing

March 11th, 2018

The logic of early specialization is straightforward:

Training is necessary to develop skill, but there is a limit to how much a person can train, not just because there are only 24 hours in a day, but because training is physically and psychologically exhausting. A person can train intensively for only a few hours a day without injuring themselves or getting burned out. Thus, the child who starts training early will have a virtually insurmountable training advantage over the child who starts later. Training is, of course, necessary to develop skill. However, the findings of a study recently published in the Journal of Sports Sciences show that later specialization may actually lead to better performance in the long-term.

Professor Arne Güllich, director of the Institute of Applied Sport Science at the University of Kaiserslautern in Germany, compared the training histories of 83 athletes who medaled in the Olympics, or a World Championship event, to those of 83 athletes who competed at that level but did not medal. (The groups were matched on age, gender, and sport to control for any influence of these factors on the results. For every medalist in a given event, the sample included a non-medalist in that event of the same gender and roughly the same age.) The results showed that both the medalists and non-medalists started practicing in their main sport before the age of 12. However, the medalists started training in their main sport an average of 18 months later than the non-medalists. (The medalists started at age 11.8, on average, compared to age 10.3 for the non-medalists.) The medalists also accumulated significantly less training in their sport during adolescence and significantly more training in other sports. This pattern of results held across a wide range of sports, from skiing to basketball to archery.

Along with reducing the risk of burnout and injury, allowing children to sample a range of activities before specializing allows a process known as gene-environment correlation to operate to its full extent. This is the idea that our genetically-influenced traits have an influence on the environments that we seek out and create for ourselves. As recently argued by the behavioral geneticist Elliot Tucker-Drob, gene-environment correlation is fundamental to understanding how expertise develops in children. For example, given the opportunity to try several sports, a child may discover that she has a high level of physical endurance and gravitate towards soccer because it places a premium on this attribute. She may also prefer soccer over other sports because she is extroverted and enjoys having teammates. In turn, after some initial success in the sport, the child’s coach may encourage her to continue playing soccer, setting in motion a “virtuous cycle” of effort followed by improvement, followed by further effort and improvement. However, this natural selection process will never unfold if the child isn’t given ample opportunity to try several sports before specializing.

Why some people become sudden geniuses

March 10th, 2018

In 1860 Eadweard Muybridge was thrown from a stagecoach — and became a creative genius:

He abandoned bookselling and became a photographer, one of the most famous in the world. He was also a prolific inventor. Before the accident, he hadn’t filed a single patent. In the following two decades, he applied for at least 10.

In 1877 he took a bet that allowed him to combine invention and photography. Legend has it that his friend, a wealthy railroad tycoon called Leland Stanford, was convinced that horses could fly. Or, more accurately, he was convinced that when they run, all their legs leave the ground at the same time. Muybridge said they didn’t.

To prove it he placed 12 cameras along a horse track and installed a tripwire that would set them off automatically as Stanford’s favourite racing horse, Occident, ran. Next he invented the inelegantly named “zoopraxiscope”, a device which allowed him to project several images in quick succession and give the impression of motion. To his amazement, the horse was briefly suspended, mid-gallop. Muybridge had filmed the first movie – and with it proven that yes, horses can fly.

The abrupt turnaround of Muybridge’s life, from ordinary bookseller to creative genius, has prompted speculation that it was a direct result of his accident. It’s possible that he had “sudden savant syndrome”, in which exceptional abilities emerge after a brain injury or disease. It’s extremely rare, with just 25 verified cases on the planet.

There’s Tony Cicoria, an orthopaedic surgeon who was struck by lightning at a New York park in 1994. It went straight through his head and left him with an irresistible desire to play the piano. To begin with he was playing other people’s music, but soon he started writing down the melodies that were constantly running through his head. Today he’s a pianist and composer, as well as a practicing surgeon.

Another case is Jon Sarkin, who was transformed from a chiropractor into an artist after a stroke. The urge to draw landed almost immediately. He was having “all kinds” of therapy at the hospital – speech therapy, art therapy, physical therapy, occupational therapy, mental therapy – “And they stuck a crayon in my hand and said ‘want to draw?’ And I said ‘fine’,” he says.


Most strikingly there’s Jason Padgett, who was attacked at a bar in Tacoma, Washington in 2002. Before the attack, Padgett was a college dropout who worked at a futon store. His primary passions in life were partying and chasing girls. He had no interest in maths – at school, he didn’t even get into algebra class.

But that night, everything changed. Initially he was taken to the hospital with a severe concussion. “I remember thinking that everything looked funky, but I thought it was just the narcotic pain shot they gave me” he says. “Then the next morning I woke up and turned on the water. It looked like little tangent lines [a straight line that touches a single point on a curve], spiralling down.”.


There are two leading ideas. The first is that when you’re bashed on the head, the effects are similar to a dose of LSD. Psychedelic drugs are thought to enhance creativity by increasing the levels of serotonin, the so-called “happiness hormone”, in the brain. This leads to “synaesthesia”, in which more than one region is simultaneously activated and senses which are usually separate become linked.


But there is an alternative. The first clue emerged in 1998, when a group of neurologists noticed that five of their patients with dementia were also artists – remarkably good ones. Specifically, they had frontotemporal dementia, which is unusual in that it only affects some parts of the brain. For example, visual creativity may be spared, while language and social skills are progressively destroyed.


To find out what was going on, the scientists performed 3D scans of their patients’ brains. In four out of five cases, they found lesions on the left hemisphere. Nobel Prize-winning research from the 1960s shows that the two halves of the brain specialise in different tasks; in general, the right side is home to creativity and the left is the centre of logic and language.

But the left side is also something of a bully. “It tends to be the dominant brain region,” says Brogaard. “It tends to suppress very marginal types of thinking — highly original, highly creative thinking, because it’s beneficial for our decision-making abilities and our ability to function in normal life.”. The theory goes that as the patients’ left hemispheres became progressively more damaged, their right hemispheres were free to flourish.

This is backed up by several other studies, including one in which creative insight was roused in healthy volunteers by temporarily dialling down activity in the left hemisphere and increasing it in the right. “[the lead researcher] Allen Snyder’s work was replicated by another person, so that’s the theory that I think is responsible,” says Darold Treffert, a psychiatrist from the University of Wisconsin Medical School, who has been studying savant syndrome for decades.


It’s been estimated that as many as one in 10 people with autism have savant syndrome and there’s mounting evidence the disorder is associated with enhanced creativity. And though it’s difficult to prove, it’s been speculated that numerous intellectual giants, including Einstein, Newton, Mozart, Darwin and Michelangelo, were on the spectrum.

One theory suggests that autism arises from abnormally low levels of serotonin in the left hemisphere in childhood, which prevents the region from developing normally. Just like with sudden savant syndrome, this allows the right hemisphere to become more active.

Interestingly, many people with sudden savant syndrome also develop symptoms of autism, including social problems, obsessive compulsive disorder (OCD) and all-consuming interests.