We’re following the doctor

April 23rd, 2017

I noticed that The Poseidon Adventure was leaving HBO soon, and I’d never seen the classic 1970s disaster movie, so I started watching it, not expecting a Christian parable:

Right at the start we’re introduced to the hero, the Rev. Frank Scott (Gene Hackman), a renegade priest whom we soon come to realize is a modern-day stand-in for Jesus Christ.

Some of the parallels are subtle. Scott is introduced during an onboard religious service by a priest named John, as in John the Baptist. Before disaster strikes the ship, Scott sits at a table with a former prostitute. He raises his glass to toast “Love.” After the ship turns over, someone looks at him and says, “Jesus Christ, what happened?”

[...]

There is just one way out. It’s to climb up a huge Christmas tree. Yes, salvation can be achieved only by way of the tree. Scott is shown dragging it like Jesus carrying the cross. “Life! Life is up there!” he admonishes the passengers. But half of them won’t listen to him, and even his followers are put off by his confidence and stridency: “Who do you think you are, God himself?”

No sooner have Scott’s followers climbed the tree to safety than the walls collapse and water floods the ballroom. Interestingly, director Ronald Neame (“The Prime of Miss Jean Brodie,” “The Odessa File”) doesn’t film the resulting chaos from the viewpoint of the doomed passengers. He shoots their scrambling and flailing from a cold distance, in much the same way that Cecil B. DeMille filmed the doomed Egyptians in “The Ten Commandments” (1956).

Neame brings the same distance to a later scene, in which Scott and his followers come upon a group of survivors led by the ship’s doctor. Scott tells them that they are headed in the wrong direction, but they walk by like zombies. “We’re following the doctor,” one says. They are people in the trance of a false doctrine.

Any doubt that Scott is a Christ figure is eradicated in the climactic scene in which Scott sacrifices his life for the remaining passengers. His method of self-sacrifice is telling. After an agonized and angry prayer (“What more do you want from us?”), he leaps onto a steaming valve and closes it, using his body weight to turn it shut. After hanging from the valve for a few extra seconds (so we catch the crucifixion reference), he drops to his death.

What I couldn’t help but notice was that the cast consisted almost entirely of male character actors and female models. Even the fat lady, Shelley Winters, started her career as a bombshell.

A more comprehensive and devious approach

April 22nd, 2017

An enterprising group of hackers targeted a Brazilian bank with a more comprehensive and devious approach than usual:

At 1 pm on October 22 of last year, the researchers say, hackers changed the Domain Name System registrations of all 36 of the bank’s online properties, commandeering the bank’s desktop and mobile website domains to take users to phishing sites. In practice, that meant the hackers could steal login credentials at sites hosted at the bank’s legitimate web addresses. Kaspersky researchers believe the hackers may have even simultaneously redirected all transactions at ATMs or point-of-sale systems to their own servers, collecting the credit card details of anyone who used their card that Saturday afternoon.

“Absolutely all of the bank’s online operations were under the attackers’ control for five to six hours,” says Dmitry Bestuzhev, one of the Kaspersky researchers who analyzed the attack in real time after seeing malware infecting customers from what appeared to be the bank’s fully valid domain. From the hackers’ point of view, as Bestuzhev puts it, the DNS attack meant that “you become the bank. Everything belongs to you now.”

It conquered the office

April 21st, 2017

Adam Smith famously used a pin factory to illustrate the advantages of specialization, Virginia Postrel reminds us — just before the Industrial Revolution really kicked off:

By improving workers’ skills and encouraging purpose-built machinery, the division of labor leads to miraculous productivity gains. Even a small and ill-equipped manufacturer, Smith wrote in The Wealth of Nations, could boost each worker’s output from a handful of pins a day to nearly 5,000.

In the early 19th century, that number jumped an order of magnitude with the introduction of American inventor John Howe’s pin-making machine. It was “one of the marvels of the age, reported on in every major journal and encyclopedia of the time,” writes historian of technology Steven Lubar. In 1839, the Howe factory had three machines making 24,000 pins a day — and the inventor was clamoring for pin tariffs to offset the nearly 25 percent tax that pin makers had to pay on imported brass wire, a reminder that punitive tariffs hurt domestic manufacturers as well as consumers.

[...]

Nowadays, we think of straight pins as sewing supplies. But they weren’t always a specialty product. In Smith’s time and for a century after, pins were a multipurpose fastening technology. Straight pins functioned as buttons, snaps, hooks and eyes, safety pins, zippers, and Velcro. They closed ladies’ bodices, secured men’s neckerchiefs, and held on babies’ diapers. A prudent 19th century woman always kept a supply at hand, leading a Chicago Tribune writer to opine that the practice encouraged poor workmanship in women’s clothes: “The greatest scorner of woman is the maker of the readymade, who would not dare to sew on masculine buttons with but a single thread, yet will be content to give the feminine hook and eye but a promise of fixedness, trusting to the pin to do the rest.”

Most significantly, pins fastened paper. Before Scotch tape or command-v, authors including Jane Austen used them to cut and paste manuscript revisions. The Bodleian Library in Oxford maintains an inventory of “dated and datable pins” removed from manuscripts going as far back as 1617.

[...]

But a better solution was on its way. In 1899, an inventor in the pin-making capital of Waterbury, Connecticut, patented a “machine for making paper clips.” William Middlebrook’s patent application, observed Henry Petroski in The Evolution of Useful Things, “showed a perfectly proportioned Gem.”

It was that paper clip design that conquered the office and consigned pins to their current home in the sewing basket.

People are disposed to mistake predicting troubles for causing troubles

April 20th, 2017

Enoch Powell gave his infamous Rivers of Blood speech on April 20, 1968. Here’s how the BBC reported it:

The Conservative right-winger Enoch Powell has made a hard-hitting speech attacking the government’s immigration policy.

Addressing a Conservative association meeting in Birmingham, Mr Powell said Britain had to be mad to allow in 50,000 dependents of immigrants each year.

He compared it to watching a nation busily engaged in heaping up its own funeral pyre.

The MP for Wolverhampton South West called for an immediate reduction in immigration and the implementation of a Conservative policy of “urgent” encouragement of those already in the UK to return home.

“It can be no part of any policy that existing families should be kept divided. But there are two directions on which families can be reunited,” he said.

Mr Powell compared enacting legislation such as the Race Relations Bill to “throwing a match on to gunpowder”.

He said that as he looked to the future he was filled with a sense of foreboding.

“Like the Roman, I seem to see the river Tiber foaming with much blood,” he said.

He estimated that by the year 2000 up to seven million people — or one in ten of the population — would be of immigrant descent.

How did that prediction pan out?

The Census in 2001 showed 4.6 million people living in the UK were from an ethnic minority, or 7.9% of the population.

Here’s the opening to the actual speech:

The supreme function of statesmanship is to provide against preventable evils. In seeking to do so, it encounters obstacles which are deeply rooted in human nature.

One is that by the very order of things such evils are not demonstrable until they have occurred: at each stage in their onset there is room for doubt and for dispute whether they be real or imaginary. By the same token, they attract little attention in comparison with current troubles, which are both indisputable and pressing: whence the besetting temptation of all politics to concern itself with the immediate present at the expense of the future.

Above all, people are disposed to mistake predicting troubles for causing troubles and even for desiring troubles: “If only,” they love to think, “if only people wouldn’t talk about it, it probably wouldn’t happen.”

Perhaps this habit goes back to the primitive belief that the word and the thing, the name and the object, are identical.

At all events, the discussion of future grave but, with effort now, avoidable evils is the most unpopular and at the same time the most necessary occupation for the politician. Those who knowingly shirk it deserve, and not infrequently receive, the curses of those who come after.

(I’ve mentioned this speech before.)

RIP WeaponsMan

April 19th, 2017

The “quiet professional” behind the WeaponsMan blog, Kevin O’Brien, has passed away, his brother reports:

He was born in 1958 to Robert and Barbara O’Brien. We grew up in Westborough, Mass. Kevin graduated from high school in 1975 and joined the Army in (I believe) 1979. He learned Czech at DLI and became a Ranger and a member of Special Forces.

Kevin’s happiest times were in the Army. He loved the service and was deeply committed to it. We were so proud when he earned the Green Beret. He was active duty for eight years and then stayed in the Reserves and National Guard for many years, including a deployment to Afghanistan in 2003. He told me after that that Afghan tour was when he felt he had made his strongest contribution to the world.

[...]

In the winter of 2015, we began building our airplane together. You could not ask for a better building partner.

Last Thursday night was our last “normal” night working on the airplane. I could not join him Friday night, but on Saturday morning I got a call from the Portsmouth Regional Hospital. He had called 911 on Friday afternoon and was taken to the ER with what turned out to be a massive heart attack. Evidently he was conscious when he was brought in, but his heart stopped and he was revived after 60 minutes of CPR. He never reawakened.

He will be missed.

Better villains, bigger explosions

April 19th, 2017

A Twitter follower offered David Frum a memorable explanation of the weak hold of the First World War upon the American consciousness.

“Americans prefer the sequel: better villains, bigger explosions.”

The shadow of Enoch Powell looms ever-larger over Britain

April 18th, 2017

“If the history of the world is but the biography of great men, as Thomas Carlyle put it, the history of Britain since the 1960s is but the biography of two great men and one woman,” The Economist declares, apparently channeling the spirit of Mencius Moldbug:

As Labour home secretary from 1965-67, Roy Jenkins took the government out of the bedroom with a series of liberalising laws on divorce, homosexuality and censorship. As Tory prime minister from 1979-90 Margaret Thatcher unleashed the power of markets. The main job of their successors was to come to terms with these twin revolutions: Tony Blair converted Labour to Thatcherism and David Cameron converted the Tories to Jenkinsism.

Before Brexit it looked as if that was it: the party that could produce the best synthesis of Thatcher and Jenkins would win. But today a third figure hovers over British politics: a man who was born in 1912 — eight years before Jenkins and 13 before Thatcher — but whose influence seems to grow by the day. One of Enoch Powell’s most famous observations was that “all political lives, unless they are cut off in midstream at some happy juncture, end in failure.” His political life is enjoying a posthumous success.

Powell put two issues at the heart of his politics: migration and Europe. He convulsed the country in 1968 when he declared in a speech in his native Birmingham that mass immigration would produce social breakdown — that “like the Roman, I seem to see the River Tiber foaming with much blood.” And he campaigned tirelessly against the European Economic Community. These two passions were united by his belief in the nation state. He thought that nations were the building blocks of society and that attempts to subvert them, through supranational engineering or global flows of people, would end in disaster.

Powell didn’t have the same direct influence as Thatcher or Jenkins. Thatcher was prime minister for 11 tumultuous years. Jenkins lived his life at the centre of the establishment. Powell spent only 15 months of his 37-year political career in office, as minister for health; nothing of substance bears his name on the statute books. In his new book, “The Road to Somewhere”, David Goodhart, a liberal critic of multiculturalism who has been accused of “liberal Powellism”, thinks that his “rivers of blood” speech was doubly counter-productive: it toxified the discussion of immigration for a generation and set the bar to successful immigration too low (no rivers foaming with blood, no problem).

Yet Brexit is soaked in the blood of Powellism. Some of the leading Brexiteers acknowledge their debt to Powell: Nigel Farage regards him as a political hero and says that the country would be better today if his words had been heeded. Powell lit the fire of Euroscepticism in 1970 and kept it burning, often alone, for decade upon decade. He provided the Eurosceptics with their favourite arguments: that Europe was a mortal threat to British sovereignty; that Britain’s future lay in going it alone, “her face towards the oceans and the continents of the world”; that the establishment had betrayed the British people into joining Europe, by selling a political project as an economic one, and would betray them again. History has also been on his side. David Shiels, of Wolfson College, Cambridge, points out that, in Powell’s time, the questions of immigration and Europe were distinct (the immigration that worried him was from the Commonwealth). Europe’s commitment to the free movement of people drove the two things together and gave Powellism its renewed power.

Just as important as his arguments was his style. Powell was the first of the new generation of populists cropping up across the West, a worshipper of Nietzsche in his youth, a professor of classics by the age of 25 who nevertheless considered himself a true voice of the people. He believed that the British establishment had become fatally out of touch on the biggest questions facing the country and used his formidable charisma — insistent voice tinged with Brummie, hypnotic stare — to seduce his audiences.

Nassim Taleb and the Guardian

April 17th, 2017

Nassim Nicholas Taleb hates bankers, academics, and journalists, but he was willing to sit down with Carole Cadwalladr of The Guardian:

And yet here he is, chatting away, surprisingly friendly and approachable. When I say as much as we walk to the restaurant, he asks, “What do you mean?”

“In your book, you’re quite…” and I struggle to find the right word, “grumpy”.

He shrugs. “When you write, you don’t have the social constraints of having people in front of you, so you talk about abstract matters.”

Social constraints, it turns out, have their uses. And he’s an excellent host. We go to his regular restaurant, a no-nonsense, Italian-run, canteen-like place, a few yards from his faculty in central Brooklyn, and he insists that I order a glass of wine.

“And what’ll you have?” asks the waitress.

“I’ll take a coffee,” he says.

“What?” I say. “No way! You can’t trick me into ordering a glass of wine and then have coffee.” It’s like flunking lesson #101 at interviewing school, though in the end he relents and has not one but two glasses and a plate of “pasta without pasta” (though strictly speaking you could call it “mixed vegetables and chicken”), and attacks the bread basket “because it doesn’t have any calories here in Brooklyn”.

This isn’t the “PC police” talking

April 16th, 2017

Scientific American has published an embarrassingly unscientific piece by Eric Siegel on the real problem with Charles Murray and The Bell Curve:

Attempts to fully discredit his most famous book, 1994′s “The Bell Curve,” have failed for more than two decades now. This is because they repeatedly miss the strongest point of attack: an indisputable — albeit encoded — endorsement of prejudice.

So, the science is unassailable, but we should vehemently attack an encoded endorsement of prejudice that is based on that (apparently) unassailable science? “This isn’t the ‘PC police’ talking,” he asserts, but he completely ignores what Murray explicitly says about prejudging people:

Even when the differences are substantial, the variation between two groups will almost always be dwarfed by the variation within groups — meaning that the overlap between two groups will be great. In a free society where people are treated as individuals, “So what?” is to me the appropriate response to genetic group differences. The only political implication of group differences is that we must work hard to ensure that our society is in fact free and that people are in fact treated as individuals.

The obscure religion that shaped the West

April 15th, 2017

Zoroastrianism might be called the obscure religion that shaped the West:

It is generally believed by scholars that the ancient Iranian prophet Zarathustra (known in Persian as Zartosht and Greek as Zoroaster) lived sometime between 1500 and 1000 BC. Prior to Zarathustra, the ancient Persians worshipped the deities of the old Irano-Aryan religion, a counterpart to the Indo-Aryan religion that would come to be known as Hinduism. Zarathustra, however, condemned this practice, and preached that God alone – Ahura Mazda, the Lord of Wisdom – should be worshipped. In doing so, he not only contributed to the great divide between the Iranian and Indian Aryans, but arguably introduced to mankind its first monotheistic faith.

The idea of a single god was not the only essentially Zoroastrian tenet to find its way into other major faiths, most notably the ‘big three’: Judaism, Christianity and Islam. The concepts of Heaven and Hell, Judgment Day and the final revelation of the world, and angels and demons all originated in the teachings of Zarathustra, as well as the later canon of Zoroastrian literature they inspired. Even the idea of Satan is a fundamentally Zoroastrian one; in fact, the entire faith of Zoroastrianism is predicated on the struggle between God and the forces of goodness and light (represented by the Holy Spirit, Spenta Manyu) and Ahriman, who presides over the forces of darkness and evil. While man has to choose to which side he belongs, the religion teaches that ultimately, God will prevail, and even those condemned to hellfire will enjoy the blessings of Paradise (an Old Persian word).

How did Zoroastrian ideas find their way into the Abrahamic faiths and elsewhere? According to scholars, many of these concepts were introduced to the Jews of Babylon upon being liberated by the Persian emperor Cyrus the Great. They trickled into mainstream Jewish thought, and figures like Beelzebub emerged. And after Persia’s conquests of Greek lands during the heyday of the Achaemenid Empire, Greek philosophy took a different course. The Greeks had previously believed humans had little agency, and that their fates were at the mercy of their many gods, whom often acted according to whim and fancy. After their acquaintance with Iranian religion and philosophy, however, they began to feel more as if they were the masters of their destinies, and that their decisions were in their own hands.

Decivilization in the 1960s

April 14th, 2017

Steven Pinker discusses decivilization in the 1960s:

After a three-decade free fall that spanned the Great Depression, World War II, and the Cold War, Americans multiplied their homicide rate by more than two and a half, from a low of 4.0 in 1957 to a high of 10.2 in 1980 (U.S. Bureau of Statistics; Fox and Zawitz: 2007). The upsurge included every other category of major crime as well, including rape, assault, robbery, and theft, and lasted (with ups and downs) for three decades. The cities got particularly dangerous, especially New York, which became a symbol of the new criminality. Though the surge in violence affected all the races and both genders, it was most dramatic among black men, whose annual homicide rate had shot up by the mid-1980s to 72 per 100,000.

[...]

The rebounding of violence in the 1960s defied every expectation. The decade was a time of unprecedented economic growth, nearly full employment, levels of economic equality for which people today are nostalgic, historic racial progress, and the blossoming of government social programs, not to mention medical advances that made victims more likely to survive being shot or knifed. Social theorists in 1962 would have happily bet that these fortunate conditions would lead to a continuing era of low crime. And they would have lost their shirts.

[...]

When rock music burst onto the scene in the 1950s, politicians and clergymen vilified it for corrupting morals and encouraging lawlessness. (An amusing video reel of fulminating fogies can be seen in Cleveland’s Rock and Roll Hall of Fame and Museum.) Do we now have to – gulp – admit they were right? Can we connect the values of 1960s popular culture to the actual rise in violent crimes that accompanied them? Not directly, of course. Correlation is not causation, and a third factor, the pushback against the values of the Civilizing Process, presumably caused both the changes in popular culture and the increase in violent behavior. Also, the overwhelming majority of baby boomers committed no violence whatsoever. Still, attitudes and popular culture surely reinforce each other, and at the margins, where susceptible individuals and subcultures can be buffeted one way or another, there are plausible causal arrows from the decivilizing mindset to the facilitation of actual violence.

One of them was a self-handicapping of the criminal justice Leviathan. Though rock musicians seldom influence public policy directly, writers and intellectuals do, and they got caught up in the zeitgeist and began to rationalize the new licentiousness. Marxism made violent class conflict seem like a route to a better world. Influential thinkers like Herbert Marcuse and Paul Goodman tried to merge Marxism or anarchism with a new interpretation of Freud that connected sexual and emotional repression to political repression and championed a release from inhibitions as part of the revolutionary struggle. Troublemakers were increasingly seen as rebels and nonconformists, or as victims of racism, poverty, and bad parenting. Graffiti vandals were now ‘artists,’ thieves were ‘class warriors,’ and neighborhood hooligans were ‘community leaders.’ Many smart people, intoxicated by radical chic, did incredibly stupid things. Graduates of elite universities built bombs to be set off at army social functions, or drove getaway cars while ‘radicals’ shot guards during armed robberies. New York intellectuals were conned by Marxobabble-spouting psychopaths into lobbying for their release from prison (Pinker 2002: 261–262).

Read the whole thing. (It’s an excerpt from The Better Angels of Our Nature: Why Violence Has Declined.)

The lowest layer of the pyramid is the foundation

April 13th, 2017

It’s hard to find a teacher who doesn’t make reference to Bloom’s Taxonomy, Doug Lemov notes, because it’s part of the language of teaching, but there’s a problem:

Bloom’s Taxonomy is often represented as a pyramid with the understanding — intended or accidental — that teachers should try to get to the top. That’s the nature of pyramids, I guess.

Bloom's Taxonomy Pyramid

Generally when teachers talk about “Bloom’s taxonomy,” they talk with disdain about “lower level” questions. They believe, perhaps because of the pyramid image which puts knowledge at the bottom, that knowledge-based questions, especially via recall and retrieval practice, are the least productive thing they could be doing in class. No one wants to be the rube at the bottom of the pyramid.

But this, interestingly is not what Bloom’s argued — at least according to Vanderbilt’s description. Saying knowledge questions are low value and that knowledge is the necessary precondition for deep thinking are very different things. More importantly believing that knowledge questions — even mere recall of facts — are low value doesn’t jibe with the overwhelming consensus of cognitive science, summarized here by Daniel Willingham, who writes,

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is in long-term memory (not just found in the environment)

In other words there are two parts to the equation. You not only have to teach a lot of facts to allow students to think deeply but you have to reinforce knowledge enough to install it in long-term memory or you can’t do any of the activities at the top of the pyramid.

US healthcare is famous for three things

April 12th, 2017

US healthcare is famous for three things, Ben Southwood notes:

It’s expensive, it’s not universal, and it has poor outcomes. The US spends around $7,000 per person on healthcare every year, or roughly 18% of GDP; the next highest spender is Switzerland, which spends about $4,500. Before Obamacare, approx 15% of the US population were persistently uninsured (8.6% still are). And as this chart neatly shows, their overall outcome on the most important variable — overall life expectancy — is fairly poor.

But some of this criticism is wrongheaded and simplistic: when you slice the data up more reasonably, US outcomes look impressive, but being the world’s outrider is much more expensive than following behind. What’s more, most of the solutions people offer just don’t get to the heart of the issue: if you give people freedom they’ll spend a lot on healthcare.

The US undoubtedly spends a huge amount on healthcare. One popular narrative is that because of market failures and/or extreme overregulation in healthcare, prices are excessively high. So Americans with insurance (or covered by Medicare, the universal system for the elderly, or Medicaid, the government system for the poor) get the same as other developed world citizens, but those without get very poor care and die younger. A system like the NHS solves the problem, according to this view, with bulk buying of land, labour, and inputs, better incentives, and universal coverage.

But there are some serious flaws in this theory. Firstly, extending insurance to the previously-uninsured doesn’t, in America, seem to have large benefits. For example, a recent NBER paper found no overall health gains from the massive insurance expansion under Obamacare.* A famous RAND study found minuscule benefits over decades from giving out free insurance to previously uninsured in the 1970s. In fact, over and above the basics, insuring those who choose not to get insurance doesn’t ever seem to have large gains. Indeed, there is wide geographic variation in the life expectancy among the low income in the US, but this doesn’t even correlate with access to medical care! This makes it unlikely that the gap between the US and the rest is explained by universality.

To find the answer, consider the main two ingredients that go into health outcomes. One is health, and the other is treatment. If latent health is the same across the Western world, we can presume that any differences come from differences in treatment. But this is simply not the case. Obesity is far higher in the USA than in any other major developed country. Obviously it is a public health problem, but it’s unrealistic to blame it on the US system of paying for doctors, administrators, hospitals, equipment and drugs.

In fact in the US case it’s not even obesity, or indeed their greater pre-existing disease burden, that is doing most of the work in dragging their life expectancy down; it’s accidental and violent deaths. It is tragic that the US is so dangerous, but it’s not the fault of the healthcare system; indeed, it’s an extra burden that US healthcare spending must bear. Just simply normalising for violent and accidental death puts the USA right to the top of the life expectancy rankings.

One of our cultural problems, Arnold Kling adds, is that we spend too much on health care and not enough on public health.

Take your shoes off at the door

April 11th, 2017

It turns out that taking your shoes off when you come inside doesn’t just keep the carpets cleaner. It’s also healthier:

Among samples collected in homes, 26.4% of shoe soles tested positive for C. Diff, about three times the number found on the surfaces of bathrooms and kitchens.

And that’s just one bacterium. In an earlier investigation, Dr. Garey examined past studies to learn if “shoe soles are a vector for infectious pathogens.” The answer was a resounding yes.

Among the studies: Austrian researchers found at least 40% of shoes carried Listeria monocytogenes in 2015. And a 2014 German study found that over a quarter of boots used on farms carried E.coli.

“Essentially, when you wear your shoes in a house, you are bringing in everything you stepped in during the day,” says Jonathan Sexton, a laboratory manager at the Mel & Enid Zuckerman College of Public Health at the University of Arizona.

Wiping your feet, however vigorously, on a welcome mat, provides only limited help, he says. “It will remove some of the dirt, but you have to think of the person who wiped their feet before. You might be picking stuff they left behind.”

Some homeowners may worry that guests in socks or bare feet might also represent a health risk. That’s possible, Dr. Sexton says, but the inside of a shoe has far less bacteria than the outside.

Both researchers agree that the risk is muted. “Shoes in the house are not something to freak out about,” Dr. Sexton says.

A Family Planning Miracle in Colorado

April 10th, 2017

So, what fraction of pregnancies are unintended?

Almost half of Colorado women who got pregnant in 2008 said that the pregnancy happened sooner than they wanted or that they hadn’t wanted to get pregnant at all. That was similar to the US average: the rate of unintended pregnancy has been stuck around 50 percent since the 1960s.

The Colorado Family Planning Initiative cut teen births and abortions almost in half:

They fell by nearly 20 percent among women aged 20-24. (Note: Under normal circumstances, over 80 percent of teen pregnancies and 70 percent of pregnancies among single women aged 20-29 are unsought, so this change means women’s realities are better matching their family desires.) Second-order births to teens—teens who gave birth a second or third time—dropped by 58 percent. High-risk births, including preterm births, also diminished.

Poor families benefited the most, because unsought pregnancy is four times as common and unsought birth seven times as common among poor women as among their more prosperous peers. With fewer families facing the dire circumstances triggered by an unexpected pregnancy or unplanned birth, the state saved $66-70 million in public assistance, according to a team of economists at the University of Colorado.

How did Colorado get such dramatic results? They provided “get it and forget it” forms of contraception, such as long-acting IUDs and implants.

This is seen as a great Progressive victory.