Technology will, by itself, degrade

January 12th, 2020

I didn’t recognize Jonathan Blow by name — he’s the “indie” game designer behind Braid, which I haven’t played, but which I have mentioned — but he recently gave a speech about a topic that interests me, Preventing the Collapse of Civilization:

He presents the key point fifteen minutes in:

This is why technology degrades. It takes a lot of energy to communicate from generation to generation, there are losses.

Nikita Prokopov summarizes it this way:

The software crisis is systemic and generational. Say, the first generation works on thing X. After X is done and becomes popular, time passes and the next generation of programmers comes and works on Y, based on X. They do not need to know, exactly, how X is built, why it was built that way, or how to write an alternative X from scratch. They are not lesser people or lazier, they just have no real need to write X2 since X already exists and allows them to solve more pressing tasks.

The biggest a-ha moment of the talk was that if you are working on Y and Y is based on X, that does not imply automatically that you would know X also. Even if the people who build X are still around, knowledge does not spread automatically and, without actual necessity, it will go away with the people who originally possessed it.

This is counter-intuitive: most people would think that if we’ve built, for example, a space ship or a complex airplane in the past, we could build it again at any time. But no, if we weren’t building a particular plane uninterruptedly, then after just 50 years it is already easier to develop a new one from scratch rather than trying to revive old processes and documentation. Knowledge does not automatically transfer to the next generation.

In programming, we are developing abstractions at an alarming rate. When enough of those are stacked, it becomes impossible to figure out or control what’s going on down the stack. This is where my contribution begins: I believe I have found some pretty vivid examples of how the ladder of abstractions has started to fall and nobody can do anything about it now because we all are used to work only at the very tip of it.

I still think a good general education would teach how to rebuild civilization. (I haven’t read my copy of How to Invent Everything: A Survival Guide for the Stranded Time Traveler yet, but it looks promising.)

Imagine sending a five-year-old into combat

January 11th, 2020

Hamilton Gregory, author of McNamara’s Folly, discusses the use of low-IQ troops in the Vietnam War:

I mentioned McNamara’s Folly when Gwern reviewed it.

Almost any change led to increased productivity

January 10th, 2020

The term Hawthorne effect was coined in 1958 by Henry A. Landsberger when he was analyzing earlier experiments from 1924–32 at the Hawthorne Works (a Western Electric factory outside Chicago) to describe the surprising finding of the numerous productivity studies:

The original purpose of the Hawthorne studies was to examine how different aspects of the work environment, such as lighting, the timing of breaks, and the length of the workday, had on worker productivity.

In the most famous of the experiments, the focus of the study was to determine if increasing or decreasing the amount of light that workers received would have an effect on how productive workers were during their shifts. Employee productivity seemed to increase due to the changes but then decreased once the experiment was over.

What the researchers in the original studies found was that almost any change to the experimental conditions led to increases in productivity. When illumination was decreased to the levels of candlelight, production increased. In other variations of the experiments, the production also improved when breaks were eliminated entirely and when the workday was lengthened.

The results were surprising and the researchers concluded at the time that workers were actually responding to the increased attention from their supervisors. Researchers suggested that productivity increased due to attention and not because of changes in the experimental variables. Landsberger defined the Hawthorne effect as a short-term improvement in performance caused by observing workers.

Roman numerals were used in academia

January 10th, 2020

The Hindu-Arabic number system was invented in India around the year 500 AD, and during the Early Middle Ages spread throughout Arabic-speaking world, but it did not spread quickly throughout Europe:

Crossley examined 1398 manuscripts created between the years 1200 and 1500 to see how much use of the Hindu-Arabic numerals, and found that throughout this period Roman numerals were still largely preferred. For the 13th century, only 7% of manuscripts had the new numbers, rising to 17% for the 14th century and 47% for the 15th century. He also found that in many instances where writers were mixing the two systems, sometimes within the same number – for example, one sometimes found M (for 1000) followed by Arabic numerals.

That crazy, mixed-up example sounds like a superior system, like a more readable scientific notation, where you succinctly clarify the order of magnitude (M = 103) and then rattle off the significant digits.

The new and old systems continued side by side, but in different domains:

Roman numerals were used in academia where universities taught about abstract properties: square numbers, triangular numbers, etc. Hindu-Arabic numerals were used for the practical world of commerce. This occurred in special, so-called abacus schools where merchants and their employees were taught the new Hindu-Arabic numerals. Such schools were prevalent in Italy. Since they were intimately involved with sometimes quite complicated calculations, the commercial use ultimately led to the development of algebra. It was not until the sixteenth century that the two domains came together. At that time academia at last embraced the study of methods of calculation, in particular algebra, while retaining its theoretical concern with abstract properties of numbers.

Two simple strategies for breaking bad habits are creating friction and changing cues

January 9th, 2020

Two simple strategies for breaking bad habits are creating friction and changing cues:

Physical distance is a simple source of friction. A 2014 study involved a bowl of buttered popcorn and a bowl of apple slices. One group of participants sat closer to popcorn than the apple slices, and the other sat closer to the apple slices. The first group ate three times more calories. The second group of participants could see and smell the popcorn, but the distance created friction, and they were less likely to eat it.

[...]

For example, researchers looked at the GPS data of people with gym memberships. Those who traveled about 3.7 miles to a gym went five or more times a month. However, those who had to travel around 5.2 miles went only about once a month.

[...]

Cues change naturally when you start new relationships, change jobs, or move. These offer a window of opportunity to act on your goals and desires without being dragged down by the cues that trigger your old habits.

For example, researchers found in a 2017 study that professional athletes whose performance had declined often improved after being traded to or signing with a new team. Another study found new residents of a small British town with strong environmental values mostly took the bus or cycled to work. But people who were not recent movers mostly drove even though they held similar values.

When cues change, it becomes easier to switch up your habits and routines.

Sometimes it feels like Amazon is trying to make the publishers look ridiculous

January 9th, 2020

The 2010s were supposed to bring the ebook revolution, but it never quite arrived:

Instead, at the other end of the decade, ebook sales seem to have stabilized at around 20 percent of total book sales, with print sales making up the remaining 80 percent. “Five or 10 years ago,” says Andrew Albanese, a senior writer at trade magazine Publishers Weekly and the author of The Battle of $9.99, “you would have thought those numbers would have been reversed.”

And in part, Albanese tells Vox in a phone interview, that’s because the digital natives of Gen Z and the millennial generation have very little interest in buying ebooks. “They’re glued to their phones, they love social media, but when it comes to reading a book, they want John Green in print,” he says. The people who are actually buying ebooks? Mostly boomers. “Older readers are glued to their e-readers,” says Albanese. “They don’t have to go to the bookstore. They can make the font bigger. It’s convenient.”

Ebooks aren’t only selling less than everyone predicted they would at the beginning of the decade. They also cost more than everyone predicted they would — and consistently, they cost more than their print equivalents.

[...]

Printing and binding and shipping — the costs that ebooks eliminated — accounted for only two dollars of the cost of a hardcover, publishers argued. So the ebook for a $20 hardcover book should cost no less than $18. And according to publishers, by setting the price of an ebook at $9.99, Amazon was training readers to undervalue books.

[...]

Print books are generally sold under a wholesale model, which works like this: First, the publisher will set a suggested list price for a book; say, $20. Then it will sell the book to resellers and distributors for a discount off that suggested list price. So if Simon & Schuster wants to sell a $20 book to Amazon, Amazon might negotiate a discount of 40 percent for itself and end up paying Simon & Schuster only $12 for that book.

But once Amazon owns the book, it has the right to set whatever price it would like for consumers. The $20 list price that Simon & Schuster set was just a suggestion. Under the wholesale model, Amazon is free to decide to sell the book to readers for as little as a single dollar if it chooses to.

Until 2010, ebooks were sold through the wholesale model too. So if Simon & Schuster was publishing a $20 hardcover, they could choose to set a suggested list price of $18 for the ebook — two dollars less than the hardcover — and then sell that ebook to Amazon at a 40 percent discount for $10.80. And Amazon could, in turn, feel free to sell that ebook for $9.99 and swallow a loss of 81 cents.

[...]

Apple was offering publishers an incentive to root for it over Amazon. With its App Store, Apple had established a resale model that worked differently from the wholesale model publishers were used to. It was called the agency model, and it worked like this: publishers would decide on what the list price for their book should be, and then put it up for sale at that price in the iBooks store. Apple would take a 30 percent commission on every sale.

Apple wasn’t willing to sell ebooks for $18, but it thought a cap of $14.99 was perfectly reasonable. And if publishers decided to go along with Apple’s plan, they could set a list price of $14.99 for an ebook and be sure that no one in the iBooks store would ever discount it without the publisher’s express permission. Apple, meanwhile, would pocket $4.50 from each sale.

But Apple couldn’t enter the ebook market while charging consumers five dollars more per unit than its biggest competitor was. It needed some assurance that no one would have a cheaper product than it had. So it made a deal with five of the Big Six publishers (Simon & Schuster, Penguin, HarperCollins, Hachette, and Macmillan; Random House, then the biggest trade publishing house, abstained): They could all sign on to Apple’s agency model, as long as they guaranteed that they’d also use that same agency model with every other retailer they worked with. That way, Amazon, too, would be forced to sell its ebooks for $14.99 — and if it refused, publishers could withhold their ebooks from Amazon and make them exclusive to Apple.

[...]

“Amazon can still discount whatever they like on the print side,” explains Jane Friedman, a publishing consultant and the author of The Business of Being a Writer. On the ebook side, however, Amazon now lists publisher-mandated prices, often with the petulant italic addition “Price set by seller.” “So the market is very weird, and often the ebook costs more than the print,” Friedman says. “Sometimes it feels like Amazon is trying to make the publishers look ridiculous.”

This kingpin strategy increased homicides by 80 percent

January 8th, 2020

The record shows that removing leaders often leads to more chaotic violence, Max Abrahms points out:

In January 2016, Mexican marines captured Joaquín “El Chapo” Guzmán, the longtime head of the Sinaloa Cartel. Taking him off the streets made the gang bloodier than ever before. Not only did the amount of violence increase, but the target selection expanded to include innocent bystanders. A gang member who worked for a contemporary of El Chapo compared the type of cartel violence before and after the arrest: “If we wanted to kill you and you turned up with your wife and children, we couldn’t do anything. We couldn’t touch you. Now, they don’t give a damn … If they see you in a taco stand, they’ll come and shoot it up.”

More systematically, the economists Jason Lindo and María Padilla-Romo examined the effects of targeting high-ranking gang members on Mexican homicide rates from 2001 to 2010. This “kingpin strategy,” they found, increased homicides by 80 percent in the municipalities where the leaders had operated for at least one year.

Many militant groups have also become less restrained toward civilians after the death or imprisonment of senior figures. In 1954, the British launched Operation Anvil to stamp out the Mau Mau Uprising in Kenya. Capturing leaders around Nairobi initiated a period of uncoordinated, rudderless violence. South Africa’s African National Congress also became less tactically disciplined when its leadership was marginalized. In 1961, the ANC established an armed wing called Umkhonto we Sizwe, which came to be known as the MK. Leadership stressed the value of “properly controlled violence” to spare civilians. For three years, MK members complied by studiously avoiding terrorist attacks. After Nelson Mandela was sentenced to life imprisonment in 1964, however, young men in the ANC engaged in stone throwing, arson, looting, and brutal killings of civilians. The political scientist Gregory Houston observed that “the removal of experienced and respected leaders … created a leadership vacuum” that empowered undisciplined hotheads. When Filipino police assassinated the Abu Sayyaf founder, Abdurajak Abubakar Janjalani, in 1998, the group devolved into a movement of bandits that preyed on private citizens. When Nigerian police summarily executed the Boko Haram founder, Mohammed Yusuf, in 2009, the terrorist organization also turned ruthless against civilians. And the al-Qaeda–linked rebel group Ahrar al-Sham became even more radical after a 2014 attack on its headquarters, in the northwestern province of Idlib, Syria, took out its leadership.

The theory that removing leaders results in worse violence is supported by more than mere anecdote. In a couple of peer-reviewed studies, I’ve tested whether killing the leader of a militant group makes that group more tactically extreme. Across conflict zones from the Afghanistan-Pakistan to the Israel-Palestine theaters, my co-authors and I found that militant groups significantly increase their attacks against civilians after an operationally successful strike against their leadership. Vengeance is not the main driver, as the overall quantity of violence changes less than the quality does. So-called leadership decapitation does not elicit a paroxysm of violence, but makes it more indiscriminate against innocent civilians.

Leadership decapitation promotes terrorism by empowering subordinates with less restraint toward civilians. In empirical research, I’ve demonstrated that militant groups fare better politically when they direct their violence at military and other government targets rather than civilians. Unlike guerrilla attacks against government targets, terrorist attacks against civilian targets tend to reduce popular support, empower hard-liners, and, most important, lower the odds of government concessions. But lower-level members, compared with their superiors, are less likely to grasp that attacking civilians does not pay.

[...]

Of course, not all militant leaders appreciate the folly of terrorism or possess the organizational clout to prevent operatives from perpetrating it. To a large extent, the effects of targeted killing thus depend on the type of leader killed. As I predicted in October, the death of the Islamic State leader, Abu Bakr al-Baghdadi, did not increase the group’s terrorist attacks, because he had favored maximum carnage against civilians and exercised limited control over his subordinates, particularly “lone wolves” who simply declared their rhetorical allegiance to him. Leadership decapitation is most likely to increase terrorism when the leader understood the strategic value of tactical restraint toward civilians and imposed his targeting restraint on the rank and file. A salient example is the al-Aqsa Martyrs’ Brigades, which ramped up their terrorist attacks against Israeli civilians when their leadership was crushed during the Second Intifada.

California’s mandated background checks had no impact on gun deaths

January 8th, 2020

A joint study conducted by researchers at the Johns Hopkins Bloomberg School of Public Health and the University of California at Davis Violence Prevention Research Program found that California’s mandated background checks had no impact on gun deaths:

In 1991, California simultaneously imposed comprehensive background checks for firearm sales and prohibited gun sales (and gun possession) to people convicted of misdemeanor violent crimes. The legislation mandated that all gun sales, including private transactions, would have to go through a California-licensed Federal Firearms License (FFL) dealer. Shotguns and rifles, like handguns, became subject to a 15-day waiting period to make certain all gun purchasers had undergone a thorough background check.

It was the most expansive state gun control legislation in America, affecting an estimated one million gun buyers in the first year alone. Though costly and cumbersome, politicians and law enforcement agreed the law was worth it.

The legislation would “keep more guns out of the hands of the people who shouldn’t have them,” said then-Republican Gov. George Deukmejian.

“I think the new laws are going to help counter the violence,” said LAPD spokesman William D. Booth.

More than a quarter of a century later, researchers at Johns Hopkins and UC Davis dug into the results of the sweeping legislation. Researchers compared yearly gun suicide and homicide rates over the 10 years following implementation of California’s law with 32 control states that did not have such laws.

They found “no change in the rates of either cause of death from firearms through 2000.”

Respectable enough to be invited to all the dances

January 7th, 2020

The genteel poverty of the Little Women in her book — respectable enough to be invited to all the dances, but too broke to be the belles of the ball — reflects the remarkable upbringing of Louisa May Alcott, Steve Sailer expains:

Back before Mark Twain, American literature was kind of a who-you-know business, and the Alcotts knew everybody who was anybody in the author industry. Ralph Waldo Emerson lent her family the money to buy their house in Concord, Henry David Thoreau told them it was haunted, and they eventually sold it to Nathaniel Hawthorne.

Considered a genius by America’s leading intellectuals, Louisa’s improvident father, Amos Bronson Alcott, was a figure out of a Mencius Moldbug essay about how WASPs are the real communists.

As Louisa recounted in her satire Transcendental Wild Oats, in the summer of 1841 her father founded a utopian commune called Fruitlands whose inmates were required to eat a vegan diet and not wear cotton (because it was picked by slaves), leather, or wool (because dumb brutes could not consent to be exploited). They could only wear linen, which was pleasant in summer, but not, as it turned out, in winter.

Nor could these animal rights activists employ beasts of burden to pull their plow. By December, with starvation held back only by Mrs. Alcott’s ceaseless labors, Mr. Alcott called the whole thing off.

Louisa was a more sensible soul than her father and enjoyed making money off her writing. So she eventually gave in when her publisher asked her to write a book for girls, even though she complained that she only identified with boys. In her semiautobiographical Little Women, the girls’ father is much improved upon by being rewritten as a beloved paterfamilias who is far away serving as a chaplain in the Union Army.

White officers are not more likely to shoot minority civilians than non-White officers

January 6th, 2020

A recent study published in PNAS looked at officer characteristics and racial disparities in fatal officer-involved shootings:

There is widespread concern about racial disparities in fatal officer-involved shootings and that these disparities reflect discrimination by White officers. Existing databases of fatal shootings lack information about officers, and past analytic approaches have made it difficult to assess the contributions of factors like crime. We create a comprehensive database of officers involved in fatal shootings during 2015 and predict victim race from civilian, officer, and county characteristics. We find no evidence of anti-Black or anti-Hispanic disparities across shootings, and White officers are not more likely to shoot minority civilians than non-White officers.

Odds of Civilian Being White vs. Black or Hispanic

Instead, race-specific crime strongly predicts civilian race. This suggests that increasing diversity among officers by itself is unlikely to reduce racial disparity in police shootings.

The low chance of war with Iran

January 5th, 2020

Richard Fernandez discusses the low chance of war with Iran:

With everyone wondering if Iran and the US will go to war it’s pertinent to understand both nations are already in an undeclared conflict going back more than 40 years. “And often, it’s been a war that our political and intelligence elites have denied exists.”

It began on November 4, 1979, when “radicals” loyal to Ayatollah Ruhollah Khomeini seized the U.S. Embassy in Tehran … On April 18, 1983, a suicide bomber drove a truck full of explosives into the U.S. Embassy in Beirut, Lebanon … Iran has also targeted U.S. soldiers on the battlefield, killing more than 1,000 U.S. troops with specialized improvised explosive devices in Iraq, placing a bounty on U.S. service personnel in Afghanistan, and most recently targeting U.S. forces in Syria.

The obvious question is why this conflict, which has claimed thousands of lives has remained in a state of limbo and why elites are at pains to deny it exists. One possible answer is that the combatants prefer it that way. Iran for its part is heavily engaged in proxy war with Saudi Arabia in far flung theaters including Syria, Yemen, Iraq, the Bahrain uprising, Lebanon and even Afghanistan. It can scarcely afford the additional cost of open conflict with the United States if it is to escape over-extension. It is in Iran’s interest to keep its war with America undeclared so that it can pick and choose when to engage.

For analogous but different reasons Washington preferred it secret too. Undeclared conflicts are the only way to fight “forever wars” where the object is not the destruction of the enemy but rather its management and containment in such a way that the global public and markets don’t notice.

Intelligence and character aren’t the same things at all

January 5th, 2020

The problem with meritocracy, T. Greer notes, isn’t the meritit’s the ocracy. He cites some passages from Andrew Yang’s book, of all places:

Intelligence and character aren’t the same things at all. Pretending that they are will lead us to ruin. The market is about to turn on many of us with little care for what separates us from each other. I’ve worked with and grown up alongside hundreds of very highly educated people for the past several decades, and trust me when I say that they are not uniformly awesome. People in the bubble think that the world is more orderly than it is. They overplan. They mistake smarts for judgment. They mistake smarts for character. They overvalue credentials. Head not heart. They need status and reassurance. They see risk as a bad thing. They optimize for the wrong things. They think in two years, not 20. They need other bubble people around. They get pissed off when others succeed. They think their smarts should determine their place in the world. They think ideas supersede action. They get agitated if they’re not making clear progress. They’re unhappy. They fear being wrong and looking silly. They don’t like to sell. They talk themselves out of having guts. They worship the market. They worry too much. Bubble people have their pluses and minuses like anyone else.

[...]

In coming years it’s going to be even harder to forge a sense of common identity across different walks of life. A lot of people who now live in the bubble grew up in other parts of the country. They still visit their families for holidays and special occasions. They were brought up middle-class in normal suburbs like I was and retain a deep familiarity with the experiences of different types of people. They loved the mall, too.

In another generation this will become less and less true. There will be an army of slender, highly cultivated products of Mountain View and the Upper East Side and Bethesda heading to elite schools that has been groomed since birth in the most competitive and rarefied environments with very limited exposure to the rest of the country.

When I was growing up, there was something of an inverse relationship between being smart and being good-looking. The smart kids were bookish and awkward and the social kids were attractive and popular. Rarely were the two sets of qualities found together in the same people. The nerd camps I went to looked the part.

Today, thanks to assortative mating in a handful of cities, intellect, attractiveness, education, and wealth are all converging in the same families and neighborhoods. I look at my friends’ children, and many of them resemble unicorns: brilliant, beautiful, socially precocious creatures who have gotten the best of all possible resources since the day they were born. I imagine them in 10 or 15 years traveling to other parts of the country, and I know that they are going to feel like, and be received as, strangers in a strange land. They will have thriving online lives and not even remember a car that didn’t drive itself. They may feel they have nothing in common with the people before them. Their ties to the greater national fabric will be minimal. Their empathy and desire to subsidize and address the distress of the general public will likely be lower and lower.

Does this mean that Hollywood movies actually reduce crime?

January 4th, 2020

Bryan Caplan discusses the social conservatism of Hollywood:

The message of all this cinema: Follow the path of bourgeois virtue.  Work hard, keep the peace, abstain from alcohol, have very few sexual partners, and keep your whole family far away from anyone who lives otherwise.  Think about how many fictional characters would have lived longer if they never set foot in a bar.

Is this the message the writers intend to send?  Unlikely.  Instead, they try to create engrossing stories — and end up weaving morality tales.

[...]

Does this mean that Hollywood movies actually reduce crime? I doubt it. The viewers most in need of lessons in bourgeois virtue are probably too impulsive to reflect on the moral of the story. They’re captivated instead by the gunplay and machismo. Yet if you’re paying attention, the moral of these stories remains: Unless your parents are criminals, listen to your parents.

Hyperindividualized freak flags became the national uniform

January 3rd, 2020

Nick Gillespie of Reason says, Thank you, Ram Dass!

Ram Dass, the psychedelic pioneer once known as Richard Alpert and notorious for getting fired by Harvard after giving undergraduate students drugs in the early 1960s, died at age 88 a little more than a week ago. So passed one of the figures who helped make postwar America vastly more individualistic, libertarian, weird, and wonderful.

Ram Dass’ journey from being the wealthy, repressed homosexual son of a railroad baron to a conventionally promising academic psychologist to the author of bestselling hippie bibles to the leader of a Hawaii-based New Age community was very public and extreme. But it neatly traces the arc of a square, buttoned-down Organization Man country into a place where hyperindividualized freak flags became the national uniform and the pursuit of spiritual and psychic wisdom became legitimate. Without figures such as Ram Dass — relentless seekers who challenged the boundaries of common decency and bourgeois respectability — we’d all be living in much duller, grayer world.

Richard Alpert’s father was the president of the New Haven Railroad, and the future Ram Dass grew up rich and entitled. He eventually made his way to Harvard as an assistant professor of psychology, where he crossed paths with Timothy Leary and helped create the “Good Friday experiment,” which catalyzed the nascent psychedelic movement. In short order, he and Leary were kicked out of Harvard and ended up living in a commune in upstate New York where they, along with Ralph Metzner (who himself died earlier this year), published an acid-drenched version of the Tibetan Book of the Dead that inspired the Beatles, among others.

In 1967, Alpert migrated to India and came back to the United States a few years later as Ram Dass. His 1971 book Be Here Now, cheekily dedicated to “the one eye love” and subtitled a “cook book for a sacred life,” helped introduce America to the now-ubiquitous term namaste and other Eastern mystical concepts. He eventually landed in Hawaii and created the Love Serve Remember Foundation.

I’m not sure I’d rate their contributions to society a net positive.

Serving Him in the Real World

January 2nd, 2020

I was not expecting to stumble across an Atlantic video-profile on John Correia and Active Self Protection:

“If you can’t be safe, be dangerous.”