The America of the 2020s is not the America of the 1970s

June 6th, 2022

The Invisible Bridge: The Fall of Nixon and the Rise of Reagan, by Rick Perlstein, is the sequel to Nixonland:

Where Nixonland roughly covered 1964-1972, The Invisible Bridge covers 1973-1976. It starts with Watergate, progresses through the bumbling missteps of the Ford administration, and ends with Ford’s narrow defeat of a right-wing insurgency by Ronald Reagan in the 1976 GOP primary.

There’s no throughline at all to this book — no coherent plot. Where Nixonland was a tight, coherent story about liberal rage and right-wing reaction, The Invisible Bridge is a chaotic, meandering tale of exhaustion, confusion, oddity and pointlessness. Which means it’s a book about the 70s. It’s a portrait of a grumpy, bitter country traumatized by social conflict but not yet ready to heal. And as such, it reminds me very much of 2021 and 2022.

Did you know that in 1976, there were two separate assassination attempts against President Gerald Ford in the space of three weeks, both by leftist radicals in northern California? I think I had heard that, but the bizarre reality of those episodes really stands out as the centerpiece of the book. Two wacky lefties tried to kill the President, one after another, and the country basically just shrugged and went on.

Anyway, as in Nixonland, the parallels to the modern day are not exact, but eerie nonetheless. Watergate feels a lot like the coup attempt of January 6, 2021 — an event that horrified people, and kept them glued to their screens for weeks, but where all the action ultimately remained confined to the ranks of the elite. The President being revealed as a crook — and then trying to cover up his criminality by claiming quasi-dictatorial powers, only to be rebuffed by resilient institutions — produced no riots, no mass wave of unrest, no repeat of 1968. Exhaustion had set in by 1973, and people were just kind of relieved to see Nixon go.

Reading this book, it’s possible to see the bumbling, nonthreatening Ford administration as the beginning of a sort of healing process. With a friendly old man in the White House, people could finally afford to tune out politics a little. The lefty radicals were still doing their thing, but they were getting fewer in number and their increasing extremism was turning off more and more Americans. Liberal Dems cruised to a huge midterm victory in 1974 off of Watergate backlash, but managed few legislative victories and ultimately saw their moment pass. Meanwhile, the angry White backlash that had powered Nixon to victory was also losing some of its energy, as White Americans fled the cities for burgeoning suburbs.

There was one group of Americans, however, who felt no exhaustion, and whose activism was just getting started — social conservatives. They wove together a slow-building backlash against libertine sex culture with the remnants of racial resentment, and turned it against abortion and gays. This story really reaches its apotheosis in Perlstein’s next book, Reaganland, but you can see it get its start in The Invisible Bridge. And the Reagan of this book is far from the cuddly, pro-immigration Reagan of the 80s — he’s seen as a genuinely dangerous right-wing radical.

The Invisible Bridge doesn’t tell a coherent story, but it teaches some important lessons about American politics. It suggests that episodes of lefty rage — where progressives expect a better world, don’t get it, and resolve to tear everything down — burn bright and hot but burn out fast. It was only 11 years from the Watts riots to Squeaky Fromme. Meanwhile, conservative America is slower to rouse, but has the stamina to secure gains when everyone else is exhausted.

The America of the 2020s is not the America of the 1970s; much will be different this time around. But many of the fundamental processes at work in that era are still at work today, and understanding them can help shine light on the stuff we see in the news.

Dashed expectations turned to anger

June 5th, 2022

Nixonland: The Rise of a President and the Fracturing of America, by Rick Perlstein, is as authoritative a volume as you’re likely to find on the history of how the unrest of the 1960s began, and how America reacted to it, Noah Smith says:

Perlstein leaves no ambiguity about what touched off the unrest: It was the Watts Riot of August 1965. That event set the stage for the big explosions of rioting in the summer of 1967 and after MLK’s assassination in 1968, both of which saw over a hundred American cities burn.

But the riots didn’t cause the 60s. Instead, Perlstein’s tale makes it clear that the unrest resulted from the confluence of several interrelated trends:

  • Black anger over ghetto conditions in American cities, liberal politicians’ attempts to solve the problem, and rightist backlash against those solutions
  • Cultural liberalization, especially the sexual revolution, among the middle class
  • The Vietnam War and the protests against it

The parallels between then and now are striking and immediately apparent. The widespread hope that the Kennedy/Johnson administration heralded a new era of liberalism in America outpaced reality, much like hope that Obama heralded a post-racial era outpaced reality — even though LBJ pushed through more substantive liberal policy than anyone except FDR, there was just no way even the famed “master of the Senate” could keep up with the wild expectations of the early 60s. And those dashed expectations turned to anger — anger over Vietnam, anger at the police, anger over ghetto conditions, anger at the dominant culture. Much as in the 2010s, dashed liberal expectations turned to anger in the form of BLM protests and riots. 2014 was our 1965, and 2020 was our 1968.

And what’s even more striking is how much the conservative reaction to 1960s liberal rage resembled the Trump era. Conservatives rallied around a leader they felt was reactionary, who would clamp down on urban Black unrest and antiwar hippies alike. Especially striking are Perlstein’s anecdotes about how right-wing counter-protesters felt they were standing up for Nixon personally — similar to the protectiveness MAGA people developed around Trump. (The irony, of course, is that in many regards Nixon governed as a liberal president, creating the EPA and OSHA and proposing universal health care and basic income! The only real similarity with Trump was in his authoritarian, paranoid personality.)

Nixon is, of course, the narrative throughline of this book, but in many ways he was just a symbol for a broader reactionary outpouring that eventually became the conservative movement of the 70s and 80s. That counterrevolution, which Perlstein has made it his life’s work to study, was far more violent, passionate, and downright scary than people realize. Americans were rightfully aghast when they saw Nazi symbols displayed openly at Charlottesville, but few realize how common those same symbols were at right-wing demonstrations in the 60s. People know about the MLK assassination riots, but few today have heard of the Hard Hat Riot. The counterrevolution was not televised.

And of course the culture war that started in the 60s is still with us today. That’s the thesis of Nixonland — which makes it all the more remarkable that the book was published in 2008, before Obama was even elected or Trump was on anyone’s radar.

Americans often say they want community policing

June 4th, 2022

Recent events remind us of Americans’ deep ambivalence and internal contradictions about policing:

Americans often say they want community policing, emphasizing de-escalation and outreach over proactive crime reduction and assertive policing. Many also oppose what they see as the “militarization” of police, rejecting the notion that American law enforcement should procure and train with tools such as sniper rifles and bullet-proof vests, let alone other more specialized equipment.

America in recent years has suffered a wave of anti-policing rhetoric, with the “Ferguson effect” beginning in 2014 and reaching a crescendo in the riots of 2020. Some radicals seek to defund them altogether.

But when an incident like Uvalde occurs, the public expects members of law enforcement to conduct what even America’s most elite special operations forces consider among the most challenging tactical tasks: a solo dynamic entry, room clearance, and structure search against a heavily armed perpetrator or perpetrators.

And the public is right to ask for this.

But few agencies select officers based on ability and willingness to perform this extremely high-impact/low-probability mission. Few agencies train officers to the high levels of proficiency required. The reality is that most law enforcement agencies require only the minimally mandated firearms qualifications, and at standards that are insufficient to meet the level of the challenge, in the event the worst should happen. Only a select few officers seek outside training and acquire the right tools, often at their own expense, to make themselves ready, lest they be called and found wanting.

Beyond bureaucratic training requirements, the task requires a certain mindset, a comfort with aggression, and a drive not doled out to all people in equal measure.

There are around 700,000 sworn law enforcement officers in the United States. As much as it may pain us to admit it, not all of them will be warriors, a word that is overused in certain circles but nevertheless remains apt. And, of course, police work requires many other interpersonal skills and training, some of which are 180-degree opposite from the psychological traits required to storm into a room alone against a determined and heavily armed gunman.

As historian Victor Davis Hanson eloquently writes, America possesses a deep discomfort with those who truly epitomize the combat virtues. While America loves the action hero, we breathe a sigh of relief at the movie’s end not only because the villain has been dispatched, but also because the hero rides away.

If we are honest with ourselves, most Americans don’t want this type of highly capable and dangerous man (and most of them will be men) doing our policing. Not on the good days, when the sun is shining and the birds are chirping.

They were Twentysomethings with a lot of time on their hands and nothing better to do

June 3rd, 2022

An interesting pattern recurs across the careers of great scientists, Dwarkesh Patel notes, an annus mirabilis (miracle year) in which they make multiple, seemingly independent breakthroughs in the span of a single year or two:

Einstein had his annus mirabilis in 1905. While he was still a patent clerk, he wrote four papers that revolutionized our understanding of the photoelectric effect, Brownian motion, and special relativity.

Newton’s annus mirabilis came to him between 1665 and 1666, when Cambridge responded to the Bubonic plague by sending its students home to quarantine. During that time, Newton, aged 22, developed the theory of gravity along with the language of calculus required to express it.


Many other great scientists — Copernicus, Darwin, von Neumann, and Gauss — also seem to have had an annus mirabilis.

Miracle years happen outside of pure science too. In his memoir, Linus Torvalds talks about how he spent the summer before turning 21 reading an operating systems textbook cover to cover, how later that year he built a terminal emulation program just for fun, and how he spent all his time working on this program until pretty soon it morphed into a full operating system called Linux. That was his annus mirabilis.

Even writers have miracle years. Just recently, the popular fantasy author Brandon Sanderson announced that in the year or two since the pandemic began, he has secretly written five extra novels in addition to the ones his fans knew he was writing.

Perhaps, he suggests, there’s a brief window in a person’s life where he has the intelligence, curiosity, and freedom of youth but also the skills and knowledge of age:

These conditions only coincide at some point in a person’s twenties. It wouldn’t be surprising if the combination of fluid intelligence (which declines steeply after your 20s) and crystalized intelligence (which accumulates slowly up till your 50s and 60s) is highest during this time. Stephan and Levine (1993) find that most Nobel laureates do their prize winning work in their late 20s or early 30s.

During his miracle year, Einstein was a patent clerk, Newton was a college student dismissed for quarantine, and Darwin was a trust fund kid who had just finished a long voyage aboard the HMS Beagle and still didn’t know what to do with his life. They had no obligations to research some old professor’s hobby horse using his particular technique or paradigm.

Given how many of the great scientific discoveries have come about during miracle years, he argues, we should do everything we can to help smart Twentysomethings have an annus mirabilis:

We should free them from rote menial work, prevent them from being overexposed to the current paradigm, and give them the freedom to explore far-fetched ideas without arbitrary deadlines or time-draining obligations.

It’s depressing that I have just described the opposite of a modern PhD program.

Imagine a person, tall, lean and feline, high-shouldered, with a brow like Shakespeare and a face like Satan

June 1st, 2022

The insidious Dr. Fu-Manchu preferred using “pythons and cobras…fungi and [his] tiny allies, the bacilli…[his] black spiders” and other peculiar animals or natural chemical weapons to kill his enemies:

Imagine a person, tall, lean and feline, high-shouldered, with a brow like Shakespeare and a face like Satan,… Invest him with all the cruel cunning of an entire Eastern race, accumulated in one giant intellect, with all the resources of science past and present… Imagine that awful being, and you have a mental picture of Dr. Fu-Manchu, the Yellow Peril incarnate in one man.

Sax Rohmer, the novelist who created the character, died after succumbing to Asian flu in 1959.

Liking it is not a matter of bad taste but of some sort of failure of political and moral sophistication

May 29th, 2022

The crowd, Freddie deBoer reports, has turned from performatively hating David Foster Wallace to performatively hating The Catcher in the Rye:

For the record, I think The Catcher in the Rye is… OK? It’s fine. It’s definitely a book of an earlier era and it felt as such when I read it as a teenager. I was hoping to connect with it on a deep level (uh, not a Mark David Chapman level) the way some adults in my life had, and I didn’t and was kind of bummed out. But it was fine. As is so often the case with these things, there’s a really dumbass reading of the book lurking in the discussion about it, which is that you’re somehow commanded to identify with Holden Caufield and to want to act like him. This is… not a good interpretation. You certainly can identify with him, but I don’t think that’s suggested very strongly, let alone mandated. As with Fight Club, another boy story for boys about boys being boys, you are invited to empathize with the alienation and loneliness of the main character while recognizing the juvenility and pointlessness of his reaction to it. But, well, now I’m actually engaging with the book, which is more than social media critics of books ever do. They never seem to want to go deeper than saying “TOXIC MASCULINITY” or whatever, which is particularly bizarre here. (Is the idea that Holden Caufield is supposed to be some sort of symbol of an idealized man? What?) It’s all uselessly Manichean — I know this headline is partially a joke but it makes me wince anyway. The important work is always to say a) this book/author is bad and b) liking it is not a matter of bad taste but of some sort of failure of political and moral sophistication.


Have you never imagined reading a book without wanting it to be a signifier of your entire personality? Do you know how many books I’ve read specifically because I hate the author and their outlook? Or, quelle horreur, you could consider reading a book without knowing what you think about it until you’ve read it! You know, the generative state of being open to forming a summative position based on the gradual aggregation of myriad minor judgments formed along the way? That would seem to be a major part of the point of reading.


It’s a sickness, the assumption that we must always tightly control every last aspect of our self-presentation, no matter how distinct from our true self, because someone on the subway with a $300k education and zero opinions they didn’t steal from podcasts might silently judge us. And as (this philosophy presumes) no one has a durable sense of self worth, being judged by strangers must be terrifying instead of meaningless.

Many have lamented the fact that professional criticism these days is often just a recitation of ways that a work of art does or does not conform to the childish moral calculus of “social justice.” And mountains of worthless reviews and recaps have been produced under these terms. But it’s important to say that this tendency is not solely or even mainly the product of ideological discipline and the desire to evangelize. Rather it stems from insecurity about one’s own subjective opinions. People who don’t trust that they are sophisticated readers or cinephiles or whatever gravitate towards tedious political checklisting because those political claims seem more transcendent and defensible and real than their own claims of taste. But this fundamentally mistakes the purpose of a review, and it’s very hard to understand why someone who is so afraid of standing by their own opinion would think to write one.


And it must always be remembered that, not that long ago, most media elites were not woke, but rather sneering neoliberals who mocked leftists as losers; the fact that media culture turned on a dime to embrace social justice fads makes it a certainty that, when that politics goes out of fashion in the coming decade, the media will flip flop right over again. No, the problem with media culture is not the politics but rather where those politics come from — not just from elite colleges or privileged childhoods lived in affluence, but from insecurity.

For the record, I found The Cather in the Rye phony and lousy.

I haven’t read any of David Foster Wallace’s novels, but I do keep going back to The String Theory.

In The Sum of Small Things, David Brooks points out, Elizabeth Currid-Halkett argues that the educated class establishes class barriers not through material consumption and wealth display but by establishing practices that can be accessed only by those who possess rarefied information:

To feel at home in opportunity-rich areas, you’ve got to understand the right barre techniques, sport the right baby carrier, have the right podcast, food truck, tea, wine and Pilates tastes, not to mention possess the right attitudes about David Foster Wallace, child-rearing, gender norms and intersectionality.

Serial killing was something of a social contagion

May 26th, 2022

With mass-killing shootings in the news, Steve Sailer wanted to point out that not all bad things are destined to increase forever:

For instance, according to the Radford University Database of known serial killers, the number of serial killers soared during what Robert Heinlein predicted c. 1940 would be known as the Crazy Years (1960s-1970s) before declining more recently.

Rise and Fall of Serial Killers

It appears that the idea of serial killing was something of a social contagion that spread first among whites, then among nonwhites. I wouldn’t be surprised if Hitchcock’s hugely influential 1960 movie Psycho, often thought as the founder of the “slasher pic” genre, played a role in this real life phenomenon, although how to measure that is beyond me.

It’s also hard to say what caused the decline over the last generation. It could be that serial killing became less appealing to the handful of sickos attracted to doing it.

Or it could be fear of being caught increased. According to Bill James, cops were long particularly bad at catching serial killers because they’d been trained not to fall for the idea that somebody was murdered by a random stranger: instead, it had to be somebody who knew the victim, an ex-boyfriend or the like. So if they had five dead women on their hands, they tended to look for five separate killers. This had been a fairly productive prejudice, since it kept them from going down the wrong path most of the time. But the huge publicity attendant to Ted Bundy c. 1980 forced cops to get serious about the serial killer phenomenon.

When fed plasmalogens, aged mice perform more like young mice

May 25th, 2022

Researchers from Xi’an Jiaotong-Liverpool University, Stanford University, Shanghai Jiao tong University, and the University of Chinese Academy of Sciences report that plasmologens (found in sea squirts) reverse some signs of aging — in mice:

The effects of the plasmalogen supplement on learning and memory were tested by training mice to use a Morris water maze — a pool of water that contains a platform that serves as a resting area. Generally, mice do not like to swim, so over five days of training, they remember where the platform is and swim directly to it as soon as they are in the pool. However, older mice take longer to find the platform after the same amount of training.

Astonishingly, when fed with plasmalogens, aged mice perform more like young mice, finding the platform much quicker than the control group of aged mice that have not been given the supplement.

To find the reason for the improvement shown by plasmalogen-fed mice, the researchers took a closer look at changes happening within the brain. They found that mice that were fed the plasmalogen supplement had a higher number and quality of synapses—the connections between neurons—than the aged mice not given the supplements.

Electric vehicles can generate electricity while carrying loads downhill

May 24th, 2022

Under the right conditions — going far enough downhill at enough of an angle with a heavy load — electric vehicles can generate a useful amount of energy:

Miauton’s company manufactures the eDumper, a 65-ton dump truck that’s said to be the world’s largest electric vehicle. Its diesel engine and fuel tank have been replaced with electric motors, batteries and cooling machinery, and it’s now working at a quarry near Biel in Switzerland, hauling 70-ton loads of lime and rocks down a mountainside.

Thanks to the expense of the high-tech systems, an eDumper costs about twice as much as a diesel-powered truck. But it never needs any fuel — a savings of between 11,000 and 22,000 gallons of diesel a year, along with its carbon emissions — and it almost never needs recharging. Test drives show it generates about as much electricity going down as it uses going up. Miauton said the company is now making three more eDumpers for mines in Germany, and it has plans for even larger electric dump trucks.

The concept of making electricity on a downhill run will soon get an even bigger boost. The Australian mining company Fortescue, a major producer of iron ore, announced in March that it will build “Infinity Trains” to generate electricity while carrying loads of ore from mines in the Outback.

The company currently runs 16 trains in Western Australia driven by 54 locomotives that use a total of around 20 million gallons of diesel fuel every year. Each train has up to 244 cars. They can be almost two miles long and carry more than 37,000 tons of ore.

Fortescue chief executive Elizabeth Gaines said four routes from mines in the inland Pilbara region are sufficiently uphill of their final destination — Port Hedland on the northern coast — that they’re suitable for Infinity Trains. The company plans to have them working on all four routes before 2030 by developing the dynamic braking feature many locomotives already have to convert gravity into electricity, she said in an email. Some routes will generate even more energy than they need for the return trip, and the company will use the extra electricity elsewhere in its operations.

The most innovative proposal for making electricity from gravity may be electric truck hydropower. According to a study published in March, a fleet of electric trucks filled with water high in the mountains can generate electricity as they travel downhill on regular roads. The empty trucks can then drive back for more water, or be used elsewhere.

Study lead author Julian Hunt, a Brazil-based researcher with the International Institute for Applied Systems Analysis, said the system is about as cost-effective for generating electricity as wind, solar and regular hydropower.

She is one of the handful of books that Tolkien explicitly acknowledges as an influence

May 22nd, 2022

It is worth remembering that Tolkien was not simply channelling Beowulf, the Eddas, and the Kalevala in his creative work, a Phuulish fellow notes, but that he was also interacting with more recent material, like H. Rider Haggard’s adventure novels:

By good fortune, She is one of the handful of books that Tolkien explicitly acknowledges as an influence. In a 1966 interview with Henry Resnick, Tolkien remarked:

I suppose as a boy She interested me as much as anything — like the Greek shard of Amyntas [Amenartas], which was the kind of machine by which everything got moving.

The shard of Amenartas is a purported ancient text, included by Rider Haggard as a means of providing some exposition to the story. Well and good. It is the incident that incites the start of the adventure. But the shard is no ordinary ancient text, at least in terms of presentation. Rider Haggard gives facsimiles of the fragment, in actual Greek.


Don’t worry. Rider Haggard helpfully transcribes and translates the text. But the sheer effort the author went to, in terms of making the artefact look real and believable is noteworthy. It rather recalls the One Ring inscription, and the inscription on Balin’s Tomb, not to mention in-universe Tolkienian texts like The Book of Mazarbul and Thror’s Map. In terms of actual historical exposition, there is also a decent comparison between Rider Haggard’s protagonists puzzling out the Shard, and Gandalf learning about the Ring via the forgotten Scroll of Isildur in the archives of Minas Tirith.

(Yes, I am aware that Rider Haggard did not invent this trope. Jules Verne provides a runic manuscript in A Journey to the Centre of the Earth (1871). But Tolkien cites Rider Haggard, not Verne).

Perhaps the single cheekiest Tolkienian shout-out to Rider Haggard is the city of Kôr. In She, the city of Kôr is an ancient ruined city, so ancient that it was already long abandoned when Ayesha turned up, thousands of years before the narrative begins. Kôr predates the Egyptians, in terms of antiquity, and it adds some glorious atmosphere to the setting.

It may therefore interest you to know that Kôr was the original name of the great Noldorin city, Tirion upon Túna. The home of Finwë, Fëanor, et al. Moreover, in Tolkien’s initial conception – found in The Book of Lost Tales – the city ends up abandoned. An early Tolkienian poem, titled Kôr: In a City Lost and Dead, describes the scene, after the Elves have left it.

Sure to be aggressive abroad and despotic at home

May 21st, 2022

In 1866, long before he famously stated that “power tends to corrupt, and absolute power corrupts absolutely,” Lord Action, an English Catholic, wrote to Robert E. Lee, the former Confederate General:

Without presuming to decide the purely legal question, on which it seems evident to me from Madison’s and Hamilton’s papers that the Fathers of the Constitution were not agreed, I saw in State Rights the only availing check upon the absolutism of the sovereign will, and secession filled me with hope, not as the destruction but as the redemption of Democracy. The institutions of your Republic have not exercised on the old world the salutary and liberating influence which ought to have belonged to them, by reason of those defects and abuses of principle which the Confederate Constitution was expressly and wisely calculated to remedy. I believed that the example of that great Reform would have blessed all the races of mankind by establishing true freedom purged of the native dangers and disorders of Republics. Therefore I deemed that you were fighting the battles of our liberty, our progress, and our civilization; and I mourn for the stake which was lost at Richmond more deeply than I rejoice over that which was saved at Waterloo.

Lee’s response includes his own defense of States’ Rights:

I am conscious the compliment conveyed in your request for my opinion as to the light in which American politics should be viewed, and had I the ability, I have not the time to enter upon a discussion, which was commenced by the founders of the constitution and has been continued to the present day. I can only say that while I have considered the preservation of the constitutional power of the General Government to be the foundation of our peace and safety at home and abroad, I yet believe that the maintenance of the rights and authority reserved to the states and to the people, not only essential to the adjustment and balance of the general system, but the safeguard to the continuance of a free government. I consider it as the chief source of stability to our political system, whereas the consolidation of the states into one vast republic, sure to be aggressive abroad and despotic at home, will be the certain precursor of that ruin which has overwhelmed all those that have preceded it. I need not refer one so well acquainted as you are with American history, to the State papers of Washington and Jefferson, the representatives of the federal and democratic parties, denouncing consolidation and centralization of power, as tending to the subversion of State Governments, and to despotism.

The New England states, whose citizens are the fiercest opponents of the Southern states, did not always avow the opinions they now advocate. Upon the purchase of Louisiana by Mr. Jefferson, they virtually asserted the right of secession through their prominent men; and in the convention which assembled at Hartford in 1814, they threatened the disruption of the Union unless the war should be discontinued. The assertion of this right has been repeatedly made by their politicians when their party was weak, and Massachusetts, the leading state in hostility to the South, declares in the preamble to her constitution, that the people of that commonwealth “have the sole and exclusive right of governing themselves as a free sovereign and independent state, and do, and forever hereafter shall, exercise and enjoy every power, jurisdiction, and right which is not, or may hereafter be by them expressly delegated to the United States of America in congress assembled.”

Such has been in substance the language of other State governments, and such the doctrine advocated by the leading men of the country for the last seventy years. Judge Chase, the present Chief Justice of the U.S., as late as 1850, is reported to have stated in the Senate, of which he was a member, that he “knew of no remedy in case of the refusal of a state to perform its stipulations,” thereby acknowledging the sovereignty and independence of state action. But I will not weary you with this unprofitable discussion.

There has been a marked shift in public interest from the collective to the individual, and from rationality toward emotion

May 20th, 2022

The surge of post-truth political argumentation suggests that we are living in a special historical period, Marten Scheffer et al. suggest, when it comes to the balance between emotion and reasoning:

To explore if this is indeed the case, we analyze language in millions of books covering the period from 1850 to 2019 represented in Google nGram data. We show that the use of words associated with rationality, such as “determine” and “conclusion,” rose systematically after 1850, while words related to human experience such as “feel” and “believe” declined. This pattern reversed over the past decades, paralleled by a shift from a collectivistic to an individualistic focus as reflected, among other things, by the ratio of singular to plural pronouns such as “I”/”we” and “he”/”they.” Interpreting this synchronous sea change in book language remains challenging. However, as we show, the nature of this reversal occurs in fiction as well as nonfiction. Moreover, the pattern of change in the ratio between sentiment and rationality flag words since 1850 also occurs in New York Times articles, suggesting that it is not an artifact of the book corpora we analyzed. Finally, we show that word trends in books parallel trends in corresponding Google search terms, supporting the idea that changes in book language do in part reflect changes in interest. All in all, our results suggest that over the past decades, there has been a marked shift in public interest from the collective to the individual, and from rationality toward emotion.

The authors blame the change on the failure of “neo-liberalism” which seems dubious and without plausible mechanism to Alex Tabarrok:

A more plausible explanation is more female writers and the closely related feminization of culture.

Spengler was not so humble

May 19th, 2022

It is easy to pick out the most significant figures of ancient history — say, Socrates or the Buddha — and pronounce that these were comparable figures of similar historical weight, T. Greer suggests, but how do you pick out which of your contemporaries deserve that honor?

One day a few men of your generation may be vindicated by history. But that history has not happened yet. Humility demands that we decline to declare what only time can prove.

Spengler was not so humble. He repeatedly describes Tolstoy (d. 1910), Ibsen (d. 1906), Nietzsche (d. 1900), Hertz (d. 1894), Dostoevsky (d. 1881), Marx (d. 1883), and Maxwell (1879) as figures of defining “world-historical” importance: in other words, as working on the same plane as Plato, Archimedes, Ovid, Shakespeare, and Newton. He does not argue their merits; to him it is obvious that these are the men who deserve to be thought of as “world-historical” figures, and it is clear from the way he makes his arguments that he expects that his own readers already agree with him.

Ponder that! Spengler began writing Decline of the West in 1914. Tolstoy was only four years dead when Spengler started his book; Marx was only 30 years deceased. But Spengler could state, with the full expectation that his audience would not question him, that these men belonged in global pantheon of humanity’s greatest figures. But Spengler was hardly alone in this sort of judgement. Ten years later John Erskine would teach his course on the great works of the Western tradition—which was the granddaddy of the Columbia Common Core, the St. John’s curriculum, and the Great Books of the Western World series—and it included all of the names mentioned above as well. To this Erskine would add the names William James, Sigmund Freud, Thomas Hardy, and Charles Darwin.[2]

Now Erskine’s list is not perfect; it has not perfectly weathered the centuries. The fame of William James has sunk with time; today we usually think of Joseph Conrad, not Thomas Hardy, as the supreme English novelist of that era. But the broader point holds: only a decade or two after these men’s deaths intellectuals confidently spoke of them in the same breath as Shakespeare and Plato. And not just subjectively, in the sense we might today (“I think Urusala LeGuin is as good as Shakespeare” or “I think Hayek is better than Plato”) but with full knowledge that the broader public already knew that these people and their works belonged on the list. It was obvious to even those who disliked Nietzche that he was a seminal figure in Western thought; it was obvious even to those who disagreed with Ibsen that he claimed a similar place in Western literature, and so forth. Their ideas might be argued against, but their genius and their influence was undeniable.

Is there anyone who died in the last decade you could make that sort of claim for?

How about for the last two decades?

The last three?

Or is there anyone at all who is still living today that might be described this way?

In the realm of science, perhaps. But in the world of social, historical, ethical, and political thought, no one comes to mind. Most “great books” curricula stop right around World War II and its immediate aftermath. St. John’s recently added Wittgenstein and de Beauvoir to their curricula, but their works are almost 70 years old. Michel Foucault is the next obvious candidate, and he died 37 years ago.

They took their own accent, the California accent, and ramped it up

May 18th, 2022

Pop-punk was created in the late 1980s and early 1990s at 924 Gilman Street in Berkeley, an all-ages venue normally referred to as “Gilman”:

This is where Bay Area bands like Rancid, Operation Ivy, the Mr. T Experience, and, especially, Green Day all started to get attention. Bay Area pop-punk is a kinder, gentler variety than either the nihilist Londoners or the hardcore California bands like the Circle Jerks and the Dead Kennedys that preceded them. The Gilman bands obviously worshipped the Clash, whose songs showed more craft, hooky melodies, and subtlety than, say, the Sex Pistols. Some of the bands, like Rancid, were responsible for amping up the Clash’s combination of punk and reggae into what’s now called the Third Wave of ska music.

The Bay Area community was goofier, sillier, more suburban, and more inclined to make happy, poppy music than any punk community that came before it. As an ode to the Clash, a lot of their singers adopted a sort of faux-British accent. “I’m an American guy faking an English accent faking an American accent,” Green Day lead singer Billie Joe Armstrong told Rolling Stone in 1994. Tim Armstrong, the (unrelated) lead singer of fellow Bay Area band Rancid, sings with an accent that varies song by song; sometimes it’s nearly featureless, other times it’s a Strummer-esque Brit inflection, other times it sounds nearly New York.

The pop-punk accent really became smooth and polished a little bit later, in the mid-1990s, with bands like Blink-182 and the Offspring, both hailing from Southern California. Their singers (Mark Hoppus and Tom DeLonge from Blink-182, Dexter Holland from the Offspring) totally abandoned any pretenses of Britishness. Instead they took their own accent, the California accent, and ramped it up, pushed it to new extremes. It was almost exactly what happened in London. Pop-punk singers became more Californian than the Californians.

Penelope Eckert, a linguistics professor at Stanford, is one of the foremost scholars examining what’s known as the “California Shift.” The California Shift is a linguistic theory covering the particular changes in dialect that affect the Pacific coast of the United States. Eckert was nice enough to humor me and listen several times to a song I chose based on its particularly egregious “pop-punk voice,” Blink-182’s “First Date.” I love the song, but am aware others may find it horribly annoying. “It really does sound like someone’s messing around,” she told me.

A key change in the California Shift is what’s called the cot/caught merger. Northeasterners and Midwesterners pronounce those words differently, giving the former an “ah” sound and the latter an “aw” sound. “Californians do not,” says Eckert, who is originally from New York. “They have no idea. That vowel is almost completely merged. Think ‘mawwm’ instead of ‘mom.’”

Vowel sounds work like those sliding puzzle games where you have to unscramble a picture by sliding one piece of it at a time. As soon as you move one piece, you’re left with an empty space behind you, which has to be filled by something else. Californians dropped the “cot” vowel sound, pronouncing it like “caught” instead. So something had to fill that space. “The California Shift is this kind of combined change in the pronunciation of short vowels,” says Kennedy. The easiest way to think about it? Look at the words kit, dress, and trap. In the California Shift, “kit” becomes “ket”, “dress” becomes “drass”, and “trap” becomes “trop”.

Linguists talk about this shift in terms of directions; to talk with a California accent is sometimes called “trap-backing,” or “trop-bocking.” Your mouth functions like a resonating chamber. You can alter the frequencies of the sounds you make by changing the size of the chamber and by moving your tongue around. Your tongue’s placement is a major factor in dialects; it can be raised, lowered, moved to the front, or moved to the back. Californians move their tongues back, hence “trop-bocking.”

But there are some more complex things going on in the pop-punk voice. Eckert walked me through the Blink-182 song word by word, pointing out places where DeLonge was playing around with accent. “When they say ‘to pick you up on our very first date,’ the interesting thing about ‘date’ is that he renders it as a monophthong ‘dehhht’ instead of ‘date,’ says Eckert. “In most American English it’s a diphthong.” A diphthong is a vowel sound with two simpler sounds in it; for most Americans, “date” is a kind of compound vowel made up of the “eh” sound and the “ee” sound. Not so much for Tom DeLonge, who eliminates all but the “eh,” making it a single sound, or a monophthong.

The monophthong “date” surprised Eckert, as she says it’s not part of the California Shift. Except! “I’ve heard that some in Chicano English, but not so much in Anglo English,” she says. Chicano English is spoken by native English speakers of Mexican descent—it’s not a Mexican accent, because Chicano English speakers are native English speakers, but sort of their own English dialect. And that goes along with one of DeLonge’s most obvious vocal tics: changing short “ih” sounds as in the work “think” to a long “ee” sound, turning it into something like “theenk.” “Chicano English raises the vowel I to ‘ee’ before nasal consonants,” says Eckert. “So ‘theenk’ is very Chicano. And you have a lot of Anglo wannabes saying that too.”

Another very distinctive element of the California accent that’s extremely present in DeLonge’s vocals is the long “oo” sound in words like “room,” which DeLonge pronounces as something more like “rehm.” That’s almost an efficiency move; the particular combination of shapes your lips have to make to move from the first consonant, R, to the last consonant, M, plus the moves your tongue has to make to form the “oo” sound, are pretty difficult. If you move your tongue closer to your front teeth, it’s a lot less work, but you’ll change the pronunciation of “room” to “rehm.” It’s called “oo-fronting.”

There are plenty more things Eckert taught me about DeLonge’s delicious accent, but one last example would be the way Californians pronounce the letter R in certain words. In a word where the stress falls on a vowel one syllable before a word ending in R, like “whatever” or “over,” most of the country, but most noticeably those in the New York/New Jersey area, stress the consonant in the second-to-last syllable extra hard. But Californians lengthen the R. So a New Yorker will say “whatevah,” but a Californian will say “whateverrrr.” “We talk about New York/New Jersey accents as being ‘R-less’ and California accents as being ‘R-ful,’” laughed Eckert. (Linguistics jokes are pretty good.)

DeLonge does some weird, non-Californian stuff, though. His pronunciation of words like “light” and “spider” come out somewhere between the vowel sound from “rye” and “roy.” “His pronunciation of it is striking, and different from Californians generally,” says Kennedy. “It may be another attempt at projecting British punk vocals, but if I recall correctly he does this in speech as well, and so it might actually be a skate/surf/punk subculture linguistic feature.”

When your smoke alarm goes off you don’t have time to look around

May 17th, 2022

Research shows that 30 years ago, you had about 17 minutes to escape a house fire:

Today it’s down to three or four minutes. The reason: Newer homes and the furniture inside them actually burn faster. A lot faster.


“The backing of your carpet is synthetic, your drapes are synthetic, the couch, the pillows are synthetic,” explained John Drengenberg, consumer safety director for UL. “They burn hotter and faster than natural materials do.”

A similar fire set to the sofa pillow in the room simulating an older home burned for several minutes without even catching the rest of the sofa. At 15 minutes the room was still intact; it wound up taking 30 minutes for the room to burn.


“When your smoke alarm goes off you don’t have time to look around, get your wedding pictures,” Drengenberg said. “You get out as quickly as you can.”