The ANC destroyed South Africa

Monday, November 15th, 2021

The African National Congress has destroyed South Africa — and failed to gain a majority of the vote:

The party is, despite its manifest failings, still custodian of the liberator’s mantle among many black South Africans — a recent survey showed that although 60% of ANC voters associated their party with corruption, they would nonetheless vote for it; such is the brand loyalty — but the party’s once hegemonic power is in retreat. The decline over the years is neatly in tandem with the nation’s trajectory towards a failed state. At its peak in 2004, the ANC pulled nearly 70% of the national vote. This week, it could barely pull past 46%.

[…]

President Nelson Mandela’s post-liberation administration winged it for five years on the back of public euphoria about the Rainbow Nation and the administrative sinews left by the departed apartheid state.

Then President Thabo Mbeki, his successor, sought to impose a sere, technocratic and welfarist vision on his realm, drawn directly from his experiences in Left-wing UK universities. Problem was that while he taught the newly enfranchised all about their rights as modern citizens, he somehow did not get around to talking about their duties. As a result, a boundless sense of entitlement has become an irreducible, damaging and informing fact of South African life, killing initiative and personal agency. Meanwhile, the technocrats who could give content to Mbeki’s vision were leaving state service in droves: victims of his racial affirmative policies.

After him came Jacob Zuma, former head of intelligence of the ANC’s military in exile, the army that somehow managed to wage a decades-long war during the apartheid years that few South Africans ever noticed. His cronies came into government trailing the odour of the Angolan military camps; the paranoia, secrecy, expedience, manipulation, fear, brutality, corruption and hopelessness.

It is estimated that during his eight-year term, Zuma benignly presided over the embezzlement of between R400 billion and R1.5 trillion of public money (1 GBP = R21) by a coterie of crooks gathered around his presidency, and by others appointed to the State services under the guise of affirmative action and “cadre deployment” (yes, they still speak like that).

[…]

Last month, the World Bank ranked South Africa’s once excellent ports at the bottom of the 351 ports surveyed and the Universal Postal Union conveyed the warming news that the South African postal service is now officially worse than Nigeria’s.

[…]

For decades now these informal cantons have become ever more self-sufficient: they have private police, hospitals, schools and an army of fixers to mediate between them and a truly appalling bureaucracy. So-called Public-Private Partnerships control large public business and tourist spaces, property developers build public roads, private companies manage water reticulation and major road routes are maintained by private enterprise.

Recent Government policy allows for Independent Power Producers: energy self-sufficiency is now within the grasp of these localised and internally expatriated communities.

Voters tend not to be impressed by either strategy

Sunday, November 14th, 2021

Rather than outline one or two serious national problems that they proposed to take on, the Democrats projected an amount of money to spend, Yuval Levin says, and then stuffed everything that every Democratic interest group desired into one package until they reached that number:

They never gave the public any sense of what mattered to them. And the internal debates about the scope and contents of the package almost all involved arguments about its overall size — about how much to spend and tax rather than what to do or how to do it.

This is just one example of a broader failure to prioritize that is endemic to our politics now. Neither party can quite explain what it wants, except to keep the other party from power. That problem is vastly overdetermined, but three reasons for it do stand out among the rest.

[...]

Throughout his career, Joe Biden has tried to position himself near the center of the Democratic coalition and be a kind of generic Democrat. This is not a bad strategy for a senator with a safe seat, and it obviously worked for him. But it’s not as good a strategy for a president with an internally divided party. A president’s strength as an executive can often be measured by whether his mid-level political appointees know what he would do in their place — whether an assistant secretary in one department or another can say “If the president had my job, I know how he would make the decision I’m now facing.” This was obviously impossible on most issues in the Trump era, since President Trump’s implacable ignorance, pathological amorality, and blinding narcissism made him reactive and unpredictable. This was part of why he was such a weak president and achieved so little that will endure. But it is also practically impossible in the Biden era, because President Biden has generally refused to identify himself with any side of any dispute within the Democratic coalition. Given his history, he would seem to represent the more moderate wing of the party, but that’s not really evident in anything his administration has done, or any role he has played in any legislative process. It’s hard to say what he wants, so he isn’t helping his party tell the public what it wants either.

[...]

The habits of polarization, which have evolved over the past generation in Washington, involve party leaders in Congress asserting themselves rather than party factions negotiating. This helps the parties confront one another more starkly, but it doesn’t help the parties negotiate internal differences. Leaders in this polarized era want to mask and submerge internal divisions, rather than to work them out, and that makes bargaining within each party pretty difficult, as both parties have learned when they have held power. The Democrats tend to respond to this problem by proposing to do everything at once — stuffing every idea they’ve ever had into one big bill. Republicans tend to respond to the same problem by proposing to do nothing — just simply nothing whatsoever. That is basically what Republicans ran on in 2020, for instance. Voters tend not to be impressed by either strategy. And this problem will only become more serious as the internal differences within the parties grow.

[...]

Both parties are changing as the American elite is changing, and a lot of their internal fractures look like tensions between their past and their future. The Democrats are gradually taking the shape of something like a fun-house mirror version of the Eisenhower coalition — upscale whites plus many black voters. (Obviously the black vote was much more divided at mid-century than now, and it was also much more suppressed by Southern racism, but those were key elements of the self-understanding of Eisenhower’s coalition.) Republicans are gradually taking the shape of a fun-house mirror version of the FDR coalition — blue-collar whites and some blue-collar ethnic minorities who will eventually be considered white. (The latter described some ethnic European Catholic minorities for FDR, it describes some Hispanic voters for today’s GOP). Both analogies are lacking, to be sure, but they suggest something about the general course of things.

[...]

The key economic-policy battleground of the immediate future is likely to be the challenge of rising living costs, and if the BBB legislation is any sign, Democrats are not well equipped to fight on that front. They remain committed to addressing high costs through a combination of subsidizing demand and restricting supply. This is essentially the left’s approach to health care, higher education, housing, and now (in this new bill) child-care. Increased demand and reduced supply is, broadly speaking, a recipe for higher prices and therefore higher costs. If the new swing voters are suburban parents, a program that risks drastic increases in child-care costs is a way to lose the future.

Arnold Kling is tempted to write “subsidize demand and restrict supply™,” since he introduced the phrase in Specialization and Trade.

You don’t become beautiful by signing up with a modeling agency

Saturday, November 13th, 2021

Social scientists distinguish between what are known as treatment effects and selection effects:

The Marine Corps, for instance, is largely a treatment-effect institution. It doesn’t have an enormous admissions office grading applicants along four separate dimensions of toughness and intelligence. It’s confident that the experience of undergoing Marine Corps basic training will turn you into a formidable soldier. A modelling agency, by contrast, is a selection-effect institution. You don’t become beautiful by signing up with an agency. You get signed up by an agency because you’re beautiful.

At the heart of the American obsession with the Ivy League is the belief that schools like Harvard provide the social and intellectual equivalent of Marine Corps basic training—that being taught by all those brilliant professors and meeting all those other motivated students and getting a degree with that powerful name on it will confer advantages that no local state university can provide. Fuelling the treatment-effect idea are studies showing that if you take two students with the same S.A.T. scores and grades, one of whom goes to a school like Harvard and one of whom goes to a less selective college, the Ivy Leaguer will make far more money ten or twenty years down the road.

The extraordinary emphasis the Ivy League places on admissions policies, though, makes it seem more like a modelling agency than like the Marine Corps, and, sure enough, the studies based on those two apparently equivalent students turn out to be flawed. How do we know that two students who have the same S.A.T. scores and grades really are equivalent? It’s quite possible that the student who goes to Harvard is more ambitious and energetic and personable than the student who wasn’t let in, and that those same intangibles are what account for his better career success. To assess the effect of the Ivies, it makes more sense to compare the student who got into a top school with the student who got into that same school but chose to go to a less selective one. Three years ago, the economists Alan Krueger and Stacy Dale published just such a study. And they found that when you compare apples and apples the income bonus from selective schools disappears.

“As a hypothetical example, take the University of Pennsylvania and Penn State, which are two schools a lot of students choose between,” Krueger said. “One is Ivy, one is a state school. Penn is much more highly selective. If you compare the students who go to those two schools, the ones who go to Penn have higher incomes. But let’s look at those who got into both types of schools, some of whom chose Penn and some of whom chose Penn State. Within that set it doesn’t seem to matter whether you go to the more selective school. Now, you would think that the more ambitious student is the one who would choose to go to Penn, and the ones choosing to go to Penn State might be a little less confident in their abilities or have a little lower family income, and both of those factors would point to people doing worse later on. But they don’t.”

(I’ve cited Malcolm Gladwell’s Getting In before.)

Japanese called the war stimulants “drug to inspire the fighting spirits”

Friday, November 12th, 2021

I recently listened to Peter Attia’s interview with David Nutt, Director of the Neuropsychopharmacology Unit in the Division of Brain Sciences at the Imperial College London, in which the good doctor made a few comments on amphetamines in World War 2.

Now, I read (and commented on) Blitzed: Drugs in the Third Reich a couple years ago, so I knew the Germans had used Pervitin, or methamphetamine, extensively in their Blitzkrieg invasion of France, but Nutt noted that the Desert Rats in North Africa used the less-powerful American drug Benzedrine, or amphetamine sulfate, to great effect, by harassing the Germans all night and sleeping through the next day — while the meth-agitated Germans stayed awake, unable to sleep.

He also mentioned that the Japanese used massive amounts of amphetamines — which seems unsurprising, but rarely mentioned. In fact, there were still huge stashes of amphetamines after the war:

The tablets were distributed to pilots for long flights and to soldiers for combat, under the trade name Philopon (also known as Hiropin). In addition, the government gave munitions workers and those laboring in other defense-related factories methamphetamine tablets to increase their productivity.

Japanese called the war stimulants “senryoku zokyo zai” or “drug to inspire the fighting spirits.” Defense workers ingested these drugs to help boost their output. In the all-out push to increase production, strong prewar inhibitions against drug use were swept aside. It is not difficult to understand why. As researchers such as political scientist Lukasz Kamienski have documented, total war required total mobilization, from factory to battlefield. Pilots, soldiers, naval crews, and laborers were all routinely pushed beyond their natural limits to stay awake longer and work harder. In this context, taking stimulants was seen as a patriotic duty.

Kamikaze pilots took large doses of methamphetamine, via injection, before their suicide missions. They were also given pep pills stamped with the crest of the emperor. These consisted of methamphetamine mixed with green tea powder and were called Totsugeki-Jo or Tokkou-Jo, known otherwise as “storming tablets.” Most kamikaze pilots were young, often only in their late teens. Before the injection of Philopon, the pilots undertook a warrior ceremony in which they were presented with sake, wreaths of flowers, and decorated headbands.

[…]

Upon surrendering in 1945, the country had massive stores of Hiropin in warehouses, military hospitals, supply depots, and caves peppered throughout its territories. Some of the supply was sent to public dispensaries for distribution as medicine, but the rest was diverted to the black market rather than destroyed. There, the country’s Yakuza crime syndicate took over much of the distribution, and the drug trade would eventually become its most important source of revenue.

Any tablets not diverted to illicit markets remained in the hands of pharmaceutical companies. In the wake of the traumas and dislocations of the war, a depressed and humiliated population offered an easy target. As Kamienski noted, “The pharmaceutical industry advertised stimulants as a perfect means of boosting the war-weary population and restoring confidence after a painful and debilitating defeat.” The drug companies mounted advertising campaigns to encourage consumers to purchase over-the-counter medicine sold as “wake-a-mine.” The product was pitched as offering “enhanced vitality.” In No Speed Limit: The Highs and Lows of Meth, journalist Frank Owen reports that these companies also sold “hundreds of thousands of pounds” of “military-made liquid meth” left over from the war to consumers, who did not need a prescription to purchase the drug.

With an estimated 5 percent of Japanese people between the ages of 18 and 25 taking the drug, many became intravenous addicts in the early 1950s.

Socrates finds 21st-century political thought shallow and confused.

Thursday, November 11th, 2021

Bryan Caplan presents a Socratic dialogue based on the premise that three Greek luminaries have time-traveled from ancient times to the 21st century:

A few months after immersion in the modern world, Pericles is a convinced member of what modernity calls “the left,” while Leonidas is an equally staunch member of “the right.” Socrates, in contrast, finds 21st-century political thought shallow and confused.

[…]

Pericles: It’s not so hard. Leftists like me care about everyone. Rightists like Leonidas only care about people like themselves.

Leonidas: [harumphs] You don’t “care about everyone.” You only care about people on your side — and you expect the rest of us to foot the bill.

[…]

Socrates: I see. Another common view is that the left cares more about the poor, and the right cares more about the rich.

Pericles: More or less. I don’t intrinsically care less about the rich; I just think they already get a lot more than they need or deserve.

Leonidas: I don’t know any rightist who says, “We’ve got to stand up for the rich.” I care about middle and working class people who play by the rules. If we can help them by taxing the rich more, great. But I don’t trust leftists to do that. When they say, “Let’s tax the rich to help the poor,” they mean, “Let’s tax everyone who plays by the rules to help everyone who doesn’t.”

[…]

Pericles: I’m a big fan of dialogue, but not because I feel “safe.” As I said, I think the world faces serious — and maybe even existential — problems. We need dialogue because it’s the only viable way to wrest control of our society and our world back from moneyed interests.

Leonidas: Leftists’ idea of a “dialogue” is them talking down to the rest of us, and shaming anyone who fails to loudly applaud. I’d love to have a series of frank discussions — discussions where the answer is genuinely up for grabs, and pragmatism prevails. And we really need such discussions, because Pericles is right about level of danger we’re all in. He just can’t see that people like himself are a big part of the problem.

Beijing does not wait to be attacked

Wednesday, November 10th, 2021

When confronted by a mounting threat to its geopolitical interests, Beijing does not wait to be attacked — it shoots first to gain the advantage of surprise:

In 1950, for instance, the fledgling PRC was less than a year old and destitute, after decades of civil war and Japanese brutality. Yet it nonetheless mauled advancing U.S. forces in Korea out of concern that the Americans would conquer North Korea and eventually use it as a base to attack China. In the expanded Korean War that resulted, China suffered almost 1 million casualties, risked nuclear retaliation, and was slammed with punishing economic sanctions that stayed in place for a generation. But to this day, Beijing celebrates the intervention as a glorious victory that warded off an existential threat to its homeland.

In 1962, the PLA attacked Indian forces, ostensibly because they had built outposts in Chinese-claimed territory in the Himalayas. The deeper cause was that the CCP feared that it was being surrounded by the Indians, Americans, Soviets, and Chinese Nationalists, all of whom had increased their military presence near China in prior years. Later that decade, fearing that China was next on Moscow’s hit list as part of efforts to defeat “counterrevolution,” the Chinese military ambushed Soviet forces along the Ussuri River and set off a seven-month undeclared conflict that once again risked nuclear war.

In the late ’70s, Beijing picked a fight with Vietnam. The purpose, remarked Deng Xiaoping, then the leader of the CCP, was to “teach Vietnam a lesson” after it started hosting Soviet forces on its territory and invaded Cambodia, one of China’s only allies. Deng feared that China was being surrounded and that its position would just get worse with time. And from the ’50s to the ’90s, China nearly started wars on three separate occasions by firing artillery or missiles at or near Taiwanese territory, in 1954–55, 1958, and 1995–96. In each case, the goal was — among other things — to deter Taiwan from forging a closer relationship with the U.S. or declaring its independence from China.

You had two jobs

Tuesday, November 9th, 2021

Bryan Caplan realizes he‘s been too generous to local governments, which really have two jobs:

  1. Provide K-12 education.
  2. Regulate construction.

And on reflection, local governments do both of these things terribly.

[…]

Voucher systems are clearly more efficient, yet virtually every locality continues to directly supply K-12 education.

[…]

Local governments’ construction regulations are usually quite strict, especially in the most desirable locations. The resulting draconian system of height limits, zoning, minimum lot sizes, minimum parking requirements, and beyond roughly double the cost of housing and greatly retard national economic growth.

[…]

While voucher systems’ effect on test scores is debatable, the effect on customer satisfaction is not.

[…]

While you can argue that housing regulations curtail negative externalities, the leading examples are parking and traffic. The optimal response to both is not construction regs, but peakload pricing.

[…]

Tiebout implicitly assumes that non-profit competition works the same way as for-profit competition. It doesn’t. If a business owner figures out how to produce the same good at a lower cost, he pockets all of the savings. If the CEO of a publicly-held corporation figures out how to produce the same good at a lower cost, he pockets a lot of the savings. But if the mayor of a city figures out how to deliver the same government services for lower taxes, he pockets none of the savings. That’s how non-profits “work.”

With non-profit incentives, neither the number of local governments nor the ease of exit lead to anything resembling perfectly competitive results. The “competitors” simply have little incentive to do a good job, so they all tend to perform poorly.

Second, voters are deeply irrational, even at the local level. […] Even at the local level, the probability of voter decisiveness is so low that the expected cost of voter irrationality is approximately zero. If you have more than a hundred voters, “Your vote doesn’t count” is basically correct.

On the Road is a terrible book about terrible people

Monday, November 8th, 2021

On the Road is a terrible book about terrible people:

Jack Kerouac and his terrible friends drive across the US about seven zillion times for no particular reason, getting in car accidents and stealing stuff and screwing women whom they promise to marry and then don’t.

But this is supposed to be okay, because they are visionaries. Their vision is to use the words “holy”, “ecstatic”, and “angelic” at least three times to describe every object between Toledo and Bakersfield. They don’t pass a barn, they pass a holy vision of a barn, a barn such as there must have been when the world was young, a barn whose angelic red and beatific white send them into mad ecstasies. They don’t almost hit a cow, they almost hit a holy primordial cow, the cow of all the earth, the cow whose dreamlike ecstatic mooing brings them to the brink of a rebirth such as no one has ever known.

[…]

On The Road seems to be a picture of a high-trust society. Drivers assume hitchhikers are trustworthy and will take them anywhere. Women assume men are trustworthy and will accept any promise. Employers assume workers are trustworthy and don’t bother with background checks. It’s pretty neat.

But On The Road is, most importantly, a picture of a high-trust society collapsing. And it’s collapsing precisely because the book’s protagonists are going around defecting against everyone they meet at a hundred ten miles an hour.

They mistake forgotten science for fiction

Sunday, November 7th, 2021

When historically ignorant readers read science fiction decades or centuries after it was written, they can mistake forgotten science for fiction:

When the science in SF survives the passage of time, we regard it as simply ordinary science or as an insightful prediction of the future; when it turns out to be wrong, we may write it off as fiction. Cordwainer Smith, in writing about “the pain of space” in “Scanners Live In Vain” was not (just) imagining some wild Freudian fantasy about leaving the womb, but drawing on pre-spaceflight 1940s extrapolation of hallucinations and cognitive problems in aviation; but since we now know that spaceflight is psychologically safe (and the real cognitive effects like the “overview effect” don’t look like “the pain of space”), contemporary readers read it as purely fictional and ponder the deep symbolism of the fantastical concept.

Similarly, Herbert made use of psi (still taken seriously at the time), extrapolation from the use of pheromones in insects to humans (though pheromones don’t even affect sexual behavior), various wooly ideas about transgenerational memory (never passed from woo to reality — sorry, “epigenetics” ain’t it either), Walter’s theory of warfare (crankery), and multilevel group selection (still highly debated), California Human Potential Movement beliefs about trainability of raw human abilities exemplified by Dianetics etc (a profound disappointment)… As they are presented as part of worldbuilding, it’s easy to simply accept them as fiction, no more intended real than manticores (or should I say, Martians?).

This works fine for Dune 56+ years later, because they are fun, and aren’t the focus. It holds up well, like The Dragon in the Sea or the eusocial-insect fiction like Hellstrom’s Hive. In contrast, Herbert’s Destination: Void, which has almost no interesting plot or characters, and is a long author-tract about his idiosyncratic interpretations of early cybernetics & speculation about AI, is unreadable today.

So, we should keep this in mind: if there are claims about how the world works in a SF work and they are false, is that because they are fictional or just science we are no longer familiar with?

[…]

So—this alternate paradigm can neatly explain all of the oddities of the Dune breeding program! The reason it is so odd is because Herbert was drawing on the obsolete Mendelian interpretations which were heavy on epistasis and de novo mutations, as opposed to the more plausibly relevant biometric Fisherian paradigm of highly polygenic additive traits with selection on standing variation. Herbert was throughout his life interested in agriculture & genetics, as demonstrated by his demonstration home farm project and the repeated use of agricultural themes in his works (eg Hellstrom’s Hive, where a group of humans develops into eusocial insects, or The Green Brain, where human extermination of insects has catastrophically destabilized global agriculture & provoked evolution of intelligent insects).

Jack Kirby is one of the most influential artists of the 20th century

Saturday, November 6th, 2021

The new Eternals film is a reminder that, while he may not be widely recognized as such, Jack Kirby is one of the most influential artists of the 20th century:

His signature style — a fusion of pivotal artistic movements such as cubism, expressionism, surrealism, avant-garde, op art, indigenous South American, midcentury commercial and futurism, blended into a visual language all his own — and innovation in composition, dynamism and design, can be found today in virtually all forms of visual media and art, from film to advertising to photography.

Kirby was born Jacob Kurtzberg in 1917 to Jewish immigrants from Austria who lived in New York’s Lower East Side tenements and eked out a living in a garment factory. A Pratt Institute dropout at 14, he found success early on when he and studio partner Joe Simon created Captain America for Timely Comics in 1941, reportedly selling almost a million copies a month. His hyperkinetic, hyper-stylized, hyper-everything art seemed barely contained by the page, helping define the nascent art form and establish the superhero genre and comic book industry.

When superheroes’ popularity waned after the war, the versatile Kirby made an indelible mark on a variety of other genres, including western, horror, space adventure and giant monsters. His rampaging behemoths paralleled the rise of creature features like “Them!” in the US and kaiju films like “Godzilla” in Japan, helping make them a lasting genre. But it was teen romance, of all things, that he influenced the most; together with Simon he created “Young Romance” for Prize Comics in 1947, a runaway hit that surpassed a million copies monthly and inspired nearly a hundred copycat series. Breaking with the Archie Comics mold, it introduced Shakespearean melodrama to youth entertainment, a revolution that’s evident in today’s numerous teen shows on TV.

Just as Kirby helped define superhero comics in the 1940s, he helped redefine them in the 1960s. Timely was now called Marvel, and his and Simon’s former office assistant was now the editor-in-chief and head writer, Stan Lee. Lee brought Kirby on for what became an unprecedented, and unsurpassed by either, period of manic creativity. They created together the Fantastic Four (1961), the Hulk (1962), Thor (1962), Ant-Man (1962), Iron Man (1963), Avengers (1963), X-Men (1963), Silver Surfer (1966), Black Panther (1966) and hundreds of other heroes, villains, cast and concepts. Kirby also played a role in the creation of Spider-Man in 1962 and Daredevil in 1964 and auteured solo properties like the Eternals.

More than just a new pantheon, they created a whole new approach. They recast monsters as outcast heroes and added the drama of teen romance, appealing more to high school and college readers. Their stories had greater realism and deeper characterization, featuring heroes with relatable faults and action informed by Kirby’s youth in a street gang and combat experience as an infantry scout in France, for which he received the Bronze Star. Marvel came to be known as “The House of Ideas,” Lee as “Stan the Man” and Kirby as “King of Comics.”

Kirby also created much of the mythology for the other big publisher in comics, DC. His radical “Fourth World” saga, a magnum opus spinning off Superman into four series from 1970 to 1973, was a grand space opera about warring alien gods. It introduced Darkseid (pronounced “dark side”), a towering, imperious villain who commands the mechanized war planet of Apokolips in a quest for galactic conquest. Opposing him is his own son Orion, raised by the noble gods to be their champion, who struggles with his dark heritage and nature. Their religion and source of power is a metaphysical life force called, simply, “the Source.”

Four years later “Star Wars” was released, featuring strikingly similar concepts like Darth Vader and Luke Skywalker, the Dark Side, the Death Star, and the Force. Vader also resembles Kirby and Lee’s Fantastic Four villain Dr. Doom, who hides his horrific burn scars under high-tech armor.

[…]

In 1979, he was commissioned to create concept art for a big-screen adaptation of Roger Zelazny’s science fiction novel, “Lord of Light” (ironically, spurred by the success of “Star Wars”).

The same producer also hired him to design an entire theme park in Colorado called Science Fiction Land. Neither would come to fruition, but Kirby’s art found an even better purpose. The CIA used it for its mock production of the film “Argo,” a now-famous covert operation for the rescue of US embassy members from Tehran during the Iran hostage crisis.

Democrats and the left should work to improve conditions for poor white people as well

Friday, November 5th, 2021

Nothing the internet has done, Freddie deBoer thinks, has been more powerful or consequential than the vast increase in social conformity it’s brought about:

Every incentive in 2021, every last one, pushes us to submit to the will of the crowd. Under those conditions it’s more important than ever that we remember who we are and where we came from.

[…]

Today’s “left,” in media and academia and elsewhere, has abandoned absolutely core commitments related to goals, policy, and process, and slandered anyone who hasn’t. The avatars of this tendency mostly know nothing but operate in a social culture in which one must project an aura of knowing everything, and so we have never had substantive debates about any of this stuff, nor do we have communal history enough to know who’s changed and who hasn’t. Let’s run the big changes down.

Of all of the concepts that underlie left discourse, moral universalism may be the most central and essential, though it is little discussed. Moral universalism is the simple belief that all human beings are equal in value and dignity, and deserve political, legal, and moral equality. (It does not mean, and has never meant, that all people are equal in abilities, nor is it an argument for equality of outcomes.) This might seem like a pretty banal assumption, but remember that recognizably left-wing or socialist principles were first developed during a time when literal dynastic aristocracies were assumed to be of inherently higher value than the common person, to say nothing of various bigotries tied to race, ethnicity, and gender. Moral universalism was a powerful and radical idea relative to that backdrop. It was moral universalism that demanded an end to slavery, to sexism, to caste systems, to socioeconomic inequality: Black people deserve freedom because they are people, women deserve equal rights because they are people, the poor deserve material security and comfort because they are people. This is not merely an elegant philosophical position but the basis of left political strategy; stressing common humanity, rather than fixating on demographic differences, means we can have the biggest tent imaginable. All it requires is believing that we must leave no one behind, as a movement and society.

In contrast, today’s left-of-center is rabidly attached to moral particularism, though they mostly haven’t ever really thought this through. By moral particularism I mean the entrenched and widespread notion that certain classes of people are, by dint of their identity categories, more important than others, more deserving of political action, more noble and holy. People will deny that when asked directly, but all of their rhetoric and priorities demonstrate that tacit belief. In argument after argument, liberals today try to settle matters by insisting that a given group’s greater historical oppression means that they must be “centered,” put first, their interests elevated over those of others. A commitment to moral universalism of course demands that these historical oppressions be addressed, until these groups reach the position of equality, at which point their rights will simply be defended like everyone else’s. But today’s liberal practice, if not the explicit ideology, demands that we must relentlessly prioritize some groups over others, and that spending time or energy devoted to those outside of these groups is somehow to take the side of oppression. Debates within the coalition frequently amount to people trying to insist that they are speaking on behalf of the most oppressed, and that whichever position succeeds in that contest is necessarily the righteous cause. Moral particularism not only does not advance an ethic where everyone deserves equal consideration and equally fair treatment, it actively disdains that notion and calls it fascist.

If you don’t believe me, and your Twitter account occupies any kind of progressive space, go on there and tweet “I think Democrats and the left should work to improve conditions for poor white people as well. Their suffering matters.” The notion of the left working for poor people as poor people, rather than merely as an extension of some identity frame, would be totally uncontroversial among the vast majority of left-leaning people throughout the existence of the modern political spectrum. Today? Go ahead, tweet that out, if you have a lot of liberal and leftist followers. See how that works out for you.

From advocating humanistic psychiatric care to opposing it

Thursday, November 4th, 2021

Liberals and progressives have gone, Michael Shellenberger argues, from advocating humanistic psychiatric care to opposing it:

In 1961, the French historian Michel Foucault published a book, Le folie et la raison, which was translated into English in 1965 as Madness and Civilization. The book made Foucault one of the most famous intellectuals in the world, and enormously popular in California, where he taught as a guest lecturer during the mid-1970s. Foucault’s book had a major impact on how we treat, and don’t treat, the seriously mentally ill.

Foucault argued that the supposedly humanistic treatment of the mad as suffering from mental illness was, in fact, a more insidious form of social control. Before 1500, the mad wandered freely in Europe, Foucault argued. After 1500, Europeans began to medicalize madness, treat it like an illness, as a way not just to control the mad but also to establish what was rational, normal, and healthy for the rest of society. Mental hospitals emerged at a time, Foucault argued, when the state was seeking to impose rational order on societies. And that started with policing the boundary between sane and insane. Foucault even criticized a humanistic asylum in England whose pioneering psychiatrist no longer used physical restraints, which the mentally ill today testify are terrifying and even constitute a kind of torture, on his patients. Said the psychiatrist, “these madmen are so intractable only because they have been deprived of air and freedom.”

Foucault wasn’t alone in his attack on psychiatry and mental hospitals. In 1961, an American sociologist, Erving Goffman, published an influential book, Asylums: Essays on the Condition of the Social Situation of Mental Patients and Other Inmates, which compared mental hospitals to concentration camps. That same year, a psychiatrist named Thomas Szasz published The Myth of Mental Illness, which argued that psychiatrists and others invented the concept of mental illness, with no biological evidence, in order to punish people who were different from the norm.

The anti-psychiatry movement became a cultural phenomenon in 1962 with the publication of Ken Kesey’s best-selling novel, One Flew Over the Cuckoo’s Nest. It revolves around a socially deviant but nonetheless sane man who feigns mental illness so he can go to a mental hospital rather than prison. He is drugged, electro-shocked, and eventually lobotomized. The novel was adapted as a Broadway play and an Oscar-winning 1975 film starring Jack Nicholson.

Szasz formed an alliance with the ACLU, which began to crusade politically, and litigate through the courts, for an end to involuntary treatment of the mentally ill. Because psychiatrists were no more reliable at diagnosing mental illness than flipping coins, argued the ACLU’s most influential attorney on the matter in 1972, they “should not be permitted to testify as expert witnesses.” Said another leading civil rights attorney in 1974, “They [the patients] are better off outside the hospital with no care than they are inside with no care. The hospitals are what really do damage to people.”

In early 1973 the journal Science published an article, “On Being Sane in Insane Places,” by a Stanford sociologist, David Rosenhan, who claimed to have sent research assistants into several mental hospitals where they were misdiagnosed with mental illness. “We now know that we cannot distinguish insanity from sanity,” he concluded. The study received widespread publicity and “essentially eviscerated any vestige of legitimacy to psychiatric diagnosis,” said the chairman of Columbia’s Department of Psychiatry. “Psychiatrists looked like unreliable and antiquated quacks unfit to join in the research revolution,” wrote another psychiatrist.

Rosenhan’s study became one of the most read and reprinted articles in the history of psychiatry, but a journalist in 2019 published a book describing so many discrepancies that she questioned whether it had ever even occurred. She only found one person who said he had participated in the study, and he said he was treated well by the hospital and had been discharged simply because he asked to leave.

Yes, it was an evil empire

Wednesday, November 3rd, 2021

Yes, it was an evil empire, Cathy Young reminds us:

It was the summer of 1983, and I, a Soviet émigré and an American in the making, was chatting with the pleasant middle-aged woman sitting next to me on a bus from Asbury Park, New Jersey, to Cherry Hill. Eventually our conversation got to the fact that I was from the Soviet Union, having arrived in the U.S. with my family three years earlier at age 17. “Oh, really?” said my seatmate. “You must have been pretty offended when our president called the Soviet Union an ‘evil empire’! Wasn’t that ridiculous?” But her merriment at the supposed absurdity of President Ronald Reagan’s recent speech was cut short when I somewhat sheepishly informed her that I thought he was entirely on point.

[…]

The woman on the bus in 1983 did not surprise me. By then, I had already met many Americans for whom “anti-Soviet” was almost as much of a pejorative as it had been in the pages of Pravda, the official newspaper of the Soviet Communist Party. My favorite was a man in the café at the Rutgers Student Center who shrugged off the victims of the gulag camps by pointing out that capitalism kills people too — with cigarettes, for example. When I recovered from shock, I told him that smoking was far more ubiquitous in the Soviet Union, and anti-smoking campaigns far less developed. That momentarily stumped him.

My mother was also at Rutgers at the time as a piano instructor. She once got into a heated argument over lunch with a colleague and friend after he lamented America’s appalling treatment of the old and the sick. She ventured that, from her ex-Soviet vantage point, it didn’t seem that bad. “Are you telling me that it’s just as bad in the Soviet Union?” her colleague retorted, only to be dumbstruck when my mother clarified that, actually, she meant it was much worse. She tried to illustrate her point by telling him about my grandmother’s sojourn in an overcrowded Soviet hospital ward: More than once, when the woman in the next bed rolled over in her sleep, her arm flopped across my grandma’s body. Half-decent care required bribing a nurse, and half-decent food had to be brought from home. My mother’s normally warm and gracious colleague shocked her by replying, “I’m sorry, but I don’t believe you.” Her perceptions, he told her, were obviously colored by antipathy toward the Soviet regime. Eventually, he relented enough to allow that perhaps my grandmother did have a very bad experience in a Soviet hospital — but surely projecting it onto all of Soviet medicine was uncalled for.

Halloween used to be kid stuff

Tuesday, November 2nd, 2021

Halloween used to be kid stuff:

By 2005, just over half of adults celebrated Halloween. Today, that number has grown to over 70 percent. Those between 18 and 34 years old participate at the highest rate, and they’re also the holiday’s biggest spenders, shelling out over twice as much on their costumes as older adults and children.

Halloween celebrations have changed, too: less trick-or-treating and more parties and bar hopping. Today, alcohol is as important as candy to the Halloween economy.

The ports of Los Angeles and Long Beach ranked below ports in Tanzania and Kenya

Monday, November 1st, 2021

Stifling regulations have left America with the most inefficient ports in the world:

A recent review of container-port efficiency ranked the ports of Los Angeles and Long Beach below ports in Tanzania and Kenya, near the bottom of the list of 351 top ports. America’s ports are effectively third-world. The 50 most efficient ports in the world are mostly in Asia and the Middle East; none are in America.