Bill Nye Saves The World features Rachel Bloom performing My Sex Junk, and, well, I don’t even know what to say:
Sam Harris interviews Charles Murray and admits that he assumed there was something to the accusations against Murray, until he went through his own witch trials and read Murray’s work.
The conventional wisdom is that we need to send even more people to college, but Devin Helton is skeptical enough that he went through a master spreadsheet of employment in the United States and made his own assessment of what percent of jobs truly require college.
Here is a table with my results, compared to what the actual attendance rates are:
There is no plausible way that 60% of jobs will innately require a degree in ten years. If 60% of jobs require a college degree on paper, that requirement will be entirely artificial (due to credentialing laws and competitive signaling spiral/degree inflation — see for example DC’s new regulation that childcare workers must have college degrees).
The most surprising thing I noticed was how many jobs require almost no specialized study or training. Even in contrarian, anti-college intellectual circles, it is popular to say we need more vocational education and apprenticeships. But skilled trades are only around 15% of jobs. The majority of jobs require no special training. They are jobs like cashier, driver, orderly, real estate agent, customer service agent, store clerk, house painter, or laborer.
Less than 15% of jobs can be plausibly said to need more study than the classic high school education.
If we want to make the working class better off, we should subsidize wages, not unnecessary education:
Consider the goods and services that make up a good and comfortable life: high-tech gizmos, gas heating, indoor plumbing, a well-built home, access to a skilled doctor, good restaurants, good beer, parks, well-built infrastructure, a stroll down a street with pretty buildings, etc. If you look at the production process for those goods and services, only a small percent of the workers involved need a college degree. And most degrees granted do not improve the production process — how does granting millions of degrees in “business”, “communications” or “social science” lead to more and better of these products? It doesn’t. And in fact, by channeling so many people into the college pipeline, we have lost out on the skills that did make for the good life. We have lost the artisans that once created beautiful streetscapes and ornate architectural detailing. We have less money to spend on infrastructure. We have more debt, and more stress.
Furthermore, even in the engineering fields, much of the know-how exists exclusively inside the productive organization — not inside the textbooks. Every engineer, when getting a job, has a big adjustment period as they learn how things are actually done. They learn why the schoolbook version was simplified or out-dated, and they learn the real techniques and tricks and tooling that they actually need to know to make things work.
In the past few decades, America has become more educated in terms of degrees. But in reality, people like my dad were training Chinese engineers to replace them, as the boomers retired and the high-tech job moved overseas. And now Forbes tells us that the Kindle cannot be made in America, because the essential technological production no longer exists here. According to policy wonks — who measure skills and education by number of years people spend sitting in chair — we have become more educated. But if you look at the actual knowledge needed to build high-tech goods, the issue is a lot more murky.
- Separate schooling from credentialing.
- Create a set of free, online high school and college degree programs that any American could enroll in, and pursue at their own pace.
- At age 13, give everyone a $100k education voucher.
- Legalize and normalize apprenticeship contracts.
The Conservative right-winger Enoch Powell has made a hard-hitting speech attacking the government’s immigration policy.
Addressing a Conservative association meeting in Birmingham, Mr Powell said Britain had to be mad to allow in 50,000 dependents of immigrants each year.
He compared it to watching a nation busily engaged in heaping up its own funeral pyre.
The MP for Wolverhampton South West called for an immediate reduction in immigration and the implementation of a Conservative policy of “urgent” encouragement of those already in the UK to return home.
“It can be no part of any policy that existing families should be kept divided. But there are two directions on which families can be reunited,” he said.
Mr Powell compared enacting legislation such as the Race Relations Bill to “throwing a match on to gunpowder”.
He said that as he looked to the future he was filled with a sense of foreboding.
“Like the Roman, I seem to see the river Tiber foaming with much blood,” he said.
He estimated that by the year 2000 up to seven million people — or one in ten of the population — would be of immigrant descent.
How did that prediction pan out?
The Census in 2001 showed 4.6 million people living in the UK were from an ethnic minority, or 7.9% of the population.
Here’s the opening to the actual speech:
The supreme function of statesmanship is to provide against preventable evils. In seeking to do so, it encounters obstacles which are deeply rooted in human nature.
One is that by the very order of things such evils are not demonstrable until they have occurred: at each stage in their onset there is room for doubt and for dispute whether they be real or imaginary. By the same token, they attract little attention in comparison with current troubles, which are both indisputable and pressing: whence the besetting temptation of all politics to concern itself with the immediate present at the expense of the future.
Above all, people are disposed to mistake predicting troubles for causing troubles and even for desiring troubles: “If only,” they love to think, “if only people wouldn’t talk about it, it probably wouldn’t happen.”
Perhaps this habit goes back to the primitive belief that the word and the thing, the name and the object, are identical.
At all events, the discussion of future grave but, with effort now, avoidable evils is the most unpopular and at the same time the most necessary occupation for the politician. Those who knowingly shirk it deserve, and not infrequently receive, the curses of those who come after.
(I’ve mentioned this speech before.)
“If the history of the world is but the biography of great men, as Thomas Carlyle put it, the history of Britain since the 1960s is but the biography of two great men and one woman,” The Economist declares, apparently channeling the spirit of Mencius Moldbug:
As Labour home secretary from 1965-67, Roy Jenkins took the government out of the bedroom with a series of liberalising laws on divorce, homosexuality and censorship. As Tory prime minister from 1979-90 Margaret Thatcher unleashed the power of markets. The main job of their successors was to come to terms with these twin revolutions: Tony Blair converted Labour to Thatcherism and David Cameron converted the Tories to Jenkinsism.
Before Brexit it looked as if that was it: the party that could produce the best synthesis of Thatcher and Jenkins would win. But today a third figure hovers over British politics: a man who was born in 1912 — eight years before Jenkins and 13 before Thatcher — but whose influence seems to grow by the day. One of Enoch Powell’s most famous observations was that “all political lives, unless they are cut off in midstream at some happy juncture, end in failure.” His political life is enjoying a posthumous success.
Powell put two issues at the heart of his politics: migration and Europe. He convulsed the country in 1968 when he declared in a speech in his native Birmingham that mass immigration would produce social breakdown — that “like the Roman, I seem to see the River Tiber foaming with much blood.” And he campaigned tirelessly against the European Economic Community. These two passions were united by his belief in the nation state. He thought that nations were the building blocks of society and that attempts to subvert them, through supranational engineering or global flows of people, would end in disaster.
Powell didn’t have the same direct influence as Thatcher or Jenkins. Thatcher was prime minister for 11 tumultuous years. Jenkins lived his life at the centre of the establishment. Powell spent only 15 months of his 37-year political career in office, as minister for health; nothing of substance bears his name on the statute books. In his new book, “The Road to Somewhere”, David Goodhart, a liberal critic of multiculturalism who has been accused of “liberal Powellism”, thinks that his “rivers of blood” speech was doubly counter-productive: it toxified the discussion of immigration for a generation and set the bar to successful immigration too low (no rivers foaming with blood, no problem).
Yet Brexit is soaked in the blood of Powellism. Some of the leading Brexiteers acknowledge their debt to Powell: Nigel Farage regards him as a political hero and says that the country would be better today if his words had been heeded. Powell lit the fire of Euroscepticism in 1970 and kept it burning, often alone, for decade upon decade. He provided the Eurosceptics with their favourite arguments: that Europe was a mortal threat to British sovereignty; that Britain’s future lay in going it alone, “her face towards the oceans and the continents of the world”; that the establishment had betrayed the British people into joining Europe, by selling a political project as an economic one, and would betray them again. History has also been on his side. David Shiels, of Wolfson College, Cambridge, points out that, in Powell’s time, the questions of immigration and Europe were distinct (the immigration that worried him was from the Commonwealth). Europe’s commitment to the free movement of people drove the two things together and gave Powellism its renewed power.
Just as important as his arguments was his style. Powell was the first of the new generation of populists cropping up across the West, a worshipper of Nietzsche in his youth, a professor of classics by the age of 25 who nevertheless considered himself a true voice of the people. He believed that the British establishment had become fatally out of touch on the biggest questions facing the country and used his formidable charisma — insistent voice tinged with Brummie, hypnotic stare — to seduce his audiences.
Attempts to fully discredit his most famous book, 1994′s “The Bell Curve,” have failed for more than two decades now. This is because they repeatedly miss the strongest point of attack: an indisputable — albeit encoded — endorsement of prejudice.
So, the science is unassailable, but we should vehemently attack an encoded endorsement of prejudice that is based on that (apparently) unassailable science? “This isn’t the ‘PC police’ talking,” he asserts, but he completely ignores what Murray explicitly says about prejudging people:
Even when the differences are substantial, the variation between two groups will almost always be dwarfed by the variation within groups — meaning that the overlap between two groups will be great. In a free society where people are treated as individuals, “So what?” is to me the appropriate response to genetic group differences. The only political implication of group differences is that we must work hard to ensure that our society is in fact free and that people are in fact treated as individuals.
Zoroastrianism might be called the obscure religion that shaped the West:
It is generally believed by scholars that the ancient Iranian prophet Zarathustra (known in Persian as Zartosht and Greek as Zoroaster) lived sometime between 1500 and 1000 BC. Prior to Zarathustra, the ancient Persians worshipped the deities of the old Irano-Aryan religion, a counterpart to the Indo-Aryan religion that would come to be known as Hinduism. Zarathustra, however, condemned this practice, and preached that God alone – Ahura Mazda, the Lord of Wisdom – should be worshipped. In doing so, he not only contributed to the great divide between the Iranian and Indian Aryans, but arguably introduced to mankind its first monotheistic faith.
The idea of a single god was not the only essentially Zoroastrian tenet to find its way into other major faiths, most notably the ‘big three’: Judaism, Christianity and Islam. The concepts of Heaven and Hell, Judgment Day and the final revelation of the world, and angels and demons all originated in the teachings of Zarathustra, as well as the later canon of Zoroastrian literature they inspired. Even the idea of Satan is a fundamentally Zoroastrian one; in fact, the entire faith of Zoroastrianism is predicated on the struggle between God and the forces of goodness and light (represented by the Holy Spirit, Spenta Manyu) and Ahriman, who presides over the forces of darkness and evil. While man has to choose to which side he belongs, the religion teaches that ultimately, God will prevail, and even those condemned to hellfire will enjoy the blessings of Paradise (an Old Persian word).
How did Zoroastrian ideas find their way into the Abrahamic faiths and elsewhere? According to scholars, many of these concepts were introduced to the Jews of Babylon upon being liberated by the Persian emperor Cyrus the Great. They trickled into mainstream Jewish thought, and figures like Beelzebub emerged. And after Persia’s conquests of Greek lands during the heyday of the Achaemenid Empire, Greek philosophy took a different course. The Greeks had previously believed humans had little agency, and that their fates were at the mercy of their many gods, whom often acted according to whim and fancy. After their acquaintance with Iranian religion and philosophy, however, they began to feel more as if they were the masters of their destinies, and that their decisions were in their own hands.
Steven Pinker discusses decivilization in the 1960s:
After a three-decade free fall that spanned the Great Depression, World War II, and the Cold War, Americans multiplied their homicide rate by more than two and a half, from a low of 4.0 in 1957 to a high of 10.2 in 1980 (U.S. Bureau of Statistics; Fox and Zawitz: 2007). The upsurge included every other category of major crime as well, including rape, assault, robbery, and theft, and lasted (with ups and downs) for three decades. The cities got particularly dangerous, especially New York, which became a symbol of the new criminality. Though the surge in violence affected all the races and both genders, it was most dramatic among black men, whose annual homicide rate had shot up by the mid-1980s to 72 per 100,000.
The rebounding of violence in the 1960s defied every expectation. The decade was a time of unprecedented economic growth, nearly full employment, levels of economic equality for which people today are nostalgic, historic racial progress, and the blossoming of government social programs, not to mention medical advances that made victims more likely to survive being shot or knifed. Social theorists in 1962 would have happily bet that these fortunate conditions would lead to a continuing era of low crime. And they would have lost their shirts.
When rock music burst onto the scene in the 1950s, politicians and clergymen vilified it for corrupting morals and encouraging lawlessness. (An amusing video reel of fulminating fogies can be seen in Cleveland’s Rock and Roll Hall of Fame and Museum.) Do we now have to – gulp – admit they were right? Can we connect the values of 1960s popular culture to the actual rise in violent crimes that accompanied them? Not directly, of course. Correlation is not causation, and a third factor, the pushback against the values of the Civilizing Process, presumably caused both the changes in popular culture and the increase in violent behavior. Also, the overwhelming majority of baby boomers committed no violence whatsoever. Still, attitudes and popular culture surely reinforce each other, and at the margins, where susceptible individuals and subcultures can be buffeted one way or another, there are plausible causal arrows from the decivilizing mindset to the facilitation of actual violence.
One of them was a self-handicapping of the criminal justice Leviathan. Though rock musicians seldom influence public policy directly, writers and intellectuals do, and they got caught up in the zeitgeist and began to rationalize the new licentiousness. Marxism made violent class conflict seem like a route to a better world. Influential thinkers like Herbert Marcuse and Paul Goodman tried to merge Marxism or anarchism with a new interpretation of Freud that connected sexual and emotional repression to political repression and championed a release from inhibitions as part of the revolutionary struggle. Troublemakers were increasingly seen as rebels and nonconformists, or as victims of racism, poverty, and bad parenting. Graffiti vandals were now ‘artists,’ thieves were ‘class warriors,’ and neighborhood hooligans were ‘community leaders.’ Many smart people, intoxicated by radical chic, did incredibly stupid things. Graduates of elite universities built bombs to be set off at army social functions, or drove getaway cars while ‘radicals’ shot guards during armed robberies. New York intellectuals were conned by Marxobabble-spouting psychopaths into lobbying for their release from prison (Pinker 2002: 261–262).
Read the whole thing. (It’s an excerpt from The Better Angels of Our Nature: Why Violence Has Declined.)
US healthcare is famous for three things, Ben Southwood notes:
It’s expensive, it’s not universal, and it has poor outcomes. The US spends around $7,000 per person on healthcare every year, or roughly 18% of GDP; the next highest spender is Switzerland, which spends about $4,500. Before Obamacare, approx 15% of the US population were persistently uninsured (8.6% still are). And as this chart neatly shows, their overall outcome on the most important variable — overall life expectancy — is fairly poor.
But some of this criticism is wrongheaded and simplistic: when you slice the data up more reasonably, US outcomes look impressive, but being the world’s outrider is much more expensive than following behind. What’s more, most of the solutions people offer just don’t get to the heart of the issue: if you give people freedom they’ll spend a lot on healthcare.
The US undoubtedly spends a huge amount on healthcare. One popular narrative is that because of market failures and/or extreme overregulation in healthcare, prices are excessively high. So Americans with insurance (or covered by Medicare, the universal system for the elderly, or Medicaid, the government system for the poor) get the same as other developed world citizens, but those without get very poor care and die younger. A system like the NHS solves the problem, according to this view, with bulk buying of land, labour, and inputs, better incentives, and universal coverage.
But there are some serious flaws in this theory. Firstly, extending insurance to the previously-uninsured doesn’t, in America, seem to have large benefits. For example, a recent NBER paper found no overall health gains from the massive insurance expansion under Obamacare.* A famous RAND study found minuscule benefits over decades from giving out free insurance to previously uninsured in the 1970s. In fact, over and above the basics, insuring those who choose not to get insurance doesn’t ever seem to have large gains. Indeed, there is wide geographic variation in the life expectancy among the low income in the US, but this doesn’t even correlate with access to medical care! This makes it unlikely that the gap between the US and the rest is explained by universality.
To find the answer, consider the main two ingredients that go into health outcomes. One is health, and the other is treatment. If latent health is the same across the Western world, we can presume that any differences come from differences in treatment. But this is simply not the case. Obesity is far higher in the USA than in any other major developed country. Obviously it is a public health problem, but it’s unrealistic to blame it on the US system of paying for doctors, administrators, hospitals, equipment and drugs.
In fact in the US case it’s not even obesity, or indeed their greater pre-existing disease burden, that is doing most of the work in dragging their life expectancy down; it’s accidental and violent deaths. It is tragic that the US is so dangerous, but it’s not the fault of the healthcare system; indeed, it’s an extra burden that US healthcare spending must bear. Just simply normalising for violent and accidental death puts the USA right to the top of the life expectancy rankings.
One of our cultural problems, Arnold Kling adds, is that we spend too much on health care and not enough on public health.
Almost half of Colorado women who got pregnant in 2008 said that the pregnancy happened sooner than they wanted or that they hadn’t wanted to get pregnant at all. That was similar to the US average: the rate of unintended pregnancy has been stuck around 50 percent since the 1960s.
The Colorado Family Planning Initiative cut teen births and abortions almost in half:
They fell by nearly 20 percent among women aged 20-24. (Note: Under normal circumstances, over 80 percent of teen pregnancies and 70 percent of pregnancies among single women aged 20-29 are unsought, so this change means women’s realities are better matching their family desires.) Second-order births to teens—teens who gave birth a second or third time—dropped by 58 percent. High-risk births, including preterm births, also diminished.
Poor families benefited the most, because unsought pregnancy is four times as common and unsought birth seven times as common among poor women as among their more prosperous peers. With fewer families facing the dire circumstances triggered by an unexpected pregnancy or unplanned birth, the state saved $66-70 million in public assistance, according to a team of economists at the University of Colorado.
How did Colorado get such dramatic results? They provided “get it and forget it” forms of contraception, such as long-acting IUDs and implants.
This is seen as a great Progressive victory.
The US entered the Great War 100 years ago, but why?
The war lasted only another year and a half, but in that time, an astounding 117,000 American soldiers were killed and 202,000 wounded.
America intervened nearly three years after it began, and the “doughboys,” as our troops were called, engaged in serious combat for only a few months. More Americans in uniform died away from the battlefield — thousands from the Spanish flu — than with weapons in hand. After victory was achieved, Wilson’s audacious hope of making a peace that would advance democracy and national self-determination blew up in his face when the Senate refused to ratify the treaty he had signed at the Palace of Versailles.
But attention should be paid. America’s decision to join the Allies was a turning point in world history. It altered the fortunes of the war and the course of the 20th century — and not necessarily for the better. Its entry most likely foreclosed the possibility of a negotiated peace among belligerent powers that were exhausted from years mired in trench warfare.
Although the American Expeditionary Force did not engage in combat for long, the looming threat of several million fresh troops led German generals to launch a last, desperate series of offensives. When that campaign collapsed, Germany’s defeat was inevitable.
How would the war have ended if America had not intervened? The carnage might have continued for another year or two until citizens in the warring nations, who were already protesting the endless sacrifices required, forced their leaders to reach a settlement. If the Allies, led by France and Britain, had not won a total victory, there would have been no punitive peace treaty like that completed at Versailles, no stab-in-the-back allegations by resentful Germans, and thus no rise, much less triumph, of Hitler and the Nazis. The next world war, with its 50 million deaths, would probably not have occurred.
The pacifists failed:
Since the war began, feminists and socialists had worked closely with progressive members of Congress from the agrarian South and the urban Midwest to keep America out. They mounted street demonstrations, attracted prominent leaders from the labor and suffrage movements, and ran antiwar candidates for local and federal office. They also gained the support of Henry Ford, who chartered a ship full of activists who crossed the Atlantic to plead with the heads of neutral nations to broker a peace settlement.
They may even have had a majority of Americans on their side. In the final weeks before Congress declared war, anti-militarists demanded a national referendum on the question, confident voters would recoil from fighting and paying the bills so that one group of European powers could vanquish another.
Once the United States did enter the fray, Wilson, with the aid of the courts, prosecuted opponents of the war who refused to fall in line. Under the Espionage and Sedition Acts, thousands were arrested for such “crimes” as giving speeches against the draft and calling the Army “a God damned legalized murder machine.”
The intervention led to big changes in America, as well as the world. It began the creation of a political order most citizens now take for granted, even as some protest against it: a state equipped to fight war after war abroad while keeping a close watch on allegedly subversive activities at home.
To make the world “safe for democracy” required another innovation: a military-industrial establishment funded, then partly and now completely, by income taxes.
The Pentagon is making the CIA and State Department obsolete, Ryan Landry argues:
The State Department and CIA have been anxious since election night, not just because of the threat of a change of policy, but because these agencies now face an existential threat on some level.
As Trump’s cabinet was assembled in late fall, the nervousness grew as Blue Empire realized how many generals and former generals would serve in high-ranking positions. Unbeknownst to most Americans, the fear is based on change within Red Empire. The Pentagon’s growth into a one-stop shop for policy formulation and implementation threatens Blue’s existence and at a minimum, influence.
Since the planes flew into the Twin Towers, the Department of Defense has been in a steady growth mode not strictly limited to increases in spending and budgets. The very scope and nature of the DOD has changed. America’s wars of choice, the imperial wars after World War II, have forced the Pentagon to send a first-world military into regions with zero infrastructure. The U.S. military had to change and find ways to support such a military in many undeveloped areas from scratch. Random reports by American media outlets repeat the same phrase over and over. The Pentagon has become a one-stop shop for the presidency to seek advice, consider policies, and execute plans. For all the recent talk of a Deep State, the Pentagon has grown into a fully operational imperium in imperio.
If the president inquires, the Pentagon offers a soup to nuts service for policy that extends beyond warmaking. As one of the largest users of oil and oil products, the Pentagon needed to focus on securing oil, and therefore energy policy advisers and researchers sprung forth. The Pentagon has a budget and a literal army of individuals to enact policy with speed that other departments can only dream of enjoying. The need to defend America’s computer networks has had secondary consequences of a private ex-military IT network of contractors and systems experts.
Mission creep has created recent open reactions from rivals at State and CIA. CIA’s fear of former national security adviser Michael Flynn was not due to his supposed Russia connections, but rather because of his approach to intelligence. Flynn voiced concerns and criticism with the CIA’s approach. Flynn also built the Defense Intelligence Agency up to integrate intelligence units and analysis directly with Army units for faster, more efficient operations. CIA was out to get Flynn as part of a turf battle and legitimate fear of reform. If not reform, simple reorientation of who is used when and where in the empire.
Defense has also spent a generation pushing its officers into furthering their education. This is not simple credentialism, but also creating a corps of officers that could rival their mirror images in State.
Choo Waihong grew up in Singapore before training and working as a corporate lawyer in Canada, the US, and London. She felt drawn back to China, but stumbled across the kingdom of women, a series of villages dotted around a mountain and Lugu Lake, where a Tibetan tribe called the Mosuo still practices its matriarchal ways:
As an unmarried woman in a community where marriage is non-existent, Waihong felt at home.
“All Mosuo women are, essentially, single,” she says. “But I think I’m seen as an oddity because I’m not from here, and I live alone, rather than with a family. I get a lot of dinner invitations, and my friends are always egging me on to find a nice Mosuo lover.” Has she? “That would be telling.”
With life centred on the maternal family, motherhood is, unsurprisingly, revered. For a young Mosuo woman, it is life’s goal. “I’ve had to advise many young women on ovulation, so keen are they to get pregnant,” she says. “You are seen as complete once you become a mother.” In this respect, Waihong, who doesn’t have children, is regarded more keenly. “My sense is that I’m pitied,” she says, “but people are too polite to tell me.”
My self-imposed ground rules are that I can’t delete accurate quotes from my work that I wish I had worded more felicitously, but I am permitted to extend quotes with material that immediately adjoins the quoted text, to correct factual mistakes, and to make suggestions to the author, as copy editors routinely do.
Bo and Ben Winegard tell a tale of two Bell Curves:
To paraphrase Mark Twain, an infamous book is one that people castigate but do not read. Perhaps no modern work better fits this description than The Bell Curve by political scientist Charles Murray and the late psychologist Richard J. Herrnstein. Published in 1994, the book is a sprawling (872 pages) but surprisingly entertaining analysis of the increasing importance of cognitive ability in the United States.
There are two versions of The Bell Curve. The first is a disgusting and bigoted fraud. The second is a judicious but provocative look at intelligence and its increasing importance in the United States. The first is a fiction. And the second is the real Bell Curve. Because many, if not most, of the pundits who assailed The Bell Curve have not bothered to read it, the fictitious Bell Curve has thrived and continues to inspire furious denunciations. We have suggested that almost all of the proposals of The Bell Curve are plausible. Of course, it is possible that some are incorrect. But we will only know which ones if people responsibly engage the real Bell Curve instead of castigating a caricature.