RIP WeaponsMan

Wednesday, April 19th, 2017

The “quiet professional” behind the WeaponsMan blog, Kevin O’Brien, has passed away, his brother reports:

He was born in 1958 to Robert and Barbara O’Brien. We grew up in Westborough, Mass. Kevin graduated from high school in 1975 and joined the Army in (I believe) 1979. He learned Czech at DLI and became a Ranger and a member of Special Forces.

Kevin’s happiest times were in the Army. He loved the service and was deeply committed to it. We were so proud when he earned the Green Beret. He was active duty for eight years and then stayed in the Reserves and National Guard for many years, including a deployment to Afghanistan in 2003. He told me after that that Afghan tour was when he felt he had made his strongest contribution to the world.

[...]

In the winter of 2015, we began building our airplane together. You could not ask for a better building partner.

Last Thursday night was our last “normal” night working on the airplane. I could not join him Friday night, but on Saturday morning I got a call from the Portsmouth Regional Hospital. He had called 911 on Friday afternoon and was taken to the ER with what turned out to be a massive heart attack. Evidently he was conscious when he was brought in, but his heart stopped and he was revived after 60 minutes of CPR. He never reawakened.

He will be missed.

Better villains, bigger explosions

Wednesday, April 19th, 2017

A Twitter follower offered David Frum a memorable explanation of the weak hold of the First World War upon the American consciousness.

“Americans prefer the sequel: better villains, bigger explosions.”

The shadow of Enoch Powell looms ever-larger over Britain

Tuesday, April 18th, 2017

“If the history of the world is but the biography of great men, as Thomas Carlyle put it, the history of Britain since the 1960s is but the biography of two great men and one woman,” The Economist declares, apparently channeling the spirit of Mencius Moldbug:

As Labour home secretary from 1965-67, Roy Jenkins took the government out of the bedroom with a series of liberalising laws on divorce, homosexuality and censorship. As Tory prime minister from 1979-90 Margaret Thatcher unleashed the power of markets. The main job of their successors was to come to terms with these twin revolutions: Tony Blair converted Labour to Thatcherism and David Cameron converted the Tories to Jenkinsism.

Before Brexit it looked as if that was it: the party that could produce the best synthesis of Thatcher and Jenkins would win. But today a third figure hovers over British politics: a man who was born in 1912 — eight years before Jenkins and 13 before Thatcher — but whose influence seems to grow by the day. One of Enoch Powell’s most famous observations was that “all political lives, unless they are cut off in midstream at some happy juncture, end in failure.” His political life is enjoying a posthumous success.

Powell put two issues at the heart of his politics: migration and Europe. He convulsed the country in 1968 when he declared in a speech in his native Birmingham that mass immigration would produce social breakdown — that “like the Roman, I seem to see the River Tiber foaming with much blood.” And he campaigned tirelessly against the European Economic Community. These two passions were united by his belief in the nation state. He thought that nations were the building blocks of society and that attempts to subvert them, through supranational engineering or global flows of people, would end in disaster.

Powell didn’t have the same direct influence as Thatcher or Jenkins. Thatcher was prime minister for 11 tumultuous years. Jenkins lived his life at the centre of the establishment. Powell spent only 15 months of his 37-year political career in office, as minister for health; nothing of substance bears his name on the statute books. In his new book, “The Road to Somewhere”, David Goodhart, a liberal critic of multiculturalism who has been accused of “liberal Powellism”, thinks that his “rivers of blood” speech was doubly counter-productive: it toxified the discussion of immigration for a generation and set the bar to successful immigration too low (no rivers foaming with blood, no problem).

Yet Brexit is soaked in the blood of Powellism. Some of the leading Brexiteers acknowledge their debt to Powell: Nigel Farage regards him as a political hero and says that the country would be better today if his words had been heeded. Powell lit the fire of Euroscepticism in 1970 and kept it burning, often alone, for decade upon decade. He provided the Eurosceptics with their favourite arguments: that Europe was a mortal threat to British sovereignty; that Britain’s future lay in going it alone, “her face towards the oceans and the continents of the world”; that the establishment had betrayed the British people into joining Europe, by selling a political project as an economic one, and would betray them again. History has also been on his side. David Shiels, of Wolfson College, Cambridge, points out that, in Powell’s time, the questions of immigration and Europe were distinct (the immigration that worried him was from the Commonwealth). Europe’s commitment to the free movement of people drove the two things together and gave Powellism its renewed power.

Just as important as his arguments was his style. Powell was the first of the new generation of populists cropping up across the West, a worshipper of Nietzsche in his youth, a professor of classics by the age of 25 who nevertheless considered himself a true voice of the people. He believed that the British establishment had become fatally out of touch on the biggest questions facing the country and used his formidable charisma — insistent voice tinged with Brummie, hypnotic stare — to seduce his audiences.

Nassim Taleb and the Guardian

Monday, April 17th, 2017

Nassim Nicholas Taleb hates bankers, academics, and journalists, but he was willing to sit down with Carole Cadwalladr of The Guardian:

And yet here he is, chatting away, surprisingly friendly and approachable. When I say as much as we walk to the restaurant, he asks, “What do you mean?”

“In your book, you’re quite…” and I struggle to find the right word, “grumpy”.

He shrugs. “When you write, you don’t have the social constraints of having people in front of you, so you talk about abstract matters.”

Social constraints, it turns out, have their uses. And he’s an excellent host. We go to his regular restaurant, a no-nonsense, Italian-run, canteen-like place, a few yards from his faculty in central Brooklyn, and he insists that I order a glass of wine.

“And what’ll you have?” asks the waitress.

“I’ll take a coffee,” he says.

“What?” I say. “No way! You can’t trick me into ordering a glass of wine and then have coffee.” It’s like flunking lesson #101 at interviewing school, though in the end he relents and has not one but two glasses and a plate of “pasta without pasta” (though strictly speaking you could call it “mixed vegetables and chicken”), and attacks the bread basket “because it doesn’t have any calories here in Brooklyn”.

This isn’t the “PC police” talking

Sunday, April 16th, 2017

Scientific American has published an embarrassingly unscientific piece by Eric Siegel on the real problem with Charles Murray and The Bell Curve:

Attempts to fully discredit his most famous book, 1994′s “The Bell Curve,” have failed for more than two decades now. This is because they repeatedly miss the strongest point of attack: an indisputable — albeit encoded — endorsement of prejudice.

So, the science is unassailable, but we should vehemently attack an encoded endorsement of prejudice that is based on that (apparently) unassailable science? “This isn’t the ‘PC police’ talking,” he asserts, but he completely ignores what Murray explicitly says about prejudging people:

Even when the differences are substantial, the variation between two groups will almost always be dwarfed by the variation within groups — meaning that the overlap between two groups will be great. In a free society where people are treated as individuals, “So what?” is to me the appropriate response to genetic group differences. The only political implication of group differences is that we must work hard to ensure that our society is in fact free and that people are in fact treated as individuals.

The obscure religion that shaped the West

Saturday, April 15th, 2017

Zoroastrianism might be called the obscure religion that shaped the West:

It is generally believed by scholars that the ancient Iranian prophet Zarathustra (known in Persian as Zartosht and Greek as Zoroaster) lived sometime between 1500 and 1000 BC. Prior to Zarathustra, the ancient Persians worshipped the deities of the old Irano-Aryan religion, a counterpart to the Indo-Aryan religion that would come to be known as Hinduism. Zarathustra, however, condemned this practice, and preached that God alone – Ahura Mazda, the Lord of Wisdom – should be worshipped. In doing so, he not only contributed to the great divide between the Iranian and Indian Aryans, but arguably introduced to mankind its first monotheistic faith.

The idea of a single god was not the only essentially Zoroastrian tenet to find its way into other major faiths, most notably the ‘big three’: Judaism, Christianity and Islam. The concepts of Heaven and Hell, Judgment Day and the final revelation of the world, and angels and demons all originated in the teachings of Zarathustra, as well as the later canon of Zoroastrian literature they inspired. Even the idea of Satan is a fundamentally Zoroastrian one; in fact, the entire faith of Zoroastrianism is predicated on the struggle between God and the forces of goodness and light (represented by the Holy Spirit, Spenta Manyu) and Ahriman, who presides over the forces of darkness and evil. While man has to choose to which side he belongs, the religion teaches that ultimately, God will prevail, and even those condemned to hellfire will enjoy the blessings of Paradise (an Old Persian word).

How did Zoroastrian ideas find their way into the Abrahamic faiths and elsewhere? According to scholars, many of these concepts were introduced to the Jews of Babylon upon being liberated by the Persian emperor Cyrus the Great. They trickled into mainstream Jewish thought, and figures like Beelzebub emerged. And after Persia’s conquests of Greek lands during the heyday of the Achaemenid Empire, Greek philosophy took a different course. The Greeks had previously believed humans had little agency, and that their fates were at the mercy of their many gods, whom often acted according to whim and fancy. After their acquaintance with Iranian religion and philosophy, however, they began to feel more as if they were the masters of their destinies, and that their decisions were in their own hands.

Decivilization in the 1960s

Friday, April 14th, 2017

Steven Pinker discusses decivilization in the 1960s:

After a three-decade free fall that spanned the Great Depression, World War II, and the Cold War, Americans multiplied their homicide rate by more than two and a half, from a low of 4.0 in 1957 to a high of 10.2 in 1980 (U.S. Bureau of Statistics; Fox and Zawitz: 2007). The upsurge included every other category of major crime as well, including rape, assault, robbery, and theft, and lasted (with ups and downs) for three decades. The cities got particularly dangerous, especially New York, which became a symbol of the new criminality. Though the surge in violence affected all the races and both genders, it was most dramatic among black men, whose annual homicide rate had shot up by the mid-1980s to 72 per 100,000.

[...]

The rebounding of violence in the 1960s defied every expectation. The decade was a time of unprecedented economic growth, nearly full employment, levels of economic equality for which people today are nostalgic, historic racial progress, and the blossoming of government social programs, not to mention medical advances that made victims more likely to survive being shot or knifed. Social theorists in 1962 would have happily bet that these fortunate conditions would lead to a continuing era of low crime. And they would have lost their shirts.

[...]

When rock music burst onto the scene in the 1950s, politicians and clergymen vilified it for corrupting morals and encouraging lawlessness. (An amusing video reel of fulminating fogies can be seen in Cleveland’s Rock and Roll Hall of Fame and Museum.) Do we now have to – gulp – admit they were right? Can we connect the values of 1960s popular culture to the actual rise in violent crimes that accompanied them? Not directly, of course. Correlation is not causation, and a third factor, the pushback against the values of the Civilizing Process, presumably caused both the changes in popular culture and the increase in violent behavior. Also, the overwhelming majority of baby boomers committed no violence whatsoever. Still, attitudes and popular culture surely reinforce each other, and at the margins, where susceptible individuals and subcultures can be buffeted one way or another, there are plausible causal arrows from the decivilizing mindset to the facilitation of actual violence.

One of them was a self-handicapping of the criminal justice Leviathan. Though rock musicians seldom influence public policy directly, writers and intellectuals do, and they got caught up in the zeitgeist and began to rationalize the new licentiousness. Marxism made violent class conflict seem like a route to a better world. Influential thinkers like Herbert Marcuse and Paul Goodman tried to merge Marxism or anarchism with a new interpretation of Freud that connected sexual and emotional repression to political repression and championed a release from inhibitions as part of the revolutionary struggle. Troublemakers were increasingly seen as rebels and nonconformists, or as victims of racism, poverty, and bad parenting. Graffiti vandals were now ‘artists,’ thieves were ‘class warriors,’ and neighborhood hooligans were ‘community leaders.’ Many smart people, intoxicated by radical chic, did incredibly stupid things. Graduates of elite universities built bombs to be set off at army social functions, or drove getaway cars while ‘radicals’ shot guards during armed robberies. New York intellectuals were conned by Marxobabble-spouting psychopaths into lobbying for their release from prison (Pinker 2002: 261–262).

Read the whole thing. (It’s an excerpt from The Better Angels of Our Nature: Why Violence Has Declined.)

The lowest layer of the pyramid is the foundation

Thursday, April 13th, 2017

It’s hard to find a teacher who doesn’t make reference to Bloom’s Taxonomy, Doug Lemov notes, because it’s part of the language of teaching, but there’s a problem:

Bloom’s Taxonomy is often represented as a pyramid with the understanding — intended or accidental — that teachers should try to get to the top. That’s the nature of pyramids, I guess.

Bloom's Taxonomy Pyramid

Generally when teachers talk about “Bloom’s taxonomy,” they talk with disdain about “lower level” questions. They believe, perhaps because of the pyramid image which puts knowledge at the bottom, that knowledge-based questions, especially via recall and retrieval practice, are the least productive thing they could be doing in class. No one wants to be the rube at the bottom of the pyramid.

But this, interestingly is not what Bloom’s argued — at least according to Vanderbilt’s description. Saying knowledge questions are low value and that knowledge is the necessary precondition for deep thinking are very different things. More importantly believing that knowledge questions — even mere recall of facts — are low value doesn’t jibe with the overwhelming consensus of cognitive science, summarized here by Daniel Willingham, who writes,

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is in long-term memory (not just found in the environment)

In other words there are two parts to the equation. You not only have to teach a lot of facts to allow students to think deeply but you have to reinforce knowledge enough to install it in long-term memory or you can’t do any of the activities at the top of the pyramid.

US healthcare is famous for three things

Wednesday, April 12th, 2017

US healthcare is famous for three things, Ben Southwood notes:

It’s expensive, it’s not universal, and it has poor outcomes. The US spends around $7,000 per person on healthcare every year, or roughly 18% of GDP; the next highest spender is Switzerland, which spends about $4,500. Before Obamacare, approx 15% of the US population were persistently uninsured (8.6% still are). And as this chart neatly shows, their overall outcome on the most important variable — overall life expectancy — is fairly poor.

But some of this criticism is wrongheaded and simplistic: when you slice the data up more reasonably, US outcomes look impressive, but being the world’s outrider is much more expensive than following behind. What’s more, most of the solutions people offer just don’t get to the heart of the issue: if you give people freedom they’ll spend a lot on healthcare.

The US undoubtedly spends a huge amount on healthcare. One popular narrative is that because of market failures and/or extreme overregulation in healthcare, prices are excessively high. So Americans with insurance (or covered by Medicare, the universal system for the elderly, or Medicaid, the government system for the poor) get the same as other developed world citizens, but those without get very poor care and die younger. A system like the NHS solves the problem, according to this view, with bulk buying of land, labour, and inputs, better incentives, and universal coverage.

But there are some serious flaws in this theory. Firstly, extending insurance to the previously-uninsured doesn’t, in America, seem to have large benefits. For example, a recent NBER paper found no overall health gains from the massive insurance expansion under Obamacare.* A famous RAND study found minuscule benefits over decades from giving out free insurance to previously uninsured in the 1970s. In fact, over and above the basics, insuring those who choose not to get insurance doesn’t ever seem to have large gains. Indeed, there is wide geographic variation in the life expectancy among the low income in the US, but this doesn’t even correlate with access to medical care! This makes it unlikely that the gap between the US and the rest is explained by universality.

To find the answer, consider the main two ingredients that go into health outcomes. One is health, and the other is treatment. If latent health is the same across the Western world, we can presume that any differences come from differences in treatment. But this is simply not the case. Obesity is far higher in the USA than in any other major developed country. Obviously it is a public health problem, but it’s unrealistic to blame it on the US system of paying for doctors, administrators, hospitals, equipment and drugs.

In fact in the US case it’s not even obesity, or indeed their greater pre-existing disease burden, that is doing most of the work in dragging their life expectancy down; it’s accidental and violent deaths. It is tragic that the US is so dangerous, but it’s not the fault of the healthcare system; indeed, it’s an extra burden that US healthcare spending must bear. Just simply normalising for violent and accidental death puts the USA right to the top of the life expectancy rankings.

One of our cultural problems, Arnold Kling adds, is that we spend too much on health care and not enough on public health.

Take your shoes off at the door

Tuesday, April 11th, 2017

It turns out that taking your shoes off when you come inside doesn’t just keep the carpets cleaner. It’s also healthier:

Among samples collected in homes, 26.4% of shoe soles tested positive for C. Diff, about three times the number found on the surfaces of bathrooms and kitchens.

And that’s just one bacterium. In an earlier investigation, Dr. Garey examined past studies to learn if “shoe soles are a vector for infectious pathogens.” The answer was a resounding yes.

Among the studies: Austrian researchers found at least 40% of shoes carried Listeria monocytogenes in 2015. And a 2014 German study found that over a quarter of boots used on farms carried E.coli.

“Essentially, when you wear your shoes in a house, you are bringing in everything you stepped in during the day,” says Jonathan Sexton, a laboratory manager at the Mel & Enid Zuckerman College of Public Health at the University of Arizona.

Wiping your feet, however vigorously, on a welcome mat, provides only limited help, he says. “It will remove some of the dirt, but you have to think of the person who wiped their feet before. You might be picking stuff they left behind.”

Some homeowners may worry that guests in socks or bare feet might also represent a health risk. That’s possible, Dr. Sexton says, but the inside of a shoe has far less bacteria than the outside.

Both researchers agree that the risk is muted. “Shoes in the house are not something to freak out about,” Dr. Sexton says.

A Family Planning Miracle in Colorado

Monday, April 10th, 2017

So, what fraction of pregnancies are unintended?

Almost half of Colorado women who got pregnant in 2008 said that the pregnancy happened sooner than they wanted or that they hadn’t wanted to get pregnant at all. That was similar to the US average: the rate of unintended pregnancy has been stuck around 50 percent since the 1960s.

The Colorado Family Planning Initiative cut teen births and abortions almost in half:

They fell by nearly 20 percent among women aged 20-24. (Note: Under normal circumstances, over 80 percent of teen pregnancies and 70 percent of pregnancies among single women aged 20-29 are unsought, so this change means women’s realities are better matching their family desires.) Second-order births to teens—teens who gave birth a second or third time—dropped by 58 percent. High-risk births, including preterm births, also diminished.

Poor families benefited the most, because unsought pregnancy is four times as common and unsought birth seven times as common among poor women as among their more prosperous peers. With fewer families facing the dire circumstances triggered by an unexpected pregnancy or unplanned birth, the state saved $66-70 million in public assistance, according to a team of economists at the University of Colorado.

How did Colorado get such dramatic results? They provided “get it and forget it” forms of contraception, such as long-acting IUDs and implants.

This is seen as a great Progressive victory.

To Be a Genius, Think Like a 94-Year-Old

Sunday, April 9th, 2017

To be a genius, think like a 94-year-old — more specifically, like Dr. John Goodenough:

In 1946, a 23-year-old Army veteran named John Goodenough headed to the University of Chicago with a dream of studying physics. When he arrived, a professor warned him that he was already too old to succeed in the field.

Recently, Dr. Goodenough recounted that story for me and then laughed uproariously. He ignored the professor’s advice and today, at 94, has just set the tech industry abuzz with his blazing creativity. He and his team at the University of Texas at Austin filed a patent application on a new kind of battery that, if it works as promised, would be so cheap, lightweight and safe that it would revolutionize electric cars and kill off petroleum-fueled vehicles. His announcement has caused a stir, in part, because Dr. Goodenough has done it before. In 1980, at age 57, he coinvented the lithium-ion battery that shrank power into a tiny package.

We tend to assume that creativity wanes with age. But Dr. Goodenough’s story suggests that some people actually become more creative as they grow older. Unfortunately, those late-blooming geniuses have to contend with powerful biases against them.

[...]

On the contrary, there’s plenty of evidence to suggest that late blooming is no anomaly. A 2016 Information Technology and Innovation Foundation study found that inventors peak in their late 40s and tend to be highly productive in the last half of their careers. Similarly, professors at the Georgia Institute of Technology and Hitotsubashi University in Japan, who studied data about patent holders, found that, in the United States, the average inventor sends in his or her application to the patent office at age 47, and that the highest-value patents often come from the oldest inventors — those over the age of 55.

[...]

Years ago, he decided to create a solid battery that would be safer. Of course, in a perfect world, the “solid-state” battery would also be low-cost and lightweight. Then, two years ago, he discovered the work of Maria Helena Braga, a Portuguese physicist who, with the help of a colleague, had created a kind of glass that can replace liquid electrolytes inside batteries.

Dr. Goodenough persuaded Dr. Braga to move to Austin and join his lab. “We did some experiments to make sure the glass was dry. Then we were off to the races,” he said.

Some of his colleagues were dubious that he could pull it off. But Dr. Goodenough was not dissuaded. “I’m old enough to know you can’t close your mind to new ideas. You have to test out every possibility if you want something new.”

When I asked him about his late-life success, he said: “Some of us are turtles; we crawl and struggle along, and we haven’t maybe figured it out by the time we’re 30. But the turtles have to keep on walking.” This crawl through life can be advantageous, he pointed out, particularly if you meander around through different fields, picking up clues as you go along. Dr. Goodenough started in physics and hopped sideways into chemistry and materials science, while also keeping his eye on the social and political trends that could drive a green economy. “You have to draw on a fair amount of experience in order to be able to put ideas together,” he said.

He also credits his faith for keeping him focused on his mission to defeat pollution and ditch petroleum. On the wall of his lab, a tapestry of the Last Supper depicts the apostles in fervent conversation, like scientists at a conference arguing over a controversial theory. The tapestry reminds him of the divine power that fuels his mind. “I’m grateful for the doors that have been opened to me in different periods of my life,” he said. He believes the glass battery was just another example of the happy accidents that have come his way: “At just the right moment, when I was looking for something, it walked in the door.”

Last but not least, he credited old age with bringing him a new kind of intellectual freedom. At 94, he said, “You no longer worry about keeping your job.”

Short- and Long-Term Memories

Saturday, April 8th, 2017

All memories start as a short-term memory and then slowly convert into a long-term memory — or so we thought:

Two parts of the brain are heavily involved in remembering our personal experiences.

The hippocampus is the place for short-term memories while the cortex is home to long-term memories.

This idea became famous after the case of Henry Molaison in the 1950s.

His hippocampus was damaged during epilepsy surgery and he was no longer able to make new memories, but his ones from before the operation were still there.

So the prevailing idea was that memories are formed in the hippocampus and then moved to the cortex where they are “banked”.

[...]

The results, published in the journal Science, showed that memories were formed simultaneously in the hippocampus and the cortex.

[...]

The mice do not seem to use the cortex’s long-term memory in the first few days after it is formed.

They forgot the shock event when scientists turned off the short-term memory in the hippocampus.

However, they could then make the mice remember by manually switching the long-term memory on (so it was definitely there).

“It is immature or silent for the first several days after formation,” Prof Tonegawa said.

The researchers also showed the long-term memory never matured if the connection between the hippocampus and the cortex was blocked.

So there is still a link between the two parts of the brain, with the balance of power shifting from the hippocampus to the cortex over time.

Should America have entered World War I?

Friday, April 7th, 2017

The US entered the Great War 100 years ago, but why?

The war lasted only another year and a half, but in that time, an astounding 117,000 American soldiers were killed and 202,000 wounded.

[...]

America intervened nearly three years after it began, and the “doughboys,” as our troops were called, engaged in serious combat for only a few months. More Americans in uniform died away from the battlefield — thousands from the Spanish flu — than with weapons in hand. After victory was achieved, Wilson’s audacious hope of making a peace that would advance democracy and national self-determination blew up in his face when the Senate refused to ratify the treaty he had signed at the Palace of Versailles.

But attention should be paid. America’s decision to join the Allies was a turning point in world history. It altered the fortunes of the war and the course of the 20th century — and not necessarily for the better. Its entry most likely foreclosed the possibility of a negotiated peace among belligerent powers that were exhausted from years mired in trench warfare.

Although the American Expeditionary Force did not engage in combat for long, the looming threat of several million fresh troops led German generals to launch a last, desperate series of offensives. When that campaign collapsed, Germany’s defeat was inevitable.

How would the war have ended if America had not intervened? The carnage might have continued for another year or two until citizens in the warring nations, who were already protesting the endless sacrifices required, forced their leaders to reach a settlement. If the Allies, led by France and Britain, had not won a total victory, there would have been no punitive peace treaty like that completed at Versailles, no stab-in-the-back allegations by resentful Germans, and thus no rise, much less triumph, of Hitler and the Nazis. The next world war, with its 50 million deaths, would probably not have occurred.

The pacifists failed:

Since the war began, feminists and socialists had worked closely with progressive members of Congress from the agrarian South and the urban Midwest to keep America out. They mounted street demonstrations, attracted prominent leaders from the labor and suffrage movements, and ran antiwar candidates for local and federal office. They also gained the support of Henry Ford, who chartered a ship full of activists who crossed the Atlantic to plead with the heads of neutral nations to broker a peace settlement.

They may even have had a majority of Americans on their side. In the final weeks before Congress declared war, anti-militarists demanded a national referendum on the question, confident voters would recoil from fighting and paying the bills so that one group of European powers could vanquish another.

Once the United States did enter the fray, Wilson, with the aid of the courts, prosecuted opponents of the war who refused to fall in line. Under the Espionage and Sedition Acts, thousands were arrested for such “crimes” as giving speeches against the draft and calling the Army “a God damned legalized murder machine.”

The intervention led to big changes in America, as well as the world. It began the creation of a political order most citizens now take for granted, even as some protest against it: a state equipped to fight war after war abroad while keeping a close watch on allegedly subversive activities at home.

To make the world “safe for democracy” required another innovation: a military-industrial establishment funded, then partly and now completely, by income taxes.

The Pentagon Is Making The CIA And State Department Obsolete

Thursday, April 6th, 2017

The Pentagon is making the CIA and State Department obsolete, Ryan Landry argues:

The State Department and CIA have been anxious since election night, not just because of the threat of a change of policy, but because these agencies now face an existential threat on some level.

As Trump’s cabinet was assembled in late fall, the nervousness grew as Blue Empire realized how many generals and former generals would serve in high-ranking positions. Unbeknownst to most Americans, the fear is based on change within Red Empire. The Pentagon’s growth into a one-stop shop for policy formulation and implementation threatens Blue’s existence and at a minimum, influence.

Since the planes flew into the Twin Towers, the Department of Defense has been in a steady growth mode not strictly limited to increases in spending and budgets. The very scope and nature of the DOD has changed. America’s wars of choice, the imperial wars after World War II, have forced the Pentagon to send a first-world military into regions with zero infrastructure. The U.S. military had to change and find ways to support such a military in many undeveloped areas from scratch. Random reports by American media outlets repeat the same phrase over and over. The Pentagon has become a one-stop shop for the presidency to seek advice, consider policies, and execute plans. For all the recent talk of a Deep State, the Pentagon has grown into a fully operational imperium in imperio.

If the president inquires, the Pentagon offers a soup to nuts service for policy that extends beyond warmaking. As one of the largest users of oil and oil products, the Pentagon needed to focus on securing oil, and therefore energy policy advisers and researchers sprung forth. The Pentagon has a budget and a literal army of individuals to enact policy with speed that other departments can only dream of enjoying. The need to defend America’s computer networks has had secondary consequences of a private ex-military IT network of contractors and systems experts.

Mission creep has created recent open reactions from rivals at State and CIA. CIA’s fear of former national security adviser Michael Flynn was not due to his supposed Russia connections, but rather because of his approach to intelligence. Flynn voiced concerns and criticism with the CIA’s approach. Flynn also built the Defense Intelligence Agency up to integrate intelligence units and analysis directly with Army units for faster, more efficient operations. CIA was out to get Flynn as part of a turf battle and legitimate fear of reform. If not reform, simple reorientation of who is used when and where in the empire.

Defense has also spent a generation pushing its officers into furthering their education. This is not simple credentialism, but also creating a corps of officers that could rival their mirror images in State.