World War II films aren’t about World War II

Friday, July 21st, 2017

Many World War II films reveal at least as much about the times in which they are made as they do about the conflict itself:

“It’s possible that 20 years from now we’ll look back at ‘Dunkirk’ and say, ‘That movie was so 2017,’ and everyone will know exactly what that means,” said film historian Mark Harris, author of “Five Came Back,” a book about Hollywood and World War II that was also the subject of a recent Netflix documentary.

Around the beginning of the war, films served a practical purpose, rallying American solidarity behind the conflict. In 1940, Hitchcock’s “Foreign Correspondent” featured a reporter calling for action with guns and battleships in a scene of a radio broadcast: “It’s as if the lights were out everywhere except in America,” he says. Chaplin, who directed and played the lead speaking role in 1940’s “The Great Dictator” about an Adolf Hitler-like figure, delivers a final speech directly into the camera that includes the line: “Let us fight to free the world.”

During the war, filmmakers churned out movies in close to real time, going from script to screen in as few as six months, said Mr. Harris.

“Films made about World War II during the war are special because we don’t know we’re going to win,” said Thomas Doherty, a professor of American studies at Brandeis University who wrote “Projections of War: Hollywood, American Culture, and World War II.” “I’m always surprised when I look at World War II movies made during the war just how stern the lessons are. The guy you really like is often killed in the film.”

Soon, the anxieties of the atomic age begin to surface. “In Harm’s Way,” a 1965 film starring John Wayne as a naval officer in the Pacific after Pearl Harbor, ends with a shot of the ocean that morphs into what looks like a mushroom cloud. Mixed feelings around the Vietnam War enter the picture with movies like 1967’s “The Dirty Dozen,” a subversive take on conflict told through the story of death-row convicts on a mission to kill Nazis.

Veterans of World War II and Vietnam and civilian Baby Boomers might have taken different messages from 1970’s “Patton,” at once a portrait of a victorious general and a man driven by ego and ambition. Douglas Cunningham, co-editor of “A Wiley Companion to the War Film” and a teacher of film history at Westminster College in Salt Lake City, Utah, recalled a scene where Patton slaps the helmet of a soldier suffering from shellshock. “By 1970, you would have had plenty of folks returning from Vietnam traumatized in ways that would have been familiar to some members of that audience,” he said.

In time the Holocaust became a central part of the screen version of World War II, with movies like 1982’s “Sophie’s Choice,” about an Auschwitz survivor, and Spielberg’s 1993 drama “Schindler’s List.”

Movies have furthered an idea that the Holocaust was known to most American soldiers during the war. A scene hinting at that connection occurs in Spielberg’s “Saving Private Ryan,” when a Jewish soldier holds up the Star of David on his dog tag and repeats the German word for Jews—“Juden”—to captured enemy soldiers. “This is the way America sees World War II now—that it was all about the Holocaust and the Holocaust was the governing point,” said Robert Burgoyne, professor of film studies at the University of St Andrews and author of two books on U.S. history as told through the movies. “The Holocaust was not known to American culture generally. It is simply a kind of rewriting of World War II according to the contemporary generation’s perspective.”

In 1998, “Saving Private Ryan” presented the war to a new generation, starting with its harrowing opening of Allied troops storming Omaha Beach on D-Day. “In terms of stoking interest in World War II, these are the most important 20 minutes in cinema history,” said Rob Citino, senior historian at The National World War II Museum in New Orleans.

One in five Americans are prescribed opioids

Thursday, July 20th, 2017

More than one in five people were prescribed an opioid painkiller at least once in 2015 — at least among those insured by Blue Cross and Blue Shield:

The report, which covers 30 million people with Blue Cross and Blue Shield insurance in 2015, supports what experts have been saying: much, if not most, of the opioid overdose epidemic is being driven by medical professionals who are prescribing the drugs too freely.

“Twenty-one percent of Blue Cross and Blue Shield (BCBS) commercially insured members filled at least one opioid prescription in 2015,” the report says. “Data also show BCBS members with an opioid use disorder diagnosis spiked 493 percent over a seven year period.”

The report excludes people with cancer or terminal illnesses. What it found fits in with similar surveys of people with Medicare, Medicaid or other government health insurance, said Dr. Trent Haywood, chief medical officer for the Blue Cross and Blue Shield Association (BCSBA).

I may be screwing this person over

Wednesday, July 19th, 2017

A recent Freakonomics podcast looks at civic-minded Harvard physician Richard Clarke Cabot’s long-running Cambridge-Somerville Youth Study, which matched troubled boys with mentors — versus a matched control group who received no mentoring:

They found a null effect. They found there were no differences between the treatment and control boys on offending.

When computers came on the scene and they could analyze the data in finer detail, they made an interesting discovery:

On all seven measures — we’re talking, how long did you live? Were you a criminal? Were you mentally healthy, physically healthy, alcoholic, satisfied with your job; satisfied with your marriage? On all seven measures, the treatment group did statistically, significantly worse off than the control group.

The lesson:

And that’s one of the important things people who are engaged in social interventions really don’t spend much time thinking, “I may be screwing this person over.” They are self-conscious about, “Maybe this won’t work, but I’ve got to try!”

Think you drink a lot?

Tuesday, July 18th, 2017

Think you drink a lot? This chart will tell you:

These figures come from Philip J. Cook’s Paying the Tab, an economically-minded examination of the costs and benefits of alcohol control in the U.S. Specifically, they’re calculations made using the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) data.

Drinks per Capita by Decile

“One consequence is that the heaviest drinkers are of greatly disproportionate importance to the sales and profitability of the alcoholic-beverage industry,” he writes writes. “If the top decile somehow could be induced to curb their consumption level to that of the next lower group (the ninth decile), then total ethanol sales would fall by 60 percent.”

(Hat tip to P.D. Mangan.)

Trevor Butterworth considers this data journalism gone wrong:

If we look at the section where he arrives at this calculation, and go to the footnote, we find that he used data from 2001-2002 from NESARC, the National Institute on Alcohol Abuse and Alcoholism, which had a representative sample of 43,093 adults over the age of 18. But following this footnote, we find that Cook corrected these data for under-reporting by multiplying the number of drinks each respondent claimed they had drunk by 1.97 in order to comport with the previous year’s sales data for alcohol in the US. Why? It turns out that alcohol sales in the US in 2000 were double what NESARC’s respondents — a nationally representative sample, remember — claimed to have drunk.

While the mills of US dietary research rely on the great National Health and Nutrition Examination Survey to digest our diets and come up with numbers, we know, thanks to the recent work of Edward Archer, that recall-based survey data are highly unreliable: we misremember what we ate, we misjudge by how much; we lie. Were we to live on what we tell academics we eat, life for almost two thirds of Americans would be biologically implausible.

But Cook, who is trying to show that distribution is uneven, ends up trying to solve an apparent recall problem by creating an aggregate multiplier to plug the sales data gap. And the problem is that this requires us to believe that every drinker misremembered by a factor of almost two. This might not much of a stretch for moderate drinkers; but did everyone who drank, say, four or eight drinks per week systematically forget that they actually had eight or sixteen? That seems like a stretch.

Deep down, they really want a king or queen

Saturday, July 15th, 2017

Ross Douthat recently teased liberals that they really like Game of Thrones because, deep down, they really want a king or queen. He considers this response a strong misreading of what Martin’s story and the show are offering:

To say that Game of Thrones is attractive to liberals because of secret monarchical longings, you have to ignore…everything GoT is doing. GoT does not make being a Stark bannerman or a Daenerys retainer look fun! Those people get flayed and beheaded! GoT presents a vision of monarchy that is exaggeratedly dystopian even compared to most of the historical reality of monarchy. I think that dystopian exaggeration is in fact key to the show’s appeal to liberals in many ways. It lets you fantasize about the negation of your principles while simultaneously confirming their rightness. GoT presents a vision of a world in which illiberal instincts can be freely indulged, in which the id is constrained only by physical power. All the violent, nasty stuff liberal society (thankfully) won’t let us do, but that’s still seething in our lizard brains, gets acted out. And not just acted out — violence and brutality are the organizing principles on which the world is based.

But this is where the dystopianism comes in, because the show chides you for harboring the very fantasies it helps you gratify. It wallows in their destructive consequences — makes that wallowing, in fact, simultaneous with the fulfillment of the fantasies. Will to power leads to suffering and chaos, which lead to more opportunities for the will to power to be acted upon, etc. This is a vastly more complex and interesting emotional appeal than “people secretly want kings.” The liberal order is always being implicitly upheld by the accommodation of our base desire for its opposite. To me, this is the most interesting ongoing thing about GoT, a franchise I’m otherwise completely tired of. Everyone wants to move to Hogwarts; only a lunatic would actually want to LIVE in Westeros. In an escapist genre, that’s interesting. It’s not subliminal royalism; it’s dark escapism, an escape that ultimately tends toward reconciliation with the existing order.

And what do liberals secretly love more than an excuse to reconcile with the existing order? Westeros makes Prime Day look utopian!

It is “a very good description of what a lot of prestige television has done,” Douthat agrees, but Game of Thrones is different:

These shows [The Sopranos, Mad Men, and Breaking Bad] invite liberal viewers into various illiberal or pre-liberal or just, I suppose, red-state worlds, which are more violent and sexist and id-driven than polite prestige-TV-viewing liberal society, and which offer viewers the kind of escapism that Phillips describes … in which there is a temporary attraction to being a mobster or hanging out with glamorous chain-smoking ’50s admen or leaving your put-upon suburban life behind and becoming Heisenberg the drug lord. But then ultimately because these worlds are clearly wicked, dystopic or just reactionary white-male-bastions you can return in relief to the end of history, making Phillips’ “reconciliation with the existing order” after sojourning for a while in a more inegalitarian or will-to-power world.

[...]

“Game of Thrones,” however, is somewhat different. Yes, it makes the current situation in Westeros look hellish, by effectively condensing all of the horrors of a century of medieval history into a few short years of civil war. And yes, it’s much darker and bloodier and has a much higher, “wait, I thought he was a hero” body count than a lot of fantasy fiction, which lets people describe it as somehow Sopranos-esque.

But fundamentally “The Sopranos” was a story without any heroes, a tragedy in which the only moral compass (uncertain as Dr. Melfi’s arrow sometimes was) was supplied by an outsider to its main characters’ world. Whereas “Game of Thrones” is still working within the framework of its essentially romantic genre — critiquing it and complicating it, yes, but also giving us a set of heroes and heroines to root for whose destinies are set by bloodlines and prophecies, and who are likely in the end to save their world from darkness and chaos no less than Aragorn or Shea Ohmsford or Rand al’Thor.

Put another way: On “The Sopranos,” there is no right way to be a mafioso. But on “Game of Thrones” there is a right way to be a lord or king and knight, and there are characters who model the virtues of each office, who prove that chivalry and wise lordship need not be a myth. Sometimes they do so in unexpected ways — the lady knight who has more chivalry than the men who jeer at her, the dwarf who rules more justly than the family members who look down on him. But this sort of reversal is typical of the genre, which always has its hobbits and stable boys and shieldmaidens ready to surprise the proud and prejudiced. And it coexists throughout the story with an emphasis on the importance of legitimacy and noblesse oblige and dynastic continuity, which is often strikingly uncynical given the dark-and-gritty atmosphere.

Consider that the central family, the Starks, are wise rulers whose sway over the North has endured for an implausible number of generations — “there has always been a Stark in Winterfell,” etc. — and whose people seems to genuinely love them. Their patriarch is too noble for his own good but only because he leaves his native fiefdom for the corruption of the southern court, and his naivete is still presented as preferable to the cynicism of his Lannister antagonists, who win temporary victories but are on their way to destroying their dynasty through their amorality and singleminded self-interestedness.

Confronted with sandwiches named Padrino and Pomodoro

Thursday, July 13th, 2017

David Brooks has come to think that the structural barriers between the classes are less important than the informal social barriers that segregate the lower 80 percent:

Recently I took a friend with only a high school degree to lunch. Insensitively, I led her into a gourmet sandwich shop. Suddenly I saw her face freeze up as she was confronted with sandwiches named “Padrino” and “Pomodoro” and ingredients like soppressata, capicollo and a striata baguette. I quickly asked her if she wanted to go somewhere else and she anxiously nodded yes and we ate Mexican.

American upper-middle-class culture (where the opportunities are) is now laced with cultural signifiers that are completely illegible unless you happen to have grown up in this class. They play on the normal human fear of humiliation and exclusion. Their chief message is, “You are not welcome here.”

In her thorough book “The Sum of Small Things,” Elizabeth Currid-Halkett argues that the educated class establishes class barriers not through material consumption and wealth display but by establishing practices that can be accessed only by those who possess rarefied information.

To feel at home in opportunity-rich areas, you’ve got to understand the right barre techniques, sport the right baby carrier, have the right podcast, food truck, tea, wine and Pilates tastes, not to mention possess the right attitudes about David Foster Wallace, child-rearing, gender norms and intersectionality.

The educated class has built an ever more intricate net to cradle us in and ease everyone else out. It’s not really the prices that ensure 80 percent of your co-shoppers at Whole Foods are, comfortingly, also college grads; it’s the cultural codes.

Status rules are partly about collusion, about attracting educated people to your circle, tightening the bonds between you and erecting shields against everybody else. We in the educated class have created barriers to mobility that are more devastating for being invisible. The rest of America can’t name them, can’t understand them. They just know they’re there.

Unemployment is the greater evil

Thursday, July 13th, 2017

Policymakers seem intent on making the joblessness crisis worse, Ed Glaeser laments:

The past decade or so has seen a resurgent progressive focus on inequality — and little concern among progressives about the downsides of discouraging work. Advocates of a $15 minimum hourly wage, for example, don’t seem to mind, or believe, that such policies deter firms from hiring less skilled workers. The University of California–San Diego’s Jeffrey Clemens examined states where higher federal minimum wages raised the effective state-level minimum wage during the last decade. He found that the higher minimum “reduced employment among individuals ages 16 to 30 with less than a high school education by 5.6 percentage points,” which accounted for “43 percent of the sustained, 13 percentage point decline in this skill group’s employment rate.”

The decision to prioritize equality over employment is particularly puzzling, given that social scientists have repeatedly found that unemployment is the greater evil. Economists Andrew Clark and Andrew Oswald have documented the huge drop in happiness associated with unemployment — about ten times larger than that associated with a reduction in earnings from the $50,000–$75,000 range to the $35,000–$50,000 bracket. One recent study estimated that unemployment leads to 45,000 suicides worldwide annually. Jobless husbands have a 50 percent higher divorce rate than employed husbands. The impact of lower income on suicide and divorce is much smaller. The negative effects of unemployment are magnified because it so often becomes a semipermanent state.

Time-use studies help us understand why the unemployed are so miserable. Jobless men don’t do a lot more socializing; they don’t spend much more time with their kids. They do spend an extra 100 minutes daily watching television, and they sleep more. The jobless also are more likely to use illegal drugs. While fewer than 10 percent of full-time workers have used an illegal substance in any given week, 18 percent of the unemployed have done drugs in the last seven days, according to a 2013 study by Alejandro Badel and Brian Greaney.

Joblessness and disability are also particularly associated with America’s deadly opioid epidemic. David Cutler and I examined the rise in opioid deaths between 1992 and 2012. The strongest correlate of those deaths is the share of the population on disability. That connection suggests a combination of the direct influence of being disabled, which generates a demand for painkillers; the availability of the drugs through the health-care system; and the psychological misery of having no economic future.

Increasing the benefits received by nonemployed persons may make their lives easier in a material sense but won’t help reattach them to the labor force. It won’t give them the sense of pride that comes from economic independence. It won’t give them the reassuring social interactions that come from workplace relationships. When societies sacrifice employment for a notion of income equality, they make the wrong choice.

Politicians, when they do focus on long-term unemployment, too often advance poorly targeted solutions, such as faster growth, more infrastructure investment, and less trade. More robust GDP growth is always a worthy aim, but it seems unlikely to get the chronically jobless back to work. The booms of the 1990s and early 2000s never came close to restoring the high employment rates last seen in the 1970s. Between 1976 and 2015, Nevada’s GDP grew the most and Michigan’s GDP grew the least among American states. Yet the two states had almost identical rises in the share of jobless prime-age men.

Infrastructure spending similarly seems poorly targeted to ease the problem. Contemporary infrastructure projects rely on skilled workers, typically with wages exceeding $25 per hour; most of today’s jobless lack such skills. Further, the current employment in highway, street, and bridge construction in the U.S. is only 316,000. Even if this number rose by 50 percent, it would still mean only a small reduction in the millions of jobless Americans. And the nation needs infrastructure most in areas with the highest population density; joblessness is most common outside metropolitan America. (See “If You Build It…,” Summer 2016.)

Finally, while it’s possible that the rise of American joblessness would have been slower if the U.S. had weaker trade ties to lower-wage countries like Mexico and China, American manufacturers have already adapted to a globalized world by mechanizing and outsourcing. We have little reason to be confident that restrictions on trade would bring the old jobs back. Trade wars would have an economic price, too. American exporters would cut back hiring. The cost of imported manufactured goods would rise, and U.S. consumers would pay more, in exchange for — at best — uncertain employment gains.

The techno-futurist narrative holds that machines will displace most workers, eventually. Social peace will be maintained only if the armies of the jobless are kept quiet with generous universal-income payments. This vision recalls John Maynard Keynes’s 1930 essay “Economic Possibilities for Our Grandchildren,” which predicts a future world of leisure, in which his grandchildren would be able to satisfy their basic needs with a few hours of labor and then spend the rest of their waking hours edifying themselves with culture and fun.

But for many of us, technological progress has led to longer work hours, not playtime. Entrepreneurs conjured more products that generated more earnings. Almost no Americans today would be happy with the lifestyle of their ancestors in 1930. For many, work also became not only more remunerative but more interesting. No Pennsylvania miner was likely to show up for extra hours (without extra pay) voluntarily. Google employees do it all the time.

Joblessness is not foreordained, because entrepreneurs can always dream up new ways of making labor productive. Ten years ago, millions of Americans wanted inexpensive car service. Uber showed how underemployed workers could earn something providing that service. Prosperous, time-short Americans are desperate for a host of other services — they want not only drivers but also cooks for their dinners and nurses for their elderly parents and much more. There is no shortage of demand for the right kinds of labor, and entrepreneurial insight could multiply the number of new tasks that could be performed by the currently out-of-work. Yet over the last 30 years, entrepreneurial talent has focused far more on delivering new tools for the skilled than on employment for the unlucky. Whereas Henry Ford employed hundreds of thousands of Americans without college degrees, Mark Zuckerberg primarily hires highly educated programmers.

Korea established a pattern

Wednesday, July 12th, 2017

Korea established a pattern that has been unfortunately followed in American wars in Vietnam, Iraq, and Afghanistan:

These are wars without declaration and without the political consensus and the resolve to meet specific and changing goals. They are improvisational wars. They are dangerous.

The wars of the last 63 years, ranging from Korea to Vietnam to Afghanistan to Iraq (but excepting Operation Desert Storm, which is an outlier from this pattern) have been marked by:

  • Inconsistent or unclear military goals with no congressional declaration of war.
  • Early presumptions on the part of the civilian leadership and some top military officials that this would be an easy operation. An exaggerated view of American military strength, a dismissal of the ability of the opposing forces, and little recognition of the need for innovation.
  • Military action that, except during the first year in Korea, largely lacked geographical objectives of seize and hold.
  • Military action with restricted rules of engagement and political constraints on the use of a full arsenal of firepower.
  • Military action against enemy forces that have sanctuaries which are largely off-limits.
  • Military action that is rhetorically in defense of democracy — ignoring the reality of the undemocratic nature of regimes in Seoul, Saigon, Baghdad, and Kabul.
  • With the exception of some of the South Korean and South Vietnamese military units, these have been wars with in-country allies that were not dependable.
  • Military action that civilian leaders modulate, often clumsily, between domestic political reassurance and international muscle-flexing. Downplaying the scale of deployment and length of commitment for the domestic audience and threatening expansion of these for the international community.
  • Wars fought by increasingly less representative sectors of American society, which further encourages most Americans to pay little attention to the details of these encounters.
  • Military action that is costly in lives and treasure and yet does not enjoy the support that wars require in a democracy.

Some of the restraints and restrictions on the conduct of these wars have been politically and even morally necessary. But it is neither politically nor morally defensible to send the young to war without a public consensus that the goals are understood and essential, and the restraints and the costs are acceptable.

Mattis cited this Atlantic piece in his recent interview with the Mercer Island High School Islander.

University establishments are the next best thing to a cult

Wednesday, July 12th, 2017

The Toronto Star reports that a certain controversial U of T professor is making nearly $50,000 a month through crowdfunding:

Prof. Jordan Peterson, who made headlines last fall when he publicly refused to use gender neutral pronouns, has been using the fundraising platform Patreon since last March to subsidize costs associated with filming and uploading videos of his lectures to YouTube.

He is now harnessing his online clout with eyes on a new goal — to offer an online university degree in the humanities for which students pay only for examinations.

“I’m fighting this as a battle of ideas,” Peterson told the Star. “Hopefully I can bring high-quality education to millions of people — for nothing. Wouldn’t that be cool.”

Peterson said he views university establishments as “the next best thing to a cult” due to their focus on what he calls “postmodern” themes such as equity. He says his independent project will contrast the university model by providing straight humanities education.

For about his first seven months on Patreon, Peterson earned about $1,000 per month. That changed last October, when he saw a dramatic increase in support, which has not slowed. The professor surpassed a fundraising goal of $45,000 on June 10, and is now aiming for $100,000 per month. On Monday, Peterson was making $49,460 every month from 4,432 patrons.

He is currently the 32nd highest-earning Patreon creator, of more than 75,000 people who are using the site to fundraise.

“Obviously people are pretty happy with the approach that I’ve been taking to psychological matters and, I suppose, to some degree, political matters online,” Peterson said.

He does have quite a few YouTube videos now. Here’s his message to Millennials on how to change the world — properly:

Mattis called him back

Tuesday, July 11th, 2017

The Mercer Island High School newspaper, the Islander, snagged an interview with James Mattis :

In a photo published alongside this article by The Washington Post on May 11, Trump’s bodyguard, Keith Schiller, could be seen carrying a stack of papers with a yellow sticky note stuck on the top. Written on it, in black ink, was the name “Jim ‘Mad Dog’ Mattis” and a phone number.

Paul Redmond of Orange County, California, contacted The Post the next day, informing them that they’d accidentally published what seemed to be Mattis’ private number.

The photo was quickly removed, but not before many, including MIHS Islander sophomore staff writer Teddy Fischer, saved it.

Calling the number, he left a message asking if Mattis would be interested in conducting a phone interview with The Islander. A few days later, when Teddy said Mattis had agreed, I didn’t believe him.

But, after receiving three more calls from the defense secretary to set up a date and time for the interview, Teddy and I got to work preparing questions.

[...]

When asked why, out of thousands of calls, Mattis chose to respond to us, he returned to his love of teaching.

“I’ve always tried to help students because I think we owe it to you young folks to pass on what we learned going down the road so that you can make your own mistakes,” he said, “not the same ones we made.”

Mattis is a history buff enshrining himself in history. Through teaching and reaching out to students like Teddy, he’s sharing history, and the wisdom he’s gained in creating it, as it’s being made.

The Joker leads a media war against Gotham’s elite

Tuesday, July 11th, 2017

Marvel’s sales tanked, the Yawfle notes, when the writers decided to put ham-fisted political messages above good stories. Now it’s DC’s turn:

For Batman: White Knight, writer-illustrator Sean Murphy (The Wake, Punk Rock Jesus) created a version of Gotham with real, modern-day problems, and then let Batman solve them by making him the villain. How? In the comic mini-series’ alternate-reality, it’s the Joker — cured of his insanity — who sees that Bruce Wayne is just another part of the city’s vicious cycle of crime and sets out to stop him.

“My main goal was to undo the comic tropes while changing Gotham from a comic book city into a real city — a city dealing with everything from Black Lives Matter to the growing wage gap,” Murphy says. “[But] rather than write a comic about the wage gap, I gave those ideas to the Joker, who leads a kind of media war against Gotham’s elite by winning people over with his potent observations and rhetoric.”

I don’t think Murphy intended this to be a Rorschach test, but half his audience will probably see this new “heroic” Joker as perfectly villainous.

Being right about the future is wrong

Sunday, July 9th, 2017

A recent Slate piece sees White Nationalist roots in Trump’s Warsaw speech defending Western civilization:

Likewise, the prosaic warning that unnamed “forces” will sap the West of its will to defend itself recalls Bannon’s frequent references to The Camp of the Saints, an obscure French novel from 1973 that depicts a weak and tolerant Europe unable to defend itself from a flotilla of impoverished Indians…

Steve Sailer caricatures this position:

You see, there was this evil novel back in 1973 that pretty accurately predicted what Chancellor Merkel would choose to do in 2015. Being right about the future is wrong. Making accurate predictions is bad. As punishment for spawning one novelist who was so vile as to grasp the future Europe was headed toward, we must make sure this dystopia comes true. Because you deserve it for being so despicable as to understand our intentions toward you.

They created a spiritual plague

Sunday, July 9th, 2017

Dylan Levi King has translated a Chinese Leftist’s contemptuous history of the evolution of the White Left, which emphasizes something we rarely hear about in history class:

While the German white left was busy bullshitting new theories in politics, the French white left stayed in the game, too — but they turned their attention to art and literature. We can see their level of achievement in this short exclamation from Maupassant:

I’ve got the pox! at last! the real thing! not the contemptible clap, not the ecclesiastical crystalline, not the bourgeois coxcombs or the leguminous cauliflowers — no — no, the great pox, the one which Francis I died of. The majestic pox, pure and simple; the elegant syphilis …. I’ve got the pox … and I am proud of it, by thunder, and to hell with the bourgeoisie.

Charles Baudelaire says it this way: “We have all of us got the spirit of republicanism in our veins, as we have the pox in our bones; we are democratized and syphilized.” It’s no wonder that the Germans called syphilis the French disease.

France’s second generation of the white left took the German white left’s Freudian ideas and white left version of liberalism to push a vision of free love. This is the reason that syphilis spread so widely in the white left camp. We know Maupassant had the disease, but so did Van Gogh, and Gauguin, too, and let’s not forget Oscar Wilde. When their fans went to the whorehouse, they turned up their noses at the whores that didn’t have syphilis — they wanted to go mad from the disease, just like their idols.

From the third generation of the white left, the idea of sickness as a badge gains credibility. Illness shows ideological devotion. Morbidity is the main feature of the second generation of the French white left. As important as their achievements in art and literature, in the ideological arena, they only managed to borrow from and degrade the philosophy of earlier times.

To sum up, Germany’s second generation white left provided the theoretical basis for the next generation and the French turned to art and literature. They created a spiritual plague. This is an important term to define:

The white left flaunts ideals, which may or may not be false, and turn their sickness into a morbid badge — this is how they get people to notice them and self-satisfaction.

That is all you need to know. If you understand what I have just written, you will understand why the women of the contemporary white left sweep into Middle East refugee camps with their messages of love.
The second generation of white left is the most important generation in the history of the evolution of this philosophy. They have combined this philosophy with fashion and psychology and created an “infectious ideology” that has spread as fast as syphilis.

Why has Italy been spared mass terror attacks in recent years?

Thursday, July 6th, 2017

Why has Italy been spared mass terror attacks in recent years?

Some experts say Italy has been able to combat the threat of Isis domestically by mastering legal and policing tools developed through years of experience in mafia investigations, which in turn were born out of the so-called “years of lead” — the period between the late 1960s and early 1980s marked by acts of political terrorism by left- and right-wing militants.

He fought the mafia and won. Now this mayor is taking on Europe over migrants

According to figures released by the Italian interior ministry, counter-terrorism authorities stopped and questioned 160,593 people between March 2016 to March 2017. They stopped and interrogated about 34,000 at airports and arrested about 550 suspected terrorists, and 38 have been sentenced on terrorism charges. More than 500 websites have been shut down and nearly half a million have been monitored.

Giampiero Massolo, who served as the director of Italian intelligence from 2012 to 2016, said there was not a particular “Italian way” to combat terrorism.

“We learned a very harsh lesson during our terrorism years,” he said. “From that we drew the experience of how important it is to maintain a constant dialogue at the operating level between intelligence and law enforcement forces. In fact, prevention is key to try to be effective in counter-terrorism.”

He added: “Another feature is to have a good control of the territory. From this point of view, the absence of [French] banlieues-like spots in Italian major cities, and …[the predominance] of small and medium towns makes it easier to monitor the situation.”

There are also more specific practices. Arturo Varvelli, a senior research fellow and terrorism expert at the thinktank Ispi, said the lack of second- and third-generation Italians who might be susceptible to Isis propaganda meant authorities instead focused on non-citizens, who could be deported at the first signs of concern. Since January, 135 individuals had been expelled, he said.

Italian authorities also rely on intercepted phone calls, which unlike the UK can be used in evidence in court and — in cases related to mafia and terrorism — can be obtained on the basis of suspicious activity and not solid evidence.

Much like the fight against Italian organised crime — the Camorra around Naples, the Cosa Nostra in Sicily, and the ’Ndràngheta in the south — infiltrating and disrupting terror networks requires breaking close social and even family relationships.

People suspected of being jihadis are encouraged to break ranks and cooperate with Italian authorities, who use residency permits and other incentives, Galli said. There has been a recognition, too, of the dangers of keeping terror suspects in jail where, much like mafia bosses before them, prison is seen as a prime territory for recruiting and networking.

“I think we have developed experience in how to deal with a criminal network. We have lots of undercover agents who do a great job of intercepting communication,” she said.

While Italians authorities are seen as having broad powers, police do not have special powers to detain terror suspects without charge. Terror suspects may be held for up to four days without charge, just like any other suspect. However, Italy has been criticised by the European court of human rights for holding defendants too long once they have been charged and are awaiting trial.

Galli said there was no groundswell of concern about whether Italy’s tactics violated civil liberties. The broad use of surveillance — including intercepted communication — is seen as sufficiently targeted to terror and mafia suspects, unlike public criticism in Italy of sweeping data collection methods used in the US and UK.

How the Democrats lost their way on immigration

Monday, July 3rd, 2017

Peter Beinart explains how the Democrats lost their way on immigration:

If the right has grown more nationalistic, the left has grown less so. A decade ago, liberals publicly questioned immigration in ways that would shock many progressives today.

In 2005, a left-leaning blogger wrote, “Illegal immigration wreaks havoc economically, socially, and culturally; makes a mockery of the rule of law; and is disgraceful just on basic fairness grounds alone.” In 2006, a liberal columnist wrote that “immigration reduces the wages of domestic workers who compete with immigrants” and that “the fiscal burden of low-wage immigrants is also pretty clear.” His conclusion: “We’ll need to reduce the inflow of low-skill immigrants.” That same year, a Democratic senator wrote, “When I see Mexican flags waved at pro-immigration demonstrations, I sometimes feel a flush of patriotic resentment. When I’m forced to use a translator to communicate with the guy fixing my car, I feel a certain frustration.”

The blogger was Glenn Greenwald. The columnist was Paul Krugman. The senator was Barack Obama.

[...]

Unfortunately, while admitting poor immigrants makes redistributing wealth more necessary, it also makes it harder, at least in the short term. By some estimates, immigrants, who are poorer on average than native-born Americans and have larger families, receive more in government services than they pay in taxes. According to the National Academies report, immigrant-headed families with children are 15 percentage points more likely to rely on food assistance, and 12 points more likely to rely on Medicaid, than other families with children. In the long term, the United States will likely recoup much if not all of the money it spends on educating and caring for the children of immigrants. But in the meantime, these costs strain the very welfare state that liberals want to expand in order to help those native-born Americans with whom immigrants compete.

What’s more, studies by the Harvard political scientist Robert Putnam and others suggest that greater diversity makes Americans less charitable and less willing to redistribute wealth. People tend to be less generous when large segments of society don’t look or talk like them. Surprisingly, Putnam’s research suggests that greater diversity doesn’t reduce trust and cooperation just among people of different races or ethnicities—it also reduces trust and cooperation among people of the same race and ethnicity.