It all comes down to eye glances

Friday, July 14th, 2017

MIT researchers are trying to figure out how people really drive:

In 2012, government-sponsored researchers rigged up 2,600 regular drivers’ vehicles with cameras and sensors in six states, then left them alone for more than a year. The result is a large, objective, and detailed database of actual driving behavior, the kind of info that’s very useful if you want to figure out exactly what causes crashes.

The MIT researchers and their colleagues took that database and added another twist. While many scientists looking to crack why a crash happened might look at the five or six seconds before the event, these researchers backed it all the way up, to around 20 seconds beforehand.

“Upstream, further prior to an event, we begin to see failures in attention allocation that are indicative of less awareness in the operating environment in the crash events,” says Bryan Reimer, an engineer who studies driver behavior at MIT. In other words: The problems that cause crashes start well before the crunch.

It all comes down to eye glances. Sure, the more time you spend looking off the road, the likelier your chance of crashing. But the time you spend looking on the road matters, too. If your glances at, say, the texts in your lap are longer than the darting ones you make back to the highway in front of you, you gradually lose awareness of where you are in space.

Usually, drivers are pretty good at managing that attentional and situational awareness, judging when it’s appropriate to look down at the radio, for example. But smartphones and in-car infotainment systems present a new issue: The driver isn’t really deciding when to engage with the product. “If the phone goes brrrrring, you feel socially or emotionally compelled to respond to it,” says Reimer. The problem is that the cueing arrives with no regard to when’s a good time.

Confronted with sandwiches named Padrino and Pomodoro

Thursday, July 13th, 2017

David Brooks has come to think that the structural barriers between the classes are less important than the informal social barriers that segregate the lower 80 percent:

Recently I took a friend with only a high school degree to lunch. Insensitively, I led her into a gourmet sandwich shop. Suddenly I saw her face freeze up as she was confronted with sandwiches named “Padrino” and “Pomodoro” and ingredients like soppressata, capicollo and a striata baguette. I quickly asked her if she wanted to go somewhere else and she anxiously nodded yes and we ate Mexican.

American upper-middle-class culture (where the opportunities are) is now laced with cultural signifiers that are completely illegible unless you happen to have grown up in this class. They play on the normal human fear of humiliation and exclusion. Their chief message is, “You are not welcome here.”

In her thorough book The Sum of Small Things, Elizabeth Currid-Halkett argues that the educated class establishes class barriers not through material consumption and wealth display but by establishing practices that can be accessed only by those who possess rarefied information.

To feel at home in opportunity-rich areas, you’ve got to understand the right barre techniques, sport the right baby carrier, have the right podcast, food truck, tea, wine and Pilates tastes, not to mention possess the right attitudes about David Foster Wallace, child-rearing, gender norms and intersectionality.

The educated class has built an ever more intricate net to cradle us in and ease everyone else out. It’s not really the prices that ensure 80 percent of your co-shoppers at Whole Foods are, comfortingly, also college grads; it’s the cultural codes.

Status rules are partly about collusion, about attracting educated people to your circle, tightening the bonds between you and erecting shields against everybody else. We in the educated class have created barriers to mobility that are more devastating for being invisible. The rest of America can’t name them, can’t understand them. They just know they’re there.

Unemployment is the greater evil

Thursday, July 13th, 2017

Policymakers seem intent on making the joblessness crisis worse, Ed Glaeser laments:

The past decade or so has seen a resurgent progressive focus on inequality — and little concern among progressives about the downsides of discouraging work. Advocates of a $15 minimum hourly wage, for example, don’t seem to mind, or believe, that such policies deter firms from hiring less skilled workers. The University of California–San Diego’s Jeffrey Clemens examined states where higher federal minimum wages raised the effective state-level minimum wage during the last decade. He found that the higher minimum “reduced employment among individuals ages 16 to 30 with less than a high school education by 5.6 percentage points,” which accounted for “43 percent of the sustained, 13 percentage point decline in this skill group’s employment rate.”

The decision to prioritize equality over employment is particularly puzzling, given that social scientists have repeatedly found that unemployment is the greater evil. Economists Andrew Clark and Andrew Oswald have documented the huge drop in happiness associated with unemployment — about ten times larger than that associated with a reduction in earnings from the $50,000–$75,000 range to the $35,000–$50,000 bracket. One recent study estimated that unemployment leads to 45,000 suicides worldwide annually. Jobless husbands have a 50 percent higher divorce rate than employed husbands. The impact of lower income on suicide and divorce is much smaller. The negative effects of unemployment are magnified because it so often becomes a semipermanent state.

Time-use studies help us understand why the unemployed are so miserable. Jobless men don’t do a lot more socializing; they don’t spend much more time with their kids. They do spend an extra 100 minutes daily watching television, and they sleep more. The jobless also are more likely to use illegal drugs. While fewer than 10 percent of full-time workers have used an illegal substance in any given week, 18 percent of the unemployed have done drugs in the last seven days, according to a 2013 study by Alejandro Badel and Brian Greaney.

Joblessness and disability are also particularly associated with America’s deadly opioid epidemic. David Cutler and I examined the rise in opioid deaths between 1992 and 2012. The strongest correlate of those deaths is the share of the population on disability. That connection suggests a combination of the direct influence of being disabled, which generates a demand for painkillers; the availability of the drugs through the health-care system; and the psychological misery of having no economic future.

Increasing the benefits received by nonemployed persons may make their lives easier in a material sense but won’t help reattach them to the labor force. It won’t give them the sense of pride that comes from economic independence. It won’t give them the reassuring social interactions that come from workplace relationships. When societies sacrifice employment for a notion of income equality, they make the wrong choice.

Politicians, when they do focus on long-term unemployment, too often advance poorly targeted solutions, such as faster growth, more infrastructure investment, and less trade. More robust GDP growth is always a worthy aim, but it seems unlikely to get the chronically jobless back to work. The booms of the 1990s and early 2000s never came close to restoring the high employment rates last seen in the 1970s. Between 1976 and 2015, Nevada’s GDP grew the most and Michigan’s GDP grew the least among American states. Yet the two states had almost identical rises in the share of jobless prime-age men.

Infrastructure spending similarly seems poorly targeted to ease the problem. Contemporary infrastructure projects rely on skilled workers, typically with wages exceeding $25 per hour; most of today’s jobless lack such skills. Further, the current employment in highway, street, and bridge construction in the U.S. is only 316,000. Even if this number rose by 50 percent, it would still mean only a small reduction in the millions of jobless Americans. And the nation needs infrastructure most in areas with the highest population density; joblessness is most common outside metropolitan America. (See “If You Build It…,” Summer 2016.)

Finally, while it’s possible that the rise of American joblessness would have been slower if the U.S. had weaker trade ties to lower-wage countries like Mexico and China, American manufacturers have already adapted to a globalized world by mechanizing and outsourcing. We have little reason to be confident that restrictions on trade would bring the old jobs back. Trade wars would have an economic price, too. American exporters would cut back hiring. The cost of imported manufactured goods would rise, and U.S. consumers would pay more, in exchange for — at best — uncertain employment gains.

The techno-futurist narrative holds that machines will displace most workers, eventually. Social peace will be maintained only if the armies of the jobless are kept quiet with generous universal-income payments. This vision recalls John Maynard Keynes’s 1930 essay “Economic Possibilities for Our Grandchildren,” which predicts a future world of leisure, in which his grandchildren would be able to satisfy their basic needs with a few hours of labor and then spend the rest of their waking hours edifying themselves with culture and fun.

But for many of us, technological progress has led to longer work hours, not playtime. Entrepreneurs conjured more products that generated more earnings. Almost no Americans today would be happy with the lifestyle of their ancestors in 1930. For many, work also became not only more remunerative but more interesting. No Pennsylvania miner was likely to show up for extra hours (without extra pay) voluntarily. Google employees do it all the time.

Joblessness is not foreordained, because entrepreneurs can always dream up new ways of making labor productive. Ten years ago, millions of Americans wanted inexpensive car service. Uber showed how underemployed workers could earn something providing that service. Prosperous, time-short Americans are desperate for a host of other services — they want not only drivers but also cooks for their dinners and nurses for their elderly parents and much more. There is no shortage of demand for the right kinds of labor, and entrepreneurial insight could multiply the number of new tasks that could be performed by the currently out-of-work. Yet over the last 30 years, entrepreneurial talent has focused far more on delivering new tools for the skilled than on employment for the unlucky. Whereas Henry Ford employed hundreds of thousands of Americans without college degrees, Mark Zuckerberg primarily hires highly educated programmers.

Korea established a pattern

Wednesday, July 12th, 2017

Korea established a pattern that has been unfortunately followed in American wars in Vietnam, Iraq, and Afghanistan:

These are wars without declaration and without the political consensus and the resolve to meet specific and changing goals. They are improvisational wars. They are dangerous.

The wars of the last 63 years, ranging from Korea to Vietnam to Afghanistan to Iraq (but excepting Operation Desert Storm, which is an outlier from this pattern) have been marked by:

  • Inconsistent or unclear military goals with no congressional declaration of war.
  • Early presumptions on the part of the civilian leadership and some top military officials that this would be an easy operation. An exaggerated view of American military strength, a dismissal of the ability of the opposing forces, and little recognition of the need for innovation.
  • Military action that, except during the first year in Korea, largely lacked geographical objectives of seize and hold.
  • Military action with restricted rules of engagement and political constraints on the use of a full arsenal of firepower.
  • Military action against enemy forces that have sanctuaries which are largely off-limits.
  • Military action that is rhetorically in defense of democracy — ignoring the reality of the undemocratic nature of regimes in Seoul, Saigon, Baghdad, and Kabul.
  • With the exception of some of the South Korean and South Vietnamese military units, these have been wars with in-country allies that were not dependable.
  • Military action that civilian leaders modulate, often clumsily, between domestic political reassurance and international muscle-flexing. Downplaying the scale of deployment and length of commitment for the domestic audience and threatening expansion of these for the international community.
  • Wars fought by increasingly less representative sectors of American society, which further encourages most Americans to pay little attention to the details of these encounters.
  • Military action that is costly in lives and treasure and yet does not enjoy the support that wars require in a democracy.

Some of the restraints and restrictions on the conduct of these wars have been politically and even morally necessary. But it is neither politically nor morally defensible to send the young to war without a public consensus that the goals are understood and essential, and the restraints and the costs are acceptable.

Mattis cited this Atlantic piece in his recent interview with the Mercer Island High School Islander.

University establishments are the next best thing to a cult

Wednesday, July 12th, 2017

The Toronto Star reports that a certain controversial U of T professor is making nearly $50,000 a month through crowdfunding:

Prof. Jordan Peterson, who made headlines last fall when he publicly refused to use gender neutral pronouns, has been using the fundraising platform Patreon since last March to subsidize costs associated with filming and uploading videos of his lectures to YouTube.

He is now harnessing his online clout with eyes on a new goal — to offer an online university degree in the humanities for which students pay only for examinations.

“I’m fighting this as a battle of ideas,” Peterson told the Star. “Hopefully I can bring high-quality education to millions of people — for nothing. Wouldn’t that be cool.”

Peterson said he views university establishments as “the next best thing to a cult” due to their focus on what he calls “postmodern” themes such as equity. He says his independent project will contrast the university model by providing straight humanities education.

For about his first seven months on Patreon, Peterson earned about $1,000 per month. That changed last October, when he saw a dramatic increase in support, which has not slowed. The professor surpassed a fundraising goal of $45,000 on June 10, and is now aiming for $100,000 per month. On Monday, Peterson was making $49,460 every month from 4,432 patrons.

He is currently the 32nd highest-earning Patreon creator, of more than 75,000 people who are using the site to fundraise.

“Obviously people are pretty happy with the approach that I’ve been taking to psychological matters and, I suppose, to some degree, political matters online,” Peterson said.

He does have quite a few YouTube videos now. Here’s his message to Millennials on how to change the world — properly:

Mattis called him back

Tuesday, July 11th, 2017

The Mercer Island High School newspaper, the Islander, snagged an interview with James Mattis :

In a photo published alongside this article by The Washington Post on May 11, Trump’s bodyguard, Keith Schiller, could be seen carrying a stack of papers with a yellow sticky note stuck on the top. Written on it, in black ink, was the name “Jim ‘Mad Dog’ Mattis” and a phone number.

Paul Redmond of Orange County, California, contacted The Post the next day, informing them that they’d accidentally published what seemed to be Mattis’ private number.

The photo was quickly removed, but not before many, including MIHS Islander sophomore staff writer Teddy Fischer, saved it.

Calling the number, he left a message asking if Mattis would be interested in conducting a phone interview with The Islander. A few days later, when Teddy said Mattis had agreed, I didn’t believe him.

But, after receiving three more calls from the defense secretary to set up a date and time for the interview, Teddy and I got to work preparing questions.

[...]

When asked why, out of thousands of calls, Mattis chose to respond to us, he returned to his love of teaching.

“I’ve always tried to help students because I think we owe it to you young folks to pass on what we learned going down the road so that you can make your own mistakes,” he said, “not the same ones we made.”

Mattis is a history buff enshrining himself in history. Through teaching and reaching out to students like Teddy, he’s sharing history, and the wisdom he’s gained in creating it, as it’s being made.

The Joker leads a media war against Gotham’s elite

Tuesday, July 11th, 2017

Marvel’s sales tanked, the Yawfle notes, when the writers decided to put ham-fisted political messages above good stories. Now it’s DC’s turn:

For Batman: White Knight, writer-illustrator Sean Murphy (The Wake, Punk Rock Jesus) created a version of Gotham with real, modern-day problems, and then let Batman solve them by making him the villain. How? In the comic mini-series’ alternate-reality, it’s the Joker — cured of his insanity — who sees that Bruce Wayne is just another part of the city’s vicious cycle of crime and sets out to stop him.

“My main goal was to undo the comic tropes while changing Gotham from a comic book city into a real city — a city dealing with everything from Black Lives Matter to the growing wage gap,” Murphy says. “[But] rather than write a comic about the wage gap, I gave those ideas to the Joker, who leads a kind of media war against Gotham’s elite by winning people over with his potent observations and rhetoric.”

I don’t think Murphy intended this to be a Rorschach test, but half his audience will probably see this new “heroic” Joker as perfectly villainous.

I attributed his craziness to the Zeitgeist

Monday, July 10th, 2017

E. Michael Jones talks about his writing mentor Bob Summers and how his behavior began to change in a dramatic way around 1970:

His actions became increasingly bizarre. He would withdraw all of his (and Joan’s) money from the bank, kidnap his son, fly to California, put himself up at expensive hotels until his money ran out, then end up back in Philadelphia after someone sent him a bus ticket. On one trip back, he got off the bus in Iowa on a hot day and had a stroke. Deprived of the ability to speak, he settled into a depression as deep as his former elation had been high. He tried to kill himself a number of times and finally succeeded. He was discovered dangling from a pipe in the basement of a house where one of his friends ran a Philadelphia version of Esalen, which is to say, a place where sensitivity sessions and sexual contact were supposed to lead to new levels of consciousness.

I used to think it was Bob’s ideas that drove him crazy. During the time I knew him, Bob had abandoned tradtional playwriting and had become a devotee of something he was calling psychodrama. I remember listening to him mention names like Moreno, Fritz Perls, and Julian Beck, whose troupe came to town and did Frankenstein, as the introduction to the concept he had for a new play. It was to be called “King of Tetch,” as in “tetched in the head,” and during the course of the play, Bob would go crazy on stage. In the end, he didn’t need a play to crazy. He as going crazy anyway.

Since Bob was a playwright, I suppose he planned make money off of the inevitable. I remember thinking it was a crazy idea at the time, but it was a time when crazy ideas were at a premium and, besides, I knew other people who were going crazy at that time too. So I attributed his craziness to the Zeitgeist, and, behind all of the other figures Bob mentioned, I attributed the ideas that drove him crazy to Wilhelm Reich, who was undergoing his New York Times documented (or promoted) revival at the time. Bob was an eastern European Jew, who shared ethnic sympathy with Reich and Reich’s project. South Street was a lot like Prague and Vienna immediately after World War I. Reich’s theories had driven Reich crazy. Why shouldn’t they have the same effect on Bob. Bob, I concluded as part of my education in the ’70s, had acted out Reich’s theories of sexual liberation and that had driven him crazy.

I still believe that. Deborah Hayden’s book Pox, however, leads me to believe that the connection between Bob Summers and Wilhelm Reich may have been more than simply ideas having consequences. Both of them, I now believe were suffering from the same disease. Both Reich and Bob Summers went crazy at the end of lives dedicated to sexual liberation. Both of them probably died of complications arising from syphilis. William Osler could have had both Bob Summers and Wilhelm Reich in mind when he described the syphilitic as manifesting “a change in character… which may astonish the friends and relatives” and warned to watch for “important indications of moral perversions manifested in offenses against decency.” Osler is talking about the final stages of syphilis, specifically paresis or general paralysis of the insane when the spirochetes which have been active all along since the period of initial infection finally succeed in destroying the brain. The most interesting aspect of the disease from a cultural point of view is the period “close to the onset of paresis,” when, in Hayden’s words, “mood shifts become more extreme as euphoria, electric excitement, bursts of creative energy, and grandiose self-reflections alternate with severe often suicidal depression. Delusions of grandeur, paranoia, exaltation, irritability, rages and irrational social behavior define the progression toward insanity. The patient may suddenly begin to gamble, go on absurd spending sprees, or imagine owning vast riches.”

Bob was around 25 years older than me. That means that he was born around 1923; that means that he was 20 years old when penicillin was invented. That means that he couldn’t have taken it as a cure until roughly four or five years later. By then, even if he had taken it, penicillin would have been too late to keep the disease from spreading to where it often did damage, namely, the brain. Because penicillin has all but eradicated the disease and most certainly has removed it as the central concern of whole cultures in the way that syphilis was at the beginning of the 20th century, the average doctor has lost his knowledge of the progression of the disease. This is a fortiori true of the man in the street. As a result, large areas of cultural history and biography are becoming increasingly incomprehensible to contemporary readers and thinkers.

Syphilis emerged into history at the birth of the modern era. It is most commonly described as having been brought back from the New World by Columbus. Hayden makes the case that Columbus, whose health never recovered after his first voyage and who heard angels speaking to him at the end, was himself infected with syphilis and died of paresis when the spirochete, the corkscrew shaped bacillus otherwise known as the pale treponema, destroyed his brain.

Tertiary neurosyphilis, he notes, is the most interesting form of the disease from a cultural point of view:

Just before the onset of paralysis, the sufferer is beset with delusions of grandeur, a sense of understanding everything, a sense that he is on the verge of some monumental discovery which will forever change the course of history, as well as a sense that some divine electricity is coursing through his veins. Since in this preliminary stage of tertiary syphilis, powers of expression are not impaired, a syphilitic who is also an artist may well produce a work of art that reflects this state of mind or, rather, this state of brain. Bob Summers felt that “King of Tetch” was just this kind of work. Wilhelm Reich felt that he had unlocked the secrets of the universe with the discovery of orgone energy, something that could now be accumulated in his orgone boxes, which would make power stations unnecessary. Hayden feels that Beethoven’s Ninth Symphony was composed under these circumstances, after syphilis had destroyed Beethoven’s hearing and was in the process of destroying his brain as well. “Seid umschlungen Millionen!” The grandiosity of Schiller’s poem is matched by the grandiosity of Beethoven’s musical score, which, at least in terms of the Ode to Joy chorus, is based on a moronic melody (melody was never Beethoven’s strong suit anyway), as the film Immortal Beloved makes clear. The brain of the syphilitic approaching general paralysis of the insane is like the light bulb that grows brighter just before it burns out completely. The syphilitic experiences, in Hayden’s words,

“episodes of creative euphoria, electrified, joyous energy when grandiosity led to a new vision. The heightened perception, dazzling insights, and almost mystical knowledge experienced during this time were expressed while precision of form of expression was still possible. At the end of the 19th century, it was believed that, in rare instances, syphilis could produce genius.”

During the period, preliminary to final decline,

“the syphilitic may be plagued by sensations of electric currents in the head,… and auditory hallucinations such as being serenaded by angels. This warning stage often has an explosive aspect, a sense of enormous contained energy, while the patient retains an ability to achieve the most rigorous control of expression. Syphilis is not suspected because of the extreme clarity of mind without dementia.”

In the period from 1881 to 1882, Nietzsche wrote to his friends about how “Each cloud contains some form of electric charge which suddenly takes hold of me, reducing me to utter misery.” The sense that some sort of divine electricity was running through his veins was so strong in Nietzsche’s mind that he felt that he ought to be displayed at an electricity exhibition in Paris. In August 1881 Nietzsche wrote to his friend Peter Gay that he felt like a human lightning bolt, “like a zig-zag doodle drawn on paper by a superior power wanting to try out a new pen.”

A feeling of boundless intellectual power accompanied the sense that electrical currents were flowing through his veins. On December 18, 1888 Nietzsche wrote to Carl Fuchs explaining that

“Never before have I known anything remotely like these months from the beginning of September until now. The most amazing tasks are as easy as a game; my health, like the weather, coming up every day with boundless brilliance and certainty. I cannot tell you how much has been finished-everything. The world will be standing on its head for the next few years: since the Old God has abdicated, I shall rule the world from now on.”

The onset of the tertiary syphilis or dementia paralytica in Nietzsche’s life is dated from January 3, 1889, when, upset at seeing a horse beaten in Turin, Italy, Nietzsche embraced the horse’s neck and collapsed into madness. His writing days over, Nietzsche spent the next 11 years of his life, up until his death in 1900, under medical care, in and out of asylums for the insane. All of his most significant writings, including those in which Christ was deposed and Dionysos/Zarathustra/Nietzsche put in his place, took place in the period of creative euphoria that lasted from 1881 to 1889, when he felt the divine electricity that is the sure sign of the onset of paresis coursing through his veins.

“I am one of those machines that could explode… Each time I had wept too much the previous day while I was walking, and not tears of sentimentality but jubilation. I sang and talked nonsense, possessed by a new attitude. I am the first man to arrive at it.”

The literary history of modern Europe, but most especially that of the 19th century, is littered with unacknowledged evidence of syphilis. The most famous example is Dracula. I am, as far as I know, the first one to argue that Dracula is about syphilis. Bram Stoker died of syphilis, something which his grandson acknowledges in at the end of his biography almost as an afterthought, as if it had no connection to Stoker’s work in general and his classic Dracula in particular. I make the argument in the second part of Monsters from the Id, my book on horror.

[...]

Hayden gives some explanation of why the suppression of syphilis happens so frequently in biography. Biographies — and Reich’s case is no exception in this regard — are generally written by devotees, people who are inspired by the subject’s work. If the work is a function of syphilis, the devotee has based his life on an illusion. “The reluctance to attribute a shameful disease like syphilis to a great person,” is understandable according to Hayden, because of “the danger that the work will in some way be linked to the disease,” and as a result “an oeuvre” would be “tainted and denigrated.” Fears like this “contribute to sparse references to syphilis” in biographies. Add to that the general ignorance about a disease no longer as threatening as it used to be and you end up with large biographical lacunae. Claude McKay, author of Home to Harlem and initiator of the Harlem Renaissance, contracted syphilis in Berlin in the early ’20s, but his biographer missed that fact, even though McKay wrote poems about it. There is no indication that the syphilis proceeded to McKay’s brain; however, the thought that it jeopardized his work is never far away.

Nietzsche is a good case proving the same point. For some inexplicable reason, there is still controversy over whether Nietzsche had syphilis, in spite of an unmistakable symptomology and accounts from people like his friend Peter Gast, who claimed that Nietzsche told him that he deliberately infected himself with syphilis by having sex with a prostitute. The reluctance to accept the fact is a reflexive defense of the ideas that Nietzsche promoted. Those who see Nietzsche as the prophet of man’s emancipation from a tyrannical God are not going to be receptive to Stoker’s idea that the delusions of grandeur necessary to any theory of rebellious atheism are really just a sign that the onset of paresis is near. Were Nietzsche’s ideas on the will and its relationship to the intellect the logical consequence of the Reformation’s denigration of reason? Perhaps. But the ideas were pushed into the form Nietzsche gave them by the grandiosity which neurosyphilis’ attack on the brain engendered in the mind.

If Nietzsche’s defenders can stall a case as obvious as his in the court of literary and historical opinion, imagine the uproar that would be generated by claiming 1) that Abraham Lincoln had syphilis and 2) that the disease affected his conduct of the Civil War. Hayden claims that Lincoln contracted syphilis as a young man and that he infected his wife Mary Todd Lincoln, causing the insanity that plagued her at the end of her life. Does that mean that the sacred cause of the Union was a function of tertiary syphilis?

[...]

Not surprisingly, the best test case for Hayden’s theory that syphilis changed the course of history is Adolf Hitler. The best indication that Hitler had syphilis is his own writing, namely, Mein Kampf.

[...]

If the internal evidence of an autobiographical text has any significance, then the obsessions which get expressed in Mein Kampf give a clear indication that Hitler had syphilis, that he probably contracted it from a Jewish prostitute, and that he extrapolated from that experience a theory of race hatred that would, in Hayden’s terms, change the course of history.

[...]

The story died, in other words, not so much because there was no evidence to support the theory, but because it would have been inconvenient to the two groups which were most interested in Hitler research: the Nazis and the Anti-Nazis. The Old Nazis, according to Wiesenthal, “bridled at the image of a syphilitic paranoiac as the greatest leader of all time” because “this would have besmirched their idol.” But the Anti-Nazis were just as opposed to the same sort of investigation because they were “afraid that an enormously complex pattern of events might suddenly be reduced to the pathological degeneration of a single individual instead of being seen as the sickness of a whole society.” Wiesenthal concludes by saying that he “can see no other reason why the question of whether or not Hitler had syphilis has received so little attention from serious historical researchers.”

Being right about the future is wrong

Sunday, July 9th, 2017

A recent Slate piece sees White Nationalist roots in Trump’s Warsaw speech defending Western civilization:

Likewise, the prosaic warning that unnamed “forces” will sap the West of its will to defend itself recalls Bannon’s frequent references to The Camp of the Saints, an obscure French novel from 1973 that depicts a weak and tolerant Europe unable to defend itself from a flotilla of impoverished Indians…

Steve Sailer caricatures this position:

You see, there was this evil novel back in 1973 that pretty accurately predicted what Chancellor Merkel would choose to do in 2015. Being right about the future is wrong. Making accurate predictions is bad. As punishment for spawning one novelist who was so vile as to grasp the future Europe was headed toward, we must make sure this dystopia comes true. Because you deserve it for being so despicable as to understand our intentions toward you.

They created a spiritual plague

Sunday, July 9th, 2017

Dylan Levi King has translated a Chinese Leftist’s contemptuous history of the evolution of the White Left, which emphasizes something we rarely hear about in history class:

While the German white left was busy bullshitting new theories in politics, the French white left stayed in the game, too — but they turned their attention to art and literature. We can see their level of achievement in this short exclamation from Maupassant:

I’ve got the pox! at last! the real thing! not the contemptible clap, not the ecclesiastical crystalline, not the bourgeois coxcombs or the leguminous cauliflowers — no — no, the great pox, the one which Francis I died of. The majestic pox, pure and simple; the elegant syphilis …. I’ve got the pox … and I am proud of it, by thunder, and to hell with the bourgeoisie.

Charles Baudelaire says it this way: “We have all of us got the spirit of republicanism in our veins, as we have the pox in our bones; we are democratized and syphilized.” It’s no wonder that the Germans called syphilis the French disease.

France’s second generation of the white left took the German white left’s Freudian ideas and white left version of liberalism to push a vision of free love. This is the reason that syphilis spread so widely in the white left camp. We know Maupassant had the disease, but so did Van Gogh, and Gauguin, too, and let’s not forget Oscar Wilde. When their fans went to the whorehouse, they turned up their noses at the whores that didn’t have syphilis — they wanted to go mad from the disease, just like their idols.

From the third generation of the white left, the idea of sickness as a badge gains credibility. Illness shows ideological devotion. Morbidity is the main feature of the second generation of the French white left. As important as their achievements in art and literature, in the ideological arena, they only managed to borrow from and degrade the philosophy of earlier times.

To sum up, Germany’s second generation white left provided the theoretical basis for the next generation and the French turned to art and literature. They created a spiritual plague. This is an important term to define:

The white left flaunts ideals, which may or may not be false, and turn their sickness into a morbid badge — this is how they get people to notice them and self-satisfaction.

That is all you need to know. If you understand what I have just written, you will understand why the women of the contemporary white left sweep into Middle East refugee camps with their messages of love.
The second generation of white left is the most important generation in the history of the evolution of this philosophy. They have combined this philosophy with fashion and psychology and created an “infectious ideology” that has spread as fast as syphilis.

The Yawfle stares and stares

Saturday, July 8th, 2017

The Yawfle stares, and stares, and stares — and aims to provide you, the discriminating and judgmental reader, with all the tech news you could possibly use:

But that’s not enough! We also provide this coverage without the cringe-worthy SJW stylings found on brand X tech news sites.

yawfle-utter-zoo-edward-gorey

Here at The Yawfle you can read about the latest software, superhero movies, and space launches and not be subjected to laborious discussion of the minority composition of the teams writing the software, making the movies, or launching the rockets. Here you can see an article about how we might not be about to all drown from rising sea levels by next Tuesday but rather, thoughtful pieces on how computer models actually work.

Amazon is just beginning to use robots in its warehouses

Saturday, July 8th, 2017

Amazon is just beginning to use robots in its warehouses and they’re already making a huge difference:

Amazon acquired Kiva for $775 million in 2012 but only started using the orange robots in its warehouses in late 2014. The deal was expected to make inventory management more efficient. It’s now beginning to become clear by how much.

The “click to ship” cycle used to be around 60-75 minutes when employees had manually to sift through the stacks, pick the product, pack it, and ship it. Now, robots handle the same job in 15 minutes, according to a Deutsche Bank note published Tuesday (June 14) based on Amazon’s metrics.

These robots are not only more efficient but they also take up less space than their human counterparts. That means warehouse design can eventually be modified to have more shelf space and less wide aisles. At the end of the third quarter of 2015, Amazon was using 30,000 Kiva robots across 13 warehouses. Each Kiva-equipped warehouse can hold 50% more inventory per square foot than centers without robots. In turn, the company’s operating costs have been sliced by 20% — or almost $22 million — per warehouse.

If Kiva robots are dispatched to the rest of the 110 Amazon warehouses, the tech giant could save almost $2.5 billion, according to Deutsche Bank. However, since it takes $15-$20 million to install robots in each warehouse, the one-time savings is expected to be closer to $800 million.

Disney’s biggest business is cable TV, for now

Friday, July 7th, 2017

Disney’s biggest business is cable TV, and kids are tuning out:

The troubles are twofold: a lack of hits and the broader move by audiences away from traditional television to digital alternatives. The shift to streaming services such Netflix Inc. and web-based platforms like Google’s YouTube is particularly pronounced among younger viewers targeted by these Disney networks.

[...]

Disney Channel programming is focused on children, while Freeform, which changed its name from ABC Family in January of 2016, is aimed at teenagers and young adults.

Cable TV has long been Disney’s biggest business, accounting for 30% of its revenue and 43% of profits last fiscal year. About 26% of cable revenue and profits come from entertainment networks like Disney Channel and Freeform, Morgan Stanley estimates, while the rest is generated by ESPN. (Disney doesn’t disclose the breakdown).

[...]

Also at stake for Disney is the exposure its TV channels offer for toys, clothes and other products that the company relies on for hundreds of millions of dollars annually in revenue.

As consumers “cut the cord,” Disney’s once fast-growing cable business has slowed down. Cable revenue is flat and operating income down 6% in the first half of the current fiscal year, which has alarmed Wall Street.

Disney Chief Executive Robert Iger has said that strengthening online accessibility for television programs is a priority and that the company is preparing to offer its channels, in part or whole, directly to consumers online rather than just through costly cable packages.

[...]

For the first six months of this year, the commercial-free Disney Channel’s ratings among in its core 2-11 and 6-14 demographics fell 23% in prime time and 13% and 18%, respectively, during the full day, compared with the same period a year ago, according to Nielsen. Ratings are also down at the smaller Disney Jr. and Disney XD networks, which fall under Mr. Marsh’s Disney Channel umbrella.

Have parents caught on to how Disney’s “family friendly” programming consists largely of bullying followed by laughter?

Why has Italy been spared mass terror attacks in recent years?

Thursday, July 6th, 2017

Why has Italy been spared mass terror attacks in recent years?

Some experts say Italy has been able to combat the threat of Isis domestically by mastering legal and policing tools developed through years of experience in mafia investigations, which in turn were born out of the so-called “years of lead” — the period between the late 1960s and early 1980s marked by acts of political terrorism by left- and right-wing militants.

He fought the mafia and won. Now this mayor is taking on Europe over migrants

According to figures released by the Italian interior ministry, counter-terrorism authorities stopped and questioned 160,593 people between March 2016 to March 2017. They stopped and interrogated about 34,000 at airports and arrested about 550 suspected terrorists, and 38 have been sentenced on terrorism charges. More than 500 websites have been shut down and nearly half a million have been monitored.

Giampiero Massolo, who served as the director of Italian intelligence from 2012 to 2016, said there was not a particular “Italian way” to combat terrorism.

“We learned a very harsh lesson during our terrorism years,” he said. “From that we drew the experience of how important it is to maintain a constant dialogue at the operating level between intelligence and law enforcement forces. In fact, prevention is key to try to be effective in counter-terrorism.”

He added: “Another feature is to have a good control of the territory. From this point of view, the absence of [French] banlieues-like spots in Italian major cities, and …[the predominance] of small and medium towns makes it easier to monitor the situation.”

There are also more specific practices. Arturo Varvelli, a senior research fellow and terrorism expert at the thinktank Ispi, said the lack of second- and third-generation Italians who might be susceptible to Isis propaganda meant authorities instead focused on non-citizens, who could be deported at the first signs of concern. Since January, 135 individuals had been expelled, he said.

Italian authorities also rely on intercepted phone calls, which unlike the UK can be used in evidence in court and — in cases related to mafia and terrorism — can be obtained on the basis of suspicious activity and not solid evidence.

Much like the fight against Italian organised crime — the Camorra around Naples, the Cosa Nostra in Sicily, and the ’Ndràngheta in the south — infiltrating and disrupting terror networks requires breaking close social and even family relationships.

People suspected of being jihadis are encouraged to break ranks and cooperate with Italian authorities, who use residency permits and other incentives, Galli said. There has been a recognition, too, of the dangers of keeping terror suspects in jail where, much like mafia bosses before them, prison is seen as a prime territory for recruiting and networking.

“I think we have developed experience in how to deal with a criminal network. We have lots of undercover agents who do a great job of intercepting communication,” she said.

While Italians authorities are seen as having broad powers, police do not have special powers to detain terror suspects without charge. Terror suspects may be held for up to four days without charge, just like any other suspect. However, Italy has been criticised by the European court of human rights for holding defendants too long once they have been charged and are awaiting trial.

Galli said there was no groundswell of concern about whether Italy’s tactics violated civil liberties. The broad use of surveillance — including intercepted communication — is seen as sufficiently targeted to terror and mafia suspects, unlike public criticism in Italy of sweeping data collection methods used in the US and UK.

If you want to learn about Korea, you should read this

Wednesday, July 5th, 2017

Colin Marshall went looking for a book that could teach him something more about Korean culture, but all the Korean books at the used bookstore — this was in Los Angeles — were just Korean translations of Western literature. His Korean language exchange partner handed him a Korean-language edition of Hermann Hesse’s Demian. “If you want to learn about Korea, you should read this.”

Writing in the Korea Times, a college student by the name of Shin Seul-ki importunes the reader to “follow your heart,” opening with a “thought-provoking passage” from Demian, in fact the novel’s own opening sentences: “I wanted only to try to live in accord with the promptings which came from my true self. Why was that so very difficult?” She answers that question with an accusatory finger pointed toward the Korean education system, which “requires students to spend nearly every waking hour figuring out not what they want to do but just studying for their college entrance exam. School doesn’t offer students a chance to find their true calling. School just pushes them into an ‘education arms race’ before finding their vision. Students study something hard for their bright future; however, paradoxically they don’t know what makes their futures brilliant.”

Korean education — along with Korean social hierarchies, Korean corporate culture, the Korean political sphere, and so on — has certainly stifled more than a few true selves, but under Shin’s argument lies a common Korean misperception: that Westerners somehow have the whole calling, vision, and future thing figured out, having long since cast off mere “routine” in favor of genuine “life.” She ends her article with a reference to Steve Jobs, the subject of a national obsession due to his vivid embodiment of the very creativity, nonconformism, effectiveness, and sheer wealth many Koreans still see their country as lacking. Walter Isaacson’s biography Steve Jobs must not rank far below Demian (maybe somewhere near the strange, much-abridged localization of the Talmud) as a holy Western text to which Koreans, frustrated and frightened by their lives for reasons they can’t quite pin down, have flocked for answers.

In response to a Quora thread entitled “What’s the deal with South Koreans and Herman Hesse?”, a longtime Korea-resident Westerner named Gord Sellar describes the novel as “about someone who (transgressively, but in a way celebrated by the novel) moves beyond the world of appearances towards the world of the self,” touching on the theme of people who bear a “Mark of Cain’ that prevents their fulfillment “by ‘normal’ social interactions.” And “for those having grown up in South Korea — a place where appearance and form are often conventionally prioritized over essence or content — this particular theme probably has a special appeal.” As does the kind of 19th-century European setting with “parents objecting to love marriages or forbidding relationships or marriages, women seeking out husbands on the basis of their career potential or income, and people (often women) ending up in desperate trouble or in penury because of a cruel parent or a tragic family accident.”

Any story of “old Europe struggling with modernity” will resonate with a Korea doing plenty of modernity-grappling of its own. Demian in particular, Sellar writes, also taps inadvertently into the particular Korean storytelling sensibility: “They are much more enamored of sad endings, and they tend to be much more patient with stories that unfold in such a way that the protagonists never had a real hope of changing the outcome.” This has introduced certain difficulties into the marketing of Korean literature to Westerners, who “have little patience for stories that feature characters who can’t take some hand in their fate” and “tend to be less patient with melodramatically sad turns of plot,” but it means certain strains of anguish-oriented German fiction, best exemplified by Goethe’s The Sorrows of Young Werther (from the object of whose unrequited passion one of Korea’s biggest conglomerates took its name), have grown popular indeed here.