Our sin tends to be timidity, not rashness

Saturday, July 22nd, 2017

Arthur C. Brooks’ advice for young people heading out into the world is to be prudent — because prudence means something more than what we’ve been led to believe:

When I finally read the German philosopher Josef Pieper’s “The Four Cardinal Virtues,” which had sat unread on my shelf for years, I was shocked to learn that I didn’t hate prudence; what I hated was its current — and incorrect — definition. The connotation of prudence as caution, or aversion to risk, is a modern invention. “Prudence” comes from the Latin “prudentia,” meaning sagacity or expertise. The earliest English uses from the 14th century had little to do with fearfulness or habitual reluctance. Rather, it signified righteous decision making that is rooted in acuity and practical wisdom. Mr. Pieper argued that we have bastardized this classical concept. We have refashioned prudence into an excuse for cowardice, hiding behind the language of virtue to avoid what he calls “the embarrassing situation of having to be brave.” The correct definition, Mr. Pieper argued, is the willingness to do the right thing, even if that involves fear and risk. In other words, to be rash is only one breach of true prudence. It is also a breach to be timid. So which offense is more common today? [...] Our sin tends to be timidity, not rashness. On average, we say “no” too much when faced with an opportunity or dilemma.

The math students dropped out because they could not understand anything

Saturday, July 22nd, 2017

June Huh took a path less taken to the peak of the math world:

Huh was born in 1983 in California, where his parents were attending graduate school. They moved back to Seoul, South Korea, when he was two. There, his father taught statistics and his mother became one of the first professors of Russian literature in South Korea since the onset of the Cold War.

After that bad math test in elementary school, Huh says he adopted a defensive attitude toward the subject: He didn’t think he was good at math, so he decided to regard it as a barren pursuit of one logically necessary statement piled atop another. As a teenager he took to poetry instead, viewing it as a realm of true creative expression. “I knew I was smart, but I couldn’t demonstrate that with my grades, so I started to write poetry,” Huh said.

Huh wrote many poems and a couple of novellas, mostly about his own experiences as a teenager. None were ever published. By the time he enrolled at Seoul National University in 2002, he had concluded that he couldn’t make a living as a poet, so he decided to become a science journalist instead. He majored in astronomy and physics, in perhaps an unconscious nod to his latent analytic abilities.

When Huh was 24 and in his last year of college, the famed Japanese mathematician Heisuke Hironaka came to Seoul National as a visiting professor. Hironaka was in his mid-70s at the time and was a full-fledged celebrity in Japan and South Korea. He’d won the Fields Medal in 1970 and later wrote a best-selling memoir called The Joy of Learning, which a generation of Korean and Japanese parents had given their kids in the hope of nurturing the next great mathematician. At Seoul National, he taught a yearlong lecture course in a broad area of mathematics called algebraic geometry. Huh attended, thinking Hironaka might become his first subject as a journalist.

Initially Huh was among more than 100 students, including many math majors, but within a few weeks enrollment had dwindled to a handful. Huh imagines other students quit because they found Hironaka’s lectures incomprehensible. He says he persisted because he had different expectations about what he might get out of the course.

“The math students dropped out because they could not understand anything. Of course, I didn’t understand anything either, but non-math students have a different standard of what it means to understand something,” Huh said. “I did understand some of the simple examples he showed in classes, and that was good enough for me.”

After class Huh would make a point of talking to Hironaka, and the two soon began having lunch together. Hironaka remembers Huh’s initiative. “I didn’t reject students, but I didn’t always look for students, and he was just coming to me,” Hironaka recalled.

Huh tried to use these lunches to ask Hironaka questions about himself, but the conversation kept coming back to math. When it did, Huh tried not to give away how little he knew. “Somehow I was very good at pretending to understand what he was saying,” Huh said. Indeed, Hironaka doesn’t remember ever being aware of his would-be pupil’s lack of formal training. “It’s not anything I have a strong memory of. He was quite impressive to me,” he said.

As the lunchtime conversations continued, their relationship grew. Huh graduated, and Hironaka stayed on at Seoul National for two more years. During that period, Huh began working on a master’s degree in mathematics, mainly under Hironaka’s direction. The two were almost always together. Hironaka would make occasional trips back home to Japan and Huh would go with him, carrying his bag through airports and even staying with Hironaka and his wife in their Kyoto apartment.

[...]

Meanwhile, Hironaka continued to tutor Huh, working from concrete examples that Huh could understand rather than introducing him directly to general theories that might have been more than Huh could grasp. In particular, Hironaka taught Huh the nuances of singularity theory, the field where Hironaka had achieved his most famous results. Hironaka had also been trying for decades to find a proof of a major open problem — what’s called the resolution of singularities in characteristic p. “It was a lifetime project for him, and that was principally what we talked about,” Huh said. “Apparently he wanted me to continue this work.”

In 2009, at Hironaka’s urging, Huh applied to a dozen or so graduate schools in the U.S. His qualifications were slight: He hadn’t majored in math, he’d taken few graduate-level classes, and his performance in those classes had been unspectacular. His case for admission rested largely on a recommendation from Hironaka. Most admissions committees were unimpressed. Huh got rejected at every school but one, the University of Illinois, Urbana-Champaign, where he enrolled in the fall of 2009.

At Illinois, Huh began the work that would ultimately lead him to a proof of the Rota conjecture.

World War II films aren’t about World War II

Friday, July 21st, 2017

Many World War II films reveal at least as much about the times in which they are made as they do about the conflict itself:

“It’s possible that 20 years from now we’ll look back at ‘Dunkirk’ and say, ‘That movie was so 2017,’ and everyone will know exactly what that means,” said film historian Mark Harris, author of “Five Came Back,” a book about Hollywood and World War II that was also the subject of a recent Netflix documentary.

Around the beginning of the war, films served a practical purpose, rallying American solidarity behind the conflict. In 1940, Hitchcock’s “Foreign Correspondent” featured a reporter calling for action with guns and battleships in a scene of a radio broadcast: “It’s as if the lights were out everywhere except in America,” he says. Chaplin, who directed and played the lead speaking role in 1940’s “The Great Dictator” about an Adolf Hitler-like figure, delivers a final speech directly into the camera that includes the line: “Let us fight to free the world.”

During the war, filmmakers churned out movies in close to real time, going from script to screen in as few as six months, said Mr. Harris.

“Films made about World War II during the war are special because we don’t know we’re going to win,” said Thomas Doherty, a professor of American studies at Brandeis University who wrote “Projections of War: Hollywood, American Culture, and World War II.” “I’m always surprised when I look at World War II movies made during the war just how stern the lessons are. The guy you really like is often killed in the film.”

Soon, the anxieties of the atomic age begin to surface. “In Harm’s Way,” a 1965 film starring John Wayne as a naval officer in the Pacific after Pearl Harbor, ends with a shot of the ocean that morphs into what looks like a mushroom cloud. Mixed feelings around the Vietnam War enter the picture with movies like 1967’s “The Dirty Dozen,” a subversive take on conflict told through the story of death-row convicts on a mission to kill Nazis.

Veterans of World War II and Vietnam and civilian Baby Boomers might have taken different messages from 1970’s “Patton,” at once a portrait of a victorious general and a man driven by ego and ambition. Douglas Cunningham, co-editor of “A Wiley Companion to the War Film” and a teacher of film history at Westminster College in Salt Lake City, Utah, recalled a scene where Patton slaps the helmet of a soldier suffering from shellshock. “By 1970, you would have had plenty of folks returning from Vietnam traumatized in ways that would have been familiar to some members of that audience,” he said.

In time the Holocaust became a central part of the screen version of World War II, with movies like 1982’s “Sophie’s Choice,” about an Auschwitz survivor, and Spielberg’s 1993 drama “Schindler’s List.”

Movies have furthered an idea that the Holocaust was known to most American soldiers during the war. A scene hinting at that connection occurs in Spielberg’s “Saving Private Ryan,” when a Jewish soldier holds up the Star of David on his dog tag and repeats the German word for Jews—“Juden”—to captured enemy soldiers. “This is the way America sees World War II now—that it was all about the Holocaust and the Holocaust was the governing point,” said Robert Burgoyne, professor of film studies at the University of St Andrews and author of two books on U.S. history as told through the movies. “The Holocaust was not known to American culture generally. It is simply a kind of rewriting of World War II according to the contemporary generation’s perspective.”

In 1998, “Saving Private Ryan” presented the war to a new generation, starting with its harrowing opening of Allied troops storming Omaha Beach on D-Day. “In terms of stoking interest in World War II, these are the most important 20 minutes in cinema history,” said Rob Citino, senior historian at The National World War II Museum in New Orleans.

Experts’ brains transform data into action

Friday, July 21st, 2017

Neuroscientists Jason Sherwin and Jordan Muraskin are studying what happens inside the brain of a baseball player trying to hit a pitch:

Sherwin and Muraskin think they’ve identified a pattern of brain activation in professional hitters. One key area is the fusiform gyrus, a small spot at the bottom of the brain that is crucial for object recognition. For baseball players, this region is much more active during hitting. Recent data also suggests that in experts the fusiform gyrus may be more connected to the motor cortex, which controls movement. Sajda says this has important implications because the increased connection could indicate that experts’ brains are more efficient at transforming data about the pitch into movement.

The expert hitters also tend to use their frontal cortex — a part of the brain that is generally in charge of deliberate decision-making — less than nonexperts do when hitting. (When we decide to order a baked potato rather than french fries, it’s a good bet that our frontal cortex is deeply involved. However, this part of the brain tends to make decisions more slowly and meticulously; it is not adept at split-second choices.)

This diminished frontal participation is crucial, they say. “Players seem to make the decision in their motor cortex rather than their frontal cortex,” Sajda says. “Their brains recognize and act on pitches more efficiently.”

Another key area that appears to be more energized among expert hitters is the supplementary motor area (SMA), a small region at the top of the brain. It is involved in the coordination of sequences of preplanned movements such as hitting. In expert hitters, this area is especially active as the pitcher winds up and as the pitch approaches the plate. In essence, the researchers say, experts are better at preparing to swing.

Muraskin thinks that the SMA plays a key role in helping hitters choose when not to swing. Many good hitters — the Nationals’ Daniel Murphy is known for this — have a preternatural ability to wait for the “right” pitch, the pitch they can hit. In other words, they excel at inhibiting their swing. “When you choose not to swing, that’s a choice,” Muraskin says. “It is a learned expertise.”

One in five Americans are prescribed opioids

Thursday, July 20th, 2017

More than one in five people were prescribed an opioid painkiller at least once in 2015 — at least among those insured by Blue Cross and Blue Shield:

The report, which covers 30 million people with Blue Cross and Blue Shield insurance in 2015, supports what experts have been saying: much, if not most, of the opioid overdose epidemic is being driven by medical professionals who are prescribing the drugs too freely.

“Twenty-one percent of Blue Cross and Blue Shield (BCBS) commercially insured members filled at least one opioid prescription in 2015,” the report says. “Data also show BCBS members with an opioid use disorder diagnosis spiked 493 percent over a seven year period.”

The report excludes people with cancer or terminal illnesses. What it found fits in with similar surveys of people with Medicare, Medicaid or other government health insurance, said Dr. Trent Haywood, chief medical officer for the Blue Cross and Blue Shield Association (BCSBA).

Grass pyramids cut noise pollution

Thursday, July 20th, 2017

Airport noise travels far in a flat country like the Netherlands:

The tricky thing about dampening airport noise is that the noise is a very low frequency with a very long wavelength, around 36 feet, so a simple barricade will do little to stop the drone. But in 2008, airport staff noticed that noise levels were reduced every fall by an unsuspecting phenomenon: plowed fields. After examining the scene, they discovered that the ridges and furrows of the field were spaced in a way that they partially silenced the hum.

So, the firm H+N+S Landscape Architects teamed up with artist Paul De Kort to produce a series of 150 artificial pyramids of grass, each 6 feet tall and 36 feet apart (the approximate wavelength of airport hubbub). This ingenious method, based on the groundbreaking work of acoustician Ernst Chladni, has effectively reduced noise pollution in the region by half.

Buitenschot Land Art Park

To the amusement of the people in the area, the 80-acre swath of ridges adds entertainment to utility. Paths for pedestrians and bicycles slice between the grass ridges, and De Kort has even incorporated works of art into the park, including “Listening Ear,” a dish with a gap in the middle that amplifies sound, and “Chladni-Pond,” a diamond-shaped pond where park guests can power a wave mechanism with their feet.

I may be screwing this person over

Wednesday, July 19th, 2017

A recent Freakonomics podcast looks at civic-minded Harvard physician Richard Clarke Cabot’s long-running Cambridge-Somerville Youth Study, which matched troubled boys with mentors — versus a matched control group who received no mentoring:

They found a null effect. They found there were no differences between the treatment and control boys on offending.

When computers came on the scene and they could analyze the data in finer detail, they made an interesting discovery:

On all seven measures — we’re talking, how long did you live? Were you a criminal? Were you mentally healthy, physically healthy, alcoholic, satisfied with your job; satisfied with your marriage? On all seven measures, the treatment group did statistically, significantly worse off than the control group.

The lesson:

And that’s one of the important things people who are engaged in social interventions really don’t spend much time thinking, “I may be screwing this person over.” They are self-conscious about, “Maybe this won’t work, but I’ve got to try!”

You can get away with as little as one minute of effort

Wednesday, July 19th, 2017

Scientists out of McMaster University recently conducted research on the shortest interval training ever:

To see just how little you can get away with when it comes to interval training for health purposes, the researchers brought in 25 less-than-in-shape young men (future studies will focus on women). They tested their levels of aerobic fitness and their ability to use insulin in the right way to control blood sugar, and biopsied their muscles to see how well they functioned on a cellular level.

Then they split them into a control group, a moderate-intensity-exercise group, and a sprint interval training (SIT) group.

The control group did nothing differently at all.

The moderate-intensity group did a typical I’m-at-the-gym routine of a two-minute warm-up, 45 minutes on the stationary bike, and a three-minute cool down, three times a week.

The SIT group did the shortest interval training ever recorded thus far by science. Participants warmed up for two minutes on a stationary bike, then sprinted full-out for 20 seconds, then rode for two minutes very slowly. They repeated this twice (for a total of three sets). The whole workout took 10 minutes, with only one minute being high-intensity.

All of the groups kept at it for 12 weeks, or about twice as long as most previous studies.

The results?

The control group, as expected, had no change in results.

The two other groups enjoyed results that were basically identical to each other’s. In both, scientists found a 20 percent increase in cardiovascular endurance, good improvements in insulin resistance, and significant increases in the cells responsible for energy production and oxygen in the muscles (thanks, biopsies).

That is remarkable. By the end, the moderate-intensity group had ridden for 27 hours, while the SIT group had ridden for 6 total hours, just 36 minutes of which was arduous.

This means one group spent about 10 total minutes on each workout, while the other spent 50 minutes. The SIT group got the same benefits in a fifth of the time.

The games get increasingly difficult as the player’s heart rate increases

Tuesday, July 18th, 2017

Boston Children’s Hospital researchers have developed videogames for children who need to learn how to control their emotions better:

The videogames track a child’s heart rate, displayed on the screen. The games get increasingly difficult as the player’s heart rate increases. To be able to resume playing without extra obstacles the child has to calm themselves down and reduce their heart rate.

[...]

The impact of the games was tested in two studies.

In a pilot study, they first tested the game in a psychiatric inpatient unit with children with anger management issues, said Joseph Gonzalez-Heydrich, director of the developmental neuropsychiatry clinic at Boston Children’s. They found improvements in just five days and published the results in 2012 in a study in the journal Adolescent Psychiatry.

“A lot of these kids we are seeing are not interested in psychotherapy and talking,” said Dr. Gonzalez-Heydrich, who is head of the scientific advisory board of Mighteor, and said he has a small amount of equity in the company. “But they will work really hard to get good at a videogame.”

In a subsequent outpatient study the researchers randomized 20 youth to 10 cognitive behavior therapy sessions and videogame therapy that required them to control their heart rate, and 20 youth to CBT with the same videogame but not linked to heart rates. All the adolescents had anger or aggression problems, said Dr. Gonzalez-Heydrich, who was senior author of the study.

Therapists interviewed the children’s primary caregiver before and two weeks after their last therapy session. They found the children’s ratings on aggression and opposition were reduced much more in the group that played the game with the built-in biofeedback. The ratings for anger went down about the same in both groups. The findings were presented at the American Academy of Child and Adolescent Psychiatry conference in 2015. The study is currently under review for publication.

Think you drink a lot?

Tuesday, July 18th, 2017

Think you drink a lot? This chart will tell you:

These figures come from Philip J. Cook’s Paying the Tab, an economically-minded examination of the costs and benefits of alcohol control in the U.S. Specifically, they’re calculations made using the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) data.

Drinks per Capita by Decile

“One consequence is that the heaviest drinkers are of greatly disproportionate importance to the sales and profitability of the alcoholic-beverage industry,” he writes writes. “If the top decile somehow could be induced to curb their consumption level to that of the next lower group (the ninth decile), then total ethanol sales would fall by 60 percent.”

(Hat tip to P.D. Mangan.)

Trevor Butterworth considers this data journalism gone wrong:

If we look at the section where he arrives at this calculation, and go to the footnote, we find that he used data from 2001-2002 from NESARC, the National Institute on Alcohol Abuse and Alcoholism, which had a representative sample of 43,093 adults over the age of 18. But following this footnote, we find that Cook corrected these data for under-reporting by multiplying the number of drinks each respondent claimed they had drunk by 1.97 in order to comport with the previous year’s sales data for alcohol in the US. Why? It turns out that alcohol sales in the US in 2000 were double what NESARC’s respondents — a nationally representative sample, remember — claimed to have drunk.

While the mills of US dietary research rely on the great National Health and Nutrition Examination Survey to digest our diets and come up with numbers, we know, thanks to the recent work of Edward Archer, that recall-based survey data are highly unreliable: we misremember what we ate, we misjudge by how much; we lie. Were we to live on what we tell academics we eat, life for almost two thirds of Americans would be biologically implausible.

But Cook, who is trying to show that distribution is uneven, ends up trying to solve an apparent recall problem by creating an aggregate multiplier to plug the sales data gap. And the problem is that this requires us to believe that every drinker misremembered by a factor of almost two. This might not much of a stretch for moderate drinkers; but did everyone who drank, say, four or eight drinks per week systematically forget that they actually had eight or sixteen? That seems like a stretch.

In the early stages of Alzheimer’s glycation damages an enzyme called MIF

Monday, July 17th, 2017

Abnormally high blood sugar levels are linked to Alzheimer’s, and now the mechanism has become clearer:

Diabetes patients have an increased risk of developing Alzheimer’s disease compared to healthy individuals. In Alzheimer’s disease abnormal proteins aggregate to form plaques and tangles in the brain which progressively damage the brain and lead to severe cognitive decline.

Scientists already knew that glucose and its break-down products can damage proteins in cells via a reaction called glycation but the specific molecular link between glucose and Alzheimer’s was not understood.

But now scientists from the University of Bath Departments of Biology and Biochemistry, Chemistry and Pharmacy and Pharmacology, working with colleagues at the Wolfson Centre for Age Related Diseases, King’s College London, have unraveled that link.

By studying brain samples from people with and without Alzheimer’s using a sensitive technique to detect glycation, the team discovered that in the early stages of Alzheimer’s glycation damages an enzyme called MIF (macrophage migration inhibitory factor) which plays a role in immune response and insulin regulation.

MIF is involved in the response of brain cells called glia to the build-up of abnormal proteins in the brain during Alzheimer’s disease, and the researchers believe that inhibition and reduction of MIF activity caused by glycation could be the ‘tipping point’ in disease progression. It appears that as Alzheimer’s progresses, glycation of these enzymes increases.

Impregnable to the waves and every day stronger

Sunday, July 16th, 2017

Pliny the Elder, in his Natural History, described Rome’s concrete as “impregnable to the waves and every day stronger” — which, it turns out, was literally true:

Writing in the journal American Mineralogist, Jackson and colleagues describe how they analysed concrete cores from Roman piers, breakwaters and harbours.

Previous work had revealed lime particles within the cores that surprisingly contained the mineral aluminous tobermorite — a rare substance that is hard to make.

The mineral, said Jackson, formed early in the history of the concrete, as the lime, seawater and volcanic ash of the mortar reacted together in a way that generated heat.

But now Jackson and the team have made another discovery. “I went back to the concrete and found abundant tobermorite growing through the fabric of the concrete, often in association with phillipsite [another mineral],” she said.

She said this revealed another process that was also at play. Over time, seawater that seeped through the concrete dissolved the volcanic crystals and glasses, with aluminous tobermorite and phillipsite crystallising in their place.

These minerals, say the authors, helped to reinforce the concrete, preventing cracks from growing, with structures becoming stronger over time as the minerals grew.

By contrast, modern concrete, based on Portland cement, is not supposed to change after it hardens — meaning any reactions with the material cause damage.

Deep down, they really want a king or queen

Saturday, July 15th, 2017

Ross Douthat recently teased liberals that they really like Game of Thrones because, deep down, they really want a king or queen. He considers this response a strong misreading of what Martin’s story and the show are offering:

To say that Game of Thrones is attractive to liberals because of secret monarchical longings, you have to ignore…everything GoT is doing. GoT does not make being a Stark bannerman or a Daenerys retainer look fun! Those people get flayed and beheaded! GoT presents a vision of monarchy that is exaggeratedly dystopian even compared to most of the historical reality of monarchy. I think that dystopian exaggeration is in fact key to the show’s appeal to liberals in many ways. It lets you fantasize about the negation of your principles while simultaneously confirming their rightness. GoT presents a vision of a world in which illiberal instincts can be freely indulged, in which the id is constrained only by physical power. All the violent, nasty stuff liberal society (thankfully) won’t let us do, but that’s still seething in our lizard brains, gets acted out. And not just acted out — violence and brutality are the organizing principles on which the world is based.

But this is where the dystopianism comes in, because the show chides you for harboring the very fantasies it helps you gratify. It wallows in their destructive consequences — makes that wallowing, in fact, simultaneous with the fulfillment of the fantasies. Will to power leads to suffering and chaos, which lead to more opportunities for the will to power to be acted upon, etc. This is a vastly more complex and interesting emotional appeal than “people secretly want kings.” The liberal order is always being implicitly upheld by the accommodation of our base desire for its opposite. To me, this is the most interesting ongoing thing about GoT, a franchise I’m otherwise completely tired of. Everyone wants to move to Hogwarts; only a lunatic would actually want to LIVE in Westeros. In an escapist genre, that’s interesting. It’s not subliminal royalism; it’s dark escapism, an escape that ultimately tends toward reconciliation with the existing order.

And what do liberals secretly love more than an excuse to reconcile with the existing order? Westeros makes Prime Day look utopian!

It is “a very good description of what a lot of prestige television has done,” Douthat agrees, but Game of Thrones is different:

These shows [The Sopranos, Mad Men, and Breaking Bad] invite liberal viewers into various illiberal or pre-liberal or just, I suppose, red-state worlds, which are more violent and sexist and id-driven than polite prestige-TV-viewing liberal society, and which offer viewers the kind of escapism that Phillips describes … in which there is a temporary attraction to being a mobster or hanging out with glamorous chain-smoking ’50s admen or leaving your put-upon suburban life behind and becoming Heisenberg the drug lord. But then ultimately because these worlds are clearly wicked, dystopic or just reactionary white-male-bastions you can return in relief to the end of history, making Phillips’ “reconciliation with the existing order” after sojourning for a while in a more inegalitarian or will-to-power world.

[...]

“Game of Thrones,” however, is somewhat different. Yes, it makes the current situation in Westeros look hellish, by effectively condensing all of the horrors of a century of medieval history into a few short years of civil war. And yes, it’s much darker and bloodier and has a much higher, “wait, I thought he was a hero” body count than a lot of fantasy fiction, which lets people describe it as somehow Sopranos-esque.

But fundamentally “The Sopranos” was a story without any heroes, a tragedy in which the only moral compass (uncertain as Dr. Melfi’s arrow sometimes was) was supplied by an outsider to its main characters’ world. Whereas “Game of Thrones” is still working within the framework of its essentially romantic genre — critiquing it and complicating it, yes, but also giving us a set of heroes and heroines to root for whose destinies are set by bloodlines and prophecies, and who are likely in the end to save their world from darkness and chaos no less than Aragorn or Shea Ohmsford or Rand al’Thor.

Put another way: On “The Sopranos,” there is no right way to be a mafioso. But on “Game of Thrones” there is a right way to be a lord or king and knight, and there are characters who model the virtues of each office, who prove that chivalry and wise lordship need not be a myth. Sometimes they do so in unexpected ways — the lady knight who has more chivalry than the men who jeer at her, the dwarf who rules more justly than the family members who look down on him. But this sort of reversal is typical of the genre, which always has its hobbits and stable boys and shieldmaidens ready to surprise the proud and prejudiced. And it coexists throughout the story with an emphasis on the importance of legitimacy and noblesse oblige and dynastic continuity, which is often strikingly uncynical given the dark-and-gritty atmosphere.

Consider that the central family, the Starks, are wise rulers whose sway over the North has endured for an implausible number of generations — “there has always been a Stark in Winterfell,” etc. — and whose people seems to genuinely love them. Their patriarch is too noble for his own good but only because he leaves his native fiefdom for the corruption of the southern court, and his naivete is still presented as preferable to the cynicism of his Lannister antagonists, who win temporary victories but are on their way to destroying their dynasty through their amorality and singleminded self-interestedness.

The problem is managing that blend while maintaining stability

Saturday, July 15th, 2017

Kaja Perina explores the mad genius mystery:

Grothendieck’s mindset embodied what polymath Alan Turing described as mathematical reasoning itself, “the combination of two facilities, which we may call ‘intuition and ingenuity.’” Grothendieck’s approach was to “dissolve” problems by finding the right level of generality at which to frame them. Mathematician Barry Mazur, now at Harvard, recalls conversations with Grothendieck as having been “largely, perhaps only, about viewpoint, never about specifics. It was principally ‘the right vantage,’ a way of seeing mathematics, that he sought, and perhaps only on a lesser level its byproducts.”

Grothendieck’s unique vantage point and thought style contributed to his genius. But they were also his undoing. The prospect of mathematical madness has been debated ever since Pythagoras, often described as the first pure mathematician, went on to lead a strange cult. Isaac Newton, Kurt Goedel, Ludwig Boltzmann, Florence Nightingale, and John Nash all attained mathematical prominence before succumbing to some type of psychopathology, including depression, delusions, and religious mysticism of the sort engendered by psychosis.

[...]

Thinking styles lie on a continuum. On one end is mechanistic, rule-based thinking, which is epitomized in minds that gravitate to math, science, engineering, and tech-heavy skill-sets. Mechanistic cognition is bottom-up, concerned with the laws of nature and with objects as they exist in the world, and stands in contrast to mentalistic thinking. Mentalistic cognition exists to decode and engage with the minds of others, both interpersonally and in terms of larger social forces. It is more holistic (top-down) and humanistic, concerned, broadly speaking, with people, not with things. This mindset makes loose, sometimes self-referential inferences about reality. If “hypermentalistic,” too much meaning will be attributed to events: All coincidences are meaningful and all events are interconnected.

Every mind lies somewhere on this diametric cognitive spectrum. And as with many spectra, at each extreme the signature thought style is dialed up too high to be fully functional. Autism, in this conceptualization, is an extreme form of mechanistic thinking. It stands in contrast to psychotic disorders, characterized by false beliefs in the sentience of inanimate objects and delusions about the self and others. Reading minds is the lingua franca of mentalistic cognition, and symptoms of psychosis are essentially mind-reading on steroids.

Extreme cognitive styles map onto genius in that autism is in some cases associated with high intelligence. General intelligence, after all, includes the ability to quickly master rule-based, highly abstract thinking. And psychotic spectrum disorders, including bipolar disorder, schizotypy, and schizophrenia, are disproportionately diagnosed in highly creative individuals (they’ve been most often measured in artists, musicians, and writers) or in their first-degree relatives. Grothendieck’s broad-spectrum thought style represents both off-the-charts intelligence and unparalleled creativity. It is within this rarified space that genius may reside. And the overshoot toward either pole—or to both—may, by the same token, engender mental illness.

[...]

The genius-madness debate has gone off course in asking whether creative individuals are at greater risk for developing mental illness than are their noncreative peers. Some are, some are not. The matter is confounded by the degree of giftedness in play. While creative types are more mentally stable than are noncreatives, the correlation reverses in the presence of exceptional creativity. Dean Keith Simonton, a professor of psychology at the University of California at Davis, finds that extraordinarily creative individuals are more likely to exhibit psychopathology than are noncreative people. He dubs this the “Mad Genius Paradox.”

An inability to filter out seemingly irrelevant information is a hallmark of both creative ideation and disordered thought. The state, known as reduced latent inhibition, allows more information to reach awareness, which can in turn foster associations between unrelated concepts. The barrage accounts for both the nonsensical ideas seen in psychosis and for novel thinking.

Over the centuries mathematical and artistic minds (and those with both gifts, such as the writer David Foster Wallace) have opined that their accomplishments flowed from the same liminal zone that harbored their greatest challenges. “The ideas I had about supernatural beings came to me the same way that my mathematical ideas did,” John Nash stated when asked why he’d once believed in space aliens.

And yet divergent thinking, while necessary for creative leaps, is hardly sufficient. In that direction lies a mind’s unravelling. Cognitive control and high intelligence must also be present, both to manage the informational cascade and to make novel use of it. “There are abnormalities of the brain that, when they co-exist with certain cognitive strengths, allow visionary thought to occur,” says psychologist Shelley Carson, who lectures on creativity at Harvard and is the author of Your Creative Brain.

“High productivity is associated with both intelligence and with high creativity, whether of a schizotypal or an autistic nature,” states Rex Jung, a neuropsychologist who studies creativity and intelligence at the University of New Mexico in Albuquerque. “These unusual characteristics are all distributed around the edges of a normal bell curve, making the possibility of [these minds] producing something new much more likely.”

The exceptional intelligence required for genius-level contributions to mathematics may not just optimize divergent thinking. It may also delay or prevent mental illness in those who are susceptible, at least for a significant period of time. Among men, the typical age of onset of schizophrenia or other psychotic spectrum disorders is in the late teens or early twenties. Yet Grothendieck, Newton, and Nash did not demonstrate thinking that could be characterized as delusional or psychotic until later in life: Nash at 30, Newton and Grothendieck well into midlife. From a neuronal perspective, the normal process of demyelination that begins in the mid-forties leads to a weakening of executive networks that are neuroprotective, explains Jung. Myelin function impacts processing speed, which is a key individual difference in intelligence, “it makes sense that someone who is highly intelligent and has a propensity to mental illness might begin to experience symptoms in this age range.”

Carson believes that these men were initially protected from illness not only by their brilliance but also by their drive to create.

This idea is echoed in the words of Pierre Cartier, among Grothendieck’s most accomplished contemporaries, who wrote that while he wished to avoid diagnosing his peer, he nonetheless considered Grothendieck’s output a buffer in his precarious mental state. “His capacity for scientific creation was the best antidote to depression, and the immersion in a living scientific milieu (Bourbaki and IHES) helped this to take place by giving it a collective dimension.”

It has been said that the ultimate mathematician is one who can see analogies between analogies. Such was the case for Grothendieck. He used metaphors, often of buildings, such as la belle demeure parfaite (the perfect mansion), to describe the solutions he sought. This is captured even in the language used by others to describe Grothendieck’s work: “Mathematicians like to walk along narrow little paths in unknown landscapes, looking for beautiful scenery or just for precious stones, but [Grothendieck] started by building a highway,” wrote Valentin Poenaru, a mathematician who knew him well. “Where some might build an acrobatic bridge between two distant mountain tops, he would just fill up the space between.”

Might the twin poles of ultramechanistic and ultracreative thinking be the variables needed for mathematical genius in particular? As Carson sees it, “People who have cognitive disinhibition added to high systematizing ability ‘see’ systems that others cannot see— visions interlocking and put together.”

[...]

Cognitive disinhibition is integral to the creative process: It underlies loose, associative thinking as well as the inability to filter out seemingly extraneous information. This type of thinking occurs when the executive control network, responsible for higher-order cognitive tasks, ramps down in favor of the default mode network. This brain network is active in the absence of explicit attentional goals; it comes online when daydreaming or parsing the minds of others—which accounts for its dominance in mentalistic thinking. Imaging studies confirm that both highly creative individuals and those at risk for psychotic spectrum disorders exhibit unusual patterns of connectivity between the two networks.

The default network and the central executive network normally operate in dynamic tension, a hallmark of mental health as well as of high IQ. Anthony Jack, a neuroscientist who studies these regions in his Brain, Mind and Consciousness Lab at Case Western Reserve University, explains that “creativity is the only desirable area in which the seesaw is disrupted. During ‘aha’ moments you have engagement of both. Genius comes from blending the two sides. The problem is managing that blend while maintaining stability.”

Slipper bread goes way, way back

Friday, July 14th, 2017

After reading about David Brooks’ beleaguered friend “with only a high school degree” who had to confront “ingredients like soppressata, capicollo and a striata baguette” at the gourmet sandwich shop, I thought of my own privileged progeny growing up with ciabatta, which I did not remember from my own childhood — for a very good reason:

Ciabatta (literally slipper bread) is an Italian white bread made from wheat flour, water, salt, and yeast, created in 1982 by a baker in Verona, Veneto, Italy, in response to the popularity of French baguettes.