Instructional Videos

Tuesday, October 18th, 2016

Instructional videos are popular and effective, because we’re designed to learn through imitation:

Last year, it was estimated that YouTube was home to more than 135 million how-to videos. In a 2008 survey, “instructional videos” were ranked to be the site’s third most popular content category — albeit a “distant third” behind “performance and exhibition” and “activism and outreach.” More recent data suggest that distance may have closed: In 2015, Google noted that “how to” searches on YouTube were increasing 70 percent annually. The genre is by now so mature that it makes for easy satire.


A 2014 study showed that when a group of marmosets were presented with an experimental “fruit” apparatus, most of those that watched a video of marmosets successfully opening it were able to replicate the task. They had, in effect, watched a “how to” video. Of the 12 marmosets who managed to open the box, just one figured it out sans video (in the human world, he might be the one making YouTube videos).


“We are built to observe,” as Proteau tells me. There is, in the brain, a host of regions that come together under a name that seems to describe YouTube itself, called the action-observation network. “If you’re looking at someone performing a task,” Proteau says, “you’re in fact activating a bunch of neurons that will be required when you perform the task. That’s why it’s so effective to do observation.”


This ability to learn socially, through mere observation, is most pronounced in humans. In experiments, human children have been shown to “over-imitate” the problem-solving actions of a demonstrator, even when superfluous steps are included (chimps, by contrast, tend to ignore these). Susan Blackmore, author of The Meme Machine, puts it this way: “Humans are fundamentally unique not because they are especially clever, not just because they have big brains or language, but because they are capable of extensive and generalised imitation.” In some sense, YouTube is catnip for our social brains. We can watch each other all day, every day, and in many cases it doesn’t matter much that there’s not a living creature involved. According to Proteau’s research, learning efficiency is unaffected, at least for simple motor skills, by whether the model being imitated is live or presented on video.

There are ways to learn from videos better:

The first has to do with intention. “You need to want to learn,” Proteau says. “If you do not want to learn, then observation is just like watching a lot of basketball on the tube. That will not make you a great free throw shooter.” Indeed, as Emily Cross, a professor of cognitive neuroscience at Bangor University told me, there is evidence — based on studies of people trying to learn to dance or tie knots (two subjects well covered by YouTube videos) — that the action-observation network is “more strongly engaged when you’re watching to learn, as opposed to just passively spectating.” In one study, participants in an fMRI scanner asked to watch a task being performed with the goal of learning how to do it showed greater brain activity in the parietofrontal mirror system, cerebellum and hippocampus than those simply being asked to watch it. And one region, the pre-SMA (for “supplementary motor area”), a region thought to be linked with the “internal generation of complex movements,” was activated only in the learning condition — as if, knowing they were going to have to execute the task themselves, participants began internally rehearsing it.

It also helps to arrange for the kind of feedback that makes a real classroom work so well. If you were trying to learn one of Beyonce’s dance routines, for example, Cross suggests using a mirror, “to see if you’re getting it right.” When trying to learn something in which we do not have direct visual access to how well we are doing — like a tennis serve or a golf swing — learning by YouTube may be less effective.


The final piece of advice is to look at both experts and amateurs. Work by Proteau and others has shown that subjects seemed to learn sample tasks more effectively when they were shown videos of both experts performing the task effortlessly, and the error-filled efforts of novices (as opposed to simply watching experts or novices alone). It may be, Proteau suggests, that in the “mixed” model, we learn what to strive for as well as what to avoid.

The ending of the liberal interregnum

Saturday, October 15th, 2016

Razib Khan shares a talk from Alice Dreger, author of Galileo’s Middle Finger: Heretics, Activists, and One Scholar’s Search for Justice, and notes a passage where she waxes eloquently about the Enlightenment, and freedom of thought:

At a certain point the cultural Left no longer made any pretense to being liberal, and transformed themselves into “progressives.” They have taken Marcuse’s thesis in Repressive Tolerance to heart.

Though I hope that Dreger and her fellow travelers succeed in rolling back the clock, I suspect that the battle here is lost. She points out, correctly, that the total politicization of academia will destroy its existence as a producer of truth in any independent and objective manner. More concretely, she suggests it is likely that conservatives will simply start to defund and direct higher education even more stridently than they do now, because they will correctly see higher education as purely a tool toward the politics of their antagonists. I happen to be a conservative, and one who is pessimistic about the persistence of a public liberal space for ideas that offend. If progressives give up on liberalism of ideas, and it seems that many are (the most famous defenders of the old ideals are people from earlier generations, such as Nadine Strossen and Wendy Kaminer, with Dreger being a young example), I can’t see those of us in the broadly libertarian wing of conservatism making the last stand alone.

Honestly, I don’t want any of my children learning “liberal arts” from the high priests of the post-colonial cult. In the near future the last resistance on the Left to the ascendency of identity politics will probably be extinguished, as the old guard retires and dies naturally. The battle will be lost. Conservatives who value learning, and intellectual discourse, need to regroup. Currently there is a populist moood in conservatism that has been cresting for a generation. But the wave of identity politics is likely to swallow the campus Left with its intellectual nihilism. Instead of expanding outward it is almost certain that academia will start cannibalizing itself in internecine conflict when all the old enemies have been vanquished.

Let the private universities, such as Oberlin, wallow in their identity politics contradictions. Dreger already points to the path we will probably have to take: gut the public universities even more than we have. Leave STEM and some professional schools intact, and transform them for all practical purposes into technical universities. All the other disciplines? Some private universities, the playgrounds of the rich and successful, will continue to be traditionalist in maintaining “liberal arts,” which properly parrot the latest post-colonial cant. But much learning will be privatized, and knowledge will spread through segregated “safe spaces.” Those of us who read and think will continue to read and think, like we always have. We just won’t have institutional backing, because there’s not going to be a societal consensus for such support.

I hope I’m wrong.

He shares two more conclusions in a comment:

It’s getting worse, not better, and it’s not about tenure or money. It’s about social sanction and approval. so two sad conclusions:

1) Truth can only move in hidden channels now if it conflicts with power. No one gives a shit if you appeal to truth; they know that it is not intrinsic value except in the serve of status and power. I admire Heterodox Academy, but part of me wonders if they’d be better served by being stealth and just creating a secret society that doesn’t put the academy on notice that some people know that reality is different from the official narratives.

2) The post-modernists are right to a first approximation: everything is power. So “we” have to capture and crush; it’s only victory or defeat. The odds are irrelevant. I put we in quotes because it doesn’t matter who you are, the game is on, whether you think you are a player or not.

Open data and crowd-sourcing mean that a whole ecosystem of knowledge can emerge that doesn’t need to be nakedly exposed and put people’s livelihoods and reputations at risk from the kommissars.

Some of my friends have argued this for a long time, and I resisted because I’m a liberal in the old sense. but reality is reality, and the fact is that no one wants the truth, and they’ll destroy you to deny it.

For every Alice Dreger there are 1,000 who support her. but they’ll stand aside while the 100 tear her to shreds, and talk sadly amongst themselves about what happened to her career…

Insulin and Alzheimer’s

Thursday, October 13th, 2016

Insulin resistance may be a powerful force in the development of Alzheimer’s Disease:

In the body, one of insulin’s responsibilities is to unlock muscle and fat cells so they can absorb glucose from the bloodstream. When you eat something sweet or starchy that causes your blood sugar to spike, the pancreas releases insulin to usher the excess glucose out of the bloodstream and into cells. If blood sugar and insulin spike too high too often, cells will try to protect themselves from overexposure to insulin’s powerful effects by toning down their response to insulin — they become “insulin resistant.” In an effort to overcome this resistance, the pancreas releases even more insulin into the blood to try to keep glucose moving into cells. The more insulin levels rise, the more insulin resistant cells become. Over time, this vicious cycle can lead to persistently elevated blood glucose levels, or type 2 diabetes.

In the brain, it’s a different story. The brain is an energy hog that demands a constant supply of glucose. Glucose can freely leave the bloodstream, waltz across the blood-brain barrier, and even enter most brain cells — no insulin required. In fact, the level of glucose in the cerebrospinal fluid surrounding your brain is always about 60% as high as the level of glucose in your bloodstream — even if you have insulin resistance — so, the higher your blood sugar, the higher your brain sugar.

Not so with insulin — the higher your blood insulin levels, the more difficult it can become for insulin to penetrate the brain. This is because the receptors responsible for escorting insulin across the blood-brain barrier can become resistant to insulin, restricting the amount of insulin allowed into the brain. While most brain cells don’t require insulin in order to absorb glucose, they do require insulin in order to process glucose. Cells must have access to adequate insulin or they can’t transform glucose into the vital cellular components and energy they need to thrive.

Despite swimming in a sea of glucose, brain cells in people with insulin resistance literally begin starving to death.

Which brain cells go first? The hippocampus is the brain’s memory center. Hippocampal cells require so much energy to do their important work that they often need extra boosts of glucose. While insulin is not required to let a normal amount of glucose into the hippocampus, these special glucose surges do require insulin, making the hippocampus particularly sensitive to insulin deficits. This explains why declining memory is one of the earliest signs of Alzheimer’s, despite the fact that Alzheimer’s Disease eventually destroys the whole brain.

Looking into the brains of habitual short sleepers

Wednesday, October 12th, 2016

A recent study looked into the brains of habitual short sleepers:

The team compared data from people who reported a normal amount of sleep in the past month with those who reported sleeping six hours or less a night. They further divided the short sleepers into two groups: Those who reported daytime dysfunction, such as feeling too drowsy to perform common tasks or keeping up enthusiasm, and those who reported feeling fine.

Both groups of short sleepers exhibited connectivity patterns more typical of sleep than wakefulness while in the MRI scanner. Anderson says that although people are instructed to stay awake while in the scanner, some short sleepers may have briefly drifted off, even those who denied dysfunction. “People are notoriously poor at knowing whether they’ve fallen asleep for a minute or two,” he says. For the short sleepers who deny dysfunction, one hypothesis is that their wake-up brain systems are perpetually in over-drive. “This leaves open the possibility that, in a boring fMRI scanner they have nothing to do to keep them awake and thus fall asleep,” says Jones. This hypothesis has public safety implications, according to Curtis. “Other boring situations, like driving an automobile at night without adequate visual or auditory stimulation, may also put short sleepers at risk of drowsiness or even falling asleep behind the wheel,” he says.

Looking specifically at differences in connectivity between brain regions, the researchers found that short sleepers who denied dysfunction showed enhanced connectivity between sensory cortices, which process external sensory information, and the hippocampus, a region associated with memory. “That’s tantalizing because it suggests that maybe one of the things the short sleepers are doing in the scanner is performing memory consolidation more efficiently than non-short sleepers,” Anderson says. In other words, some short sleepers may be able to perform sleep-like memory consolidation and brain tasks throughout the day, reducing their need for sleep at night. Or they may be falling asleep during the day under low-stimulation conditions, often without realizing it.

Where Creativity Comes From

Tuesday, October 11th, 2016

The old adage about inventiveness is that it stems from necessity:

Based on his studies of orangutans, primatologist Carel van Schaik of the University of Zurich has come to a very different view. “When food is scarce, orangutans go into energy-saving mode. They minimize movement and focus on unappealing fall-back foods,” he observed. Their strategy in this scenario is quite the opposite of innovation, but it makes sense. “Trying something new can be risky — you can get injured or poisoned — and it requires a serious investment of time, energy and attention, while the outcome is always uncertain,” van Schaik explains.

Research on humans faced with scarcity echoes van Schaik’s orangutan findings. In 2013, Science published a study by economist Sendhil Mullainathan of Harvard University and psychologist Eldar Shafir of Princeton University describing how reminding people with a low income of their financial trouble reduced their capacity to think logically and solve problems in novel situations. A subsequent study found that Indian sugarcane farmers performed much better on the same cognitive performance test after receiving the once-a-year payment for their produce, temporarily resolving their monetary concerns. (Farmers who did not take the test previously did comparably well after getting paid, so it is unlikely that the improvement was simply the consequence of prior experience with the test.) People will do whatever it takes to survive, of course, which may occasionally lead to innovations. But as these and other studies suggest, if one’s mind is constantly occupied with urgent problems, such as finding food or shelter or paying bills, there will not be much capacity left to come up with long-term solutions to better one’s livelihood.

So where does creativity come from? Insights have come from the surprising observation that orangutans can be incredibly creative in captivity. “If food is provided for and predators are absent, they suddenly have a lot of time on their hands, free from such distractions,” van Schaik explains. Furthermore, in their highly controlled environments, exploration rarely has unpleasant consequences, and there are many unusual objects to play around with. Under such circumstances, orangutans appear to lose their usual fear of the unknown. In a study published in the American Journal of Primatology in 2015, van Schaik and his colleagues compared the response of wild and captive orangutans to a newly introduced object, a small platform in the shape of an orangutan nest. While captive orangutans approached the new object almost immediately, most wild orangutans, though habituated to the presence of humans, didn’t even go near it during several months of testing — only one eventually dared to touch it. Such fear of novelty may pose a significant obstacle to creativity: if an animal avoids approaching any new objects, innovations become rather unlikely. “So if you ask me, opportunity is the mother of invention,” van Schaik remarks.

Thought in its First, Molten State

Monday, October 10th, 2016

Philosophy never seems to be making progress:

One of Gottlieb’s central insights is that, as he wrote in his previous volume, “The Dream of Reason,” which covered thought from the Greeks to the Renaissance, “the history of philosophy is more the history of a sharply inquisitive cast of mind than the history of a sharply defined discipline.” You might say that philosophy is what we call thought in its first, molten state, before it has had a chance to solidify into a scientific discipline, like psychology or cosmology. When scientists ask how people think or how the universe was created, they are addressing the same questions posed by philosophy hundreds or even thousands of years earlier. This is why, Gottlieb observes, people complain that philosophy never seems to be making progress: “Any corner of it that comes generally to be regarded as useful soon ceases to be called philosophy.”

Growing Plants on Mars

Sunday, October 9th, 2016

Growing plants on Mars ain’t easy:

Drew Palmer, an assistant professor of Biological Sciences, Brooke Wheeler an assistant professor at the College of Aeronautics, and astrobiology majors from the Department of Physics and Space Sciences, are growing Outredgeous lettuce (a variety of red romaine) in different settings — Earth soil, analog Martian surface material known as regolith simulant, and regolith simulant with nutrients added — to find the magic formula for the type and amount of nutrients needed to grow a plant in inhospitable Martian dirt.

“We have to get the regolith right or anything we do won’t be valid,” said Andy Aldrin, director of the Buzz Aldrin Space Institute.

Unlike Earth soil, Martian regolith contains no helpful organic matter and has fewer minerals plants need for food, such as phosphates and nitrates. Adding to the challenges, real Martian regolith in its pure state is harmful for both plants and humans because of high chlorine content in the form of perchlorates.

The current Mars regolith simulant isn’t perfect. Until a real sample of Mars dirt comes back to Earth, which could happen on a mission estimated to be at least 15 years from now, Florida Tech researchers will spend the next year trying to create an accurate regolith analogue by applying chemical sensing data from the Mars rovers.

Eventually, it may be possible with the addition of fertilizer and removal of the perchlorates to grow various plants in a Martian soil. Florida Tech scientists are partnering with NASA scientists who have experience growing plants on the International Space Station to help figure out ways to make Martian farming a reality.

The Mindful Child

Saturday, October 8th, 2016

Meditation training may be most helpful for children:

It’s long been known that meditation helps children feel calmer, but new research is helping quantify its benefits for elementary school-age children. A 2015 study found that fourth- and fifth-grade students who participated in a four-month meditation program showed improvements in executive functions like cognitive control, working memory, cognitive flexibility — and better math grades. A study published recently in the journal Mindfulness found similar improvements in mathematics in fifth graders with attention deficit hyperactivity disorder. And a study of elementary school children in Korea showed that eight weeks of meditation lowered aggression, social anxiety and stress levels.

These investigations, along with a review published in March that combed the developmental psychology and cognitive neuroscience literature, illustrate how meditative practices have the potential to actually change the structure and function of the brain in ways that foster academic success.

Fundamental principles of neuroscience suggest that meditation can have its greatest impact on cognition when the brain is in its earliest stages of development.

This is because the brain develops connections in prefrontal circuits at its fastest rate in childhood. It is this extra plasticity that creates the potential for meditation to have greater impact on executive functioning in children. Although meditation may benefit adults more in terms of stress reduction or physical rejuvenation, its lasting effects on things like sustained attention and cognitive control are significant but ultimately less robust.

A clinical study published in 2011 in The Journal of Child and Family Studies demonstrates this concept superbly. The research design allowed adults and children to be compared directly since they were enrolled in the same mindfulness meditation program and assessed identically. Children between 8 and 12 who had A.D.H.D. diagnoses, along with parents, were enrolled in an eight-week mindfulness-training program. The results showed that mindfulness meditation significantly improved attention and impulse control in both groups, but the improvements were considerably more robust in the children.

Outside of the lab, many parents report on the benefits of early meditation. Heather Maurer of Vienna, Va., who was trained in transcendental meditation, leads her 9-year-old daughter, Daisy, through various visualization techniques and focused breathing exercises three nights a week, and says her daughter has become noticeably better at self-regulating her emotions, a sign of improved cognitive control. “When Daisy is upset, she will sit herself down and concentrate on her breathing until she is refocused,” Ms. Maurer said.

Amanda Simmons, a mother who runs her own meditation studio in Los Angeles, has seen similar improvements in her 11-year-old son, Jacob, who is on the autism spectrum. Jacob also has A.D.H.D. and bipolar disorder, but Ms. Simmons said many of his symptoms have diminished since he began daily meditation and mantra chants six months ago. “The meditation seems to act like a ‘hard reboot’ for his brain, almost instantly resolving mood swings or lessening anger,” Ms. Simmons said. She believes it has enabled him to take a lower dose of Risperdal, an antipsychotic drug used to treat bipolar disorder.

Whether children are on medication or not, meditation can help instill self-control and an ability to focus. Perhaps encouraging meditation and mind-body practices will come to be recognized as being as essential to smart parenting as teaching your child to work hard, eat healthfully and exercise regularly.

To learn some meditation techniques you can teach your child, read Three Ways for Children to Try Meditation at Home.

A Cognitively Restricted Subculture

Wednesday, October 5th, 2016

This passage from a Guardian piece on James Flynn made me do a double-take:

He is also an ardent democratic socialist who left an academic career in the US because he believed he was held back by his political views and his activity in the civil rights movement.

Flynn thought his academic career was held back by his pro-civil rights views?

Despite Flynn’s progressive bona fides, The Guardian has its concerns:

It is already evident to me, after reading the book [Does your Family Make You Smarter?], that the Flynn effect doesn’t settle as much as some of us thought or hoped it did. And that by 21st-century standards, perhaps Flynn doesn’t quite measure up as a liberal hero.

The answer to the question in the title, Flynn explains, is that your family environment’s effect on your IQ almost disappears by the age of 17. An important exception is in the vocabulary component of IQ tests, where the effect persists into the mid-20s and can make a big difference, at least in the US, to the chances of getting into a top university. The home has most influence in early childhood but is swamped by later environments at school, university and work. And they will more closely match your genes because you will seek out (and be chosen for) environments that match your “genetic potential”, whether it’s basketball, carpentry or mathematics.


I have many more questions but one in particular looms over discussions about IQ and we both know we can’t avoid it. It was, after all, to challenge the late Arthur Jensen, professor of educational psychology at the University of California, Berkeley — who claimed the genes of African Americans were responsible for their inferior IQ scores — that Flynn began to examine the evidence on intelligence. But a sentence from his new book is nagging away at me. American blacks, it says, “come from a cognitively restricted subculture”.

This is hugely sensitive territory because, while it may be good to say genes don’t make people stupid, it isn’t so good to tell anyone their way of life does. Flynn, however, makes no apologies. “It’s whites, not blacks, who complain,” he says. “Blacks know the score. Facts are facts.” On recorded IQ tests, he says, African Americans have persistently lagged behind most other ethnicities in America (including, according to some commentators, black immigrants from, for example, the Caribbean) and this cannot be explained by the Flynn effect since, as he puts it, “blacks don’t live in a time warp”.

He then tells what sounds like a version of those dodgy jokes about the Irishman, the Scotsman and the Englishman. Except this isn’t a joke. “Go to the American suburbs one evening,” says Flynn, “and find three professors. The Chinese professor’s kids immediately do their homework. The Jewish professor’s kids have to be yelled at. The black professor says: ‘Why don’t we go out and shoot a few baskets?’”

As I emit a liberal gasp, he continues: “The parenting is worse in black homes, even when you equate them for socio-economic status. In the late 1970s, an experiment took 46 black adoptees and gave half to black professional families and half to white professionals with all the mothers having 16 years of education. When their IQs were tested at eight-and-a-half, the white-raised kids were 13.5 IQ points ahead. The mothers were asked to do problem-solving with their children. Universally, the blacks were impatient, the whites encouraging. Immediate achievement is rewarded in black subculture but not long-term achievement where you have to forgo immediate gratification.”

He tells me of research showing that “when American troops occupied Germany at the end of the second world war, black soldiers left behind half-black children and white soldiers left behind all-white. By 11, the two groups had identical average IQs. In Germany, there was no black subculture.”

Flynn refuses to speculate about the lingering effects of slavery and subsequent discrimination that have prevented African Americans from entering colleges and professional careers. Universities, he thinks, should do more research on racial differences and a new version of that 1970s study. “I have shown — this wicked person who actually looks at the evidence — that blacks gained 5.5 IQ points on whites between 1972 and 2002. There’s been no changes in family structure [the incidence of single-parent families], no gains in income. I suspect it’s an improvement in parenting. But I can’t prove it.”

I leave that sunlit garden in a troubled frame of mind. Flynn has made a great contribution to human knowledge and understanding. But he hasn’t settled the nature-against-nurture debate — and I wonder if he is now muddying the waters, constructing theories about parenting from flimsy evidence.

A Schizophrenic Computer

Tuesday, October 4th, 2016

You can “teach” a neural net a series of simple stories, but if the neural net is set to “hyperlearn” from examples, you get a schizophrenic computer:

For ordinary brains, while there’s significant evidence that people do pretty much remember everything, your brain stores them differently. In particular, intense experiences, which are signaled to the brain by the presence of dopamine, are remembered differently than others. Which is why, for example, you probably can’t remember what you had for lunch last Tuesday, but you still have strong memories of your first kiss.

The hyperlearning hypothesis posits that for schizophrenics, this system of classifying experiences breaks down because of excessive levels of dopamine. Rather than classifying some memories as important and others as less essential, the brain classes everything as important. According to the hypothesis, this is what leads to schizophrenics getting trapped into seeing patterns that aren’t there, or simply drown in so many memories that they can’t focus on anything.

In order to simulate the hyperlearning hypothesis, the team put the DISCERN network back through the paces of learning, only this time, they increased its learning rate — in other words, it wasn’t forgetting as many things. They “taught” it several stories, then asked them to repeat them back. They then compared the computer’s result to the results of schizophrenic patients, as well as healthy controls.

What they discovered is that, like the schizophrenics, the DISCERN program had trouble remembering which story it was talking about, and got elements of the different stories confused with each other. The DISCERN program also showed other symptoms of schizophrenia, such as switching back and forth between third and first person, abruptly changing sentences, and just providing jumbled responses.

Preschool Teacher Bias

Monday, October 3rd, 2016

A recent study asked teachers to watch preschool classroom video and detect challenging behavior before it became problematic:

Each video included four children: a black boy and girl and a white boy and girl.

Here’s the deception: There was no challenging behavior.

While the teachers watched, eye-scan technology measured the trajectory of their gaze. Gilliam wanted to know: When teachers expected bad behavior, who did they watch?

“What we found was exactly what we expected based on the rates at which children are expelled from preschool programs,” Gilliam says. “Teachers looked more at the black children than the white children, and they looked specifically more at the African-American boy.”

Indeed, according to recent data from the U.S. Department of Education, black children are 3.6 times more likely to be suspended from preschool than white children. Put another way, black children account for roughly 19 percent of all preschoolers, but nearly half of preschoolers who get suspended.

One reason that number is so high, Gilliam suggests, is that teachers spend more time focused on their black students, expecting bad behavior. “If you look for something in one place, that’s the only place you can typically find it.”

The Yale team also asked subjects to identify the child they felt required the most attention. Forty-two percent identified the black boy, 34 percent identified the white boy, while 13 percent and 10 percent identified the white and black girls respectively.

The only possible explanation for this is teacher bias. There’s no way boys could be three times as much trouble as girls, after all.

Vegetable Television

Friday, September 30th, 2016

Ariel Levy calls ayahuasca the drug of choice for the Age of Kale:

The day after Apollo 14 landed on the moon, Dennis and Terence McKenna began a trek through the Amazon with four friends who considered themselves, as Terence wrote in his book “True Hallucinations,” “refugees from a society that we thought was poisoned by its own self-hatred and inner contradictions.” They had come to South America, the land of yagé, also known as ayahuasca: an intensely hallucinogenic potion made from boiling woody Banisteriopsis caapi vines with the glossy leaves of the chacruna bush. The brothers, then in their early twenties, were grieving the recent death of their mother, and they were hungry for answers about the mysteries of the cosmos: “We had sorted through the ideological options, and we had decided to put all of our chips on the psychedelic experience.”

They started hiking near the border of Peru. As Dennis wrote, in his memoir “The Brotherhood of the Screaming Abyss,” they arrived four days later in La Chorrera, Colombia, “in our long hair, beards, bells, and beads,” accompanied by a “menagerie of sickly dogs, cats, monkeys, and birds” accumulated along the way. (The local Witoto people were cautiously amused.) There, on the banks of the Igara Paraná River, the travellers found themselves in a psychedelic paradise. There were cattle pastures dotted with Psilocybe cubensis — magic mushrooms — sprouting on dung piles; there were hammocks to lounge in while you tripped; there were Banisteriopsis caapi vines growing in the jungle. Taken together, the drugs produced hallucinations that the brothers called “vegetable television.” When they watched it, they felt they were receiving important information directly from the plants of the Amazon.

The McKennas were sure they were on to something revelatory, something that would change the course of human history. “I and my companions have been selected to understand and trigger the gestalt wave of understanding that will be the hyperspacial zeitgeist,” Dennis wrote in his journal. Their work was not always easy. During one session, the brothers experienced a flash of mutual telepathy, but then Dennis hurled his glasses and all his clothes into the jungle and, for several days, lost touch with “consensus reality.” It was a small price to pay. The “plant teachers” seemed to have given them “access to a vast database,” Dennis wrote, “the mystical library of all human and cosmic knowledge.”

Occam’s razor might suggest that the drug creates a sense of profound understanding, without conferring any actual profound understanding — but it does seem to have profound positive effects:

The self-help guru Tim Ferriss told me that the drug is everywhere in San Francisco, where he lives. “Ayahuasca is like having a cup of coffee here,” he said. “I have to avoid people at parties because I don’t want to listen to their latest three-hour saga of kaleidoscopic colors.”


Ferriss, the author of such “life-hacking” manuals as “The 4-Hour Workweek” and “The 4-Hour Body,” told me, “It’s mind-boggling how much it can do in one or two nights.” He uses ayahuasca regularly, despite a harrowing early trip that he described as “the most painful experience I’ve ever had by a factor of a thousand. I felt like I was being torn apart and killed a thousand times a second for two hours.” This was followed by hours of grand-mal seizures; Ferriss had rug burns on his face the next day. “I thought I had completely fried my motherboard,” he continued. “I remember saying, ‘I will never do this again.’ ” But in the next few months he realized that something astounding had happened to him. “Ninety per cent of the anger I had held on to for decades, since I was a kid, was just gone. Absent.”

Why the Father of Modern Statistics Didn’t Believe Smoking Caused Cancer

Tuesday, September 27th, 2016

Ronald Fisher, the notoriously cantankerous father of modern statistics, was appalled when the British Medical Journal‘s editorial board announced, in 1957, that the time for amassing evidence and analyzing data was over:

Now, they wrote, “all the modern devices of publicity” should be used to inform the public about the perils of tobacco.

According to Fisher, this was nothing short of statistically illiterate fear mongering.

He was right, in the narrow sense, that no one had yet proven a causal link between smoking and cancer:

Fisher never objected to the possibility that smoking caused cancer, only the certainty with which public health advocates asserted this conclusion.

“None think that the matter is already settled,” he insisted in his letter to the British Medical Journal. “Is not the matter serious enough to require more serious treatment?”

R.A. Fisher Smoking Pipe

While most of the afflictions that had been killing British citizens for centuries were trending downward, the result of advances in medicine and sanitation, one disease was killing more and more people each year: carcinoma of the lung.

The figures were staggering. Between 1922 and 1947, the number of deaths attributed to lung cancer increased 15-fold across England and Wales. Similar trends were documented around the world. Everywhere, the primary target of the disease seemed to be men.

What was the cause? Theories abounded. More people than ever were living in large, polluted cities. Cars filled the nation’s causeways, belching noxious fumes. Those causeways were increasingly being covered in tar. Advances in X-ray technology allowed for more accurate diagnoses. And, of course, more and more people were smoking cigarettes.

Which of these factors was to blame? All of them? None of them? British society had changed so dramatically and in so many ways since the First World War, it was impossible to identify a single cause. As Fisher would say, there were just too many confounding variables.

In 1947, the British Medical Research Council hired Austin Bradford Hill and Richard Doll to look into the question.

Though Doll was not well known at the time, Hill was an obvious choice. A few years earlier, he had made a name for himself with a pioneering study on the use of antibiotics to treat tuberculosis. Just as Fisher had randomly distributed fertilizer across the fields at Rothamsted, Hill had given out streptomycin to tubercular patients at random while prescribing bed rest to others. Once again, the goal was to make sure that the patients who received one treatment were, on average, identical to those who received the other. Any large difference in outcomes between the two groups had to be the result of the drug. It was medicine’s first published randomized control trial.

Despite Hill’s groundbreaking work with randomization, the question of whether smoking (or anything else) causes cancer was not one you could ask with a randomized control trial. Not ethically, anyway.

“That would involve taking a group of say 6,000 people, selecting 3,000 at random and forcing them to smoke for 5 years, while forcing the other 3,000 not to smoke for 5 years, and then comparing the incidence of lung cancer in the two groups,” says Donald Gillies, an emeritus professor of philosophy of science and mathematics at University College London. “Clearly this could not be done, so, in this example, one has to rely on other types of evidence.”

Hill and Doll tried to find that evidence in the hospitals of London. They tracked down over 1,400 patients, half of whom were suffering from lung cancer, the other half of whom had been hospitalized for other reasons. Then, as Doll later told the BBC, “we asked them every question we could think of.”

These questions covered their medical and family histories, their jobs, their hobbies, where they lived, what they ate, and any other factor that might possibly be related to lung cancer. The two epidemiologists were shooting in the dark. The hope was that one of the many questions would touch on a trait or behavior that was common among the lung cancer patients and rare among those in the control group.

At the beginning of the study, Doll had his own theory.

“I personally thought it was tarring of the roads,” Doll said. But as the results began to come in, a different pattern emerged. “I gave up smoking two-thirds of the way though the study.”

Hill and Doll published their results in the British Medical Journal in September of 1950. The findings were alarming, but not conclusive. Though the study found that smokers were more likely than non-smokers to have lung cancer, and that the prevalence of the disease rose with the quantity smoked, the design of the study still left room for Fisher’s dreaded “confounding” problem.

The problem was in the selection of the control. Hill and Doll had picked a comparison group that resembled the lung cancer patients in age, sex, approximate residence, and social class. But did this cover the entire list of possible confounders? Was there some other trait, forgotten or invisible, that the two researchers had failed to ask about?

To get around this problem, Hill and Doll designed a study where they wouldn’t have to choose a control group at all. Instead, the two researchers surveyed over 30,000 doctors across England. These doctors were asked about their smoking habits and medical histories. And then Hill and Doll waited to see which doctors would die first.

By 1954, a familiar pattern began to emerge. Among the British doctors, 36 had died of lung cancer. All of them had been smokers. Once again, the death rate increased with the rate of smoking.

The “British Doctor Study” had a distinct advantage over the earlier survey of patients. Here, the researchers could show a clear “this then that” relationship (what medical researchers call a “dose-response”). Some doctors smoked more than others in 1951. By 1954, more of those doctors were dead.

The back-to-back Doll and Hill studies were notable for their scope, but they were not the only ones to find a consistent connection between smoking and lung cancer. Around the same time, the American epidemiologists, E. C. Hammond and Daniel Horn conducted a study very similar to the Hill and Doll survey of British doctors.

Their results were remarkably consistent. In 1957, the Medical Research Council and the British Medical Journal decided that enough evidence had been gathered. Citing Doll and Hill, the journal declared that “the most reasonable interpretation of this evidence is that the relationship is one of direct cause and effect.”

Ronald Fisher begged to differ.

In some ways, the timing was perfect. In 1957, Fisher had just retired and was looking for a place to direct his considerable intellect and condescension.

Neither the first nor the last retiree to start a flame war, Fisher launched his opening salvo by questioning the certainty with which the British Medical Journal had declared the argument over.

“A good prima facie case had been made for further investigation,” he wrote. “The further investigation seems, however, to have degenerated into the making of more confident exclamations.”

The first letter was followed by a second and then a third. In 1959, Fisher amassed these missives into a book. He denounced his colleagues for manufacturing anti-smoking “propaganda.” He accused Hill and Doll of suppressing contrary evidence. He hit the lecture circuit, relishing the opportunity to once again hold forth before the statistical establishment and to be, in the words of his daughter, “deliberately provocative.”

Provocation aside, Fisher’s critique came down to the same statistical problem that he had been tackling since his days at Rothamsted: confounding variables. He did not dispute that smoking and lung cancer tended to rise and fall together—that is, that they were correlated. But Hill and Doll and the entire British medical establishment had committed “an error…of an old kind, in arguing from correlation to causation,” he wrote in a letter to Nature.

Most researchers had evaluated the association between smoking and cancer and concluded that the former caused the latter. But what if the opposite were true?

What if the development of acute lung cancer was preceded by an undiagnosed “chronic inflammation,” he wrote. And what if this inflammation led to a mild discomfort, but no conscious pain? If that were the case, wrote Fisher, then one would expect those suffering from pre-diagnosed lung cancer to turn to cigarettes for relief. And here was the British Medical Journal suggesting that smoking be banned in movie theaters!

“To take the poor chap’s cigarettes away from him,” he wrote, “would be rather like taking away [the] white stick from a blind man.”

If that particular explanation seems like a stretch, Fisher offered another. If smoking doesn’t cause cancer and cancer doesn’t cause smoking, then perhaps a third factor causes both. Genetics struck him as a possibility.

To make this case, Fisher gathered data on identical twins in Germany and showed that twin siblings were more likely to mimic one another’s smoking habits. Perhaps, Fisher speculated, certain people were genetically predisposed to crave of cigarettes.

Was there a similar familial pattern for lung cancer? Did these two predispositions come from the same hereditary trait? At the very least, researchers ought to look into this possibility before advising people to toss out their cigarettes.

And yet nobody was.

“Unfortunately, considerable propaganda is now being developed to convince the public that cigarette smoking is dangerous,” he wrote. “It is perhaps natural that efforts should be made to discredit evidence which suggests a different view.”

Though Fisher was in the minority, he was not alone in taking this “different view.” Joseph Berkson, the chief statistician at the Mayo Clinic throughout the 1940s and 50s, was also a prominent skeptic on the smoking-cancer question, as was Charles Cameron, president of the American Cancer Society. For a time, many of Fisher’s peers in academic statistics, including Jerzy Neyman, questioned the validity of a causal claim. But before long, the majority buckled under the weight of mounting evidence and overwhelming consensus.

But not Fisher. He died in 1962 (of cancer, though not of the lung). He never conceded the point.

Feed a virus, starve a bacterial infection?

Saturday, September 24th, 2016

A new study supports the folk wisdom to “feed a cold and starve a fever” — if you assume a fever is bacterial:

In the first series of experiments, the investigators infected mice with the bacterium Listeria monocytogenes, which commonly causes food poisoning. The mice stopped eating, and they eventually recovered. But when the mice were force fed, they died. The researchers then broke the food down by component and found fatal reactions when the mice were given glucose, but not when they were fed proteins or fats. Giving mice the chemical 2-DG, which prevents glucose metabolism, was enough to rescue even mice who were fed glucose and allowed them to survive the infection.

When the researchers did similar studies in mice with viral infections, they found the opposite effect. Mice infected with the flu virus A/WSN/33 survived when they were force fed glucose, but died when they were denied food or given 2-DG.

Migrant Competence

Sunday, September 18th, 2016

James Thompson explores migrant competence:

Europe is experiencing enormous inflows of people from Africa and the Middle East, and in the midst of conflicting rhetoric, of strong emotions and of a European leadership broadly in favour of taking more migrants (and sometimes competing to do so) one meme keeps surfacing: that European Jews are the appropriate exemplars of migrant competence and achievements.

European history in the 20th Century shows why present-day governments feel profound shame at their predecessors having spurned European Jews fleeing Nazi Germany. However, there are strong reasons for believing that European Jews are brighter than Europeans, and have greater intellectual and professional achievements. There may be cognitive elites elsewhere, but they have yet to reveal themselves. Expectations based on Jewish successes are unlikely to be repeated.

I am old enough to know that political decisions are not based on facts, but on presumed political advantages. The calculation of those leaders who favour immigration seems to be that the newcomers will bring net benefits, plus the gratitude and votes of those migrants, plus the admiration of some of the locals for policies which are presented as being acts of generosity, thus making some locals feel good about themselves for their altruism. One major ingredient of the leadership’s welcome to migrants is the belief that they will quickly adapt to the host country, and become long term net contributors to society. Is this true?

With Heiner Rindermann he analyzed the gaps, possible causes, and impact of The Cognitive Competences of Immigrant and Native Students across the World:

In Finland the natives had reading scores of 538, first-generation immigrants only 449, second-generation 493. The original first-generation difference of 89 points was equivalent to around 2–3 school years of progress, the second-generation difference of 45 points (1-2 school years) is still of great practical significance in occupational terms.

In contrast, in Dubai natives had reading scores of 395; first-generation immigrants 467; second-generation 503. This 105 point difference is equivalent to 16 IQ points or 3–5 years of schooling.

Rather than look at the scales separately, Rindermann created a composite score based on PISA, TIMSS and PIRLS data so as to provide one overall competence score for both the native born population and the immigrants which had settled in each particular country. For each country you can seen the natives versus immigrant gap. By working out what proportion of the national population are immigrants you can recalculate the national competence (IQ) for that country. Rindermann proposes that native born competences need to be distinguished from immigrant competences in national level data.

The analysis of scholastic attainments in first and second generation immigrants shows that the Gulf has gained from immigrants and Europe has lost. This is because those emigrating to the Gulf have higher abilities than the locals, those emigrating to Europe have lower ability than the locals.

The economic consequences can be calculated by looking at the overall correlations between country competence and country GDP.


The natives of the United Kingdom have a competence score of 519 (migrants to UK 499), Germany 516 (migrants to Germany 471), the United States 517 (migrants to US 489). There, in a nutshell, is the problem: those three countries have not selected their migrants for intellectual quality. The difference sounds in damages: lower ability leads to lower status, lower wages and higher resentment at perceived differences. On the latter point, if the West cannot bear to mention competence differences, then differences in outcome are seen as being due solely to prejudice.