Why the Father of Modern Statistics Didn’t Believe Smoking Caused Cancer

September 27th, 2016

Ronald Fisher, the notoriously cantankerous father of modern statistics, was appalled when the British Medical Journal‘s editorial board announced, in 1957, that the time for amassing evidence and analyzing data was over:

Now, they wrote, “all the modern devices of publicity” should be used to inform the public about the perils of tobacco.

According to Fisher, this was nothing short of statistically illiterate fear mongering.

He was right, in the narrow sense, that no one had yet proven a causal link between smoking and cancer:

Fisher never objected to the possibility that smoking caused cancer, only the certainty with which public health advocates asserted this conclusion.

“None think that the matter is already settled,” he insisted in his letter to the British Medical Journal. “Is not the matter serious enough to require more serious treatment?”

R.A. Fisher Smoking Pipe

While most of the afflictions that had been killing British citizens for centuries were trending downward, the result of advances in medicine and sanitation, one disease was killing more and more people each year: carcinoma of the lung.

The figures were staggering. Between 1922 and 1947, the number of deaths attributed to lung cancer increased 15-fold across England and Wales. Similar trends were documented around the world. Everywhere, the primary target of the disease seemed to be men.

What was the cause? Theories abounded. More people than ever were living in large, polluted cities. Cars filled the nation’s causeways, belching noxious fumes. Those causeways were increasingly being covered in tar. Advances in X-ray technology allowed for more accurate diagnoses. And, of course, more and more people were smoking cigarettes.

Which of these factors was to blame? All of them? None of them? British society had changed so dramatically and in so many ways since the First World War, it was impossible to identify a single cause. As Fisher would say, there were just too many confounding variables.

In 1947, the British Medical Research Council hired Austin Bradford Hill and Richard Doll to look into the question.

Though Doll was not well known at the time, Hill was an obvious choice. A few years earlier, he had made a name for himself with a pioneering study on the use of antibiotics to treat tuberculosis. Just as Fisher had randomly distributed fertilizer across the fields at Rothamsted, Hill had given out streptomycin to tubercular patients at random while prescribing bed rest to others. Once again, the goal was to make sure that the patients who received one treatment were, on average, identical to those who received the other. Any large difference in outcomes between the two groups had to be the result of the drug. It was medicine’s first published randomized control trial.

Despite Hill’s groundbreaking work with randomization, the question of whether smoking (or anything else) causes cancer was not one you could ask with a randomized control trial. Not ethically, anyway.

“That would involve taking a group of say 6,000 people, selecting 3,000 at random and forcing them to smoke for 5 years, while forcing the other 3,000 not to smoke for 5 years, and then comparing the incidence of lung cancer in the two groups,” says Donald Gillies, an emeritus professor of philosophy of science and mathematics at University College London. “Clearly this could not be done, so, in this example, one has to rely on other types of evidence.”

Hill and Doll tried to find that evidence in the hospitals of London. They tracked down over 1,400 patients, half of whom were suffering from lung cancer, the other half of whom had been hospitalized for other reasons. Then, as Doll later told the BBC, “we asked them every question we could think of.”

These questions covered their medical and family histories, their jobs, their hobbies, where they lived, what they ate, and any other factor that might possibly be related to lung cancer. The two epidemiologists were shooting in the dark. The hope was that one of the many questions would touch on a trait or behavior that was common among the lung cancer patients and rare among those in the control group.

At the beginning of the study, Doll had his own theory.

“I personally thought it was tarring of the roads,” Doll said. But as the results began to come in, a different pattern emerged. “I gave up smoking two-thirds of the way though the study.”

Hill and Doll published their results in the British Medical Journal in September of 1950. The findings were alarming, but not conclusive. Though the study found that smokers were more likely than non-smokers to have lung cancer, and that the prevalence of the disease rose with the quantity smoked, the design of the study still left room for Fisher’s dreaded “confounding” problem.

The problem was in the selection of the control. Hill and Doll had picked a comparison group that resembled the lung cancer patients in age, sex, approximate residence, and social class. But did this cover the entire list of possible confounders? Was there some other trait, forgotten or invisible, that the two researchers had failed to ask about?

To get around this problem, Hill and Doll designed a study where they wouldn’t have to choose a control group at all. Instead, the two researchers surveyed over 30,000 doctors across England. These doctors were asked about their smoking habits and medical histories. And then Hill and Doll waited to see which doctors would die first.

By 1954, a familiar pattern began to emerge. Among the British doctors, 36 had died of lung cancer. All of them had been smokers. Once again, the death rate increased with the rate of smoking.

The “British Doctor Study” had a distinct advantage over the earlier survey of patients. Here, the researchers could show a clear “this then that” relationship (what medical researchers call a “dose-response”). Some doctors smoked more than others in 1951. By 1954, more of those doctors were dead.

The back-to-back Doll and Hill studies were notable for their scope, but they were not the only ones to find a consistent connection between smoking and lung cancer. Around the same time, the American epidemiologists, E. C. Hammond and Daniel Horn conducted a study very similar to the Hill and Doll survey of British doctors.

Their results were remarkably consistent. In 1957, the Medical Research Council and the British Medical Journal decided that enough evidence had been gathered. Citing Doll and Hill, the journal declared that “the most reasonable interpretation of this evidence is that the relationship is one of direct cause and effect.”

Ronald Fisher begged to differ.

In some ways, the timing was perfect. In 1957, Fisher had just retired and was looking for a place to direct his considerable intellect and condescension.

Neither the first nor the last retiree to start a flame war, Fisher launched his opening salvo by questioning the certainty with which the British Medical Journal had declared the argument over.

“A good prima facie case had been made for further investigation,” he wrote. “The further investigation seems, however, to have degenerated into the making of more confident exclamations.”

The first letter was followed by a second and then a third. In 1959, Fisher amassed these missives into a book. He denounced his colleagues for manufacturing anti-smoking “propaganda.” He accused Hill and Doll of suppressing contrary evidence. He hit the lecture circuit, relishing the opportunity to once again hold forth before the statistical establishment and to be, in the words of his daughter, “deliberately provocative.”

Provocation aside, Fisher’s critique came down to the same statistical problem that he had been tackling since his days at Rothamsted: confounding variables. He did not dispute that smoking and lung cancer tended to rise and fall together—that is, that they were correlated. But Hill and Doll and the entire British medical establishment had committed “an error…of an old kind, in arguing from correlation to causation,” he wrote in a letter to Nature.

Most researchers had evaluated the association between smoking and cancer and concluded that the former caused the latter. But what if the opposite were true?

What if the development of acute lung cancer was preceded by an undiagnosed “chronic inflammation,” he wrote. And what if this inflammation led to a mild discomfort, but no conscious pain? If that were the case, wrote Fisher, then one would expect those suffering from pre-diagnosed lung cancer to turn to cigarettes for relief. And here was the British Medical Journal suggesting that smoking be banned in movie theaters!

“To take the poor chap’s cigarettes away from him,” he wrote, “would be rather like taking away [the] white stick from a blind man.”

If that particular explanation seems like a stretch, Fisher offered another. If smoking doesn’t cause cancer and cancer doesn’t cause smoking, then perhaps a third factor causes both. Genetics struck him as a possibility.

To make this case, Fisher gathered data on identical twins in Germany and showed that twin siblings were more likely to mimic one another’s smoking habits. Perhaps, Fisher speculated, certain people were genetically predisposed to crave of cigarettes.

Was there a similar familial pattern for lung cancer? Did these two predispositions come from the same hereditary trait? At the very least, researchers ought to look into this possibility before advising people to toss out their cigarettes.

And yet nobody was.

“Unfortunately, considerable propaganda is now being developed to convince the public that cigarette smoking is dangerous,” he wrote. “It is perhaps natural that efforts should be made to discredit evidence which suggests a different view.”

Though Fisher was in the minority, he was not alone in taking this “different view.” Joseph Berkson, the chief statistician at the Mayo Clinic throughout the 1940s and 50s, was also a prominent skeptic on the smoking-cancer question, as was Charles Cameron, president of the American Cancer Society. For a time, many of Fisher’s peers in academic statistics, including Jerzy Neyman, questioned the validity of a causal claim. But before long, the majority buckled under the weight of mounting evidence and overwhelming consensus.

But not Fisher. He died in 1962 (of cancer, though not of the lung). He never conceded the point.

Retire at 30

September 26th, 2016

In 2005, Peter Adeney — better known as Mr. Money Mustacheretired at 30 years old:

Leading up to retirement, Adeney and his wife, Simi, both software engineers, stashed two-thirds of their combined $134,000 take-home pay in savings. After just 10 years in the workforce, the couple had accrued about $600,000 in investments and paid off a house worth $200,000, Adeney told Nick Paumgarten of The New Yorker, giving them a solid cushion to retire on.

[...]

He suggests learning to live on less — cutting down your wardrobe, buying used cars — and finding ways to add meaning to life that don’t rely on material possessions. One personal challenge he took on was learning carpentry.

Hobbits, Hooligans, and Vulcans

September 25th, 2016

Jason Brennan divides people into three groups based on their orientation to politics:

“Hobbits,” who are apathetic and ignorant; “Hooligans,” who are engaged but hopelessly biased, convinced that fans of other political teams are “stupid, evil, selfish, or at best, deeply misguided”; and “Vulcans,” who “think scientifically and rationally about politics” and whose “opinions are strongly grounded in social science and philosophy.”

That third group is largely theoretical.

Brennan’s Against Democracy follows “previous libertarian broadsides against democracy,” such as Bryan Caplan’s The Myth of the Rational Voter and Ilya Somin’s Democracy and Political Ignorance.

Feed a virus, starve a bacterial infection?

September 24th, 2016

A new study supports the folk wisdom to “feed a cold and starve a fever” — if you assume a fever is bacterial:

In the first series of experiments, the investigators infected mice with the bacterium Listeria monocytogenes, which commonly causes food poisoning. The mice stopped eating, and they eventually recovered. But when the mice were force fed, they died. The researchers then broke the food down by component and found fatal reactions when the mice were given glucose, but not when they were fed proteins or fats. Giving mice the chemical 2-DG, which prevents glucose metabolism, was enough to rescue even mice who were fed glucose and allowed them to survive the infection.

When the researchers did similar studies in mice with viral infections, they found the opposite effect. Mice infected with the flu virus A/WSN/33 survived when they were force fed glucose, but died when they were denied food or given 2-DG.

Charles Murray II on Conversations with Bill Kristol

September 23rd, 2016

Bill Kristol converses with Charles Murray about the current political moment, a universal basic income, constitutionalism and nationalism, and The Bell Curve:

Stop wasting money teaching millions of students content they already know

September 22nd, 2016

Large percentages of students perform above grade level:

Based on the Wisconsin and California Smarter Balanced, Florida FSA, and multistate MAP data, we estimate that 20–40 percent of elementary and middle school students perform at least one grade level above their current grade in reading, with 11–30 percent scoring at least one grade level above in math.

Moreover, we also found large percentages of students performing well above grade level—more than one grade level ahead. Using MAP data, we estimate that 8–10 percent of Grade 4 students perform at the Grade 8 level in reading/English/language arts, with 2–5 percent scoring at similar levels in math. Relying specifically on the MAP data, one out of every ten fifth-graders is performing at the high school level in reading, and nearly one child in forty at this age is performing at the high school level in mathematics. Because of the MAP test’s computer-adaptive format and high measurement ceiling, these results cannot be explained away by the correction that commonly applies to pencil-and-paper grade-level achievement tests. On the latter tests, a fifth-grader with a ninth-grade-level equivalent score amounts to a ninth-grader’s completing a fifth-grade test. By contrast, a MAP test score that is equivalent to ninth-grade performance is in fact based on ninth-grade content knowledge and skills.

Converting these percentages to numbers of children provides a sobering picture of the number of students who are not well served under the current grade-based educational paradigm. In Wisconsin alone, somewhere between 278,000 and 330,000 public-school students are performing more than a full grade above where they are placed in school. And as mentioned above, in the much larger state of California, that number is between 1.4 million and 2 million students.

Federal and state education policies are largely irrelevant for this huge number of students. Getting kids to grade-level proficiency has been a focus of U.S. education policy and practice for well over a decade. Yet the U.S. likely wastes tens of billions of dollars each year in efforts to teach students content they already know.

This structure centered on age-based grade levels, therefore, needs serious rethinking. One option is whole-grade or single-subject acceleration. Indeed, this is consistent with the literature, which has documented uniformly positive benefits when academic acceleration is implemented thoughtfully. Academic acceleration is particularly beneficial for students pursuing professional careers that require substantial academic preparation and credentialing, a point that has been recognized for more than eighty years. Acceleration would also reduce the difficulty of differentiated instruction because students within a given classroom are selected to be far more homogeneous in ability and prior knowledge than they are in the traditional system.

[...]

The current K-12 education system essentially ignores the learning needs of a huge percentage of its students. Knowing this, twenty years from now we may look back and wonder why we kept using age-based grade levels to organize K–12 education for so long.

Andrew Sullivan’s Distraction Sickness

September 21st, 2016

Andrew Sullivan doesn’t quite call for a Butlerian Jihad, but he does recognize that he developed a distraction sickness from modern technology:

Since the invention of the printing press, every new revolution in information technology has prompted apocalyptic fears. From the panic that easy access to the vernacular English Bible would destroy Christian orthodoxy all the way to the revulsion, in the 1950s, at the barbaric young medium of television, cultural critics have moaned and wailed at every turn. Each shift represented a further fracturing of attention — continuing up to the previously unimaginable kaleidoscope of cable TV in the late-20th century and the now infinite, infinitely multiplying spaces of the web. And yet society has always managed to adapt and adjust, without obvious damage, and with some more-than-obvious progress. So it’s perhaps too easy to view this new era of mass distraction as something newly dystopian.

But it sure does represent a huge leap from even the very recent past. The data bewilder. Every single minute on the planet, YouTube users upload 400 hours of video and Tinder users swipe profiles over a million times. Each day, there are literally billions of Facebook “likes.” Online outlets now publish exponentially more material than they once did, churning out articles at a rapid-fire pace, adding new details to the news every few minutes. Blogs, Facebook feeds, Tumblr accounts, tweets, and propaganda outlets repurpose, borrow, and add topspin to the same output.

We absorb this “content” (as writing or video or photography is now called) no longer primarily by buying a magazine or paper, by bookmarking our favorite website, or by actively choosing to read or watch. We are instead guided to these info-nuggets by myriad little interruptions on social media, all cascading at us with individually tailored relevance and accuracy. Do not flatter yourself in thinking that you have much control over which temptations you click on. Silicon Valley’s technologists and their ever-perfecting algorithms have discovered the form of bait that will have you jumping like a witless minnow. No information technology ever had this depth of knowledge of its consumers — or greater capacity to tweak their synapses to keep them engaged.

And the engagement never ends. Not long ago, surfing the web, however addictive, was a stationary activity. At your desk at work, or at home on your laptop, you disappeared down a rabbit hole of links and resurfaced minutes (or hours) later to reencounter the world. But the smartphone then went and made the rabbit hole portable, inviting us to get lost in it anywhere, at any time, whatever else we might be doing. Information soon penetrated every waking moment of our lives.

And it did so with staggering swiftness. We almost forget that ten years ago, there were no smartphones, and as recently as 2011, only a third of Americans owned one. Now nearly two-thirds do. That figure reaches 85 percent when you’re only counting young adults. And 46 percent of Americans told Pew surveyors last year a simple but remarkable thing: They could not live without one. The device went from unknown to indispensable in less than a decade. The handful of spaces where it was once impossible to be connected — the airplane, the subway, the wilderness — are dwindling fast. Even hiker backpacks now come fitted with battery power for smartphones. Perhaps the only “safe space” that still exists is the shower.

Am I exaggerating? A small but detailed 2015 study of young adults found that participants were using their phones five hours a day, at 85 separate times. Most of these interactions were for less than 30 seconds, but they add up. Just as revealing: The users weren’t fully aware of how addicted they were. They thought they picked up their phones half as much as they actually did. But whether they were aware of it or not, a new technology had seized control of around one-third of these young adults’ waking hours.

The interruptions often feel pleasant, of course, because they are usually the work of your friends. Distractions arrive in your brain connected to people you know (or think you know), which is the genius of social, peer-to-peer media. Since our earliest evolution, humans have been unusually passionate about gossip, which some attribute to the need to stay abreast of news among friends and family as our social networks expanded. We were hooked on information as eagerly as sugar. And give us access to gossip the way modernity has given us access to sugar and we have an uncontrollable impulse to binge. A regular teen Snapchat user, as the Atlantic recently noted, can have exchanged anywhere between 10,000 and even as many as 400,000 snaps with friends. As the snaps accumulate, they generate publicly displayed scores that bestow the allure of popularity and social status. This, evolutionary psychologists will attest, is fatal. When provided a constant source of information and news and gossip about each other — routed through our social networks — we are close to helpless.

More than an Off-Duty Police Officer

September 20th, 2016

The “off-duty police officer” who stopped the jihadi knife attack in Minnesota was more than an off-duty police officer:

He owns a firing range and firearms training facility called Tactical Advantage. He’s considered an expert in firearms training and education and has helped teach classes on law enforcement skills at St. Cloud State University for nine years, his company website says.
He’s a member of the United States Practical Shooters Association and has won medals in various shooting competitions.

Yeah, he’s a USPSA shooter. I don’t want to say he’s living the dream, but…

Spending Money and Solving Problems

September 19th, 2016

Today we live in a financial age, Peter Thiel says:

The right is obsessed with tax cuts, and the left is obsessed with funding increases. Republicans joke about the incompetence of government to please wealthy donors who don’t want to pay for it; Democrats enable incompetence because they are beholden to public-sector unions that expect their members to get paid whether or not they do the job.

Lost between the two extremes is the vast majority of citizens’ common-sense expectation that the country’s transportation, health care and defense systems should actually work.

[...]

The establishment doesn’t want to admit it, but Trump’s heretical denial of Republican dogma about government incapacity is exactly what we need to move the party — and the country — in a new direction. For the Republican Party to be a credible alternative to the Democrats’ enabling, it must stand for effective government, not for giving up on government.

I believe that effective government will require less bureaucracy and less rulemaking; we may need to have fewer public servants, and we might need to pay some of them more. At a minimum, we should recognize that success cannot be reduced to the overall size of the budget: Spending money and solving problems are not the same thing.

When Americans lived in an engineering age rather than a financial one, they mastered far bigger tasks for far less money. We can’t go back in time, but we can recover the common sense that guided our grandparents who accomplished so much. One elementary principle is accountability: We can’t expect the government to get the job done until voters can say both to incompetent transit workers and to the incompetent elites who feel entitled to govern: “You’re fired.”

Migrant Competence

September 18th, 2016

James Thompson explores migrant competence:

Europe is experiencing enormous inflows of people from Africa and the Middle East, and in the midst of conflicting rhetoric, of strong emotions and of a European leadership broadly in favour of taking more migrants (and sometimes competing to do so) one meme keeps surfacing: that European Jews are the appropriate exemplars of migrant competence and achievements.

European history in the 20th Century shows why present-day governments feel profound shame at their predecessors having spurned European Jews fleeing Nazi Germany. However, there are strong reasons for believing that European Jews are brighter than Europeans, and have greater intellectual and professional achievements. There may be cognitive elites elsewhere, but they have yet to reveal themselves. Expectations based on Jewish successes are unlikely to be repeated.

I am old enough to know that political decisions are not based on facts, but on presumed political advantages. The calculation of those leaders who favour immigration seems to be that the newcomers will bring net benefits, plus the gratitude and votes of those migrants, plus the admiration of some of the locals for policies which are presented as being acts of generosity, thus making some locals feel good about themselves for their altruism. One major ingredient of the leadership’s welcome to migrants is the belief that they will quickly adapt to the host country, and become long term net contributors to society. Is this true?

With Heiner Rindermann he analyzed the gaps, possible causes, and impact of The Cognitive Competences of Immigrant and Native Students across the World:

In Finland the natives had reading scores of 538, first-generation immigrants only 449, second-generation 493. The original first-generation difference of 89 points was equivalent to around 2–3 school years of progress, the second-generation difference of 45 points (1-2 school years) is still of great practical significance in occupational terms.

In contrast, in Dubai natives had reading scores of 395; first-generation immigrants 467; second-generation 503. This 105 point difference is equivalent to 16 IQ points or 3–5 years of schooling.

Rather than look at the scales separately, Rindermann created a composite score based on PISA, TIMSS and PIRLS data so as to provide one overall competence score for both the native born population and the immigrants which had settled in each particular country. For each country you can seen the natives versus immigrant gap. By working out what proportion of the national population are immigrants you can recalculate the national competence (IQ) for that country. Rindermann proposes that native born competences need to be distinguished from immigrant competences in national level data.

The analysis of scholastic attainments in first and second generation immigrants shows that the Gulf has gained from immigrants and Europe has lost. This is because those emigrating to the Gulf have higher abilities than the locals, those emigrating to Europe have lower ability than the locals.

The economic consequences can be calculated by looking at the overall correlations between country competence and country GDP.

[...]

The natives of the United Kingdom have a competence score of 519 (migrants to UK 499), Germany 516 (migrants to Germany 471), the United States 517 (migrants to US 489). There, in a nutshell, is the problem: those three countries have not selected their migrants for intellectual quality. The difference sounds in damages: lower ability leads to lower status, lower wages and higher resentment at perceived differences. On the latter point, if the West cannot bear to mention competence differences, then differences in outcome are seen as being due solely to prejudice.

Do Immigrants Import Their Economic Destiny?

September 17th, 2016

Do immigrants import their economic destiny?

This is one of the great policy questions in our new age of mass migration, and it’s related to one of the great questions of social science: Why do some countries have relatively liberal, pro-market institutions while others are plagued by corruption, statism, and incompetence? Three lines of research point the way to a substantial answer:

  • The Deep Roots literature on how ancestry predicts modern economic development,
  • The Attitude Migration literature, which shows that migrants tend to bring a lot of their worldview with them when they move from one country to another,
  • The New Voters-New Policies literature, which shows that expanding the franchise to new voters really does change the nature of government.

Together, these three data-driven literatures suggest that if you want to predict how a nation’s economic rules and norms are likely to change over the next few decades, you’ll want to keep an eye on where that country’s recent immigrants hail from.

Being white, and a minority, in Georgia

September 15th, 2016

Being white, and a minority, in Georgia is a new combination:

A generation ago, this Atlanta suburb was 95 percent white and rural with one little African-American neighborhood that was known as “colored town.’’ But after a tidal wave of Hispanic and Asian immigrants who were attracted to Norcross by cheap housing and proximity to a booming job market, white people now make up less than 20 percent of the population in Norcross and surrounding neighborhoods. It’s a shift so rapid that many of the longtime residents feel utterly disconnected from the place where they raised their children.

“It’s not that much anger, but you don’t feel comfortable knowing that all this is around you,” said Billy Weathers, 79, who has lived in the area for his whole life and doesn’t speak a lick of Spanish.

Many say they feel isolated in their own hometown, pushed to change their ways, to assimilate to the new arrivals instead of the other way around. They resent the shift, even knowing it’s nobody’s fault, really. And they have mostly kept their feelings to themselves. Who, they wonder, would listen to folks like us, anyway?

[...]

“There used to be a place where we could go out to eat to get southern cooking,” said Billy’s wife, JoAnn Weathers, 79. “Well there’s no more southerners left here. . . . They came from other countries and completely changed our lives.”

It’s an attitude that many in the elites of both parties are quick to dismiss as out-of-date, wrongheaded, and frankly kind of embarrassing. It sounds like racial prejudice, and sometimes is. But to simply ignore or belittle this sense of loss and isolation is to close your eyes and ears to nativist sentiments that predated Trump’s rise, and even if he loses, aren’t going away. The demographic tide all but guarantees it.

[...]

The number of Hispanics in Georgia nearly doubled from 2000 to 2010, bringing their total to roughly 9 percent of the state’s population. That’s still a smaller proportion than in the nation, where the Hispanic population reached a record 55 million in 2014 — 17.4 percent of the country’s overall population, according to Census data.

But in this part of Georgia, the pace of change has been breathtaking.

[...]

Norcross, once a sleepy bedroom community of about 3,000 people living on winding roads, attracted waves upon waves of these newcomers.

Part of the attraction was the cheap housing. Part of it was the easy access to interstate highways and the jobs in Atlanta. But also, once the area had a beachhead of immigrants, more came to live near family or acquaintances, according to Mary Odem, an associate professor at Emory University who has studied the immigrant influx in Georgia.

Now roughly 16,000 people live within the Norcross city limits, and about 40 percent are Hispanic. In 1980, only 23 people in the city were foreign-born.

This kind of population change, if sustained throughout the Atlanta area, will vastly improve the odds of Democrats running for office in Georgia.

[...]

Bell, like many others interviewed, said he distinguishes between immigrants who are making an effort to fit into the existing culture and those who he thinks aren’t trying to assimilate. The Vietnamese and the Koreans, he said, are at least keeping to themselves.

“The Latinos just throw it in your face. They’re here for the money. They don’t want to be American,” Bell said. “They don’t care about America.”

He listed big changes that he’s noticed: More renters in the neighborhood who seem to him to care little about the upkeep of their property, single-family houses that he says are filled with multiple families, garbage bins overflowing and litter in the streets.

Just bite in

September 14th, 2016

Group socialisation theory was Judith Rich Harris’s attempt to solve a puzzle she had encountered while writing child development textbooks for college students:

My textbooks endorsed the conventional view of child development — that what makes children turn out the way they do is ‘nature’ (their genes) and ‘nurture’ (the way their parents bring them up). But after a while it dawned on me that there just wasn’t enough solid evidence to support that view, and there was a growing pile of evidence against it. The problem was not with the ‘nature’ part — genes were having their expected effect. But ‘nurture’ wasn’t working the way it was supposed to. In studies that provided some way of controlling for or eliminating the effects of heredity, the environment provided by parents had little or no effect on how the children turned out.

And yet, genes accounted for only about 50 per cent of the variation in personality and social behaviour. The environment must be playing some role. But it wasn’t the home environment. So I proposed that the environment that has lasting effects on personality and social behaviour is the one the child encounters outside the home. This makes sense if you think about the purpose of childhood. What do children have to accomplish while they’re growing up? They have to learn how to behave in a way that is acceptable to the other members of their society. How do they do this? Not by imitating their parents! Parents are adults, and every society prescribes different behaviours for children and adults. A child who behaved like his or her parents (in any context other than a game) would be seen as impertinent, unruly or weird.

Before going on to become The Nurture Assumption, her work started out as a 1995 Psychological Review piece, which won the George A. Miller award for an outstanding article in general psychology — and there was a certain irony to that:

In 1960 I was a graduate student in the Department of Psychology at Harvard. One day I got a letter saying that the Department had decided to kick me out of their PhD programme. They doubted I would ever make a worthwhile contribution to psychology, the letter said, due to my lack of ‘originality and independence’. The letter was signed by the acting chairman of the Department, George A. Miller!

Sometimes, when life hands you a lemon, you should just bite in. Getting kicked out of Harvard was a devastating blow at the time, but in retrospect, it was the best thing that Harvard ever did for me. It freed me from the influence of ‘experts’. It kept me from being indoctrinated. Many years later, it enabled me to write The Nurture Assumption.

Why Universities Should Get Rid of PowerPoint and Why They Won’t

September 13th, 2016

Universities measure student satisfaction but they do not measure learning:

When we do attempt to measure learning, the results are not pretty. US researchers found that a third of American undergraduates demonstrated no significant improvement in learning over their four-year degree programs. They tested students in the beginning, middle and end of their degrees using the Collegiate Learning Assessment, an instrument that tests skills any degree should improve – analytic reasoning, critical thinking, problem solving and writing.

Paul Ralph’s main argument is against PowerPoint:

A review of research on PowerPoint found that while students liked PowerPoint better than overhead transparencies, PowerPoint did not increase learning or grades. Liking something doesn’t make it effective, and there’s nothing to suggest transparencies are especially effective learning tools either.

Research comparing teaching based on slides against other methods such as problem-based learning – where students develop knowledge and skills by confronting realistic, challenging problems – predominantly supports alternative methods.

PowerPoint slides are toxic to education for three main reasons:

  1. Slides discourage complex thinking. Slides encourage instructors to present complex topics using bullet points, slogans, abstract figures and oversimplified tables with minimal evidence. They discourage deep analysis of complex, ambiguous situations because it is nearly impossible to present a complex, ambiguous situation on a slide. This gives students the illusion of clarity and understanding.
  2. Reading evaluations from students has convinced me that when most courses are based on slides, students come to think of a course as a set of slides. Good teachers who present realistic complexity and ambiguity are criticised for being unclear. Teachers who eschew bullet points for graphical slides are criticised for not providing proper notes.
  3. Slides discourage reasonable expectations. When I used PowerPoint, students expected the slides to contain every detail necessary for projects, tests and assignments. Why would anyone waste time reading a book or going to a class when they can get an A by perusing a slide deck at home in their pyjamas?

“Good teachers who present realistic complexity and ambiguity are criticised for being unclear.”

Rainbow Modeling Compound

September 12th, 2016

Play-Doh is composed of flour, water, and salt — that much you probably already knew — but also boric acid and mineral oil:

The non-toxic, non-staining, reusable modeling compound that came to be known as “Play-Doh” was a pliable, putty-like substance concocted by Noah McVicker of Cincinnati-based soap manufacturer Kutol Products. It was devised at the request of Kroger Grocery, which wanted a product that could clean coal residue from wallpaper. Following World War II, with the transition from coal-based home heating to natural gas and the resulting decrease in internal soot, and the introduction of washable vinyl-based wallpaper, the market for wallpaper cleaning putty decreased substantially. McVicker’s nephew, Joe McVicker, joined Kutol with the remit to save the company from bankruptcy. Joe McVicker was the brother-in-law of nursery school teacher Kay Zufall, and Zufall had seen a newspaper article about making art projects with the wallpaper cleaning putty. Her students enjoyed it, and she persuaded Bill Rhodenbaugh (who also sold the putty) and Joe McVicker to manufacture it as a child’s toy. Zufall and her husband came up with the name Play-Doh; Joe McVicker and Rhodenbaugh had wanted to call it “Rainbow Modeling Compound”. Joe McVicker took Play-Doh to an educational convention for manufacturers of school supplies, and Woodward & Lothrop, a department store in Washington, DC began selling the compound. In 1956, the McVickers formed the Rainbow Crafts Company to make and sell Play-Doh. Also in 1956, a three-pack of 7-ounce cans was added to the product line, and, after in-store demonstrations, Macy’s of New York and Marshall Field’s of Chicago opened retail accounts. In 1957, chemist Dr. Tien Liu reduced Play Doh’s salt content (thus allowing models to dry without losing their color), and Play-Doh ads were telecast on Captain Kangaroo, Ding Dong School, and Romper Room. In 1958, Play-Doh’s sales reached nearly $3 million.