Where did we lose the microbes?

Saturday, November 28th, 2015

Italian microbiologists compared the intestinal microbes of young villagers in Burkina Faso with those of children in Florence, Italy:

The villagers, who subsisted on a diet of mostly millet and sorghum, harbored far more microbial diversity than the Florentines, who ate a variant of the refined, Western diet. Where the Florentine microbial community was adapted to protein, fats, and simple sugars, the Burkina Faso microbiome was oriented toward degrading the complex plant carbohydrates we call fiber.


“It was the most different human microbiota composition we’d ever seen,” Sonnenburg told me. To his mind it carried a profound message: The Western microbiome, the community of microbes scientists thought of as “normal” and “healthy,” the one they used as a baseline against which to compare “diseased” microbiomes, might be considerably different than the community that prevailed during most of human evolution.

And so Sonnenburg wondered: If the Burkina Faso microbiome represented a kind of ancestral state for humans — the Neolithic in particular, or subsistence farming — and if the transition between that state and modern Florence represented a voyage from an agriculturalist’s existence to 21st-century urban living, then where along the way had the Florentines lost all those microbes?


Humans can’t digest soluble fiber, so we enlist microbes to dismantle it for us, sopping up their metabolites. The Burkina Faso microbiota produced about twice as much of these fermentation by-products, called short-chain fatty acids, as the Florentine. That gave a strong indication that fiber, the raw material solely fermented by microbes, was somehow boosting microbial diversity in the Africans.

Indeed, when Sonnenburg fed mice plenty of fiber, microbes that specialized in breaking it down bloomed, and the ecosystem became more diverse overall. When he fed mice a fiber-poor, sugary, Western-like diet, diversity plummeted. (Fiber-starved mice were also meaner and more difficult to handle.) But the losses weren’t permanent. Even after weeks on this junk food-like diet, an animal’s microbial diversity would mostly recover if it began consuming fiber again.

This was good news for Americans — our microbial communities might re-diversify if we just ate more whole grains and veggies. But it didn’t support the Sonnenburgs’ suspicion that the Western diet had triggered microbial extinctions. Yet then they saw what happened when pregnant mice went on the no-fiber diet: temporary depletions became permanent losses.

When we pass through the birth canal, we are slathered in our mother’s microbes, a kind of starter culture for our own community. In this case, though, pups born to mice on American-type diets — no fiber, lots of sugar — failed to acquire the full endowment of their mothers’ microbes. Entire groups of bacteria were lost during transmission. When Sonnenburg put these second-generation mice on a fiber-rich diet, their microbes failed to recover. The mice couldn’t regrow what they’d never inherited. And when these second-generation animals went on a fiberless diet in turn, their offspring inherited even fewer microbes. The microbial die-outs compounded across generations.

Agriculture Linked to DNA Changes in Ancient Europe

Friday, November 27th, 2015

After agriculture arrived in Europe 8,500 years ago, people’s DNA underwent widespread changes, altering their height, digestion, immune system and skin color:

Previous studies had suggested that Europeans became better able to digest milk once they began raising cattle. Dr. Reich and his colleagues confirmed that LCT, a gene that aids milk digestion, did experience intense natural selection, rapidly becoming more common in ancient Europeans. But it didn’t happen when farming began in Europe, as had been supposed. The earliest sign of this change, it turns out, dates back only 4,000 years.

While agriculture brought benefits like a new supply of protein in milk, it also created risks. Early European farmers who depended mainly on wheat and other crops risked getting low doses of important nutrients.

So a gene called SLC22A4 proved advantageous as soon as Europeans started to farm, Dr. Reich and his colleagues found. It encodes a protein on the surface of cells that draws in an amino acid called ergothioneine. Wheat and other crops have low levels of ergothioneine, and the new variant increases its absorption. That would have increased the chances of survival among the farmers who had the gene.

Yet this solution created a problem of its own. The same segment of DNA that carries SLC22A4 also contains a variation that raises the risk of digestive disorders like irritable bowel syndrome. These diseases, then, may be an indirect consequence of Europe’s pivot toward agriculture.

Dr. Reich and his colleagues also tracked changes in the color of European skin.

The original hunter-gatherers, descendants of people who had come from Africa, had dark skin as recently as 9,000 years ago. Farmers arriving from Anatolia were lighter, and this trait spread through Europe. Later, a new gene variant emerged that lightened European skin even more.

Why? Scientists have long thought that light skin helped capture more vitamin D in sunlight at high latitudes. But early hunter-gatherers managed well with dark skin. Dr. Reich suggests that they got enough vitamin D in the meat they caught.

He hypothesizes that it was the shift to agriculture, which reduced the intake of vitamin D, that may have triggered a change in skin color.

The new collection of ancient DNA also allowed Dr. Reich and his colleagues to track the puzzling evolution of height in Europe. After sorting through 169 height-related genes, they found that Anatolian farmers were relatively tall, and the Yamnaya even taller.

Northern Europeans inherited a larger amount of Yamnaya DNA, making them taller, too. But in southern Europe, people grew shorter after the advent of farming.

Dr. Reich said it wasn’t clear why natural selection favored short stature in the south and not in the north. Whatever the reason, this evolutionary history still shapes differences in height across the continent today.

Electrically Accelerated and Enhanced Remineralisation (EAER)

Tuesday, November 17th, 2015

Tooth decay is normally removed by drilling, followed by filling in the cavity with an amalgam or composite resin, but a new treatment, called Electrically Accelerated and Enhanced Remineralisation (EAER), accelerates the natural movement of calcium and phosphate into the damaged tooth:

A two-step process first prepares the damaged area of enamel, then uses a tiny electric current to push minerals into the repair site. It could be available within three years.

Three Levels of Moral Beliefs

Wednesday, November 11th, 2015

Our basic problem, Hayek explains, is that we have three levels of moral beliefs:

We have in the first instance our intuitive moral feelings, which are adapted to the small person to-person-society, where we act toward people that we know and are served by people that we know. Then we have a society run by moral traditions, which unlike what modern rationalists believe are not intellectual discoveries of men who designed them, but they are an example of a process that I now prefer to describe by the biological term of group selection. Those groups that quite accidentally developed favorable habits, such as a tradition of private property and the family, succeed but they never understood this. So we owe our present extended order of human cooperation very largely to a moral tradition, which the intellectual does not approve of because it had never been intellectually designed. And it has to compete with a third level of moral beliefs; the morals that intellectuals design in the hope that they can better satisfy man’s instincts than the traditional rules do. And we live in a world where the three moral traditions are in constant conflict: The innate ones, the traditional ones, and the intellectually designed ones…You can explain the whole of social conflicts of the last 200 years by the conflict of the three moral traditions.

The principle criticisms of liberal individualist society is that it is selfish:

The altruism is an instinct we’ve inherited from small society where we know for whom we work, who we serve. When we pass from this—as I like to call it—concrete society where we are guided by what we see, to the abstract society which far transcends our range of vision, it becomes necessary that we are guided not by the knowledge of the effect of what we do but with some abstract symbols. The only symbol that takes us to where we can make the best contribution is profit. And in fact by pursuing profit we are as altruistic as we can possibly be. Because we extend our concern to people who are beyond our range of personal conception. This is a condition which makes it possible even to produce what I call an extended order; an order which is not determined by our aim, by our knowing what are the most urgent needs, but by an impersonal mechanism that by a system of communication puts a label on certain things which is wholely impersonal. Now this is exactly where the conflict between the traditional moral—which is not altruistic, which emphasizes private property, and the instinctive moral which is altruistic, come in constant conflict. The very transition from a concrete society where each serves the needs of others who he knows, to an extended abstract society where people serve the needs of others whom they do not know, whose existence they are not aware of, must only be made possible by the abandonment of altruism and solidarity as the main guiding factors, which I admit are still the factors dominating our instincts, and what restrains our instincts is the tradition of private property and the family, the two traditional rules of morals, which are in conflict with instinct.

David Sloan Wilson notes that Hayek departs from orthodox economics:

Hayek places economics on an evolutionary foundation, including our genetically evolved adaptations to life in small-scale society, cultural evolution based on unplanned variation and selection, and intentional thought processes that result in planned variation and selection.

Discussions of Hayek, he argues, are therefore discussions of economics from an evolutionary perspective:

This will come as a surprise to a lot of Hayek enthusiasts, who manage to endorse his view of economics, deny evolution, and maintain a pious stance toward religion all at the same time. This absurd combination of beliefs is what passes for economic discourse in the popular sphere — and economic experts who know better somehow allow it to happen.

Wilson seems compelled to treat religion as primitive superstition and contrast it against the useful products of cultural evolution, which is amusing if you’ve been reading about Moses the Microbiologist (in The Paleo Manifesto) and fasting in Eastern Orthodox Christianity.

No One Left to Blame

Tuesday, November 10th, 2015

Steven D. Levitt and Stephen J. Dubner invite us to think like a freak about the unpleasant topic of suicide:

There are about 38,000 suicides a year in the United States, more than twice the number of homicides. Suicide is one of the top ten causes of death for nearly every age group. Because talking about suicide carries such a strong moral taboo, these facts are little known.

As of this writing, the U.S. homicide rate is lower than it’s been in fifty years. The rate of traffic fatalities is at a historic low, having fallen by two-thirds since the 1970s. The overall suicide rate, meanwhile, has barely budged — and worse yet, suicide among 15- to 24-year-olds has tripled over the past several decades.

One might think, therefore, that by studying the preponderance of cases, society has learned everything possible about what leads people to commit suicide.

David Lester, a psychology professor at Richard Stockton College in New Jersey, has likely thought about suicide longer, harder, and from more angles than any other human. In more than twenty-five-hundred academic publication, he has explored the relationship between suicide and, among other things, alcohol, anger, antidepressants, astrological signs, biochemistry, blood type, body type, depression, drug abuse, gun control, happiness, holidays, Internet use, IQ, mental illness, migraines, the moon, music, national-anthem lyric, personality type, sexuality, smoking, spirituality, TV watching, and wide-open spaces.

Has all this study led Lester to some grand unified theory of suicide? Hardly. So far he has one compelling notion. It’s what might be called the “no one left to blame” theory of suicide. While one might expect that suicide is highest among people whose lives are the hardest, research by Lester and others suggests the opposite: suicide is more common among people with a higher quality of life.

“If you’re unhappy and you have something to blame your unhappiness on — if it’s the government, or the economy, or something — then that kind of immunizes you against committing suicide,” he says. “It’s when you have no external cause to blame for your unhappiness that suicide becomes more likely. I’ve used this idea to explain why African-Americans have lower suicide rates, why blind people whose sight is restored often become suicidal, and why adolescent suicide rates often rise as their quality of life gets better.”

That said, Lester admits that what he and other experts know about suicide is dwarfed by what is unknown.

Good News Is Unplanned

Monday, November 2nd, 2015

Good news is unplanned, according to Matt Ridley’s The Evolution of Everything — which discusses the evolution of government:

States emerged from protection rackets in which a gang monopolizing violence demanded payment of goods and services — taxes — in exchange for promises to defend local farmers and artisans from predation by rival gangs. “Tudor monarchs and the Taliban are cut from exactly the same cloth,” summarizes Ridley.

But two to three centuries ago, the fractured polities of Western Europe provided an open, speculative space where novel ideas about property rights, free trade, freedom of religion, freedom of the press, and limits on government could mutate and grow. Where those bottom-up conceptual mutations took hold, technological innovation sped forward, incomes rose, and civil liberties were recognized. Once established, liberal societies are veritable evolution machines that frenetically generate new mutations and swiftly recombine them to produce a vast array of new products, services, and social institutions that enable ever more people to flourish. So far liberal societies are outcompeting—in the sense of being richer and more appealing—those polities that are closer to the original protection rackets.

“Perhaps,” Ridley hopefully suggests, “the state is now evolving steadily towards benign and gentle virtue.” He adds, “Perhaps not.”

A Slow-Motion, Ever-Evolving Riot

Tuesday, October 27th, 2015

Perhaps we should see the school-shooting epidemic as a slow-motion, ever-evolving riot:

What explains a person or a group of people doing things that seem at odds with who they are or what they think is right? Granovetter took riots as one of his main examples, because a riot is a case of destructive violence that involves a great number of otherwise quite normal people who would not usually be disposed to violence.

Most previous explanations had focussed on explaining how someone’s beliefs might be altered in the moment. An early theory was that a crowd cast a kind of intoxicating spell over its participants. Then the argument shifted to the idea that rioters might be rational actors: maybe at the moment a riot was beginning people changed their beliefs. They saw what was at stake and recalculated their estimations of the costs and benefits of taking part.

But Granovetter thought it was a mistake to focus on the decision-making processes of each rioter in isolation. In his view, a riot was not a collection of individuals, each of whom arrived independently at the decision to break windows. A riot was a social process, in which people did things in reaction to and in combination with those around them. Social processes are driven by our thresholds — which he defined as the number of people who need to be doing some activity before we agree to join them. In the elegant theoretical model Granovetter proposed, riots were started by people with a threshold of zero — instigators willing to throw a rock through a window at the slightest provocation. Then comes the person who will throw a rock if someone else goes first. He has a threshold of one. Next in is the person with the threshold of two. His qualms are overcome when he sees the instigator and the instigator’s accomplice. Next to him is someone with a threshold of three, who would never break windows and loot stores unless there were three people right in front of him who were already doing that — and so on up to the hundredth person, a righteous upstanding citizen who nonetheless could set his beliefs aside and grab a camera from the broken window of the electronics store if everyone around him was grabbing cameras from the electronics store.

Granovetter was most taken by the situations in which people did things for social reasons that went against everything they believed as individuals. “Most did not think it ‘right’ to commit illegal acts or even particularly want to do so,” he wrote, about the findings of a study of delinquent boys. “But group interaction was such that none could admit this without loss of status; in our terms, their threshold for stealing cars is low because daring masculine acts bring status, and reluctance to join, once others have, carries the high cost of being labeled a sissy.” You can’t just look at an individual’s norms and motives. You need to look at the group.

His argument has a second implication. We misleadingly use the word “copycat” to describe contagious behavior — implying that new participants in an epidemic act in a manner identical to the source of their infection. But rioters are not homogeneous. If a riot evolves as it spreads, starting with the hotheaded rock thrower and ending with the upstanding citizen, then rioters are a profoundly heterogeneous group.

Finally, Granovetter’s model suggests that riots are sometimes more than spontaneous outbursts. If they evolve, it means they have depth and length and a history. Granovetter thought that the threshold hypothesis could be used to describe everything from elections to strikes, and even matters as prosaic as how people decide it’s time to leave a party. He was writing in 1978, long before teen-age boys made a habit of wandering through their high schools with assault rifles. But what if the way to explain the school-shooting epidemic is to go back and use the Granovetterian model — to think of it as a slow-motion, ever-evolving riot, in which each new participant’s action makes sense in reaction to and in combination with those who came before?


But the riot has now engulfed the boys who were once content to play with chemistry sets in the basement. The problem is not that there is an endless supply of deeply disturbed young men who are willing to contemplate horrific acts. It’s worse. It’s that young men no longer need to be deeply disturbed to contemplate horrific acts.

Adopted Children Do Worse In School, Despite Having Better Parents

Thursday, October 22nd, 2015

Adoptive parents go to great lengths to raise their adoptive children, so why are their young kids’ behavior and test scores worse on average? I can’t imagine — but researchers, willing to dig deep, have come up with this explanation:

One clue might be attachment theory, which holds that a strong bond with at least one nurturing adult—usually the mother—is essential to a child thriving. That adult can be the adoptive parent, but the adoption itself might mean that the bond with the birth parent was disrupted or never formed, Zill writes. In the worst cases, these children might have experienced a traumatic event prior to their adoption. Early trauma can affect the parts of the brain that control mood and learning.

Infants and toddlers with a so-called “disorganized attachment” to their earliest caregivers—those who feel frightened of or dissociated from their parents—are more psychologically vulnerable later in life. Among other things, they have more problems regulating their emotions and managing conflicts without resorting to hostility. Parents who create disorganized attachment with their kids might be the sorts of parents who get their kids taken away and adopted out.

That last line raises some intriguing questions. I’d investigate that line of thinking a bit more thoroughly.

IQ testing across space and time

Tuesday, October 20th, 2015

Raw IQs have been steadily increasing for decades, and IQ tests have been renormed along the way, but this Flynn Effect has been more pronounced in the more abstract, less culturally loaded sections of the test:

The kind of cognitive facilities that come up in normal conversation, such as vocabulary, arithmetic and general knowledge, have only seen small Flynn Effects, which is why the Flynn Effect isn’t easily noticeable in much of daily life (although I’ll point out below where it can be seen).

Flynn Effect and Cultural Load

One of the big changes in daily life over recent centuries has been the growth of what I might call humans having to deal with “machine logic.” People today deal far more often each day than in the past with semi-intelligent machines who can only be dealt with in a certain way according to their logic. You deal with the ATM rather than with a bank teller, with a gasoline pump rather than with a pump jockey, with elevator buttons rather than with elevator operators. You can’t wave your hands around with these machines until they figure out what you want done. You have to follow a precise logical series of steps.


Generation after generation, children grow up in an environment ever denser with the kind of systems logic that the more Flynn Effected-Wechsler subtests ask about. Growing up, kids these days get more practice with the kind of thinking tested on the Raven’s and on some of the Wechsler subtexts. And they legitimately are better at it.

The Flynn Effect is a side effect of the developers of the IQ test being on “the right side of history.”

How a Video Game Helped People Make Better Decisions

Tuesday, October 20th, 2015

Carey K. Morewedge and his colleagues developed a couple “serious” computer games to help people make better decisions:

Participants who played one of our games, each of which took about 60 minutes to complete, showed a large immediate reduction in their commission of the biases (by more than 31%), and showed a large reduction (by more than 23%) at least two months later.

The games target six well-known cognitive biases. Though these biases were chosen for their relevance to intelligence analysis, they affect all kinds of decisions made by professionals in business, policy, medicine, and education as well. They include:

  • Bias blind spot — seeing yourself as less susceptible to biases than other people
  • Confirmation bias — collecting and evaluating evidence that confirms the theory you are testing
  • Fundamental attribution error — unduly attributing someone’s behavior to enduring aspects of that person’s disposition rather than to the circumstance in which the person was placed
  • Anchoring — relying too heavily on the first piece of information considered when making a judgment
  • Projection — assuming that other people think the same way we do
  • Representativeness — relying on some simple and often misleading rules when estimating the probability of uncertain events

We ran two experiments. In the first experiment, involving 243 adult participants, one group watched a 30-minute video, “Unbiasing Your Biases,” commissioned by the program sponsor, the Intelligence Advanced Research Projects Activity (IARPA), a U.S. research agency under the Director of National Intelligence. The video first defined heuristics — information-processing shortcuts that produce fast and efficient, though not necessarily accurate, decisions. The video then explained how heuristics can sometimes lead to incorrect inferences. Then, bias blind spot, confirmation bias, and fundamental attribution error were described and strategies to mitigate them were presented.

Another group played a computer game, “Missing: The Pursuit of Terry Hughes,” designed by our research team to elicit and mitigate the same three cognitive biases. Game players make decisions and judgments throughout the game as they search for Terry Hughes — their missing neighbor. At the end of each level of the game, participants received personalized feedback about how biased they were during game play. They were given a chance to practice and they were taught strategies to reduce their propensity to commit each of the biases.

We measured how much each participant committed the three biases before and after the game or the video. In the first experiment, both the game and the video were effective, but the game was more effective than the video. Playing the game reduced the three biases by about 46% immediately and 35% over the long term. Watching the video reduced the three biases by about 19% immediately and 20% over the long term.

In a second experiment, involving 238 adult participants, one group watched the video “Unbiasing Your Biases 2” to address anchoring, projection, and representativeness. Another group played the computer detective game “Missing: The Final Secret,” in which they were to exonerate their employer of a criminal charge and uncover criminal activity of her accusers. Along the way, players made decisions that tested their propensity to commit anchoring, projection, and representativeness. After each level of the game, their commission of those biases was measured and players were provided with personalized feedback, practice, and mitigation strategies.

Again, the game was more effective than the video. Playing the game reduced the three biases by about 32% immediately and 24% over the long term. Watching the video reduced the three biases by about 25% immediately and 19% over the long term.

The games, which were specifically designed to debias intelligence analysts, are being deployed in training academies in the U.S. intelligence services. But because this approach affects the decision maker rather than specific decisions, such games can be effective in many contexts and decisions — and with lasting effect. (A commercial version of the games is in production.)

Reason Interviews Andy Weir

Friday, October 2nd, 2015

“I want us to have a self-sufficient population somewhere other than Earth,” Andy Weir (The Martian) says, “because 25 years of being a computer programmer has taught me the value of backing things up”:

New Technique Can Cheaply and Efficiently Detect All Known Viruses in a Blood Sample

Monday, September 28th, 2015

Think of VirCapSeq-VERT as a massive exercise in fishing for viruses:

To make the hooks, the team identified and synthesized distinctive stretches of DNA from the genomes of every known group of virus that affects humans and other vertebrates. They ended up with two million of these hooks, each of which was baited to snag a different virus. If you dangle them in a blood sample, yank them out, and then sequence everything that’s attached to them, you end up with the full genome of every virus present.

The team tested the system using tissue samples spiked with genes from many infamous viruses, including those responsible for Ebola, dengue, flu, and MERS. They also tried analyzing a nasal swab from a patient and a stool sample from a bat. VirCapSeq-VERT successfully identified all the viruses in these samples, even when they were present at miniscule amounts.

What Our Primate Relatives Say About War

Saturday, September 26th, 2015

Inter-group conflict may be important among chimpanzees, but Homo sapiens turned it into an art:

Although the details remain highly controversial, a series of new studies in archaeology and anthropology have debunked Rousseau’s myth of the peaceful savage. Death rates (as the percentage of adult males killed in intergroup conflict) among indigenous and prehistoric societies make the wars of the 20th century seem like skirmishes. Although humans were not always at war, human societies were always organized around its ever-present threat.

A sensitivity to human evolution and the behavior of chimpanzees and bonobos forces students of modern students of conflict to face two hard truths. First, we evolved in conditions of resource competition where fear of others, aggression and violence offered adaptive solutions to protect and provide for ourselves and our kin. We therefore need to amend Clausewitz. Humans do indeed wage war for political purposes, but long before war for raison d’etat there was war for resources. International politics is therefore not the root of war but merely an example of it—the continuation of seeking access to valuable resources by other means. Accordingly, when we consider “Why war?,” we have an answer: war is one of Mother Nature’s solutions to compete successfully for resources.

Second, the human traits of egoism, dominance, and in-group/out-group bias are at least partly adaptations to the ecological conditions prevalent in human evolution. It is not assumed that we simply inherited these wholesale from a common ancestor, or the common ancestor we share with the chimpanzee. Clearly, we have undergone many physiological and behavioral changes since then and ecology has been as or more important. But although humans and chimpanzees appear to have travelled much of the road to war together, we have gone far further. The particular socioecological setting in which humans evolved meant that aggression and war were significant behavioral adaptations. These same settings led to remarkable levels of cooperation as well, but note that this cooperation is selectively directed towards in-group members, the better to avoid exploitation by rival groups and organize for war. These adaptations, lamentably, remain with us today and influence our behavior, politics and society.

When Radiation Isn’t the Real Risk

Friday, September 25th, 2015

A small group of scientists met in Tokyo this past spring to evaluate the deadly aftermath of Fukushima:

No one has been killed or sickened by the radiation — a point confirmed last month by the International Atomic Energy Agency. Even among Fukushima workers, the number of additional cancer cases in coming years is expected to be so low as to be undetectable, a blip impossible to discern against the statistical background noise.

But about 1,600 people died from the stress of the evacuation — one that some scientists believe was not justified by the relatively moderate radiation levels at the Japanese nuclear plant.

Evacuations can be prompted by the linear no-threshold model of radiation deaths. If one sievert of radiation causes fatal cancers in 5 percent of the people exposed, then one millisievert must cause fatal cancer in 0.005 percent of the people exposed:

By avoiding what would have been an average cumulative exposure of 16 millisieverts, the number of cancer deaths prevented was perhaps 160, or 10 percent of the total who died in the evacuation itself.

But that estimate assumes the validity of the current standards. If low levels of radiation are less harmful, then the fallout might not have caused any increase in the cancer rate.

The idea of hormesis goes further, proposing that weak radiation can actually reduce a person’s risk.

We’ve discussed Fukushima’s incredible death toll before — and the fact that radiation is good for you.

Gluten-Free Diet Has No Benefit for Children With Autism

Wednesday, September 23rd, 2015

A gluten-free, casein-free diet had no benefit for children with autism, a small study found:

Fourteen young children between three and five years old with a diagnosis of autism were put on a gluten- and casein-free diet for 30 weeks, working with a registered dietitian to make sure they were getting the necessary nutrition.

After they got used to the diet, children were “challenged” weekly for 12 weeks either with a food that contained gluten, casein, both, or a placebo. None of the researchers, parents or children knew if they were getting a real food challenge or a placebo.


Though the researchers initially wanted more children in the study, ultimately only 14 completed it because of both the difficulty of persuading families to sign up and the number of dropouts, the researchers said. Some families left because their children complained about the diet.

The scientists recorded a range of behaviors in the lab after each food challenge and asked parents to monitor others at home, including a range of autism symptoms, sleep patterns and bowel movements.

The data showed no significant change in any of the outcomes between when they were challenged with gluten or casein and when they were given a placebo.