The study, funded jointly by the European Framework 6 programme and the Sheepdrove Trust, found that concentrations of antioxidants such as polyphenolics were between 18-69% higher in organically-grown crops. Numerous studies have linked antioxidants to a reduced risk of chronic diseases, including cardiovascular and neurodegenerative diseases and certain cancers.
Substantially lower concentrations of a range of the toxic heavy metal cadmium were also detected in organic crops (on average 48% lower).
Nitrogen concentrations were found to be significantly lower in organic crops. Concentrations of total nitrogen were 10%, nitrate 30% and nitrite 87% lower in organic compared to conventional crops. The study also found that pesticide residues were four times more likely to be found in conventional crops than organic ones.
Environmental history‘s popularity derives from progressive political concerns. This is unfortunate, T. Greer notes:
There are few fields whose findings have such a clear and wide ranging impact on every other aspect of human civilization. The rise and fall of dynasties, the great deeds of armies and generals, the wealth and poverty of nations, and the daily life of men and women across human history were molded by the ecological setting in which they occurred.
A book I often recommend to those who doubt that environmental history is essential to making sense of human civilization is Alfred Crosby’s Ecological Imperialism: The Biological Expansion of Europe, 900-1900. Over the last three decades a torrent of books and articles have been written to explain why the West was able to rise above ‘the rest’ and establish global supremacy. While of the same vein as these works, the question that animates Ecological Imperialism is slightly different: why were Europeans so successful at reproducing European society (and completely displacing the previous inhabitants) of some locales but unable to accomplish this same feat in other locations? Why were white settlers in North America, New Zealand, Australia, Argentina, and South Africa fantastically more successful than their fellow colonists in Brazil, Mozambique, Panama, or New Guinea?
The answer, says Crosby, is ecology. European expansion was not just a movement of peoples, but of entire environments. For settlers to survive and thrive they must be able to secure food, construct buildings, and move about from place to place. European civilization allowed Europeans to do this on a scale new to human history, but the successes of Western civilization cannot be separated from the environment from which they sprang. Its great cities, armies, and ships were ultimately built upon a unique suite of European flora and fauna. When transplanted far from their homeland these alien organisms do better in some lands than in others. Places where climate and local disease prove hostile to European biota, like central Africa, are places where Western imperialists could only establish an ephemeral presence. Places like Brazil or Mexico, less deadly to European life but unsuitable for large-scale colonization without adopting indigenous crops and farming techniques, produced creole cultures that mixed European and Amerindian traditions. In Australia, New Zealand, North America, and the South African coastline the only environmental constraint European settlers faced was distance. The geography, climate, and ecology of these places were perfect for European biota, allowing them to displace native life without conscious effort.
The displacement was complete and utter. American readers may be surprised to find out how much of America’s ‘natural and wild’ wilderness is made of European aliens of recent import. Sparrows, starlings, house flies, honeybees, garden snails, earthworms, common rats, white clovers, dandelions, Kentucky blue-grass, stinging nettles, knot-grass, broadleaf plantains, Bermuda grass, periwinkles, mayweed, ground ivy, knapweed, milk thistles and almost every type of grass you can find east of the Mississippi originated in Europe and came to the United States in the two centuries after 1650. And that is just a small sample of the hundreds of plants and animals that came to America along with the Europeans. What started as a few alien weeds accidentally carried across the sea grew to dominate entire ecosystems. By 1940 an ecological survey in Southern California could report that “63% of herbaceous vegetation in the grassland types, 66% in the woodland, and 54% in the chaparral” were naturalized plants.
The expansion of European biota across the land was not simply a consequence of European migration. In most cases it was an essential precursor to large-scale European immigration itself. European colonization of New Zealand’s South Island is a case in point. The first settlers in New Zealand did not believe that sheep could ever prosper there, for both islands, being carpeted by ferns or covered with dense forest, had no grass to speak of and nothing sheep could survive on. Initial attempts to remedy the situation by introducing flowering plants to New Zealand failed. The plants would grow but could not reproduce: New Zealand had no insect species adapted to pollinate them! It was not until settlers brought honeybees to the islands that the situation changed, leading to an explosion of European plants across both islands. The new grasses and clovers were perfect feed for English sheep, and it was not long before the sheep had reached such numbers that they became one of New Zealand’s chief exports and an economic pull for more migrants.
Ecological Imperialism does sound comically left-wing, doesn’t it?
Cool it in the bedroom, a new study recommends:
For the new study, published in June in Diabetes, researchers affiliated with the National Institutes of Health persuaded five healthy young male volunteers to sleep in climate-controlled chambers at the N.I.H. for four months. The men went about their normal lives during the days, then returned at 8 every evening. All meals, including lunch, were provided, to keep their caloric intakes constant. They slept in hospital scrubs under light sheets.
For the first month, the researchers kept the bedrooms at 75 degrees, considered a neutral temperature that would not prompt moderating responses from the body. The next month, the bedrooms were cooled to 66 degrees, a temperature that the researchers expected might stimulate brown-fat activity (but not shivering, which usually begins at more frigid temperatures). The following month, the bedrooms were reset to 75 degrees, to undo any effects from the chillier room, and for the last month, the sleeping temperature was a balmy 81 degrees. Throughout, the subjects’ blood-sugar and insulin levels and daily caloric expenditures were tracked; after each month, the amount of brown fat was measured.
The cold temperatures, it turned out, changed the men’s bodies noticeably. Most striking, after four weeks of sleeping at 66 degrees, the men had almost doubled their volumes of brown fat. Their insulin sensitivity, which is affected by shifts in blood sugar, improved. The changes were slight but meaningful, says Francesco S. Celi, the study’s senior author and now a professor at Virginia Commonwealth University. “These were all healthy young men to start with,” he says, “but just by sleeping in a colder room, they gained metabolic advantages” that could, over time, he says, lessen their risk for diabetes and other metabolic problems. The men also burned a few more calories throughout the day when their bedroom was chillier (although not enough to result in weight loss after four weeks). The metabolic enhancements were undone after four weeks of sleeping at 81 degrees; in fact, the men then had less brown fat than after the first scan.
The message of these findings, Celi says, is that you can almost effortlessly tweak your metabolic health by turning down the bedroom thermostat a few degrees.
Spandrell recently found “a very neat paper on sex relations” — Women’s Mating Strategies, published in Evolutionary Anthropology — and decided “to pull an Isegoria and silently quote some pieces of the paper over several posts” — starting with this:
What does a woman want? The traditional evolutionist’s answer to Freud’s famous query is that a woman’s extensive investment in each child implies that she can maximize her fitness by restricting her sexual activity to one or at most a few high-quality males. Because acquiring resources for her offspring is of paramount importance, a woman will try to attract wealthy, high-status men who are willing and able to help her. She must be coy and choosy, limiting her attentions to men worthy of her and emphasizing her chastity so as not to threaten the paternity confidence of her mate.
The lady has been getting more complicated of late, however. As Sarah Hrdy predicted, we now have evidence that women, like other female primates, are also competitive, randy creatures. Women have been seen competing with their rivals using both physical aggression and more subtle derogation of competitors. While they are still sometimes coy and chaste, women have also been described recently as sexy and sometimes promiscuous creatures, manipulating fatherhood by the timing of orgasm, and using their sexuality to garner resources from men.
The real answer to Freud’s query, of course, is that a woman wants it all: a man with the resources and inclination to invest, and with genes that make him attractive to other women so that her sons will inherit his success. Her strategies for attaining these somewhat conflicting aims, and her success in doing so, are shaped by her own resources and options and by conflicts of interest with men and other women.
Scott Adams (Dilbert) was raised as a Methodist and was a believer until age eleven:
Then I lost faith and became an annoying atheist for decades. In recent years I’ve come to see religion as a valid user interface to reality. The so-called “truth” of the universe is irrelevant because our tiny brains aren’t equipped to understand it anyway.
Our human understanding of reality is like describing an elephant to a space alien by saying an elephant is grey. That is not nearly enough detail. And you have no way to know if the alien perceives color the same way you do. After enduring your inadequate explanation of the elephant, the alien would understand as much about elephants as humans understand about reality.
In the software world, user interfaces keep human perceptions comfortably away from the underlying reality of zeroes and ones that would be incomprehensible to most of us. And the zeroes and ones keep us away from the underlying reality of the chip architecture. And that begs a further question: What the heck is an electron and why does it do what it does? And so on. We use software, but we don’t truly understand it at any deep level. We only know what the software is doing for us at the moment.
Religion is similar to software, and it doesn’t matter which religion you pick. What matters is that the user interface of religious practice “works” in some sense. The same is true if you are a non-believer and your filter on life is science alone. What matters to you is that your worldview works in some consistent fashion.
If you’re deciding how to fight a disease, science is probably the interface that works best. But if you’re trying to feel fulfilled, connected, and important as you navigate life, religion seems to be a perfectly practical interface. But neither science nor religion require an understanding of reality at the detail level. As long as the user interface gives us what we need, all is good.
Some of you non-believers will rush in to say that religion has caused wars and other acts of horror so therefore it is not a good user interface to reality. I would counter that no one has ever objectively measured the good and the bad of religion, and it would be impossible to do so because there is no baseline with which to compare. We only have one history. Would things have gone better with less religion? That is unknowable.
If you think there might have been far fewer wars and atrocities without religion, keep in mind that some of us grow up to be Josef Stalin, Pol Pot, and Genghis Khan. There’s always a reason for a war. If you add up all the people who died in holy wars, it would be a rounding error compared to casualties from wars fought for other reasons.
What I know for sure is that plenty of people around me are reporting that they find comfort and social advantages with religion. And science seems to support a correlation between believing, happiness, and health. Anecdotally, religion seems to be a good interface.
Today when I hear people debate the existence of God, it feels exactly like debating whether the software they are using is hosted on Amazon’s servers or Rackspace. From a practical perspective, it probably doesn’t matter to the user one way or the other. All that matters is that the user interface does what you want and expect.
There are words in nearly every language to describe believers, non-believers, and even the people who can’t decide. But is there a label for people who believe human brains are not equipped to understand reality so all that matters is the consistency and usefulness of our user interface?
A new study finds that very few scientists — fewer than 1% — manage to publish a paper every year:
But these 150,608 scientists dominate the research journals, having their names on 41% of all papers. Among the most highly cited work, this elite group can be found among the co-authors of 87% of papers.
Many of these prolific scientists are likely the heads of laboratories or research groups; they bring in funding, supervise research, and add their names to the numerous papers that result.
The single biggest thing you as an expectant parent can do to have a child with a large vocabulary, Razib Khan reminds us, is to select a mate with a large vocabulary.
Can video games make you smarter? Yeah, sort of:
The New York Times‘ fitness writer seems surprised that intense exercise is more effective than milder exercise. A new study found one mechanism:
At Scripps, the scientists had been focusing on catecholamines and their relationship with a protein found in both mice and people that is genetically activated during stress, called CRTC2. This protein, they discovered, affects the body’s use of blood sugar and fatty acids during moments of stress and seems to have an impact on health issues such as insulin resistance.
The researchers also began to wonder about the role of CRTC2 during exercise.
Scientists long have known that the sympathetic nervous system plays a part in exercise, particularly if the activity is intense. Strenuous exercise, the thinking went, acts as a kind of stress, prompting the fight or flight response and the release of catecholamines, which goose the cardiovascular system into high gear. And while these catecholamines were important in helping you to instantly fight or flee, it was generally thought they did not play an important role in the body’s longer-term response to exercise, including changes in muscle size and endurance. Intense exercise, in that case, would have no special or unique effects on the body beyond those that can be attained by easy exercise.
But the Scripps researchers were unconvinced. “It just didn’t make sense” that the catecholamines served so little purpose in the body’s overall response to exercise, said Michael Conkright, an assistant professor at Scripps, who, with his colleague Dr. Nelson Bruno and other collaborators, conducted the new research. So, for a study published last month in The EMBO Journal, he and his collaborators decided to look deeper inside the bodies of exercising mice and, in particular, into what was going on with their CRTC2 proteins.
To do so, they first bred mice that were genetically programmed to produce far more of the CRTC2 protein than other mice. When these mice began a program of frequent, strenuous treadmill running, their endurance soared by 103 percent after two weeks, compared to an increase of only 8.5 percent in normal mice following the same exercise routine. The genetically modified animals also developed tighter, larger muscles than the other animals, and their bodies became far more efficient at releasing fat from muscles for use as fuel.
These differences all were the result of a sequence of events set off by catecholamines, the scientists found in closely examining mouse cells. When the CRTC2 protein received and read certain signals from the catecholamines, it would turn around and send a chemical message to genes in muscle cells that would set in motion processes resulting in larger, stronger muscles.
In other words, the catecholamines were involved in improving fitness after all.
What this finding means, Dr. Conkright said, is that “there is some truth to that idea of ‘no pain, no gain.’”
There is value in trash — if you can unlock it:
That’s what this facility in northern Oregon is designed to do. Run by a startup called S4 Energy Solutions, it’s the first commercial plant in the US to use plasma gasification to convert municipal household garbage into gas products like hydrogen and carbon monoxide, which can in turn be burned as fuel or sold to industry for other applications. (Hydrogen, for example, is used to make ammonia and fertilizers.)
Here’s how it works: The household waste delivered into this hangar will get shredded, then travel via conveyer to the top of a large tank. From there it falls into a furnace that’s heated to 1,500 degrees Fahrenheit and mixes with oxygen and steam. The resulting chemical reaction vaporizes 75 to 85 percent of the waste, transforming it into a blend of gases known as syngas (so called because they can be used to create synthetic natural gas). The syngas is piped out of the system and segregated. The remaining substances, still chemically intact, descend into a second vessel that’s roughly the size of a Volkswagen Beetle.
This cauldron makes the one above sound lukewarm by comparison. Inside, two electrodes aimed toward the middle of the vessel create an electric arc that, at 18,000 degrees, is almost as hot as lightning. This intense, sustained energy becomes so hot that it transforms materials into their constituent atomic elements. The reactions take place at more than 2,700 degrees, which means this isn’t incineration—this is emission-free molecular deconstruction. (The small amount of waste material that survives falls to the bottom of the chamber, where it’s trapped in molten glass that later hardens into inert blocks.)
The seemingly sci-fi transformation occurs because the trash is blasted apart by plasma—the forgotten-stepsister state of matter. Plasma is like gas in that you can’t grip or pour it. But because extreme heat ionizes some atoms (adding or subtracting electrons), causing conductivity, it behaves in ways that are distinct from gas.
U.Va. psychologist Timothy Wilson and colleagues have found that people do not enjoy being alone with their thoughts:
The period of time that Wilson and his colleagues asked participants to be alone with their thoughts ranged from six to 15 minutes. Many of the first studies involved college student participants, most of whom reported that this “thinking period” wasn’t very enjoyable and that it was hard to concentrate. So Wilson conducted another study with participants from a broad selection of backgrounds, ranging in age from 18 to 77, and found essentially the same results.
“That was surprising — that even older people did not show any particular fondness for being alone thinking,” Wilson said.
He does not necessarily attribute this to the fast pace of modern society, or the prevalence of readily available electronic devices, such as smartphones. Instead, he thinks the devices might be a response to people’s desire to always have something to do.
In his paper, Wilson notes that broad surveys have shown that people generally prefer not to disengage from the world, and, when they do, they do not particularly enjoy it. Based on these surveys, Americans spent their time watching television, socializing or reading, and actually spent little or no time “relaxing or thinking.”
During several of Wilson’s experiments, participants were asked to sit alone in an unadorned room at a laboratory with no cell phone, reading materials or writing implements, and to spend six to 15 minutes — depending on the study — entertaining themselves with their thoughts. Afterward, they answered questions about how much they enjoyed the experience and if they had difficulty concentrating, among other questions.
Most reported they found it difficult to concentrate and that their minds wandered, though nothing was competing for their attention. On average the participants did not enjoy the experience. A similar result was found in further studies when the participants were allowed to spend time alone with their thoughts in their homes.
“We found that about a third admitted that they had ‘cheated’ at home by engaging in some activity, such as listening to music or using a cell phone, or leaving their chair,” Wilson said. “And they didn’t enjoy this experience any more at home than at the lab.”
An additional experiment randomly assigned participants to spend time with their thoughts or the same amount of time doing an external activity, such as reading or listening to music, but not to communicate with others. Those who did the external activities reported that they enjoyed themselves much more than those asked to just think, that they found it easier to concentrate and that their minds wandered less.
The real “grabber” is this bit though:
The researchers took their studies further. Because most people prefer having something to do rather than just thinking, they then asked, “Would they rather do an unpleasant activity than no activity at all?”
The results show that many would. Participants were given the same circumstances as most of the previous studies, with the added option of also administering a mild electric shock to themselves by pressing a button.
Twelve of 18 men in the study gave themselves at least one electric shock during the study’s 15-minute “thinking” period. By comparison, six of 24 females shocked themselves. All of these participants had received a sample of the shock and reported that they would pay to avoid being shocked again.
“What is striking,” the investigators write, “is that simply being alone with their own thoughts for 15 minutes was apparently so aversive that it drove many participants to self-administer an electric shock that they had earlier said they would pay to avoid.”
Weapons Man notes a similar empirical discovery by the men developing the original Special Forces Qualification Course, which has led to every subsequent edition of SFQC including some type of isolation period:
In the field exercise portion, soldiers were isolated in the woods for approximately five days and four nights. There would always be a number of people who had never been alone before for a single night of their young lives, and who found this aspect of the survival training extremely difficult. Some would endure. Some would fire the flare that would draw instructors to their location and write an ignominious end to their Green Beret aspirations.
Certainly the introverts and the self-sufficient (two sets with a large intersection, but not entirely the same) did well in the old survival exercise, at least on the isolation axis of measurement. It wasn’t the sole purpose of the drill. One also had 14 or 15 mandatory tasks to accomplish, some of them difficult and time-consuming, and had to obey rules like not linking up with other students — or at least, avoid getting caught breaking the rules. But it was one important aspect of Special Forces training that produced operators capable of individual operations, although those were almost never done deliberately. It also identified for SF men for whom the de facto isolation of being the only American amid a group of strange foreigners of different race, language and culture, would not be too stressful.
Needless to say, those who came through the isolation exercise best were usually those for whom being isolated and alone for several days was nothing new, including hunters, hikers, single-hand sailors, and other adventuresome youth.
Writing is cognitively unnatural:
For almost all human existence, nobody wrote anything; even after that, for millennia, only a tiny elite did so. And it remains an odd way to communicate. You can’t see your readers’ facial expressions. They can’t ask for clarification. Often, you don’t know who they are, or how much they know. How to make up for all this?
Pinker’s answer builds on the work of two language scholars, Mark Turner and Francis-Noël Thomas, who label their approach “joint attention”. Writing is a modern twist on an ancient, species-wide behaviour: drawing someone else’s attention to something visible. Imagine stopping during a hike to point out a distant church to your hiking companion: look, over there, in the gap between those trees – that patch of yellow stone? Now can you see the spire? “When you write,” Pinker says, “you should pretend that you, the writer, see something in the world that’s interesting, and that you’re directing the attention of your reader to that thing.”
Perhaps this seems stupidly obvious. How else could anyone write? Yet much bad writing happens when people abandon this approach. Academics can be more concerned with showcasing their knowledge; bureaucrats can be more concerned with covering their backsides; journalists can be more concerned with breaking the news first, or making their readers angry. All interfere with “joint attention”, making writing less transparent.
I’d say perfectly natural, rather than stupidly obvious.
Consistent with prior research, IQ was most strongly related to openness to experience. Out of 9 dimensions of openness to experience, 8 out of 9 were positively related to IQ: intellectual engagement, intellectual creativity, mental quickness, intellectual competence, introspection, ingenuity, intellectual depth, and imagination. Interestingly, IQ was much more strongly related to intellectual engagement and mental quickness than imagination, ingenuity, or intellectual depth, and IQ was not related to sensitivity to beauty.
Out of 45 dimensions of personality, 23 dimensions were not related to IQ. This included gregariousness, friendliness, assertiveness, poise, talkativeness, social understanding, warmth, pleasantness, empathy, cooperation, sympathy, conscientiousness, efficiency, dutifulness, purposefulness, cautiousness, rationality, perfectionism, calmness, impulse control, imperturbability, cool-headedness, and tranquility. These qualities were not directly relevant to IQ.
8 dimensions of personality outside the openness to experience domain were positively related to IQ, including organization, toughness, provocativeness, leadership, self-disclosure, emotional stability, moderation, and happiness — although the correlations were much smaller than with intellectual engagement and mental quickness. IQ was negatively related to orderliness, morality, nurturance, tenderness, and sociability, but again, the negative correlations were much smaller than the relationships among IQ, intellectual engagement, and mental quickness.
Meat eating was behind the evolutionary success of humankind, because this higher-quality diet meant that women could wean their children earlier:
Among natural fertility societies, the average duration of breast-feeding is 2 years and 4 months. This is not much in relation to the maximum lifespan of our species, around 120 years. It is even less if compared to our closest relatives: female chimpanzees suckle their young for 4–5 years, whereas the maximum lifespan for chimpanzees is only 60 years.
Many researchers have tried to explain the relatively shorter breast-feeding period of humans based on social and behavioral theories of parenting and family size. But the Lund group has now shown that humans are in fact no different than other mammals with respect to the timing of weaning. If you enter brain development and diet composition into the equation, the time when our young stop suckling fits precisely with the pattern in other mammals.
This is the type of mathematical model that Elia Psouni and her colleagues have built. They entered data on close to 70 mammalian species of various types into the model — data on brain size and diet. Species for which at least 20 per cent of the energy content of their diet comes from meat were categorised as carnivores. The model shows that the young of all species cease to suckle when their brains have reached a particular stage of development on the path from conception to full brain-size. Carnivores, due to their high quality diet, can wean earlier than herbivores and omnivores.
The model also shows that humans do not differ from other carnivores with respect to timing of weaning. All carnivorous species, from small animals such as ferrets and raccoons to large ones like panthers, killer whales and humans, have a relatively short breast-feeding period. The difference between us and the great apes, which has puzzled previous researchers, seems to depend merely on the fact that as a species we are carnivores, whereas gorillas, orangutans and chimpanzees are herbivores or omnivores.