You should train yourself to thrive on stress:
A simple idea underpins science — trust, but verify:
A rule of thumb among biotechnology venture-capitalists is that half of published research cannot be replicated. Even that may be optimistic. Last year researchers at one biotech firm, Amgen, found they could reproduce just six of 53 “landmark” studies in cancer research. Earlier, a group at Bayer, a drug company, managed to repeat just a quarter of 67 similarly important papers. A leading computer scientist frets that three-quarters of papers in his subfield are bunk. In 2000-10 roughly 80,000 patients took part in clinical trials based on research that was later retracted because of mistakes or improprieties.
In the 1950s, when modern academic research took shape after its successes in the second world war, it was still a rarefied pastime. The entire club of scientists numbered a few hundred thousand. As their ranks have swelled, to 6m-7m active researchers on the latest reckoning, scientists have lost their taste for self-policing and quality control. The obligation to “publish or perish” has come to rule over academic life. Competition for jobs is cut-throat. Full professors in America earned on average $135,000 in 2012—more than judges did. Every year six freshly minted PhDs vie for every academic post. Nowadays verification (the replication of other people’s results) does little to advance a researcher’s career. And without verification, dubious findings live on to mislead.
Careerism also encourages exaggeration and the cherry-picking of results. In order to safeguard their exclusivity, the leading journals impose high rejection rates: in excess of 90% of submitted manuscripts. The most striking findings have the greatest chance of making it onto the page. Little wonder that one in three researchers knows of a colleague who has pepped up a paper by, say, excluding inconvenient data from results “based on a gut feeling”. And as more research teams around the world work on a problem, the odds shorten that at least one will fall prey to an honest confusion between the sweet signal of a genuine discovery and a freak of the statistical noise. Such spurious correlations are often recorded in journals eager for startling papers. If they touch on drinking wine, going senile or letting children play video games, they may well command the front pages of newspapers, too.
Conversely, failures to prove a hypothesis are rarely even offered for publication, let alone accepted. “Negative results” now account for only 14% of published papers, down from 30% in 1990. Yet knowing what is false is as important to science as knowing what is true. The failure to report failures means that researchers waste money and effort exploring blind alleys already investigated by other scientists.
The hallowed process of peer review is not all it is cracked up to be, either. When a prominent medical journal ran research past other experts in the field, it found that most of the reviewers failed to spot mistakes it had deliberately inserted into papers, even after being told they were being tested.
The concept of hormesis — that a low dose of poison or some other stressor is good for you — makes a wonderful excuse for many vices.
Having a little too much to drink or having a cigar is good for me, as long as I don’t overdo it.
Cold showers are also good for you, if not as much fun as other stresors:
As one form of hydrotherapy, the health benefits of cold water therapy are numerous. Cold showers provide a gentle form of stress that leads to thermogenesis (internal generation of body heat), turning on the body’s adaptive repair systems to strengthen immunity, enhance pain and stress tolerance, and ward off depression, overcome chronic fatigue syndrome, stop hair loss, and stimulate anti-tumor responses.
Some people advocate starting with a warm shower, and switching over to cool or cold water only at the end of the shower. This is fine, particularly if you are afraid that a pure cold shower would just be too uncomfortable or intolerable. But I prefer just jumping right in. When you start with cold water, you will experience the phenomenon of cold shock, an involuntary response characterized by a sudden rapid breathing and increased heart rate. This in itself is very beneficial. The extent of cold shock has been shown to decrease with habituation, and exposure to colder water (10C or 50F) appears to be more effective than just cool water (15 C or 59F) in promoting habituation. The habituation itself is what is most beneficial, both objectively and subjectively. There is an analogy here with high intensity resistance exercise and interval training, both of which elevate heart rate and lead to long term adaptations to stress, with improved cardiovascular capacity and athletic performance.
But cold showers provide a different and probably complementary type of habituation to that which results from exercise. A study of winter swimmers compared them with a control group in their physiological response to being immersed in cold water: Both groups responded to cold water by thermogenesis (internal production of body heat), but the winter swimmers did so by raising their core temperature and did not shiver until much later than the controls, whereas the control subjects responded by shivering to increase their peripheral temperatures. The winter swimmers also tolerated much larger temperature differences and conserved their energy better. Other studies confirm that the benefits of habituation show up only after several weeks of cold showering. For example, adaptation to cold leads to increased output of the beneficial “short term stress” hormones adrenaline and thyroxine, leading to mobilization of fatty acids, and substantial fat loss over a 1-2 week period.
So regular cold showers, like high intensity exercise, and intermittent fasting, appear to provide similar, but not identical hormetic benefits.
I’m in no hurry to try an ice bath.
(Hat tip to our Slovenian guest.)
Children believe in Santa, because they have plenty of evidence that Santa exists. They’ll also believe in the Candy Witch if she comes and replaces their candy with toys.
A few years ago I read Poe’s Some Words with a Mummy for Halloween, and it turned out to make for excellent election-year reading.
Humans are born with an innate number sense or approximate number system:
A few years ago, researchers played newborn infants — as young as seven hours! — recordings of spoken syllables repeated a fixed number of times. In one trial, babies would hear “tuuuuu” four times, for example, whereas in another they’d hear “tu” twelve times. At the same time, the babies were shown pictures of geometric shapes, such as four squares or twelve circles. Somewhat amazingly (at this age, after all, they’re basically blind, sucking potato sacks), the babies matched the number of sounds they heard with the number of shapes they saw. On the trials where they had heard four syllables, they would look longer at pictures of four shapes, and on those with 12 syllables, they’d look longer at pictures of 12 shapes.
The better a baby’s number sense at six months old, the stronger the child’s mathematical abilities three years later, a new PNAS study finds:
The study hinges on a method in which researchers track babies’ eye movements as they watch two video screens at the same time. One screen always shows the same number of dots, but the dots change in size and location. The other screen shows the same thing, except that the number of dots changes as well.
Babies like novelty. In an earlier study using this method, the researchers showed that 6-month-old babies tend to look longer at the screen in which the number of dots changes (the left side of the video above) than the other screen, presumably because they notice the difference in the number of dots and like watching it change.
Conservation is hard, Greg Cochran notes — but so is driving a prey animal to extinction:
Even if the population as a whole would be better off if a given prey species persisted in fair numbers, any single individual would benefit from cheating — even from eating the very last mammoth.
More complicated societies, with private property and draconian laws against poaching, do better, but even they don’t show much success in preserving a tasty prey species over the long haul. Considers the aurochs, the wild ancestor of the cow. The Indian version seems to have been wiped out 4–5,000 years ago. The Eurasian version was still common in Roman times, but was rare by the 13th century, surviving only in Poland. Theoretically, only members of the Piast dynasty could hunt aurochsen — but they still went extinct in 1627.
How then did edible species survive in pre-state societies? I can think of several ways in which some species managed to survive voracious humans, but none of them involve green intent.
First you have to realize that driving a prey species to extinction is unusual: it doesn’t happen often with normal predators. Specialized predators obviously can’t do it — when their prey gets scarce, so do they. On the other hand, unspecialized predators generally won’t be as efficient. On the gripping hand, at any given moment, a predator and its prey have been co-evolving (and co-existing) for millions of years. Both are highly optimized — which means that further improvements would be difficult — and it shouldn’t easy for the predator to suddenly develop a crushing superiority. This argument doesn’t apply to newly introduced predators, of course.
Mass extinction is even less likely, because even an unspecialized predator should become rare when the total amount of prey (all relevant species) goes way down.. Unless this potent predator is really an omnivore — but that means even less specialization in predation. Omnivores (bears, for example) usually aren’t that effective.
If we go back far enough, protohumans simply weren’t very good hunters, because they weren’t smart. Lions manage to be pretty good predators without being particularly smart, but humans, who don’t have impressive natural armament, have to succeed in hunting through tools and social cooperation. They were probably death on turtles early on, but in general early humans advanced slowly, giving prey species lots of time to adapt — African and Eurasian species, that is.
The pace of innovation gradually increased, and I can think of some species in Africa and Eurasia that were probably ganked by humans a long time ago — but it wasn’t dramatic. Progress in hunting, new tactics and weapons, was still slow enough to allow adaptive response in prey species. Consider the Neanderthals: I can’t think of a single species they wiped out. Wimps.
By the Upper Paleolithic, modern humans were innovating much more rapidly, and human-driven extinction starts to become really important. It wasn’t just better hunting that mattered. Better food preparation — getting more out of each carcass — increased human density, and thus hunting intensity. You might think that greater efficiency would mean that we didn’t need to bring down as many beasts — not so, in a Malthusian world.
Developing new ways of gathering food other than hunting, such as fishing and better preparation of plant foods, meant that human density could stay high even as mammal biomass crashed. Innovations in clothing and housing let people colonize the high Arctic, and eventually the Americas. Invention of boats and rafts led to the colonization of Australia and numerous islands.
We were omnivores and generalists: population collapse of prey species couldn’t stop us. We could kill anything — but the biggest threat of extinction was to large animals, which were worth a lot (mucho calories for the tribe) and bred slowly. Worst off were those animals that had never had a chance to adapt to humans.
There were some modifying factors. It probably wasn’t just adaptation to humans that saved much of the African megafauna: African pathogens may have played a role too, keeping human numbers down and possibly even creating natural game preserves (I’m thinking of sleeping sickness). Contrariwise, Australia and the Americas were almost disease-free, as far as humans were concerned.
War is bad for us, good for our prey. The no-man’s land between hostile tribes is oddly full of game, since people are afraid to go there. In much the same way, rabbits flourished next to the Berlin Wall, while Asiatic black bears and musk deer inhabit the Korean DMZ.
The Secret Race deserves to be read alongside The Sports Gene, Malcolm Gladwell says:
“Lance and Ferrari showed me there were more variables than I’d ever imagined, and they all mattered: wattages, cadence, intervals, zones, joules, lactic acid, and, of course, hematocrit,” Hamilton writes. “Each ride was a math problem: a precisely mapped set of numbers for us to hit…. It’s one thing to go ride for six hours. It’s another to ride for six hours following a program of wattages and cadences, especially when those wattages and cadences are set to push you to the ragged edge of your abilities.”
Hematocrit, the last of those variables, was the number they cared about most. It refers to the percentage of the body’s blood that is made up of oxygen-carrying red blood cells. The higher the hematocrit, the more endurance you have. (Mäntyranta had a very high hematocrit.) The paradox of endurance sports is that an athlete can never work as hard as he wants, because if he pushes himself too far his hematocrit will fall. Hamilton had a natural hematocrit of forty-two per cent — which is on the low end of normal. By the third week of the Tour de France, he would be at thirty-six per cent, which meant a six-per-cent decrease in his power — in the force he could apply to his pedals. In a sport where power differentials of a tenth of a per cent can be decisive, this “qualifies as a deal breaker.”
For the members of the Postal Service squad, the solution was to use the hormone EPO and blood transfusions to boost their hematocrits as high as they could without raising suspicion. (Before 2000, there was no test for EPO itself, so riders were not allowed to exceed a hematocrit of fifty per cent.) Then they would add maintenance doses over time, to counteract the deterioration in their hematocrit caused by races and workouts. The procedures were precise and sophisticated. Testosterone capsules were added to the mix to aid recovery. They were referred to as “red eggs.” EPO (a.k.a. erythropoietin), a naturally occurring hormone that increases the production of red blood cells, was Edgar — short for Edgar Allan Poe. During the Tour de France, and other races, bags of each rider’s blood were collected in secret locations at predetermined intervals, then surreptitiously ferried from stage to stage in refrigerated containers for strategic transfusions. The window of vulnerability after taking a drug — the interval during which doping could be detected — was called “glowtime.” Most riders who doped (and in the Armstrong era, it now appears, nearly all the top riders did) would take two thousand units of Edgar subcutaneously every couple of days, which meant they “glowed” for a dangerously long time. Armstrong and his crew practiced microdosing, taking five hundred units of Edgar nightly and injecting the drug directly into the vein, where it was dispersed much more quickly.
“The Secret Race” is full of paragraphs like this:
The trick with getting Edgar in your vein, of course, is that you have to get it in the vein. Miss the vein — inject it in the surrounding tissue — and Edgar stays in your body far longer; you might test positive. Thus, microdosing requires a steady hand and a good sense of feel, and a lot of practice; you have to sense the tip of the needle piercing the wall of the vein, and draw back the plunger to get a little bit of blood so you know you’re in. In this, as in other things, Lance was blessed: he had veins like water mains. Mine were small, which was a recurring headache.
Hamilton was eventually caught and was suspended from professional cycling. He became one of the first in his circle to implicate Lance Armstrong, testifying before federal investigators and appearing on “60 Minutes.” He says that he regrets his years of using performance-enhancing drugs. The lies and duplicity became an unbearable burden. His marriage fell apart. He sank into a depression. His book is supposed to serve as his apology. At that task, it fails. Try as he might — and sometimes he doesn’t seem to be trying very hard — Hamilton cannot explain why a sport that has no problem with the voluntary induction of anorexia as a performance-enhancing measure is so upset about athletes infusing themselves with their own blood.
“Dope is not really a magical boost as much as it is a way to control against declines,” Hamilton writes. Doping meant that cyclists finally could train as hard as they wanted. It was the means by which pudgy underdogs could compete with natural wonders. “People think doping is for lazy people who want to avoid hard work,” Hamilton writes. For many riders, the opposite was true:
EPO granted the ability to suffer more; to push yourself farther and harder than you’d ever imagined, in both training and racing. It rewarded precisely what I was good at: having a great work ethic, pushing myself to the limit and past it. I felt almost giddy: this was a new landscape. I began to see races differently. They weren’t rolls of the genetic dice, or who happened to be on form that day. They didn’t depend on who you were. They depended on what you did — how hard you worked, how attentive and professional you were in your preparation.
Shirley S. Wang reports on the science of trips and falls:
The body has three main systems that help us stay balanced. The visual system takes in information from the outside world and transmits it to the brain. The proprioceptive system, which incorporates sensory systems throughout the body, tells us how the body’s parts are oriented relative to each other. And the vestibular system, located in the inner ear, focuses primarily on how the head is moving. Generally, if at least two of these systems are impaired, people tend to have trouble with balance.
As people age, the vestibular system becomes less sensitive. Instead, individuals tend to rely more on their vision, which is relatively slow compared with the vestibular system. As a result, older people don’t process information as quickly to correct for missteps, Dr. Cullen says.
After a fall, older people often say they tripped or slipped. Researchers at Simon Fraser University, in Burnaby, British Columbia, wanted to observe what really happens. The team outfitted a long-term-care facility with video cameras and recorded residents going about their daily lives. They recorded 227 falls from 130 individuals over about three years. Tripping caused just 1 out of 5 of the incidents. The biggest reason for falling—accounting for 41% of the total—was due to incorrect weight shifting, like leaning over too far, says Stephen Robinovitch, a professor in the biomedical physiology and kinesiology and engineering science departments. Other, less frequent reasons for falling included loss of support with an external object, like a walker, or bumping into something.
Scientists at American Cyanamid’s Lederle laboratory linked animal nutrition and antibiotics — by accident:
The Lederle team was investigating a long-time poultry nutrition mystery: when chickens rooted through bacteria-rich manure — their own or other animals’ — they laid more eggs and enjoyed lower mortality rates and less illness. The team turned its microscopes on those henhouse organisms and discovered that one of them produced a substance that resembled B12. Was B12 was the mysterious factor that distinguished animal proteins from plant-based counterparts? Team members tested that proposition by feeding animals with B12, in this case a batch created with residues from the manufacture of the antibiotic Aureomycin.
They assumed that the B12, like any vitamin, would enhance the animals’ health, but the results astonished them: The animals that ate that Aureomycin-based B12 sample grew fifty percent faster than those fed B12 manufactured from other residues. Initially team members believed that they had discovered yet another vitamin. Further tests revealed the startling truth: They’d inadvertently employed a batch of B12 that contained not just manufacturing residues but tiny amounts of Aureomycin, too.
The two-for-one, marveled a reporter for Science News Letter, “cast the antibiotic in a spectacular new role” for the “survival of the human race in a world of dwindling resources and expanding populations.” Farmers wasted no time abandoning expensive animal proteins in favor of both B12 and infinitesimal, inexpensive doses of antibiotics. Their livestock reached market weight more quickly, and farmers’ production costs dropped. Consumers enjoyed lower prices for pork and poultry.
David Brandt farms 1,200 acres in central Ohio, where he uses 14 different plant species as cover crops in the off-season, reducing his need for synthetic fertilizers and herbicides. And he never, ever tills his soil. He has been dubbed the Obi-Wan Kenobi of soil:
We start in Brandt’s field, where we encounter waist-high, deep-green corn plants basking in the afternoon heat. A mat of old leaves and stems covers the soil — remnants of the winter cover crops that have kept the field devoid of weeds. At Brandt’s urging, we scour the ground for what he calls “haystacks” — little clusters of dead, strawlike plant residue bunched up by earthworms. Sure enough, the stacks are everywhere. Brandt scoops one up, along with a fistful of black dirt. “Look there — and there,” he says, pointing into the dirt at pinkie-size wriggling earthworms. “And there go some babies,” he adds, indicating a few so tiny they could curl up on your fingernail.
Then he directs our gaze onto the ground where he just scooped the sample. He points out a pencil-size hole going deep into the soil — a kind of worm thruway that invites water to stream down. I don’t think I’m the only one gaping in awe, thinking of the thousands of miniature haystacks around me, each with its cadre of worms and its hole into the earth. I look around to find several NRCS people holding their own little clump of dirt, oohing and ahhing at the sight.
Then we cross the street to the neighbor’s field. Here, the corn plants look similar to Brandt’s, if a little more scraggly, but the soil couldn’t be more different. The ground, unmarked by haystacks and mostly bare of plant residue altogether, seems seized up into a moist, muddy crust, but the dirt just below the surface is almost dry. Brandt points to a pattern of ruts in the ground, cut by water that failed to absorb and gushed away. Brandt’s land managed to trap the previous night’s rain for whatever the summer brings. His neighbor’s lost not just the precious water, but untold chemical inputs that it carried away.
He also adds wheat to the ubiquitous corn-soy rotation favored by his peers throughout the Corn Belt:
Bringing in a third crop disrupts weed and pest patterns, and a 2012 Iowa State University study found that by doing so, farmers can dramatically cut down on herbicide and other agrichemical use.
In her recent interview, HBD Chick doesn’t sound like a misanthrope:
You asked earlier if there was a single book or article that got me interested in human biodiversity, and I said that there really wasn’t, that it was more of a gradual thing; but that classic article of Steve’s — “Cousin Marriage Conundrum” — really set me off in one direction within HBD! It was that article, plus Stanley Kurtz and Parapundit’s writings on the issue, that really piqued my interest in cousin marriage (and mating patterns in general) and the effects that it can have on a society.
To sum up Steve’s article, he pointed out that, in societies with a lot of cousin marriage, like in Iraq and Afghanistan, the extended family is much more important to people than here in the West, so it’s difficult to establish and maintain things like liberal democracy and a low-corruption, low-nepotism society, since everybody is more focused on accruing benefits for their respective extended families than on what is best for the commonweal. Which got me to thinking: if those societies don’t manage democracy and are corrupt because they have cousin marriage, perhaps we in the West have democracy and aren’t so corrupt because we don’t practice cousin marriage. Which, to make a long story short, seems to be the case — at least I think I’ve accumulated an awful lot of circumstantial evidence that strongly indicates this to be the case.
Teenage drivers drive worse when listening to their own music:
Researchers at Ben-Gurion University in Israel recruited 85 drivers about 18 years old; just over half were male. The subjects were each assigned to drive six challenging road trips that were about 40 minutes long, accompanied by an experienced driving instructor. Music was played on four trips, two with selections from the drivers’ playlists, mostly fast-paced vocals, and two with background music, which was a blend of easy listening, soft rock and light jazz in instrumental and vocal arrangements designed to increase driver safety. No music was played on two trips. Subjects rated their mood after each trip and in-car data recorders analyzed driver behavior and errors.
All 85 subjects committed at least three errors in one or more of the six trips; 27 received a verbal warning and 17 required steering or braking by an instructor to prevent an accident. When the music was their own, 98% made errors; without the music, 92% made errors; and while listening to the safe-driving music, 77% made errors. Speeding, following too closely, inappropriate lane use, one-handed driving and weaving were the common violations.
The male subjects were more aggressive drivers and made more serious errors than female subjects. The teens played their own music at a very loud volume but significantly decreased the sound level when listening to the safe-driving music, researchers said. Mood ratings were highest on trips with driver-preferred music.
Jon Krakauer (Into the Wild) returns to the question of how Chris McCandless died. While the proximal cause was starvation, it would appear that McCandless starved in the wild because he was paralyzed by an obscure toxin found in wild potato (Hedysarum alpinum) seeds:
The one constant about ODAP poisoning, however, very simply put, is this: those who will be hit the hardest are always young men between the ages of 15 and 25 and who are essentially starving or ingesting very limited calories, who have been engaged in heavy physical activity, and who suffer trace-element shortages from meager, unvaried diets.