When Confidence Trumps Competence

Wednesday, November 26th, 2014

Another study shows that people prefer confidence to accuracy when choosing an expert to trust:

Researchers at Washington State University did an exhaustive analysis of non-celebrity “pundits” who made predictions about the outcomes of sporting events. They rated each social media post that involved a prediction for its confidence level. For example, a prediction that one team would “crush” another is more confident than merely projecting a “win.” They checked predictions against actual game results to gauge accuracy, and also analyzed the number of followers each built over time.

The results were surprising. While accuracy of predictions did lead to a small but statistically significant increase in the number of followers, confidence was nearly three times as powerful.

The potent effects of confidence on trust aren’t new. As I described in Convince With Confidence, Carnegie Mellon researchers had subjects participate in a weight-guessing game in which they could purchase the assistance of “advisers.” They tended to choose those advisers who were more confident, even when after multiple rounds those advisers were less accurate than others.

Why Asians wear surgical masks in public

Tuesday, November 25th, 2014

Surgeons wear masks to protect patients from their mouth-borne germs, not the other way around, but Asians wear surgical masks in public for a number of reasons, and they’ve been doing it for a century:

The custom of facemask-wearing began in Japan during the early years of the 20th century, when a massive pandemic of influenza killed between 20 and 40 million people around the world — more than died in World War I. There were outbreaks of the disease on every inhabited continent, including Asia (where it devastated India, leading to the deaths of a full 0.5% of the population). Covering the face with scarves, veils and masks became a prevalent (if ineffective) means of warding off the disease in many parts of the world, until the epidemic finally faded at the end of 1919.

In Japan, a few years later, the Great Kanto Earthquake of 1923, triggered a massive inferno that consumed nearly 600,000 homes in the most populous part of the nation. After the quake, the sky was filled with smoke and ash for weeks, and air quality suffered for months afterward. Facemasks came out of storage and became a typical accessory on the streets of Tokyo and Yokohama. A second global flu epidemic in 1934 cemented Japan’s love affair with the facemask, which began to be worn with regularity during the winter months — primarily, given Japan’s obsession with social courtesy, by cough-and-cold victims seeking to avoid transmitting their germs to others, rather than healthy people looking to prevent the onset of illness.

Then, in the 1950s, Japan’s rapid post-World War II industrialization led to rampant air pollution and booming growth of the pollen-rich Japanese cedar, which flourished due to rising ambient levels of carbon dioxide. Mask-wearing went from seasonal affectation to year-round habit. Today, Japanese consumers buy $230 million in surgical masks a year, and neighboring countries facing chronic pollution issues — most notably China and Korea — have also adopted the practice.

Traditional Chinese Medicine puts a premium on proper breathing and clean air, which may explain some of the masks’ popularity..

This is my boomstick!

Monday, November 24th, 2014

Early gunpowder wasn’t really gunpowder so much as thunderpowder:

Strange as it may seem, the Battle of Crecy, which showed the longbow at its best, was also the scene of an incident that sounded the death knell, not only of the bow, but of all merely mechanical means of missile propulsion. This battle saw the first recorded use of artillery in an engagement between major armies and heralded explosives as a means of missile propulsion. However, the justified praise of the longbow was so great at this time that were it not for the meticulous writings of a few historians of the day, it would have gone unnoticed that Edward III employed stampede cannon on his flanks. These devices represented artillery in its crudest form, and were mainly used, as the name implies, to scare the enemy’s horses and strike terror into the untrained foot soldier. Missile throwing ability was secondary. Earliest cannon design appears to have been that of an iron tube encased in wood to give it further support, and still keep it light. The explosive was a crude black powder to which generally was added various kinds of wax. the mixture being made into balls. The balls, when discharged, produced an effect somewhat like an oversized Roman candle. The cannon’s front end was supported by a metal fork and, to take care of recoil, the butt simply was placed against a convenient knoll. Firearm development stems from this modest beginning.


Bacon spoke of the simple deceits which are practiced by jugglers and ventriloquists, and commented that “popular opinion does anything that men wish it to do, so long as men are agreed about it.

“In addition to these marvels, there are certain others which do not involve particular constructions. We can prepare from saltpeter and other materials an artificial fire which will burn at whatever distance we please… Beyond these are still other stupendous things in Nature. For the sound of thunder may be artificially produced in the air with greater resulting horror than if it had been produced by natural causes. A moderate amount of the proper material, of the size of a thumb, will make a horrible sound and violent coruscation.”


Although Bacon suggests several military uses for his explosive (for instance, “an enemy might be either blown up bodily or put to flight by the terror caused by the explosion”), there was nothing to be found in any of his writings to show he ever once contemplated its use as a missile-throwing agent. The identity of the individual who first thought of propelling a projectile through a tube from the force generated by gunpowder still remains a mystery.

Loud weapons work.

Clausewitz, Nonlinearity and the Unpredictability of War

Saturday, November 22nd, 2014

Classical mathematics concentrated on linear equations for a sound pragmatic reason, Ian Stewart noted: it couldn’t solve anything else. Modern chaos theorists like to emphasize this point.

James Clerk Maxwell noted another chaotic concept over a century ago:

When the state of things is such that an infinitely small variation of the present state will alter only by an infinitely small quantity the state at some future time, the condition of the system, whether at rest or in motion, is said to be stable; but when an infinitely small variation in the present state may bring about a finite difference in the state of the system in a finite time, the condition of the system is said to be unstable. It is manifest that the existence of unstable conditions renders impossible the prediction of future events, if our knowledge of the present state is only approximate, and not accurate…. it is a metaphysical doctrine that from the same antecedents follow the same consequents. No one can gainsay this. But it is not of much use in a world like this, in which the same antecedents never again concur, and nothing ever happens twice… The physical axiom which has a somewhat similar aspect is “That from like antecedents follow like consequents.” But here we have passed from sameness to likeness, from absolute accuracy to a more or less rough approximation.

In describing war, Clausewitz resorts to a striking metaphor of nonlinearity:

In the last section of Chapter 1, Book One, he claims that war is “a remarkable trinity” (eine wunderliche Dreifaltigkeit) composed of (a) the blind, natural force of violence, hatred, and enmity among the masses of people; (b) chance and probability, faced or generated by the commander and his army; and (c) war’s rational subordination to the policy of the government.(28) Clausewitz compares these three tendencies to three varying legal codes interacting with each other (the complexity of which would have been obvious to anyone who lived under the tangled web of superimposed legal systems in the German area before, during, and after the upheavals of the Napoleonic years). Then he concludes with a visual metaphor: “Our task therefore is to develop a theory that maintains a balance between these three tendencies, like an object suspended between three magnets.” (29) What better image could he have conjured to convey his insight into the profoundly interactive nature of war than this emblem of contemporary nonlinear science? (30)

Although the passage is usually taken to mean only that we should not overemphasize any one element in the trinity, Clausewitz’s metaphor also implicitly confronts us with the chaos inherent in a nonlinear system sensitive to initial conditions. The demonstration usually starts with a magnet pendulum hanging over one magnet; when the pendulum is pulled aside and let go, it comes to rest quickly. Positioned over two equally powerful magnets, the pendulum swings toward first one, then the other, and still settles into a rest position as it is captured by one of the points of attraction. But when a pendulum is released over three equidistant and equally powerful magnets, it moves irresolutely to and fro as it darts among the competing points of attraction, sometimes kicking out high to acquire added momentum that allows it to keep gyrating in a startlingly long and intricate pattern. Eventually, the energy dissipates under the influence of friction in the suspension mountings and the air, bringing the pendulum’s movement asymptotically to rest. The probability is vanishingly small that an attempt to repeat the process would produce exactly the same pattern. Even such a simple system is complex enough for the details of the trajectory of any actual “run” to be, effectively, irreproducible.

My claim here is not that Clausewitz somehow anticipated today’s “chaos theory,” but that he perceived and articulated the nature of war as an energy-consuming phenomenon involving competing and interactive factors, attention to which reveals a messy mix of order and unpredictability. His final metaphor of Chapter 1, Book One captures this understanding perfectly. The pendulum and magnets system is orderly, because it is a deterministic system that obeys Newton’s laws of motion; in the “pure theory” (with an idealized frictionless pendulum), we only need to know the relevant quantities accurately enough to know its future. But in the real world, “a world like this” in Maxwell’s phrase, it is not possible to measure the relevant initial conditions (such as position) accurately enough to replicate them in order to get the same pattern a second time, because all physical measurements are approximations limited by the instrument and standard of measurement. And what is needed is infinitely fine precision, for an immeasurably small change in the initial conditions can produce a significantly different pattern. Nor is it possible to isolate the system from all possible influences around it, and that environment will have changed since the measurements were taken. Anticipation of the overall kind of pattern is possible, but quantitative predictability of the actual trajectory is lost.

There are a number of interconnected reasons for the pendulum and magnets picture to be emblematic for Clausewitz, and all of them go to the heart of the problem of understanding what he meant by a “theory” of war. First of all, the image is not that of any kind of Euclidean triangle or triad, despite its understanding as such by many readers. Given his attacks on the formulation of rigidly “geometric” principles of war by some of his contemporaries, such an image would have been highly inapt. (31) Clausewitz’s message is not that there are three passive points, but three interactive points of attraction that are simultaneously pulling the object in different directions and forming complex interactions with each other. In fact, even the standard translation given above is too static, for the German original conveys a sense of on-going motion: “Die Aufgabe ist also, dass sich die Theorie zwischen diesen drei Tendenzen wie zwischen drei Anziehungspunkten schwebend erhalte.” (32) Literally: “The task is therefore that the theory would maintain itself floating among these three tendencies as among three points of attraction.” The connotations of schweben involve lighter-than-air, sensitive motion; a balloon or a ballerina “schwebt.” The image is no more static than that of wrestlers. The nature of war should not be conceived as a stationary point among the members of the trinity, but as a complex trajectory traced among them.

Secondly, Clausewitz’s employment of magnetism is a typical resort to “high-tech” imagery. The relationship of magnetism to electricity was just beginning to be clarified in a way that made it a cutting-edge concept for its time. It is quite possible that he actually observed a demonstration of a pendulum and three magnets as envisioned in the metaphor, for he was a man of considerable scientific literacy. (33) His famous incorporation of the notion of “friction,” also a high-technology concept for his day, is another example of this characteristic of his thought.

Thirdly, and perhaps most importantly, the metaphor offers us insight into a mind realistically willing to abandon the search for simplicity and analytical certainty where they are not obtainable. The use of this image displays an intuitive grasp of dynamic processes that can be isolated neither from their context nor from chance, and are thus characterized by inherent complexities and probabilities. It encodes Clausewitz’s sense of war in a realistic dynamical system, not an idealized analytical abstraction.

Accidental Rewilding

Thursday, November 20th, 2014

Many primeval forests aren’t in fact primeval but the result of recent accidental rewilding:

In the Americas — North, Meso and South — the first Europeans to arrive in the 15th and 16th centuries reported dense settlement and large-scale farming. Some of them were simply not believed. Spaniards such as the explorer Francisco de Orellana and the missionary Brother Gaspar de Carvajal, who travelled the length of the Amazon river in 1542, claimed that they had seen walled cities in which many thousands of people lived, raised highways and extensive farming along its banks. When later expeditions visited the river, they found no trace of them, just dense forest to the water’s edge and small scattered bands of hunter-gatherers. Orellana and Carvajal’s reports were dismissed as the ravings of fantasists, seeking to boost commercial interest in the lands they had explored.

It was not until the late 20th century that investigations by archaeologists such as Anna Roosevelt at the University of Illinois at Chicago and Michael Heckenberger at the University of Florida suggested that Orellana and Carvajal’s accounts were probably accurate. In parts of the Americas previously believed to have been scarcely inhabited, Heckenberger and his colleagues found evidence of garden cities surrounded by major earthworks and wooden palisades, built on grids and transected by broad avenues. In some places they unearthed causeways, bridges and canals. The towns were connected to their satellite villages by road networks that were planned and extensive. These were advanced agricultural civilisations, maintaining fish farms as well as arable fields and orchards. As in Slovenia, what appeared to be primordial forest had grown over the traces of a vanished population.

It appears that European diseases such as smallpox, measles, diphtheria, the common cold were brought to the Caribbean coast of South America by explorers and early colonists and then passed down indigenous trade routes into the heart of the continent, where they raged through densely peopled settlements before any Europeans reached them. So feracious is the vegetation of the Amazon that it would have obliterated all visible traces of the civilisations built by its people within a few years of their dissolution. The great várzea (floodplain) forests, whose monstrous trees inspired such wonder among 18th and 19th century expeditions, were probably not the primordial ecosystems the explorers imagined them to be.

Gruesome events — some accidental, others deliberately genocidal — wiped out the great majority of the hemisphere’s people and the rich and remarkable societies that they’d created. In many parts of the Americas, the only humans who remained were — like the survivors in a post-holocaust novel — hunter-gatherers. Some belonged to tribes that had long practised that art, others were forced to re-acquire lost skills as a result of civilisational collapse. Imported disease made cities lethal: only dispersed populations had a chance of avoiding epidemics. Dispersal into small bands of hunter-gatherers made economic complexity impossible. The forests blotted out memories of what had gone before. Humanity’s loss was nature’s gain.

The impacts of the American genocides might have been felt throughout the northern hemisphere. Dennis Bird and Richard Nevle, earth scientists at Stanford University, have speculated that the recovering forests drew so much carbon dioxide out of the atmosphere — about 10 parts per million — that they could have helped to trigger the cooling between the 16th and 19th centuries known as the Little Ice Age. The short summers and long cold winters, the ice fairs on the Thames and the deep cold depicted by Pieter Breugel might have been caused partly as a result of the extermination of the Native Americans.

(I don’t recall ever seeing the word feracious before. As you might infer, it means fruitful or fertile.)

Farmed Bluefin

Wednesday, November 19th, 2014

The Japanese treasure the rich red meat of hon-maguro or true tuna:

At an auction in Tokyo, a single bluefin once sold for $1.5 million, or $3,000 a pound.

All this has put the wild Pacific bluefin tuna in a perilous state. Stocks today are less than one-fifth of their peak in the early 1960s, around the time Japanese industrial freezer ships began prowling the oceans, according to an estimate by an international governmental committee monitoring tuna fishing in the Pacific. The wild population is now estimated by that committee at 44,848 tons, or roughly nine million fish, down nearly 50% in the past decade.


Not long ago, full farming of tuna was considered impossible. Now the business is beginning to take off, as part of a broader revolution in aquaculture that is radically changing the world’s food supply.


With a decadeslong global consumption boom depleting natural fish populations of all kinds, demand is increasingly being met by farm-grown seafood. In 2012, farmed fish accounted for a record 42.2% of global output, compared with 13.4% in 1990 and 25.7% in 2000. A full 56% of global shrimp consumption now comes from farms, mostly in Southeast Asia and China. Oysters are started in hatcheries and then seeded in ocean beds. Atlantic salmon farming, which only started in earnest in the mid-1980s, now accounts for 99% of world-wide production — so much so that it has drawn criticism for polluting local water systems and spreading diseases to wild fish.

Until recently, the Pacific bluefin tuna defied this sort of domestication. The bluefin can weigh as much as 900 pounds and barrels through the seas at up to 30 miles an hour. Over a month, it may roam thousands of miles of the Pacific. The massive creature is also moody, easily disturbed by light, noise or subtle changes in the water temperature. It hurtles through the water in a straight line, making it prone to fatal collisions in captivity.


Tuesday, November 18th, 2014

Ordinary people have three kinds of cones in their eyes, attuned to red, green, and blue, but a few people, mostly women, are tetrachromats, with four kinds of cones:

For years, researchers weren’t sure tetrachromacy existed. If it did, they stipulated, it could only be found in people with two X chromosomes. This is because of the genes behind color vision. People who have regular color vision have three cones, tuned to the wavelengths of red, green, and blue. These are connected to the X chromosome — most men have only one, but most women have two. Mutations in the X chromosome cause a person to perceive more or less color, which is why men more commonly have congenital colorblindness than women (if their one X chromosome has a mutation). But the theory stood that if a person received two mutated X chromosomes, she could have four cones instead of the usual three.

Note the use of most in that paragraph:

The original story stated that all men have one X and one Y chromosome and that all women have two X chromosomes. This statement neglected to include those with Klinefelter Syndrome and transgender individuals. We regret the error.

(Hat tip to T. Greer.)

Why Are So Few Blockbuster Drugs Invented Today?

Monday, November 17th, 2014

Why are so few blockbuster drugs invented today?

On Sept. 25, 1990, James D. Watson, the Nobel Prize-winning co-discoverer of the double-helix structure of DNA, and at the time the director of the National Center for Human Genome Research, wrote a letter to this paper making a prediction: “The ability to sequence DNA quickly and cheaply will also provide the technological basis for a new era in drug development.”

At that moment, the idea that the human genome would lead to a multitude of cures for diseases seemed inevitable and irresistible. DNA is, after all, nature’s instruction booklet for building living things; open that book and read its instructions, the thinking ran, and the botched instructions that result in diseases would be revealed. From there, a logical series of steps would arrive at a cure. Once a malfunctioning gene was isolated, scientists would find the protein coded by that gene. Then they’d use that protein as a target. Finally, they’d run tests of tens of thousands of unique chemical entities that drug companies have stockpiled over the years, to find one that fit the target like a key in a lock, to correct its function.

But this golden road to pharmaceutical riches, known as target-based drug discovery, has often proved to be more of a garden path. The first disappointment has been that most diseases affecting large numbers of people are not caused by a handful of mutations that can be unearthed as easily as digging potatoes in a field. Geneticists have called this the problem of “missing heritability,” because despite what they promised in the 1990s, they have found no single genetic variants that are necessary and sufficient to cause most forms of widespread diseases like diabetes, heart disease, Alzheimer’s or cancer.

The second disappointment is that even when a genetic variation can be plainly linked to a disease, the process for figuring out what to do about it rarely works as efficiently as advertised. Compounds that appear to hit a designated target right between the eyes still often fail to be safe and effective in animal and human studies. Biology is just way too complicated.

“If you read them now, the claims made for genomics in the 1990s sound a bit like predictions made in the 1950s for flying cars and anti-gravity devices,” Jack Scannell, an industry analyst, told me. But rather than speeding drug development, genomics may have slowed it down. So far it has produced fewer returns on greater investments. Scannell and Brian Warrington, who worked for 40 years inventing drugs for pharmaceutical companies, published a grim paper in 2012 that showed the plummeting efficiency of the pharmaceutical industry. They found that for every billion dollars spent on research and development since 1950, the number of new drugs approved has fallen by half roughly every nine years, meaning a total decline by a factor of 80. They called this Eroom’s Law, because it resembled an inversion of Moore’s Law (the observation, first made by the Intel co-founder Gorden E. Moore in 1965, that the number of transistors in an integrated circuit doubles approximately about every two years).


So far, most drug companies have continued to devote a vast majority of their funding to target-based research, even as more traditional methods of drug discovery have proved more productive. A study published last year by David Swinney found that only 17 of 50 novel drugs approved by the F.D.A. between 1999 and 2008 came from target-based research, compared with 28 from what Swinney calls “phenotypic” discovery, made by studying living cells in Petri dishes, animals and humans. Many of the drugs in this latter category — Alamast for allergies, Amitiza for constipation, Abreva for herpes cold sores, Ranexa for angina, Veregen for genital warts and Keppra, Excegran and Inovelon for seizures — were discovered by chemists who didn’t set out knowing what the drugs’ targets were, or even how they worked. In one now well known case of nontargeted research, scientists developed a drug for angina and found that while it wasn’t effective for relieving chest pain, it did cause erections in the study’s male volunteers. The researchers changed course, and Viagra was born.

Alternative Scientific History

Wednesday, November 12th, 2014

Science asked its readers what one piece of scientific knowledge from today they would share, if they could go back in time, and how might it change the course of history?

The responses show an almost laughable ignorance of the real world:

If I could travel back in time, I would transport to Syracuse, Sicily, in 222 B.C.E. to introduce the fundamental theorem of calculus to Archimedes 10 years before his death. As the great mathematical genius of his era, he would have been most poised to understand and disseminate the knowledge of linking the concept of a derivate of a function with the concept of the integral.… So much technology of today, from the internal combustion engine to the principles of economics, has been made possible due to calculus.

The internal combustion engine was made possible due to calculus?

As the future scientific envoy, I have an audience with Emperor Qin and present my gift: Women are capable of doing the same thing as men; they even can do better. Certainly, with adequate data, glorious accomplishment stories, and plenty of examples, such as Madame Curie, Mrs. Thatcher, Deng Yaping, and Oprah Winfrey, I can convince Emperor Qin to give women more chances to receive education and give full play to their talent in science and technology, culture, politics, and the military. In that way, more than 2000 years later, China would surely be a super power stronger than today.…


I would go back to ancient Rome on the morning of 15 March, 44 B.C.E., to the steps of the Roman Senate, and share Bayes’ Theorem with Julius Caesar. In the days leading up to his assassination, Rome was awash with rumors of an assassination plot. According to legend, an old soothsayer had forewarned Caesar himself of a great danger that threatened him on the Ides of March, and Caesar’s own wife Calpurnia had a premonition of her husband’s murder and tried to warn him of the danger. But were these dark forebodings and dire prophecies just idle gossip (noise) or a credible forecast of the future (signal)? Given this uncertainty, I would advise Caesar to guess the prior probability of an assassination plot and then update his prior based on the sundry rumors swirling around Rome. Had Caesar applied Bayesian reasoning, it is likely he would have followed his wife’s advice and stayed home on that fateful day. Had he done so, Bayes’ Rule might have changed the course of history, for the Roman Republic might have yet been saved, and perhaps we would all still be speaking Latin.


In 1687, Sir Isaac Newton published his Principia outlining the fundamentals of what quickly became called Newtonian mechanics. I would travel back to Cambridge, England, 5 years before this date and teach Einstein’s, theory of relativity to Isaac Newton. The obvious change in history resulting from this action would of course be a massive head start for the field of modern physics.… However, I would argue that a less obvious but possibly more important consequence of this historical change would be its effect on how we teach science. Currently, high school students and first year undergraduates are taught the limited version of physics discovered by Newton. Only students who choose to continue in the discipline learn Einstein’s more generalized form of mechanics and how classical mechanics is encompassed in this modern understanding. If Newton had discovered both his and Einstein’s contributions at the same time, the result would be an educational system that introduces a more complete view of physics to a wider audience of people from an earlier age.…

You see, more people would move beyond simple Newtonian physics, if only Newton had understood relativity! Clearly!

Instead of a piece of technical knowledge, I would share something that would provide perspective: the photo of Earth taken by the Apollo 17 astronauts in 1972. “The Blue Marble,” as it is often called, shows both the unity and finitude of the planet and its resources. The photo is emblematic of the modern environmental movement’s birth in the 1970s. I would bring this photo to early 19th-century Britain, during the Industrial Revolution, when consumption of Earth’s resources began to increase dramatically. Providing this information 150 years earlier would be an opportunity for the soon-to-be industrialized culture of western Europe to reconsider its relationship with the planet.

I’m sure early 19th-century Britain would react to the photo in the exact same way as late 20th-century America.

“Hello, Professors [Svante Arrhenius and Arvid Högbom], I travel back in time from 2013 to tell you that…since the early 20th century, Earth’s mean surface temperature has increased by about 0.8°C. The primary cause is greenhouse gases produced by human activities. The Intergovernmental Panel on Climate Change indicated that during the 21st century the global surface temperature is likely to rise another 1.1°C to 2.9°C, even for their lowest emissions scenario. Global warming isn’t just about things getting hotter; other changes include stormier, drier, and even colder conditions.” The next day, they wrote to the government and the scientific associations to call people’s attention to global warming and adaptations to eliminate it. Actions like reducing fossil fuel use, planting trees, and conserving water were known by people all over the world. Instead of destroying the planet, every single man on Earth began to protect and sustain it in their daily life.

I can imagine the swift Swedish reaction to the threat of mildly warmer temperatures.

I would choose Thomas Edison in the beginning of the year 1900 in New York City. I would describe the events of the future and how he and I could help keep our environment cleaner. I would give him designs to solar panels and hope that the future of solar technology would make America and other countries independent of oil production. Thomas Edison’s name alone could create Edison Panels that would be on every Victorian home in the world, especially in hard-to-reach locations…. Fewer trees would be cut, and the world would remain more rural and yet prosper from a new power source.

Edison supposedly did recommend harnessing the sun’s energy, for what it’s worth. Solar-thermal energy might be practical early in the Industrial Revolution, but photovoltaic would be a long, long way off.

I would be inclined to bring the Romans knowledge of movable type and paper, or maybe glasswork and lenses. I’m not sure that you can do much with an understanding of the heliocentric solar system, the periodic table, evolution, etc.

Sleep Extension

Wednesday, November 12th, 2014

Sleep deprivation has substantial effects on mood, mental and cognitive skills, and motor abilities, and this certainly applies to athletes:

It seems like certain kinds of athletic tasks are more affected by sleep deprivation. Although one-off efforts and high-intensity exercise see an impact, sustained efforts and aerobic work seem to suffer an even larger setback. Gross motor skills are relatively unaffected, while athletes in events requiring fast reaction times have a particularly hard time when they get less sleep.

Until recently, no one had studied the opposite of sleep deprivation, sleep extension:

The Cardinal men’s basketball team volunteered to be Mah’s study cohort. Eleven players used motion-sensing wristbands to determine how long they slept on average—just over 6.5 hours a night. For two weeks, the team kept to their normal schedules, while Mah’s researchers measured their performances on sprint drills, free throws, and three-point shooting. Then, the players were told to try and sleep as much as they could for five to seven weeks, with a goal of 10 hours in bed each night. Their actual time asleep, as measured by the sensors attached to their wrists, went from an average of 6.5 hours to nearly 8.5 hours.

The results were startling. By the end of the extra-sleep period, players had improved their free throw shooting by 11.4 percent and their three-point shooting by 13.7 percent. There was an improvement of 0.7 seconds on the 282-foot sprint drill—every single player on the team was quicker than before the study had started.

A 13-percent performance enhancement is the sort of gain that one associates with drugs or years of training—not simply making sure to get tons of sleep.

Professional athletes have to travel, and they often have to travel across time zones. Over the season, they appear to get fatigued and make certain kinds of errors more often:

Researchers at Vanderbilt University examined the plate discipline of hitters in baseball over the course of the season, and found that hitters swing at more pitches outside the strike zone late in the season than they do earlier in the season. Why? Dr. Scott Kutscher, the leader of the research team, said in a press release, “We theorize that this decline is tied to fatigue that develops over the course of the season due to a combination of frequency of travel and paucity of days off.”

Kutscher’s team has found that this decay in plate discipline has become more pronounced in baseball since 2006—the year that Major League Baseball banned stimulants. (For years, bowls of amphetamines, known as “greenies,” were a fixture in baseball clubhouses.)

Whoa, whoa, whoa. The “greenies” popular in the 1960s were just banned eight years ago? Wow.

Pesticide Exposure and Depression

Tuesday, November 11th, 2014

A recent study has linked pesticide exposure and depression in male private pesticide applicators — or farmers:

There’s a significant correlation between pesticide use and depression, that much is very clear, but not all pesticides. The two types that Kamel says reliably moved the needle on depression are organochlorine insecticides and fumigants, which increase the farmer’s risk of depression by a whopping 90% and 80%, respectively. The study lays out the seven specific pesticides, falling generally into one of those two categories, that demonstrated a categorically reliable correlation to increased risk of depression.

These types aren’t necessarily uncommon, either; one, called malathion, was used by 67% of the tens of thousands of farmers surveyed. Malathion is banned in Europe, for what that’s worth.

I asked whether farmers were likely to simply have higher levels of depression than the norm, given the difficulties of the job — long hours, low wages, a lack of power due to government interference, that kind of thing — and, according to Kamel, that wasn’t a problem at all. “We didn’t have to deal with overreporting [of depression] because we weren’t seeing that,” she says. In fact, only 8% of farmers surveyed sought treatment for depression, lower than the norm, which is somewhere around 10% in this country.

Lagoon and Spray

Sunday, November 9th, 2014

Until recently, hogs roamed in outdoor pens or fields, where their droppings fertilized crops, but now hog-farming has gone big, and not everything scales well:

Most of the farms that survived did so by going big—raising thousands of animals that spend their entire lives inside barns. Today, Duplin County, North Carolina, the top swine producer in the country, is home to 530 hog operations with a collective capacity of 2.35 million animals. According to a 2008 GAO estimate, hogs in five eastern North Carolina counties produced 15.5 million tons of manure in one year.

To handle all that waste, farmers in North Carolina use a standard practice called the lagoon and spray field system. They flush feces and urine from barns into open-air pits called lagoons, which turn the color of Pepto-Bismol when pink-colored bacteria colonize the waste. To keep the lagoons from overflowing, farmers spray liquid manure on their fields nearby.

The result, says Steve Wing, an epidemiologist at the University of North Carolina at Chapel Hill, is this: “The eastern part of North Carolina is covered with shit.”

Drugs that Extend Lifespan

Wednesday, November 5th, 2014

A recent study set out to screen for FDA-approved drugs that might extend lifespan:

The screening consisted of an assay based on neuronal cells in a medium with 15 mM (millimoles) of glucose. This amount of glucose is about three times the normal human blood glucose level, though a level easily achieved by out-of-control diabetics, and is toxic to neural tissue. The assay set out to find which drugs promoted survival in that level of glucose, and came up with 30 of them.

Then the researchers tested each of the 30 cell-survival promoting drugs on the roundworm C. elegans, the animal of choice in many anti-aging studies. (The animal is both tiny and has a short lifespan, making it ideal for this sort of thing: cheap, easily manipulated, fast results.) Six compounds were found that extended lifespan: caffeine, ciclopirox olamine, tannic acid, acetaminophen, bacitracin, and baicalein.

Acetaminophen overdoses kill hundreds and hospitalize thousands each year, but lower doses may protect against glucose toxicity.

Stimulation Seeking and Intelligence

Tuesday, November 4th, 2014

Preschoolers who seek stimulation — who physically explore their environment and engage in verbal and nonverbal stimulation with other children and adults — end up more intelligent:

The prediction that high stimulation seeking 3-year-olds would have higher IQs by 11 years old was tested in 1,795 children on whom behavioral measures of stimulation seeking were taken at 3 years, together with cognitive ability at 11 years. High 3-year-old stimulation seekers scored 12 points higher on total IQ at age 11 compared with low stimulation seekers and also had superior scholastic and reading ability. Results replicated across independent samples and were found for all gender and ethnic groups. Effect sizes for the relationship between age 3 stimulation seeking and age 11 IQ ranged from 0.52 to 0.87. Findings appear to be the first to show a prospective link between stimulation seeking and intelligence. It is hypothesized that young stimulation seekers create for themselves an enriched environment that stimulates cognitive development.

This salient bit went unmentioned in the abstract:

The larger population from which the participants were drawn consisted
of 1,795 children from the island of Mauritius (a country lying in the Indian
Ocean between Africa and India).

(Hat tip to Richard Harper.)

Liberals deny science, too

Monday, November 3rd, 2014

Liberals deny science, too, Chris Mooney reports:

The new study, by University of Texas-Brownville sociologist Mark Horowitz and two colleagues, surveyed 155 academic sociologists. 56.7 percent of the sample was liberal, another 28.6 percent was identified as radical, and only 4.8 percent were conservative.  Horowitz, who describes himself as a politically radical, social-justice oriented researcher, said he wanted to probe their views of the possible evolutionary underpinnings of various human behaviors. “I wanted to get at the really ideological blank slate view, it’s sort of a preemptive assumption that everything is taught, everything is learned,” he explained.

Sure enough, the study found that these liberal academics showed a pretty high level of resistance to evolutionary explanations for phenomena ranging from sexual jealousy to male promiscuity.

In fairness, the sociologists were willing to credit some evolutionary-style explanations. Eight-one percent found it either plausible or highly plausible that “some people are born genetically with more intellectual potential than others,” and 70 percent ascribed sexual orientation to “biological roots.” Meanwhile, nearly 60 percent of sociologists in the sample considered it “plausible” that human beings have a “hardwired” taste preference for foods that are full of fat and sugar, and just under 50 percent thought it plausible that we have an innate fear of snakes and spiders (for very sound, survival-focused reasons).

Yet the study also found that these scholars were less willing to consider evolutionary explanations for other aspects of human behavior, especially those relating to male-female differences. Less than 50 percent considered it plausible that that “feelings of sexual jealousy have a significant evolutionary biological component,” for instance, and just 36.4 percent considered it plausible that men “have a greater tendency towards promiscuity than women due to an evolved reproductive strategy.” While it is hard to be absolutely definitive on either of these issues (we weren’t there to observe evolution happen), evolutionary psychologists have certainly argued in published studies that people exhibit jealousy in sexual relationships in order to ensure reproductive fidelity and preserve the resources that come from a partner, and that men are more promiscuous because they are not constrained in how often they can attempt to reproduce.