Kids love dinosaurs

Wednesday, December 13th, 2017

As a near-universal rule, kids love dinosaurs. Or, as psychologists might say, many children develop an intense interest in dinosaurs:

Researchers don’t know exactly what sparks them — the majority of parents can’t pinpoint the moment or event that kicked off their kids’ interest — but almost a third of all children have one at some point, typically between the ages of 2 and 6 (though for some the interest lasts further into childhood). And while studies have shown that the most common intense interest is vehicles — planes, trains, and cars — the next most popular, by a wide margin, is dinosaurs.

[...]

“I hear it over and over” from parents, he says: ‘They know all the names! I don’t know how they remember that stuff.’” But Lacovara does, or at least he has some theories. “I think for many of these children, that’s their first taste of mastery, of being an expert in something and having command of something their parent or coach or doctor doesn’t know,” he says. “It makes them feel powerful. Their parent may be able to name three or four dinosaurs and the kid can name 20, and the kid seems like a real authority.”

Intense interests are a big confidence booster for kids, agrees Kelli Chen, a pediatric psychiatric occupational therapist at Johns Hopkins.

They’re also particularly beneficial for cognitive development. A 2008 study found that sustained intense interests, particularly in a conceptual domain like dinosaurs, can help children develop increased knowledge and persistence, a better attention span, and deeper information-processing skills. In short, they make better learners and smarter kids. There’s decades of research to back that up: Three separate studies have found that older children with intense interests tend to be of above-average intelligence.

[...]

And it’s probably not a coincidence that the age range for developing intense interests overlaps with the peak ages of imagination-based play (which is from age 3 through age 5).

[...]

In a study published in 2007, researchers who followed up with the parents of 177 kids found that the interests only lasted between six months and three years.

There are a number of reasons kids stop wanting to learn anything and everything about a particular topic, and one of the biggest is, ironically, school. As they enter a traditional educational environment, they’re expected to hit a range of targets in various subjects, which doesn’t leave much room for a specialization.

[...]

“Maybe at home the interest was being reinforced, and the positive feedback loop was, ‘Johnny knows that’s a pterodactyl, Johnny’s a genius!’ When you’re getting praise over and over again for having information about a subject, you’re on a runaway train to Dinosaurland,” Chatel says. “But then school begins and the positive feedback loops shift to, ‘Johnny played so well with others, Johnny shared his toys and made a friend.’”

It starts much, much too early for me

Saturday, December 9th, 2017

Studies have shown the benefits of later school starts, but what about really late school starts?

Here we report on the implementation and impact of a 10 a.m. school start time for 13-16-year-old students. A four-year observational study using a before-after-before (A-B-A) design was carried out in an English state-funded high school. School start times were changed from 8:50 a.m. in study year 0, to 10 a.m. in years 1-2, and then back to 8:50 a.m. in year 3. Measures of student health (absence due to illness) and academic performance (national examination results) were used for all students. Implementing a 10 a.m. start saw a decrease in student illness after two years of over 50% (p< .0005 and effect size: Cohen’s d = 1.07), and reverting to an 8:50 a.m. start reversed this improvement, leading to an increase of 30% in student illness (p<.0005 and Cohen’s d = 0.47). The 10:00 a.m. start was associated with a 12% increase in the value-added number of students making good academic progress (in standard national examinations) that was significant (<.0005) and equivalent to 20% of the national benchmark.

My teenage self would be nodding in agreement — as would Brian Setzer:

Hey, man, I don’t feel like goin’ to school no more / Me neither. They can’t make you go. No you daddyo yeah! / I ain’t goin’ to school it starts too early for me / Well listen man I ain’t goin’ to school no more it starts much, much too early for me / I don’t care about readin’, writin’, ‘rithmetic or history

An Amish mutation leads to a long life

Monday, November 27th, 2017

Amish who carried the null SERPINE1 mutation lived about 10 years longer:

That’s a huge increase in lifespan, arguably much greater than almost any single factor we know in humans.

The carriers of the gene mutation produce less PAI-1, which results in a greater tendency for blood clots to break down. Those who are homozygous (-/-) for the mutation have an even greater tendency to break down blood clots, which results in a bleeding disorder. That’s the immediate consequence of less PAI-1.

However, the heterozygous (+/-) carriers had longer telomeres, which is a sign of slower aging. They also had less diabetes risk, a 0% diabetes rate compared to 7% in non-carriers, even though body mass index was the same. And they had better cardiovascular risk markers, including lower blood pressure and lower carotid artery thickness, a measure of atherosclerosis.

Clearly, PAI-1 does a lot to promote aging, and having less of it appears to result in longer life.

Some aspects of prodigy and autism do overlap

Friday, November 24th, 2017

Prodigies aren’t typically autistic, but some aspects of prodigy and autism do overlap:

Prodigies, like many autistic people, have a nearly insatiable passion for their area of interest. Lauren Voiers, an art prodigy from the Cleveland area, painted well into the night as a teenager; sometimes she didn’t sleep at all before school began. That sounds a lot like the “highly restricted, fixated interests” that are part of autism’s diagnostic criteria.

Prodigies also have exceptional working memories. In a 2012 study led by one of us, Dr. Ruthsatz, all eight of the prodigies examined scored in the 99th percentile in this area. As the child physicist Jacob Barnett once put it during an interview on “60 Minutes,” “Every number or math problem I ever hear, I have permanently remembered.” Extreme memory has long been linked to autism as well. Dr. Leo Kanner, one of the scientists credited with identifying autism in the 1940s, noted that the autistic children he saw could recite “an inordinate number of nursery rhymes, prayers, lists of animals, the roster of presidents, the alphabet forward and backward.” A study on talent and autism published in 2015 in The Journal of Autism and Developmental Disorders found that over half of the more than 200 autistic subjects had unusually good memories.

Finally, both prodigies and autistic people have excellent eyes for detail. Simon Baron-Cohen, an autism researcher, and his colleagues have described an excellent eye for detail as “a universal feature of the autistic brain.” It’s one of the categories on the Autism-Spectrum Quotient, a self-administered test Dr. Baron-Cohen helped develop that measures autistic traits. The prodigies in Dr. Ruthsatz’s 2012 study got high marks in this trait on the test. One of the subjects, Jonathan Russell, a 20-year-old music prodigy who lives in New York, described how startled he gets when the chimes on the subway are slightly off key.

Beyond the cognitive similarities, many child prodigies have autistic relatives. In the 2012 study, half of the prodigies had an autistic relative at least as close as a niece or grandparent. Three had received a diagnosis of autism themselves when young, which they seemed to have since grown out of.

There might even be evidence of a genetic link between the conditions. In a 2015 study published in Human Heredity, Dr. Ruthsatz and her colleagues examined the DNA of prodigies and their families. They found that the prodigies and their autistic relatives both seemed to have a genetic mutation or mutations on the short arm of Chromosome 1 that were not shared by their neurotypical relatives. Despite a small sample size (the finding rested on five extended prodigy families), the data was statistically significant.

Engineers at Caltech have created a stable ring of plasma in open air

Saturday, November 18th, 2017

Engineers at Caltech have created a stable ring of plasma in open air using just a stream of water and a crystal plate:

“We were told by some colleagues this wasn’t even possible. But we can create a stable ring and maintain it for as long as we want, no vacuum or magnetic field or anything,” says co-author Francisco Pereira of the Marine Technology Research Institute in Italy, a visiting scholar at Caltech.

The stream of water is an 85-micron-diameter jet blasting from a specially designed nozzle at 9,000 pounds per square inch that strikes the crystal plate with an impact velocity of around 1,000 feet per second. For reference, that’s a stream narrower than a human hair moving about as fast as a bullet fired from a handgun.

Stable Plasma Torus at Caltech

In their study, Gharib and his team experimented with both crystal plates of quartz and lithium niobate, each of which can induce the triboelectric effect—in which an electric charge builds up because of friction with another material. When the jet hits the crystal, the water creates a smooth, laminar flow of positively charged ions across the negatively charged surface. At the shear region, where the stream strikes the surface and flows outward across it, the triboelectric effect triggers a high flow of electrons through the water to its surface. This flow of electrons ionizes the atoms and molecules in the surrounding gas near the surface of the water, creating a donut, or torus, of glowing plasma that is dozens of microns in diameter and visible under a microscope.

Gharib and his team fired the water jet at surfaces of different textures and found that the smoother the surface, the clearer the structure of the plasma ring. The ring is stable, and as long as the water continues to flow, the ring maintains its shape and size.

In addition, engineers working with the plasma noticed that their cell phones encountered high levels of radio frequency noise—static—while they were in the same room as the experiment. It turns out that the plasma ring emits distinct radio frequencies. “That’s never been seen before. We think it’s because of the piezo properties of the materials that we used in our experiments,” Pereira says, referring to the materials’ ability to be electrically polarized through mechanical stress—in this case, the flowing of water.

Consciousness began when the gods stopped speaking

Thursday, November 16th, 2017

Julian Jaynes presented his (in)famous theory of consciousness in his 1970 book, The Origin of Consciousness in the Breakdown of the Bicameral Mind:

The book sets its sights high from the very first words.  “O, what a world of unseen visions and heard silences, this insubstantial country of the mind!” Jaynes begins. “A secret theater of speechless monologue and prevenient counsel, an invisible mansion of all moods, musings, and mysteries, an infinite resort of disappointments and discoveries.”

To explore the origins of this inner country, Jaynes first presents a masterful precis of what consciousness is not. It is not an innate property of matter. It is not merely the process of learning. It is not, strangely enough, required for a number of rather complex processes. Conscious focus is required to learn to put together puzzles or execute a tennis serve or even play the piano. But after a skill is mastered, it recedes below the horizon into the fuzzy world of the unconscious. Thinking about it makes it harder to do. As Jaynes saw it, a great deal of what is happening to you right now does not seem to be part of your consciousness until your attention is drawn to it. Could you feel the chair pressing against your back a moment ago? Or do you only feel it now, now that you have asked yourself that question?

Consciousness, Jaynes tells readers, in a passage that can be seen as a challenge to future students of philosophy and cognitive science, “is a much smaller part of our mental life than we are conscious of, because we cannot be conscious of what we are not conscious of.” His illustration of his point is quite wonderful. “It is like asking a flashlight in a dark room to search around for something that does not have any light shining upon it. The flashlight, since there is light in whatever direction it turns, would have to conclude that there is light everywhere. And so consciousness can seem to pervade all mentality when actually it does not.”

Perhaps most striking to Jaynes, though, is that knowledge and even creative epiphanies appear to us without our control. You can tell which water glass is the heavier of a pair without any conscious thought — you just know, once you pick them up. And in the case of problem-solving, creative or otherwise, we give our minds the information we need to work through, but we are helpless to force an answer. Instead it comes to us later, in the shower or on a walk. Jaynes told a neighbor that his theory finally gelled while he was watching ice moving on the St. John River. Something that we are not aware of does the work.

The picture Jaynes paints is that consciousness is only a very thin rime of ice atop a sea of habit, instinct, or some other process that is capable of taking care of much more than we tend to give it credit for. “If our reasonings have been correct,” he writes, “it is perfectly possible that there could have existed a race of men who spoke, judged, reasoned, solved problems, indeed did most of the things that we do, but were not conscious at all.”

Jaynes believes that language needed to exist before what he has defined as consciousness was possible. So he decides to read early texts, including The Iliad and The Odyssey, to look for signs of people who aren’t capable of introspection — people who are all sea, no rime. And he believes he sees that in The Iliad. He writes that the characters in The Iliad do not look inward, and they take no independent initiative. They only do what is suggested by the gods. When something needs to happen, a god appears and speaks. Without these voices, the heroes would stand frozen on the beaches of Troy, like puppets.

Speech was already known to be localized in the left hemisphere, instead of spread out over both hemispheres. Jaynes suggests that the right hemisphere’s lack of language capacity is because it used to be used for something else — specifically, it was the source of admonitory messages funneled to the speech centers on the left side of the brain. These manifested themselves as hallucinations that helped guide humans through situations that required complex responses — decisions of statecraft, for instance, or whether to go on a risky journey.

The combination of instinct and voices — that is, the bicameral mind — would have allowed humans to manage for quite some time, as long as their societies were rigidly hierarchical, Jaynes writes. But about 3,000 years ago, stress from overpopulation, natural disasters, and wars overwhelmed the voices’ rather limited capabilities. At that point, in the breakdown of the bicameral mind, bits and pieces of the conscious mind would have come to awareness, as the voices mostly died away. That led to a more flexible, though more existentially daunting, way of coping with the decisions of everyday life — one better suited to the chaos that ensued when the gods went silent. By The Odyssey, the characters are capable of something like interior thought, he says. The modern mind, with its internal narrative and longing for direction from a higher power, appear.

We failed in the direction of truth

Sunday, November 12th, 2017

Razib Khan is excited to read Steven Pinker’s Enlightenment Now, because he’s looking for a little hope:

At this point, I am very pessimistic as to the prospects for the Enlightenment project.

This is pretty obvious to anyone who reads me closely. I’ve been writing and discussing with people on the internet, and in private, for many years now, and have come to the conclusion most people are decent, but they’re also craven and intellectually unserious outside of their domain specificity when they are intellectual. Many of our institutions are quite corrupt, and those which are supposedly the torchbearers of the Enlightenment, such as science, are filled with people who are also blind to their own biases or dominated by those who will plainly lie to advance their professional prospects or retain esteem from colleagues.

[...]

n psychology, much of the replication crisis was simply due to personal self-interest (more publications). But some of it was obviously political (see stereotype threat). Similarly, look at the fiasco in nutrition science. Some of it was personal, but there were also political demands from on high that there be something done. So “scholars” set some guidelines that people followed for decades, even if later they were shown to be totally ineffective. I’m not even going to get into the travesty that is modern biomedical science, with professional advancement and institutional interests combined in a deadly cocktail.

Also, I enjoy science popularizing (or did, I don’t read science books much anymore) as much as the next person, but isn’t it interesting how much of modern science confirms the mainstream elite cultural norms of ~2020? Curiously, if you read science popularizations in newspapers in 1920 they would also confirm the elite cultural norms of 1920…. But this time we’re right!

Other institutions aren’t doing better. The media is going through economic collapse, and journalists and their paymasters are reacting by pandering to their audiences. Instead of illuminating, they’re confirming. That’s what the audience wants, and I’m sure it’s more satisfying to journalists anyway. But can you blame them with the economics that are before us?

[...]

People have always been biased and subject to motivated reasoning. We’ve had our disputes whatever our ideology, whether it be conservative, moderate, or liberal. But the Enlightenment perspective of critical rationalism, which took philosophical realism seriously, meant that ultimately people who disagreed often assumed that fundamentally they were trying to converge on the same facts, the same reality. Reality existed, and you couldn’t just wish it away. Discussion might forward two individuals to a convergence!

We’re not there anymore. Whether it be Bush-era contempt for “Reality-Based Community”, or the rising crest of “Critical Theory”, the acid of subjectivism is eroding the vast edifice of aspirational realism which grew organically in the wake of the Enlightenment. This isn’t a Left vs. Right phenomenon, it’s a human dynamic, because for most of human history what is true has been determined by what the tribe dictates to be true, and what the tribe dictates to be true has often not been based on a critical evaluation of facts and theories. What the tribe dictates to be true is computationally less intensive than thinking things through yourself, and, it’s often right-enough.

The reality is that this cultural cognition and conformity has always held. It’s just that it seems that for a few centuries substantial latitude was given in public to a relative amount of heterodoxy from broad tribal visions. And it was always a work in progress. But there was a goal, and an ideal, even if we habitually failed. We failed in the direction of truth.

We live in a post-modern age now. Feelings are paramount, facts must bow before them. But the curious fact is that the post-modern age is just the pre-modern age. When I first read the Christian author Alister McGrath I literally scoffed at his contention that atheism would fail before the ascendancy of post-modernism. Ten years on I will admit that I now believe he was right and I was wrong. Though I don’t think the New Atheism failed miserably, I do think that the problems it is encountering from the cultural Left are due to its cold modernist baggage.

No truth, no liberalism. No liberalism, and democracy become the mob. The passions of the mob do eventually fail, and its wake a more oligarchic and hierarchical system will emerge. We may simply be seeing the end of the liberal individualist interregnum, as history reverts to its despotic collectivist norm.

[...]

Finally, understanding that most people don’t need to be right or utter the truth, but simply need to win, has made me much more cheerful and less sour observing everyday stupidities. It is no great insight to observe that I’ve never been one who has had much esteem for the admiration of my peers. I like to do my own thing. But tribal acclamation must be the best of all things for most humans, and now I understand why they fight unfairly and stupidly with such ease and naturalness: their aim not to be right in the eyes of nature, but to rise in the esteem their fellow human. That is the summum bonum.

Preregistration of clinical trials causes medicines to stop working

Saturday, November 11th, 2017

Something must be done to combat this public health hazard, Chris Blattman says, as he notes that preregistration of clinical trials causes medicines to stop working. From the PLOS paper:

We identified all large NHLBI supported RCTs between 1970 and 2012 evaluating drugs or dietary supplements for the treatment or prevention of cardiovascular disease.

Relative Risk of Primary Outcome by Publication Year

17 of 30 studies (57%) published prior to 2000 showed a significant benefit of intervention on the primary outcome in comparison to only 2 among the 25 (8%) trials published after 2000. There has been no change in the proportion of trials that compared treatment to placebo versus active comparator. Industry co-sponsorship was unrelated to the probability of reporting a significant benefit. Pre-registration in clinical trials.gov was strongly associated with the trend toward null findings.

Bacterial fats deserve the blame for heart disease

Wednesday, November 8th, 2017

Heart disease is associated with clogged arteries, but it may be bacterial Fats, not dietary ones, that deserve the blame:

Using careful chemical analysis of atheromas collected from patients by a colleague at Hartford Hospital, they found lipids with a chemical signature unlike those from animals at all. Instead, these strange lipids come from a specific family of bacteria.

“I always call them greasy bugs because they make so much lipid. They are constantly shedding tiny blebs of lipids. Looks like bunches of grapes,” on a bacterial scale, says Frank Nichols, a UConn Health periodontist who studies the link between gum disease and atherosclerosis. The bacteria, called Bacteroidetes, make distinctive fats. The molecules have unusual fatty acids with branched chains and odd numbers of carbons (mammals typically don’t make either branched chain fatty acids or fatty acids with odd numbers of carbons).

Xudong Yao, a UConn associate professor of chemistry who analyzed the lipid samples, says the chemical differences between bacterial and human lipids result in subtle weight differences between the molecules. “We used these weight differences and modern mass spectrometers to selectively measure the quantity of the bacterial lipids in human samples to link the lipids to atherosclerosis,” he says. “Establishment of such a link is a first step to mark the lipids as indicators for early disease diagnosis.”

The marked chemical differences between Bacteroidetes lipids and the human body’s native lipids may be the reason they cause disease, suggests Nichols. The immune cells that initially stick to the blood vessel walls and collect the lipids recognize them as foreign. These immune cells react to the lipids and set off alarm bells.

Nichols and Yao’s team also showed that despite being non-native lipids, the Bacteroidetes lipids could be broken down by an enzyme in the body that processes lipids into the starting material to make inflammation-enhancing molecules. So the Bacteroidetes lipids have a double whammy on the blood vessels: the immune system sees them as a signal of bacterial invasion, and then enzymes break them down and super-charge the inflammation.

Despite the havoc they wreak, it’s not the Bacteroidetes bacteria themselves invading. Usually these bacteria stay happily in the mouth and gastrointestinal tract. If conditions are right, they can cause gum disease in the mouth, but not infect the blood vessels. But the lipids they produce pass easily through cell walls and into the bloodstream.

The next step in the research is to analyze thin slices of atheroma to localize exactly where the bacterial lipids are accumulating. If they can show the Bacteroidetes-specific lipids are accumulating within the atheroma, but not in the normal artery wall, that would be convincing evidence that these unusual lipids are associated specifically with atheroma formation, and therefore contribute to heart disease.

(Hat tip to Mangan.)

Americans have never eaten much fruit

Monday, November 6th, 2017

Humans did not evolve to eat modern sugary fruit year round in abundance, Mangan notes, and even in the early modern era it wasn’t a large part of the diet:

In the 18th and 19th centuries, Americans did not eat very much in the way of fruits and vegetables. Meat was abundant, and even the poor ate plenty of it. Fruits and vegetables had a short growing season and were ripe for only a short period of time, and in the absence of refrigeration and transport, spoiled, as Nina Techolz writes:

Even in the warmer months, fruit and salad were avoided, for fear of cholera. (Only with the Civil War did the canning industry flourish, and then only for a handful of vegetables, the most common of which were sweet corn, tomatoes, and peas.)

So it would be “incorrect to describe Americans as great eaters of either [fruits or vegetables],” wrote the historians Waverly Root and Rich­ard de Rochemont. Although a vegetarian movement did establish itself in the United States by 1870, the general mistrust of these fresh foods, which spoiled so easily and could carry disease, did not dissipate until after World War I, with the advent of the home refrigerator. By these accounts, for the first 250 years of American history, the entire nation would have earned a failing grade according to our modern mainstream nutritional advice.

What about apples — fruit, obviously — didn’t Americans eat them? Johnny Appleseed is famous for spreading apple trees around the country. But it turns out that much of the apple crop was turned into apple cider. Not only did cider provide alcohol, but it’s a way to preserve and concentrate apples in the absence of refrigeration and transport.

Wild Bananas

Fruits (and vegetables) are thought to be healthy due to the phytochemicals, namely polyphenols, that they contain:

However, coffee, tea, red wine, and chocolate all generally provide far more polyphenols than fruit. With the exception of chocolate, they have the added benefit of being entirely sugar-free, and even chocolate can be consumed without sugar or in low-sugar forms such as dark chocolate. So, if you want to consume polyphenols, and you consume coffee, etc., then fruit would be superfluous.

From feeling old to feeling young

Monday, November 6th, 2017

At age 72, Dr. Alan S. Green saw miraculous improvement in health, from feeling old to feeling young, after taking rapamycin once per week:

I attended college on a tennis scholarship and ran a marathon in just under 4 hours at age 40. But by age 70 my main physical activity was reduced to walking my two Shiba Innu dogs in the park. Then by age 72, I experienced angina and shortness of breath on small hills. As a trained pathologist I accepted the reality that I was in rather poor shape. My fasting blood sugar was up, my creatinine blood level was elevated indicating renal insufficiency and I couldn’t fit into any of my pants. I then began trying to learn about aging. I discovered a story more extraordinary and improbable than anything I had ever encountered in my lifetime.

[...]

Based upon empirical medicine principles, I decided rapamycin 6 mg once a week would be an aggressive treatment and 3 mg once every 10 days would be a conservative treatment. I decided to go with aggressive treatment. January 2016, I began the rapamycin-based Koschei formula with intent to take it for one year; in what could euphemistically be called a “proof-of-concept” experiment. I didn’t have to wait one year; by 4 months the results were miraculous. I lost 20 pounds, my waist-line went from 38 inches to 33. I bought a pair of size 32 jeans and didn’t have to wear joggers no more. I could walk 5 miles a day and ride a bike up hills without any hint of angina. Creatinine went from elevated to normal and fasting blood sugar went down. I thought I was Lazarus back from the dead. It’s now over 1 year and I feel great. I’ve also had no mouth sores, the most common clinical side-effect. For me, rapamycin is the world’s greatest medicine.

Dennis Mangan asked Dr. Green a number of questions:

PDM: I note that of the drugs you advocate for anti-aging, metformin, aspirin, and ACE inhibitors/AR blockers are cheap, while rapamycin is more expensive. Does any other drug come close to rapamycin in efficacy or is it indispensable? Of the four drugs, what fraction of anti-aging effect is due to rapamycin in your estimation?

ASG: Rapamycin is only $3.50 for 1 mg if you buy it on line with a prescription from Canada; therefore monthly cost might come to $50-100 a month.

My rough guess of the relative value of each as anti-aging drug would be as follows: rapamycin, ACE inhibitor/AR blocker, metformin, aspirin: 75, 18, 6, 1.

[...]

PDM: I was fascinated to learn about angiotensin disruption for anti-aging, which I’m not sure if I had heard of before, and also that it fits the growth vs longevity paradigm. (On second thought, I had heard of it, but I forgot. Must be the effects of age.) Do you think hypertension is a “normal” manifestation of aging and that everyone can expect to have it to some degree as they age?

ASG: The two best characterized systems which promote aging are the mTOR system and the angiotensin-renin system. Angiotensin II is the primary cause of hypertension; but angiotensin II also promotes atherosclerosis, damage to mitochondria and increase ROS in tissues. I think all older persons probably suffer from higher activity from angiotensin II than is healthy. So probably most old people had some degree of hypertension and they would benefit from being on angiotensin blocker/inhibitor (ARB/ACE). The important thing is to use one that crosses blood-brain barrier.

Dogs are not super-cooperative wolves

Tuesday, October 24th, 2017

Dogs are not super-cooperative wolves:

She and her colleagues challenged their canines to a simple task, which other scientists have used on all kinds of brainy animals — chimps, monkeys, parrots, ravens, and even elephants. There’s a food-bearing tray that lies on the other side of their cage, tempting and inaccessible. A string is threaded through rings on the tray, and both of its ends lie within reach of the animals. If an individual grabs an end and pulls, it would just yank the string out and end up with a mouthful of fibers — not food. But if two animals pull on the ends together, the tray slides close, and they get to eat.

All in all, the dogs did terribly. Just one out of eight pairs managed to pull the tray across, and only once out of dozens of trials. By contrast, five out of seven wolf pairs succeeded, on anywhere between 3 and 56 percent of their attempts. Even after the team trained the animals, the dogs still failed, and the wolves still outshone them. “We imagined that we would find some differences but we didn’t expect them to be quite so strong,” Marshall-Pescini says.

It’s not that the dogs were uninterested: They explored the strings as frequently as the wolves did. But the wolves would explore the apparatus together — biting, pawing, scratching, and eventually pulling on it. The dogs did not. They tolerated each other’s presence, but they were much less likely to engage with the task at the same time, which is why they almost never succeeded.

“The dogs are really trying to avoid conflict over what they see as a resource,” says Marshall-Pescini. “This is what we found in food-sharing studies, where the dominant animal would take the food and the subordinate wouldn’t even try to approach. With wolves, there’s a lot of arguing and it sounds aggressive, but they end up sharing. They have really different strategies in situations of potential conflict. [With the dogs], you see that if you avoid the other individual, you avoid conflict, but you can’t cooperate either.”

“Amazingly, no one had ever studied whether carnivores could solve this type of cooperative task, and it’s fun to see that the wolves coordinated,” says Brian Hare from Duke University, who studies dog behavior and the influence of domestication. He has argued that during the domestication process, dogs began using their traditional inherited mental skills with a new social partner: humans.

Simultaneously, dogs perhaps became less attentive to each other, adds Marshall-Pescini. After all, wolves need to work together to kill large prey, and sharing food helps to keep their social bonds intact. But when they started scavenging on human refuse, they could feed themselves on smaller portions by working alone. If they encountered another forager, “maybe the best strategy was to continue searching rather than to get into conflict with another dog,” she says.

But dogs can be trained. When owners raise dogs in the same household, and train them not to fight over resources, the animals start to tolerate each other, and unlock their ancient wolflike skills. This might be why, in 2014, Ljerka Ostojic, from the University of Cambridge, found that pet dogs, which had been trained in search and rescue, had no trouble with the string-pulling task that flummoxed Marshall-Pescini’s dogs.

“It speaks to the fact that living among other dogs, without interaction with humans, is arguably less natural for dogs — as if domestication both refined attention, coordination, and even pro-sociality between species, and weakened social skills within the species,” says Alexandra Horowitz, who studies dog cognition at Barnard College. “A pack of dogs living together, without human intervention, is impaired compared to dogs living with humans.”

Hosting experiments in governance styles

Wednesday, October 18th, 2017

The Seasteading Institute and its for-profit spin-off, Blue Frontiers, have racked up some real-world achievements in the past year, Nature (!) reports:

They signed a memorandum of understanding with the government of French Polynesia in January that lays the groundwork for the construction of their prototype. And they gained momentum from a conference of interested parties in Tahiti in May, which hundreds of people attended. The project’s focus has shifted from building a libertarian oasis to hosting experiments in governance styles and showcasing a smorgasbord of sustainable technologies for, among other things, desalination, renewable energy and floating food-production. The shift has brought some gravitas to the undertaking, and some ecologists have taken interest in the possibilities of full-time floating laboratories.

But the project still faces some formidable challenges. The team must convince the people of French Polynesia that the synthetic islands will benefit them; it must raise enough money to actually build the prototype, which it estimates will cost up to US$60 million; and once it is built, the group must convince the world that artificial floating islands are more than just a gimmick. Producing solid science and broadly useful technology would go a long way towards making that case.

Brain drain is real

Friday, October 6th, 2017

Brain Drain via Lymphatic SystemIn 1816, an Italian anatomist reported finding lymphatic vessels on the surface of the brain, but for two centuries the dogma has remained that the brain is an exceptional organ, with no way to remove waste:

Then in 2015, two studies of mice found evidence of the brain’s lymphatic system in the dura. Coincidentally, that year, Dr. Reich saw a presentation by Jonathan Kipnis, Ph.D., a professor at the University of Virginia and an author of one the mouse studies.

“I was completely surprised. In medical school, we were taught that the brain has no lymphatic system,” said Dr. Reich. “After Dr. Kipnis’ talk, I thought, maybe we could find it in human brains?”

To look for the vessels, Dr. Reich’s team used MRI to scan the brains of five healthy volunteers who had been injected with gadobutrol, a magnetic dye typically used to visualize brain blood vessels damaged by diseases, such as multiple sclerosis or cancer. The dye molecules are small enough to leak out of blood vessels in the dura but too big to pass through the blood-brain barrier and enter other parts of the brain.

At first, when the researchers set the MRI to see blood vessels, the dura lit up brightly, and they could not see any signs of the lymphatic system. But, when they tuned the scanner differently, the blood vessels disappeared, and the researchers saw that dura also contained smaller but almost equally bright spots and lines which they suspected were lymph vessels. The results suggested that the dye leaked out of the blood vessels, flowed through the dura and into neighboring lymphatic vessels.

To test this idea, the researchers performed another round of scans on two subjects after first injecting them with a second dye made up of larger molecules that leak much less out of blood vessels. In contrast with the first round of scans, the researchers saw blood vessels in the dura but no lymph vessels regardless of how they tuned the scanner, confirming their suspicions.

They also found evidence for blood and lymph vessels in the dura of autopsied human brain tissue. Moreover, their brain scans and autopsy studies of brains from nonhuman primates confirmed the results seen in humans, suggesting the lymphatic system is a common feature of mammalian brains.

“These results could fundamentally change the way we think about how the brain and immune system inter-relate,” said Walter J. Koroshetz, M.D., NINDS director.

Dr. Reich’s team plans to investigate whether the lymphatic system works differently in patients who have multiple sclerosis or other neuroinflammatory disorders.

It is to be feared that it may have become too popular

Friday, October 6th, 2017

Techniques of Systems Analysis presents a toy problem of allocating resources to planes and bombs in order to attack a couple potentially sheltered airfields with area defenses and local defenses. There are increasing returns to more planes attacking a particular airfield, as they can saturate defenses, but there are decreasing returns to hitting a particular airfield with more bombs:

It is probably clear to the reader that any reasonable person, including for example the ancient Greeks, could have followed our qualitative reasoning and understood, that when one is poor

  1. most of the money should be spent on decreasing attrition (buying planes)
  2. that one should concentrate on one target,

and conversely that when one is rich

  1. one should spend more money on bombs because the enemy’s defenses are automatically saturated by the number of planes in the attack
  2. that one can now afford to attack both targets.

The exciting thing that we have done is to make the above qualitative remarks numerical; that is, we have change what we called an “intuitive judgment” into what we called a “considered opinion.” How exciting this is can be seen from the fact that the ability to make this type of calculation and end up with Charts 17 and 18 is as much of an intellectual invention as the steam engine or the telegraph is a technical invention.

Techniques of Systems Analysis Charts 17 and 18

In fact, the concepts needed for this kind of analysis were invented in roughly the same time period as these two gadgets were. Moreover, they were not used for this kind of a question until late in the nineteenth century. In fact, it is only in the post World War II period, which saw a great expansion in the intellectual tools, computing ability, and suitable problems for this kind of analysis that it really became popular as an aid to the military planner. It is to be feared that it may have become too popular. Many people got so excited about the possibilities that they went overboard and claimed entirely too much for the technique.

One trouble was that people did not generally realize that even modern computing methods are not really powerful enough to evaluate complicated systems without the aid of a good deal of skillful “intuitive” supervision and guidance and, even more to the point, that the problems of uncertainty can swamp or negate a good deal of straightforward analysis. In many cases it was necessary to idealize the problem so much to make it tractable to analysis that the resulting considered opinion was less valuable than almost any reasonable intuitive judgment which was based on an examination of the unidealized problem.