This isn’t the “PC police” talking

Sunday, April 16th, 2017

Scientific American has published an embarrassingly unscientific piece by Eric Siegel on the real problem with Charles Murray and The Bell Curve:

Attempts to fully discredit his most famous book, 1994′s “The Bell Curve,” have failed for more than two decades now. This is because they repeatedly miss the strongest point of attack: an indisputable — albeit encoded — endorsement of prejudice.

So, the science is unassailable, but we should vehemently attack an encoded endorsement of prejudice that is based on that (apparently) unassailable science? “This isn’t the ‘PC police’ talking,” he asserts, but he completely ignores what Murray explicitly says about prejudging people:

Even when the differences are substantial, the variation between two groups will almost always be dwarfed by the variation within groups — meaning that the overlap between two groups will be great. In a free society where people are treated as individuals, “So what?” is to me the appropriate response to genetic group differences. The only political implication of group differences is that we must work hard to ensure that our society is in fact free and that people are in fact treated as individuals.

Take your shoes off at the door

Tuesday, April 11th, 2017

It turns out that taking your shoes off when you come inside doesn’t just keep the carpets cleaner. It’s also healthier:

Among samples collected in homes, 26.4% of shoe soles tested positive for C. Diff, about three times the number found on the surfaces of bathrooms and kitchens.

And that’s just one bacterium. In an earlier investigation, Dr. Garey examined past studies to learn if “shoe soles are a vector for infectious pathogens.” The answer was a resounding yes.

Among the studies: Austrian researchers found at least 40% of shoes carried Listeria monocytogenes in 2015. And a 2014 German study found that over a quarter of boots used on farms carried E.coli.

“Essentially, when you wear your shoes in a house, you are bringing in everything you stepped in during the day,” says Jonathan Sexton, a laboratory manager at the Mel & Enid Zuckerman College of Public Health at the University of Arizona.

Wiping your feet, however vigorously, on a welcome mat, provides only limited help, he says. “It will remove some of the dirt, but you have to think of the person who wiped their feet before. You might be picking stuff they left behind.”

Some homeowners may worry that guests in socks or bare feet might also represent a health risk. That’s possible, Dr. Sexton says, but the inside of a shoe has far less bacteria than the outside.

Both researchers agree that the risk is muted. “Shoes in the house are not something to freak out about,” Dr. Sexton says.

To Be a Genius, Think Like a 94-Year-Old

Sunday, April 9th, 2017

To be a genius, think like a 94-year-old — more specifically, like Dr. John Goodenough:

In 1946, a 23-year-old Army veteran named John Goodenough headed to the University of Chicago with a dream of studying physics. When he arrived, a professor warned him that he was already too old to succeed in the field.

Recently, Dr. Goodenough recounted that story for me and then laughed uproariously. He ignored the professor’s advice and today, at 94, has just set the tech industry abuzz with his blazing creativity. He and his team at the University of Texas at Austin filed a patent application on a new kind of battery that, if it works as promised, would be so cheap, lightweight and safe that it would revolutionize electric cars and kill off petroleum-fueled vehicles. His announcement has caused a stir, in part, because Dr. Goodenough has done it before. In 1980, at age 57, he coinvented the lithium-ion battery that shrank power into a tiny package.

We tend to assume that creativity wanes with age. But Dr. Goodenough’s story suggests that some people actually become more creative as they grow older. Unfortunately, those late-blooming geniuses have to contend with powerful biases against them.

[...]

On the contrary, there’s plenty of evidence to suggest that late blooming is no anomaly. A 2016 Information Technology and Innovation Foundation study found that inventors peak in their late 40s and tend to be highly productive in the last half of their careers. Similarly, professors at the Georgia Institute of Technology and Hitotsubashi University in Japan, who studied data about patent holders, found that, in the United States, the average inventor sends in his or her application to the patent office at age 47, and that the highest-value patents often come from the oldest inventors — those over the age of 55.

[...]

Years ago, he decided to create a solid battery that would be safer. Of course, in a perfect world, the “solid-state” battery would also be low-cost and lightweight. Then, two years ago, he discovered the work of Maria Helena Braga, a Portuguese physicist who, with the help of a colleague, had created a kind of glass that can replace liquid electrolytes inside batteries.

Dr. Goodenough persuaded Dr. Braga to move to Austin and join his lab. “We did some experiments to make sure the glass was dry. Then we were off to the races,” he said.

Some of his colleagues were dubious that he could pull it off. But Dr. Goodenough was not dissuaded. “I’m old enough to know you can’t close your mind to new ideas. You have to test out every possibility if you want something new.”

When I asked him about his late-life success, he said: “Some of us are turtles; we crawl and struggle along, and we haven’t maybe figured it out by the time we’re 30. But the turtles have to keep on walking.” This crawl through life can be advantageous, he pointed out, particularly if you meander around through different fields, picking up clues as you go along. Dr. Goodenough started in physics and hopped sideways into chemistry and materials science, while also keeping his eye on the social and political trends that could drive a green economy. “You have to draw on a fair amount of experience in order to be able to put ideas together,” he said.

He also credits his faith for keeping him focused on his mission to defeat pollution and ditch petroleum. On the wall of his lab, a tapestry of the Last Supper depicts the apostles in fervent conversation, like scientists at a conference arguing over a controversial theory. The tapestry reminds him of the divine power that fuels his mind. “I’m grateful for the doors that have been opened to me in different periods of my life,” he said. He believes the glass battery was just another example of the happy accidents that have come his way: “At just the right moment, when I was looking for something, it walked in the door.”

Last but not least, he credited old age with bringing him a new kind of intellectual freedom. At 94, he said, “You no longer worry about keeping your job.”

Short- and Long-Term Memories

Saturday, April 8th, 2017

All memories start as a short-term memory and then slowly convert into a long-term memory — or so we thought:

Two parts of the brain are heavily involved in remembering our personal experiences.

The hippocampus is the place for short-term memories while the cortex is home to long-term memories.

This idea became famous after the case of Henry Molaison in the 1950s.

His hippocampus was damaged during epilepsy surgery and he was no longer able to make new memories, but his ones from before the operation were still there.

So the prevailing idea was that memories are formed in the hippocampus and then moved to the cortex where they are “banked”.

[...]

The results, published in the journal Science, showed that memories were formed simultaneously in the hippocampus and the cortex.

[...]

The mice do not seem to use the cortex’s long-term memory in the first few days after it is formed.

They forgot the shock event when scientists turned off the short-term memory in the hippocampus.

However, they could then make the mice remember by manually switching the long-term memory on (so it was definitely there).

“It is immature or silent for the first several days after formation,” Prof Tonegawa said.

The researchers also showed the long-term memory never matured if the connection between the hippocampus and the cortex was blocked.

So there is still a link between the two parts of the brain, with the balance of power shifting from the hippocampus to the cortex over time.

A Tale of Two Bell Curves

Monday, March 27th, 2017

Bo and Ben Winegard tell a tale of two Bell Curves:

To paraphrase Mark Twain, an infamous book is one that people castigate but do not read. Perhaps no modern work better fits this description than The Bell Curve by political scientist Charles Murray and the late psychologist Richard J. Herrnstein. Published in 1994, the book is a sprawling (872 pages) but surprisingly entertaining analysis of the increasing importance of cognitive ability in the United States.

[...]

There are two versions of The Bell Curve. The first is a disgusting and bigoted fraud. The second is a judicious but provocative look at intelligence and its increasing importance in the United States. The first is a fiction. And the second is the real Bell Curve. Because many, if not most, of the pundits who assailed The Bell Curve have not bothered to read it, the fictitious Bell Curve has thrived and continues to inspire furious denunciations. We have suggested that almost all of the proposals of The Bell Curve are plausible. Of course, it is possible that some are incorrect. But we will only know which ones if people responsibly engage the real Bell Curve instead of castigating a caricature.

Masters of reality, not big thinkers

Sunday, March 26th, 2017

Joel Mokyr’s A Culture of Growth attempts to answer the big question: Why did science and technology (and, with them, colonial power) spread west to east in the modern age, instead of another way around?

He reminds us that the skirmishing of philosophers and their ideas, the preoccupation of popular historians, is in many ways a sideshow — that the revolution that gave Europe dominance was, above all, scientific, and that the scientific revolution was, above all, an artisanal revolution. Though the élite that gets sneered at, by Trumpites and neo-Marxists alike, is composed of philosophers and professors and journalists, the actual élite of modern societies is composed of engineers, mechanics, and artisans — masters of reality, not big thinkers.

Mokyr sees this as the purloined letter of history, the obvious point that people keep missing because it’s obvious. More genuinely revolutionary than either Voltaire or Rousseau, he suggests, are such overlooked Renaissance texts as Tommaso Campanella’s “The City of the Sun,” a sort of proto-Masonic hymn to people who know how to do things. It posits a Utopia whose inhabitants “considered the noblest man to be the one that has mastered the most skills… like those of the blacksmith and mason.” The real upheavals in minds, he argues, were always made in the margins. He notes that a disproportionate number of the men who made the scientific and industrial revolution in Britain didn’t go to Oxford or Cambridge but got artisanal training out on the sides. (He could have included on this list Michael Faraday, the man who grasped the nature of electromagnetic induction, and who worked some of his early life as a valet.) What answers the prince’s question was over in Dr. Johnson’s own apartment, since Johnson was himself an eccentric given to chemistry experiments — “stinks,” as snobbish Englishmen call them.

As in painting and drawing, manual dexterity counted for as much as deep thoughts — more, in truth, for everyone had the deep thoughts, and it took dexterity to make telescopes that really worked. Mokyr knows Asian history, and shows, in a truly humbling display of erudition, that in China the minds evolved but not the makers. The Chinese enlightenment happened, but it was strictly a thinker’s enlightenment, where Mandarins never talked much to the manufacturers. In this account, Voltaire and Rousseau are mere vapor, rising from a steam engine as it races forward. It was the perpetual conversation between technicians and thinkers that made the Enlightenment advance. ted talks are a licensed subject for satire, but in Mokyr’s view ted talks are, in effect, what separate modernity from antiquity and the West from the East. Guys who think big thoughts talking to guys who make cool machines — that’s where the leap happens.

How to Gain New Skills

Friday, March 24th, 2017

In his How to Gain New Skills guide for students, Ulrich Boser (Learn Better) discusses an experiment that took place years ago at a Catholic all-girls school in New York City:

As part of the experiment, the girls were taught how to play darts for the first time, and the two psychologists conducting the study divided the young women into some groups. Let’s call members of the first group “Team Performance,” and they were told that they should learn the game of darts by trying to throw the darts as close to the center of the board as possible. In other words, the researchers informed the women that the best way to win was to rack up some points.

The psychologists also pulled together another group of young women. Let’s call them “Team Learning Method,” and they learned to play darts very differently. The researchers had these girls focus on the process of gaining expertise, and the women started by focusing on how exactly to throw the darts, mastering some basic processes like “keep your arm close to your body.” Then, after the women showed some proficiency, they were encouraged to aim at the bull’s eye, slowly shifting from some process goals to some outcome goals like hitting the target.

Finally, there was the control group. Their instructions? The researchers told them to learn to “do their best.” In other words, these young women could take any approach that they wanted to learning darts. Let’s think of this group as “Team Conventional Wisdom.”

To learn more about the experiment, I met up with Anastasia Kitsantas, who ran the study together with psychologist Barry Zimmerman. While the experiment took place some years ago, Kitsantas still has the darts stashed away in her office at George Mason University, and on a rainy afternoon, she pulled out the little yellow missiles from an office cabinet to show them to me, laying the darts out like an important relic from some forgotten South American tribe.

Kitsantas held onto the darts because of the study’s surprisingly large outcomes, and by the end of the experiment, the young women on Team Learning Method dramatically outperformed the others, with scores nearly twice as high as Team Conventional Wisdom. The women also enjoyed the experience much more. “Several of the students asked me to teach them more about darts after the experiment. They kept asking me for weeks,” Kitsantas told me.

The best basketball player in the world is not the tallest

Thursday, March 23rd, 2017

Even a strong predictor of outcome is seldom able to pick out the very top performer, Stephen Hsu notes — e.g., taller people are on average better at basketball, but the best player in the world is not the tallest:

This seems like a trivial point (as are most things, when explained clearly), however, it still eludes the vast majority. For example, in the Atlantic article I linked to in the earlier post Creative Minds, the neuroscientist professor who studies creative genius misunderstands the implications of the Terman study. She repeats the common claim that Terman’s study fails to support the importance of high cognitive ability to “genius”-level achievement: none of the Termites won a Nobel prize, whereas Shockley and Alvarez, who narrowly missed the (verbally loaded) Stanford-Binet cut for the study, each won for work in experimental physics. But luck, drive, creativity, and other factors, all at least somewhat independent of intelligence, influence success in science. Combine this with the fact that there are exponentially more people a bit below the Terman cut than above it, and Terman’s results do little more than confirm that cognitive ability is positively but not perfectly correlated with creative output.

Strong Predictor Graph

In the SMPY study probability of having published a literary work or earned a patent was increasing with ability even within the top 1%. The “IQ over 120 doesn’t matter” meme falls apart if one measures individual likelihood of success, as opposed to the total number of individuals at, e.g., IQ 120 vs IQ 145, who have achieved some milestone. The base population of the former is 100 times that of the latter!

Please remember this perverse outcome

Sunday, March 19th, 2017

Charlie Munger was not impressed with academic psychology, but he was impressed with Robert Cialdini‘s Influence:

Cialdini had made himself into a super-tenured “Regents’ Professor” at a very young age by devising, describing, and explaining a vast group of clever experiments in which man manipulated man to his detriment, with all of this made possible by man’s intrinsic thinking flaws.

I immediately sent copies of Cialdini’s book to all my children — I also gave Cialdini a share of Berkshire stock [Class A] to thank him for what he had done for me and the public. Incidentally, the sale by Cialdini of hundreds of thousands of copies of a book about social psychology was a huge feat, considering that Cialdini didn’t claim that he was going to improve your sex life or make you any money.

Part of Cialdini’s large book-buying audience came because, like me, it wanted to learn how to become less often tricked by salesmen and circumstances. However, as an outcome not sought by Cialdini, who is a profoundly ethical man, a huge number of his books were bought by salesmen who wanted to learn how to become more effective in misleading customers. Please remember this perverse outcome when my discussion comes to incentive-caused biases a consequence of the superpower of incentives.

Cialdini’s Pre-Suasion came out recently.

Collecting psychology experiments as a boy collects butterflies

Saturday, March 18th, 2017

Charlie Munger was always interested in psychology, but he didn’t turn to psychology textbooks for a long, long time:

Motivated as I was, by midlife I should probably have turned to psychology textbooks, but I didn’t, displaying my share of the outcome predicted by the German folk saving: “We are too soon old and too late smart.” However, as I later found out, I may have been lucky to avoid for so long the academic psychology that was then laid out in most textbooks. These would not then have guided me well with respect to cults and were often written as if the authors were collecting psychology experiments as a boy collects butterflies — with a passion for more butterflies and more contact with fellow collectors and little craving for synthesis in what is already possessed. When I finally got to the psychology texts, I was reminded of the observation of Jacob Viner, the great economist; that many an academic is like the truffle hound, an animal so trained and bred for one narrow purpose that it is no good at anything else. I was also appalled by hundreds of pages of extremely not scientific musing about comparative weights of nature and nurture in human outcomes. And I found that introductory psychology texts, by and large, didn’t deal appropriately with a fundamental issue: Psychological tendencies tend to be both numerous and inseparably intertwined, now and forever, as they interplay in life. Yet the complex parsing out of effects from intertwined tendencies was usually avoided by the writers of the elementary texts. Possibly the authors did not wish, through complexity, to repel entry of new devotees to their discipline. And, possibly, the cause of their inadequacy was the one given by Samuel Johnson in response to a woman who inquired as to what accounted for his dictionary’s mis-definition of the word “pastern.” “Pure ignorance,” Johnson replied. And, finally, the text writers showed little interest in describing standard antidotes to standard, psychology-driven folly, and they thus avoided most discussion of exactly what most interested me.

Bright Eyes

Monday, March 13th, 2017

Researchers have confirmed that baseline pupil size is related to cognitive ability. Bright people look bright.

No invertebrate on land would have been a match for it

Friday, March 10th, 2017

The earliest tetrapods had much bigger eyes than their fishy forebears, and those bigger eyes evolved before walking legs:

Eyes don’t fossilize, but you can estimate how big they would have been by measuring the eye sockets of a fossilized skull. MacIver and his colleagues, including fossil eye expert Lars Schmitz, did this for the skulls of 59 species — from finned fish to intermediate fishapods to legged tetrapods. They showed that over 12 million years, the group’s eyes nearly tripled in size. Why?

Eyes are expensive organs: it takes a lot of energy to maintain them, and even more so if they’re big. If a fish is paying those costs, the eyes must provide some kind of benefit. It seems intuitive that bigger eyes let you see better or further, but MacIver’s team found otherwise. By simulating the kinds of shallow freshwater environments where their fossil species lived — day to night, clear to murky — they showed that bigger eyes make precious little difference underwater. But once those animals started peeking out above the waterline, everything changed. In the air, a bigger eye can see 10 times further than it could underwater, and scan an area that’s 5 million times bigger.

In the air, it’s also easier for a big eye to pay for itself. A predator with short-range vision has to constantly move about to search the zone immediately in front of its face. But bigger-eyes species could spot prey at a distance, and recoup the energy they would otherwise have spent on foraging. “Long-range vision gives you a free lunch,” says MacIver. “You can just look around, instead of moving to inspect somewhere else.”

Tiktaalik with Eyes Above Surface

Those early hunters would have seen plenty of appetizing prey. Centipedes and millipedes had colonized the land millions of years before, and had never encountered fishapod predators. “I imagine guys like Tiktaalik lurking there like a crocodile, waiting for a giant millipede to walk by, and chomping on it,” says MacIver. “No invertebrate on land would have been a match for it.”

Absorbent Beads Could Save Energy

Friday, March 10th, 2017

Porous zeolite beads could cut the energy used in large-scale drying operations in half, according to UC Davis plant scientist Kent Bradford:

The beads were developed by Rhino Research in Thailand. Bradford and his collaborators there have spent several years testing and refining the technology with local farmers in that country as well as in India, Nepal, Kenya, and other tropical nations, where as much as a third of crops are lost before reaching consumers. In those areas, the beads are placed alongside, say, harvested rice or maize seeds, separated in mesh sacks or screened-in compartments within containers. They then capture water from the air, significantly reducing the moisture that leads to rot and fungal infections.

Now the researchers are working to bring the technology to richer nations at the industrial scale, exploring its use to dry harvested almonds, walnuts, rice, and grains. Typically, farming operations blow hot air through harvested crops as they pass through drying towers or silos. But experiments show that ambient air can work just as well, if it’s first dried by passing it through the beads. The researchers also believe this approach can improve the quality of the end product, because uneven air heating frequently scorches parts of the batch, ruining the taste of nuts and other foods.

The beads themselves still need to be heated in the end, in order to remove the water so they can be reused. But that can be done in a compact space like an oven, which is far more efficient than blowing around heated air.

The Biological Origins of Higher Civilizations

Thursday, March 9th, 2017

Elfnonationalist explores the biological origins of higher civilizations:

It is my opinion that the most successful civilized nations of Europe, namely, Britain, France, the Netherlands, and Germany, (and to a lesser degree, Northern Italy, Spain, Scandinavia, and Russia) have been so successful, not necessarily due to early adoption of manorialism, but rather due to this balance of genetic input from both genetically pacified farmers, who were accustomed to a settled, relatively peaceful existence, as well as the more mobile, “barbaric” in Nietzschean terms, Indo-Europeans who were descended primarily from hunters and fishers who had recently adopted a highly competitive pastoralist lifestyle on the Pontic steppe (see David W. Anthony’s The Horse, the Wheel, and Language). The aristocracies of early Greece and Rome would have also possessed this ideal mix of genetically inherited traits, being descended from Indo-European invaders who married local Neolithic farmers, introducing the early Greek and Italic languages into the Mediterranean basin. This aristocracy is practically gone now, however, through an overwhelming genetic absorption into the conquered Neolithic farmer populace, who were ultimately descended primarily from early Near-Eastern agriculturalists.

The end result of the ideal genetic admixture which I have described is a people which are both civilized and politically organized, and also are also willing to innovate, take risks (like exploring the New World), and challenge old notions of thought, as was done in the scientific revolution.

Moral Outrage Is Self-Serving

Sunday, March 5th, 2017

Moral outrage is self-serving, Bowdoin psychology professor Zachary Rothschild and University of Southern Mississippi psychology professor Lucas A. Keefer have found:

Triggering feelings of personal culpability for a problem increases moral outrage at a third-party target. For instance, respondents who read that Americans are the biggest consumer drivers of climate change “reported significantly higher levels of outrage at the environmental destruction” caused by “multinational oil corporations” than did the respondents who read that Chinese consumers were most to blame.

The more guilt over one’s own potential complicity, the more desire “to punish a third-party through increased moral outrage at that target.” For instance, participants in study one read about sweatshop labor exploitation, rated their own identification with common consumer practices that allegedly contribute, then rated their level of anger at “international corporations” who perpetuate the exploitative system and desire to punish these entities. The results showed that increased guilt “predicted increased punitiveness toward a third-party harm-doer due to increased moral outrage at the target.”

Having the opportunity to express outrage at a third-party decreased guilt in people threatened through “ingroup immorality.” Study participants who read that Americans were the biggest drivers of man-made climate change showed significantly higher guilt scores than those who read the blame-China article when they weren’t given an opportunity to express anger at or assign blame to a third-party. However, having this opportunity to rage against hypothetical corporations led respondents who read the blame-America story to express significantly lower levels of guilt than the China group. Respondents who read that Chinese consumers were to blame had similar guilt levels regardless of whether they had the opportunity to express moral outrage.

“The opportunity to express moral outrage at corporate harm-doers” inflated participants perception of personal morality. Asked to rate their own moral character after reading the article blaming Americans for climate change, respondents saw themselves as having “significantly lower personal moral character” than those who read the blame-China article—that is, when they weren’t given an out in the form of third-party blame. Respondents in the America-shaming group wound up with similar levels of moral pride as the China control group when they were first asked to rate the level of blame deserved by various corporate actors and their personal level of anger at these groups. In both this and a similar study using the labor-exploitation article, “the opportunity to express moral outrage at corporate harm-doing (vs. not) led to significantly higher personal moral character ratings,” the authors found.

Guilt-induced moral outrage was lessened when people could assert their goodness through alternative means, “even in an unrelated context.” Study five used the labor exploitation article, asked all participants questions to assess their level of “collective guilt” (i.e., “feelings of guilt for the harm caused by one’s own group”) about the situation, then gave them an article about horrific conditions at Apple product factories. After that, a control group was given a neutral exercise, while others were asked to briefly describe what made them a good and decent person; both exercises were followed by an assessment of empathy and moral outrage. The researchers found that for those with high collective-guilt levels, having the chance to assert their moral goodness first led to less moral outrage at corporations. But when the high-collective-guilt folks were given the neutral exercise and couldn’t assert they were good people, they wound up with more moral outrage at third parties. Meanwhile, for those low in collective guilt, affirming their own moral goodness first led to marginally more moral outrage at corporations.