The vampire is consumption in human form

Thursday, February 14th, 2019

The vampire is consumption in human form:

“Peter Plogojowitz…died…and had been buried…within a week, nine people, both old and young, died also…while they were yet alive…Plogojowitz…(came) to them in their sleep, laid himself on them…so that they would give up the ghost… They exhumed…(his) body…which was completely fresh…hair…nails…had grown on him; the old skin…had peeled …a fresh new one had emerged under it…I saw some fresh blood in his mouth…he had sucked from the people killed by him…(we) sharpened a stake…as he was pierced…much blood, completely fresh, flowed… through his…mouth.” (1725)

Tuberculosis was a disease of antiquity with a unique nom de guerre — consumption — and helped give impetus to the vampire myth. The ancient and universal presence of “bloodsuckers” will be chronicled from early times of recorded human existence.

In Western culture, ancient Greeks called their mythological bloodsuckers Lamia. The Libyan princess Lamia had an illicit love affair with Zeus. Hera, the wife whom Zeus spurned, killed all of Lamia’s children and drove Lamia into exile. The tale — told by Aristophanes (446–386 B.C.E.) and Aristotle (384–322 B.C.E.) — described how Lamia sought revenge by sucking the lifeblood of babies. In the second century B.C.E., a manuscript fragment by Titinus suggested garlic hung around the neck of children would protect them from Lamia.

After the arrival of Christianity, undead demons were accommodated by the new religion and were renamed Vrykolakas, roaming undead beings afflicting humanity. Many cultures followed with Albanians, Montenegrins, Bulgarians, Croatians, Hungarians, Serbians, and Russians naming their ethnic vampires. In fact, it was the Walachian prince, Vlad Tepes (the Impaler, 1431–1476), who was Bram Stokers’ model for Dracula. The German schrattl, or shroud-eater, was thought to rise from the grave spreading disease. The presence of mythic undead creatures populated Native North and South American cultures. In the 13th and 14th centuries undead Icelandic beings were added to other myths and called Grettirs.

Although it has been opined that “East is East and West is West and never the twain shall meet,” the vampire myth brought cultural extremes together. Chinese jiangshi were corpses of those drowned, hung, or victims of suicide returning to drain humans of their life force. The Japanese and culture of India described similar undead blood suckers. As fatal communicable diseases and death were universal — and otherwise inexplicable — so were naïve myths of vampires. Myths would evolve into metaphors as well.

“…the body swells…Discoloured natural fluids and liquefied tissues are made frothy by gas and some exude from the natural orifices, forced out by the increasing pressure in the body cavities… eyes bulge… little wonder that Bacon was convinced that purposeful dynamic spirits wrought this awful change.”

“Blood migrates…in the course of decomposition…the gases in the abdomen increase in pressure…and are forced upwards and decomposing blood escapes from the mouth and nostrils.”

In 1739, Austrians occupying Serbia and Walachia investigated reports of a gruesome local custom: Exhuming dead bodies and re-killing them. The practice was a consequence of their ignorance of natural processes of bodily decomposition. During decomposition intestinal bacterial gas flows through blood vessels and tissues pushing blood stained fluids through the nose and mouth. Seven days after death, cadaveric skin loosens, the top layer sheds off in sheets in a process called skin slippage.

At a 1732 exhumation it was written, “They dug up Arnod Paole forty days after his death…and they found…fresh blood had flowed from his eyes, nose, mouth and ears; that the shirt (was) completely bloody…the skin, had fallen off…he was a true vampire, they drove a stake through his heart.” The presence of blood after death was interpreted as a sign of reanimated life — albeit at someone else’s expense. In the context of consumption, the visible confluence of paleness and hemoptysis lent credence to postmortem appearances suggesting life after death. That perception would be further fueled by ignorance regarding the transmission of infectious diseases.

“The Vampire is consumption in human form, embodying an evil that slowly and secretly drains the life from its victims.”

“For as long as what caused tuberculosis was not understood…tuberculosis was thought to be an insidious, implacable theft of a life…Any disease that is treated as a mystery and acute enough to be feared will be felt to be morally…contagious.”

– Susan Sontag

Prior to the Germ Theory, from the time of Aristotle, miasmas — or currents of contaminated air circulated by winds — were considered the cause of communicable diseases. As a result, continued connections between consumption and the vampire myth persisted into the late Nineteenth Century.

“Mercy died, apparently of tuberculosis, in January 1892…Mercy was a vampire…Mercy’s brother Edwin was a strapping young man of 18…in 1891 Mercy and Edwin both became ill… the boy went off to Colorado, where he recovered. His sister (Mercy) eventually was carried to her grave by the illness…Edwin returned still in weak health…why was such a strong man’s life draining away? Why had the same thing happened to Mercy only a few months before…nothing less terrible than a vampire was sucking their children’s blood and taking their lives with it…to their infinite horror, Mercy’s body, which had blood and seemed unnaturally preserved, with color still in the cheeks…(They) removed the corpse’s heart and burned it on a rock.”

Consumption — the captain of all these men of death — was chosen and persisted as a metaphor for vampirism. Charles Dickens wrote, that for individuals with tuberculosis, “life and death are so strangely blended that death takes the “glow and hue of life, and life the gaunt and grisly form of death” (Nicholas Nickleby). Dickens and others who witnessed victims of consumption — precariously hovering between life and death — recognized the stigma:

“The emaciated figure strikes one with terror; the forehead covered with drops of sweat; the cheeks painted with a livid crimson, the eyes sunk; the little fat that raised them in their orbits entirely wasted; the pulse quick and tremulous; the nails long, bending over the ends of the fingers; the palms of the hands dry and painfully hot to the touch; the breath offensive, quick and laborious, and the cough so incessant as scarce to allow the wretched sufferer time to tell his complaints.”

Science, ascendant in the late Nineteenth, early twentieth century identified Mycobacterium tuberculosis as the agent of tuberculosis. Science also shed light on bodily decomposition after death. The cure of tuberculosis would follow and the myth and metaphors surrounding tuberculosis would disappear.

The great vice of the Greeks was extrapolation

Wednesday, February 13th, 2019

I recently read Ignition!, by John D. Clark, and I found it an odd mix of fun, opinionated bits and dry chemistry:

“Now it is clear that anyone working with rocket fuels is outstandingly mad. I don’t mean garden-variety crazy or a merely raving lunatic. I mean a record-shattering exponent of far-out insanity.”

“It is, of course, extremely toxic, but that’s the least of the problem. It is hypergolic with every known fuel, and so rapidly hypergolic that no ignition delay has ever been measured. It is also hypergolic with such things as cloth, wood, and test engineers, not to mention asbestos, sand, and water — with which it reacts explosively. It can be kept in some of the ordinary structural metals — steel, copper, aluminium, etc. — because of the formation of a thin film of insoluble metal fluoride which protects the bulk of the metal, just as the invisible coat of oxide on aluminium keeps it from burning up in the atmosphere. If, however, this coat is melted or scrubbed off, and has no chance to reform, the operator is confronted with the problem of coping with a metal-fluorine fire. For dealing with this situation, I have always recommended a good pair of running shoes.”

“If your propellants flow into the chamber and ignite immediately, you’re in business. But if they flow in, collect in a puddle, and then ignite, you have an explosion which generally demolishes the engine and its immediate surroundings. The accepted euphemism for this sequence of events is a ‘hard start.’”

“Their guess turned out to be right, but one is reminded of E. T. Bell’s remark that the great vice of the Greeks was not sodomy but extrapolation.”

“…a molecule with one reducing (fuel) end and one oxidizing end, separated by a pair of firmly crossed fingers, is an invitation to disaster.”

“I looked around and signaled to my own gang, and we started backing away gently, like so many cats with wet feet.”

“And there is one disconcerting thing about working with a computer — it’s likely to talk back to you. You make some tiny mistake in your FORTRAN language — putting a letter in the wrong column, say, or omitting a comma — and the 360 comes to a screeching halt and prints out rude remarks, like “ILLEGAL FORMAT,” or “UNKNOWN PROBLEM,” or, if the man who wrote the program was really feeling nasty that morning, “WHAT’S THE MATTER STUPID? CAN’T YOU READ?” Everyone who uses a computer frequently has had, from time to time, a mad desire to attack the precocious abacus with an axe.”

Rubisco has become a victim of its own success

Saturday, February 9th, 2019

Plants convert sunlight into energy through photosynthesis, but this process involves a glitch — and an “expensive” fix, called photorespiration:

Researchers from the University of Illinois and U.S. Department of Agriculture Agricultural Research Service report in the journal Science that crops engineered with a photorespiratory shortcut are 40 percent more productive in real-world agronomic conditions.


Photosynthesis uses the enzyme Rubisco — the planet’s most abundant protein — and sunlight energy to turn carbon dioxide and water into sugars that fuel plant growth and yield. Over millennia, Rubisco has become a victim of its own success, creating an oxygen-rich atmosphere. Unable to reliably distinguish between the two molecules, Rubisco grabs oxygen instead of carbon dioxide about 20 percent of the time, resulting in a plant-toxic compound that must be recycled through the process of photorespiration.

“Photorespiration is anti-photosynthesis,” said lead author Paul South, a research molecular biologist with the Agricultural Research Service, who works on the RIPE project at Illinois. “It costs the plant precious energy and resources that it could have invested in photosynthesis to produce more growth and yield.”
Photorespiration normally takes a complicated route through three compartments in the plant cell. Scientists engineered alternate pathways to reroute the process, drastically shortening the trip and saving enough resources to boost plant growth by 40 percent. This is the first time that an engineered photorespiration fix has been tested in real-world agronomic conditions.


The team engineered three alternate routes to replace the circuitous native pathway. To optimize the new routes, they designed genetic constructs using different sets of promoters and genes, essentially creating a suite of unique roadmaps. They stress tested these roadmaps in 1,700 plants to winnow down the top performers.

Over two years of replicated field studies, they found that these engineered plants developed faster, grew taller, and produced about 40 percent more biomass, most of which was found in 50-percent-larger stems.

The team tested their hypotheses in tobacco: an ideal model plant for crop research because it is easier to modify and test than food crops, yet unlike alternative plant models, it develops a leaf canopy and can be tested in the field. Now, the team is translating these findings to boost the yield of soybean, cowpea, rice, potato, tomato, and eggplant.

An enterprising evil grad student could weaponize kudzu and then hold the world ransom for…one million dollars!

Keeping someone in solitary for more than 15 days constitutes torture

Friday, February 8th, 2019

Professional gambler Rich Alati took an unusual bet:

On 10 September last year, the American was sitting at a poker table at the Bellagio in Las Vegas, when he was asked a question by a fellow professional player, Rory Young: how much would it take for him to spend time in complete isolation, with no light, for 30 days? An hour later a price had been agreed: $100,000.

Young would hand over the money if Alati could last 30 days in a soundproofed bathroom with no light. He would be delivered food from a local restaurant, but the meals would come at irregular intervals to prevent him from keeping track of time. There would be no TV, radio, phone or access to the outside world but he would be allowed some comforts: a yoga mat, resistance band, massage ball, and, appropriately for a bathroom, lavender essential oils as well as a sugar and salt scrub. If Alati failed he would have to pay Young $100,000.


Dr Michael Munro, a psychologist Young consulted before agreeing to the bet, told Young: “Even if he lasts for 30 days, it will be extremely taxing on his mental health for the short and potentially long term.”

There’s good reason for such caution. Solitary confinement is often used as punishment, most notably in the United States, where inmates in solitary are isolated in their cells 23 hours a day. The United Nations’ Nelson Mandela Rules state that keeping someone in solitary for more than 15 days constitutes torture.


But Alati was confident. He had practiced meditation and yoga, and was certain his experiences at silent retreats would help him. On 21 November, a crowd of families and friends gathered at the house where the challenge would take place. Alati and Young’s lawyers were there as well as cameramen from a production company interested in buying television rights to the story. For that reason, as well as safety, the entire bet would be recorded. Alati’s father was given the power to pull Alati out at any time should he show signs of not being “in the right headspace,” as Alati puts it.


Around the 10-day mark, Young started to worry that Alati might make the 30 days, noting he looked “totally fine”. He worried he had miscalculated: Young hadn’t known Alati – a gregarious, fast talker – for long before they had made the bet. “His personality did not reflect that of someone who was proficient with meditation,” Young said.

On day 15, Young’s voice came on over the loudspeaker. Alati jumped out of bed, happy to hear a voice that wasn’t his own. Young told Alati that he had been in for around two weeks and that he had an offer for him: Alati could leave if he paid out $50,000.


Alati waited for a few days until Young came back on the loudspeaker and asked if he had any offers of his own. Alati said he wouldn’t come out for less than $75,000, to which Young countered with an offer of $40,000. They settled on $62,400. Alati had had been in the silence and dark for 20 days.

Widespread use would provide an entire new category for the Darwin Awards

Thursday, January 31st, 2019

The Four Thieves Vinegar Collective is a volunteer network of anarchists and hackers developing DIY medicines:

Four Thieves claims to have successfully synthesized five different kinds of pharmaceuticals, all of which were made using MicroLab. The device attempts to mimic an expensive machine usually only found in chemistry laboratories for a fraction of the price using readily available off-the-shelf parts. In the case of the MicroLab, the reaction chambers consist of a small mason jar mounted inside a larger mason jar with a 3D-printed lid whose printing instructions are available online. A few small plastic hoses and a thermistor to measure temperature are then attached through the lid to circulate fluids through the contraption to induce the chemical reactions necessary to manufacture various medicines. The whole process is automated using a small computer that costs about $30.

To date, Four Thieves has used the device to produce homemade Naloxone, a drug used to prevent opiate overdoses better known as Narcan; Daraprim, a drug that treats infections in people with HIV; Cabotegravir, a preventative HIV medicine that may only need to be taken four times per year; and mifepristone and misoprostol, two chemicals needed for pharmaceutical abortions.


As for the DEA, none of the pharmaceuticals produced by the collective are controlled substance, so their possession is only subject to local laws about prescription medicines. If a person has a disease and prescription for the drug to treat that disease, they shouldn’t run into any legal issues if they were to manufacture their own medicine. Four Thieves is effectively just liberating information on how to manufacture certain medicines at home and developing the open source tools to make it happen. If someone decides to make drugs using the collective’s guides then that’s their own business, but Four Thieves doesn’t pretend that the information it releases is for “educational purposes only.”


The catalyst for Four Thieves Vinegar Collective was a trip Laufer took to El Salvador in 2008 when he was still in graduate school. While visiting a rural medical clinic as part of an envoy documenting human rights violations in the country, he learned that it had run out of birth control three months prior. When the clinic contacted the central hospital in San Salvador, it was informed the other hospital had also run out of birth control. Laufer told me he was stunned that the hospitals were unable to source birth control, a relatively simple drug to manufacture that’s been around for over half-a-century. He figured if drug dealers in the country were able to use underground labs to manufacture illicit drugs, a similar approach could be taken to life-saving medicines.

This doesn’t seem wise:

Eric Von Hippel, an economist at MIT that researches “open innovation,” is enthusiastic about the promise of DIY drug production, but only under certain conditions. He cited a pilot program in the Netherlands that is exploring the independent production of medicines that are tailor made for individual patients as a good example of safe, DIY drug production. These drugs are made in the hospital by trained experts. Von Hippel believes it can be dangerous when patients undertake drug production on their own.
“If one does not do chemical reactions under just-right conditions, one can easily create dangerous by-products along with the drug one is trying to produce,” von Hippel told me in an email. “Careful control of reactor conditions is unlikely in DIY chemical reactors such as the MicroLab design offered for free by the Four Thieves Vinegar Collective.”

His colleague, Harold DeMonaco, a visiting scientist at MIT, agreed. DeMonaco suggested that a more rational solution to the problems addressed would be for patients to work with compounding pharmacies. Compounding pharmacies prepare personalized medicine for their customers and DeMonaco said they are able to synthesize the same drugs Four Thieves is producing at low costs, but with “appropriate safeguards.”

“Unless the system is idiot proof and includes validation of the final product, the user is exposed to a laundry list of rather nasty stuff,” DeMonaco told me in an email. “Widespread use [of Four Thieves’ devices] would provide an entire new category for the Darwin Awards.”

He regularly asks students to throw spears at him

Monday, January 28th, 2019

Anthropologists have long concluded that Neanderthals used their thick, heavy spears only at close range, because the academics could only throw those spears about 10 meters. What happens when athletes throw Neanderthal spears?

On a very cold January morning, in an athletic field in central England, Annemieke Milks watched as six javelin-throwers hurled a pair of wooden spears. Their target was a hay bale, “meant to approximate the kill zone of a large animal like a horse,” says Milks, an archeologist at University College London. And their spears were replicas of the oldest complete hunting weapons ever found — a set of 300,000-year-old, six-and-half-foot sticks found in a mine at Schöningen, Germany.

The athletes managed to throw their replicas over distances of 65 feet. That’s a far cry from modern javelin feats — the world record for men, set in 1996, is 323.1 feet. But it’s twice what many scientists thought that primitive spears were capable of. It suggests that, contrary to popular belief, early spear-makers — Neanderthals, or perhaps other ancient species like Homo heidelbergensis — could probably have hunted their prey from afar.


“The 10-meter distance was repeated over and over again, but not backed up with much evidence.” It came from an influential ethnographic review that considered the spear-throwing skills of many modern populations, but didn’t include adept groups like the Tasmanian and Tiwi peoples of Australia. And it was bolstered by studies and anecdotal reports in which spears were thrown by anthropologists—hardly a decent stand-in for a skilled Neanderthal hunter.

For example, John Shea, an archeologist at Stony Brook University tells me that he regularly takes his students into an athletic field and asks them to throw replica Schöningen spears at him. “If they hit me, I pledge to give them $20,” he says. “I’ve been doing this ‘experiment’ for 25 years and I’ve neither got so much as a scratch on me nor parted with any cash. The spears come sailing in so low and slow I can usually just step sideways out of the way, bat them away with a stick or, if I am feeling really cocky, catch them in mid-air.”

A German sport scientist and javelin-thrower named Hermann Rieder had more success: In a small study, he managed to hit targets from around 16 feet away and suggested that the spears were useful weapons at longer distances.


It’s sometimes said that heavy spears would slow mid-flight and hit their targets with dull thuds. But Milks found that the replicas slowed very little, and landed with a kinetic wallop comparable to projectiles launched by bows or spear-throwing tools.

But Steve Churchill, an anthropologist from Duke University, notes that the javelin-throwers only hit their target a quarter of the time, and less so at the furthest distances. He’s also unclear as to how many of those “hits” would have been strong enough to, say, penetrate an animal’s hide. In his own experience (and he freely admits that he’s not a trained thrower), Schöningen replicas wobble a lot and tend to strike targets at glancing angles. They might fly far, in other words, but do they fly true? “This is a very good study,” he says, but “I don’t see a lot here to convince me that the Schöningen spears were effective long range weapons.”

Milks counters that professional javelin-throwers go for distance, and aren’t trained to hit targets. Despite that, some of them clearly got the sense that the heavy spears behave unusually, vibrating along their axis and flexing on impact. The more experienced athletes compensated for this by putting spin on the spears. “That brought home how important it is to use skilled throwers,” Milks says. “What I really want to do now is to go to hunter-forager groups and have them show us these spears are capable of. They use spears from age 6, which is something I can’t replicate with javelin athletes.”

It was the usual horror story

Friday, January 25th, 2019

I can’t say I know much about Mother Jones, but I was surprised to see them publish a “scary” look into the science of smoking pot:

It’s been a few years since Alex Berenson has “committed journalism,” as he likes to say. As a New York Times reporter, Berenson did two tours covering the Iraq War, an experience that inspired him to write his first of nearly a dozen spy novels. Starting with the 2006 Edgar Award-winning The Faithful Spy, his books were so successful that he left the Times in 2010 to write fiction full time. But his latest book, out January 8, strays far from the halls of Langley and the jihadis of Afghanistan. Tell Your Children is nonfiction that takes a sledgehammer to the promised benefits of marijuana legalization, and cannabis enthusiasts are not going to like it one bit.

The book was seeded one night a few years ago when Berenson’s wife, a psychiatrist who evaluates mentally ill criminal defendants in New York, started talking about a horrific case she was handling. It was “the usual horror story, somebody who’d cut up his grandmother or set fire to his apartment — typical bedtime chat in the Berenson house,” he writes. But then, his wife added, “Of course he was high, been smoking pot his whole life.”

Berenson, who smoked a bit in college, didn’t have strong feelings about marijuana one way or another, but he was skeptical that it could bring about violent crime. Like most Americans, he thought stoners ate pizza and played video games — they didn’t hack up family members. Yet his Harvard-trained wife insisted that all the horrible cases she was seeing involved people who were heavy into weed. She directed him to the science on the subject.

We look back and laugh at Reefer Madness, which was pretty over-the-top, after all, but Berenson found himself immersed in some pretty sobering evidence: Cannabis has been associated with legitimate reports of psychotic behavior and violence dating at least to the 19th century, when a Punjabi lawyer in India noted that 20 to 30 percent of patients in mental hospitals were committed for cannabis-related insanity. The lawyer, like Berenson’s wife, described horrific crimes — including at least one beheading — and attributed far more cases of mental illness to cannabis than to alcohol or opium. The Mexican government reached similar conclusions, banning cannabis sales in 1920 — nearly 20 years before the United States did — after years of reports of cannabis-induced madness and violent crime.

Over the past couple of decades, studies around the globe have found that THC — the active compound in cannabis — is strongly linked to psychosis, schizophrenia, and violence. Berenson interviewed far-flung researchers who have quietly but methodically documented the effects of THC on serious mental illness, and he makes a convincing case that a recreational drug marketed as an all-around health product may, in fact, be really dangerous — especially for people with a family history of mental illness and for adolescents with developing brains.

A 2002 study in BMJ (formerly the British Medical Journal) found that people who used cannabis by age 15 were four times as likely to develop schizophrenia or a related syndrome as those who’d never used. Even when the researchers excluded kids who had shown signs of psychosis by age 11, they found that the adolescent users had a threefold higher risk of demonstrating symptoms of schizophrenia later on. One Dutch marijuana researcher that Berenson spoke with estimated, based on his own work, that marijuana could be responsible for as much as 10 percent of psychosis in places where heavy use is common.

These studies are hardly Reagan-esque, drug warrior hysteria. In 2017, the National Academies of Sciences, Engineering, and Medicine issued a report nearly 500 pages long on the health effects of cannabis and concluded that marijuana use is strongly associated with the development of psychosis and schizophrenia. The researchers also noted that there’s decent evidence linking pot consumption to worsening symptoms of bipolar disorder and to a heightened risk of suicide, depression, and social anxiety disorders: “The higher the use, the greater the risk.”

Given that marijuana use is up 50 percent over the past decade, if the studies are accurate, we should be experiencing a big increase in psychotic diseases. And we are, Berenson argues. He reports that from 2006 to 2014, the most recent year for which data is available, the number of ER visitors co-diagnosed with psychosis and a cannabis use disorder tripled, from 30,000 to 90,000.

Legalization advocates would say Berenson and the researchers have it backwards: Pot doesn’t cause mental illness; mental illness drives self-medication with pot. But scientists find that theory wanting. Longitudinal studies in New Zealand, Sweden, and the Netherlands spanning several decades identified an association between cannabis and mental illness even when accounting for prior signs of mental illness. In an editorial published alongside the influential 2002 BMJ study on psychosis and marijuana, two Australian psychiatrists wrote that these and other findings “strengthen the argument that use of cannabis increases the risk of schizophrenia and depression, and they provide little support for the belief that the association between marijuana use and mental health problems is largely due to self-medication.”

One of the book’s most convincing arguments against the self-medication theory is that psychosis and schizophrenia are diseases that typically strike people during adolescence or in their early 20s. But with increasing pot use, the number of people over 30 coming into the ER with psychosis has also shot up, suggesting that cannabis might be a cause of mental illness in people with no prior history of it.”

Malcolm Gladwell wrote a similar piece in the New Yorker, emphasizing how little we know about marijuana compared to legal drugs, and Berenson himself has an opinion piece in the New York Times, where he points out that many of the same people pressing for marijuana legalization argued that the risks of opioid addiction could be easily managed.

Your dominant frequency is how many times per second your brain pulses alpha waves

Thursday, January 24th, 2019

Magnetic EEG/ECG-guided Resonant Therapy, or MeRT, aims to return a person’s brain to the beat of its natural information-processing rhythm, or its dominant frequency:

Your dominant frequency is how many times per second your brain pulses alpha waves. “We’re all somewhere between 8 and 13 hertz. What that means is that we encode information 8 to 13 times per second. You’re born with a signature. There are pros and cons to all of those. If you’re a slower thinker, you might be more creative. If you’re faster, you might be a better athlete,” Won says.

Navy SEALS tend to have higher-than-average dominant frequencies, around 11 or 13 Hz. But physical and emotional trauma can disrupt that, causing the back of the brain and the front of the brain to emit electricity at different rates. The result: lopsided brain activity. MeRT seeks to detect arrhythmia, find out which regions are causing it, and nudge the off-kilter ones back onto the beat.

“Let’s just say in the left dorsal lateral prefrontal cortex, towards the front left side of the brain, if that’s cycling at 2 hertz, where we are 3 or 4 standard deviations below normal, you can pretty comfortably point to that and say that these neurons aren’t firing correctly. If we target that area and say, ‘We are going to nudge that area back to, say, 11 hertz,’ some of those symptoms may improve,” says Won. “In the converse scenario, in the right occipital parietal lobe where, if you’ve taken a hit, you may be cycling too fast. Let’s say it’s 30 hertz. You’re taking in too much information, oversampling your environment. And if you’re only able to process it using executive function 11 times per second, that information overload might manifest as anxiety.”

If the theory behind MeRT is true, it could explain, at least partially, why a person may suffer from many mental-health symptoms: anxiety, depression, attention deficits, etc. The pharmaceutical industry treats them with separate drugs, but they all may have a similar cause, and thus be treatable with one treatment. That, anyway, is what Won’s preliminary results are suggesting.

“You don’t see these type of outcomes with psychopharma or these other types of modalities, so it was pretty exciting,” he said.

There are lots of transcranial direct stimulation therapies out there, with few results to boast of. What distinguishes MeRT from other attempts to treat mental disorders with electrical fields is the use of EEG as a guide.

Berger’s telepathy theories never panned out

Wednesday, January 23rd, 2019

The idea of electric therapy goes way, way back:

The idea that electricity, properly administered, could treat illness goes back to 1743 when a German physician named Johann Gottlob Kruger of the University of Halle successfully treated a harpsichordist with arthritis via electrical stimulation to the hand. John Wesley, the father of Methodism, also experimented with electricity as a therapeutic and declared it “The nearest an Universal medicine of any yet known in the world.”

But the idea remained mostly an idea with no real science to back it up, until the 20th century.

Enter Hans Berger, a German scientist who wanted to show that human beings were capable of telepathy via an unseen force he referred to as “psychic energy.” He believed this energy derived from an invisible relationship between blood flow, metabolism, emotion, and the sensation of pain and thought that if he could find physical evidence that psychic energy existed, perhaps humanity could learn to control it.

To test his theory, he needed a way to record the brain’s electrical activity. In 1924, he applied a galvanometer a tool originally built to measure the heart’s electrical activity, to the skull of a young brain-surgery patient. The galvanometer was essentially a string of silver-coated quartz filament flanked by magnets. The filament would move as it encountered electromagnetic activity, which could be graphed. Berger discovered that the brain produced electrical oscillations at varying strengths. He dubbed the larger ones, of 8 to 12 Hz, the alpha waves, the smaller ones beta waves, and named the graphing of these waves an electroencephalogram, or EEG.

Berger’s telepathy theories never panned out, but the EEG became a healthcare staple, used to detect abnormal brain activity, predict potential seizures, and more.

Anger is no longer his go-to emotion

Tuesday, January 22nd, 2019

In many ways, SEALS represent the perfect test group for experimental brain treatment:

At the lab, Tony (whose name has been changed to protect his identity) met Dr. Erik Won, president and CEO of the Newport Brain Research Laboratory, the company that’s innovating Magnetic EEG/ECG-guided Resonant Therapy, or MeRT. Won’s team strapped cardiac sensors on Tony and placed an electroencephalography cap on his skull to measure his brain’s baseline electrical activity. Then came the actual therapy. Placing a flashlight-sized device by Tony’s skull, they induced an electromagnetic field that sent a small burst of current to his brain. Over the course of 20 minutes, they moved the device around his cranium, delivering jolts that, at their most aggressive, felt like a firm finger tapping.

For Tony, MeRT’s effects were obvious and immediate. He walked out of the first session to a world made new. “Everything looked different,” he told me. “My bike looked super shiny.”

He began to receive MeRT five times a week — each session lasting about an hour, with waiting room time — and quickly noticed a change in his energy. “I was super boosted,” he said. His mood changed as well.

Today, he admits that he still has moments of frustration but says that anger is no longer his “go-to emotion.” He’s developed the ability to cope. He still wants help with his memory, but his life is very different. He’s taken up abstract painting and welding, two hobbies he had no interest in at all before the therapy. He’s put in a new kitchen. Most importantly, his sleep is very different: better.

Tony’s experience was similar to those of five other special-operations veterans who spoke with Defense One. All took part in a double-blind randomized clinical trial that sought to determine how well MeRT treats Persistent Post-Concussion Symptoms and Post-Traumatic Stress Disorder, or PTSD. Five out of the six were former Navy SEALS.

In many ways, SEALS represent the perfect test group for experimental brain treatment. They enter the service in superb health and then embark on a course of training that heightens mental and physical strength and alertness. Then comes their actual jobs, which involve a lot of “breaching”: getting into a place that the enemy is trying to keep you out of. It could be a compound in Abbottabad, Pakistan — or every single door in that compound. Breaching is so central to SEAL work that it’s earned them the nickname “door kickers.” But it often involves not so much kicking as explosives at closer-than-comfortable range. “I got blown up a lot in training,” says Tony, and a lot afterwards as well. Put those two factors together and you have a population with a high functioning baseline but with a lot of incidents of persistent post-concussive syndrome, often on top of heavy combat-related PTSD and other forms of trauma.

One by one, these former SEALs found their way to Won’s lab. One — let’s call him Bill — sought to cure his debilitating headaches. Another, Ted, a SEAL trainer, had no severe symptoms but wanted to see whether the therapy could improve his natural physical state and performance. A fourth, Jim, also a former SEAL, suffered from severe inability to concentrate, memory problems, and low affect, which was destroying his work performance. “I was forcing myself to act normal,” Jim said. “I didn’t feel like I was good at anything.”

Yet another, a former member of the Air Force Security Forces named Cathy, had encountered blasts and a “constant sound of gunfire” during her deployments to Iraq and Afghanistan. She suffered from memory problems, depression, anger, bouts of confusion, and migraines so severe she had to build a darkroom in her house.

Like Cathy, the rest had difficulty sleeping. Even Ted, who had no severe PTSD-related problems, reported that he “slept like crap,” before the treatment began.

All said that they saw big improvements after a course of therapy that ran five days a week for about four weeks. Bill reported that his headaches were gone, as did Cathy, who said her depression and mood disorders had lessened considerably. Jim’s memory and concentration improved so dramatically that he had begun pursuing a second master’s degree and won a spot on his college’s football team. Ted said he was feeling “20 years younger” physically and found himself better able to keep pace with the younger SEALS he was training. All of it, they say, was a result of small, precisely delivered, pops of electricity to the brain. Jim said the lab had also successfully treated back and limb pain by targeting the peripheral nervous system with the same technique.

Won, a former U.S. Navy Flight Surgeon, and his team have treated more than 650 veterans using MeRT. The walls of the lab are adorned with acrylic paintings from veterans who have sought treatment. The colors, themes, and objects in the paintings evolve, becoming brighter, more optimistic, some displaying greater motor control, as the painter progresses through the therapy.

The lab is about one-third of the way through a double-blind clinical trial that may lead to FDA approval, and so Won was guarded in what he could say about the results of their internal studies. But he said that his team had conducted a separate randomized trial on 86 veterans. After two weeks, 40 percent saw changes in their symptoms; after four weeks, 60 did, he said.

Few even had wallets

Tuesday, January 22nd, 2019

A century ago the market economy was important, but a lot of economic activity still took place within the family, Peter Frost notes, especially in rural areas:

In the late 1980s I interviewed elderly French Canadians in a small rural community, and I was struck by how little the market economy mattered in their youth. At that time none of them had bank accounts. Few even had wallets. Coins and bills were kept at home in a small wooden box for special occasions, like the yearly trip to Quebec City. The rest of the time these people grew their own food and made their own clothes and furniture. Farms did produce food for local markets, but this surplus was of secondary importance and could just as often be bartered with neighbors or donated to the priest. Farm families were also large and typically brought together many people from three or four generations.

By the 1980s things had changed considerably. Many of my interviewees were living in circumstances of extreme social isolation, with only occasional visits from family or friends. Even among middle-aged members of the community there were many who lived alone, either because of divorce or because of relationships that had never gone anywhere. This is a major cultural change, and it has occurred in the absence of any underlying changes to the way people think and feel.

Whenever I raise this point I’m usually told we’re nonetheless better off today, not only materially but also in terms of enjoying varied and more interesting lives. That argument made sense back in the 1980s — in the wake of a long economic boom that had doubled incomes, increased life expectancy, and improved our lives through labor-saving devices, new forms of home entertainment, and stimulating interactions with a broader range of people.

Today, that argument seems less convincing. Median income has stagnated since the 1970s and may even be decreasing if we adjust for monetization of activities, like child care, that were previously nonmonetized. Life expectancy too has leveled off and is now declining in the U.S. because of rising suicide rates among people who live alone. Finally, cultural diversity is having the perverse effect of reducing intellectual diversity. More and more topics are considered off-limits in public discourse and, increasingly, in private conversation.

Liberalism is no longer delivering the goods — not only material goods but also the goods of long-term relationships and rewarding social interaction.

Previously they had been a lumpenproletariat of single men and women

Monday, January 21st, 2019

Liberal regimes tend to erode their own cultural and genetic foundations, thus undermining the cause of their success:

Liberalism emerged in northwest Europe. This was where conditions were most conducive to dissolving the bonds of kinship and creating communities of atomized individuals who produce and consume for a market. Northwest Europeans were most likely to embark on this evolutionary trajectory because of their tendency toward late marriage, their high proportion of adults who live alone, their weaker kinship ties and, conversely, their greater individualism. This is the Western European Marriage Pattern, and it seems to go far back in time. The market economy began to take shape at a later date, possibly with the expansion of North Sea trade during early medieval times and certainly with the take-off of the North Sea trading area in the mid-1300s (Note 1).

Thus began a process of gene-culture coevolution: people pushed the limits of their phenotype to exploit the possibilities of the market economy; selection then brought the mean genotype into line with the new phenotype. The cycle then continued anew, with the mean phenotype always one step ahead of the mean genotype.

This gene-culture coevolution has interested several researchers. Gregory Clark has linked the demographic expansion of the English middle class to specific behavioral changes in the English population: increasing future time orientation; greater acceptance of the State monopoly on violence and consequently less willingness to use violence to settle personal disputes; and, more generally, a shift toward bourgeois values of thrift, reserve, self-control, and foresight. Heiner Rindermann has presented the evidence for a steady rise in mean IQ in Western Europe during the late medieval and early modern era. Henry Harpending and myself have investigated genetic pacification during the same timeframe in English society. Finally, hbd*chick has written about individualism in relation to the Western European Marriage Pattern (Note 2).

This process of gene-culture coevolution came to a halt in the late 19th century. Cottage industries gave way to large firms that invested in housing and other services for their workers, and this corporate paternalism eventually became the model for the welfare state, first in Germany and then elsewhere in the West. Working people could now settle down and have families, whereas previously they had largely been a lumpenproletariat of single men and women. Meanwhile, middle-class fertility began to decline, partly because of the rising cost of maintaining a middle-class lifestyle and partly because of sociocultural changes (increasing acceptance and availability of contraception, feminism, etc.).

This reversal of class differences in fertility seems to have reversed the gene-culture coevolution of the late medieval and early modern era.

This is the mindset that enabled northwest Europeans to exploit the possibilities of the market economy

Friday, January 18th, 2019

There is reason to believe that northwest Europeans were pre-adapted to the market economy:

They were not the first to create markets, but they were the first to replace kinship with the market as the main way of organizing social and economic life. Already in the fourteenth century, their kinship ties were weaker than those of other human populations, as attested by marriage data going back to before the Black Death and in some cases to the seventh century (Frost 2017). The data reveal a characteristic pattern:

  • men and women marry relatively late
  • many people never marry
  • children usually leave the nuclear family to form new households
  • households often have non-kin members

This behavioral pattern was associated with a psychological one:

  • weaker kinship and stronger individualism;
  • framing of social rules in terms of moral universalism and moral absolutism, as opposed to kinship-based morality (nepotism, amoral familialism);
  • greater tendency to use internal controls on behavior (guilt proneness, empathy) than external controls (public shaming, community surveillance, etc.)

This is the mindset that enabled northwest Europeans to exploit the possibilities of the market economy. Because they could more easily move toward individualism and social atomization, they could go farther in reorganizing social relationships along market-oriented lines. They could thus mobilize capital, labor, and raw resources more efficiently, thereby gaining more wealth and, ultimately, more military power.

This new cultural environment in turn led to further behavioral and psychological changes. Northwest Europeans have adapted to it just as humans elsewhere have adapted to their own cultural environments, through gene-culture coevolution.


Northwest Europeans adapted to the market economy, especially those who formed the nascent middle class of merchants, yeomen, and petty traders. Over time, this class enjoyed higher fertility and became demographically more important, as shown by Clark (2007, 2009a, 2009b) in his study of medieval and post-medieval England: the lower classes had negative population growth and were steadily replaced, generation after generation, by downwardly mobile individuals from the middle class. By the early 19th century most English people were either middle-class or impoverished descendants of the middle class.

This demographic change was associated with behavioral and psychological changes to the average English person. Time orientation became shifted toward the future, as seen by increased willingness to save money and defer gratification. There was also a long-term decline in personal violence, with male homicide falling steadily from 1150 to 1800 and, parallel to this, a decline in blood sports and other violent though legal practices (cock fighting, bear and bull baiting, public executions). This change can largely be attributed to the State’s monopoly on violence and the consequent removal of violence-prone individuals through court-ordered or extrajudicial executions. Between 1500 and 1750, court-ordered executions removed 0.5 to 1.0% of all men of each generation, with perhaps just as many dying at the scene of the crime or in prison while awaiting trial (Clark 2007; Frost and Harpending 2015).

Similarly, Rindermann (2018) has argued that mean IQ steadily rose in Western Europe during late medieval and post-medieval times. More people were able to reach higher stages of mental development. Previously, the average person could learn language and social norms well enough, but their ability to reason was hindered by cognitive egocentrism, anthropomorphism, finalism, and animism (Rindermann 2018, p. 49). From the sixteenth century onward, more and more people could better understand probability, cause and effect, and the perspective of another person, whether real or hypothetical. This improvement preceded universal education and improvements in nutrition and sanitation (Rindermann 2018, pp. 86-87).

Decoupling is not a worry for anything but a very small explosion

Thursday, January 17th, 2019

The U.S. government conducted more than 1,000 nuclear tests, most of them in the Nevada desert or on faraway Pacific islands, but it also set off a couple nukes under Mississippi:

In 1959, the American physicist Albert Latter theorized that setting off a bomb in an underground cavity could muffle the blast. After tests with conventional explosives, Latter wrote that a detonation as big as 100 kilotons—more than six times bigger than the bomb dropped on Hiroshima—“would make a seismic signal so weak it would not even be detected by the Geneva system.” His theory, known as “decoupling,” became a rallying point for people who wanted to keep testing, says Jeffrey Lewis, of the James Martin Center for Nonproliferation Studies in Monterey, California.

“They wanted to come up with a reason that we couldn’t verify an agreement with the Soviets,” says Lewis, who’s also the publisher of the Arms Control Wonk blog. But in 1963, after the Cuban Missile Crisis brought the world nose-to-nose with the unthinkable, the superpowers signed the Limited Test Ban Treaty. It kept future tests underground, and researchers turned to making sure those tests would be spotted.

The Atomic Energy Commission wanted to test Latter’s theory using actual nukes. And salt deposits were considered the ideal places for tests, since they could be excavated more easily than rock and the resulting cavity would endure for years. So the search was on for a salt dome in territory similar to where the Russians tested their bombs, Auburn University historian David Allen Burke says.

“It had to be a certain diameter. It had to be a certain size. It needed to be a very large salt dome that was still a distance underground and not where it could interfere with water or petroleum or anything else,” says Burke, who wrote a book about the Mississippi tests.

That led the agency to southern Mississippi, which is full of salt domes. The government leased a nearly 1,500-acre patch of forest atop one of those domes and got to work.


The first blast, code-named Salmon, was a 5.3-kiloton device that would blow a cavity into the salt dome half a mile underground. The second, Sterling, was only 380 tons, and would go off in the cavity left behind by Salmon. AEC crews drilled a 2,700-foot hole down into the salt dome, lowered the first bomb into it, plugged it with 600 feet of concrete… and waited.

The Salmon test was put off nearly a month by a string of technical problems and bad weather, including Hurricane Hilda, which hit one state over in Louisiana. People living up to five miles from the test site were evacuated and recalled twice in preparation for blasts that never happened. They got paid $10 a head for adults and $5 for children for their trouble.


Far from Latter’s predictions that a blast as big as 100 kilotons could be kept off the scopes, Lewis says, it turned out that decoupling “is not a worry for anything but a very small explosion.” However, the data helped shape a later treaty which limited underground tests to 150 kilotons.


Federal records now indicate cancer rates in Lamar County are lower than both the state and national average.

(Hat tip to Hans Schantz.)

The plasmids force their hosts to lay down their arms

Tuesday, January 15th, 2019

Bacteria evolve drug-resistance in the usual way, but they also spread genes for drug-resistance horizontally, through plasmids:

As a self-defense mechanism, Acinetobacter kills other bacteria that get too close, which doesn’t help the plasmids reproduce. So, the plasmids force their hosts to lay down their arms, allowing them to then pass copies of themselves into the neighboring bacteria.

In response, the researchers mutated the plasmids so they couldn’t stop the bacteria from defending itself. In another test, they mutated the Acinetobacter itself so its defenses couldn’t be lowered, and in both cases the outcome was the same. The plasmids — and by extension, antibiotic resistance — were unable to spread.