White sharks will immediately vacate their preferred hunting ground

Friday, April 19th, 2019

A recent study finds that great white sharks clear out when killer whales arrive:

“When confronted by orcas, white sharks will immediately vacate their preferred hunting ground and will not return for up to a year, even though the orcas are only passing through,” said marine ecologist Salvador Jorgensen of Monterey Bay Aquarium.

The team collected data from two sources: the comings and goings of 165 great white sharks GPS tagged between 2006 and 2013; and 27 years of population data of orcas, sharks and seals collected by Point Blue Conservation Science at Southeast Farallon Island off the coast of San Francisco.

Great White Shark with Liver Eaten by Orca

In addition, orcas have been observed preying on great white sharks around the world, including near the Farallon Islands. It’s still a little unclear why, but the orca-killed sharks that wash ashore (one is pictured at the top of the page) are missing their livers — their delicious, oil-rich, full-of-vitamins livers.

Consumption of wine was inversely associated with the risk of common cold

Monday, April 8th, 2019

Does drinking alcohol protect against the common cold?

To examine whether intakes of wine, beer, spirits, and total alcohol are associated with the risk of common cold, in 1998–1999 the authors analyzed data from a cohort study carried out in a population of 4,272 faculty and staff of five Spanish universities.

Usual alcohol intake was assessed at baseline by means of a standardized frequency questionnaire that was validated in a random sample of the population. The authors detected 1,353 cases of common cold.

Total alcohol intake and beer and spirits consumption were not related to the occurrence of common cold, whereas consumption of wine was inversely associated with the risk of common cold. When drinkers of >14 glasses of wine per week were compared with teetotalers, the relative risk was 0.6 (95% confidence interval: 0.4, 0.8) after adjustment for age, sex, and faculty/staff status. The association was stronger for red wine.

These results remained unaltered after adjustment for total alcohol intake and for other potential risk factors for common cold. Findings suggest that wine intake, especially red wine, may have a protective effect against common cold. Beer, spirits, and total alcohol intakes do not seem to affect the incidence of common cold.

It’s rare to see joint as well as bone growth

Thursday, April 4th, 2019

Having previously regenerated bone in mice using the BMP2 protein, scientists then added BMP9:

When using the combination on mice with amputated toes, over 60 percent of the stump bones formed a layer of cartilage within three days. Without the proteins, the amputated toes would’ve healed over as normal.

That cartilage is a key part of joints, and shows definite progress in limb regeneration. Even in animals who can naturally regrow lost limbs, it’s rare to see joint as well as bone growth.

“These studies provide evidence that treatment of growth factors can be used to engineer a regeneration response from a non-regenerating amputation wound,” explain the researchers in their paper.

The results of the study showed that the regeneration process was most advanced when BMP2 was applied first, with BMP9 added a week after – in this case it led to the growth of more complete joint structures, even with some connections to the bone.

Serial numbers on Lugers were deliberately confusing

Friday, March 29th, 2019

While discussing the German “Luger” pistol, Dunlap brought up a point that surprised me:

Serial numbers on Lugers were deliberately confusing, as the Germans did not like to have people adding up numbers and estimating production figures, so they organized a code-series system, which is no military secret now, but which I have never completely solved.

This surprised me, because I’d read about the Germans failing to do just that with their tanks:

The statisticians had one key piece of information, which was the serial numbers on captured mark V tanks. The statisticians believed that the Germans, being Germans, had logically numbered their tanks in the order in which they were produced. And this deduction turned out to be right. It was enough to enable them to make an estimate of the total number of tanks that had been produced up to any given moment.

The basic idea was that the highest serial number among the captured tanks could be used to calculate the overall total. The German tanks were numbered as follows: 1, 2, 3N, where N was the desired total number of tanks produced. Imagine that they had captured five tanks, with serial numbers 20, 31, 43, 78 and 92. They now had a sample of five, with a maximum serial number of 92. Call the sample size S and the maximum serial number M. After some experimentation with other series, the statisticians reckoned that a good estimator of the number of tanks would probably be provided by the simple equation (M-1)(S+1)/S. In the example given, this translates to (92-1)(5+1)/5, which is equal to 109.2. Therefore the estimate of tanks produced at that time would be 109.

By using this formula, statisticians reportedly estimated that the Germans produced 246 tanks per month between June 1940 and September 1942. At that time, standard intelligence estimates had believed the number was far, far higher, at around 1,400. After the war, the allies captured German production records, showing that the true number of tanks produced in those three years was 245 per month, almost exactly what the statisticians had calculated, and less than one fifth of what standard intelligence had thought likely.

There is an almost palpable sense of fear in this landscape

Tuesday, March 12th, 2019

New evidence suggests that the “peaceful” Maya fought bitter wars:

In February 2018, National Geographic broke the story of the PACUNAM LiDAR Initiative, a sweeping aerial survey of some 800 square miles (2,100 square kilometers) of the Maya Biosphere Reserve in northern Guatemala. Using revolutionary laser technology, the survey revealed the long-hidden ruins of a sprawling pre-Columbian civilization that was far more complex and interconnected than most Maya specialists had supposed.

Mayan Ruins in Jungle

“You could walk over the top of a major ruin and miss it,” says Thomas Garrison, an Ithaca College archaeologist who’s part of the PACUNAM project. “But LiDAR picks up the patterns and makes the features pop out with astounding clarity.”

Three-dimensional maps generated by the survey yielded surprises even at Tikal, the largest and most extensively explored archaeological site in Guatemala. The ancient city was at least four times bigger than previously thought, and partly surrounded by a massive ditch and rampart stretching for miles.

Mayan Ruins via LIDAR

“This was surprising,” says Houston, “because we had a tendency to romanticize Maya warfare as something that was largely ritualized and concentrated toward the end of the civilization. But the fortifications we’re seeing now suggest an elevated level of conflict over centuries. Rulers were so deeply worried about defense that they felt the need to invest in all these hilltop fortifications. There is an almost palpable sense of fear in this landscape.”

The microblade was best for lacerated wounds

Monday, March 11th, 2019

University of Washington archaeologists have re-created and tested three types of stone-age spear tips:

So Wood traveled to the area around Fairbanks, Alaska, and crafted 30 projectile points, 10 of each kind. She tried to stay as true to the original materials and manufacturing processes as possible, using poplar projectiles, and birch tar as an adhesive to affix the points to the tips of the projectiles. While ancient Alaskans used atlatls (a kind of throwing board), Wood used a maple recurve bow to shoot the arrows for greater control and precision.

Stone, Microblade, and Bone Spear Tips

For the bone tip, modeled on a 12,000-year-old ivory point from an Alaskan archaeological site, Wood used a multipurpose tool to grind a commercially purchased cow bone.

For the stone tip, she used a hammerstone to strike obsidian into flakes, then shaped them into points modeled on those found at another site in Alaska from 13,000 years ago.

And for the composite microblade tip — modeled microblade technologies seen in Alaska since at least 13,000 years ago and a rare, preserved grooved antler point from a more recent Alaskan site used more than 8,000 years ago — Wood used a saw and sandpaper to grind a caribou antler to a point. She then used the multipurpose tool to gouge out a groove around its perimeter, into which she inserted obsidian microblades.

Wood then tested how well each point could penetrate and damage two different targets: blocks of ballistic gelatin (a clear synthetic gelatin meant to mimic animal muscle tissue) and a fresh reindeer carcass, purchased from a local farm. Wood conducted her trials over seven hours on a December day, with an average outdoor temperature of minus 17 degrees Fahrenheit.

(That outdoor temperature seems like it would affect the target quite a bit.)

In Wood’s field trial, the composite microblade points were more effective than simple stone or bone on smaller prey, showing the greatest versatility and ability to cause incapacitating damage no matter where they struck the animal’s body. But the stone and bone points had their own strengths: Bone points penetrated deeply but created narrower wounds, suggesting their potential for puncturing and stunning larger prey (such as bison or mammoth); the stone points could have cut wider wounds, especially on large prey (moose or bison), resulting in a quicker kill.

[...]

“We have shown how each point has its own performance strengths,” she said. Bone points punctured effectively, flaked stone created a greater incision, and the microblade was best for lacerated wounds. “It has to do with the animal itself; animals react differently to different wounds. And it would have been important to these nomadic hunters to bring the animal down efficiently. They were hunting for food.”

Skip the ice

Tuesday, February 26th, 2019

Icing postworkout became practically mandatory after physician Gabe Mirkin coined the term RICE — Rest, Ice, Compression, Elevation — in 1978, and its popularity continues today in marathon medical tents and professional locker rooms:

Ice is meant to slow blood flow, which reduces inflammation and pain. But, it turns out, that also can be counterproductive, as it inhibits the rebuilding of muscle and the restoration process. “Instead of promoting healing and recovery,” Aschwanden writes, “icing might actually impair it.” And that’s led to a growing backlash against icing, which even Mirkin has joined. Instead of rushing to the cold stuff, Aschwanden advises athletes to wait it out and leave time for the body to heal.

This isn’t a new discovery:

As early as 2006, exercise physiologist Motoi Yamane and researchers at Chukyo University in Aichi, in Japan, found that icing leg muscles after cycling or forearm handgrip exercises interfered with performance gains. Recently Yamane published a follow-up study at Aichi Mizuho College — again, using weighted handgrip exercises — that corroborates his earlier results: RICE is disadvantageous after training and messes with both muscular and vascular adaptations of resistance training.

Exercise physiologist Jonathan Peake and his colleagues at Queensland University of Technology in Brisbane, Australia agree. They’re among the latest researchers to test ice baths on athletes. In a recent study presented as an abstract at the 2014 American College of Sports Medicine conference, the researchers put two groups of young men on a bi-weekly resistance-training program. The first group took ice baths after each training session (ten minutes in water at around 50 degrees), while the other group did a low-intensity active warm-down on a bicycle. It turned out that icing suppressed the cell-signaling response that regulates muscle growth. Three months later, the scientists found that the ice-bath group didn’t gain nearly as much muscle as the bicycle warm-down group.

Peake concluded that it’s probably not a good idea to be using ice baths after every training session, particularly when athletes are in season. In a parallel study presented March 30 at the Experimental Biology meeting, Peake also looked at muscle biopsies in a rat contusion injury model (researchers dropped weights on rats’ leg muscles to cause bruising). An ice bath on the bruised muscles was enough to suppress inflammation and delay muscle fiber regeneration. For the minor muscle injuries, icing was detrimental rather beneficial, prolonging the healing process that inflammation brings.

The two new studies hammer a couple more nails in the RICE coffin, according to Dr. Gabe Mirkin. He was the sports medicine doctor who originally coined the acronym, which stands for rest, ice, compression, elevation, in 1978, and has since quit recommending it to athletes. “We never rest or ice athletes anymore. RICE is fine for someone who doesn’t need to get back to training quickly, but it’s terrible for competitive athletes.” he said.

More movement, Dr. Mirkin says, as shown in Peake’s research, is the best way to speed up muscle recovery. The new research is an extension of a growing body of evidence over the last several years that now makes clear that the only advantage of icing muscles is for temporarily pain relief. “About all icing is good for is a placebo effect,” Dr. Mirkin says. “There’s no evidence that icing speeds healing or makes you stronger; in fact, it makes you weaker so you can’t do your next hard workout.

Past incompetence predicts future progress

Monday, February 25th, 2019

Gregory Cochran is (darkly) optimistic about how much low-hanging fruit is out there in the world of medical research:

If we look at cases where an innovation or discovery was possible — even easy — for a long time before it was actually developed, we might be able to find patterns that would help us detect the low-hanging fruit dangling right in front of us today.

For now, one example.  We know that gastric and duodenal ulcer, and most cases of stomach cancer, are caused by an infectious organism, Helicobacter pylori.  It apparently causes amnesia as well. This organism was first seen in 1875 — nobody paid any attention.

Letulle showed that it induced gastritis in guinea pigs, 1888. Walery Jaworski rediscovered it in 1889, and suspected that it might cause gastric disease. Nobody paid any attention.  Krienitz associated it with gastric cancer in 1906.  Who cares?

Around 1940, some American researchers rediscovered it, found it more common in ulcerated stomachs,  and published their results.  Some of them thought that this might be the cause of ulcers — but Palmer, a famous pathologist,  couldn’t find it when he looked in the early 50s, so it officially disappeared again. He had used the wrong stain.  John Lykoudis, a Greek country doctor noticed that a heavy dose of antibiotics coincided with his ulcer’s disappearance, and started treating patients with antibiotics — successfully.   He tried to interest pharmaceutical companies — wrote to Geigy, Hoechst, Bayer, etc.  No joy.   JAMA rejected his article. The local medical society referred him for disciplinary action and fined him

The Chinese noticed that antibiotics could cure ulcers in the early 70s, but they were Commies, so it didn’t count.

Think about it: peptic and duodenal ulcer were fairly common, and so were effective antibiotics, starting in the mid-40s. Every internist in the world — every surgeon — every GP was accidentally curing ulcers  — not just one or twice,  but again and again.  For decades. Almost none of them noticed it, even though it was happening over and over, right in front of their eyes.  Those who did notice were ignored until the mid-80s, when Robin Warren and Barry Marshall finally made the discovery stick. Even then,  it took something like 10 years for antibiotic treatment of ulcers to become common, even though it was cheap and effective. Or perhaps because it was cheap and effective.

This illustrates an important point: doctors are lousy scientists, lousy researchers.  They’re memorizers, not puzzle solvers.  Considering that Western medicine was an ineffective pseudoscience — actually, closer to a malignant pseudoscience  — for its first two thousand years, we shouldn’t be surprised.    Since we’re looking for low-hanging fruit,  this is good news.  It means that the great discoveries in medicine are probably not mined out. From our point of view, past incompetence predicts future progress.  The worse, the better!

There’s never been a case of a runner dying of dehydration on a marathon course

Monday, February 25th, 2019

What to drink during exercise, and how much, is an ongoing debate among athletes and health professionals:

While daily water-intake recommendations vary (the National Institute of Health suggests that men consume three liters per day and women 2.2 liters), athletes are invariably told to drink at every opportunity. This hydration preoccupation — often prompted by science of limited rigor and fueled by marketing from sports-drink companies — has lead to people drinking even when they’re not thirsty, especially when working out. And according to Aschwanden, that could be a big problem. “The body is highly adapted to cope with losing multiple liters of fluid,” she writes.

In fact, the evidence cited in her book shows that drinking too much water poses a much greater risk than drinking too little. Overhydration can lead to blood-sodium levels becoming diluted to dangerous and even fatally low concentrations (a condition known as hyponatremia). This became a recurring problem, for example, at the Comrades Marathon — a famous 90-kilometer race in South Africa — after it added water stations for the first time in 1981. “There’s never been a case of a runner dying of dehydration on a marathon course,” recounts Aschwanden. “But since 1993, at least five marathoners have died from hyponatremia that developed during a race.” Drinking when thirsty, she advises, is the much better approach than wrought water consumption.

Why do Bedouins wear black in the desert?

Sunday, February 24th, 2019

Why do Bedouins wear black in the desert?

The question so intrigued four scientists — all non-Bedouins — that they ran an experiment. Their study, called Why Do Bedouins Wear Black Robes in Hot Deserts?, was published in the journal Nature in 1980.

“It seems likely,” the scientists wrote, “that the present inhabitants of the Sinai, the Bedouins, would have optimised their solutions for desert survival during their long tenure in this desert. Yet one may have doubts on first encountering Bedouins wearing black robes and herding black goats. We have therefore investigated whether black robes help the Bedouins to minimise solar heat loads in a hot desert.”

The research team — C Richard Taylor and Virginia Finch of Harvard University and Amiram Shkolnik and Arieh Borut of Tel Aviv University — quickly discovered that, as you might suspect, a black robe does convey more heat inward than a white robe does. But they doubted that this was the whole story.

Taylor, Finch, Shkolnik, and Borut measured the overall heat gain and loss suffered by a brave volunteer. They described the volunteer as “a man standing facing the sun in the desert at midday while he wore: 1) a black Bedouin robe; 2) a similar robe that was white; 3) a tan army uniform; and 4) shorts (that is, he was semi-nude)”.

Each of the test sessions (black-robed, white-robed, uniformed and half-naked) lasted 30 minutes. They took place in the Negev desert at the bottom of the rift valley between the Dead Sea and the Gulf of Eilat. The volunteer stood in temperatures that ranged from a just-semi-sultry 35°C (95°F) to a character-building 46°C (115°F). Though he is now nameless, this was his day in the sun.

The results were clear. As the report puts it: “The amount of heat gained by a Bedouin exposed to the hot desert is the same whether he wears a black or a white robe. The additional heat absorbed by the black robe was lost before it reached the skin.”

Bedouins’ robes, the scientists noted, are worn loose. Inside, the cooling happens by convection — either through a bellows action, as the robes flow in the wind, or by a chimney sort of effect, as air rises between robe and skin. Thus it was conclusively demonstrated that, at least for Bedouin robes, black is as cool as any other colour.

Jackals moved north because wolves were eradicated

Thursday, February 21st, 2019

Jackals now vastly outnumber wolves in Europe:

Smaller than North American coyotes, the golden jackal weighs an average 20 pounds. It is native to the Middle East and southern Asia, ranging as far east as Thailand and inhabiting Iraq, Iran, Afghanistan, Pakistan and India.

The species arrived at the southern edge of Central and Eastern Europe about 8,000 years ago, fossil evidence suggests, and started to expand slowly in the 19th century. But the current boom really began in the 1950s and has accelerated over the past 20 years.

Jackals are one of the least studied canine predators. Like wolves and coyotes, jackals have family-based packs, but the groups tend to be smaller, with four to six animals, while wolf packs may include 15 animals.

A monogamous pair of jackals forms the core of a pack; the young may stay with the parents, or leave to establish their own packs.

Jackals are not as prominent in tales and proverbs as some other animals, although there’s an old quote, variously attributed, that it is better to live like a lion for a day than a jackal for 100 years. Hemingway described “personal columnists” as jackals, which no doubt refers to their scavenging habits.

Jackals did have one moment of past glory. The Egyptian god Anubis was sometimes said to have a jackal’s head. That claim to fame has been lost: The North African animal that may have inspired the sculptures of Anubis has been reclassified as the African wolf.

Golden Jackal in Croatia

Substantial populations of jackals now live in a number of European countries, including Greece, Slovenia, Croatia, Hungary, Romania, Ukraine, Austria, Italy, and above all, Bulgaria, which has the largest population.

Jackal wanderers — or advance scouts — have been found in France, Italy, Germany, Switzerland, Poland, Belarus, Estonia, the Netherlands and Denmark.

Scientists think jackals began to move north because wolves were targeted for eradication, particularly in the Balkans. That opened a door, since jackals seem to avoid areas well populated by wolves.

[...]

The jackals’ expansion is a huge natural experiment, similar to but more surprising than the spread of coyotes in North America. Coyotes were well established in the West and Southwest before they started arriving in the Northeast and Southeast, and lately in Mexico.

The vampire is consumption in human form

Thursday, February 14th, 2019

The vampire is consumption in human form:

“Peter Plogojowitz…died…and had been buried…within a week, nine people, both old and young, died also…while they were yet alive…Plogojowitz…(came) to them in their sleep, laid himself on them…so that they would give up the ghost… They exhumed…(his) body…which was completely fresh…hair…nails…had grown on him; the old skin…had peeled …a fresh new one had emerged under it…I saw some fresh blood in his mouth…he had sucked from the people killed by him…(we) sharpened a stake…as he was pierced…much blood, completely fresh, flowed… through his…mouth.” (1725)

Tuberculosis was a disease of antiquity with a unique nom de guerre — consumption — and helped give impetus to the vampire myth. The ancient and universal presence of “bloodsuckers” will be chronicled from early times of recorded human existence.

In Western culture, ancient Greeks called their mythological bloodsuckers Lamia. The Libyan princess Lamia had an illicit love affair with Zeus. Hera, the wife whom Zeus spurned, killed all of Lamia’s children and drove Lamia into exile. The tale — told by Aristophanes (446–386 B.C.E.) and Aristotle (384–322 B.C.E.) — described how Lamia sought revenge by sucking the lifeblood of babies. In the second century B.C.E., a manuscript fragment by Titinus suggested garlic hung around the neck of children would protect them from Lamia.

After the arrival of Christianity, undead demons were accommodated by the new religion and were renamed Vrykolakas, roaming undead beings afflicting humanity. Many cultures followed with Albanians, Montenegrins, Bulgarians, Croatians, Hungarians, Serbians, and Russians naming their ethnic vampires. In fact, it was the Walachian prince, Vlad Tepes (the Impaler, 1431–1476), who was Bram Stokers’ model for Dracula. The German schrattl, or shroud-eater, was thought to rise from the grave spreading disease. The presence of mythic undead creatures populated Native North and South American cultures. In the 13th and 14th centuries undead Icelandic beings were added to other myths and called Grettirs.

Although it has been opined that “East is East and West is West and never the twain shall meet,” the vampire myth brought cultural extremes together. Chinese jiangshi were corpses of those drowned, hung, or victims of suicide returning to drain humans of their life force. The Japanese and culture of India described similar undead blood suckers. As fatal communicable diseases and death were universal — and otherwise inexplicable — so were naïve myths of vampires. Myths would evolve into metaphors as well.

“…the body swells…Discoloured natural fluids and liquefied tissues are made frothy by gas and some exude from the natural orifices, forced out by the increasing pressure in the body cavities… eyes bulge… little wonder that Bacon was convinced that purposeful dynamic spirits wrought this awful change.”

“Blood migrates…in the course of decomposition…the gases in the abdomen increase in pressure…and are forced upwards and decomposing blood escapes from the mouth and nostrils.”

In 1739, Austrians occupying Serbia and Walachia investigated reports of a gruesome local custom: Exhuming dead bodies and re-killing them. The practice was a consequence of their ignorance of natural processes of bodily decomposition. During decomposition intestinal bacterial gas flows through blood vessels and tissues pushing blood stained fluids through the nose and mouth. Seven days after death, cadaveric skin loosens, the top layer sheds off in sheets in a process called skin slippage.

At a 1732 exhumation it was written, “They dug up Arnod Paole forty days after his death…and they found…fresh blood had flowed from his eyes, nose, mouth and ears; that the shirt (was) completely bloody…the skin, had fallen off…he was a true vampire, they drove a stake through his heart.” The presence of blood after death was interpreted as a sign of reanimated life — albeit at someone else’s expense. In the context of consumption, the visible confluence of paleness and hemoptysis lent credence to postmortem appearances suggesting life after death. That perception would be further fueled by ignorance regarding the transmission of infectious diseases.

“The Vampire is consumption in human form, embodying an evil that slowly and secretly drains the life from its victims.”

“For as long as what caused tuberculosis was not understood…tuberculosis was thought to be an insidious, implacable theft of a life…Any disease that is treated as a mystery and acute enough to be feared will be felt to be morally…contagious.”

– Susan Sontag

Prior to the Germ Theory, from the time of Aristotle, miasmas — or currents of contaminated air circulated by winds — were considered the cause of communicable diseases. As a result, continued connections between consumption and the vampire myth persisted into the late Nineteenth Century.

“Mercy died, apparently of tuberculosis, in January 1892…Mercy was a vampire…Mercy’s brother Edwin was a strapping young man of 18…in 1891 Mercy and Edwin both became ill… the boy went off to Colorado, where he recovered. His sister (Mercy) eventually was carried to her grave by the illness…Edwin returned still in weak health…why was such a strong man’s life draining away? Why had the same thing happened to Mercy only a few months before…nothing less terrible than a vampire was sucking their children’s blood and taking their lives with it…to their infinite horror, Mercy’s body, which had blood and seemed unnaturally preserved, with color still in the cheeks…(They) removed the corpse’s heart and burned it on a rock.”

Consumption — the captain of all these men of death — was chosen and persisted as a metaphor for vampirism. Charles Dickens wrote, that for individuals with tuberculosis, “life and death are so strangely blended that death takes the “glow and hue of life, and life the gaunt and grisly form of death” (Nicholas Nickleby). Dickens and others who witnessed victims of consumption — precariously hovering between life and death — recognized the stigma:

“The emaciated figure strikes one with terror; the forehead covered with drops of sweat; the cheeks painted with a livid crimson, the eyes sunk; the little fat that raised them in their orbits entirely wasted; the pulse quick and tremulous; the nails long, bending over the ends of the fingers; the palms of the hands dry and painfully hot to the touch; the breath offensive, quick and laborious, and the cough so incessant as scarce to allow the wretched sufferer time to tell his complaints.”

Science, ascendant in the late Nineteenth, early twentieth century identified Mycobacterium tuberculosis as the agent of tuberculosis. Science also shed light on bodily decomposition after death. The cure of tuberculosis would follow and the myth and metaphors surrounding tuberculosis would disappear.

The great vice of the Greeks was extrapolation

Wednesday, February 13th, 2019

I recently read Ignition!, by John D. Clark, and I found it an odd mix of fun, opinionated bits and dry chemistry:

“Now it is clear that anyone working with rocket fuels is outstandingly mad. I don’t mean garden-variety crazy or a merely raving lunatic. I mean a record-shattering exponent of far-out insanity.”

“It is, of course, extremely toxic, but that’s the least of the problem. It is hypergolic with every known fuel, and so rapidly hypergolic that no ignition delay has ever been measured. It is also hypergolic with such things as cloth, wood, and test engineers, not to mention asbestos, sand, and water — with which it reacts explosively. It can be kept in some of the ordinary structural metals — steel, copper, aluminium, etc. — because of the formation of a thin film of insoluble metal fluoride which protects the bulk of the metal, just as the invisible coat of oxide on aluminium keeps it from burning up in the atmosphere. If, however, this coat is melted or scrubbed off, and has no chance to reform, the operator is confronted with the problem of coping with a metal-fluorine fire. For dealing with this situation, I have always recommended a good pair of running shoes.”

“If your propellants flow into the chamber and ignite immediately, you’re in business. But if they flow in, collect in a puddle, and then ignite, you have an explosion which generally demolishes the engine and its immediate surroundings. The accepted euphemism for this sequence of events is a ‘hard start.’”

“Their guess turned out to be right, but one is reminded of E. T. Bell’s remark that the great vice of the Greeks was not sodomy but extrapolation.”

“…a molecule with one reducing (fuel) end and one oxidizing end, separated by a pair of firmly crossed fingers, is an invitation to disaster.”

“I looked around and signaled to my own gang, and we started backing away gently, like so many cats with wet feet.”

“And there is one disconcerting thing about working with a computer — it’s likely to talk back to you. You make some tiny mistake in your FORTRAN language — putting a letter in the wrong column, say, or omitting a comma — and the 360 comes to a screeching halt and prints out rude remarks, like “ILLEGAL FORMAT,” or “UNKNOWN PROBLEM,” or, if the man who wrote the program was really feeling nasty that morning, “WHAT’S THE MATTER STUPID? CAN’T YOU READ?” Everyone who uses a computer frequently has had, from time to time, a mad desire to attack the precocious abacus with an axe.”

Rubisco has become a victim of its own success

Saturday, February 9th, 2019

Plants convert sunlight into energy through photosynthesis, but this process involves a glitch — and an “expensive” fix, called photorespiration:

Researchers from the University of Illinois and U.S. Department of Agriculture Agricultural Research Service report in the journal Science that crops engineered with a photorespiratory shortcut are 40 percent more productive in real-world agronomic conditions.

[...]

Photosynthesis uses the enzyme Rubisco — the planet’s most abundant protein — and sunlight energy to turn carbon dioxide and water into sugars that fuel plant growth and yield. Over millennia, Rubisco has become a victim of its own success, creating an oxygen-rich atmosphere. Unable to reliably distinguish between the two molecules, Rubisco grabs oxygen instead of carbon dioxide about 20 percent of the time, resulting in a plant-toxic compound that must be recycled through the process of photorespiration.

“Photorespiration is anti-photosynthesis,” said lead author Paul South, a research molecular biologist with the Agricultural Research Service, who works on the RIPE project at Illinois. “It costs the plant precious energy and resources that it could have invested in photosynthesis to produce more growth and yield.”
Photorespiration normally takes a complicated route through three compartments in the plant cell. Scientists engineered alternate pathways to reroute the process, drastically shortening the trip and saving enough resources to boost plant growth by 40 percent. This is the first time that an engineered photorespiration fix has been tested in real-world agronomic conditions.

[...]

The team engineered three alternate routes to replace the circuitous native pathway. To optimize the new routes, they designed genetic constructs using different sets of promoters and genes, essentially creating a suite of unique roadmaps. They stress tested these roadmaps in 1,700 plants to winnow down the top performers.

Over two years of replicated field studies, they found that these engineered plants developed faster, grew taller, and produced about 40 percent more biomass, most of which was found in 50-percent-larger stems.

The team tested their hypotheses in tobacco: an ideal model plant for crop research because it is easier to modify and test than food crops, yet unlike alternative plant models, it develops a leaf canopy and can be tested in the field. Now, the team is translating these findings to boost the yield of soybean, cowpea, rice, potato, tomato, and eggplant.

An enterprising evil grad student could weaponize kudzu and then hold the world ransom for…one million dollars!

Keeping someone in solitary for more than 15 days constitutes torture

Friday, February 8th, 2019

Professional gambler Rich Alati took an unusual bet:

On 10 September last year, the American was sitting at a poker table at the Bellagio in Las Vegas, when he was asked a question by a fellow professional player, Rory Young: how much would it take for him to spend time in complete isolation, with no light, for 30 days? An hour later a price had been agreed: $100,000.

Young would hand over the money if Alati could last 30 days in a soundproofed bathroom with no light. He would be delivered food from a local restaurant, but the meals would come at irregular intervals to prevent him from keeping track of time. There would be no TV, radio, phone or access to the outside world but he would be allowed some comforts: a yoga mat, resistance band, massage ball, and, appropriately for a bathroom, lavender essential oils as well as a sugar and salt scrub. If Alati failed he would have to pay Young $100,000.

[...]

Dr Michael Munro, a psychologist Young consulted before agreeing to the bet, told Young: “Even if he lasts for 30 days, it will be extremely taxing on his mental health for the short and potentially long term.”

There’s good reason for such caution. Solitary confinement is often used as punishment, most notably in the United States, where inmates in solitary are isolated in their cells 23 hours a day. The United Nations’ Nelson Mandela Rules state that keeping someone in solitary for more than 15 days constitutes torture.

[...]

But Alati was confident. He had practiced meditation and yoga, and was certain his experiences at silent retreats would help him. On 21 November, a crowd of families and friends gathered at the house where the challenge would take place. Alati and Young’s lawyers were there as well as cameramen from a production company interested in buying television rights to the story. For that reason, as well as safety, the entire bet would be recorded. Alati’s father was given the power to pull Alati out at any time should he show signs of not being “in the right headspace,” as Alati puts it.

[...]

Around the 10-day mark, Young started to worry that Alati might make the 30 days, noting he looked “totally fine”. He worried he had miscalculated: Young hadn’t known Alati – a gregarious, fast talker – for long before they had made the bet. “His personality did not reflect that of someone who was proficient with meditation,” Young said.

On day 15, Young’s voice came on over the loudspeaker. Alati jumped out of bed, happy to hear a voice that wasn’t his own. Young told Alati that he had been in for around two weeks and that he had an offer for him: Alati could leave if he paid out $50,000.

[...]

Alati waited for a few days until Young came back on the loudspeaker and asked if he had any offers of his own. Alati said he wouldn’t come out for less than $75,000, to which Young countered with an offer of $40,000. They settled on $62,400. Alati had had been in the silence and dark for 20 days.