Domes are overrated

Sunday, December 8th, 2019

It is an unwritten rule of space journalism that any article about Moon or Mars bases needs to have a conceptual drawing of habitation domes, Casey Handmer notes, but domes are overrated:

Domes feature compound curvature, which complicates manufacturing. If assembled from triangular panels, junctions contain multiple intersecting acute angled parts, which makes sealing a nightmare. In fact, even residential dome houses are notoriously difficult to insulate and seal! A rectangular room has 6 faces and 12 edges, which can be framed, sealed, and painted in a day or two. A dome room has a new wall every few feet, all with weird triangular faces and angles, and enormously increased labor overhead.

Domes

It turns out that the main advantage of domes — no internal supports — becomes a major liability on Mars. While rigid geodesic domes on Earth are compressive structures, on Mars, a pressurized dome actually supports its own weight and then some. As a result, the structure is under tension and the dome is attempting to tear itself out of the ground. Since lifting force scales with area, while anchoring force scales with circumference, domes on Mars can’t be much wider than about 150 feet, and even then would require extensive foundation engineering.

Once a dome is built and the interior occupied, it can’t be extended. Allocation of space within the dome is zero sum, and much of the volume is occupied by weird wedge-shaped segments that are hard to use. Instead, more domes will be required, but since they don’t tesselate, tunnels of some kind would be needed to connect to other structures. Each tunnel has to mate with curved walls, a rigid structure that must accept variable mechanical tolerances, be broad enough to enable large vehicles to pass, yet narrow enough to enable a bulkhead to be sealed in the event of an inevitable seal failure. Since it’s a rigid structure, it has to be structurally capable of enduring pressure cycling across areas with variable radii of curvature without fatigue, creep, or deflection mismatch.

A concerned citizen is largely helpless

Saturday, November 30th, 2019

In Loserthink Scott Adams cites a celebrity’s global warming climate change tweet as an example of a bright person talking about something without training in economics or business:

Now let’s say you had experience in economics and business, as I do. In those domains, anyone telling you they can predict the future in ten years with their complicated multivariate models is automatically considered a fraud.

[...]

You might be debating me in your mind right now and thinking that, unlike the field of finance, the scientific process drives out bias over time. Studies are peer reviewed, and experiments that can’t be reproduced are discarded.

Is that what is happening?

Here I draw upon my sixteen years working in corporate America. If my job involved reviewing a complicated paper from a peer, how much checking of the data and the math would I do when I am already overworked? Would I travel to the original measuring instruments all over the world and check their calibrations? Would I compare the raw data to the “adjusted” data that is used in the paper? Would I do a deep dive on the math and reasoning, or would I skim it for obvious mistakes? Unless scientists are a different kind of human being than the rest of us, they would intelligently cut corners whenever they think they could get away with it, just like everyone else. Assuming scientists are human, you would expect lots of peer-reviewed studies to be flawed. And that turns out to be the situation. As the New York Times reported in 2018, the peer review process is defective to the point of being laughable.

[...]

My point is that a concerned citizen is largely helpless in trying to understand how settled the science of climate change really is. But that doesn’t stop us from having firm opinions on the topic.

[...]

Whenever you have a lot of money in play, combined with the ability to hide misbehavior behind complexity, you should expect widespread fraud to happen. Take, for example, the 2019 Duke University settlement in which the university agreed to pay $112.5 million for repeatedly submitting research grant requests with falsified data. Duke had a lot of grant money at stake, and lots of complexity in which to hide bad behavior. Fraud was nearly guaranteed.

If you have been on this planet for a long time, as I have, and you pay attention to science, you know that the consensus of scientists on the topic of nutrition was wrong for decades.

[...]

Over time, it became painfully obvious to me that nutrition science wasn’t science at all. It was some unholy marriage of industry influence, junk science, and government. Any one of those things is bad, but when you put those three forces together, people die. That isn’t hyperbole. Bad nutrition science has probably killed a lot of people in the past few decades.

What were these watermelons good for?

Monday, November 25th, 2019

Native to Africa, watermelons have been grown throughout the continent since ancient times — despite the fact that watermelons were not sweet until much, much later:

In southwest Libya, 5,000-year-old seeds were excavated, and watermelon remnants from 1500 B.C. have been discovered in the foundational deposits beneath walls of a Sudanese temple. Archeologists have also found seeds and paintings of various species of watermelon in ancient Egyptian tombs dating back from as long as 4,000 years ago. These species include wild watermelons, as well as the oblong predecessors of the “dessert” watermelon.

But if not a flavorful fruit, what were these watermelons good for? According to the work of Harry S. Paris, a horticulturalist at the Agricultural Research Organization in Israel, ancient Egyptians likely harvested the round fruit for its water. Wild, or “spontaneous” plants, Paris writes, can be sources of clean water during the long, dry season, and can provide food for livestock and animals.

[...]

Living travelers, too, needed reliable water sources to sustain them. According to Paris, it’s likely that travelers took watermelons with them as a kind of nature-made canteen. Along with trade, he writes, the watermelon’s role as a portable fresh water supply helped the fruit find its way into new regions.

Once the Greeks got a hold of the pepo (as they called it) around 400 B.C., they, too, put it to use. While some varieties were eaten (and others had to be boiled, fried, or simply avoided), the watermelon made a splash in the medical world. Pliny the Elder found pepones to be incredibly refreshing, and, according to one translation, “also laxative.” The first-century physician Dioscorides also noted that the pepon was cooling, wet, and diuretic.

[...]

But by the first few centuries A.D., posits Paris, the watermelon had likely sweetened up. Writings in Hebrew from the end of the second century, as well as sixth-century Latin texts, group the watermelon with other sweet fruits, including pomegranates, figs, and grapes.

Eating marmot is thought to be good for health

Monday, November 18th, 2019

China has been hit with the plague:

On Tuesday, Beijing authorities announced a municipal hospital had taken in a married couple from Inner Mongolia, a sparsely populated autonomous region in northwest China, seeking treatment for pneumonic plague. One patient is stable while the other is in critical condition but not deteriorating, according to Beijing’s health commission.

The Chinese Center for Disease Control and Prevention assured the public on Weibo, a Chinese social media site that is the equivalent of Twitter, that chances of a plague outbreak are “extremely low.” The city’s health commission has quarantined the infected patients, provided preventative care for those exposed to the couple and sterilized the relevant medical facilities, the center said.

Police are also guarding the quarantined emergency room of Chaoyang hospital, where the infected patients were first received and diagnosed, according to Caixin, an independent Chinese news outlet.

[...]

China has a checkered record in managing public health crises. In 2002, the central government initially refused to acknowledge a nationwide outbreak of severe acute respiratory syndrome, or SARS, an illness with flu- and pneumonia-like symptoms.

[...]

Mongolia, which borders the autonomous region where the infected Chinese couple lives, reported two fatal cases of bubonic plague just this year, after the patients ate raw marmot, a species of wild rodent that often carry the offending bacterium. In Mongolia, eating marmot is thought to be good for health.

At least it’s not African rabies.

Violence is rare and commonly occurs due to confusion and helplessness

Sunday, November 17th, 2019

Anne Nassauer — Assistant Professor of Sociology at the John F. Kennedy Institute for North American Studies at Freie Universität Berlin — notes that video surveillance footage shows how rare violence really is:

Today, videos from closed-circuit television, body cameras, police dash cameras, or mobile phones are increasingly used in the social sciences. For lack of other data, researchers previously relied on people’s often vague, partial, and biased recollections to understand how violence happened. Now, video footage shows researchers second-to-second how an event unfolded, who did what, was standing where, communicating with whom, and displaying which emotions, before violence broke out or a criminal event occurred. And while we would assume such footage highlights the cruel, brutal, savage nature of humanity, looking at violence up-close actually shows the opposite. Examining footage of violent situations – from the very cameras set up because we believe that violence lurks around every corner – suggests violence is rare and commonly occurs due to confusion and helplessness, rather than anger and hate.

Armed robberies are an example in point. We would assume robbers to resort to violence if clerks fail to hand over what is in the register; after all, that is the fundamental proposition of the situation. Instead, video surveillance shows that robbers become afraid of the unexpected situation they are in and run away. It shows that criminals, like most people, rely on situational routines that offer familiarity and reassurance. In my research of surveillance footage of robberies clerks laughed at a robber’s assault rifle, and robbers, rather than shooting or hitting the victim, were startled and gave up. When a robber showed slight gloominess, a clerk cheered him up and the robber became even sadder, discussed his financial problems with the clerk and left. If clerks treat robbers like a child, surveillance footage shows how robbers may react according to this role and become hesitant and plead to be taken seriously. This means even in an armed robbery, where perpetrators are prepared and committed to the crime and clerks usually fear for their lives, robbers as well as clerks tend to make sense out of the situation together, avoid violence and get into shared rhythms and routines.

We can see similar patterns when looking at video recordings of protest violence and violent uprisings. In some protest marches, certain groups attend with the clear goal to use violence; they mask up and come prepared with stones to throw at police. In other protests, police decided on a zero-tolerance strategy and plan to use force at the slightest misstep by activists. Despite such preparations for and willingness to use violent means, violence rarely actually breaks out, and people usually engage in peaceful interactions. If violence does erupt, we see that it does so not because people are violent or cruel, but because routine interactions break down, which leads to confusion, distress, uncertainty, anxiety, and fear, and ultimately violent altercations.

Similarly, research on street fights, or mass shootings shows that most people that have the will to fight and kill are actually bad at “doing” violence –as are the great majority of humans. Only very few people in very specific situations manage to be violent effectively, and it is those outliers that make it to the news. Contrary to common belief, rates of violence and crime have never been as low in most Western countries, as they are today.

Such findings have implications; fear of people’s cruel nature and violence lurking around every corner perpetuate everyday actions, drive voting behavior, and impact policymaking through worst-case-scenario thinking. Fearing fellow humans as inherently violent and cruel not only lacks empirical grounding, but research also shows it leads people to make bad decisions. Surveillance videos and recent research on violence challenge this notion that we need to fear each other. They counter the idea that we need elaborate protection from each other and constant state surveillance, which not only tends to cost public funds but also often curtails civil and human rights (e.g., privacy, free speech, free movement, right of asylum). The optimistic outlook offered by scientific analyses of videos might mean we can spend our time more wisely; instead of fearing each other and investing time and resources to protect ourselves from exaggerated dangers, we could enjoy society and our remaining civil rights and freedoms a little more.

Randall Collins (The Sociological Eye), whom she cites, makes similar points in Violence: A Micro-sociological Theory.

The Thule Society lives on

Saturday, November 16th, 2019

A “trans-Neptunian object” located in the Kuiper belt was recently named Ultima Thule and then rapidly renamed Arrokoth:

It is a contact binary 36 km (22 mi) long, composed of two planetesimals 22 km (14 mi) and 15 km (9 mi) across, nicknamed “Ultima” and “Thule”, respectively, that are joined along their major axes. Ultima, which is flatter than Thule, appears to be an aggregate of 8 or so smaller units, each approximately 5 km (3 mi) across, that fused together before Ultima and Thule came into contact. Because there have been few to no disruptive impacts on Arrokoth since it formed, the details of its formation have been preserved. With the New Horizons space probe’s flyby at 05:33 on 1 January 2019 (UTC time), Arrokoth became the farthest and most primitive object in the Solar System visited by a spacecraft.

[...]

Before the flyby on 1 January 2019, NASA invited suggestions from the public on a nickname to be used. The campaign involved 115,000 participants from around the world, who suggested some 34,000 names. Of those, 37 reached the ballot for voting and were evaluated for popularity – this included eight names suggested by the New Horizons team and 29 suggested by the public. Ultima Thule, which was selected on 13 March 2018, was proposed by about 40 different members of the public and obtained the seventh highest number of votes among the nominees. It is named after the Latin phrase ultima Thule (literally “farthest Thule”), an expression referencing the most distant place beyond the borders of the known world. Once it was determined the body was a bilobate contact binary object, the New Horizons team nicknamed the larger lobe “Ultima” and the smaller “Thule”.

The nickname was criticized due to its use by Nazi occultists as the supposed mythical origin of the Aryan race, although it is commonly used in ancient Greek and Latin literature as well as the historical Inuit culture of the Thule people. The Thule Society was a key sponsor of what became the Nazi Party, and some modern-day neo-Nazis and members of the alt-right continue to use the term. A few members of the New Horizons team were aware of that association when they selected the nickname, and have since defended their choice. Responding to a question at a press conference, Alan Stern said, “Just because some bad guys once liked that term, we’re not going to let them hijack it.”

Oh, but we are.

The Church made us WEIRD

Saturday, November 9th, 2019

A new study, The Church, intensive kinship, and global psychological variation, published in Science, makes the point that HBD Chick has been making for some time now:

A growing body of research suggests that populations around the globe vary substantially along several important psychological dimensions and that populations characterized as Western, Educated, Industrialized, Rich, and Democratic (WEIRD) are particularly unusual. People from these societies tend to be more individualistic, independent, and impersonally prosocial (e.g., trusting of strangers) while revealing less conformity and in-group loyalty. Although these patterns are now well documented, few efforts have sought to explain them. Here, we propose that the Western Church (i.e., the branch of Christianity that evolved into the Roman Catholic Church) transformed European kinship structures during the Middle Ages and that this transformation was a key factor behind a shift towards a WEIRDer psychology

Church vs. Cousin Marriage

The two Jonathan co-authors are new colleagues of Tyler Cowen‘s at GMU economics.

Was it hand hygiene, fragility of the patients, or room cleaning procedures?

Tuesday, October 29th, 2019

Harvard Medical School graduate and lecturer Dr. Stephanie Taylor and colleagues studied 370 patients in one unit of a hospital to try to isolate the factors associated with patient infections:

They tested and retested 8 million data points controlling for every variable they could think of to explain the likelihood of infection. Was it hand hygiene, fragility of the patients, or room cleaning procedures? Taylor thought it might have something to do with the number of visitors to the patient’s room.

While all those factors had modest influence, one factor stood out above them all, and it shocked the research team. The one factor most associated with infection was (drum roll): dry air. At low relative humidity, indoor air was strongly associated with higher infection rates. “When we dry the air out, droplets and skin flakes carrying viruses and bacteria are launched into the air, traveling far and over long periods of time. The microbes that survive this launching tend to be the ones that cause healthcare-associated infections,” said Taylor. “Even worse, in addition to this increased exposure to infectious particles, the dry air also harms our natural immune barriers which protect us from infections.”

Since that study was published, there is now more research in peer-reviewed literature observing a link between dry air and viral infections, such as the flu, colds and measles, as well as many bacterial infections, and the National Institutes of Health (NIH) is funding more research. Taylor finds one of the most interesting studies from a team at the Mayo Clinic, which humidified half of the classrooms in a preschool and left the other half alone over three months during the winter. Influenza-related absenteeism in the humidified classrooms was two-thirds lower than in the standard classrooms—a dramatic difference. Taylor says this study is important because its design included a control group: the half of classrooms without humidity-related intervention.

Scientists attribute the influence of dry air to a new understanding about the behavior of airborne particles, or “infectious aerosol transmissions.” They used to assume the microbes in desiccated droplets were dead, but advances in the past several years changed that thinking. “With new genetic analysis tools, we are finding out that most of the microbes are not dead at all. They are simply dormant while waiting for a source of rehydration,” Taylor explained. “Humans are an ideal source of hydration, since we are basically 60% water. When a tiny infectious particle lands on or in a patient, the pathogen rehydrates and begins the infectious cycle all over again.”

This isn’t exactly news though.

It would be frighteningly easy to have much larger wars than any we have ever seen in history

Tuesday, October 22nd, 2019

In Only the Dead: The Persistence of War in the Modern Age Ohio State University professor of political science Bear Braumoeller argues that war is not declining:

Braumoeller used the Correlates of War data set, which scholars from around the world study to measure uses of force up to and including war.

What he found with the statistical analyses was that any decline in the deadliness of war that we think we see in the data is within the normal range of variation — in other words, our period of relative peace right now could easily be occurring simply by chance.

[...]

Once an armed conflict has had more than 1,000 battle deaths (the criteria for being included in the Correlates of War database), there’s about a 50 percent chance it will be as devastating to combatants as the 1990 Iraq War, which killed 20,000 to 35,000 fighters.

There’s a 2 percent chance — about the probability of drawing three of a kind in a five-card poker game — that such a war could end up being as devastating to combatants as World War I. And there’s about a 1 percent chance that its intensity would surpass that of any international war fought in the last two centuries.

“This is pretty bleak. Not only has war not disappeared, but it would be frighteningly easy to have much larger wars than any we have ever seen in history,” Braumoeller said.

The brain seems to be wired to be periodically distractible

Thursday, October 10th, 2019

To pay attention, the brain uses filters, not a spotlight:

For a long time, because attention seemed so intricately tied up with consciousness and other complex functions, scientists assumed that it was first and foremost a cortical phenomenon. A major departure from that line of thinking came in 1984, when Francis Crick, known for his work on the structure of DNA, proposed that the attentional searchlight was controlled by a region deep in the brain called the thalamus, parts of which receive input from sensory domains and feed information to the cortex. He developed a theory in which the sensory thalamus acted not just as a relay station, but also as a gatekeeper — not just a bridge, but a sieve — staunching some of the flow of data to establish a certain level of focus.

[...]

[Michael Halassa, a neuroscientist at the McGovern Institute for Brain Research at the Massachusetts Institute of Technology] was drawn to a thin layer of inhibitory neurons called the thalamic reticular nucleus (TRN), which wraps around the rest of the thalamus like a shell. By the time Halassa was a postdoctoral researcher, he had already found a coarse level of gating in that brain area: The TRN seemed to let sensory inputs through when an animal was awake and attentive to something in its environment, but it suppressed them when the animal was asleep.

In 2015, Halassa and his colleagues discovered another, finer level of gating that further implicated the TRN as part of Crick’s long-sought circuit — this time involving how animals select what to focus on when their attention is divided among different senses. In the study, the researchers used mice trained to run as directed by flashing lights and sweeping audio tones. They then simultaneously presented the animals with conflicting commands from the lights and tones, but also cued them about which signal to disregard. The mice’s responses showed how effectively they were focusing their attention. Throughout the task, the researchers used well-established techniques to shut off activity in various brain regions to see what interfered with the animals’ performance.

As expected, the prefrontal cortex, which issues high-level commands to other parts of the brain, was crucial. But the team also observed that if a trial required the mice to attend to vision, turning on neurons in the visual TRN interfered with their performance. And when those neurons were silenced, the mice had more difficulty paying attention to sound. In effect, the network was turning the knobs on inhibitory processes, not excitatory ones, with the TRN inhibiting information that the prefrontal cortex deemed distracting. If the mouse needed to prioritize auditory information, the prefrontal cortex told the visual TRN to increase its activity to suppress the visual thalamus — stripping away irrelevant visual data.

The attentional searchlight metaphor was backward: The brain wasn’t brightening the light on stimuli of interest; it was lowering the lights on everything else.

[...]

With tasks similar to those they’d used in 2015, the team probed the functional effects of various brain regions on one another, as well as the neuronal connections between them. The full circuit, they found, goes from the prefrontal cortex to a much deeper structure called the basal ganglia (often associated with motor control and a host of other functions), then to the TRN and the thalamus, before finally going back up to higher cortical regions. So, for instance, as visual information passes from the eye to the visual thalamus, it can get intercepted almost immediately if it’s not relevant to the given task. The basal ganglia can step in and activate the visual TRN to screen out the extraneous stimuli, in keeping with the prefrontal cortex’s directive.

[...]

Halassa’s findings indicate that the brain casts extraneous perceptions aside earlier than expected. “What’s interesting,” said Ian Fiebelkorn, a cognitive neuroscientist at Princeton University, is that “filtering is starting at that very first step, before the information even reaches the visual cortex.”

[...]

According to his findings, the focus of the attentional spotlight seems to get relatively weaker about four times per second, presumably to prevent animals from staying overly focused on a single location or stimulus in their environment. That very brief suppression of what’s important gives other, peripheral stimuli an indirect boost, creating an opportunity for the brain to shift its attention to something else if necessary. “The brain seems to be wired to be periodically distractible,” he said.

It’s a very vulnerable point, and plants have targeted it

Wednesday, October 9th, 2019

Monarch butterflies eat only milkweed, a poisonous plant that should kill them, and even store the toxins in their own bodies as a defense against hungry birds:

Only three genetic mutations were necessary to turn the butterflies from vulnerable to resistant, the researchers reported in the journal Nature. They were able to introduce these mutations into fruit flies, and suddenly they were able to eat milkweed, too.

[...]

Insects began dining on plants over 400 million years ago, spurring the evolution of many botanical defenses, including harsh chemicals. Certain plants, including milkweed, make particularly nasty toxins known as cardiac glycosides.

The right dose can stop a beating heart or disrupt the nervous system. For thousands of years, African hunters have put these poisons on the tips of arrows. Agatha Christie wrote a murder mystery featuring foxglove, which produces cardiac glycosides.

The toxins gum up so-called sodium pumps, an essential component of all animal cells. “It’s a very vulnerable point, and plants have targeted it,” said Susanne Dobler, a molecular biologist at the University of Hamburg in Germany.

These pumps move positively charged sodium atoms out of cells, giving their interiors a negative charge. Heart cells need sodium pumps to build enough electrical charge to deliver a heartbeat. Nerves use the pumps to produce signals to the brain. If the pumps fail, then those functions come to a halt.

[...]

The researchers compared the genes that serve as blueprints for the sodium pump in poison-resistant species, like the milkweed beetle and the milkweed bug. Most of these species, it turned out, had gained the same three mutations.

[...]

Monarchs share one of the mutations with a related butterfly that doesn’t eat milkweed, and a second mutation with a closer relative that eats milkweed but doesn’t store cardiac glycosides in its wings. The third mutation arose in an even more recent ancestor.

Gaining these mutations gradually altered the sodium pumps in the monarchs’ cells, Dr. Dobler suspected, so that the cardiac glycosides couldn’t disrupt them. As the butterflies became more resistant, they were able to enjoy a new supply of food untouched by most other insects.

[...]

Noah Whiteman, an evolutionary biologist at the University of California, Berkeley, led the effort to test this hypothesis. “These three mutations may be the thing that unlocked the door” for the butterflies, he said.

He and his colleagues figured out how to use Crispr, the gene-editing technology, to introduce the mutations into fruit flies. The flies survive on rotting fruit, and even a small dose of cardiac glycosides can be deadly to them.

The researchers began by giving the flies the first mutation to arise in the ancestors of monarchs. The larvae that carried this mutation were able to survive on a diet of yeast laced with low levels of cardiac glycosides.

The second mutation let the flies withstand even more toxins, and the third made them entirely resistant. With all three mutations, the flies even ate dried milkweed powder.

The third mutation had another striking effect. When the flies with the gene developed into adults, their bodies carried low levels of cardiac glycoside, useful as a defense against predation.

O brave new world that has such insects in it!

Heat training can boost your cool-weather performance

Monday, October 7th, 2019

A 2010 study from the University of Oregon found that 10 days of training in 104 degrees Fahrenheit boosted cyclists’ VO2max by 5 percent, Alex Hutchinson notes, even when the subjects were later tested in cool temperatures, and a new study out of Swansea University supports this finding:

The study involved 22 cyclists (all male, alas), all of whom were serious amateur cyclists training an average of 14 hours a week and competing regularly. The adaptation protocol was 10 consecutive days of cycling in the lab for 60 minutes at an intensity equal to 50 percent of their VO2max, with half of them in the heat group at a room temperature of 100.4 F (38 degrees Celsius) and the other half in a control group at 68 F (20 C). They also continued with their normal training outside the lab, subtracting their lab rides to maintain roughly the same training volume as usual. The outcome measure on the test days was VO2max, a marker of aerobic fitness that has a reasonably good correlation with race performance, tested at 68 F (20 C).

If you looked at the data right after the heat adaptation period, or even a couple of days later, you’d conclude that it makes you worse. The VO2max readings were lower. But three days after the heat adaptation, VO2max readings started to climb, and four days afterwards, they peaked at 4.9 percent higher than baseline, strikingly similar to the 2010 Oregon study. The control group, meanwhile, hardly saw any change.

Traumatic brain injury causes intestinal damage

Friday, October 4th, 2019

University of Maryland School of Medicine (UMSOM) researchers have found a two-way link between traumatic brain injury (TBI) and intestinal changes:

Researchers have known for years that TBI has significant effects on the gastrointestinal tract, but until now, scientists have not recognized that brain trauma can make the colon more permeable, potentially allowing allow harmful microbes to migrate from the intestine to other areas of the body, causing infection.. People are 12 times more likely to die from blood poisoning after TBI, which is often caused by bacteria, and 2.5 times more likely to die of a digestive system problem, compared with those without such injury.

In this study, the researchers examined mice that received an experimental TBI. They found that the intestinal wall of the colon became more permeable after trauma, changes that were sustained over the following month.

It is not clear how TBI causes these gut changes. A key factor in the process may be enteric glial cells (EGCs), a class of cells that exist in the gut. These cells are similar to brain astroglial cells, and both types of glial cells are activated after TBI. After TBI, such activation is associated with brain inflammation that contributes to delayed tissue damage in the brain. Researchers don’t know whether activation of ECGs after TBI contributes to intestinal injury or is instead an attempt to compensate for the injury.

The researchers also focused on the two-way nature of the process: how gut dysfunction may worsen brain inflammation and tissue loss after TBI. They infected the mice with Citrobacter rodentium, a species of bacteria that is the rodent equivalent of E. coli, which infects humans. In mice with a TBI who were infected with this the bacteria, brain inflammation worsened. Furthermore, in the hippocampus, a key region for memory, the mice who had TBI and were then infected lost more neurons than animals without infection.

The CIA paid $240,000 to buy the world’s entire supply of LSD

Saturday, September 28th, 2019

The director of the CIA’s infamous MK-ULTRA program, Sidney Gottlieb, was the unwitting godfather of the entire LSD counterculture:

In the early 1950s, he arranged for the CIA to pay $240,000 to buy the world’s entire supply of LSD. He brought this to the United States, and he began spreading it around to hospitals, clinics, prisons and other institutions, asking them, through bogus foundations, to carry out research projects and find out what LSD was, how people reacted to it and how it might be able to be used as a tool for mind control.

Now, the people who volunteered for these experiments and began taking LSD, in many cases, found it very pleasurable. They told their friends about it. Who were those people? Ken Kesey, the author of One Flew Over the Cuckoo’s Nest, got his LSD in an experiment sponsored by the CIA by MK-ULTRA, by Sidney Gottlieb. So did Robert Hunter, the lyricist for the Grateful Dead, which went on to become a great purveyor of LSD culture. Allen Ginsberg, the poet who preached the value of the great personal adventure of using LSD, got his first LSD from Sidney Gottlieb. Although, of course, he never knew that name.

CR is unpleasant to most humans

Thursday, September 26th, 2019

Rapamycin is an immunosuppressant for transplant patients, but it’s also been found to increase lifespan in lab animals. Dr. Alan Green, who prescribes rapamycin for anti-aging purposes, recommends Blagosklonny’s paper, Disease or not, aging is easily treatable:

Is aging a disease? It does not matter because aging is already treated using a combination of several clinically-available drugs, including rapamycin. Whether aging is a disease depends on arbitrary definitions of both disease and aging. For treatment purposes, aging is a deadly disease (or more generally, pre-disease), despite being a normal continuation of normal organismal growth. It must and, importantly, can be successfully treated, thereby delaying classic age-related diseases such as cancer, cardiovascular and metabolic diseases, and neurodegeneration.

[...]

As the simplest example, calorie restriction (CR) slows aging in diverse organisms, including primates [43-50]. Similarly, intermittent fasting (IF) and ketogenic diet (severe carbohydrate restriction) extend life span in mammals [48, 51-54]. CR (as well as carbohydrate restriction and IF fasting) improves health in humans [45, 48, 53, 55-62]. However, CR is unpleasant to most humans and its life-extending capacity is limited. Nutrients activate the mTOR (molecular Target of Rapamycin) nutrient-sensing pathway [63-65] and, as we will discuss mTOR drives aging, inhabitable by rapamycin. Rapamycin-based anti-aging therapies have been recently implemented by Dr. Alan Green (https://rapamycintherapy.com).

There’s a bit of circularity there.