Can Hypothermia Save Gunshot Victims?

Friday, December 9th, 2016

E.P.R., or emergency preservation and resuscitation, has long been proved successful in animal experiments, but overcoming the institutional, logistical, and ethical obstacles to performing it on a human being has taken more than a decade:

When [the first patient to undergo E.P.R.] loses his pulse, the attending surgeon will, as usual, crack his chest open and clamp the descending aorta. But then, instead of trying to coax the heart back into activity, the surgeon will start pumping the body full of ice-cold saline at a rate of at least a gallon a minute. Within twenty minutes (depending on the size of the patient, the number of wounds, and the amount of blood lost), the patient’s brain temperature, measured using a probe in the ear or nose, will sink to somewhere in the low fifties Fahrenheit.

At this point, the patient, his circulatory system filled with icy salt water, will have no blood, no pulse, and no brain activity. He will remain in this state of suspended animation for up to an hour, while surgeons locate the bullet holes or stab wounds and sew them up. Then, after as much as sixty minutes without a heartbeat or a breath, the patient will be resuscitated. A cardiac surgeon will attach a heart-lung bypass machine and start pumping the patient full of blood again, cold, at first, but gradually warming, one degree at a time, over the course of a couple of hours. As soon as the heartbeat returns, perhaps jump-started with the help of a gentle electric shock, and as long as the lungs seem capable of functioning, at least with the help of a ventilator, the patient will be taken off bypass.

Even if everything works perfectly, it will take between three and five days to determine whether the patient’s brain has been damaged, and, if so, to what extent. There will be more surgeries, followed by months of rehabilitation.

You can see why the homicide rate keeps going down.

Cancer drug sparks growth of new eggs

Tuesday, December 6th, 2016

Women who had been treated for Hodgkin’s Lymphoma with the chemotherapy drug ABVD had 10 times the eggs of healthy women:

Lead researcher Professor Evelyn Telfer, of the University of Edinburgh’s School of Biological Sciences, said: “We were astonished when we saw what had happened to the tissue. It looked like pre-pubescent tissue with a high density of follicles and clustering that you don’t normally see in an adult.

“We knew that ABVD does not have a sterilising effect like some cancer drugs can, but to find new eggs being made, in such huge numbers, that was very surprising to see.

“It looks like something is being activated probably in the germline or stem cells and we need to find out what that mechanism is. It could be that the harshness of the treatment triggers some kind of shock effect or perturbation which stimulates the stem cells into producing new eggs.

“I think it’s a pretty big deal. It is the first time that we have ever been able to see new follicles being formed within the ovary, and it may only be a small number of women, but it is significant that the same effect was seen in all of the women on ABVD. The outcome may be significant and far-reaching.”

Scientists analysed samples of ovarian tissue donated by 14 women who had undergone chemotherapy, alongside tissues from 12 healthy women.

They found that the tissue from eight of the cancer patients who had been treated with ABVD had between four and 10 times more eggs compared with tissue from women who had received a different chemotherapy, or healthy women of a similar age.

The ovarian tissue was seen to be in healthy condition, appearing similar to tissue from young women’s ovaries.

Although the eggs are still in an immature state, the scientists are now trying to discover how they were created in the first place, then work out a way to bring them to maturity. It is unclear if the eggs in their current form would be functional.

But if research can reveal the mechanism, it would help scientists understand how women could produce more eggs during their lifetime, which was until now thought to be impossible.

Future studies will examine the separate impact of each of the four drugs that combine to make ABVD — known as adriamycin, bleomycin, vinblastine and dacarbazine — to better understand the biological mechanisms involved.

(Hat tip to Mangan.)

Mammals Are Downright Drab

Sunday, December 4th, 2016

Compared to colorful fish, lizards, birds, and insects, we mammals are downright drab:

Unless you are a color scientist you are probably accustomed to dealing with chemical colors. For example, if you take a handful of blue pigment powder, mix it with water, paint it onto a chair, let it dry, then scrape it off the chair, and grind it back into powder, you expect it to remain blue at all stages in the process (except if you get a bit of chair mixed in with it.)

By contrast, if you scraped the scales off a blue morpho butterfly’s wings, you’d just end up with a pile of grey dust and a sad butterfly. By themselves, blue morpho scales are not “blue,” even under regular light. Rather, their scales are arranged so that light bounces between them, like light bouncing from molecule to molecule in the air.

[...]

This kind of structural color works great if your medium is scales, feathers, carapaces, berries, or even CDs, but just doesn’t work with hair, which we mammals have.

Compared to other animals, mammals also have bad color perception, which may be explained by the nocturnal bottleneck hypothesis:

The hypothesis states that mammals were mainly or even exclusively nocturnal through most of their evolutionary story, starting with their origin 225 million years ago, and only ending with the demise of the dinosaurs 65 millions years ago. While some mammal groups have later evolved to fill diurnal niches, the 160 million years spent as nocturnal animals has left a lasting legacy on basal anatomy and physiology, and most mammals are still nocturnal.

The Real War on Science

Wednesday, November 30th, 2016

John Tierney’s liberal friends sometimes ask him why he doesn’t devote more of his science journalism to the sins of the Right:

My friends don’t like my answer: because there isn’t much to write about. Conservatives just don’t have that much impact on science. I know that sounds strange to Democrats who decry Republican creationists and call themselves the “party of science.” But I’ve done my homework. I’ve read the Left’s indictments, including Chris Mooney’s bestseller, The Republican War on Science. I finished it with the same question about this war that I had at the outset: Where are the casualties?

Read the whole thing.

Too Much Radioactive Stuff in One Place

Sunday, November 27th, 2016

The original radioactive boy scout piece from Harper’s goes into a fair amount of detail, if you keep reading:

Armed with information from his friends in government and industry, David typed up a list of sources for fourteen radioactive isotopes. Americium-241, he learned from the Boy Scout atomic-energy booklet, could be found in smoke detectors; radium-226, in antique luminous dial clocks; uranium-238 and minute quantities of uranium-235, in a black ore called pitchblende; and thorium-232, in Coleman-style gas lanterns.

To obtain americium-241, David contacted smoke-detector companies and claimed that he needed a large number of the devices for a school project. One company agreed to sell him about a hundred broken detectors for a dollar apiece. (He also tried to “collect” detectors while at scout camp.) David wasn’t sure where the americium-241 was located, so he wrote to BRK Electronics in Aurora, Illinois. A customer-service representative named Beth Weber wrote back to say she’d be happy to help out with “your report.” She explained that each detector contains only a tiny amount of americium-241, which is sealed in a gold matrix “to make sure that corrosion does not break it down and release it.” Thanks to Weber’s tip, David extracted the americium components and then welded them together with a blowtorch.

As it decays, americium-241 emits alpha rays composed of protons and neutrons. David put the lump of americium inside a hollow block of lead with a tiny hole pricked in one side so that alpha rays would stream out. In front of the lead block he placed a sheet of aluminum. Aluminum atoms absorb alpha rays and in the process kick out neutrons. Since neutrons have no charge, and thus cannot be measured by a Geiger counter, David had no way of knowing whether the gun was working until he recalled that paraffin throws off protons when hit by neutrons. David aimed the apparatus at some paraffin, and his Geiger counter registered what he assumed was a proton stream. His neutron gun, crude but effective, was ready.

With neutron gun in hand, David was ready to irradiate. He could have concentrated on transforming previously non-radioactive elements, but in a decision that was both indicative of his personality and instrumental to his later attempt to build a breeder reactor, he wanted to use the gun on radioisotopes to increase the chances of making them fissionable. He thought that uranium-235, which is used in atomic weapons, would provide the “biggest reaction.” He scoured hundreds of miles of upper Michigan in his Pontiac looking for “hot rocks” with his Geiger counter, but all he could find was a quarter trunkload of pitchblende on the shores of Lake Huron. Deciding to pursue a more bureaucratic approach, he wrote to a Czechoslovakian firm that sells uranium to commercial and university buyers, whose name was provided, he told me, by the NRC. Claiming to be a professor buying materials for a nuclear-research laboratory, he obtained a few samples of a black ore—either pitchblende or uranium dioxide, both of which contain small amounts of uranium-235 and uranium-238.

David pulverized the ores with a hammer, thinking that he could then use nitric acid to isolate uranium. Unable to find a commercial source for nitric acid—probably because it is used in the manufacture of explosives and thus is tightly controlled—David made his own by heating saltpeter and sodium bisulfate, then bubbling the gas that was released through a container of water, producing nitric acid. He then mixed the acid with the powdered ore and boiled it, ending up with something that “looked like a dirty milk shake.” Next he poured the “milk shake” through a coffee filter, hoping that the uranium would pass through the filter. But David miscalculated uranium’s solubility, and whatever amount was present was trapped in the filter, making it difficult to purify further.

Frustrated at his inability to isolate sufficient supplies of uranium, David turned his attention to thorium-232, which when bombarded with neutrons produces uranium-233, a man-made fissionable element (and, although he might not have known it then, one that can be substituted for plutonium in breeder reactors). Discovered in 1828 and named after the Norse god Thor, thorium has a very high melting point, and is thus used in the manufacture of airplane engine parts that reach extremely high temperatures. David knew from his merit-badge pamphlet that the “mantle” used in commercial gas lanterns—the part that looks like a doll’s stocking and conducts the flame—is coated with a compound containing thorium-232. He bought thousands of lantern mantles from surplus stores and, using the blowtorch, reduced them into a pile of ash.

David still had to isolate the thorium-232 from the ash. Fortunately, he remembered reading in one of his dad’s chemistry books that lithium is prone to binding with oxygen—meaning, in this context, that it would rob thorium dioxide of its oxygen content and leave a cleaner form of thorium. David purchased $1,000 worth of lithium batteries and extracted the element by cutting the batteries in half with a pair of wire cutters. He placed the lithium and thorium dioxide together in a ball of aluminum foil and heated the ball with a Bunsen burner. Eureka! David’s method purified thorium to at least 9,000 times the level found in nature and 170 times the level that requires NRC licensing.

At this point, David could have used his americium neutron gun to transform thorium-232 into fissionable uranium-233. But the americium he had was not capable of producing enough neutrons, so he began preparing radium for an improved irradiating gun.

Radium was used in paint that rendered luminescent the faces of clocks and automobile and airplane instrument panels until the late 1960s, when it was discovered that many clock painters, who routinely licked their brushes to make a fine point, died of cancer. David began visiting junkyards and antiques stores in search of radium-coated dashboard panels or clocks. Once he found such an item, he’d chip paint from the instruments and collect it in pill vials. It was slow going until one day, driving through Clinton Township to visit his girlfriend, Heather, he noticed that his Geiger counter went wild as he passed Gloria’s Resale Boutique/Antique. The proprietor, Gloria Genette, still recalls the day when she was called at home by a store employee who said that a polite young man was anxious to buy an old table clock with a tinted green dial but wondered if she’d come down in price. She would. David bought the clock for $10. Inside he discovered a vial of radium paint left behind by a worker either accidentally or as a courtesy so that the clock’s owner could touch up the dial when it began to fade. David was so overjoyed that he dropped by the boutique later that night to leave a note for Gloria, telling her that if she received another “luminus [sic] clock” to contact him immediately. “I will pay any some [sic] of money to obtain one.”

To concentrate the radium, David secured a sample of barium sulfate from the X-ray ward at a local hospital (staff there handed over the substance because they remembered him from his merit-badge project) and heated it until it liquefied. After mixing the barium sulfate with the radium paint chips, he strained the brew through a coffee filter into a beaker that began to glow. This time, David had judged the solubility of the two substances correctly; the radium solution passed through to the beaker. He then dehydrated the solution into crystalline salts, which he could pack into the cavity of another lead block to build a new gun.

Whether David fully realized it or not, by handling purified radium he was truly putting himself in danger. Nevertheless, he now proceeded to acquire another neutron emitter to replace the aluminum used in his previous neutron gun. Faithful to Erb’s instructions, he secured a strip of beryllium (which is a much richer source of neutrons than aluminum) from the chemistry department at Macomb Community College—a friend who attended the school swiped it for him—and placed it in front of the lead block that held the radium. His cute little americium gun was now a more powerful radium gun. David began to bombard his thorium and uranium powders in the hopes of producing at least some fissionable atoms. He measured the results with his Geiger counter, but while the thorium seemed to grow more radioactive, the uranium remained a disappointment.

Once again, “Professor Hahn” sprang into action, writing his old friend Erb at the NRC to discuss the problem. The NRC had the answer. David’s neutrons were too “fast” for the uranium).

He would have to slow them down using a filter of water, deuterium, or tritium. Water would have sufficed, but David likes a challenge. Consulting his list of commercially available radioactive sources, he discovered that tritium, a radioactive material used to boost the power of nuclear weapons, is found in glow-in-the-dark gun and bow sights, which David promptly bought from sporting-goods stores and mail-order catalogues. He removed the tritium contained in a waxy substance inside the sights, and then, using a variety of pseudonyms, returned the sights to the store or manufacturer for repair—each time collecting another tiny quantity of tritium. When he had enough, David smeared the waxy substance over the beryllium strip and targeted the gun at uranium powder. He carefully monitored the results with his Geiger counter over several weeks, and it appeared that the powder was growing more radioactive by the day.

Now seventeen, David hit on the idea of building a model breeder reactor. He knew that without a critical pile of at least thirty pounds of enriched uranium he had no chance of initiating a sustained chain reaction, but he was determined to get as far as he could by trying to get his various radioisotopes to interact with one another. That way, he now says, “no matter what happened there would be something changing into something—some kind of action going on there.” His blueprint was a schematic of a checkerboard breeder reactor he’d seen in one of his father’s college textbooks. Ignoring any thought of safety, David took the highly radioactive radium and americium out of their respective lead casings and, after another round of filing and pulverizing, mixed those isotopes with beryllium and aluminum shavings, all of which he wrapped in aluminum foil. What were once the neutron sources for his guns became a makeshift “core” for his reactor. He surrounded this radioactive ball with a “blanket” composed of tiny foil-wrapped cubes of thorium ash and uranium powder, which were stacked in an alternating pattern with carbon cubes and tenuously held together with duct tape.

David monitored his “breeder reactor” at the Golf Manor laboratory with his Geiger counter. “It was radioactive as heck,” he says. “The level of radiation after a few weeks was far greater than it was at the time of assembly. I know I transformed some radioactive materials. Even though there was no critical pile, I know that some of the reactions that go on in a breeder reactor went on to a minute extent.”

Finally, David, whose safety precautions had thus far consisted of wearing a makeshift lead poncho and throwing away his clothes and changing his shoes following a session in the potting shed, began to realize that, sustained reaction or not, he could be putting himself and others in danger. (One tip-off was when the radiation was detectable through concrete.) Jim Miller, a nuclear-savvy high-school friend in whom David had confided, warned him that real reactors use control rods to regulate nuclear reactions. Miller recommended cobalt, which absorbs neutrons but does not itself become fissionable. “Reactors get hot, it’s just a fact,” Miller, a nervous, skinny twenty-two-year-old, said during an interview at a Burger King in Clinton Township where he worked as a cook. David purchased a set of cobalt drill bits at a local hardware store and inserted them between the thorium and uranium cubes. But the cobalt wasn’t sufficient. When his Geiger counter began picking up radiation five doors down from his mom’s house, David decided that he had “too much radioactive stuff in one place” and began to disassemble the reactor. He placed the thorium pellets in a shoebox that he hid in his mother’s house, left the radium and americium in the shed, and packed most of the rest of his equipment into the trunk of the Pontiac 6000.

In light of recent news, this jumps out:

Back in 1995, the EPA arranged for David to undergo a full examination at the nearby Fermi nuclear power plant. David, fearful of what he might learn, refused. Now, though, he’s looking ahead. “I wanted to make a scratch in life,” he explains when I ask him about his early years of nuclear research. “I’ve still got time. I don’t believe I took more than five years off of my life.”

Cognitive Control As a Double-Edged Sword

Saturday, November 26th, 2016

A recent study looks at cognitive control as a double-edged sword:

Cognitive control, the ability to limit attention to goal-relevant information, aids performance on a wide range of laboratory tasks. However, there are many day-to-day functions which require little to no control and others which even benefit from reduced control. We review behavioral and neuroimaging evidence demonstrating that reduced control can enhance the performance of both older and, under some circumstances, younger adults. Using healthy aging as a model, we demonstrate that decreased cognitive control benefits performance on tasks ranging from acquiring and using environmental information to generating creative solutions to problems. Cognitive control is thus a double-edged sword – aiding performance on some tasks when fully engaged, and many others when less engaged.

Female monkeys use wile to rally troops

Friday, November 25th, 2016

Female vervet monkeys manipulate males into fighting battles by lavishing attention on brave soldiers while giving noncombatants the cold shoulder:

After a skirmish with a rival gang, usually over food, females would groom males that had fought hardest, while snapping at those that abstained.

When the next battle came along, both those singled out for attention and those aggressively shunned would participate more vigorously in combat, according to a study published in the journal Proceedings of the Royal Society B.

Young children are terrible at hiding

Tuesday, November 22nd, 2016

Young children are terrible at hiding:

Curiously, they often cover only their face or eyes with their hands, leaving the rest of their bodies visibly exposed.

For a long time, this ineffective hiding strategy was interpreted as evidence that young children are hopelessly “egocentric” creatures. Psychologists theorized that preschool children cannot distinguish their own perspective from someone else’s. Conventional wisdom held that, unable to transcend their own viewpoint, children falsely assume that others see the world the same way they themselves do. So psychologists assumed children “hide” by covering their eyes because they conflate their own lack of vision with that of those around them.

But research in cognitive developmental psychology is starting to cast doubt on this notion of childhood egocentrism. We brought young children between the ages of two and four into our Minds in Development Lab at USC so we could investigate this assumption. Our surprising results contradict the idea that children’s poor hiding skills reflect their allegedly egocentric nature.

Each child in our study sat down with an adult who covered her own eyes or ears with her hands. We then asked the child whether or not she could see or hear the adult, respectively. Surprisingly, children denied that they could. The same thing happened when the adult covered her own mouth: Now children denied that they could speak to her.

A number of control experiments ruled out that the children were confused or misunderstood what they were being asked. The results were clear: Our young subjects comprehended the questions and knew exactly what was asked of them. Their negative responses reflected their genuine belief that the other person could not be seen, heard, or spoken to when her eyes, ears, or mouth were obstructed. Despite the fact that the person in front of them was in plain view, they flatout denied being able to perceive her. So what was going on?

It seems like young children consider mutual eye contact a requirement for one person to be able to see another. Their thinking appears to run along the lines of “I can see you only if you can see me, too” and vice versa. Our findings suggest that when a child “hides” by putting a blanket over her head, this strategy is not a result of egocentrism. In fact, children deem this strategy effective when others use it.

Built into their notion of visibility, then, is the idea of bidirectionality: Unless two people make eye contact, it is impossible for one to see the other. Contrary to egocentrism, young children simply insist on mutual recognition and regard.

Children’s demand of reciprocity demonstrates that they are not at all egocentric.

The Canary in the Coalmine Theory of the Arts

Monday, November 21st, 2016

Kurt Vonnegut presented his canary in the coalmine theory of the arts in Physicist, Purge Thyself:

This theory says that artists are useful to society because they are so sensitive. They are super-sensitive. They keel over like canaries in poison coal mines long before more robust types realize that there is any danger whatsoever.

Adam Perkins uses this idea to illustrate the personality differences between visionaries and implementers:

The ‘Big Five’ dimensions of personality are extraversion, neuroticism, conscientiousness, agreeableness and openness to experience. Studies show that openness to experience captures individual differences in the capacity to imagine new concepts and things — i.e., to be creative.

So far, so good: this finding tallies nicely with biographical information showing that geniuses tend to be unusually adventurous, curious and open-minded. For example, instead of spending his family’s wealth on wine, women and song as was customary for young English gentlemen of means in the early 19th century, Charles Darwin spent it on five years sailing round the world in a cramped and smelly boat called HMS Beagle, even though his voyage had no specific purpose and he suffered from chronic seasickness. But readers may also suspect that openness to experience is not the only personality dimension that is important when it comes to being one of Vonnegut’s canaries because their super-sensitivity is particularly acute for detecting danger. And threat-sensitivity is not captured by openness to experience but instead by the personality dimension of neuroticism.

Epidemiological evidence fits with the idea of visionary ability being linked to high scores on neuroticism because it shows that creative professionals have a higher than average risk of psychiatric illness and of suicide. Neurotic tendencies also seem to be commonplace in the life stories of geniuses. But these observations could just be an artefact of the pressure of constantly trying to come up with new ideas — it doesn’t mean that high scores on neuroticism necessarily aid creativity. Moreover, given the amount of hard graft that it takes to succeed as a visionary, whether Charles Darwin, Winston Churchill, Vincent van Gogh, Jane Austen or Bruce Springsteen, it seems likely that a person weighed down with negative thoughts and feelings would have a worse chance of making a difference to the world than a calm, cheerful, happy-go-lucky individual who bounces out of bed every morning feeling refreshed and energetic.

So how could it be that high scores on neuroticism aid creativity? One theory is that neuroticism stems from individual differences in patterns of self-generated thought (SGT) which, in turn, depend on variation in the functioning of a brain system known as the default mode network (DMN). The DMN activates when we are not engaged with the world around us, such as when we are daydreaming. This theory was created by Jonny Smallwood, Danilo Arnone, Dean Mobbs and me in response to the finding that some people have less positive thoughts when engaged in daydreaming.

Moreover, it turns out that these individuals — akin to high scorers on neuroticism — display more activity in a part of the brain that controls conscious perception of threat. The key insight is that this pattern of threat-related brain activity was observed while participants were daydreaming in a threat-free situation so these individuals can be viewed as possessing an especially active imagination when it comes to threats. This raises the possibility that the creative advantage associated with high scores on neuroticism stems from highly neurotic individuals having a problem-focussed style of daydreaming, which might help them find solutions to those problems, compared to people whose attitude to problems is “out of sight, out of mind” (i.e., low scorers on neuroticism).

[...]

It could even be said that high scorers on neuroticism are more conscious of reality than the rest of the population. When combined with other important qualities such as adventurousness (i.e., high scores on openness to experience) and plenty of neural horsepower (i.e., high IQ) it is possible that this higher state of consciousness emerges as visionary characteristics that gives the bearer a better ability to see new ways of developing music, painting and so on.

Radioactive Boy Scout Dead at 39

Sunday, November 20th, 2016

In 1994 David Charles Hahn attempted to build a homemade breeder nuclear reactor for a Boy Scout project in his mom’s Michigan backyard shed. He seems to have recently died at age 39 — of unknown causes.

Metabolic Effects of a 4-Day Outdoor Trip

Monday, November 14th, 2016

Researchers looked at the netabolic effects of a 4-day outdoor trip under simulated Paleolithic conditions:

Background: The observation that the emergence of common Western diseases takes place with much greater prevalence as societies migrate from natural-living cultures to modernized societies, has been well documented. For approximately 84,000 generations humans lived under hunter-gatherer conditions but recently endured dramatic change from our native lifestyle with the occurrence of the agricultural, industrial, and digital revolutions. The massive technological advancement that occurred within a relatively recent timeframe enabled humans to live in manner that is remarkably different than our pre-agricultural past. Consequently, the shift from a natural to a modern lifestyle likely promotes a gene-environment mismatch which causes metabolic dysregulation which causes disease.

Methods: Using a within-participant design, we examined whether, compared to baseline, changes in lifestyle towards a more Paleolithic-style pattern, for a four-day and four-night period related to changes in a variety of metabolic parameters. Two groups of 14 volunteers were isolated for a period of four days and four nights in the natural park Südeifel on the borders between Germany and Luxembourg. Participants lived outdoors without tents. The daily hiking performance was 16.4 km (approx. 24963 steps/day) and the daily activity time 5.49 h/day by a mean caloric intake of 1747 kcal/day.

Results: After four days of simulated Paleolithic conditions, body weight (-2.9%), body mass index (-2.7%), body fat (-10.4%), visceral fat (-13.6%) and waist-hip-ratio (-2.2%) significantly decreased, while muscle mass significantly increased (+2,3%). Additionally, fasting glucose (-6.5%), basal insulin (-44.4%), homeostasis model assessment-index (-49.3%) and fatty liver index (-41%) significantly dropped. In contrast, C-reactive protein, significantly increased (+67.1%).

Conclusion: Our study indicates that a short nature trip, where modern humans adjust their behavioral patterns to simulate a more Paleolithic-like condition, could serve as an effective strategy to help prevent or improve modern metabolic disease. Particularly, the major findings of an expeditious reduction of homeostasis model assessment-index and fatty liver index scores in only four days reveal the potential for meaningful benefits with such an intervention, even when compared to the effects of longer-term, single-intervention studies such as dietary or fitness programs on similar metabolic parameters.

(Hat tip to Mangan.)

Starship Troupers

Saturday, November 12th, 2016

Starship research is enjoying something of a boom:

Serious work in the field dates back to 1968, when Freeman Dyson, an independent-minded physicist, investigated the possibilities offered by rockets powered by a series of nuclear explosions. Then, in the 1970s, the BIS designed Daedalus, an unmanned vessel that would use a fusion rocket to attain 12% of the speed of light, allowing it to reach Barnard’s Star, six light-years away, in 50 years. That target, though not the nearest star to the sun, was the nearest then suspected of having at least one planet.

[...]

During the cold war America spent several years and much treasure (peaking in 1966 at 4.4% of government spending) to send two dozen astronauts to the Moon and back. But on astronomical scales, a trip to the Moon is nothing. If Earth — which is 12,742km, or 7,918 miles, across — were shrunk to the size of a sand grain and placed on the desk of The Economist’s science correspondent, the Moon would be a smaller sand grain about 3cm away. The sun would be a larger ball nearly 12 metres down the hall. And Alpha Centauri B would be around 3,200km distant, somewhere near Volgograd, in Russia.

Chemical rockets simply cannot generate enough energy to cross such distances in any sort of useful time. Voyager 1, a space probe launched in 1977 to study the outer solar system, has travelled farther from Earth than any other object ever built. A combination of chemical rocketry and gravitational kicks from the solar system’s planets have boosted its velocity to 17km a second. At that speed, it would (were it pointing in the right direction) take more than 75,000 years to reach Alpha Centauri.

Nuclear power can bring those numbers down. Dr Dyson’s bomb-propelled vessel would take about 130 years to make the trip, although with no ability to slow down at the other end (which more than doubles the energy needed) it would zip through the alien solar system in a matter of days. Daedalus, though quicker, would also zoom right past its target, collecting what data it could along the way. Icarus, its spiritual successor, would be able at least to slow down. Only Project Longshot, run by NASA and the American navy, envisages actually stopping on arrival and going into orbit around the star to be studied.

But nuclear rockets have problems of their own. For one thing, they tend to be big. Daedalus would weigh 54,000 tonnes, partly because it would have to carry all its fuel with it. That fuel itself has mass, and therefore requires yet more fuel to accelerate it, a problem which quickly spirals out of control. And the fuel in question, an isotope of helium called 3He, is not easy to get hold of. The Daedalus team assumed it could be mined from the atmosphere of Jupiter, by humans who had already spread through the solar system.

A different approach, pioneered by the late Robert Forward, was championed by Dr Benford and his brother Gregory, who, like Forward was, is both a physicist and a science-fiction author. The idea is to leave the troublesome fuel behind. Their ships would be equipped with sails. Instead of filling them with wind, an orbiting transmitter would fill them with energy in the form of lasers or microwave beams, giving them a ferocious push to a significant fraction of the speed of light which would be followed (with luck) by an uneventful cruise to wherever they were going.

“Cheaper”, though, is a relative term. Jim Benford reckons that even a small, slow probe designed to explore space just outside the solar system, rather than flying all the way to another star, would require as much electrical power as a small country — beamed, presumably, from satellites orbiting Earth. A true interstellar machine moving at a tenth of the speed of light would consume more juice than the entirety of present-day civilisation. The huge distances involved mean that everything about starships is big. Cost estimates, to the extent they mean anything at all, come in multiple trillions of dollars.

That illustrates another question about starships, beyond whether they are possible. Fifty years of engineering studies have yet to turn up an obvious technical reason why an unmanned starship could not be built (crewed ships might be doable too, although they throw up a host of extra problems). But they have not answered the question of why anyone would want to go to all the trouble of building one.

Set Phasers to “Vaporize”

Friday, November 11th, 2016

Star Trek‘s phasers have a top setting that will vaporize a human:

That’s not just overkill, that’s an insane level of overkill. It’s like using a TOW anti-tank missile to target an individual.

And this is one of the things that Star Trek got wrong. Not that it’s necessarily impossible for a weapon the size of a keychain to vaporize a human, but that the process of vaporizing the human wouldn’t utterly trash the surroundings. Face it: you’re converting, oh, 180 pounds of water to steam, and converting the calcium in the bones, the metal and plastic in his clothes, tools, weapons, etc. into plasma. And if the target is also holding a phaser, you’re converting that into vapor, which means that its battery (or whatever the power source is) is going to explode.

Phaser-vaporizing someone on board a spaceship is going to be a disaster, because by converting 180 pounds of water into steam, you’re increasing the volume by a factor of around 1,000. Imagine if the room the target was in suddenly found itself loaded with 1,000 more people. The pressure will blow the hull apart. While a blaster will simply poke a hole in the target, maybe burning their clothes.

Star Trek always made the result of someone getting vaporized pretty… well, sterile. Zap, bright light, gone. But it wouldn’t be like that. If you want to know what someone getting phasered at full power would look like, YouTube provides. Behold the phenomenon of the “Arc Flash,” where enough electrical energy can be dumped into a human to convert said human into a steam explosion. Obviously, this might be considered slightly grisly, so gather the kids around (occurs at 1:14; you can adjust settings to .25 speed to watch the guy go from “normal” to “Hey, he’s a glowing blob, just like in Star Trek” to “Where’d he go?” in three frames):

It’s kinda unclear just what the hell happened here, but it sure looks like the guy was converted into mostly a cloud and a bit of a spray. In any event, there’s no missing the fact that something really quite energetic happened to the guy. The captain of the Klingon scout vessel vaporizes one of his crew on the bridge, they’re going to be scrubbing it down for *days,* assuming that the steam and overpressure doesn’t kill everyone else on the bridge.

It turns out that the arc flash did not vaporize the worker:

(Hat tip to Nyrath.)

Arctic Foxes Grow Their Own Gardens

Wednesday, November 9th, 2016

Arctic foxes grow their own gardens:

The underground homes, often a century old, are topped with gardens exploding with lush dune grass, diamondleaf willows, and yellow wildflowers — a flash of color in an otherwise gray landscape.

Arctic Fox at Entrance of its Den in Alaska's Arctic National Wildlife Refuge

“These animals are fertilizing and basically growing a garden.”

Gardens that create such a stark contrast on the tundra that scientists who recently published the first scientific study on the dens have dubbed the foxes “ecosystem engineers.”

Conducted in 2014 near Churchill, Manitoba, the experiments revealed that the foxes’ organic waste supports almost three times as much botanical biomass in summer months as the rest of the tundra.

[...]

Some dens are over a century old, and the best are elevated: ridges, mounds, riverbanks. But with so much permafrost — frozen ground — and such a flat environment, prime sites can take years to develop.

And since digging new homes wastes valuable energy, real estate is limited — so foxes reuse locations — and in a strange time-share, foxes sometimes steal sites belonging to ground squirrels.

With litters averaging about eight to 10 pups—some as high as 16—the foxes deposit high amounts of nutrients in and around their dens, a combination of urination, defecation, and leftover kills.

In winter, foxes don’t drink water or eat snow or ice, which lowers their core temperature. Instead they get water from their food, which concentrates nutrients in their urine, making it more potent.

Feral Pigs and Rabid Bats

Wednesday, November 9th, 2016

Brazil has an unusual feral pig problem:

There have been feral pigs in Brazil for up to 200 years, research suggests, when a few domestic pigs escaped and went wild in the Pantanal region. But a large-scale, country-wide invasion can be traced back to the 1990s, when wild boars were imported from Europe and Canada for use in high-quality meat products. In Brazil, many farmers bred these boars with the domestic pigs that already existed in the country. Eventually, the government stopped permitting the importation of wild boars, and many of the interbred pigs were released — accidentally or intentionally — into the wild.

The feral pigs cause enough ecological and agricultural damage as it is, but now the authors of the new study are concerned that their continued spread could boost bat populations in some areas and contribute to a spike in rabies infections in people. This could happen in a variety of ways, they’ve suggested. While vampire bats have been known to bite sleeping humans and infect them directly, bushmeat hunters — and their hunting dogs — could also be exposed through contact with infected pigs.

And rabies isn’t the only concern either, Pedrosa added. Vampire bats are known reservoirs for a handful of other infectious diseases as well, including several viruses that can cause serious respiratory illness in humans.

They haven’t addressed the problem the way Texans have:

The Brazilian government has established a program allowing the killing of feral pigs, he noted, but added that rigorous restrictions on the purchase of firearms has kept the number of participants fairly small so far.

Rabid bats are a problem in Brazil:

In 2005, a spate of attacks on humans in Brazil made international headlines by causing 23 rabies deaths in two months and leading to more than 1,300 people seeking medical treatment for rabies.

[...]

Today, the incidence of rabies infections in vampire bats varies by location — it tends to be anywhere from about 1 to up to 10 percent, according to the authors of the new paper.

[...]

For the new paper, the researchers analyzed thousands of photographs and videos used to monitor wildlife in Brazil’s Pantanal region, a tropical wetland area mostly occupying the Brazilian state of Mato Grosso do Sul, and the Atlantic Forest, which runs down the Atlantic coast. They found that, in addition to preying on livestock like cattle, the bats also feed on wild animals including tapirs, deer and feral pigs. The videos and photos from the Pantanal region suggested there was about a 2 percent chance that a pig might be attacked by a vampire bat on any given night. In the Atlantic Forest, this chance rose to 11 percent.