Anti-Helicopter Mines?

Friday, January 20th, 2017

The U.S. Army is now concerned about anti-helicopter mines:

Bulgaria, which seems to have developed these devices as far as as the late 1990s, offers several mines such as the AHM-200, a 200-pound device which looks like a mortar tube mounted on a tripod. The mine, which is emplaced on the surface rather than buried in the dirt, has an acoustic sensor which arms the weapon when it picks up the sound of the helicopter as far away as 1,500 feet. At a range of 500 feet, a Doppler radar tracks the target. When the helicopter gets within 300 feet, the mine detonates both an explosively formed projectile and an explosive charge packed with steel balls.

A 2012 Russian news video shows what looks a similar device. A Russian expert in the video claims that anti-helicopter mines were developed because shoulder-fired anti-aircraft missiles are ineffective against helicopters flying lower than 300 feet.

Other nations have also developed anti-helicopter mines. Poland has one, while Austria has developed an infrared-guided version.

How Amazon innovates in ways that Google and Apple can’t

Saturday, January 14th, 2017

Timothy B. Lee explains how Amazon innovates in ways that Google and Apple can’t:

Amazon has figured out how to combine the entrepreneurial culture of a small company with the financial resources of a large one. And that allows it tackle problems most other companies can’t.

[...]

Google’s approach — solve the hard technical problems first, worry about the business model later — is rooted in the engineering background of Google Founders Larry Page and Sergey Brin. In contrast, Amazon CEO Jeff Bezos spent almost a decade working for several Wall Street firms before starting Amazon — a background that gives him a more pragmatic outlook that’s more focused on developing products customers will actually want to pay for.

Bezos has worked to create a culture at Amazon that’s hospitable to experimentation.

“I know examples where a random Amazon engineer mentions ‘Hey I read about an idea in a blog post, we should do that,’” Eric Ries says. “The next thing he knows, the engineer is being asked to pitch it to the executive committee. Jeff Bezos decides on the spot.”

A key factor in making this work, Ries says, is that experiments start small and grow over time. At a normal company, when the CEO endorses an idea, it becomes a focus for the whole company, which is a recipe for wasting a lot of resources on ideas that don’t pan out. In contrast, Amazon creates a small team to experiment with the idea and find out if it’s viable. Bezos famously instituted the “two-pizza team” rule, which says that teams should be small enough to be fed with two pizzas.

Ries says that new teams get limited funding and clear milestones; if a team succeeds in smaller challenges, it’s given more resources and a larger challenge to tackle.

But Amazon doesn’t spend too much time on internal testing. “They prioritize launching early over everything else,” one engineer wrote in an epic 2011 rant comparing Amazon’s culture to other technology companies. Launching early with what Ries has dubbed a “minimum viable product” allows Amazon to learn as quickly as possible whether an idea that sounds good on paper is actually a good idea in the real world.

Of course, this method isn’t foolproof; Amazon has had plenty of failures, like its disastrous foray into the smartphone market. But by getting a product into the hands of paying customers as quickly as possible and taking their feedback seriously, Amazon avoids wasting years working on products that don’t serve the needs of real customers.

This seems to be the approach Amazon is taking with Amazon Go, its new convenience store concept. It’s a technology that could work in many different types of retail stores, but Amazon’s initial approach is modest: a single, relatively small convenience store. Media reports suggest that Amazon plans to open 2,000 retail stores, but the company disputes this. The Amazon way, after all, isn’t to open one store because there’s a plan for 2,000. It’s to open one store and then open thousands more if the first one is a big success.

In the abstract this approach — minimize bureaucracy, start out with small experiments, expand them if they’re successful — sounds so good that it’s almost banal. But it’s surprisingly difficult for big companies to do this, especially when they’re entering new markets.

Over time, big companies develop cultures and processes optimized for the market where they had their original success. Companies have a natural tendency to establish uniform standards across the enterprise.

[...]

“It doesn’t matter what technology” teams use at Amazon, one of the company’s former engineers wrote in 2011. Bezos has explicitly discouraged the kind of standardization you see at companies like Google and Apple, encouraging teams to operate independently using whatever technology makes the most sense.

Bezos has worked hard to make Amazon a modular, flexible organization with a minimum of company-wide policies.

(Hat tip to Arnold Kling.)

The U.S. Army’s Radical Idea to Save Its Tanks from Enemy Missiles

Friday, January 13th, 2017

The U.S. Army’s radical idea to save its tanks from enemy missiles involves a shield:

OBJECTIVE: Develop and demonstrate a model for a mechanism capable of moving an armor panel of at least 1 square foot with an areal density of 100 pounds per square foot (PSF) 10” horizontally in less than 5 seconds. The movement is intended to be repeatable and controlled from the interior of the vehicle and shall not pose harm to dismounted personnel.

DESCRIPTION: Conventional armor solutions currently being integrated are “not adaptable” in providing increased threat capability and protection from a greatly expanded set of threats. A solution is needed for threats that are not feasibly addressed with conventional armor systems. Conventional armor systems are essentially static and unable to respond to unanticipated changes in threats deployed against the system; essentially the army has limited potential to increase the capabilities of current static armor recipes in order to balance size, weight, and performance requirements.

Increased threat defeat using conventional armor is prohibitive due to the significant weight burdens associated with increased protection. Any increase in weight has secondary effects such as limited off-road mobility and increased logistics burden.

This SBIR topic solicits new, innovative approaches to incorporate mechanisms into an armor system to provide protection against increased threats. For the purpose of this effort the system shall be designed to interface with a 1” plate of Rolled Homogenous Armor (RHA) Plate that represents a surrogate vehicle structure. The mechanism needs to be capable of moving a 100 PSF armor panel 10 inches horizontally in less 5 seconds. The mechanism needs to be able to withstand automotive loading as well as environmental conditions typical of a combat vehicle. The proposal should discuss in detail how the system could be incorporated onto a vehicle platform and what the projected Space, Weight, Power, and cooling (SWAP-C) at the vehicle level.

The proposal shall not include a system that could be describe as an Active Protection System (APS). A system is considered an APS system if any of the two statements apply: 1. A light-weight hit avoidance vehicle defense system which, when integrated on a ground combat vehicle, can detect, track; and then interdict by diversion, disruption, neutralization, or destruction of incoming line-of-sight threat munitions. 2. A system that deploys a counter-measure that does not providing any inherent protection to the vehicle system when the counter-measure does not perform as designed.

Lego Boost

Wednesday, January 4th, 2017

Lego’s new Boost line was designed to be less complex than its Mindstorms line:

My favorite was Vernie, a bowtie-clad robot with amazing moving eyebrows. There’s also a cat, a space rover, a factory and a guitar.

LEGO Boost

While most of the pieces resemble the billions out there in the wild, Lego Boost kits come with a special Move Hub. Inside is a computer, a wireless chip and a tilt sensor. Attach that, along with included motors and a special sensor that detects color and distance, and the creations come to life.

Actually, there’s one additional step: coding. Lego Boost connects to an Android or iOS tablet app—at launch, no phones, however. The app demonstrates how to assemble simple lines of instruction. No typing required. Like real-world Lego bricks, these digital blocks of code stack up to make your Lego creation respond to stimuli or perform a routine. In a few minutes, I was able to make Vernie do a little dance, and the cat meow when I gave it a milk bottle made from bricks.

The rest of us are end users

Wednesday, January 4th, 2017

More people believe in magic than we would care to admit, Richard Fernandez says:

ISIS is currently carrying out a campaign against wizards in their midst and is executing those they suspect of dabbling in it. But that is understandable given their world view.

[...]

When the last cellphone in the Caliphate is destroyed or worn out no one will know how to make another. Their 8th century is capable of producing fanaticism but probably couldn’t make a ball point pen. Objects in the ISIS universe are “magical” — put there by Allah in the possession of the infidel for holy warriors to plunder and enjoy until the power which inheres in them gradually fades away.

Surprisingly much of the modern world is not very different. Many people treat technology like magic even in the West. How does a cell phone work? Dunno. Where does it come from? The store. Civilization depends on the knowledge of a small fraction of the world’s 7.5 billion population. The know-how to make pharmaceuticals, complex devices, aircraft, computers, industrial chemicals from scratch is probably confined to a few million people concentrated in North America, Europe, Russia and North Asia. The rest of us are end users.

If a global catastrophe destroyed all of civilization’s works yet spared these few million they could re-create every object in the world again. By contrast if only these few millions perished the remaining billions though untouched could continue only until things broke down. It is knowledge which sustains civilization.

Mix of Graphene With ‘Silly Putty’ Yields Extremely Sensitive Sensor

Monday, December 12th, 2016

Mixing graphene — a material made of single-atom-thick layers of carbon — with homemade “Silly Putty” produces a sensor so sensitive that it can detect the tiny footsteps of spiders:

Dr. Coleman’s lab has a long tradition of incorporating household products into nanotechnology research. For instance, they have made graphene using a kitchen blender. The idea of mixing graphene with silly putty came from one of Dr. Coleman’s students. He greenlit the project, thinking it would be a good outreach tool. The material turned out to have unusual and interesting properties.

Silly Putty manufacturer Crayola didn’t respond to a request for comment.

To do their tests, the scientists hooked up their G-putty using wires to a recording device. When pressure is applied—by a spider’s walking or a heart’s pulsing—they showed G-putty’s resistance, or ability to conduct electricity, changed in a measurable way, giving scientists the “basis of a sensor,” according to Dr. Coleman. G-putty, he says, is about 10 times as sensitive as other similar technologies. (In a wearable, it would be connected to some sort of battery, he said.)

Molten-Salt Reactors

Thursday, December 8th, 2016

China hopes to build the world’s largest nuclear power industry, with both conventional nuclear plants and a variety of next-generation reactors, including thorium molten-salt reactors, high-temperature gas-cooled reactors (which, like molten-salt reactors, are both highly efficient and inherently safe), and sodium-cooled fast reactors (which can consume spent fuel from conventional reactors to make electricity):

Alvin Weinberg first came to Oak Ridge in 1945, just after its laboratories had been built in the northern Tennessee hills to make weapons-grade uranium and plutonium. A veteran of the Manhattan Project, Weinberg became director of the rapidly growing national lab in 1955 and held the position until 1973. He was a pioneering nuclear physicist and a philosopher of nuclear power who used the phrase “Faustian bargain” to describe the tension between industrialized society’s thirst for abundant energy and the extreme vigilance needed to keep nuclear power safe. To make this energy source both clean and extremely cheap, he believed, the link between nuclear power and nuclear weapons would have to be severed. And the way to break that link was the thorium molten-salt reactor.

Under Weinberg’s leadership, a team of enthusiastic young chemists, physicists, and engineers operated a small, experimental molten-salt reactor from 1965 to 1969. That reactor at Oak Ridge ran on uranium; Weinberg’s eventual goal was to build one that would run exclusively on thorium, which, unlike uranium, cannot easily be made into a bomb. But the molten-salt experiment was abandoned in the early 1970s. One big reason was that Weinberg managed to alienate his superiors by warning of the dangers of conventional nuclear power at a time when dozens of such reactors were already under construction or in the planning stages.

By the end of the century, the U.S. had built 104 nuclear reactors, but construction of new ones had all but come to a halt, and the technology remained stuck in the 1970s. Because conventional reactors require huge, costly containment vessels that can blow up in extreme conditions, and because they use extensive external cooling systems to make sure the solid-fuel core doesn’t overheat and cause a runaway reaction leading to a meltdown, they are hugely expensive. Two new reactors being built now in Georgia could cost $21 billion, 50 percent over the original estimate of $14 billion. All that for 40-year-old technology.

Today, though, as climate change accelerates and government officials and scientists seek a nuclear technology without the expensive problems that have stalled the conventional version, molten salt is enjoying a renaissance. Companies such as Terrestrial Energy, Transatomic Power, Moltex, and Flibe Energy are vying to develop new molten-salt reactors. Research programs on various forms of the technology are under way at universities and institutes in Japan, France, Russia, and the United States, in addition to the one at the Shanghai Institute. Besides the work going into developing solid-fuel reactors that are cooled by molten salt (like the one I toured virtually in Shanghai), there are even more radical designs that also use radioactive materials dissolved in molten salt as the fuel (as Weinberg’s experiment did).

Like all nuclear plants, molten-salt reactors excite atoms in a radioactive material to create a controlled chain reaction. The reaction unleashes heat that boils water, creating steam that drives a turbine to generate electricity. Solid-fuel reactors cooled with molten salt can run at higher temperatures than conventional reactors, making them more efficient, and they operate at atmospheric pressures—meaning they do not require expensive vessels of the sort that ruptured at Chernobyl. Molten-salt reactors that use liquid fuel have an even more attractive advantage: when the temperature in the core reaches a certain threshold, the liquid expands, which slows the nuclear reactions and lets the core cool. To take advantage of this property, the reactor is built like a bathtub, with a drain plug in the bottom; if the temperature in the core gets too high, the plug melts and the fuel drains into a shielded tank, typically underground, where it is stored safely as it cools. These reactors should be able to tap more of the energy available in radioactive material than conventional ones do. That means they should dramatically reduce the amount of nuclear waste that must be handled and stored.

Because they don’t require huge containment structures and need less fuel to produce the same amount of electricity, these reactors are more compact than today’s nuclear plants. They could be mass-produced, in factories, and combined in arrays to form larger power plants.

All of that should make them cheaper to build. Unlike wind and solar, which have gotten far less expensive over time, nuclear plants have become much more so. According to the U.S. Energy Information Administration, the inflation-adjusted cost of building a nuclear plant rose from $1,500 per kilowatt of capacity in the early 1960s to more than $4,000 a kilowatt by the mid-1970s. In its latest calculation, in 2013, the EIA found that the figure had risen to more than $5,500—more expensive than a solar power plant or onshore wind farm, and far more than a natural-gas plant. That up-front cost is amplified by the large size of the reactors; at the average cited by the EIA, a one-gigawatt plant would cost $5.5 billion, a risky investment for any company.

Those up-front costs are balanced by the fact that nuclear plants are relatively cheap to operate: at new plants the levelized cost of electricity, which measures the cost of power generated over the lifetime of the plant, is $95 per megawatt-hour, according to the EIA—comparable to the cost of electricity from coal-fired plants, and less than solar power ($125 a megawatt-hour). Still, natural-gas plants are far cheaper to build, and the cost of the electricity they produce ($75 a megawatt-hour, according to the EIA) is also lower. Tightening regulations on carbon emissions makes nuclear more attractive, but lowering the cost of construction is critical to the future of zero-carbon nuclear power.

Why Saddam and Gaddafi Failed to get the Bomb

Saturday, November 26th, 2016

Målfrid Braut-Hegghammer, author of Unclear Physics, explains why Saddam and Gaddafi failed to get the Bomb:

While dictators with weak states can easily decide that they want nuclear weapons, they will find it difficult to produce them. Why? Personalist dictators like Saddam and Gaddafi weaken formal state institutions in order to concentrate power in their own hands. This helps them remain in power for longer, but makes their states inefficient. Weak states have fewer instruments to set up and manage complex technical programs. They lack the basic institutional capability to plan, execute, and review complicated technical projects. As a result, their leaders can be led to believe that the nuclear weapons program is doing great while, in fact, nothing is working out. In Libya, for example, scientists worked throughout the 1980s to produce centrifuges, with zero results.

[...]

As my book shows, these programs were afflicted with capacity problems at every stage, from initial planning to their final dismantlement. These problems were worse in Libya than in Iraq, because Gaddafi dismantled most state institutions as part of his Cultural Revolution during the 1970s. Saddam created a bloated state that was difficult to navigate for his officials, with competing agencies and programs blaming each other for various problems as these emerged. This made oversight difficult, from Saddam’s point of view, and caused endless infighting and backstabbing inside the Iraqi nuclear program. As a result, scientists spent days in endless meetings, blaming each other for delays, rather than working together as a team to solve problems they were facing.

Even when Saddam tried to put more pressure on his scientists to deliver results, he failed. After Israel destroyed a research reactor complex in Iraq in June 1981, Saddam became more determined to get nuclear weapons. But the program made little progress. In 1985, his leading scientists promised Saddam that they would achieve a major breakthrough by 1990 – without specifying what exactly they would achieve by that time. By 1987, it was clear that they would not be able to make a significant breakthrough by the deadline. This created plenty of shouting and conflict inside the program, and led to an in-house restructuring, but even at this stage no one was willing to tell Saddam the bad news. When the delays could no longer be denied, the scientists blamed another agency. This was a strategic blunder – because this agency was led by Saddam’s son-in-law, Hussein Kamil. Saddam put Kamil in charge of the nuclear weapons program. Even Kamil, who was notoriously brutal against his employees, became so frustrated with the nuclear program that he threatened to imprison anyone found to intentionally cause delays. Tellingly, this threat was never implemented.

In contrast, Libyan scientists often did not show up for work. The regime couldn’t just fire them, partly because there were too few scientists in Libya to begin with. The regime was unable to educate enough scientists and engineers, and had to hire foreigners (including many Egyptians). Some of the Egyptian scientists went on strike during a 1977 conflict between the two states – and, apparently, managed to negotiate better conditions. Not quite what we would expect from a brutal dictator, is it? But, as the history of Libya’s nuclear program demonstrates, the regime invested enormous sums in buying equipment without getting significantly closer to the nuclear weapons threshold. In fact, nothing worked – including phones, photo-copiers and expensive laboratory equipment. Some of the equipment broke, and no-one knew how to fix them, whereas other stuff was left unopened because the technical staff was concerned that fluctuating voltage in their electrical system could break the equipment. The Soviet research reactor also faced problems, because the Libyans were unable to filter the water cooling the reactor system, which meant the pipes became clogged with sand.

The Iraqi and Libyan programs failed for different reasons. The Iraqi program was beginning to make some progress after the internal restructuring. Kamil decided to ignore Saddam’s rule to not seek help from abroad, and bought equipment for the nuclear weapons program from Germany and other countries in the late 1980s. But then, Saddam miscalculated badly and decided to invade Kuwait in the summer of 1990. After the invasion, the Iraqis launched a crash nuclear program. Kamil told Saddam that they were on the threshold of acquiring nuclear weapons in the fall of 1990, which wasn’t true. But, if Saddam hadn’t invaded Kuwait, which led to the 1991 Gulf War, he would most likely have acquired nuclear weapons. The Libyan program never even got close.

A Twist on Wing Design

Tuesday, November 15th, 2016

MIT researchers are testing a shaping-changing wing that could replace the hinged flaps and ailerons of conventional flight controls:

They constructed the wing from tiny lightweight structural pieces made with Kapton foil on an aluminum frame, arranged in a lattice of cells like a honeycomb. The skin of the wing is made with overlapping strips of the flexible foil, layered like fish scales, allowing the pieces to slide across each other as the wing flexes, they said.

Flexible Wing from MIT

Two small motors apply a twisting pressure to each wingtip to control maneuvers in flight. They say this elastic airfoil can morph continuously to reduce drag, increase stall angle, and reduce vibration control flutter.

The Soft Robotics abstract:

We describe an approach for the discrete and reversible assembly of tunable and actively deformable structures using modular building block parts for robotic applications. The primary technical challenge addressed by this work is the use of this method to design and fabricate low density, highly compliant robotic structures with spatially tuned stiffness.

This approach offers a number of potential advantages over more conventional methods for constructing compliant robots. The discrete assembly reduces manufacturing complexity, as relatively simple parts can be batch-produced and joined to make complex structures. Global mechanical properties can be tuned based on sub-part ordering and geometry, because local stiffness and density can be independently set to a wide range of values and varied spatially. The structure’s intrinsic modularity can significantly simplify analysis and simulation. Simple analytical models for the behavior of each building block type can be calibrated with empirical testing and synthesized into a highly accurate and computationally efficient model of the full compliant system.

As a case study, we describe a modular and reversibly assembled wing that performs continuous span-wise twist deformation. It exhibits high performance aerodynamic characteristics, is lightweight and simple to fabricate and repair. The wing is constructed from discrete lattice elements, wherein the geometric and mechanical attributes of the building blocks determine the global mechanical properties of the wing. We describe the mechanical design and structural performance of the digital morphing wing, including their relationship to wind tunnel tests that suggest the ability to increase roll efficiency compared to a conventional rigid aileron system. We focus here on describing the approach to design, modeling, and construction as a generalizable approach for robotics that require very lightweight, tunable, and actively deformable structures.

Starship Troupers

Saturday, November 12th, 2016

Starship research is enjoying something of a boom:

Serious work in the field dates back to 1968, when Freeman Dyson, an independent-minded physicist, investigated the possibilities offered by rockets powered by a series of nuclear explosions. Then, in the 1970s, the BIS designed Daedalus, an unmanned vessel that would use a fusion rocket to attain 12% of the speed of light, allowing it to reach Barnard’s Star, six light-years away, in 50 years. That target, though not the nearest star to the sun, was the nearest then suspected of having at least one planet.

[...]

During the cold war America spent several years and much treasure (peaking in 1966 at 4.4% of government spending) to send two dozen astronauts to the Moon and back. But on astronomical scales, a trip to the Moon is nothing. If Earth — which is 12,742km, or 7,918 miles, across — were shrunk to the size of a sand grain and placed on the desk of The Economist’s science correspondent, the Moon would be a smaller sand grain about 3cm away. The sun would be a larger ball nearly 12 metres down the hall. And Alpha Centauri B would be around 3,200km distant, somewhere near Volgograd, in Russia.

Chemical rockets simply cannot generate enough energy to cross such distances in any sort of useful time. Voyager 1, a space probe launched in 1977 to study the outer solar system, has travelled farther from Earth than any other object ever built. A combination of chemical rocketry and gravitational kicks from the solar system’s planets have boosted its velocity to 17km a second. At that speed, it would (were it pointing in the right direction) take more than 75,000 years to reach Alpha Centauri.

Nuclear power can bring those numbers down. Dr Dyson’s bomb-propelled vessel would take about 130 years to make the trip, although with no ability to slow down at the other end (which more than doubles the energy needed) it would zip through the alien solar system in a matter of days. Daedalus, though quicker, would also zoom right past its target, collecting what data it could along the way. Icarus, its spiritual successor, would be able at least to slow down. Only Project Longshot, run by NASA and the American navy, envisages actually stopping on arrival and going into orbit around the star to be studied.

But nuclear rockets have problems of their own. For one thing, they tend to be big. Daedalus would weigh 54,000 tonnes, partly because it would have to carry all its fuel with it. That fuel itself has mass, and therefore requires yet more fuel to accelerate it, a problem which quickly spirals out of control. And the fuel in question, an isotope of helium called 3He, is not easy to get hold of. The Daedalus team assumed it could be mined from the atmosphere of Jupiter, by humans who had already spread through the solar system.

A different approach, pioneered by the late Robert Forward, was championed by Dr Benford and his brother Gregory, who, like Forward was, is both a physicist and a science-fiction author. The idea is to leave the troublesome fuel behind. Their ships would be equipped with sails. Instead of filling them with wind, an orbiting transmitter would fill them with energy in the form of lasers or microwave beams, giving them a ferocious push to a significant fraction of the speed of light which would be followed (with luck) by an uneventful cruise to wherever they were going.

“Cheaper”, though, is a relative term. Jim Benford reckons that even a small, slow probe designed to explore space just outside the solar system, rather than flying all the way to another star, would require as much electrical power as a small country — beamed, presumably, from satellites orbiting Earth. A true interstellar machine moving at a tenth of the speed of light would consume more juice than the entirety of present-day civilisation. The huge distances involved mean that everything about starships is big. Cost estimates, to the extent they mean anything at all, come in multiple trillions of dollars.

That illustrates another question about starships, beyond whether they are possible. Fifty years of engineering studies have yet to turn up an obvious technical reason why an unmanned starship could not be built (crewed ships might be doable too, although they throw up a host of extra problems). But they have not answered the question of why anyone would want to go to all the trouble of building one.

Jeff Bezos discusses space flight and his vision for Blue Origin

Sunday, October 30th, 2016

Jeff Bezos discusses space flight and his vision for Blue Origin at the 2016 Pathfinder Awards at Seattle’s Museum of Flight:

Colonizing Venus With Floating Cities

Wednesday, October 26th, 2016

Colonizing Venus with floating cities raises the question, Why build the cities floating in the atmosphere?

Because the atmosphere has raw materials for construction and building breathable atmosphere, and because the planet provides gravity that you’d otherwise need a large rotating habitat or tether-and-counterweight system to achieve, and a human-friendly temperature range that removes the need for complex thermal control systems. Venus is effectively the only other source of nitrogen in the inner system aside from Earth itself, and although Venus is very dry, the atmosphere does contain water, and sulfuric acid which can be converted into water or itself used in industrial processes. And the atmosphere is primarily CO2, which can provide carbon for polymers.

Once bootstrapped, small supplemental shipments of minerals and machinery would allow enormous expansion, the atmosphere itself providing building material and lifting gas. The Venusian habitats could provide atmospheric gases and polymer building materials to the inner system. In reality, it is probably one of the easiest places to establish large Earthlike habitats. The plentiful sunlight, ease of constructing large habitats under Earthlike gravity, and constant supply of CO2, nitrogen, and water from the atmosphere could make it an agricultural center for supplying the inner system as well…not only with food, but also chemicals and materials derived from plants. And yes, the rocket fuel for getting things into orbit could also be synthesized from the atmosphere, and the thick atmosphere makes Venus itself one of the easiest planets to land on (even if you never actually touch land).

Converting CO2 to building material sounds like science fiction, but it is a fact that plants do it all around us.

[...]

The advantages of Venus are Earthlike gravity for small, non-rotating habitats, an environment with Earthlike temperatures and pressures and protection from radiation and micrometeorites reducing structural mass and the consequences of a breach (seal the compromised area off, then put on hazmat suits and patch the breach), plentiful availability of nitrogen (which Mars or orbital habitats would need a constant supply of), and sunlight for crops without concentrator mirrors.

Raining In High-Frequency Traders

Tuesday, October 25th, 2016

What is the relationship between high-frequency traders and liquidity?

Ever since high-frequency trading rose to prominence, a debate has raged over whether the ensuing arms race between super-fast traders helped or hindered markets. One side argues that it helps because the massive number of transactions the fastest traders engage in lower costs by reducing the spreads between bids and offers. Critics counter that, in reality, spreads widen since slower traders need to charge higher spreads as insurance against getting caught flatfooted by a fast-moving event.

[...]

Starting in 2010, high-frequency traders began using ultrafast microwave links to relay prices and other information between Chicago and New York. To begin with, only some traders had access to microwave networks. Until 2013, others had to rely on less speedy fiber-optic cable.

But microwave transmissions are disrupted by water droplets and snowflakes, so during heavy storms traders using the networks switch to fiber. Messrs. Shkilko and Sokolov used weather-station data from along the microwaves’ paths to determine when storms occurred and then looked at what happened to bid-ask spreads in a variety of securities during those periods.

They narrowed, suggesting that the slowing down of the fastest high-frequency traders improved market liquidity.

Someone Is Learning How to Take Down the Internet

Monday, October 24th, 2016

Someone is learning how to take down the Internet, Bruce Schneier suggests:

Recently, some of the major companies that provide the basic infrastructure that makes the Internet work have seen an increase in DDoS attacks against them. Moreover, they have seen a certain profile of attacks. These attacks are significantly larger than the ones they’re used to seeing. They last longer. They’re more sophisticated. And they look like probing. One week, the attack would start at a particular level of attack and slowly ramp up before stopping. The next week, it would start at that higher point and continue. And so on, along those lines, as if the attacker were looking for the exact point of failure.

The attacks are also configured in such a way as to see what the company’s total defenses are. There are many different ways to launch a DDoS attacks. The more attack vectors you employ simultaneously, the more different defenses the defender has to counter with. These companies are seeing more attacks using three or four different vectors. This means that the companies have to use everything they’ve got to defend themselves. They can’t hold anything back. They’re forced to demonstrate their defense capabilities for the attacker.

[...]

One company told me about a variety of probing attacks in addition to the DDoS attacks: testing the ability to manipulate Internet addresses and routes, seeing how long it takes the defenders to respond, and so on. Someone is extensively testing the core defensive capabilities of the companies that provide critical Internet services.

Who would do this? It doesn’t seem like something an activist, criminal, or researcher would do. Profiling core infrastructure is common practice in espionage and intelligence gathering. It’s not normal for companies to do that. Furthermore, the size and scale of these probes — and especially their persistence — points to state actors. It feels like a nation’s military cybercommand trying to calibrate its weaponry in the case of cyberwar. It reminds me of the U.S.’s Cold War program of flying high-altitude planes over the Soviet Union to force their air-defense systems to turn on, to map their capabilities.

Brian Krebs offers some specifics:

At first, it was unclear who or what was behind the attack on Dyn. But over the past few hours, at least one computer security firm has come out saying the attack involved Mirai, the same malware strain that was used in the record 620 Gpbs attack on my site last month. At the end September 2016, the hacker responsible for creating the Mirai malware released the source code for it, effectively letting anyone build their own attack army using Mirai.

Mirai scours the Web for IoT devices protected by little more than factory-default usernames and passwords, and then enlists the devices in attacks that hurl junk traffic at an online target until it can no longer accommodate legitimate visitors or users.

According to researchers at security firm Flashpoint, today’s attack was launched at least in part by a Mirai-based botnet. Allison Nixon, director of research at Flashpoint, said the botnet used in today’s ongoing attack is built on the backs of hacked IoT devices — mainly compromised digital video recorders (DVRs) and IP cameras made by a Chinese hi-tech company called XiongMai Technologies. The components that XiongMai makes are sold downstream to vendors who then use it in their own products.

“It’s remarkable that virtually an entire company’s product line has just been turned into a botnet that is now attacking the United States,” Nixon said, noting that Flashpoint hasn’t ruled out the possibility of multiple botnets being involved in the attack on Dyn.

Many of these devices allow users to change the default usernames and passwords on a Web-based administration panel — but the devices also have default usernames and passwords for telnet and SSH, which aren’t editable from the Web-based admin tools:

“The issue with these particular devices is that a user cannot feasibly change this password,” Flashpoint’s Zach Wikholm told KrebsOnSecurity. “The password is hardcoded into the firmware, and the tools necessary to disable it are not present. Even worse, the web interface is not aware that these credentials even exist.”

Flashpoint’s researchers said they scanned the Internet on Oct. 6 for systems that showed signs of running the vulnerable hardware, and found more than 515,000 of them were vulnerable to the flaws they discovered.

Colonizing Venus

Monday, October 24th, 2016

Colonizing Venus may be easier than colonizing Mars:

In many ways Venus is the hell planet. Results of spacecraft investigation of the surface and atmosphere of Venus are summarized by Bougher, Hunten, and Phillips [1997]:

  • Surface temperature 735K: lead, tin, and zinc melt at surface, with hot spots with temperatures in excess of 975 K
  • Atmospheric pressure 96 Bar (1300 PSI); similar to pressure at a depth of a kilometer under the ocean
  • The surface is cloud covered; little or no solar energy
  • Poisonous atmosphere of primarily carbon dioxide, with nitrogen and clouds of sulfuric acid droplets.

However, viewed in a different way, the problem with Venus is merely that the ground level is too far below the one atmosphere level. At cloud-top level, Venus is the paradise planet. As shown in figure 2, at an altitude slightly above fifty km above the surface, the atmospheric pressure is equal to the Earth surface atmospheric pressure of 1 Bar. At this level, the environment of Venus is benign.

  • above the clouds, there is abundant solar energy
  • temperature is in the habitable “liquid water” range of 0-5OC
  • atmosphere contains the primary volatiles required for life (Carbon, Hydrogen, Oxygen, Nitrogen, and Sulfur)
  • Gravity is 90% of the gravity at the surface of Earth.

While the atmosphere contains droplets of sulfuric acid, technology to avoid acid corrosion are well known, and have been used by chemists for centuries. In short, the atmosphere of Venus is most earthlike environment in the solar system. Although humans cannot breathe the atmosphere, pressure vessels are not required to maintain one atmosphere of habitat pressure, and pressure suits are not required for humans outside the habitat.

It is proposed here. that in the near term, human exploration of Venus could take place from aerostat vehicles in the atmosphere, and that in the long term, permanent settlements could be made in the form of cities designed to float at about fifty kilometer altitude in the atmosphere of Venus.

On Venus, breathable air (i.e., oxygen-nitrogen mixture at roughly 21:78 mixture ratio) is a lifting gas. The lifting power of breathable air in the carbon dioxide atmosphere of Venus is about half kg per cubic meter. Since air is a lifting gas on Venus: the entire lifting envelope of an aerostat can be breathable gas, allowing the full volume of the aerostat to be habitable volume. For comparison, on Earth, helium lifts about one kg per cubic meter, so a given volume of air on Venus will lift about half as much as the same volume of helium will lift on Earth.

Settling Venus sounds oddly feasible:

In the long term, permanent settlements could be made in the form of cities designed to float at about fifty kilometer altitude in the atmosphere of Venus.

The thick atmosphere provides about one kilogram per square centimeter of mass shielding from galactic cosmic radiation and from solar particle event radiation, eliminating a key difficulty in many other proposed space settlement locations. The gravity, slightly under one Earth gravity, is likely to be sufficient to prevent the adverse affects of microgravity. At roughly one atmosphere of pressure, a habitat in the atmosphere will not require a high-strength pressure vessel.

Humans would still require provision of oxygen, which is mostly absent from the Venusian atmosphere, but in other respects the environment is perfect for humans (although on the habitat exterior humans would still require sufficient clothing to avoid direct skin exposure to aerosol droplets).

Since breathable air is a lifting gas, the entire lifting envelope of an aerostat can be breathable gas, allowing the full volume of the aerostat to be habitable volume. For objects the size of cities, this represents an enormous amount of lifting power. A one-kilometer diameter spherical envelope will lift 700,000 tons (two Empire state buildings). A two-kilometer diameter envelope would lift 6 million tons.

So, if the settlement is contained in an envelope containing oxygen and nitrogen the size of a modest city, the amount of mass which can be lifted will be, in fact, large enough that it could also hold the mass of a modest city. The result would be an environment as spacious as a typical city.

The lifting envelope does not need to hold a significant pressure differential. Since at the altitudes of interest the external pressure is nearly one bar, atmospheric pressure inside the envelope would be the same as the pressure outside. The envelope material itself would be a rip-stop material, with high-strength tension elements to carry the load. With zero pressure differential between interior and exterior, even a rather large tear in the envelope would take thousands of hours to leak significant amounts of gas, allowing ample time for repair. (For safety, the envelope would also consist of several individual units).

Solar power is abundant in the atmosphere of Venus, and, in fact, solar arrays can produce nearly as much power pointing downward (toward the reflective clouds) as they produce pointing toward the sun. The Venus solar day, 116.8 terrestrial days, is extremely long; however, the atmospheric winds circle the planet much more rapidly, rotating around the planet in four days. Thus, on the habitat, the effective solar “night” would be roughly fifty hours, and the solar “day” the same. This is longer than an Earth day, but is still comfortable compared to, for example, the six-month night experienced in terrestrial near-polar locations. If the habitat is located at high latitudes, the day and night duration could be shortened toward a 24-hour cycle.