The Myth of “Natural” Resources

Friday, April 30th, 2010

This Popular Mechanics headline made Shannon Love laugh:

Bioengineers Turn Trees into Tires
Billions of gallons of oil are used worldwide every year to manufacture tires. Bioengineers are developing a plant-based substitute that could replace some of that oil within five years.

If only we could find some plant-based substitute for synthetic rubber:

I think it humorous that we started out making tires from trees but then so successfully and overwhelmingly switched to synthetic rubber that we now find the idea of making rubber from plant materials exciting and revolutionary.

Love says that this demonstrates The Myth of “Natural” Resources — that’s there’s much natural about them:

The history of rubber provides a good example of our ability to create a resource when needed.

To begin with, the “natural” rubber that comes out of plants in the form of latex is useless. It’s a gummy sticky mess that smears and oozes onto everything and then turns brittle and crumbling when dry. To make it even basically useful humans must take action to heat the latex to increase the degree of polymerization in isoprene that makes up the bulk of the sap. Heated, treated rubber was all that raw rubber was used to make for thousands of years — until the discovery of vulcanization, a process in which humans added sulfur to the rubber while heating to make it solid but still elastic. In the last century, humans developed a wide range of processing techniques to turn the raw rubber into many different types of materials, each with different properties. The rubber we actually use is massively altered from that which comes out of the trees.

Okay, you may be thinking, I can see how we invent technology to turn the raw sap into many different materials, but we still need to take the sap from nature in the first place, don’t we?

Well, no. First, we don’t depend on the naturally occurring rubber plants in their natural environment to produce the sap. For centuries humans have been altering the natural environment to create a better environment just for rubber plants. We have informally and formally breed the plants so they will devote more resources to producing latex for us and less to ensuring their own reproduction.

(Also, an important but usually overlooked facet of resources is the human action needed to move a resource from its point of creation to where we use it. Without the ability to transport the sap or rubber from the tropics where the plants grow to anywhere on earth, the utility of rubber is severely restricted.)

Second, we don’t need the latex sap from plants at all. On December 7, 1941, one half of the world’s supply of rubber came from Dutch Indonesia. When the Japanese overran Indonesia in a matter of weeks, they reduced the Allies’ supplies of plant-sourced rubber by that same 50%. Given the importance of rubber for tires, waterproofing, electrical insulation, gaskets, etc., the loss of the rubber could have proven devastating to the allied war effort. 1940s-era technology just wouldn’t work without a material with the properties of rubber. Yet within 18 months the Allies had higher stocks of rubber than when the war started.

How? Well, they just made it. They made it from oil, turpentine and anything else they had lying around. The truth of the matter is that for nearly a century, organic chemists have been able to turn almost any carbon-containing compound into any other carbon-containing compound. The only reason that natural rubber was used at all was because it required the least number of tradeoffs in other resources. With the pressure of the war, the tradeoffs shifted and the Allies shifted to synthetics. After the war, synthetics just got cheaper and cheaper until today the majority of all rubber materials is created from oil and coal using the chemists’ alchemy.

We don’t even have to pump any additional oil to make rubber. Most rubber and other materials made from oil are made from the heavy fractions left over from processing fuel. Without oil-based synthetic materials, refining oil for fuel would produce large amounts of hazardous waste. With the production of oil-based synthetics, not a drop of oil is wasted.

Further, as demonstrated by the Popular Mechanics story above, we don’t even have to use oil or coal. They are just currently just the most convenient and cheap sources of complex carbon compounds.

Rubber is in no way unusual in being replaceable. All supposedly “natural” resource are really artificial resources that we can generate in functionally unlimited quantities. Anything that qualifies as a resource: water, land, iron, aluminum, oil, any organic substance, etc., is created by human action and therefore is not limited by anything in nature.

Long term we never have and never will face a situation where we have to permanently ration a fixed and ever dwindling resource. Anyone who says otherwise is either lying or (more likely) massively ignorant of the history of technology.

This truth does raise a question: If humans create “natural” resources, then why do we even have the concept? Why does it seem obvious to most that resources are finite attributes of nature?

Partly this no doubt results from the fact that on time scales of months or years, sudden interruptions in an established system of producing a resource causes severe problems. It is usually not possible to create a replacement resource quickly in response to an such an unpredictable event. This causes people to believe that the resource is fixed and limited because it just seems to disappear. A good example of this illusion is the “energy crisis” of 1973-84 in which it was widely accepted that the crisis resulted from a physical depletion of the world’s stock of ‘oil’. In reality, political interference in the creation of oil caused the crisis and the crisis disappeared when the political interference did.

Long term, it is normal (but invisible to most) that we are constantly shifting how we create every resource. What is useless dirt or ooze in one generation evolves into a vital resource in the next, and then is considered worthless in the next. People used to fight over dead-fall wood for use in household hearths. Today, dead branches are a nuisance that you have to pay people to haul off for you.

Everything we now call a resource was once useless. 200 years ago, aluminum was unknown and bauxite was just a red clay. For nearly, a century after its discovery, aluminum was so rare and expensive it was considered a precious metal. Gradually we learned how to efficiently produce aluminum until today it’s so cheap we use it for disposable drinking containers. Moreover, in the past only one specific aluminum compound was considered a useful ore. Now there are dozens. We even learn how to do without aluminum altogether, such as by substituting carbon-fiber composites in aircraft and other traditional uses of aluminum. This process happens with every resource, without exception. This is simply how technology works, but most people don’t understand this.

Deconstructing the Outrage

Friday, April 30th, 2010

Victor Davis Hanson deconstructs the outrage over the Arizona law that makes being an illegal immigrant illegal:

I have been trying to collate all the furor over the Arizona law, much of it written by those who do not live in locales that have been transformed by illegal immigration. These writers are more likely to show solidarity from a distance than to visit or live in the areas that have been so radically changed by the phenomenon.

On the unfortunate matter of “presenting papers”: I have done that numerous times this year — boarding airplanes, purchasing things on a credit card, checking into a hotel, showing a doorman an I.D. when locked out, going to the DMV, and, in one case, pulling off a rural road to use my cell phone in a way that alarmed a chance highway patrolman. An I.D. check to allay “reasonable suspicion” or “probable cause” is very American.

On the matter of racial profiling: No one wishes to harass citizens by race or gender, but, again unfortunately, we already profile constantly. When I had top classics students, I quite bluntly explained to graduating seniors that those who were Mexican American and African American had very good chances of entering Ivy League or other top graduate schools from Fresno, those who were women and Asians so-so chances, and those who were white males with CSUF BAs very little chance, despite straight A’s and top GRE scores. The students themselves knew all that better than I — and, except the latter category, had packaged and self-profiled themselves for years in applying for grants, admissions, fellowships, and awards. I can remember being told by a dean in 1989 exactly the gender and racial profile of the person I was to hire before the search had even started, and not even to “waste my time” by interviewing a white male candidate. Again, the modern university works on the principle that faculty, staff, and students are constantly identified by racial and gender status. These were not minor matters, but questions that affected hundreds of lives for many decades to come. (As a postscript I can also remember calling frantically to an Ivy League chair to explain that our top student that he had accepted had just confessed to me that in fact he was an illegal alien, and remember him “being delighted” at the news, as if it were an added bonus.)

On the matter of equality, fairness, and compassion, it is even more problematic. Literally thousands of highly skilled would-be legal immigrants from Latin America, Africa, Asia, and Europe wait patiently while others cut in front and illegally obtain what others legally wait for — residence in the U.S. Meanwhile, millions of Mexican-American, African-American, and poor white citizens have seen their wages fall because of competition from illegal aliens who will work for far less compensation. It is a bit strange that those of the upper classes are outraged over Arizona without empathy for entry-level U.S. workers or lower-middle-class taxpayers who end up paying the most for illegal immigration. But then, those who express the most moral outrage often are the least sensitive to the moral questions involved (see next).

Richard Fernandez notes that any illegal immigrant who got treated like a legal immigrant or visitor would sue for the violation of his human rights — and win:

In this world, it is immoral to ask a poor migrant to produce papers. That’s Nazism. But a rich migrant of the same color is not only subjected to the “papers please” requirement but asked to post bonds, show property ownership, exhibit bank accounts, subject himself to security clearances, health checks and attend interviews with a nameless consular official sitting behind an armored glass window. And then if he does all that, he can be asked to wait. And wait. And wait.

Which is more anti-minority?

Friday, April 30th, 2010

Which is more anti-minority?, Aretae asks:

  1. Arizona’s law making it illegal to be an illegal immigrant?
  2. Portland or assorted rich coastal California towns using land use regulations to make absolutely certain that no poor people (read Non-Asian Minorities) with families can live anywhere nearby?

Commenter Mark Horning makes the point that the Arizona law was very carefully written:

It makes illegal under Arizona law exactly what is already illegal in all 50 states and the territories and district. They copied the language verbatim from the existing federal statute.

The only legal difference is that now an Arizona Post certified officer can enforce the law under Arizona statute, instead of needing a federal officer to enforce it under federal statute. No different than charging someone with counterfeiting under Arizona law as opposed to federal law. If you have a problem with that, the problem is with the Federal statute that has been on the books since 1940.

The requirement for “reasonable suspicion” is the same language as set forth by the SCOTUS in Terry v. Ohio. (Also see Hibbel v. Nevada) If the cops have reasonable suspicion that you have broken a law, they can detain you until you identify yourself. (Terry Stop) They already have this power under Supreme court precedent. Again, the problem is not with the law, it’s with Terry v. Ohio.

Six Easy Steps to Avert the Collapse of Civilization

Friday, April 30th, 2010

David Eagleman suggests six easy steps to avert the collapse of civilization:

  1. Try Not to Cough on One Another
  2. Don’t Lose Things
  3. Tell Each Other Faster
  4. Mitigate Tyranny
  5. Get More Brains Involved
  6. Try Not to Run Out of Energy

The Cancer of Bureaucracy

Friday, April 30th, 2010

Bruce Charlton decries the cancer of bureaucracy:

Everyone living in modernizing ‘Western’ societies will have noticed the long-term, progressive growth and spread of bureaucracy infiltrating all forms of social organization: nobody loves it, many loathe it, yet it keeps expanding. Such unrelenting growth implies that bureaucracy is parasitic and its growth uncontrollable — in other words it is a cancer that eludes the host immune system.

Old-fashioned functional, ‘rational’ bureaucracy that incorporated individual decision-making is now all-but extinct, rendered obsolete by computerization. But modern bureaucracy evolved from it, the key ‘parasitic’ mutation being the introduction of committees for major decision-making or decision-ratification. Committees are a fundamentally irrational, incoherent, unpredictable decision-making procedure; which has the twin advantages that it cannot be formalized and replaced by computerization, and that it generates random variation or ‘noise’ which provides the basis for natural selection processes.

Modern bureaucracies have simultaneously grown and spread in a positive-feedback cycle; such that interlinking bureaucracies now constitute the major environmental feature of human society which affects organizational survival and reproduction. Individual bureaucracies must become useless parasites which ignore the ‘real world’ in order to adapt to rapidly changing ‘bureaucratic reality’.

Within science, the major manifestation of bureaucracy is peer review, which — cancer-like — has expanded to obliterate individual authority and autonomy. There has been local elaboration of peer review and metastatic spread of peer review to include all major functions such as admissions, appointments, promotions, grant review, project management, research evaluation, journal and book refereeing and the award of prizes.

Peer review eludes the immune system of science since it has now been accepted by other bureaucracies as intrinsically valid, such that any residual individual decision-making (no matter how effective in real-world terms) is regarded as intrinsically unreliable (self-interested and corrupt). Thus the endemic failures of peer review merely trigger demands for ever-more elaborate and widespread peer review.

Just as peer review is killing science with its inefficiency and ineffectiveness, so parasitic bureaucracy is an un-containable phenomenon; dangerous to the extent that it cannot be allowed to exist unmolested, but must be utterly extirpated. Or else modernizing societies will themselves be destroyed by sclerosis, resource misallocation, incorrigibly-wrong decisions and the distortions of ‘bureaucratic reality’. However, unfortunately, social collapse is the more probable outcome, since parasites can evolve more rapidly than host immune systems.

That’s the abstract. Read the whole thing.

Ultimate Parkour Challenge

Thursday, April 29th, 2010

MTV’s Ultimate Parkour Challenge doesn’t seem quite ready for prime time, but the spectacular moments are quite spectacular — and scary:

Gambling with Other People’s Money

Thursday, April 29th, 2010

In the United States, Russ Roberts says, we like to believe we are a capitalist society based on individual responsibility.

But we are what we do. Not what we say we are. Not what we wish to be. But what we do. And what we do in the United States is make it easy to gamble with other people’s money — particularly borrowed money — by making sure that almost everybody who makes bad loans gets his money back anyway. The financial crisis of 2008 was a natural result of these perverse incentives. We must return to the natural incentives of profit and loss if we want to prevent future crises.

The Promises of Frameworks

Thursday, April 29th, 2010

Web-programmer Eric Harrison talks about the unmet promises of frameworks:

These frameworks all promise rapid development through pre-built methods and APIs that take me from Point A (the idea of the app I have to build) to Point B (the finished product). The problem I’ve found is that in the applications I need to build, Point B is never where I need to go. For example, let’s say that I’m building a Content Management System for work. The application flow looks something like this:

  1. User views a page that has tabular data retrieved from the database.
  2. User clicks on a record in that table to pull up a form to edit that data.
  3. Application retrieves data from database, displays it in a form.
  4. User edits the data, clicks Save.
  5. Application takes the data from the request, checks the validity, then stores the data in the database.
  6. [Repeat as needed]

Pretty simple workflow, and is often one of the things that these frameworks are built to handle. They’ll have ActiveRecord interfaces to automatically understand the data in the database. The framework will have a magic tabular output mechanism to take the list of records and make a large HTML table and display all of the data. The framework will also have some magic methods to create a form that is linked to the ActiveRecord entity and handle all the data validation and processing.

This all works great… as long as your application has a single table with a very strict set of rules. As soon as you add relationships between multiple tables and crazy business logic (which exists in every single application I’ve ever built), these frameworks start to fall apart. You then spend hours poring through documentation figuring out how to extend the ActiveRecord class with your own ActiveRecordSpecialTable class that handles the many-to-many relationships. Then you have to dive in to the output mechanisms and figure out how to trick the framework into displaying an edit form with data from multiple sources. And so on, and so forth. Forever.

Meanwhile, by merely applying your years of web development experience, you could have made two simple scripts. One to display tabular data and one to display a form. You could have quickly crafted a back end script to pull data from the database using an SQL statement like:

FROM posts
ON posts.user_id =

Nice, simple, and easy to maintain.

The problem I noticed is that these frameworks are very good at taking my application from Point A to Point B, but very bad at taking my application from Point B to Point C.

Getting Through Allergy Season

Thursday, April 29th, 2010

Some advice on getting through allergy season:

Increase circulation to your sinuses and throat areas by complaining endlessly about your allergies in that nasal little voice of yours.

From The Onion.

All Meaning Would Vanish

Thursday, April 29th, 2010

Po Bronson suggests a Twilight Zone-type premise:

What if surgeons never got to work on humans, they were instead just endlessly in training, cutting up cadavers? What if the same went for all adults – we only got to practice at simulated versions of our jobs? Lawyers only got to argue mock cases, for years and years. Plumbers only got to fix fake leaks in classrooms. Teachers only got to teach to videocameras, endlessly rehearsing for some far off future. Book writers like me never saw our work put out to the public – our novels sat in drawers. Scientists never got to do original experiments; they only got to recreate scientific experiments of yesteryear. And so on.

Rather quickly, all meaning would vanish from our work. Even if we enjoyed the activity of our job, intrinsically, it would rapidly lose depth and relevance. It’d lose purpose. We’d become bored, lethargic, and disengaged.

In other words, we’d turn into teenagers.

In Escaping the Endless Adolescence, Joe Allen argues that our urge to protect teenagers from real life – because they’re not ready yet – has tragically backfired:

By insulating them from adult-like work, adult social relationships, and adult consequences, we have only delayed their development. We have made it harder for them to grow up. Maybe even made it impossible to grow up on time.

Basically, we long ago decided that teens ought to be in school, not in the labor force. Education was their future. But the structure of schools is endlessly repetitive. “From a Martian’s perspective, high schools look virtually the same as sixth grade,” said Allen. “There’s no recognition, in the structure of school, that these are very different people with different capabilities.” Strapped to desks for 13+ years, school becomes both incredibly montonous, artificial, and cookie-cutter.

As Allen writes, “We place kids in schools together with hundreds, sometimes thousands, of other kids typically from similar economic and cultural backgrounds. We group them all within a year or so of one another in age. We equip them with similar gadgets, expose them to the same TV shows, lessons, and sports. We ask them all to take almost the exact same courses and do the exact same work and be graded relative to one another. We give them only a handful of ways in which they can meaningfully demonstrate their competencies. And then we’re surprised they have some difficulty establishing a sense of their own individuality.”

And we wonder why it’s taking so long for them to mature.

Paul Graham made a similar point in explaining why nerds are unpopular in high school — but not so much before or after:

I think the important thing about the real world is not that it’s populated by adults, but that it’s very large, and the things you do have real effects. That’s what school, prison, and ladies-who-lunch all lack. The inhabitants of all those worlds are trapped in little bubbles where nothing they do can have more than a local effect. Naturally these societies degenerate into savagery. They have no function for their form to follow.

In fact, all the evidence that teenagers have “raging hormones” or other intrinsic problems is modern:

I’m suspicious of this theory that thirteen-year-old kids are intrinsically messed up. If it’s physiological, it should be universal. Are Mongol nomads all nihilists at thirteen? I’ve read a lot of history, and I have not seen a single reference to this supposedly universal fact before the twentieth century. Teenage apprentices in the Renaissance seem to have been cheerful and eager. They got in fights and played tricks on one another of course (Michelangelo had his nose broken by a bully), but they weren’t crazy.

As far as I can tell, the concept of the hormone-crazed teenager is coeval with suburbia. I don’t think this is a coincidence. I think teenagers are driven crazy by the life they’re made to lead. Teenage apprentices in the Renaissance were working dogs. Teenagers now are neurotic lapdogs. Their craziness is the craziness of the idle everywhere.

Pepper Pain

Thursday, April 29th, 2010

Capsaicin is the magic ingredient that makes hot chilli peppers hot. The human body produces its own endogenous capsaicin-like substances — fatty acids called oxidized linoleic acid metabolites or OLAMs — but they don’t simply generate a spicy heat; they cause pain:

Dr Kenneth Hargreaves, senior researcher at the Dental School at the University of Texas, and his team next set out to see if they could block these newly discovered pain pathways.

Lab work on mice showed that by knocking out a gene for the receptors, there was no sensitivity to capsaicin.

Armed with this knowledge they set about making drugs to do the same.

Dr Hargreaves said: “This is a major breakthrough in understanding the mechanisms of pain and how to more effectively treat it.

“We have discovered a family of endogenous capsaicin-like molecules that are naturally released during injury, and now we understand how to block these mechanisms with a new class of non-addictive therapies.”

Ultimately, he hopes the drugs will be able to treat different types of chronic pain, including that associated with cancer and inflammatory diseases such as arthritis and fibromyalgia.

Don’t Mess with Texas’s Governor, Coyotes

Thursday, April 29th, 2010

If you’re a fan of Texas governor Rick Perry, Ashby Jones says, you’ll likely become an even bigger fan after learning that he shot and killed a coyote while jogging in Austin. According to the AP:

Perry says he needed just one shot from his laser-sighted pistol to take down a coyote that was menacing his dog during an early morning jog in an undeveloped area near Austin.

Perry told The Associated Press he sometimes carries his pistol, loaded with hollow-pointed bullets, when he jogs on trails because he’s scared of snakes — and that he’d seen coyotes in that area.

When the coyote came out of the brush toward his daughter’s labrador retriever puppy on a February jog, he charged it and shot it with his .380 Ruger pistol.

He put a laser sight on a .380? The Ruger LCP is a small gun:

The Ruger® LCP™ is a compact .380 Auto from the industry leader in rugged, reliable firearms. From backup firearms for law enforcement to licensed carry for personal protection, the LCP is the perfect choice.

Designed with both male and female shooters in mind, the LCP is as affordable as it is reliable. At just 9.40 ounces (with an empty magazine), the LCP is lightweight and ideal for all-day carry — ensuring you have it when and where you need it.

Anyway, there’s some talk about the legality of what he did — which demonstrates that this happened in Austin, not Texas proper:

An Austin City ordinance prohibits firing handguns within city limits, with some exceptions, according to the Chronicle. Another ordinance says in the city limits may not knowingly shoot, kill, or hunt a wild animal.

The issue, it seems, will likely turn on whether Perry was inside the city limits when he fired the shot. Zoning records, according to the Chron, indicate that Perry’s rental home is within the city limits.

But hold on, says Perry’s office. Perry spokesman Mark Miner cited a section of the Health and Safety Code that says a coyote that is about to attack domestic animals may be killed by a person witnessing the attack.

“If he had to do it again, he would,” Miner said.

Near-Letter Quality Printers

Wednesday, April 28th, 2010

Do you remember near-letter quality printers?

Anyone who owned a computer during the 1980s would know that there were basically 2 types of printers: 9-pin dot matrix and 24-pin dot matrix. In the west, 9-pin dot matrix printers were sold as the basic model while 24-pin printers were sold as offering more readable near-letter quality (NLQ) printing. It’s easy to see the reason for 9 pins; it’s the minimum that can produce readable English text:

What you might not know is that 24-pin dot matrix printers were not developed to print clearer English text, but were developed in Japan to print Kanji. The 72-dpi resolution of a 9-pin printer simply cannot print readable Japanese text. As the following blow-up of 24-pixel text shows, this resolution is just high enough to render these complex characters – making it the Japanese equivalent of the 8-bit days. (Because I’ve used a Windows font to capture the image, the characters are not as well defined as they would be from a Japanese printer or mobile phone of the same resolution).

Compared to 72-pixel font:

The Data-Driven Life

Wednesday, April 28th, 2010

Gary Wolf reports on the trackers who follow the data-driven life:

Often, pioneering trackers struggle with feelings of being both aided and tormented by the very systems they have built. I know what this is like. I used to track my work hours, and it was a miserable process. With my spreadsheet, I inadvertently transformed myself into the mean-spirited, small-minded boss I imagined I was escaping through self-employment. Taking advantage of the explosion of self-tracking services available on the Web, I started analyzing my workday at a finer level. Every time I moved to a new activity — picked up the phone, opened a Web browser, answered e-mail — I made a couple of clicks with my mouse, which recorded the change. After a few weeks I looked at the data and marveled. My day was a patchwork of distraction, interspersed with valuable, but too rare, periods of focus. In total, the amount of uninterrupted close attention I was able to muster in a given workday was less than three hours. After I got over the humiliation, I came to see how valuable this knowledge was. The efficiency lesson was that I could gain significant benefit by extending my day at my desk by only a few minutes, as long as these minutes were well spent. But a greater lesson was that by tracking hours at my desk I was making an unnecessary concession to a worthless stereotype. Does anybody really believe that long hours at a desk are a vocational ideal? I got nothing from my tracking system until I used it as a source of critical perspective, not on my performance but on my assumptions about what was important to track.

There’s much more to the original article.

PepsiCo Reduces Sodium by Restructuring Salt

Wednesday, April 28th, 2010

Pepsi — which is primarily a snack-food company, not a soft-drink company — plans on reducing sodium content by restructuring salt:

“Early on in our research, it became apparent that the majority of salt on a snack doesn’t even have time to dissolve in your saliva because you swallow it so rapidly,” explained Mehmood Khan, senior vice president and chief scientific officer and a former Mayo Clinic endocrinologist. A Wall Street Journal story later reported only about 20 percent of the salt on a chip dissolves on the tongue, and the remaining 80 percent is swallowed without contributing to taste.

“There was an opportunity for our scientists,” said Khan. “If we could figure out a way of getting the salt crystals to dissolve faster, then we could decrease the amount of salt we put on a snack with no compromise on taste.”

Well, they did. Khan said PepsiCo researchers collaborated with scientists from around the world and found ways of changing the crystal size and structure to make the salt crystal dissolve more quickly, effectively putting the sodium on your tongue, not in your digestive system. He said it took an understanding of crystal chemistry.

When asked if the resulting product needed FDA or GRAS approval, Khan said no. “It’s still sodium chloride. Once it’s dissolved, it’s no different than any other salt.”

I was going to recommend the overly complicated and highly technical process of putting salt on the outside of snacks, rather than mixed in with the other ingredients, but I suppose that’s impractical.