Doomed by Diversity

Monday, June 18th, 2012

Nicholas Stix proclaims Eve Carson doomed by diversity:

Eve Carson was the golden girl — pretty, popular, brilliant, altruistic. Funny, too.

Born and raised in Athens, Georgia, Carson had been her high school class president, and was elected student government president at the University of North Carolina-Chapel Hill. A senior, double-majoring in political science and medicine, she was an aspiring doctor. Attending UNC on a Morehead-Cain Scholarship, she was a North Carolina Fellow and a Phi Beta Kappa, taught science in an elementary school, tutored in a middle school, mentored girls in a track/character-building program, and volunteered summers in Third World programs. Whew!

But all that ended when a couple of black thugs turned her into yet another statistic.

At 3:30 a.m., on March 5, 2008, Carson was kidnapped by Demario “Rio” Atwater and Laurence Lovette Jr., as she left her house to get into her SUV.

Carson’s killers had been stalking the parking lot of a sorority house down the street. Caroline Harper, a UNC alumna who was then living in the sorority house, testified at Lovette’s trial that

…she had just finished talking to her boyfriend on her cell phone about 3:30 a.m. on March 5 when she saw two black men in their late teens or early 20s standing in the parking lot of her sorority house….

They were standing there looking at me,” Harper said. “It was just a couple of seconds before I got really frightened and drove away.

[Lovette trial spotlights ATM use by Anne Blythe, The News and Observer, December 9, 2011.]

While Atwater held Carson at gunpoint in the back seat and sexually molested her, Lovette drove her SUV from bank to bank. A surveillance photograph showed Lovette using a drive-through cash machine. The killers were ultimately able to withdraw a total of $1,400 from her account.

The killers shot Eve Carson in the face and the head, one with a .25 caliber pistol, and the other with a shotgun, obliterating the right side of her pretty face. When the prosecution showed the jury autopsy photographs, some jurors wept.

Unusually, Eve Carson’s murder got national publicity, probably because of her own leftwing activism. A U.S. senator, a congressman, a mayor, and the chancellor of her university variously published or read eulogies for her. At Carson’s funeral in Athens, UNC Chancellor James Moeser spoke of how she embodied “the Carolina way” of “excellence with a heart,” and called her “a force of nature.”

But here’s the back story: If the criminal justice system had done its job, Eve Carson’s killers would both have already been in jail.

Atwater and Lovette were both on probation. Each had been arrested repeatedly prior to Carson’s murder—without ever being returned to jail for violating probation. Atwater, who had been convicted in February 2005 of felony breaking-and-entering, was even convicted of a further felony, possession of a firearm, in June 2007—yet still wasn’t incarcerated!

Meanwhile, Lovette’s probation officer, Chalita Nicole Thomas, who had never met with him, was herself arrested 11 times over the course of four years—at least twice for DUI, and once for carrying a concealed weapon. [Probation officer never met with Lovette | abc11.com, March 26, 2008]

The Main Stream Media declined to report what her other eight arrests were for—and then “disappeared” Thomas’ story. [See video of Thomas’s arrest record.]

The Soul Is Dead

Monday, June 18th, 2012

In 1882, Nietzsche declared “God is dead.” In the next few decades, some neuroscientist will declare, “The soul is dead,” Tom Wolfe predicts (writing in 1996):

Nietzsche said this was not a declaration of atheism, although he was in fact an atheist, but simply the news of an event. He called the death of God a “tremendous event,” the greatest event of modern history. The news was that educated people no longer believed in God, as a result of the rise of rationalism and scientific thought, including Darwinism, over the preceding 250 years. But before you atheists run up your flags of triumph, he said, think of the implications. “The story I have to tell,” wrote Nietzsche, “is the history of the next two centuries.” He predicted (in Ecce Homo) that the twentieth century would be a century of “wars such as have never happened on earth,” wars catastrophic beyond all imagining. And why? Because human beings would no longer have a god to turn to, to absolve them of their guilt; but they would still be racked by guilt, since guilt is an impulse instilled in children when they are very young, before the age of reason. As a result, people would loathe not only one another but themselves. The blind and reassuring faith they formerly poured into their belief in God, said Nietzsche, they would now pour into a belief in barbaric nationalistic brotherhoods: “If the doctrines…of the lack of any cardinal distinction between man and animal, doctrines I consider true but deadly”—he says in an allusion to Darwinism in Untimely Meditations—”are hurled into the people for another generation…then nobody should be surprised when…brotherhoods with the aim of the robbery and exploitation of the non–brothers…will appear in the arena of the future.”

Nietzsche’s view of guilt, incidentally, is also that of neuro–scientists a century later. They regard guilt as one of those tendencies imprinted in the brain at birth. In some people the genetic work is not complete, and they engage in criminal behavior without a twinge of remorse—thereby intriguing criminologists, who then want to create Violence Initiatives and hold conferences on the subject.

Nietzsche said that mankind would limp on through the twentieth century “on the mere pittance” of the old decaying God–based moral codes. But then, in the twenty–first, would come a period more dreadful than the great wars, a time of “the total eclipse of all values” (in The Will to Power). This would also be a frantic period of “revaluation,” in which people would try to find new systems of values to replace the osteoporotic skeletons of the old. But you will fail, he warned, because you cannot believe in moral codes without simultaneously believing in a god who points at you with his fearsome forefinger and says “Thou shalt” or “Thou shalt not.”

Why should we bother ourselves with a dire prediction that seems so far–fetched as “the total eclipse of all values”? Because of man’s track record, I should think. After all, in Europe, in the peaceful decade of the 1880s, it must have seemed even more far–fetched to predict the world wars of the twentieth century and the barbaric brotherhoods of Nazism and Communism. Ecce vates! Ecce vates! Behold the prophet! How much more proof can one demand of a man’s powers of prediction?

A hundred years ago those who worried about the death of God could console one another with the fact that they still had their own bright selves and their own inviolable souls for moral ballast and the marvels of modern science to chart the way. But what if, as seems likely, the greatest marvel of modern science turns out to be brain imaging? And what if, ten years from now, brain imaging has proved, beyond any doubt, that not only Edward O. Wilson but also the young generation are, in fact, correct?

[...]

This sudden switch from a belief in Nurture, in the form of social conditioning, to Nature, in the form of genetics and brain physiology, is the great intellectual event, to borrow Nietzsche’s term, of the late twentieth century. Up to now the two most influential ideas of the century have been Marxism and Freudianism. Both were founded upon the premise that human beings and their “ideals”—Marx and Freud knew about quotation marks, too—are completely molded by their environment. To Marx, the crucial environment was one’s social class; “ideals” and “faiths” were notions foisted by the upper orders upon the lower as instruments of social control. To Freud, the crucial environment was the Oedipal drama, the unconscious sexual plot that was played out in the family early in a child’s existence. The “ideals” and “faiths” you prize so much are merely the parlor furniture you feature for receiving your guests, said Freud; I will show you the cellar, the furnace, the pipes, the sexual steam that actually runs the house. By the mid–1950s even anti–Marxists and anti–Freudians had come to assume the centrality of class domination and Oedipally conditioned sexual drives. On top of this came Pavlov, with his “stimulus–response bonds,” and B. F. Skinner, with his “operant conditioning,” turning the supremacy of conditioning into something approaching a precise form of engineering.

So how did this brilliant intellectual fashion come to so screeching and ignominious an end?

The demise of Freudianism can be summed up in a single word: lithium. In 1949 an Australian psychiatrist, John Cade, gave five days of lithium therapy—for entirely the wrong reasons—to a fifty–one–year–old mental patient who was so manic–depressive, so hyperactive, unintelligible, and uncontrollable, he had been kept locked up in asylums for twenty years. By the sixth day, thanks to the lithium buildup in his blood, he was a normal human being. Three months later he was released and lived happily ever after in his own home. This was a man who had been locked up and subjected to two decades of Freudian logorrhea to no avail whatsoever. Over the next twenty years antidepressant and tranquilizing drugs completely replaced Freudian talk–talk as treatment for serious mental disturbances. By the mid–1980s, neuroscientists looked upon Freudian psychiatry as a quaint relic based largely upon superstition (such as dream analysis — dream analysis!), like phrenology or mesmerism. In fact, among neuroscientists, phrenology now has a higher reputation than Freudian psychiatry, since phrenology was in a certain crude way a precursor of electroencephalography. Freudian psychiatrists are now regarded as old crocks with sham medical degrees, as ears with wire hairs sprouting out of them that people with more money than sense can hire to talk into.

Marxism was finished off even more suddenly—in a single year, 1973—with the smuggling out of the Soviet Union and the publication in France of the first of the three volumes of Aleksandr Solzhenitsyn’s The Gulag Archipelago. Other writers, notably the British historian Robert Conquest, had already exposed the Soviet Union’s vast network of concentration camps, but their work was based largely on the testimony of refugees, and refugees were routinely discounted as biased and bitter observers. Solzhenitsyn, on the other hand, was a Soviet citizen, still living on Soviet soil, a zek himself for eleven years, zek being Russian slang for concentration camp prisoner. His credibility had been vouched for by none other than Nikita Khrushchev, who in 1962 had permitted the publication of Solzhenitsyn’s novella of the gulag, One Day in the Life of Ivan Denisovich, as a means of cutting down to size the daunting shadow of his predecessor Stalin. “Yes,” Khrushchev had said in effect, “what this man Solzhenitsyn has to say is true. Such were Stalin’s crimes.” Solzhenitsyn’s brief fictional description of the Soviet slave labor system was damaging enough. But The Gulag Archipelago, a two–thousand–page, densely detailed, nonfiction account of the Soviet Communist Party’s systematic extermination of its enemies, real and imagined, of its own countrymen, by the tens of millions through an enormous, methodical, bureaucratically controlled “human sewage disposal system,” as Solzhenitsyn called it— The Gulag Archipelago was devastating. After all, this was a century in which there was no longer any possible ideological detour around the concentration camp. Among European intellectuals, even French intellectuals, Marxism collapsed as a spiritual force immediately. Ironically, it survived longer in the United States before suffering a final, merciful coup de grace on November 9, 1989, with the breaching of the Berlin Wall, which signaled in an unmistakable fashion what a debacle the Soviets’ seventy–two–year field experiment in socialism had been. (Marxism still hangs on, barely, acrobatically, in American universities in a Mannerist form known as Deconstruction, a literary doctrine that depicts language itself as an insidious tool used by The Powers That Be to deceive the proles and peasants.)

Freudianism and Marxism—and with them, the entire belief in social conditioning—were demolished so swiftly, so suddenly, that neuroscience has surged in, as if into an intellectual vacuum. Nor do you have to be a scientist to detect the rush.

[...]

Meantime, the notion of a self—a self who exercises self–discipline, postpones gratification, curbs the sexual appetite, stops short of aggression and criminal behavior—a self who can become more intelligent and lift itself to the very peaks of life by its own bootstraps through study, practice, perseverance, and refusal to give up in the face of great odds—this old–fashioned notion (what’s a boot strap, for God’s sake?) of success through enterprise and true grit is already slipping away, slipping away…slipping away…The peculiarly American faith in the power of the individual to transform himself from a helpless cypher into a giant among men, a faith that ran from Emerson (“Self–Reliance”) to Horatio Alger’s Luck and Pluck stories to Dale Carnegie’s How to Win Friends and Influence People to Norman Vincent Peale’s The Power of Positive Thinking to Og Mandino’s The Greatest Salesman in the World —that faith is now as moribund as the god for whom Nietzsche wrote an obituary in 1882. It lives on today only in the decrepit form of the “motivational talk,” as lecture agents refer to it, given by retired football stars such as Fran Tarkenton to audiences of businessmen, most of them woulda–been athletes (like the author of this article), about how life is like a football game. “It’s late in the fourth period and you’re down by thirteen points and the Cowboys got you hemmed in on your own one–yard line and it’s third and twenty–three. Whaddaya do?…”

Sorry, Fran, but it’s third and twenty–three and the genetic fix is in, and the new message is now being pumped out into the popular press and onto television at a stupefying rate. [...] If I may mention just a few things the evolutionary psychologists have illuminated for me over the past two months:

The male of the human species is genetically hardwired to be polygamous, i.e., unfaithful to his legal mate. Any magazine–reading male gets the picture soon enough. (Three million years of evolution made me do it!) Women lust after male celebrities, because they are genetically hardwired to sense that alpha males will take better care of their offspring. (I’m just a lifeguard in the gene pool, honey.) Teenage girls are genetically hardwired to be promiscuous and are as helpless to stop themselves as dogs in the park. (The school provides the condoms.) Most murders are the result of genetically hardwired compulsions. (Convicts can read, too, and they report to the prison psychiatrist: “Something came over me…and then the knife went in.” 2)

Where does that leave self–control? Where, indeed, if people believe this ghostly self does not even exist, and brain imaging proves it, once and for all?

Enstupidation

Sunday, June 17th, 2012

Fred Reed wonders what purpose the public schools serve — other than to warehouse children while their parents work or watch television:

They certainly don’t teach much, as survey after survey shows. Is there any particular reason for having them? Apart from their baby-sitting function, I mean.

Schooling, sez me, should be adapted to the needs and capacities of those being schooled. For unintelligent children, the study of anything beyond minimal reading is a waste of time, since they will learn little or nothing more. For the intelligent, a public schooling is equivalent to tying an anchor to a student swimmer. The schools are an impediment to learning, a torture of the bright, and a form of negligent homicide against a country that needs trained minds in a competitive world.

Let us start with the truly stupid. Millions of children graduate — “graduate” — from high school — “high school” — unable to read. Why inflict twelve years of misery on them? It is not reasonable to blame them for being witless, but neither does it make sense to pretend that they are not. For them school is custodial, nothing more. Since there is little they can do in a technological society, they will remain in custody all their lives. This happens, and must happen, however we disguise it.

For those of reasonably average acuity, it little profits to go beyond learning to read, which they can do quite well, and to use a calculator. Upon their leaving high school, question them and you find that they know almost nothing. They could learn more, average not being stupid, but modest intelligence implies no interest in study. This is true only of academic subjects such as history, literature, and physics. They will study things that seem practical to them. Far better to teach the modestly acute such things as will allow them to earn a living, be they typing, carpentry, or diesel repair. Society depends on such people. But why inflict upon them the geography of Southeast Asia, the plays of Shakespeare, or the history of the nineteenth century? Demonstrably they remember none of it.

Some who favor the public schools assert that an informed public is necessary to a functioning democracy. True, and beyond doubt. But we do not have an informed public, never have had one, and never will. Nor, really, do we have a functioning democracy.

Women Do Not Belong in the Infantry

Saturday, June 16th, 2012

Women do not belong in the infantry, Nate Smith says:

It’s a simple statement and one that, until recently, nearly every civilized culture seemed to accept as a truism. For reasons as multitudinous as they are apparent and profound, in time of war men have shouldered arms and marched to the clash of legions or the sound of the guns. Women as a rule have not. Even in those scattered and wretched societies whose women prowled the battlefields to torture the wounded and desecrate the dead, no woman was thrown into offensive action against the massed ranks of the enemy. Show me an exception and I’ll show you savages.

I don’t think he’s going to be changing anyone’s mind:

Since the obvious has apparently escaped social reformers and military planners, I will restate it: there are fundamental physical differences between men and women. I could quote facts and figures about the difference in average body weight of men and women, the distribution of muscle mass, and the capacity for heavy lifting and muscular endurance. But since facts and figures haven’t deterred those who argue for women in the infantry, I’ll just use a real world example.

Marine Second Lieutenants at The Basic School — just across the street from the Infantry Officer Course — conduct at least a half-dozen conditioning hikes during their six months of basic officer training. The hikes range from 3 miles to 12 or more, and are conducted with full packs, body armor, personal weapons, and the machine guns and mortars organic to an infantry battalion. Since “Every Marine is a Rifleman”, all lieutenants — male and female — learn the basics of infantry leadership. The hike pace is 3 miles every 50 minutes, followed by a ten minute break. Forever. Or so it seems.

Most service members will admit that conditioning hikes are grueling exercises in physical and mental endurance. I personally despised them, especially when it was my turn to shoulder a 25 pound machine gun or a 45 pound, .50-caliber receiver. Each hike took all of my effort and physical fitness to complete. Unsurprisingly, during my time at The Basic School no female lieutenant completed a hike of greater than 6 miles with the rest of the 180 or so male lieutenants. Not one. And that’s with the male lieutenants carrying all of the radios and heavy weapons.

A hike only gets you to the fight.

Am I disparaging my fellow lieutenants simply because they were women? Of course not. Many of them were smart, fit, and exceptionally disciplined and dedicated. Hell, they chose to lead Marines. I’m certain that the majority of them went on to serve bravely in the stinking streets of Iraq and the austere mountain valleys of Afghanistan. But not with the infantry.

The fact is that an infantryman’s job is a mix between professional athlete, police officer, mechanic, and construction worker. It is a physical job. Infantrymen are affectionately and accurately known as “grunts” because of the sound made when shifting a 120-pound pack closer against one’s agonized shoulders. It isn’t good enough to survive the physical requirements of a 12 mile mountain ruck march if at the end of it an infantryman cannot fling down his pack and sprint in short bursts of speed across an undulating farm field while delivering effective and disciplined fire against a concealed enemy who is desperately trying to kill him.

It would be the rare woman that could meet such an exacting physical standard. Yet, undoubtedly some could. A 73 year old Japanese woman summited Mount Everest this past weekend. There must be a few 20 year old, female athletes that could excel in the infantry. So why not keep the standard the same and allow women who pass it to enlist in the infantry? This brings me to my next obvious point.

First Sneak Peek at the Animated Dark Knight Returns

Friday, June 15th, 2012

Frank Miller’s Dark Knight Returns is coming to the small screen this fall:

Like Alan Moore’s Watchmen, Frank Miller’s Dark Knight Returns is a product of its time.

(Hat tip to io9.)

Point-Shooting Comes Naturally

Friday, June 15th, 2012

The Force Science Research Center’s latest round of hit-probability experiments has produced these unnerving findings:

  • Even “naive shooters,” untrained and unpracticed with handguns, are amazingly accurate in making head shots at close range, and tend to shoot for the head instinctively.
  • Shots intended for an officer’s vested area often end up in unprotected vital parts of the body because of a suspect’s poor gun control.
  • The speed with which an officer can be put behind the reactionary curve, even by assailants who have no expertise with firearms, is startling.

The study used volunteers from Northeast Wisconsin Technical College’s 2-year corrections and law enforcement program in Green Bay:

After a brief safety review with red guns, the participants were given functional weapons with live ammunition and, in a controlled sequence, were told to address targets especially designed by Avery for ultra precise measurement of shot placement.

Those with no experience were allowed to fire half a dozen “familiarization” rounds “to get the feel of sound and recoil” but were not told how to hold the gun, except to “grip it firmly” and to avoid touching the trigger until the muzzle was safely down range. Each shooter used his or her same assigned gun throughout the tests, either a Glock 17, a Springfield XD in 9mm (supplied by Springfield Armory), a Beretta 9mm, or a S&W J-frame short-barrel Special.

The shooters each started from a series of 4 positions, reflecting how offenders commonly have guns when confronted by LEOs:

  1. Hand on the gun, which was concealed at the rear waistband;
  2. Gun hidden at the front waistband, with a garment covering it;
  3. Gun in hand, hidden behind a leg;
  4. Gun held to a baseball hat which the subject was holding by the bill, simulating a hostage situation or an intended suicide with sudden homicidal capabilities.

Holsters were not used, consistent with the recent FBI study documenting that run-of-the-mill street punks rarely carry weapons holstered.

Each shooter presented the gun and fired from each of these starting positions at 9 different distances, ranging from 1 to 25 yards from the target. The controlled lighting was “dimmer than daylight, but not low-light,” Avery said. “They could see their targets clearly.”

The shooters were told that at the sound of a timer they should “shoot as fast as you can, as well as you can, trying to hit the target with every shot but not slowing down in an attempt to gain accuracy,” Avery said. “We wanted them to get the first round off in under 1 second and to complete 3 shots within 1.7 seconds. That’s similar to a real assailant bringing a gun out and firing as rapidly as he can.” They were not told what part of the target to try to hit, just “wherever you feel is best.”

Data from the tests are still undergoing a detailed computer analysis, but based on on-site observations and preliminary reviews, these are some of the highlights Avery and Lewinski consider significant:

POINT SHOOTING. An overwhelming majority of the test subjects used point shooting at all distances when firing rapidly, and almost all used 1-handed techniques at close ranges. At 5-7 yards and beyond, many shifted spontaneously to 2-hand stances, with an increase in hit probability noted.

Even though point shooting, the volunteers still tended to extend their arms fully and bring the gun up to eye level. “Rarely did they use a combat tuck,” Avery said. “Even at 1 yard, they tended to extend their arm to shoot.”

To Avery’s surprise, many initial rounds, especially when the gun was brought from behind the back, tended to go to the right of the target (from the shooter’s perspective). This contradicts conventional wisdom, he said, which holds that shots from a right-handed shooter often end up going to the left. If this apparent discrepancy is sustained in further testing, officers who are taught to move to their left in hopes of avoiding early rounds may, in fact, be stepping into a field of fire.

HEAD SHOTS. At close distances (1-3 yards), more than half the simulated offenders “shot at the head without being told to” and had a “very high hit probability” with at least 1 of their shots, Avery noted. “It was astounding how they could keep the pattern in the head.”

The chest (center mass) was the second most likely target.

Avery explained that people tend to shoot where their attention is directed. Unless they are trained otherwise, they are likely to look at the face, particularly in close-up encounters. “We communicate with each other nonverbally by watching facial gestures, and we look at each other’s eyes, especially at close distances.” Consequently, he speculated, the much-reported tendency of street assailants to target officers’ heads may be less a “deliberate, diabolical plot” and more related to natural instincts.

BRACKETING. Often a shooter missed a desired placement with the first round but was able to “bracket” subsequent rounds for successful hits “without slowing down,” Avery said. “They were able to coordinate their actions, process feedback on hits, and adjust their placement very rapidly, even with no previous training or practice.”

He conceded that due to research limitations this tendency may have been “a little artificial” during the experiments because hit placement was more easily detected on the paper targets than might be true with a clothed human being, especially in low-light conditions. However, even at distances where they could not see their hits, the bracketing tendency was noted.

SPEED. A strong majority of the shooters fired all 3 rounds within 1.5 seconds. That included reaction time in responding to the timer signal. Some were able to react and shoot all 3 shots within 1 second. A “very large majority” fired all 3 with about a quarter-second between shots. Some were longer, up to .35-.40.

An actual assailant who is deciding when to shoot without reacting to an auditory signal and who is likely bringing his gun out and up with his finger already on the trigger could be expected to get a first round off even faster than the volunteers, Avery said.

DISTANCE VARIABLES. At 5 to 7 yards, many of the shooters “directed fire at a bigger part of the body” than the head, Avery reported. But still, “a lot of shots hit in the head, neck, and upper chest.” He attributed this to “the guns climbing in recoil and the shooters not being able to control that at speed.” He said that “a significant number of rounds impacted above the level of a vest,” even at distances where luck became a strong factor in shot placement.

COLLATERAL DAMAGE. Shooters who missed the intended target altogether often produced “collateral hits on a side target as far as 4 feet away,” Avery observed. This has implications for officers who tend to cluster together. “They need separation to avoid getting hit by accident by shots from a barrage intended for another officer.”

MUZZLE BLAST. At 1 yard, “specks of unburned powder” from muzzle blast frequently “covered the whole head” of the target, Avery recalled. “Some targets were blown apart.” Without adequate eye protection, an officer risks being “flash-banged and flash-blinded, probably out to 3 yards,” even with near misses from a felon’s gun.

QUICK LEARNING. “Within a very short time, at least half the volunteers had a very good grasp” on the basic mechanics of shooting, Avery noted. “A lot of subconscious learning took place within the first 15 shots. For example, without being told, many learned how to set the wrist to control recoil. Some people just have a natural ability to pick up a gun and be able to control it. It was amazing how well many of these people could shoot with no training at all. Flat out amazing!”

“Natural aptitude” was most noticeable among “the more athletic types,” he said. “It was evident that weight training and higher-than-average grip strength give you a clear advantage in shooting, especially at distances beyond 3 yards. But even the smaller, weaker subjects for the most part were able to fire fast and accurately.”

He cited one small female who produced a gun from behind her leg and delivered 3 head shots from 3 yards in less than 1.5 seconds. “And she had never held a gun before,” Avery said.

“These findings,” Lewinski said, “are certain to have significant impact on officer-survival training.”

Homemade Ballistic Mask

Friday, June 15th, 2012

When pedants pointed out that bullet-proof vests weren’t absolutely impervious to all bullets, they became bullet-resistant, and then ballistic.

Ballistic masks also exist, but they have plenty of downsides.

To the geeks at The Post Apoc, the chief downside was cost:

I once saw a kevlar ballistic mask in a tactical catalogue for $400. I’d just finished playing through Army of Two, an Xbox shooter in which the protagonists wear heavy body armor, including ballistic masks. So, naturally, I thought “Sweet! I want one!… but not for $400 effin’ bucks.”

Thus, I decided to build my own. First, a base. I thought about starting with a plaster mold of my own face, but I wanted something a little stouter — something inherently designed to take punishment, but also light weight and simple. I found a vintage street hockey mask on ebay for $10.

I knew kevlar would be the main ingredient, but I’d never even looked into purchasing any — I assumed it would be expensive. Not so. I typed in “kevlar” on ebay and… oh. You can just purchase rolls of the stuff for next to nothing. We’ll I’ll be damned. I bought a square yard of it for $20. I knew I wanted to layer it with a flexible glue — something that wouldn’t stiffen the kevlar and make it brittle and less effective, so I used plain old Mod Podge crafting glue. I laid the first layer on whole, which was WAY too hard. It didn’t want to stick to the plastic at all, and I basically had to continuously smooth it out with my hands for fifteen minutes until the glue dried completely. So, after that I cut the kevlar into strips with tin snips and started laying the strips on in alternating directions, each new layer overlapping the seams of the previous one.

I used the tin snips to clean up the edges of the mask, then went around it with the thickest duct tape I could find, just to hold everything in place a bit better.

I’d heard somewhere that aluminum did a good job of slowing bullets down because it bent and stretched. In testing, this proved negligible at best. I had a sheet of aluminum lying around that my brother had used for another project, so I cut it up into “scales” and armored the mask with it. I started at the edges and worked my way in toward the center so the plates would overlap outward and hopefully deflect the bullets toward the outside edges of the mask, like rain running down a shingled roof. Again, in testing, this did not happen AT ALL — but damn, you’ve got to admit that those metal scales look pretty damn cool. Out at the edges, the metal was only one layer thick, but on the forehead and nose, it was three layers thick. I glued the metal plates on using silicon glue — again, because I wanted everything to be flexible, not rigid and brittle.

I needed something to hold all those plates to one another. Enter, epoxy. I painted a thick, gooey layer of epoxy over the entire mask. Granted, epoxy by itself dries brittle, but this wasn’t intended to add protection, it was only intended to hold the plates in place, and it worked pretty well. After the epoxy dried, I painted the mask with spray on truck bed liner… just to add the final, scary touch.

Then they put it on a watermelon and started shooting:

Journal of a New COBRA Recruit

Thursday, June 14th, 2012

This Journal of a New COBRA Recruit entry explains the famous terrorist organization’s training methodology:

Awful exciting day today. First we got to do our airborne training. They loaded us up into a plane, and we flew up and then jumped out. Our chutes had the big, scary COBRA symbol on them. It was awesome. But it was hard, because we were supposed to keep yelling “COBRA!” all the way down. It was tough to get enough breath to yell right at first. Sarge says it just takes practice.

After that we finally got to do weapons training. About time! They gave me a rifle and pointed at the target. I held the rifle up to my cheek and sighted down the barrel, just like I did when I went deer hunting with Grampa. Boy, did Sarge go apeshit over that! Got in my face and started yelling at me, asking how I expected to scare someone if I just stood there all quiet-like and shot so carefully. Sarge is a great teacher because he doesn’t just criticize. He showed the right way to shoot. What you do is you start shooting your gun wildly and run towards the target as fast as you can and, in your scariest voice, you yell “COBRA!” We worked on that all afternoon, and just before we broke for dinner, I actually hit the target! Sarge and everyone else were so happy for me that they were about to cry. Told me I’d just set the record for marksmanship in COBRA boot camp. I wanted to call Mom and tell her the good news, but she thinks I work for the phone company.

The Code of the Banana Man

Thursday, June 14th, 2012

Rich Cohen shares five lessons from the Banana Man:

Samuel Zemurray, known to friend and foe alike as Sam the Banana Man, made his first fortune in “ripes,” bananas that the big fruit traders considered too mature to reach market in time for sale. Their rule of thumb was “One freckle turning, two freckles ripe” — thus consigning tons of fruit to a great reeking pile at the edge of the wharf, where it was pushed into the sea or simply left to rot.

When Zemurray, a young Russian immigrant, saw that first sad pile of ripes — circa 1895, in the port of Mobile, Ala. — he recognized his opportunity. For a man who spent his early years on a desolate wheat farm in Bessarabia, there was obvious value in even a freckled piece of fruit. By 1903, he was a mini mogul, with $100,000 in the bank.

From there, Sam went into yellows, even greens. In 1909, he headed south to Honduras, where he bought and cleared great swaths of virgin jungle and then, working with a mercenary army recruited in New Orleans, overthrew the government and replaced it with one more to his liking. He built a cracker-jack banana company down there and eventually took over United Fruit in December 1932. By the time he died in 1961, in the grandest house in New Orleans, he had been a hauler and a cowboy, a farmer, a trader, a political battler, a revolutionary, a philanthropist and a CEO.

The lessons:

1. Go see for yourself.

When Sam decided to become a banana grower, he moved to the jungle in Honduras. He planted stems, walked the fields and loaded banana boats. He believed that this was his great advantage over the executives of United Fruit, the market-leading behemoth that he battled for over a decade. U.F. was bigger, but it was run from an office in Boston. Sam was on the ground; he understood his workers, how they felt, what they feared and believed. Telling fruit honchos in Boston why he knew better, Sam would curse and say, “You’re there, I’m here.”

2. Don’t try to be smarter than the problem.

In the late 1920s, United Fruit and Sam’s company were trying to acquire the same piece of land, a fertile expanse that straddled the border of Honduras and Guatemala. But the land seemed to have two rightful owners, one in Honduras, the other in Guatemala. While U.F. hired lawyers and commissioned studies, trying to determine the legal property holder, Zemurray simply purchased the land twice, once from each owner. A simple problem deserves a simple solution.

3. Don’t trust the experts.

In the 1930s, with United Fruit staggered by the Great Depression — its stock price fell from $100 a share to just over $10 — the company’s executives, in search of a game plan, consulted experts, solicited reports and interviewed economists. Zemurray wanted answers to the same questions — by then, he was the biggest holder of United Fruit stock — but he went instead to the New Orleans docks, where he buttonholed the sea captains and fruit jobbers who really understood the situation on the ground.

He learned, for example, that banana-boat captains had been ordered to cross the Gulf of Mexico at half-speed, thus saving fuel. He also learned that, in the course of the extra days on the water, a large percentage of the cargo was going from yellow to ripe. One of Sam’s first orders when he took over U.F. in 1932 was: Don’t slow down; cut the number of crossings. Within six months of Sam’s ascension, the stock had rallied and reached $50 a share.

4. Money can be made again, but a lost reputation is gone forever.

Early in his career, Sam joined in a partnership with United Fruit. The behemoth gave him money and helped to distribute his product; he gave the company the use of his ships. One year, when banana workers went on strike in Nicaragua and blockaded the country’s rivers, U.F. broke the blockade with Zemurray’s ships, his company logo painted in big letters on the side. It made his name hated in Nicaragua. It was one of the events that convinced Sam to dissolve his partnership with U.F., no matter how much he had come to depend on its deep pockets. A person who doesn’t control his own name and image has nothing.

5. When in doubt, do something!

When Zemurray took over United Fruit in 1932, the company was a few months from collapse. The stock price was heading to zero, the best workers fleeing. As soon as he took control, he set off on a whirlwind tour, crisscrossing Central and South America, meeting workers in the field and asking for their ideas. The perception of activity, he explained, is just as important as the nature of that activity. The boys in the fields need to know that there is a person in charge. If they think you know what you’re doing, they’ll follow you anywhere.

He sounds like quite the operations guy. The admonition to go see for yourself fits the modern philosophy of management by walking around — or, if you want to use a more exotic foreign name, gemba kaizen.

Cohen distills it down to the Code of the Banana Man:

Power comes from knowledge, information and experience, which grow from the ground like a banana stem. If you lose touch with that, you’re ripe for the taking.

MBA startups at Stanford reach all-time high

Wednesday, June 13th, 2012

A record-breaking 16% of the Stanford’s Graduate School of Business class of 2011 chose to start their own companies at graduation, exceeding the school’s 12% peak during the dot-com boom:

About 30% started companies in Internet services and e-commerce, but 15% ventured into investment and financial services, and 7% each in food and beverages, retail or wholesale, and sport or sports management. Roughly 5% of the Stanford MBA entrepreneurs launched enterprises in healthcare, with another 5% in cleantech and alternative energy.

Green Roofs

Wednesday, June 13th, 2012

Green roofs are neither simple nor cheap:

Over a black roof — flat is easiest but sloped can work — goes insulation, then a waterproof membrane, then a barrier to keep roots from poking holes in the membrane. On top of that there is a drainage layer, such as gravel or clay, then a mat to prevent erosion. Next is a lightweight soil (Chicago City Hall uses a blend of mulch, compost and spongy stuff) and finally, plants.

An extensive roof — less than 6 inches of soil planted with hardy cover such as sedum — can cost $15 per square foot. An intensive roof — essentially a garden, with deeper soil and plants that require watering and weeding — can double that. But because the vegetation is thicker, it will do a better job of cooling a building and collecting rainwater. Plants reduce sewer discharge in two ways. They retain rainfall, and what does run off is delayed until after the waters have peaked.

A study conducted by Columbia University and City University of New York of three test roofs built by Con Edison in Queens found that the green roof — an extensive roof, planted with sedum — cut the rate of heat gained through the roof in summer by 84 percent, and the rate of heat lost through the roof in winter by 34 percent.

Another study (same researchers, same Con Ed test sites) found that green roofs are a very cost-effective way to reduce storm water runoff. If New York has one billion square feet of possibly greenable roof, planting it all could retain 10 to 15 billion gallons of annual rainfall — which would cut a substantial amount of sewage overflow. “If you add in all the other green infrastructure, such as street trees, permeable pavement and ground collection pits, it might be possible to eliminate the combined sewage overflow without building specialized water detention tanks, which are hugely expensive,” said Stuart Gaffin, a research scientist at Columbia’s Center for Climate Systems Research, who co-authored both studies with colleagues from City College.

Green roofs have other advantages.They scrub the air: one square meter can absorb all the emissions from a car being driven 12,000 miles a year, said Amy Norquist, chief executive of Greensulate, which installs green roofs.And green roofs can provide the plants that animals, birds and bees need where parks are far apart.

White roofs are both simple and cheap:

But less investment buys less return. White roofs don’t catch rainwater, help biodiversity or clean the air. Gaffin’s group found that the white portion of the Con Ed roof averaged 43 degrees cooler than black at noon on summer days. That’s something, but it’s a smaller cooling effect than green roofs offer. Green roofs improve each year as vegetation becomes denser and taller. But after a few months, a white roof tends to look like city snow — covered with soot. As a white roof dirties, it loses a lot of its cooling ability.

Museum Intervention

Wednesday, June 13th, 2012

A museum intervention is now mandatory for all first-year med students at Yale:

Called Enhancing Observational Skills, the program asks students to look at and then describe paintings — not Pollocks and Picassos but Victorian pieces, with whole people in them. The aim? To improve diagnostic knack.

Linda Friedlaender, the curator of education at the Yale Center for British Art, and Irwin Braverman, at Yale’s medical school, created the program a decade ago and guide groups through the New Haven museum. Each student is assigned a painting — “Mrs. James Guthrie,” say, by Lord Frederic Leighton — which they examine for 15 minutes, recording all they see. Then the group discusses its observations.

There is no redness, no apparent pressure, in Mrs. Guthrie’s fingers as she holds a flower. Does that mean she’s putting it into the vase — or taking it out? The conclusion matters less than the collection of detail. “We are trying to slow down the students,” Ms. Friedlaender told me. “They have an urge to come up with a diagnosis immediately and get the right answer.”

Many have been taught that schooling is a race to the finish. Others learned early that equations beat etchings (picture book writers, once considered the “academicians of the nursery,” have been trampled on the fast track to pre-K). Ms. Friedlander is realistic: “This is not an aesthetic experience we’re providing. The artwork is a means to an end.”

Surgeon Richard Selzer, in “Letters to a Young Doctor,” wrote: “I have seen sorrow more fully expressed in a buttocks eaten away by bedsores; fear, in the arching of a neck; supplication, in a wrist. Only last week I was informed by a man’s kneecaps that he was going to die. Flashing blue lights, they teletyped that he was running out of oxygen and blood.” The Yale intervention may not endow students with Dr. Selzer’s acute empathy. But a three-year study published in the Journal of the American Medical Association showed that, afterward, they are 10% more effective at diagnosis.

The program has expanded to more than 20 medical schools, including Harvard, Columbia and Cornell. It has also become part of Wharton’s executive education.

If your goal is to teach diagnosis, perhaps you should show students photos or videos of actual patients — but that’s so practical.

The New Neuroscience of Choking

Tuesday, June 12th, 2012

Jonah Lehrer reviews the neuroscience of choking under pressure — which comes about from thinking too much:

The sequence of events typically goes like this: When people get anxious about performing, they naturally become particularly self-conscious; they begin scrutinizing actions that are best performed on autopilot. The expert golfer, for instance, begins contemplating the details of his swing, making sure that the elbows are tucked and his weight is properly shifted. This kind of deliberation can be lethal for a performer.

Sian Beilock, a professor of psychology at the University of Chicago, has documented the choking process in her lab. She uses putting on the golf green as her experimental paradigm. Not surprisingly, Beilock has shown that novice putters hit better shots when they consciously reflect on their actions. By concentrating on their golf game, they can avoid beginner’s mistakes.

A little experience, however, changes everything. After golfers have learned how to putt — once they have memorized the necessary movements — analyzing the stroke is a dangerous waste of time. And this is why, when experienced golfers are forced to think about their swing mechanics, they shank the ball. “We bring expert golfers into our lab, and we tell them to pay attention to a particular part of their swing, and they just screw up,” Beilock says. “When you are at a high level, your skills become somewhat automated. You don’t need to pay attention to every step in what you’re doing.”

But this only raises questions: What triggers all of these extra thoughts? And why does it only happen to some athletes, performers, and students? Everyone gets nervous; not everyone chokes.

A new study in Neuron, by a team of neuroscientists at Caltech and University College of London, begins to solve this mystery. The experiment featured a simple arcade game, in which subjects attempted to move a virtual ball into a square target within two seconds. To make the task more difficult, the ball appeared to be weighted and connected to a spring, which flexed and bent as if it were real.

After a short training period, the subjects were put into an fMRI machine and offered a range of rewards, from nothing to a hundred dollars, if they could successfully place the ball into the square. (The subjects were later given an actual reward based on their score.) At first, their performance steadily improved as the incentives increased; the extra money was motivating. However, this effect only lasted for a little while. Once the rewards passed a certain threshold — and the particular tipping point depended on the individual — the scientists observed a surprising decrease in success. The extra cash hurt performance; the subjects began to choke.

Because the game was unfolding inside a brain scanner — an admittedly imperfect tool, which uses changes in blood flow as a proxy for neural activity — the scientists could begin to decipher the mental mechanics behind this process. They quickly zeroed in on a subcortical region called the ventral striatum, which has been implicated in the processing of various pleasures, from taking cocaine to eating ice cream to receiving cash gifts. (The striatum is dense with dopamine neurons.) As expected, the striatum tracked the financial stakes of the game, so that telling subjects about a bigger payout led to increased activity in the brain area. So far, so obvious: the extra money led people to get more excited about the potential rewards, which led them to work harder. This is why businesses give people bonuses.

However, when the subjects actually began playing the video game, the striatum did something very peculiar. All of a sudden, the activity of the brain area became inversely related to the magnitude of the reward; bigger incentives led to less excitement. Furthermore, activity in the insula was closely correlated with success, with decreased activity leading to decreased performance.

What explains this result? The researchers argue that the subjects were victims of loss aversion, the well-documented psychological phenomenon that losses make us feel bad more than gains make us feel good.

[...]

Although we assume that there’s a simple, linear relationship between financial rewards and productivity — that’s why Wall Street gives its best employees huge bonuses — such rewards can backfire, especially when the task is difficult, or requires expertise. Consider a classic study led by the psychologist Sam Glucksberg in the early nineteen-sixties. He gave subjects a standard test of creativity known as “Duncker’s candle problem.” A “high drive” group was told that the person solving the task in the shortest amount of time would receive twenty dollars. A “low drive” group, in contrast, was reassured that their speed didn’t matter. To Glucksberg’s surprise, the subjects with an incentive to think quickly took, on average, more than three minutes longer to find the answer.

There is something poignant about this deconstruction of choking. It suggests that the reason some performers fall apart on the back nine or at the free-throw line is because they care too much. They really want to win, and so they get unravelled by the pressure of the moment. The simple pleasures of the game have vanished; the fear of losing is what remains.

Picture Book 1936

Tuesday, June 12th, 2012

When I was a freshman in college, we were assigned John Dower’s War Without Mercy: Race and Power in the Pacific War — or, rather, we were assigned most of the book. Which parts were we not assigned? Why, the parts about the Japanese and their racist views of us, of course.

The past is a foreign country, and Imperial Japan is an especially foreign country. Propaganda from Imperial Japan is surprisingly hard to understand, despite its obvious message.

Timothy White is studying the history of Malaysian cinema, and here he looks at films from the Japanese occupation:

There is some documentation of the documentary propaganda films shown in the occupied nations of Asia,(29) and some information concerning the feature films that were made by the Japanese films in Southeast Asia during the Occupation. In addition to Abe Yutaka’s Nankai no Hanataba other Japanese films made in and for Malaya and Singapore Asia include Shima Koji’s Shingaporu Sokogeki (All-out Attack on Singapore, 1943) and Koga Masato’s Marei no Tora (The Tiger of Malaya, 1943).(30)

Some animated cartoons were made for exhibition in Southeast Asia also. One very popular cartoon character was Momotaro, the “Peach boy,” who appeared in a number of cartoons designed not just for domestic consumption within Japan, but for propaganda use in occupied countries as well. For example, Picture Book 1936 (Momotaro vs. Mickey Mouse) presented fanged Mickey Mouse look-alikes riding giant bats, attacking peaceful Pacific islanders (represented by cats and dolls, for some reason); the hero Momotaro jumps out of a picture book, repels the American mice, and cherry trees blossom throughout the island as the grateful natives sing “Tokyo Chorus.” In a more ambitious cartoon, Momotaro’s Sea Eagle, released in 1943, Momotaro leads the attack on Pearl Harbour, then “liberates” Southeast Asia; although Momotaro himself is a human boy, the “liberated peoples” are presented as animals (cute little rabbits, mice, ducks and bears, who willingly and sternly fight behind Momotaro, their liberator and leader), while the Americans and British (and especially General Percival, who surrenders Singapore to Momotaro) are huge, hairy, ugly demons, complete with horns and drooling fangs.(31)

Nippon Banzai another animated propaganda film designed for use in the occupied nations employed, along with an almost avant-garde mix of line animation, shadow animation, and live-action footage, the following commentary (in English!): “The peaceful Southeast Asian countries have been trampled underfoot for many years, their inhabitants made to suffer by the devilish British, Americans, and Dutch. In the midst of this hardship, in their hearts they (the inhabitants) have waited for a ray of light, a strong soul. That light, that soul was Japan.”(32)

Picture Book 1936 certainly has the look and feel of an American cartoon of the period, but the cultural references — e.g. Momotaro — are lost on an American audience:

(Hat tip to io9.)

Gatorade Goes Back to the Lab

Tuesday, June 12th, 2012

Almost half a century ago, Gatorade made its name as a sports drink. Then PepsiCo made a fortune selling Gatorade cheaply in grocery stores and convenience stores as just another soft drink. Now Gatorade is going back to its roots:

Determined to walk away from discount-driven sales—or “rented volume,” as [Gatorade president Sarah] Robb O’Hagan calls it—the company decided in 2008 to turn away from couch potatoes who chugged Gatorade to wash down a cheeseburger or cure a hangover. Analysts gasped during a 2009 earnings call when PepsiCo Chief Executive Officer Indra K. Nooyi said such consumers — who had by then reverted to cheaper beverages like soft drinks and tap water — “didn’t really have a right to exist in the Gatorade world.” Harsh, perhaps, but it was Nooyi’s way of saying PepsiCo wasn’t selling out anymore.

[...]

“The huge aha! for me was, ‘We’re an athletic performance brand, we’re selling in convenience stores, grocery stores, Wal-Mart (WMT), but we don’t even show up in a sporting goods store, in a cycling store, in a place where an athlete actually goes to equip themselves to play sports,’?” she says. Robb O’Hagan has since brought Gatorade back to athletes and to the science that gave the brand its credibility.

First developed by researchers at the University of Florida in 1965, Gatorade took off quickly with college and professional athletes because it has a formula proven on the playing field. By 1983 it had became the National Football League’s official sports drink. In 2001, PepsiCo bought the $2 billion-a-year brand, and the soft-drink and snack giant spent the better part of the decade pushing Gatorade through its massive distribution system. PepsiCo introduced hundreds of flavors and package deviations, including a breakfast version, Gatorade A.M., pitched by Indianapolis Colts quarterback Peyton Manning. The strategy made sense at the time, Robb O’Hagan allows, but it crashed along with the economy in 2008. In 2007 the sports-drink category had mushroomed to $8 billion a year in the U.S., and Gatorade controlled 80 percent, according to industry newsletter Beverage Digest. Within three years, the sports-drink market had declined by $1 billion, and Gatorade’s market share had eroded to 74.8 percent. Meanwhile, serious athletes were turning away from sports drinks to a raft of emerging products, including Jelly Belly Sport Beans, Bonk Breaker Energy Bars, and the Honey Stinger energy waffles endorsed by Tour de France champion Lance Armstrong. They bought Carbo-Pro powders in large tubs. Gatorade had mostly conceded these markets. “It’s our role to make anything to drive an athlete’s performance that goes inside their body,” Robb O’Hagan says, drawing a comparison to her former employer’s strategy. “Nike’s all about what’s outside your body. We’re about what’s inside.”

They can expect some challenges educating the customers about their new product lines:

Naturally, Gatorade can’t make individualized products for everyone; the company has to find common denominators. Its solution so far has been the G Series. The core line is targeted to “performance” athletes — competitive high school swimmers to adult basketball league players — who make up nearly a quarter of the U.S. population, Robb O’Hagan says. The series includes a 4-ounce carbohydrate-loaded “pre-game fuel” drink pouch designed to be easily torn open and squeezed into your mouth. The flavored recovery water in the series is packed with protein and carbohydrates.

G Series Fit moves down the ladder a bit and is intended for the roughly 55 million Americans aged 18 to 34 who exercise three times or more per week. These people work out to stay healthy, without necessarily competing. The supplements in Fit are scaled back to match less intense workouts. This line is where Gatorade’s departure from beverages is most pronounced so far: The main offering is a fruit-and-nut bar segmented into bite-size, 50-calorie squares. A fruit smoothie provides an after-workout dose of protein to help the athlete recover sooner.

G Series Pro, meanwhile, is a consumer version of products Gatorade had already been producing for professional athletes. A recovery bar contains whey and casein from milk protein for muscle growth. Vitamins and minerals in the bar boost muscle metabolism, Gatorade says, while carbohydrates help store energy in muscles and the liver in the form of glycogen sugar. Gatorade soon will roll out Pro chews — essentially Gummi Bears for endurance athletes — to compete with Gu Chomps and Clif Bloks that are a staple on long-distance courses. The company also sells two all-natural versions of its new drinks that use noncaloric sweeteners. Gatorade drinkers accustomed to buying 32-oz. bottles for 99¢ may experience sticker shock when it comes to the newest products. PepsiCo charges for its innovation. A 12-oz. bottle of the Pro pre-workout carbohydrate drink sells for $2.99.

This product lineup demands an equally dramatic shift in how and where PepsiCo distributes Gatorade in stores. Before Gatorade’s transformation, sales were split fairly evenly among grocery chains, club stores such as Wal-Mart, and convenience stores. “We are setting a different bar for how we are looking at retail,” says brand marketing Vice-President Fairchild, whom Robb O’Hagan recruited from Nike last year.

Gatorade has taken its Pro series into cycling and running specialty stores that cater to endurance athletes as well as health supplement stores such as GNC. “People come in and buy nutrition from me every day and spend hundreds of dollars,” says Julian Angus, 40, owner of Tempo Cyclery in Sarasota, Fla., who remembers a time last decade when only a few companies made products for elite athletes. Margins rival those of clothing and accessories, he says. Still, Angus was skeptical when Gatorade first pitched him on the products and started sending displays. He worried that customers, not realizing these were new offerings, would think they were being charged boutique prices for the same old drinks they could buy at the supermarket.

Performance athletes make up a quarter of the US population?  That seems optimistic.

Gallup’s Health and Healthcare survey asks Americans to say how frequently they participate in moderate sports or recreational activities, vigorous sports or exercise activities, and weight-lifting or weight-training.

Approximately 6 in 10 Americans indicate they regularly engage in moderate exercise (59% in 2007); about half as many regularly engage in vigorous exercise (32%); and about half as many as that report doing regular weight training (15%).

I would say those self-reported numbers set an upper bound.