Angelina Jolie’s Perfect Game

Wednesday, June 25th, 2014

Hello Magazine Cover of Angelina Jolie and FamilyLooking back, the Brangelina publicity strategy is deceptively simple, Anne Helen Petersen explains:

In fact, it’s a model of the strategy that has subconsciously guided star production for the last hundred years. More specifically, that the star should be at once ordinary and extraordinary, “just like us” and absolutely nothing like us. Gloria Swanson is the most glamorous star in the world — who loves to make dinner for her children. Paul Newman is the most handsome man in Hollywood — whose favorite pastime is making breakfast in his socks and loafers.

Jolie’s post-2005 image took the ordinary — she was a working mom trying to make her relationship work — and not only amplified it, but infused it with the rhetoric and imagery of globalism and liberalism. She’s not just a mom, but a mom of six. Instead of teaching her kids tolerance, she creates a family unit that engenders it; instead of reading books on kindness and generosity, she models it all over the globe. As for her partner, he isn’t just handsome — he’s the Sexiest Man Alive. And she doesn’t just have a job; instead, her job is being the most important — and influential — actress in the world.

Her image was built on the infrastructure of the status quo — a straight, white, doting mother engaged in a long-term monogamous relationship — but made just extraordinary enough to truly entice but never offend. The line between the tantalizing and the scandalizing is notoriously difficult to tread (just ask Kanye), but Jolie was able to negotiate it via two tactics: First, and most obviously, she accumulated (or, more generously, adopted and gave birth to) a dynamic group of children who were beautiful to observe; second, she figured out how to talk about her personal life in a way that seemed confessional while, in truth, revealing very little; and third, she exploited the desire for inside access into control of that access.

An Outstanding Weapon

Wednesday, June 25th, 2014

The RPG is a really, really outstanding weapon, Weapons Man says, and it fits in a sweet spot of direct-fire AT and AP support weaponry that’s really missing in the US infantry squad:

Instead, we have more riflemen, and additional-duty weapons like the AT-4. The RPG is cheaper and reusable, and it has a range advantage over most US disposable non-guided weapons. Its effective anti-tank range is about double that of the AT-4, and the disposable AT-4 costs $1,500 a round.

The evolutionary history of the RPG is fascinating. The Soviets began by copying a weapon they’d felt the sharp end of: the German Panzerfaust. There were many versions of this disposable AT weapon available, and by war’s end the Germans were evolving this weapon in the direction of a reusable tube. It was the Panzerfaust that originated the grenade-launch boost and rocket sustainer operating system, and the weapon evolved rapidly under the pressures of mechanized warfare. Early Panzerfäuste had a mere 30 meter range, demanding bravery, or recklessness, from a rifleman under the pressure of hordes of T-34s or Shermans. And the warheads were marginal, at least on the well-protected T-34. By 1945 most of the initial weaknesses had been allayed by the intense development taking place behind the lines, and the industrial and R&D plant fell into Russian hands.

Unlike the USA, where captured German scientists and engineers came to be trusted, with many staying on as employees and seeking American citizenship, the Soviets, who suffered terribly at German hands, never trusted the Germans and held them in rigid captivity. As quickly as possible, they transitioned German projects, including rockets, guided AA missiles, and turbine engines as well as AT weapons, to Soviet design bureaux and shut the Germans out, generally releasing them back in the USSR’s occupied zone of Germany.

The Soviet engineers proved to be quick and imaginative. They continued to improve the Panzerfaust operating system. It is generally believed that a Soviet-produced version of the late-war Panzerfaust 250 was given limited issue as the Ruchnoy Protitankovniy Granatomet or RPG-1. A Soviet-improved version was widely issued as the RPG-2 in the later 1940s, as part of the systematic re-equipment of the Soviet Army that also saw new rifles, machine guns, and soon, tanks in service.

The limits of the RPG-2 led to the larger, heavier, more solid, and tactically longer-ranged and more accurate RPG-7 in 1961, and the versatility of the RPG-7 has kept it on the world’s front lines to this day. While most of the world knows about the remarkable longevity of the Kalashnikov rifle, its AT counterpart is just as ubiquitous, and won’t be going away any time soon. (In fact, a US firm makes a modified version for Foreign Military Sales).

Louis Awerbuck

Wednesday, June 25th, 2014

Renowned shooting instructor Louis Awerbuck has passed away. Or, rather, he has shot himself, rather than face lingering illness. (At least, that’s what the Internet tells me.) This old interview may shed some light:

Q: You’re involved in teaching skills and a mindset that involve defending life and potentially taking life. Do you think about your mortality more than the average person?

LA: Yes, but I…:

Q. How often do you think about your mortality, the fact that one day you will die?

LA: Almost permanently now. But I don’t care; it doesn’t matter. I don’t have any family, so it’s not a big deal. It’s literally going back to what you were talking about earlier—the Asian way of thinking… the Japanese way of thinking. Everybody holds life so precious; I don’t. I mean, I’d like to live to a hundred and fifty if I were healthy, but [pauses] death and taxes.

Q: So in your understanding, what’s after death?

LA: I don’t know, but I think there’s got to be something. Otherwise, you wouldn’t have a five year-old killed, ridden over by a bus, for no reason. There’s got to be something out there. There’s got to be a reason one person lives to be a drunken murderer for 105 years and a good kid gets run over by a school bus when she’s four years old. There’s got to be something. What it is, I don’t know. I’m not a theologian. I guess it’s just a stepping in-between steps.

LA: Different people are different-

Q: For you?

LA: For me? For preserving my life? Honoring my parents. That’s why I didn’t die fourteen years ago. Not much else. I don’t trust anyone. Can’t trust anyone. So, that’s why I say I really don’t care about my death. I’ve had a hundred years packed into sixty. Why would I? I’ve got nothing to live for. I’ve got nothing to lose. I’ve got no Achilles heel. I’m not the average person. I’m an exception to the rule. The average person— wife and kids, lineage, wants to see their grandchildren play football or through college or whatever. Fine. I’m the end of the line. I’m the end of the blood line, completely.

Q: Most adults wrestle with some sort of fear or anxiety. It can be their financial well-being, their health, or their personal safety. What do you fear most in life?

LA: Probably physical incapacitation, if I were cognizant of it. Dependency, physical dependency, and being cognizant of it. Having Alzheimer’s and knowing I’ve got Alzheimer’s and not being able to [pauses] end it. That’s it. I don’t fear anything else because … Mr. Roosevelt said, “There’s nothing to fear but fear itself.” I don’t want to be dependent on anybody else. There is nothing else.

Q: Any regrets or things you would have done differently in life?

LA: I would have given my parents more time, of my so-called “valuable” time, when I was younger. That’s all. I was going to say I wouldn’t have put in as much of my side of the pound of flesh as I did, but I probably would have, but that’s it. I owe nobody anything. Nobody owes me anything. I’m happy. You get up with daily fears—“I hope the kids are alright, I hope the wife’s alright, I hope I can pay the bills…” I don’t have those worries. I go broke? I’ll make some more money, somehow, somewhere. No wife, no kids, my dog’s dead, so what am I supposed to be concerned about? No family (none living). No lineage. I mean it sounds pathetic, or pathos-tic, but why would I have worries in life? All of the general person’s worries, normal worries.

He considered himself a realist:

Q: You have the advantage of having lived in South Africa as well as America. What’s right about American culture? What about it concerns you?

LA: What concerns me is America is what South Africa was thirty-five years ago, and people are too blind to see it. What’s right about it? It’s still got a Constitution and a Bill of Rights, if people will abide by it. But … it’s never coming back to what it was. If anyone’s that stupid….The cycle’s over. World powers have cycles, and America’s is over.

Q: So you’re not optimistic about the…

LA: I’m not pessimistic. I’m realistic because I’ve lived through this before. I’ve seen it all before. Without trying to sound supercilious, I’ve seen it all before. It’s just déjà vu, all over again, to quote the lyric. It’s going to go in no other direction. I think people would be shocked to know what is not American, owned in America, and I’m not going to give specifics. But there’s hardly anything “American made” that is American made. They’re trying to do things the right way…. The nice-guys-finish-last syndrome applies. That’s it.

Even six years ago he didn’t see much future:

Q: What does the future hold for Louis Awerbuck and Yavapai Firearms Academy?

LA: The Academy, I don’t know. For me, not much. It’s twilight and the sun’s going down. Am I … despondent? No. I reckon I’ve had a hundred years of good health, but … I’m jaded with mankind. That’s my problem. I’m jaded with mankind. Too many people. Too many years. Too many lies. Too many people with no morals, no ethics. Money, money, money. Me, me, me. Nice guys finish last. I don’t mind finishing last, but I’m tired of running, running the race. There’s no point to it. What is the end of it? What is it all? Nothing that I haven’t seen before.

More knowledge, hopefully. In fact, you can cancel the whole preceding three paragraphs and say, “Hope for more knowledge.” Just learn, learn, learn. It’s the psychology that I’m interested in. But otherwise, nothing.

What do I have left to do that I haven’t done? Nothing. Except maybe golf, but I ain’t going to try to hit a 4-inch golf ball into a 3-inch hole. Snow skiing? And I ain’t jumping out of a perfectly good aircraft, so there’s nothing left to do that I haven’t done that I wanted to do, except learn. That’s it. The show’s over.

Writhing Time

Wednesday, June 25th, 2014

During the first 32 games of the World Cup, Geoff Foster records, there were 302 players who could be seen at some point rolling around in pain, crumpling into a fetal position or lying lifeless on the pitch as the referee stopped the match:

These theatrical episodes ate up a total of 132 minutes of clock, a metric we have decided to call “writhing time.”

To be fair, it is actually possible to get hurt playing soccer. You can clang heads. You can snap a hamstring. You can get spiked in the soft tissue. There were nine injuries in total that forced players to be substituted from the game and to miss, or potentially miss, a match. These were discarded. That left 293 cases of potential embellishment that collectively took up 118 minutes, 21 seconds.

Another trick: how to calculate writhing time. The criteria used here is the moment the whistle is blown (because of a potential injury) to the moment that player stands up. If the TV camera cut to a replay, the stand-up moment was estimated. If he was helped off the field, the “writhing” clock stopped when he crossed the sidelines.

The study showed one thing emphatically: The amount of histrionics your players display during a match correlates strongly to what the scoreboard says. Players on teams that were losing their games accounted for 40 “injuries” and nearly 12.5 minutes of writhing time. But players on teams that were winning — the ones who have the most incentive to run out the clock — accounted for 103 “injuries” and almost four times as much writhing.

Writhing Time

U.S.A. in the Stage of the Pioneers

Wednesday, June 25th, 2014

In the case of the United States of America, the pioneering period did not consist of a barbarian conquest of an effete civilisation, Sir John Glubb notes, but of the conquest of barbarian peoples:

Thus, viewed from the outside, every example seems to be different. But viewed from the standpoint of the great nation, every example seems to be similar.

The United States arose suddenly as a new nation, and its period of pioneering was spent in the conquest of a vast continent, not an ancient empire. Yet the subsequent life history of the United States has followed the standard pattern which we shall attempt to trace — the periods of the pioneers, of commerce, of af?uence, of intellectualism and of decadence.

Hill Tribes

Tuesday, June 24th, 2014

Flat, dry, central Iraq is the Bonneville Salt Flats of insurgency, the War Nerd says, but the hilly north is another story:

No world records set there. In fact, I.S.I.S. seems to be bogging down badly around Kirkuk. To understand why, you need to consider both ethnography and terrain. And in fact, those two things are linked very tightly here, for some grim historical reasons. If you look at an ethnic map of Northern Iraq, you’ll notice that the minority sects and ethnic groups (those two categories tend to run together in the Middle East) are clustered north of the Central Iraqi plain, where the ground rises toward the serious mountains along the Turkish and Iranian borders.

There’s a reason for that, a simple and cruel one: Minority communities that aren’t protected by the hills tend to get wiped out. All over the world, you’ll find groups described as “hill tribes,” and in almost every case, if you go back a few centuries you’ll find that these aren’t “hill tribes” by choice, but defeated tribes who were forced off the plains and into the hills to survive. In places as far apart as Burma and Kurdistan, that pattern holds very clearly.

Meat eating behind evolutionary success of humankind

Tuesday, June 24th, 2014

Meat eating was behind the evolutionary success of humankind, because this higher-quality diet meant that women could wean their children earlier:

Among natural fertility societies, the average duration of breast-feeding is 2 years and 4 months. This is not much in relation to the maximum lifespan of our species, around 120 years. It is even less if compared to our closest relatives: female chimpanzees suckle their young for 4–5 years, whereas the maximum lifespan for chimpanzees is only 60 years.

Many researchers have tried to explain the relatively shorter breast-feeding period of humans based on social and behavioral theories of parenting and family size. But the Lund group has now shown that humans are in fact no different than other mammals with respect to the timing of weaning. If you enter brain development and diet composition into the equation, the time when our young stop suckling fits precisely with the pattern in other mammals.

This is the type of mathematical model that Elia Psouni and her colleagues have built. They entered data on close to 70 mammalian species of various types into the model — data on brain size and diet. Species for which at least 20 per cent of the energy content of their diet comes from meat were categorised as carnivores. The model shows that the young of all species cease to suckle when their brains have reached a particular stage of development on the path from conception to full brain-size. Carnivores, due to their high quality diet, can wean earlier than herbivores and omnivores.

The model also shows that humans do not differ from other carnivores with respect to timing of weaning. All carnivorous species, from small animals such as ferrets and raccoons to large ones like panthers, killer whales and humans, have a relatively short breast-feeding period. The difference between us and the great apes, which has puzzled previous researchers, seems to depend merely on the fact that as a species we are carnivores, whereas gorillas, orangutans and chimpanzees are herbivores or omnivores.

The City of Industry

Tuesday, June 24th, 2014

The City of Industry is a 12-square mile suburb of Los Angeles with just 219 residents — but more than 2,500 businesses, providing 80,000 jobs:

It was incorporated on June 18, 1957 to prevent surrounding cities from annexing industrial land for tax revenue.


The City of Industry has no business taxes and is primarily funded through retail sales tax from shopping centers located within the city limits, and property tax on parcels within the City. The city has the highest property tax rate in Los Angeles County, at 1.92%.


Naturally, this does not please Victor Valle, professor of ethnic studies at California Polytechnic University, San Luis Obispo, whose Genealogies of Power in Southern California describes it as “the gritty crossroads of the global trade revolution that is transforming Southern California factories into warehouses, and adjacent working class communities into economic and environmental sacrifice zones choking on cheap goods and carcinogenic diesel exhaust.”

The Course of Empire

Tuesday, June 24th, 2014

Sir John Glubb plots the course of empire:

The first stage of the life of a great nation, therefore, after its outburst, is a period of amazing initiative, and almost incredible enterprise, courage and hardihood. These qualities, often in a very short time, produce a new and formidable nation. These early victories, however, are won chie?y by reckless bravery and daring initiative.

The ancient civilisation thus attacked will have defended itself by its sophisticated weapons, and by its military organisation and discipline. The barbarians quickly appreciate the advantages of these military methods and adopt them. As a result, the second stage of expansion of the new empire consists of more organised, disciplined and professional campaigns.

In other fields, the daring initiative of the original conquerors is maintained — in geographical exploration, for example: pioneering new countries, penetrating new forests, climbing unexplored mountains, and sailing uncharted seas. The new nation is confident, optimistic and perhaps contemptuous of the ‘decadent’ races which it has subjugated.

The methods employed tend to be practical and experimental, both in government and in warfare, for they are not tied by centuries of tradition, as happens in ancient empires. Moreover, the leaders are free to use their own improvisations, not having studied politics or tactics in schools or in textbooks.

The Limits of Expertise

Monday, June 23rd, 2014

Tom Nichols, professor of national security affairs at the U.S. Naval War College, recently lamented the death of expertise — or, rather, the death of the acknowledgement of expertise:

A fair number of Americans now seem to reject the notion that one person is more likely to be right about something, due to education, experience, or other attributes of achievement, than any other.

Indeed, to a certain segment of the American public, the idea that one person knows more than another person is an appalling thought, and perhaps even a not-too-subtle attempt to put down one’s fellow citizen. It’s certainly thought to be rude: to judge from social media and op-eds, the claim of expertise — and especially any claim that expertise should guide the outcome of a disagreement — is now considered by many people to be worse than a direct personal insult.

The expert isn’t always right, he admits, but an expert is far more likely to be right than you are.

Only this isn’t quite true, as Philip Tetlock’s research has shown:

The results were unimpressive. On the first scale, the experts performed worse than they would have if they had simply assigned an equal probability to all three outcomes — if they had given each possible future a thirty-three-per-cent chance of occurring. Human beings who spend their lives studying the state of the world, in other words, are poorer forecasters than dart-throwing monkeys, who would have distributed their picks evenly over the three choices.

Tetlock also found that specialists are not significantly more reliable than non-specialists in guessing what is going to happen in the region they study. Knowing a little might make someone a more reliable forecaster, but Tetlock found that knowing a lot can actually make a person less reliable. “We reach the point of diminishing marginal predictive returns for knowledge disconcertingly quickly,” he reports. “In this age of academic hyperspecialization, there is no reason for supposing that contributors to top journals — distinguished political scientists, area study specialists, economists, and so on — are any better than journalists or attentive readers of the New York Times in ‘reading’ emerging situations.” And the more famous the forecaster the more overblown the forecasts. “Experts in demand,” Tetlock says, “were more overconfident than their colleagues who eked out existences far from the limelight…. The expert also suffers from knowing too much: the more facts an expert has, the more information is available to be enlisted in support of his or her pet theories, and the more chains of causation he or she can find beguiling. This helps explain why specialists fail to outguess non-specialists. The odds tend to be with the obvious”.

James Shanteau‘s “cross-domain” study of expert performance showed that some fields developed true expertise (“high validity” domains), and others did not:

The importance of predictable environments and opportunities to learn them was apparent in an early review of professions in which expertise develops. Shanteau (1992) reviewed evidence showing that [real, measurable] expertise was found in livestock judges, astronomers, test pilots, soil judges, chess masters, physicists, mathematicians, accountants, grain inspectors, photo interpreters, and insurance analysts.

In contrast, Shanteau noted poor performance by experienced professionals in another large set of occupations: stockbrokers, clinical psychologists, psychiatrists, college admissions officers, court judges, personnel selectors, and intelligence analysts.”

Read T. Greer’s whole piece on the limits of expertise.

Private Cities 101

Monday, June 23rd, 2014

The 21st century will be the century of cities, we’re told, and Mark Lutter would like to see it become the century of private cities — or proprietary communities:

Proprietary communities are communities defined through private property. A common example is a mall. It is owned by a proprietor who rents out space for income. However, in order to increase the value of the store space, the proprietor also must provide public goods, security, lighting, and open spaces inside the mall. Proprietary communities typically lease land to residents, with revenue the result of increased land value from the provision of public goods.

Proprietary communities offer a solution to a host of problems commonly assumed to justify government intervention. Private property internalizes externalities. Proprietary communities take advantage of that fact by creating private property over land spaces traditionally thought of as public domain. They work by creating a residual claimant in the provision of public goods. That is, proprietors keep as income the rents collected through leases after costs are deducted.

Economists tend not to worry about the provision of goods or services when such provision has the potential to make people rich. The private sector does a good job of making cars because people who make great cars will enjoy financial rewards. On the other hand, no one can get rich stopping overfishing, for example, which is why it remains a problem.

Proprietary communities offer people a way to get rich by providing public goods. Public goods affect the value of the land on which they are provided. A classic example is schools. Good schools can increase land value by thousands — if not tens of thousands — of dollars. Similarly, police, roads, parks, and sanitation tend to raise land values. Because a proprietor’s or developer’s income depends on the value of the land he is renting out, he has incentives to provide public goods as part of his total offering.

The two closest examples of proprietary cities are Letchworth and Welwyn, small cities of around 30,000 each founded by Ebenezer Howard on Georgist principles before being nationalized after World War II. Walt Disney World is effectively a private city unto itself, demonstrating the scalability of the idea.

Imagining a modern proprietary city is difficult. Order is defined in the process of its emergence and the market makes fools out of those believing they can predict its path. However, a conservative guess is that a proprietary city might look similar to Sandy Springs, a city in Georgia of 93,000 people, that outsourced public services to private companies after a bankruptcy crisis, obtaining creating a superior provision of public goods at a lower cost.

Like Arnold Kling, I’d like to know more about how and why private cities don’t emerge.

A Providential Turnover?

Monday, June 23rd, 2014

When a new power rises, it often appears to be a providential turnover, Sir John Glubb notes:

Whatever causes may be given for the overthrow of great civilisations by barbarians, we can sense certain resulting benefits. Every race on earth has distinctive characteristics. Some have been distinguished in philosophy, some in administration, some in romance, poetry or religion, some in their legal system. During the pre-eminence of each culture, its distinctive characteristics are carried by it far and wide across the world.

If the same nation were to retain its domination indefinitely, its peculiar qualities would permanently characterise the whole human race. Under the system of empires each lasting for 250 years, the sovereign race has time to spread its particular virtues far and wide. Then, however, another people, with entirely different peculiarities, takes its place, and its virtues and accomplishments are likewise disseminated. By this system, each of the innumerable races of the world enjoys a period of greatness, during which its peculiar qualities are placed at the service of mankind.

To those who believe in the existence of God, as the Ruler and Director of human affairs, such a system may appear as a manifestation of divine wisdom, tending towards the slow and ultimate perfection of humanity.

The Unknown War

Sunday, June 22nd, 2014

On June 22, 1941, Nazi Germany invaded Soviet Russia and kicked off The Unknown War, the war on the Eastern Front, the real war:

Burt Lancaster, who hosts the series from Soviet Russia, was known for his progressive politics:

Lancaster was a vocal supporter of liberal political causes, and frequently spoke out in support of racial minorities, including at the March on Washington in 1963. He was a vocal opponent of the Vietnam War and political movements such as McCarthyism, and he helped pay for the successful defense of a soldier accused of “fragging” (murdering) another soldier during the war.[16] In 1968, Lancaster actively supported the presidential candidacy of antiwar Senator Eugene McCarthy of Minnesota, and frequently spoke on his behalf during the Democratic primaries. He heavily campaigned for George McGovern in the 1972 presidential election. In 1985, Lancaster, a longtime supporter of gay rights, joined the fight against AIDS after his close friend, Rock Hudson, contracted the disease. He campaigned for Michael Dukakis in the 1988 presidential election.

The Causes of Race Outbursts

Sunday, June 22nd, 2014

The modern instinct is to seek a reason for everything, Sir John Glubb says, and to doubt the veracity of a statement for which a reason cannot be found:

So many examples can be given of the sudden eruption of an obscure race into a nation of conquerors that the truth of the phenomenon cannot be held to be doubtful. To assign a cause is more difficult. Perhaps the easiest explanation is to assume that the poor and obscure race is tempted by the wealth of the ancient civilisation, and there would undoubtedly appear to be an element of greed for loot in barbarian invasions.

Such a motivation may be divided into two classes. The first is mere loot, plunder and rape, as, for example, in the case of Attila and the Huns, who ravaged a great part of Europe from A.D. 450 to 453. However, when Attila died in the latter year, his empire fell apart and his tribes returned to Eastern Europe.

Many of the barbarians who founded dynasties in Western Europe on the ruins of the Roman Empire, however, did so out of admiration for Roman civilisation, and themselves aspired to become Romans.

Slobbery as Snobbery

Saturday, June 21st, 2014

Theodore Dalrymple describes the modern fashion for slobbery as snobbery:

A century ago, there would have been one clothes shop for every hundred well-dressed people. Nowadays there is one well-dressed person (if that) for every hundred clothes shops. What accounts for this strange reversal of ratios?

Beyond the fact that clothes are now mass-produced rather than made individually, there is an act of will involved. Practically everyone now dresses not merely in a casual way, but with studied slovenliness for fear of being thought elegant, as elegance is a metonym for undemocratic sentiment or belief. You can dress as expensively as you like, indeed expensive scruffiness is a form of chic, but on no account must you dress with taste and discrimination. To do so might be to draw hostile attention to yourself. Who on Earth do you think you are to dress like that?