How the modern day tomato came to be

Tuesday, September 20th, 2011

Barry Estabrook explains how the modern day tomato came to be:

I was driving down an interstate highway in Southwestern Florida and come up behind what I thought at first was a gravel truck. As I got closer, I saw what I took for Granny Smith apples — and I thought, “Those don’t grow in Florida.” When I got really close, I saw it was full of bright green tomatoes. No pink — just green.

I was mesmerized, and then the truck hit a bump. Three tomatoes came flying off and nearly went through my windshield. I noticed that they hit the pavement on I-75, bounced and then rolled into the ditch.

They didn’t shatter, they didn’t splatter; they stayed intact. I thought, “My God! What have they done to this wonderful fruit?”

Winter tomatoes that we get in our grocery stores and in fast food places are picked when they’re bright green. Any hint of coloration is treasonous in a Florida tomato field in the winter. The industry says they’re “mature green” and supposedly might develop flavor, but there’s no way the pickers can tell the difference between mature and immature.

These green tomatoes are taken back to a warehouse, packed in boxes, which are stacked on pallets and moved into storage areas where they’re exposed to ethylene gas. The gas forces the tomatoes to turn the right color; it doesn’t ripen them.

There are two factors at work here. The first is that the tomatoes are picked when they’re immature and no matter what you do, an immature tomato will never get any taste; though it might look alluring.

The second problem with industrial tomatoes is that for the last fifty years, they’ve been bred for one thing only, and that’s yield. One farmer told me, “I get paid per pound. I don’t get paid a cent for taste.” Sadly, he was right.

This leaves unanswered the question of why customers won’t pay for taste.

Too Big for TV

Sunday, September 18th, 2011

George R.R. Martin discusses the challenges of writing screenplays on a budget based on a novel he originally wrote to be too big for television:

Which episode of Game of Thrones are you writing next season?

The Battle of the Blackwater, God help me. David and Dan must hate me.

That’s the one where you have to be most conscious of budgetary decisions.

It’s very tough because we don’t have the budget to do the battle in the book. We just don’t.

Well, they have to be able to show the ships and what happens to them, right?

I hope so. We’ll see. I’m writing it. I’m cutting certain things. We’ll see once I turn it in if we can do it. When you look HBO’s Rome

Loved Rome.

I loved it too, but what about the battles?

We see Caesar leave the tent to go to war, then he comes back and falls asleep.

Caesar leaves the tent. Pompey leaves the tent. Then we see Pompey’s banner in the mud. And Caesar comes back to the tent. The next episode, Pompey describes the battle to Pullo and Vorenus drawing it in the dirt with a stick to explain what happened. For the Battle of Actium, they open with Mark Anthony floating on a piece of wood — and Rome had a bigger budget than we do. I’ve been trying to tell the fans that. On some level they’re expecting the Battle of Pelennor Fields [from Peter Jackson's The Return of the King].

Fans don’t distinguish as much between mediums now.

They don’t. And television has set that up by being increasingly good. Back in the 1960s or ’70s, you could tell TV show from a movie in three frames just way it was shot and lit. But you can’t these days.

College vs. Work

Tuesday, September 13th, 2011

John Derbyshire shares an anecdote from a reader about college vs. work:

I work as an energy trader and recently took a customer down to Appalachia to visit some coal mines. On our visit to one of the mines, there was a large sign prominently displayed: Accepting Applications. Once the meeting and mine tour were finished we were in the mine manager’s office and I asked him, “How come you’re hiring? Did you just lose some workers?”

“Hell, no!” was the reply. “We are always looking for people.”

Not sure if you have had the chance to visit Appalachia, but there are large pockets of poverty here, especially when the overall unemployment throughout the country is close to 10 percent. Hard to imagine there would be any job openings. So I asked him again, “How come? Don’t you pay enough?”

He explained to me that a high school graduate can start working at the mine and make roughly $40K a year. After 90 days of training (or in the industry lingo, when a worker goes from being a “red hat” to a “black hat”) that pay jumps up to about $50K a year.

Now granted, this isn’t easy work. It’s a 50-hour work week (with overtime of course), which includes night shifts and weekends. But $50K for a high school graduate?

The manager went on to explain to me that, “If you know which end of a wrench to pick up” the company will be glad to train you to be an electrician, equipment operator, etc. in which case your salary will rise to $75–$100K a year.

I asked him, “Then how come you can’t get workers?”

His reply was telling. “All you have to do to get a mine job is come to work every day, work reasonably hard, and pee clean. We just can’t find people who can do this.”

Apparently it’s not even illegal drugs. Legal prescription painkillers are the main cause. People will shop doctors and get multiple prescriptions. Doctors are happy to prescribe these, because they run clinics and make money from Medicaid selling the pills.

Finally I asked the manager, who was in his mid 50s or so, “What about your kids?”

He replied: “Oh, they both went to college.”

“What are they doing now?”

“Working for the state government.”

“How much do they get paid?”

“About $25 grand a year.”

Why the Gender Gap Won’t Go Away. Ever.

Friday, September 9th, 2011

Kay S. Hymowitz explains why the gender gap won’t go away — ever:

Let’s begin by unpacking that 75-cent statistic, which actually varies from 75 to about 81, depending on the year and the study. The figure is based on the average earnings of full-time, year-round (FTYR) workers, usually defined as those who work 35 hours a week or more.

But consider the mischief contained in that “or more.” It makes the full-time category embrace everyone from a clerk who arrives at her desk at 9 AM and leaves promptly at 4 PM to a trial lawyer who eats dinner four nights a week — and lunch on weekends — at his desk. I assume, in this case, that the clerk is a woman and the lawyer a man for the simple reason that — and here is an average that proofers rarely mention — full-time men work more hours than full-time women do. In 2007, according to the Bureau of Labor Statistics, 27 percent of male full-time workers had workweeks of 41 or more hours, compared with 15 percent of female full-time workers; meanwhile, just 4 percent of full-time men worked 35 to 39 hours a week, while 12 percent of women did. Since FTYR men work more than FTYR women do, it shouldn’t be surprising that the men, on average, earn more.

The way proofers finesse “full-time” can be a wonder to behold. Take a recent article in the Washington Post by Mariko Chang, author of a forthcoming book on the wealth gap between women and men. Chang cites a wage difference between “full-time” male and female pharmacists to show how “even when they work in the same occupation, men earn more.” A moment’s Googling led me to a 2001 study in the Journal of the American Pharmacists Association concluding that male pharmacists worked 44.1 hours a week, on average, while females worked 37.2 hours. That study is a bit dated, but it’s a good guess that things haven’t changed much in the last decade. According to a 2009 article in the American Journal of Pharmaceutical Education, female pharmacists’ preference for reduced work hours is enough to lead to an industry labor shortage.

The other arena of mischief contained in the 75-cent statistic lies in the seemingly harmless term “occupation.” Everyone knows that a CEO makes more than a secretary and that a computer scientist makes more than a nurse. And most people wouldn’t be shocked to hear that secretaries and nurses are likely to be women, while CEOs and computer scientists are likely to be men. That obviously explains much of the wage gap.

But proofers often make the claim that women earn less than men doing the exact same job. They can’t possibly know that. The Labor Department’s occupational categories can be so large that a woman could drive a truck through them. Among “physicians and surgeons,” for example, women make only 64.2 percent of what men make. Outrageous, right? Not if you consider that there are dozens of specialties in medicine: some, like cardiac surgery, require years of extra training, grueling hours, and life-and-death procedures; others, like pediatrics, are less demanding and consequently less highly rewarded. Only 16 percent of surgeons, but a full 50 percent of pediatricians, are women. So the statement that female doctors make only 64.2 percent of what men make is really on the order of a tautology, much like saying that a surgeon working 50 hours a week makes significantly more than a pediatrician working 37.

A good example of how proofers get away with using the rogue term “occupation” is Behind the Pay Gap, a widely quoted 2007 study from the American Association of University Women whose executive summary informs us in its second paragraph that “one year out of college, women working full time earn only 80 percent as much as their male colleagues earn.” The report divides the labor force into 11 extremely broad occupations determined by the Department of Education. So ten years after graduation, we learn, women who go into “business” earn considerably less than their male counterparts do. But the businessman could be an associate at Morgan Stanley who majored in econ, while the businesswoman could be a human-relations manager at Foot Locker who took a lot of psych courses. You don’t read until the end of the summary — a point at which many readers will have already Tweeted their indignation — that when you control for such factors as education and hours worked, there’s actually just a 5 percent pay gap. But the AAUW isn’t going to begin a report with the statement that women earn 95 percent of what their male counterparts earn, is it?
[...]
So why do women work fewer hours, choose less demanding jobs, and then earn less than men do? The answer is obvious: kids. A number of researchers have found that if you consider only childless women, the wage gap disappears. June O’Neill, an economist who has probably studied wage gaps as much as anyone alive, has found that single, childless women make about 8 percent more than single, childless men do (though the advantage vanishes when you factor in education). Using Census Bureau data of pay levels in 147 of the nation’s 150 largest cities, the research firm Reach Advisors recently showed that single, childless working women under 30 earned 8 percent more than their male counterparts did.

Where is the Next Steve Jobs?

Thursday, September 1st, 2011

Every company wants to be like Apple, Virginia Postrel notes, and every CEO wants to be the next Steve Jobs, but Apple’s success wasn’t magical, and Steve Jobs didn’t work that magic through pure charisma:

In his new book “Good Strategy, Bad Strategy: The Difference and Why It Matters,” Richard P. Rumelt, a strategy professor at UCLA’s Anderson School of Management, offers another explanation: the ruthless execution of good strategy.

Strategy is not what many people think it is. It is not a fill-in-the-blanks mission statement blathering about how XYZ Corp. will ethically serve its stakeholders by implementing best-in-class integrated sustainable practices to grow as a global leader while maximizing shareholder value. Such bafflegab is “Dilbert“-fodder that generates cynicism and contempt. It is, at best, a big waste of time.

Neither is strategy a declaration that the ABC Co. will increase sales by 20 percent a year for the next five years, with a profit margin of at least 20 percent. Strategy is not the resolve to hunker down and try harder — what Kenichi Ohmae of McKinsey criticized in a 1989 Harvard Business Review article as “do more better.” Effort is not strategy. Neither are financial projections. And neither are wishes.

A strategy “is a way of dealing with a high-stakes challenge,” Rumelt told me in an interview. “It’s a way around the obstacles or problems in a difficult situation.”

Every good strategy, he writes, includes what he calls the kernel: a “diagnosis” of the challenge (“What’s going on here?”), a “guiding policy” for dealing with that challenge (the core idea often called a strategy), and a set of “coherent actions” to carry out that policy (the implementation). [...] “Strategy is scarcity’s child and to have a strategy, rather than vague aspirations, is to choose one path and eschew others,” writes Rumelt.

A strategy is not a goal like maximizing shareholder value or keeping America safe from terrorism. It’s not even a plan. It is a design — a coherent approach to defining and solving a particular problem, in which the different elements have to work together.

In this analysis, Steve Jobs is not only a connoisseur and sponsor of good design. He is himself a successful designer — not of products but of business strategies.

Apple’s recent success has made people forget not only how close the company came to failing but also what Jobs did to turn it around when he returned as chief executive in 1997. He diagnosed Apple’s problem: It was hemorrhaging cash and its product lineup was too diverse, confusing and expensive.

In response, Rumelt explains, Jobs “redesigned the whole business logic around a simplified product line sold through a limited set of outlets.” He cut product offerings down to two: a desktop and a laptop, and no peripherals. He moved most manufacturing to Taiwan, cut software development, and eliminated all but one national retailer, opening a Web store to sell directly to consumers.

And, yes, Jobs also got Microsoft Corp. (MSFT) to invest $150 million in Apple and to commit to continuing to make Mac versions of key software. But that agreement wouldn’t have helped much without the rest of the strategy. “I don’t know how he learned that ruthlessness,” Rumelt says. But it worked.

What Jobs did not do, the book suggests, is equally telling. He avoided all the management responses that masquerade as strategies. “He did not announce ambitious revenue or profit goals; he did not indulge in messianic visions of the future,” Rumelt writes. “And he did not just cut in a blind ax-wielding frenzy.”

The organization’s new, coherent design bought the company time and gave it a clear identity on which to build. Apple’s gutsy decision to open its own retail stores in 2001 made sense only in the context of its new strategy. [...] So if you really want to be like Apple, drop the fluff- filled vision statements and magical wishes. Pretend your company’s existence is at stake, coldly evaluate the environment, and make choices. Stop thinking of strategy as meaningless verbiage or financial goals and treat it as a serious design challenge.

(Hat tip to David Foster.)

Managerial Mystique

Monday, August 29th, 2011

With Steve Jobs stepping down, I thought I’d share the abstract to the recent study on Managerial Mystique: Magical Thinking in Judgments of Managers’ Vision, Charisma, and Magnetism:

Successful businesspeople are often attributed somewhat mystical talents, such as the ability to mesmerize an audience or envision the future. We suggest that this mystique — the way some managers are perceived by observers — arises from the intuitive logic that psychologists and anthropologists call magical thinking.

Consistent with this account, Study 1 found that perceptions of a manager’s mystique are associated with judgments of his or her charismatic vision and ability to forecast future business trends. The authors hypothesized that mystique arises especially when success is observed in the absence of mechanical causes, such as long hours or hard-won skills.

In Study 2, managers who succeeded mysteriously rather than mechanically evoked participants’ attributions of foresight and their expectations of success at visionary tasks yet not at administrative tasks. The authors further hypothesized that as mystique is assumed to spread through contagion, observers desire physical contact with managers who are attributed mystique and with these managers’ possessions.

Study 3 found that managers described as visionary as opposed to diligent are judged to be charismatic and ultimately magnetic. The authors discuss the implications of these judgment patterns for the literatures on perception biases and impression management in organizations.

Sprezzatura comes to mind.

My Little Pony: The RPG

Friday, August 19th, 2011

A few years back, as an April Fools joke, Wizards of the Coast, the folks behind Dungeons & Dragons, announced a My Little Pony roleplaying game.

This annoyed gamer-geek mxyzplk — because My Little Pony: The RPG was a great idea:

I put some thought into this when my daughter was younger. You could quite easily make a commodity RPG based on, for example, Dora the Explorer. Those episodes are very rote, the girl is on a quest and has to pass three different obstacles. You print up some “adventure sheets” with three to-do things, and a harried parent can “run the game” while doing housework. “Here, to get by the rhyming troll you have to write down a poem! Work one out together, Dora, Nora, and Whoever-you-are! Back in 5! Remember to play pretend!” It can be made appropriate down to a very young age. That article came out when my girl was 4 and I easily specced out some kid-compatible mechanics (who rolled higher on a d6 + arts & crafts!).

Of course, this is hard for most RPG companies to do. It’s not like they’re part of a huge corporation that owns the rights to a bunch of children’s properties! Oh, wait…

It’s pretty sad that we want to get a new generation into the hobby, but the most obvious and high value things that could do that are despised, and instead we think all we need is yet another 300 page rulebook slaughterfest game. Get a child psychologist, combine simple to-dos with pony figures, run a TV spot during the show (retask some of the money being flushed down the toiled advertising Green Lantern toys), and voila, the My Little Pony Adventure Game has more people playing it than every other extant RPG within weeks.

I knew that My Little Pony was back on toy-store shelves. I didn’t realize it had a new hit show:

The series had a reboot last year and is properly titled My Little Pony: Friendship is Magic. While it is obviously a child’s cartoon, it is insanely well done: well-written, well-drawn, well-acted, with plenty of puns, sight gags, at least one Chuck Jones reference, and several very catchy songs.

To paraphrase the producer: “We knew parents would end up watching this show with their kids so we wanted to make it fun for them too. This includes male parents as well.” It worked! The series is now very popular with high school and college age students of both sexes.

It gets weirder:

Despite the target demographic of young girls, the show has gained a large following of predominately male teenagers and adults, calling themselves “bronies”. The appreciation of this unlikely audience is due to a combination of Faust’s direction and characterization, the expressive Flash based animation style, themes older audiences can appreciate, and a reciprocal relationship between the creators and fans. Elements of the show have become part of the remix culture and have formed the basis for a variety of Internet memes.

Secrets of the Secret Wars

Friday, August 12th, 2011

Jim Shooter reveals the secrets of the Secret Wars:

The road that led to Marvel Super Heroes Secret Wars actually began when Kenner Toys licensed the DC Universe for a boys’ action figure line. Their competitor, Mattel, already had their He-Man action figure line, which was doing very well, but wanted to hedge the bet in case comic book character action figures became the rage. So, they came to Marvel to talk about licensing our characters. One thing they demanded of us was an “event,” a special publication or series to help launch the toy line. I offered an idea that was suggested by a dozen or so correspondents — usually younger ones — in the fan mail every day: one big, epic story with all (or many) of the heroes and villains in it. Everyone agreed.

We went through a number of ideas for names for the toy line and series. Mattel’s focus group tests indicated that kids reacted positively to the words “wars” and “secret.” Okay.

Mattel had a number of other requirements. Doctor Doom, they said, looked too medieval. His armor would have to be made more high-tech. So would Iron Man’s, because their focus groups indicated that kids reacted positively… etc. Okay.

They also said there had to be new fortresses, vehicles and weapons because they wanted playsets, higher price point merchandise and additional play value. Okay.
[...]
Allowing any one of the writers to handle pretty much everyone else’s characters in Secret Wars, contemplated to be the biggest, most continuity-intensive crossover ever done, would have led to bloodshed in the hallowed halls.

So, I wrote it. As Editor in Chief, by definition, I was the company’s designated Keeper of the Franchises, and the ordained Absolute Authority on the characters — all part of the job, at least back then.

Movers and Shakers in Sports & Leisure

Tuesday, August 9th, 2011

There’s a certain dark humo(u)r in Amazon.co.uk’s Movers and Shakers in Sports & Leisure:

Mencius Moldbug notes the remarkable upsurge — in the last 24 hours! — in transatlantic attention to our romantic national pastime:

Now that’s what I call a special relationship! Can an expansion team be far behind? The London Chavs, perhaps? But why is no one buying gloves? UK readers please note: barehanding a long fly ball is no joke.

You’re displaying remarkable perspicuity, however, in voting aluminium. Admittedly, the “Bronx” brand exerts a powerful fashion attraction. But not only is the aluminium bat cheaper, it’s almost impossible to break. There’s not much use in a broken baseball bat. And remember: the strike zone starts at the knees.

He also links to this transcript of the messages going out over Blackberry’s BBM service:

Everyone from all sides of London meet up at the heart of London (central) OXFORD CIRCUS!!, Bare SHOPS are gonna get smashed up so come get some (free stuff!!!) fuck the feds we will send them back with OUR riot! >:O
Dead the ends and colour war for now so
if you see a brother… SALUT!
if you see a fed… SHOOT!
We need more MAN then feds so Everyone run wild, all of london and others are invited! Pure terror and havoc & Free stuff….just smash shop windows and cart out da stuff u want! Oxford Circus!!!!! 9pm, we don’t need pussyhole feds to run the streets and put our brothers in jail so tool up,
its a free world so have fun running wild shopping;)
Oxford Circus 9pm if u see a fed stopping a brother JUMP IN!!! EVERYONE JUMP IN niggers will be lurking about, all blacked out we strike at 9:15pm-9:30pm, make sure ur there see you there. REMEMBA DA LOCATION!!! OXFORD CIRCUS!!!
MUST REBROADCAST TO ALL CONTACTS!!!

Since when are English police called feds?

Hedge Fund Cult Leader Ray Dalio

Tuesday, August 9th, 2011

Kevin Roose describes Ray Dalio’s Bridgewater hedge fund as a billion-dollar cult:

Dalio, a tall, gaunt 61-year-old man with a swoop of gray hair, is an adherent of “radical transparency,” a management theory that calls for total honesty and accountability. He’s also a longtime practitioner of Transcendental Meditation and has built its precepts on self-actualization into Bridgewater’s office culture. (He’s even brought in David Lynch, the film director and unofficial TM spokesman, to lead a seminar for his staff.) Dalio expects employees to openly criticize not just the cafeteria fare but also each other; behind-the-back gossip is strictly prohibited. “Issue logs” track mistakes ranging from significant (poorly executed trades) to small (one employee is said to have been issue-logged for failing to wash his hands after a trip to the bathroom) and can result in “drilldowns,” intense sessions — one insider compares them to a cross between a white-collar deposition and the Spanish Inquisition — during which managers diagnose problems, identify responsible parties (“RPs,” in Daliospeak), and issue blunt correctives. Other employees can withdraw recordings of these proceedings from the firm’s “transparency library.”

Should Bridgewater employees need a refresher on the house rules, they can consult their copy of Principles, the 110-page manifesto Dalio has written to codify his philosophies about life, work, and the pursuit of greatness. The book used to be given to all Bridgewater employees in paper form and is now distributed via a custom app. Dalio’s axioms are studied with Talmudic intensity at the firm, and conversations with employees tend to be sprinkled with company jargon: “ego-barrier,” “probe,” and the ultimate Bridgewater insult, “suboptimal.”

“Empathy and kindness aren’t a top priority there,” says a former Bridgewater employee. The firm’s culture of absolute candor is designed to strip out emotional considerations and emphasize cold, Vulcan logic in all decision-making — the thin-skinned need not apply. But firm loyalists insist it sounds worse than it is. “Every organization is absolutely riddled with problems,” says one, “but we have a way of fixing them.” Dalio’s Principles, acolytes say, allow the firm’s researchers and traders to sidestep office politics and ego-stroking and focus on what really matters: beating the markets.

Which Bridgewater has certainly done. Last year, the firm put up the best numbers in its 36-year history, notching a nearly 45 percent gain in its most aggressive fund on its way to a total haul of more than $15 billion. Those returns — which CNBC ­noted were greater than the 2010 profits of Google, Amazon, Yahoo, and eBay combined — vaulted Bridgewater even further ahead in the hedge-fund rankings and reportedly netted Dalio a personal windfall of more than $3 billion.

How Debt Has Defined Human History

Monday, August 8th, 2011

David Graeber discusses how debt has defined human history — in the Wall Street Journal:

In fact, contrary to popular belief, credit has been the predominant form of money in world history. In ancient Mesopotamia, elaborate credit systems predated coinage by thousands of years. Periods in which people assume that money really “is” gold and silver, let alone use cash in most everyday transactions, are more the exception than the rule. Ancient empires, for instance, used coins mainly to pay soldiers, and when those empires dissolved in the early Middle Ages, society didn’t really “revert to barter,” as its often believed, but returned to elaborate credit systems — denominated in Roman (and then Carolingian) currency that no longer actually physically existed.

The remarkable thing was that they were able to maintain these credit systems despite the lack of any reliable state authorities willing or able to enforce contracts. How did they do it? Two ways: but both involved insisting that there were values that were more important than mere money.

The first was the cult of personal honor. In most parts of the world, in the Middle Ages (Europe was only a partial exception), merchants had to develop reputations for scrupulous integrity — not just always paying their debts, but forgiving others’ debts if they were in difficulties, and being generally pillars of their communities. Merchants could be trusted with money because they convinced others that they didn’t think money was the most important thing. As a result, “credit,” “honor,” and “decency” became the same thing — an identification which passed into ordinary life as well. As a result in England, where probably 95% of all transactions in a Medieval village were on credit, and decent people tended to avoid the courts, people still speak of “village worthies,” or “men of no account.” The apogee of this system though was the world of Medieval Islam, where checks were already in wide use by 1000 AD, and letters of credit could travel from Mali to Malaysia, all without any state enforcement whatsoever. In Melaka, the great Indian Ocean entrepôt, merchants from as far a way as Ethiopia or Korea notoriously avoided written contracts, preferring to seal deals “with a handshake and a glance at heaven.” If there were problems, they were referred to sharia courts with no power to have miscreants arrested or imprisoned, but with the power to destroy a merchant’s reputation, and therefore, credit-worthiness, if he were to refuse to abide by their rulings.

This latter brings us to the second factor: the existence of some sort of overarching institutions, larger than states, usually religious in nature, that ensured that credit systems didn’t fly completely out of hand. For much of human history, the great social evil — the thing that everyone feared would lead to the utter breakdown of society — was the debt crisis. The masses of the poor would become indebted to the rich, they would lose their flocks and fields, begin selling family members into peonage and slavery, leading either to mass flight, uprisings, or a society so polarized that the majority were effectively (sometimes literally) reduced to slaves. In periods where economic transactions were conducted largely through cash, there are many parts of the world where this actually began happen. Periods dominated by credit money, where everyone recognized that money was just a promise, a social arrangement, almost invariably involve some kind of mechanism to protect debtors. Mesopotamian kings used to rely on their cosmic ability to recreate society to declare clean slates, erase all debts, and simply start over. In ancient Judea this was institutionalized in the seventh-year Jubilee. In the Middle Ages, Christian and Islamic bans on usury and debt peonage, far from being impediments to trade, were actually what made most trade possible, since they ensured ordinary people were not entirely impoverished, and had the means to purchase the merchants’ wares, and because those religious systems became the foundation for networks of honor and trust.

This provides a hint of why we have been experiencing such a succession of debt crises. In this new phase of credit money that we’ve entered since 1971, we did exactly the opposite. Instead of setting up great overarching institutions designed to protect debtors, we created institutions like the S&P or IMF, essentially, designed instead to protect creditors. It has become increasingly apparent that the system simply doesn’t work.

That last portion hints at the fact that Graeber isn’t simply an anthropologist; he’s also an anarchist, which makes his appearance in the Journal unusual.

Graeber actually had plenty to say here, on this very blog, when we discussed his take on the origins of money.

Summer TV’s Top Target: Boys

Friday, August 5th, 2011

I’ve heard good things about Phineas and Ferb, but I didn’t realize the Disney show was part of a strategy to break out of their girl-orientation and target boys:

“We definitely set out to create a boy’s franchise. That was our goal. That group was underserved,” says Adam Sanderson, senior vice president, franchise management, at Disney-ABC Television Group. A nationwide live stadium show set for 85 cities, “Disney’s Phineas and Ferb: The Best Live Tour Ever!,” kicks off this summer.

Disney has put its licensing heft behind Phineas and Ferb, who have appeared on Kraft Macaroni & Cheese, Johnson & Johnson’s Band-Aid bandages and Kellogg’s Fruit Snacks, among other products. Skateboards, guitars and raincoats sold at stores like Wal-Mart, Kohl’s and Target are geared to young adults, so little boys will aspire to have them, too. “Disney’s whole thing was to develop a boy’s brand because it’s always been about princesses,” says Griffin Bentley, vice president of licensing for Mad Engine Inc., of San Diego, which has sold more than two million “Phineas and Ferb” T-shirts since 2008.

The approach seems to be working. Of the series’ 1.3 million viewers ages 6 to 11, roughly 48% are boys, compared with 39% for the Disney Channel overall. In May, a “Phineas and Ferb” show launched at Disney World. A feature film is in development.

The 6-to-11 age range is television programmers’ sweet spot. TV for kids under 6 can be controversial, and shows aimed at preschoolers generally have an educational or an emotional lesson, which appeases parents and cuts down on criticism from advocacy groups looking to curb TV-watching among young children. But there’s no need for tricky educational story lines for older kids. Shows can be purely entertaining.

Starting somewhere around 6, kids start exercising more independence in their TV viewing. A 2010 study from ad-buying firm Horizon Media found 55% of 6- to 11-year-olds have a TV in their rooms. Advertisers, led by fast food and movie studios, spent $121 million last year to reach kids via network and cable TV, according to Kantar Media.

It’s also the age when boys’ viewing takes a sharp turn away from girls’. “Kids watch the same stuff until age 5 or 6, and then they start to diverge,” says Linda Simensky, vice president of children’s programming at PBS, which in January came out with “Wild Kratts,” an animated show about brothers who travel the world looking for animals and acquiring their characteristics.

Some time after viewers hit age 6, story lines that appeal to girls—about friendship, romance, gossiping—start to make boys cringe. Boys like TV shows about robots and action. They prefer shows with male leads.

And according to recent research, they also prefer animation. Cartoons have dominated preschool programming, but few recent animated shows have had the boy-only appeal of “Transformers,” “G.I. Joe” and other retro classics. Meanwhile, live-action shows, with few exceptions, have tended to feature female leads, whether it’s Disney shows with stars like Selena Gomez, Demi Lovato and Miley Cyrus, or Nickelodeon’s “iCarly” (about girl with a Web show) and “Victorious” (about a girl at a performing-arts high school).

I’m still baffled that a media empire that has had so much success with princesses hasn’t found a way to work in a few masculine knights.

The Secret Origin of the Transformers

Friday, August 5th, 2011

Jim Shooter explains the convoluted origin of the Transformers:

In 1983, a toy company approached Marvel Comics seeking development of a toy property for comics, animation and other entertainment. The toys in question were cars and other vehicles that could be opened and unfolded into ROBOTS. Very cool.

The toy company was KNICKERBOCKER TOYS. They called their toy property, based on technology licensed from a Japanese company, the “MYSTERIONS.”

Marvel Comics was their second choice as a creative services provider. They had gone to DC Comics first. The executive who approached us showed us what DC had created for them. It was a comic book. He only had photocopies. I don’t believe the thing was ever printed.

It was awful.
[...]
So, we made a deal and began work. I wrote the back story and the treatment for the first story. They loved it.

The plan was for us to publish comics and for our studio, Marvel Productions, to produce a number of animated half-hours — six, I think. I forget. We would launch just before the pre-sale of the toys. Then follow it up in the spring when the initial wave of low price point items shipped. The usual.

We were asked to come to a meeting at Knickerbocker’s offices out in the wilds of Jersey somewhere.
[...]
The next day we learned that, just before our meeting, Hasbro had announced that it was acquiring Knickerbocker. Shakeup, indeed.

The deal with Knickerbocker fell victim to the takeover by Hasbro. The Hollywood term for similar events is “turnaround.” Projects begun by previous administrations are automatically put into turnaround, that is, on hold — usually permanently.
[...]
Some months later, the Hasbro exec who was Marvel’s main contact, Bob Prupis, came to my office. He pulled a few toy vehicles out of his bag and proceeded to open and unfold them into ROBOTS.

They were bigger and much more complex than the Mysterions. Different Japanese technology, same general idea.

Hasbro, he said, had the rights to the technology and toys based upon it. The problem, he said was story. He said that the Japanese storyline associated with the toys wasn’t useful. Japanese kids, apparently, don’t require much justification. Cars become robots, robots become cars. Well, of course they do. What do you mean, “why?”

(P.S. To this day I’ve never read or seen any of the Japanese storyline.)

American kids, he thought would like to know why. Did I think we could develop this toy concept for comics, animation and other entertainment the way we developed G.I. JOE?

Sure.

I didn’t mention the Mysterons, but, hey, if I could do it once, I figured I could do it again. I had to wonder, though, whether the Knickerbocker Mysterions somehow inspired Hasbro’s acquisition of the Transformers toys and technology.

Following the success of G.I. JOE, these toy developments had become a regular thing.

Marc Miyake has this to add:

As for the Japanese storylines associated with the Transformers before they became the Transformers, I can assure you they were detailed with logic behind the transformations: e.g., the transforming cassette tapes and so on were part of a boy’s secret arsenal against an alien invasion. And the transforming cars were supposed to fool alien invaders who would otherwise shoot at obvious military vehicles. I speak and read Japanese, and I spent my youth reading the backstories in the catalogs for Takara’s Microman and Diaclone lines. These backstories were expanded upon in spinoff manga: e.g., Yoshihiro Moritou’s Microman which ran in Japan’s TV Magazine for years. It would have been easy to fuse the backstories and adapt this existing material for the US market. (I spent 7th and 8th grade figuring out how to do that!) But for whatever reason, Hasbro didn’t like the Japanese backstories, and the Transformers backstory you created has endured 27 years, eclipsing the originals even in Japan itself.

Closing Loops

Tuesday, July 26th, 2011

Brothers Jesse Edwin Evans and Samuel Evans plan to turn their New Chicago Brewing Company into a zero-waste facility:

The heat for brewing New Chicago’s beer will come from an anaerobic digester, which uses bacteria to convert organic waste — produced in the building and by neighboring food businesses — to biogas (and sludge, which becomes fertilizer).

The gas is then cleaned, compressed, and run through a high-pressure turbine (repurposed from a military fighter jet engine) to create electricity and 850-degree steam.

The brewery, in turn, will produce spent grains—which can be used to feed the tilapia, grow mushrooms, and feed the digester — and carbon dioxide — which will be piped to the plants in the building to make them grow faster.

“The project is about closing loops,” Edel says. For that reason, he’s looking carefully at the energy needs and waste outputs of each potential occupant. He wants to demonstrate that even the most energy-intensive businesses can operate at net zero in a sustainable way.

That’s part of the reason brewing is important to the Plant: “It’s an energy-intensive activity, it’s a waste-intensive activity, and it’s a food activity. There are no toxins; it’s pure, clean stuff, and 100 percent of the waste from brewing is useful.”

Once the digester’s up and running, he says, they’ll be selling some power back to ComEd — but “they don’t let you sell them much, because you get classed as a power plant pretty quickly.”

Vertical farms and aquaponics facilities already exist in the U.S., though they’re still relatively rare, but the Plant could very well be the first place to create a series of loops that includes an anaerobic digester, food businesses, brewing, fish farming, and plant growing. Most anaerobic digesters are used on large farms to manage animal waste, though some breweries are also implementing them for wastewater.

Anheuser-Busch began using one at its New Jersey facility in 1985 to turn wastewater into biogas and now has digesters at ten of its 12 breweries in the U.S; Sierra Nevada and New Belgium both installed similar digesters around 2002 because their wastewater was overwhelming the municipal water treatment facilities in their respective cities.

Magic Hat Brewing Company began using a digester last year that, like the one the Plant will have, breaks down spent grains as well as wastewater and converts them to natural gas that becomes fuel for the brewing process. Steve Hill, the social networking manager of Magic Hat’s parent company North American Breweries, says that the digester will save the brewery, which produces an annual 155,000 barrels, about $200,000 per year.

Anheuser-Busch’s digesters cost $5 to $10 million apiece to build, according to Gene Bocis, who oversees utility and wastewater systems for A-B’s North American zone; Magic Hat’s was $4 million (though an outside company owns it, so the brewery didn’t have to front the money). The costs are scalable to some extent — Edel estimates that the Plant’s medium-size digester will cost $2.1 million — but even a small digester is likely to be out of the price range of most new breweries. Doug Hurst, who opened Metropolitan Brewing in Ravenswood three years ago, says he thinks most craft breweries are fairly green-minded, but “this isn’t a huge moneymaking business so it’s hard to justify a large initial outlay as a small start-up.”

The Debut of the Dazzler

Friday, July 22nd, 2011

Jim Shooter explains the debut of the Dazzler, a third-tier Marvel Comics character:

Sometime in early 1979, Marvel’s in-house counsel and V.P. of business affairs Alice Donenfeld proposed that we create a super-heroine/singer character. She was hoping to set up a joint venture with a record company — we’d produce comics featuring the character and they’d produce and market music using studio musicians, as was done with the Archies.

Disco was big at the time. Virtually every bar with a dance floor played disco, from upscale nightclubs like the Ice Palace and Studio 54, to dance halls like the one seen in Saturday Night Fever to local joints.

I assigned Tom DeFalco and John Romita, Jr. to take a shot at creating the character. In my initial discussions with them, I believe, we came up with the notion of giving her light powers, and therefore, being able to provide her own light show. Hence the “Dazzler” part of the name “Disco Dazzler.” I don’t remember who came up with which parts of the above. I was the one who came up with the energy-transmutation rationale to explain her powers.

John did some nice design sketches — performer’s attire that looked just super-hero enough. The part that Tom delivered was pretty standard. She was a young woman who dreamed and struggled to become a star, born with a “gift,” like the X-Men. She found she could use her powers openly while performing, under the pretense that it was some kind of stage magic, a closely guarded trade secret. [...] Dazzler debuted in X-Men #130.

And nothing much happened after that….

Until one day, later in 1979, I was called to an impromptu meeting upstairs. Present were Alice Donenfeld, President Jim Galton and our Hollywood rep whose name escapes me. They seemed pretty excited.

Alice and the rep had met with Neil Bogart of Casablanca Record and Filmworks who not only was interested in the “Archies” type recording venture, but wanted to launch it with a half-hour animated special. Cool.

Bogart wanted lots of Marvel heroes in the special and he wanted the stars he had under contract to provide voices for the non-Marvel characters. There had to be, therefore, characters to play for Robin Williams, Cher, Donna Summer, Rodney Dangerfield, Lenny and Squiggy, the Village People and KISS.

They had a follow up meeting already scheduled with Bogart. They needed a treatment for the story in four days. [...] So, I did it. For free, by the way, over a weekend. If I was going to be the fool who blew the deal, I didn’t want to be handing in a bill at the same time. What I wrote is posted below.

The treatment was presented to Casablanca and Neil, and the verdict came back, quoted to me by our rep, “This isn’t a half-hour special. This is a FEATURE FILM!”

And it would have been.

However, around that time, Bogart had health issues, Casablanca was being bought out and accounting improprieties were being alleged. The project fell into limbo.

But Marvel owned all rights. Casablanca had no investment, no stake whatsoever in the property or my treatment.

Alice went to the Cannes Film Festival in May of 1980 with my treatment in hand.

She managed to take a meeting with Bo Derek, and got her to read my treatment. On the basis of my treatment, Bo agreed to become attached to the project. She wanted to play Dazzler.

There’s a picture of Bo and her Husband John taken at Cannes that was featured on the cover of People Magazine, shown below. If you look closely, you can see that John is holding a stack of Marvel Comics. That’s the first issue of She-Hulk on top.