National versus Corporate Governance

Tuesday, February 18th, 2014

Nations do not feel the same competitive pressures as corporations, Richard Posner notes, and so they aren’t managed as efficiently:

Even a small, miserable, effectively bankrupt nation like Greece does not disappear, as large corporations not infrequently do, because of its inefficiency. Because corporations are simpler and smaller and also more constrained by competition than nations, we can expect them to be managed more efficiently, and specifically to adopt an organizational structure that minimizes agency costs. So let’s glance at the structure of the typical large corporation and compare it to our federal government structure. There will be a board of directors to exercise a general but loose supervision over the corporation (and in turn subject to very loose control by the shareholders), intervening decisively only in crisis situations or where there is vacancy in the office of the Chief Executive Officer. The CEO will be the dominant figure in the corporation, exercising something close to dictatorial power, assisted by a small personal staff. Often he will overshadow the chairman of the board of directors—he may even double as chairman and CEO. There will be a Chief Operating Officer, exercising day to day management, while the CEO, as the public face of the corporation, will formulate policy, provide overall guidance, inspiration, and “vision,” appoint the major subordinate corporate officials (general counsel, chief information officer, chief financial officer, etc.), and maintain personal relations with important investors, customers, competitors, and regulatory officials. The corporation will have several or many operating divisions, reporting to the COO or CEO, each headed by a vice president or equivalent. The employees in each division will serve at the pleasure of their superiors; no one will have fixed tenure.

Compare the federal government. The closest to a board of directors is the Congress, but it differs mainly in having a good deal of policy responsibility, and, since it does not appoint and is not appointed by the President (corresponding to a corporate CEO), there is no presumption that its policy preferences will coincide with the President’s. The President’s exercise of his own policymaking powers will often work at cross-purposes with Congress’s exercise of its powers; nor can he appoint senior officials without the concurrence of a division of the Congress, namely the Senate. A further dilution of presidential power results form the existence of an independent federal judiciary, headed by the Supreme Court. Federal judges and Justices have lifetime tenure, can invalidate legislative and executive action both federal and state, and rarely (because of that tenure) will a President be able to appoint a majority of the Supreme Court Justices or other federal judges.

The President has a personal staff (officially the “Executive Office of the President”) that numbers more than 2000. The staff assists him in overseeing the large number of “divisions” (executive departments, such as State and Defense and free-standing agencies such as the Environmental Protection Agency) into which the federal government is divided. The heads of these divisions, however, do not have hiring or firing authority, as a practical matter, over most of the employees, who are tenured civil servants. The heads serve short terms, and often are appointed for political reasons unrelated to experience or competence. The brevity of their terms and frequent lack of relevant skills or knowledge create tense relations with the tenured bureaucrats. As a result, the President’s control over the more than 4 million federal employees (of whom about 40 percent are military) is as a practical matter quite limited.

A further division of government is brought about by federalism: the division of the nation into 50 states, each with quasi-sovereign powers. The federal government has considerable power over the states, but far less than a CEO or COO would have over the operating divisions of their corporation.

The structure of the federal government reflects the state of the nation in the eighteenth century. The population was roughly 1 percent of what it is today, there were only 13 states and as a result the Congress was small, and the federal government was tiny. It is doubtful that starting over in today’s circumstances a constitutional convention would create a similar system.

The most glaring deficiency is the limited authority of the President and the absence of an official corresponding to the Chief Operating Officer of a private corporation. From an efficiency standpoint the President should be able to promulgate laws (not just regulations), subject to override by supermajorities of Congress; appoint subordinate officials without the concurrence of the Senate; and create a position analogous to that of a COO; this would result in a structure analogous to that of France, which has both a President and a prime minister, the latter being in charge of day to day governmental operations (though France is not an example of a well-governed country). The President should also be authorized to control the finances of the states, alter their boundaries, appoint their principal officials, and veto their laws. And no civil servants should have tenure. The result of all these changes would be to conform American government to the “government” of a large private corporation.

Maybe the President would become the “outside” partner and his Chief Operating Officer the “inside” partner, the former dealing with relations with foreign countries (in the broadest sense, comprehending not only foreign relations in the conventional sense, but also immigration, trade, and military assistance and intervention), the latter with the formulation and implementation of domestic policies.

Of course these are not feasible reforms. Anything that strengthens the President weakens other sources of power, not only in the government itself (legislators, civil servants) but also in the private sector, which through campaign donations and other forms of political activity exert a powerful and self-interested influence on governmental action, resulting in enormous waste and perversity.

The Crash Reel

Monday, February 17th, 2014

I never followed snowboarding, so I’d never heard of Kevin Pearce before watching The Crash Reel. I also didn’t know that the sport had progressed from the six-foot-tall halfpipes of the 1990s to towering 22-foot superpipes.

Coming into the last Winter Olympics, the 2010 games in Vancouver, Pearce was hailed as the next Shaun White, until he took a bad, bad fall in training, and the resulting traumatic brain injury put him in a coma for a week and a half — and left him different:

“It was pretty wild, seeing that stuff,” Pearce says of the moments with his family. “It was heavy to see that, like ‘Damn, that went down. I was not aware that went down.’ That’s no BS, I really didn’t know that stuff was going on. My brain was thinking one way, and all the people that were helping me recover had a different mindset and were on a different page. Now that I see it, I understand what they went through, what I put them through.”

It was eventually his brother, David, who has Down syndrome and has his own struggles accepting himself for who he is, was able to get through to Kevin by speaking the blatant truth. In his defense of why he doesn’t want his brother to start snowboarding professionally again, David simply told Kevin, “I don’t want you to die.”

Kevin says the accident actually brought him even closer to his brother. “Growing up with a brother with Down syndrome taught me so much about special needs and life in general and how you really need to open your eyes and be aware of all situations,” he says. “My accident had a huge impact on David. It really affected him. And it brought us so much closer. Before, I was living a life where I was winning competitions and living that life, and now, with the life I’m living, we understand each other so much better now. Things are harder. Life is slower. And David understands that.”

Taking the advice from David and his family, and taking note of his own physical limitations, Kevin eventually realized that the ending to his story was not going to be an Olympic gold medal. It wasn’t going to be professional snowboarding, period.

“The real moment I remember really realizing it is when I got back into a competition in Mt. Baker, Washington. It was the first time that I really tried to go fast and race down the mountain after my accident. It allowed me to understand how impaired my snowboarding was. You know, sitting in this chair right now, I feel totally fine. My brain doesn’t tell me that I’m still injured. Yeah, there are issues with my vision and issues with my memory, but those things aren’t telling me that I can’t snowboard. Doctors tell me, parents tell me, but I didn’t believe that sh–. I thought I could do it.”

The film reinforces Steve Sailer’s point that Winter Olympians are the successful products of nice families who engaged in a lot of (not inexpensive) family fun together.

Interestingly, Pearce’s father is a very successful businessman — a very successful, dyslexic businessman, with an artisanal glassblowing business.

Unrealistic Murders

Monday, February 17th, 2014

I’ve never seen the hit UK TV show Midsomer Murders, but a recent NHS study has found that it portrays homicide unrealistically — which concerns the psychologists leading the study, because it could be affecting public health messages:

“The typical fictional homicide is part of a planned series committed by a middle-aged white man or woman who is not intoxicated, sometimes using a bizarre weapon. In contrast, real homicides were almost all single, and were usually carried out by often intoxicated younger men from a more diverse ethnic origin, in an unplanned attack using a kitchen knife. They were also more likely to have a diagnosed mental disorder.”

Steve Sailer has made the same point about American crime dramas.

I couldn’t help but chuckle at the first comment on the Independent‘s site:

To be honest, I would rather they had focussed their attention on our right wing politics and their hate campaigns against the poor and disabled…now that is an area where we see a lot harm being caused, not only by the measures they introduce, but also the likes of Benefit Sttreet and the Daily Mail who just love to hate. They need their fix of hatred every day…influencing society at large to hate.

The Paradox of Order

Monday, February 17th, 2014

Doug Lemov describes the paradox of order — in the context of a well-behaved class, but with implications well beyond that narrow setting:

Students, whether they realize it or not, rely on teachers to create such environments. Nonetheless, many observers misunderstand them and think they occur naturally. It’s folly to think that left to their own devices a room full of people, almost any room full of people, will behave this way.  Classrooms like Erin’s cannot be achieved without meticulous attention to building the behavioral environment step by step.  Ends and means are easily confused, and because effective classroom culture, when it is complete, is nearly invisible for stretches of time, some people will not see the work that goes into it; they will see teachers who don’t talk to their students much about behavior and believe that the answer is not to talk about behavior much with your students. If you try to ignore behavior you will end up talking about little else, whereas if you are intentional about behavioral culture and establish clear expectations, behavioral issues will ultimately fade into the background as you talk about history, art, literature, math and science.  What you see in Erin’s classrooms is not ‘better kids’ who miraculously behave. What you see is meticulous intentionality in its dormant state.

You must have order to have a learning-intensive classroom. When I say that I am not talking merely about kids of color in urban classrooms.


And while some people fail to see that, there is another side to that coin. Orderly behavior without real and rigorous academics is an empty vessel. This is worth noting because the changes that occur in some classrooms when a teacher brings order to them can be so powerful they can be like catnip. Once you learn to get students to sit silently, the temptation can be to have them sit silently when they should be interacting. Once you teach students to line up in an orderly way, the temptation can be to line them up and keep them in lines. Beware: an orderly room must be orderly to allow academic rigor to thrive. Students must be silent so their classmates may speak in a climate of respect. They must line up quickly so they can get where they need to go for maximum learning. The goal is to get them quickly and quietly through the line and on the learning on the other side, not to keep them in quiet lines because it makes us feel more in control. I often tell skeptics of high behavioral expectations that just because a teacher can cause her classroom to be pin-drop quiet, does not mean she always must. It gives her an option she can exercise at will, not an obligation. And that reminder is good for those of us who believe in order as well.

Practical Gradualism vs. Moral Absolutism

Sunday, February 16th, 2014

Often the crusades which succeed are those which feel morally absolute to their advocates, Tyler Cowen says — and which also seem like practically-minded compromises to moderates and the undecided.

Freedom Mountain Academy

Sunday, February 16th, 2014

Freedom Mountain Academy is a one-room boarding school on a farm in the Appalachian Mountain — where troubled kids earn their freedom:

In traditional schools, students start with all of the ordinary privileges, which are then removed for bad behavior. At FMA, students start with few privileges but gain more and more as they demonstrate commitment and personal responsibility. That little twist from entitlement and punishment to earning and reinforcement rekindles motivation in FMA’s students. The curriculum reinforces this approach by rewarding students’ efforts—all of which is kind of like life.

When students first arrive, they forfeit their electronic devices and all use of electricity. FMA has this rule in order to “eliminate constant trivial pursuits,” said Margaret Cullinane, the school’s director and Kevin’s daughter. And it comes as a rude awakening.

Former student Taylor Meidinger, 16, of Packwood, Washington, said most students can handle the isolation for a week or two. But after a few weeks, Meidinger said, students miss their electronics, the outside world, family, and friends. They usually hate their new environment until they reach the two-week holiday break in December, when they go home. “At this point everybody tries to talk their parents into letting them stay home,” added Meidinger. “Then they come back for the second semester and start to realize what opportunities they have, and they start liking it.”

FMA also incentivizes students to develop a sense of collaboration. The first book they read is Viktor Frankl’s Man’s Search for Meaning, which chronicles Frankl’s struggle to find purpose in his life as an Auschwitz prisoner. Students discuss the book’s theme of suffering and how it relates to their own feelings. At this point they go on their first expedition into the wilderness, where they live off the land for 10 days, learning to work with and rely on one another.

“When you’re out in the wilderness you really can see how we’re all connected to each other,” said former FMA student Travis Ackerman. “They teach you how to grow up and take care of yourself. They don’t just teach you to memorize facts but they push you to think of new ways to solve problems,” he said. Ackerman, who was sent to FMA from Omaha, Nebraska, had to work with his classmates to set up camp, chop wood, start fires, and cook food. Learning to cooperate didn’t teach him rugged individualism, but rugged collaboration. After completing this first expedition, students reach the first of four levels of achievement. Each level grants both privileges and responsibilities in return for students’ adapting and adhering to expectations at FMA. Level one gives them permission to leave the building at will as long as they stay nearby. Reaching level two means students can range farther from the building. At level three students are free to roam the entire campus. For more than 90 percent of the students, this level is the extent of their achievements.

A select few, however, reach level four, at which they can help teach a class of their choosing. Fewer than 10 percent of the students reach this level, though more than 95 percent complete FMA’s nine-month program.


Saturday, February 15th, 2014

When the Bosnian crisis began in 1992, humanitarian groups and the UN came in to help the victims of Serb aggression:

But they quickly began to realise they were being used by western governments as a way of containing a crisis that the politicians did not want to get involved with.

The journalist David Rieff wrote

“The idea was simple, coarse and brutal. Instead of political action backed by the credible threat of military force, the Western powers would substitute a massive humanitarian effort to alleviate the worst consequences of a conflict they wanted to contain

‘Containment through charity’ was the way one UN official put it.”

And then at Srebrenica thousands of civilians gathered together in the enclave — believing they were under international protection. But when the Serbian troops led by General Mladic marched in, the UN troops did nothing. The promise of protection had simply made it easier for the Serbs to kill over 8,000 people.


One of the UN’s special envoys in Bosnia, Jose Maria Mendiluce realised that Glucksmann was right:

“You don’t reply to fascism with relief supplies. Only if we stop being neutral between murderers and victims, if we decide to back Bosnia’s fight for life against the fascist horror of ethnic cleansing, shall we be able to contribute to the survival of the remnants of that country and of our own dignity.”

And then a few months later American air power — under the command of NATO — was used to force the Serbs to negotiate a peace. Almost no-one disagreed. It was a Good War in which the left-wing humanitarians were now allied with their old imperialist enemy — America.

Out of Srebrenica came a strange new hybrid — a humanitarian militarism. And in the 1990s it rose up to capture the imagination of a generation on the left in Europe.

Ever since the collapse of the left in the early 1980s they had been searching for a new vision of how to change the world for the better. Now they found it — a humanitarianism that had the power to right wrongs around the world rather than just alleviate them.

It even had French philosophers behind it.

And one of that generation who was most entranced was Tony Blair, and in 1999 he took this humanitarianism to its moment of greatest triumph.

It was also a moment of triumph for Bernard Kouchner. He became the head of the interim administration in Kosovo — and he set out to create a new democracy.

Many of his staff were leftist revolutionaries from 1968. Even one of the NATO commanders had fought on the streets of Paris.

But Kouchner quickly discovered that victims could be very bad. There was an extraordinary range of ethnic groups in Kosovo.

They all had vendettas with each other — which meant that they were both victims and horrible victimizers at the same time.

It began to be obvious that getting rid of evil didn’t always lead to the simple triumph of goodness.

Which became horribly clear in Iraq in 2003.

One Ad to Rule Them All

Friday, February 14th, 2014

What would happen if J.R.R Tolkien worked in advertising? One ad to rule them all:

One Ad to Rule Them All Absolut

One Ad to Rule Them All Chanel

One Ad to Rule Them All Land Rover

One Ad to Rule Them All Nike

One Ad to Rule Them All Tom Tom

Art Deco Furniture

Friday, February 14th, 2014

I enjoyed this collection of Art Deco furniture, including a custom daybed with bookshelves from the New York apartment of composer and pianist George Gershwin:

Art Deco Day Bed

Armed Intervention

Friday, February 14th, 2014

Humanitarian intervention soon evolved into armed intervention:

At the same time as the humanitarian movement was rising up, so too were the new despots that were going to become some of the main targets for this new idealism.

Many of them — like Pol Pot, Saddam Hussein, and Muammar Gadaffi — were also, in a strange way, products of the failure of the Communist dream. Like Kouchner they too were trying to rework revolutionary theory — but in their case with horrific results.

I have found a sort of fly-on-the-wall documentary made in 1976 which follows Muammar Gadaffi around as he goes about ruling Libya. [...] The interviewer asks Gadaffi to explain why he has sent Libyan troops to fight with the Palestinians against Israel, and why he has sent in Libyan agents to try and overthrow President Sadat of Egypt.

In response Gadaffi launches into an explanation that countries like Libya have a duty to intervene in other nations where the ordinary people are being oppressed by autocrats or oppressive governments — and help free them. That includes helping to liberate Egypt and Tunisia.

But it also means, he says, that politicians like him are justified in intervening in Northern Ireland to help the Provisional IRA. Because they are oppressed by the British government

They too are victims.

What Gadaffi was arguing was a strange mirror image of the theory that Kouchner and the other ex-leftists in Europe were developing.

For they too were heading towards the idea of “armed intervention”.

In the 1980s the humanitarian movement was flourishing — above all in Afghanistan. But in Afghanistan the movement also came up against a big political problem.

Men and women from what was now called “the doctors’ movement” went in over the mountains to help the victims of the Soviet attacks. They were brave and daring and they saved the lives of many Afghan civilians.

But they also helped the Mujaheddin. Under the theory of the humanitarian movement this was fine. The Mujaheddin were resisting the Soviet totalitarianism. They were victims fighting back so it was morally right to help them.

But others didn’t see it that way.

Here is video of the trial in Kabul in 1983 of a French doctor who had been captured by the Afghan army.

He is called Philippe Augoyard. He worked for Aide Medicale Internationale — which was another version of MSF. The trial is absurd — and in the tradition of all communist show trials the doctor reads out a “confession” and admits to “working with the counter-revolutionary bandits”.

But there is also another part of his confession that was both true and embarrassing for all the ex-Marxists and Maoists in the humanitarian movement. The mujaheddin they were helping were backed, funded and armed by the Americans.

Which meant they were helping American global imperialism.

But then a group of French philosophers came to the rescue. They came up with a theory that said it wasn’t bad to work with American military power. In fact, if the humanitarians could harness America’s armed might, they could use it to change the world in a revolutionary way.

The philosophers were led by another ex-Maoist called Andre Glucksmann. He had turned against the left and had developed his own theory which he called “anti-totalitariansm”.

But he wasn’t alone. Glucksmann was part of a group of intellectuals that rose up in France in the late 1970s called the New Philosophers. They saw Bernard Kouchner as an action hero putting their ideas into practice. Another prominent one was the glamorous Bernard-Henri Levy.

Glucksmann put it in stark terms. Everything that oppressed people around the world he called “Auschwitz”. Even famines were called “Auschwitz”.

It was the ghost of the Second World War again.

Glucksmann then said that people with power had a right to intervene in other societies to prevent “Auschwitzes”. And that included using American power.

Maybe, he said, power exercised by the strong was not always oppression. If it was used decently it could liberate the oppressed.

And — Glucksmann said — this didn’t just mean medical help. It included “armed resistance”.

The American Precariat

Thursday, February 13th, 2014

Americans have lost faith in the American way, David Brooks finds:

Fertility rates, a good marker of confidence, are down. Even accounting for cyclical changes, people are less likely to voluntarily vacate a job in search of a better one. Only 46 percent of white Americans believe they have a good chance of improving their standard of living, the lowest levels in the history of the General Social Survey.

Peter Beinart wrote a fascinating piece for National Journal, arguing that Americans used to have much more faith in capitalism, a classless society, America’s role in the world and organized religion than people from Europe. But now American attitudes resemble European attitudes, and when you just look at young people, American exceptionalism is basically gone.

Fifty percent of Americans over 65 believe America stands above all others as the greatest nation on earth. Only 27 percent of Americans ages 18 to 29 believe that. As late as 2003, Americans were more likely than Italians, Brits and Germans to say the “free market economy is the best system on which to base the future of the world.” By 2010, they were slightly less likely than those Europeans to embrace capitalism.

Thirty years ago, a vast majority of Americans identified as members of the middle class. But since 1988, the percentage of Americans who call themselves members of the “have-nots” has doubled. Today’s young people are more likely to believe success is a matter of luck, not effort, than earlier generations.

What Bad Students Know that Good Economists Don’t

Thursday, February 13th, 2014

What bad students know that good economists don’t, Bryan Caplan says, is that the return to trying to get a degree is far lower than the return to successfully getting a degree:

For students in the bottom quartile of academic ability, paying a year’s tuition is almost as foolish as buying 10,000 lottery tickets.


College is a great investment for great students, a mediocre investment for mediocre students, and a bad investment for bad students.

Doctors without Borders

Thursday, February 13th, 2014

Out of Biafra came a new idea of how to save the world:

And the man who would create it was a young French doctor called Bernard Kouchner.

Kouchner had worked for the Red Cross in Biafra, but he had become disgusted by the Red Cross’ refusal to publicise the genocide created by the Nigerian government.

Just as the Red Cross hadn’t revealed the horrors they saw in World War Two in the Nazi concentration camps because they insisted on being “neutral”

Kouchner resigned and went back to Paris where he founded a new humanitarian organisation called Medecins Sans Frontieres. Being neutral, Kouchner said, really meant being complicit in the horror. And MSF would never be complicit. It was on the side of the innocent victims.

Kouchner — and many of the others who founded MSF — had been Marxist or Maoist revolutionaries, but they had become disenchanted with those utopian visions. And what they were doing was reworking the politics of third world liberation into a new form.

It was a type of liberation that they believed went beyond the politics of left and right and instead was about saving individuals from the horrors of totalitarianism whether that came from the right or the left.

They weren’t going to be neutral. They were going to take sides. But it was the side of the victims — because they were neutral.

Their first slogan was “There are no good and bad victims”.

And in 1979 Kouchner dramatically demonstrated this belief. He hired a ship to go and rescue the Vietnamese boat people who were fleeing the communist regime who now ruled Vietnam.

The left — and many liberals — were shocked. Because these were “bad victims”. Victims of the noble anti-imperialists who had defeated America.


Wednesday, February 12th, 2014

MOOCs represent one of the few places in which people are actually studying pedagogy at universities, and trying to improve it, John H. Cochrane notes:

Technology facilitates data collection. Unless you’re in the business, you would be very surprised to learn that college professors receive essentially no training in how to teach, no supervision of their classroom activities, and little feedback beyond a numerical rating at the end of the quarter. Though we are data-oriented empirical social scientists in our research lives, we do essentially no measurement or study of teaching effectiveness. The data collection and analysis that moocs provide may change that. Also, the presence (and necessity!) of a mooc staff means there are people whose job it is to keep up with those pedagogical lessons.

The Birth of Affirmative Action

Wednesday, February 12th, 2014

Proponents of affirmative action lump it in with other victories of the civil rights movement, Tanner Colby notes:

The phrase “affirmative action” first appeared in President Kennedy’s Executive Order 10925, which called for “affirmative action” to be taken to ensure people were employed “without regard to their race, creed, color, or national origin.” And Lyndon Johnson is usually given credit for enunciating the principles of affirmative action when he called for reparative economic justice for black America in his famous “To Fulfill These Rights” speech at Howard University, saying, “You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and say, ‘You are now free to compete with all the others,’ and still justly believe that you have been completely fair.”

But neither Kennedy nor Johnson ever implemented anything resembling what we now describe as affirmative action — i.e., quotas and set-asides — on the economic front, largely because the Democratic party was beholden to Big Labor, whose unions were adamantly opposed to quotas of any kind.


The way in which affirmative action was implemented speaks volumes about the motivations behind it. Nixon’s first task upon taking office was to resolve the impasse between civil rights leaders and skilled labor unions. In his first address to Congress, the president announced what became known as the Philadelphia Plan, which imposed goals and timetables for race-based hiring in the city’s unions. Prior to the Philadelphia Plan, under Kennedy and Johnson, affirmative action had always meant to take affirmative action to ensure discrimination was not taking place. Now, affirmative action meant imposing racial preferences and quotas. After its launch in Philadelphia, the program was rolled out in dozens of other cities nationwide. In the meantime, the White House was busy stuffing racial-preference policies into the federal bureaucracy wherever it could find room. In the spring of 1969, Nixon expanded affirmative action mandates from government procurement contracts and applied them to any institution that received any federal funds of any kind, which brought universities, research institutions — basically everyone — into the fold. Then Nixon issued Executive Order 11478, which called for affirmative action in all government employment, bringing huge numbers of black workers onto the federal payroll. Racial preferences, as we know them today, were now sewn into the fabric of the country.

And affirmative action “worked.” The most immediate and measurable impact was in government hiring. Blacks had always enjoyed relatively better employment prospects in the public sector, and affirmative action greatly enhanced that. By the early 1970s, 57 percent of black male college graduates and 72 percent of black female college graduates were employed in government positions. The private sector also went on a hiring binge. Impelled by the fear of more urban riots, the Fortune 500 launched a flotilla of affirmative action programs aimed at getting as many black hires in the door as quickly as possible. After decades of economic stagnation, between 1969 and 1972, total black income rose from $38.7 billion to $51.1 billion, a 32 percent jump in just three years.

Richard Nixon put more money in black wallets than JFK, LBJ, and MLK combined. While they never embraced Nixon, black Americans and their white liberal champions fell in love with quotas and set-asides. Many of the moderate and liberal Republicans in the White House had faith in affirmative action, too. Nixon’s Secretary of Labor George Shultz, upon his nomination, acknowledged that black unemployment was the most pressing labor issue in the country, and believed that the Philadelphia Plan would offer a useful model for cities across the country. Even some conservatives were on board. Prominent Nixon-supporter William F. Buckley called for “a pro-Negro discrimination” in order to address the problem of unemployment.

The president, however, felt differently. Just weeks after calling the Philadelphia Plan “historic and critical” in Congress, Nixon jotted a note to domestic aide John Erlichman calling it “an almost hopeless holding action at best,” saying “let’s limit our public action and $ — to the least we can get away with.” Nixon’s primary concern in the Oval Office was to make his mark as a great foreign policy leader. Vietnam, China, the Soviet Union — these were his main preoccupations. He wanted the home front happy and humming along so that he’d be free to spend his political capital overseas, and his approach to the volatile issues of race reflected this. Staunch opposition to school busing and fair housing would appease suburban white voters. Affirmative action on the jobs front would appease middle-class black voters, end the riots threatening corporate America’s bottom line, and generally keep the racial question tamped down long enough for Nixon to win re-election in ’72.