Shelby Foote on the Confederate Battle Flag

Wednesday, June 24th, 2015

Shelby Foote explains his thoughts on the Confederate battle flag:

The flag is a symbol my great grandfather fought under and in defense of. I am for flying it anywhere anybody wants to fly it. I do know perfectly well what pain it causes my black friends, but I think that pain is not necessary if they would read the confederate constitution and knew what the confederacy really stood for. This country has two grievous sins on its hands. One of them is slavery — whether we’ll ever be cured of it, I don’t know. The other one is emancipation — they told 4 million people, you’re free, hit the road, and they drifted back into a form of peonage that in some ways is worse than slavery. These things have got to be understood before they’re condemned. They’re condemned on the face of it because they take that flag to represent what those yahoos represent as — in their protest against civil rights things. But the people who knew what that flag really stood for should have stopped those yahoos from using it as a symbol of what they stood for. But we didn’t — and now you had this problem of the confederate flag being identified as sort of a roughneck thing, which it is not….

I don’t object to any individual hiding from history, but I do object to their hiding history from me. And that’s what seems to me to be going on here. There are a lot of terrible things that happened in American history, but we don’t wipe ’em out of the history books; we don’t destroy their symbols; we don’t forget they ever happened; we don’t resent anybody bringing it up. The confederate flag has been placed in that position that’s unique with an American symbol. I’ve never known one to be so despised.

(Hat tip to Foseti.)

Why an X rather than a cross?

Wednesday, June 24th, 2015

So, why is the Confederate battle flag based on an X rather than a cross?

William Miles’s disappointment with the Stars and Bars went beyond his strong ideological objections to the Stars and Stripes. He had hoped that the Confederacy would adopt his own design for a national flag — the pattern that later generations mistakenly and ironically insisted on calling the Stars and Bars.… Charles Moise, a self-described “southerner of Jewish persuasion,” wrote Miles and other members of the South Carolina delegation asking that “the symbol of a particular religion” not be made the symbol of the nation.

In adapting his flag to take these criticisms into account, Miles removed the palmetto tree and crescent and substituted a diagonal cross for the St. George’s cross. Recalling (and sketching) his proposal a few months later, Miles explained that the diagonal cross was preferable because “it avoided the religious objection about the cross (from the Jews & many Protestant sects), because it did not stand out so conspicuously as if the cross had been placed upright thus.” … If Miles had not been eager to conciliate southern Jews, the traditional Latin (or St. George’s) cross would have adorned his flag.

Precisely the Wrong Stuff

Wednesday, June 24th, 2015

A key principle of human factors is that it is the unspoken rules of who can say what and when that often lead to crucial things going unsaid:

If we don’t like to think that doctors make mistakes, doctors like to think about it even less.

One of the biggest problems identified was the unwritten but entrenched hierarchy of hospitals. Bromiley, who has worked with experts from various “safety-critical” industries, including the military, told me that the hospital is by far the most hierarchical workplace he has come across. At the top of the tree are consultant surgeons, the rock stars of the hospital corridors: highly driven, competitive, mostly male and not the kind who enjoy confessing to uncertainty. Then come anaesthetists, often quieter of disposition and warier of risk. Further down are nurses, valued for their hard work but not for their brains.

A key principle of human factors is that it is the unspoken rules of who can say what and when that often lead to crucial things going unsaid. The most painful part of the transcript of Flight 173’s final hour is the flight engineer’s interjections. You can sense his concern about the fuel situation, and his hesitancy about expressing it. Fifteen minutes is gonna – really run us low on fuel here. Perhaps he’s assuming the captain and his officers know the urgency of their predicament. Perhaps he’s worried about being seen to speak out of turn. Whatever it is, he doesn’t say what he feels: This is an emergency. We need to get this plane on the ground – NOW. Similarly, the nurses who could see the urgency of Elaine Bromiley’s condition didn’t feel able to tell the doctors that they were on the verge of committing a grave error. So they made tentative suggestions that were easy to ignore.

John Pickles, an ENT surgeon and former medical director of Luton and Dunstable Hospital NHS Foundation Trust, told me that usually when an operation is carried out on the wrong part of the body (a class of error known as “wrong-site surgery”), there is at least one person in the room who knows or suspects a mistake is being made. He recalled the case of a patient in South Wales who had the wrong kidney removed. A (female) medical student had pointed out the impending error but the two (male) surgeons ignored her and carried on. The patient, who was 70 years old, was left with one diseased kidney, and died six weeks later. In other cases nobody spoke up at all.

The pioneers of crew resource management knew that merely warning pilots about fixation error was not sufficient. It is too powerful an instinct to be repressed entirely even when you know about it. The answer lay with the crew. Because even the most experienced captains are prone to human error, the entire aircraft crew needed to act as a collective intelligence, vigilant for problems and responsible for solutions. “It’s the people at the edge of the room, standing back from the situation, who can often see it best,” Bromiley said to me.

He recalled the case of British Midland Flight 92, which had just taken off for its flight from London to Belfast on 8 January 1989 when the pilots discovered one of the engines was on fire. Following procedure, they shut it down. Over the PA, the captain explained that because of a problem with the right engine he was making an emergency landing. The cabin staff, who – like the passengers, but unlike the cockpit crew – could see smoke and flames coming from the left engine, didn’t pass this information on to the cockpit. After the pilots shut down the only functioning engine, British Midland 92 crashed into the embankment of the M1 motorway near Kegworth in Leicestershire. Forty-seven of the 126 people on board died; 74 sustained serious injuries.

The airline industry pinpointed a major block to communication among members of the cockpit crew: the captain. The rank of captain retained the aura of imperial command it inherited from the military and from the early days of flying, when pilots such as Chuck Yeager, immortalised in Tom Wolfe’s book The Right Stuff, were celebrated as audacious mavericks. The pioneers of CRM realised that, in the age of mass air travel, charismatic heroism was precisely the wrong stuff. The industry needed team players. The captain’s aura was a force field, stopping other crew members from speaking their mind at critical moments. It wasn’t just the instrument panel that had to change: it was the culture of the cockpit.

Long before they started doing more good than harm, surgeons were revered as men of genius. In the 18th and 19th centuries, surgical superstars performed operations in packed amphitheatres before hushed, admiring audiences. A great surgeon was a virtuoso performer with the hands of a god. His nurses and assistants were present merely to follow the great man’s commands, much as the planets in an orrery revolve around the sun. The advent of medical science gave this myth a grounding in reality: at least we can be confident that doctors today make people better, most of the time. But it reinforced a mystique that makes doctors, and especially surgeons (who, of course, still perform in operating theatres), hard to question, by either patients or staff.

Better safety involves bringing doctors off their pedestal or, rather, inviting them to step down from it. Modern medicine is more reliant than ever on teamwork. As operations become more complex, more people and procedures are involved. Operating rooms swarm with people; various specialists pronounce judgement or perform procedures, and then leave. Surgical teams are often comprised of individuals who know each other only vaguely, if at all. It is a simple but unavoidable truth that the more people are involved in something, and the less well they know each other, the more likely it is that someone will make an error.

The most significant human factors innovation in health care in recent years is surprisingly prosaic: the checklist. Borrowed from the airline industry, the checklist is a standardised list of procedures to follow for every operation, and for every eventuality. Checklists compensate for the inbuilt tendency of human beings under stress to forget or ignore what is important, including the most basic things (the first item on one aviation checklist is FLY THE AIRPLANE). They also empower the people at the edges of the room: before the operation and at key moments during it, the whole team goes through each point in turn, including emergencies, which gives a cue to more reserved members of the team to speak up.

Checklists are most effective in an atmos­phere of informality and openness: it has been shown that simply using the first name of the other team members improves communication, and that giving people a chance to say something at the beginning of a case makes them more likely to speak up during the operation itself.

Naturally, this spirit of openness entails a diminishment of the surgeon’s power – or a dispersal of that power around the team. Some doctors don’t mind this – indeed, they welcome it, because they realise that their team can save them from career-ruining mistakes. Others are more resistant, particularly those who treasure their independence; mavericks don’t do checklists. Even those who see themselves as evolved team players may overestimate their openness. J Bryan Sexton, a psychologist at Johns Hopkins University in the US, has conducted global surveys of operating-room staff. He found that while 64 per cent of surgeons rated their operations as having high levels of teamwork, only 28 per cent of nurses agree.

One Man, One Vote, One Time

Tuesday, June 23rd, 2015

Ian Smith flew for the RAF in WWII and then returned to Rhodesia and eventually become Prime Minister at an interesting time in African history:

Countries were becoming “independent.”

The post-independence period in all the sub-Saharan countries followed a strangely predictable pattern. Smith called it the “one man, one vote, one time” pattern. In addition to the rise of (generally Communist) dictators for life, the independence movements were also characterized by the rape and slaughter of any remaining white Africans (although it’s supposedly important to protect minorities, protecting whites in Africa is apparently affirmatively bad), massive reductions in economic output and the general decay or outright disappearance of any semblance of civilization. Nevertheless, the Americans and the British (and, of course, the Russians – purely coincidentally, I’m sure) continued to push for independence.

“Freedom” came to Ghana (1957), Nigeria (1960), Congo (1960), Uganda, Tanzania, Kenya, Zanzibar (later part of Tanzania), with largely predictable results.

The writing was on the wall for Rhodesia by the early ’60s. They met to draft a new Constitution in hopes the British would grant them independence. Obviously, given the success of other independence movements, the ruling elite (mostly white) were reluctant to follow the fully-democratic route. In drafting the constitution, the leaders met with and obtained approval from over 600 tribal chiefs.

The history of the various constitutions and negotiations is too long (and frankly too depressing) to repeat. Read it, if you can stomach it.

The British perpetually pushed for full elections. The Rhodesians would have elections and consultations with tribal leaders. Always, the elections were considered unrepresentative by the British. Mugabe would convince his supporters to boycott, for example, and the British would be up in arms. Always, the constitutions stated explicitly that the political system was dedicated to “unimpeded progress to majority rule.” It was never enough. Full elections were nearly impossible anyway, in a country that had never taken a census, in which most citizens had no birth certificates, and most citizens were illiterate.

The British essentially outsourced their foreign policy in Africa to the OAU. That organization became a collective of third-rate communist dictators. A fun group to bargain with.

The international community exerted increasing pressure on Rhodesia. The UN labelled Smith a threat to world peace – apparently not wishing to plunge his country into chaos and destruction is a threat to world peace. The UN blockaded the country, even though they simultaneously didn’t recognize its independence. The UN thereby (according to its own logic) blockaded part of the United Kingdom.

Early on, the blockade was largely ineffective. In fact, it gave a boost to agricultural production (at this point Rhodesia actually started exporting food) and industry. However, the blockade put Rhodesia at the mercy of South Africa and Portugal (via it’s colonies, particularly Mozambique).

In a few years, the Portuguese government was overthrown. Portugal soon abandoned its colonies and “free” Mozambique declared war on Rhodesia (ah, freedom).

At this point, Rhodesia was entirely reliant on South Africa for things like access to the sea and bullets. Smith viewed apartheid as “unprincipled and totally indefensible” and an entirely untenable long-term solution.

At this point, South Africa basically threw Rhodesia under the bus, in hopes that doing so would appease the international community. If white South Africans weren’t getting it so badly, you’d almost think they deserved what they were getting for their treatment of Rhodesia.

One can’t help but wonder if South Africa and the international community had their own reasons to focus first on Rhodesia. Was the acceptance of ultimate black rule and the fact that blacks and whites had many equal rights too threatening to South Africa? Was the fact that Rhodesia was so successful too threatening to the international community? For whatever reason, Rhodesia had to be dealt with.

(Smith blames the Communists. Indeed, de-colonization resulted in communist governments in most of Africa. If you read this blog regularly, you may not be surprised that US and British foreign policies were entirely dedicated to gaining additional African territory for communists. Smith, however, was unable to believe that the US and Britain were promoting communist interests. If communists were trying to take over all of Africa (and ultimately the resource rich South Africa), taking Rhodesia first would be a necessary step. This, like much of post-war US and British foreign policy is probably just a coincidence though. It’s worth noting that when Smith visited the US or Britain and spoke to politicians, the politicians were always shocked by Smith’s arguments. Apparently the State Department and Foreign Office were passing communist propaganda on Rhodesia through to politicians. Another coincidence, I’m sure.)

As the situation became more dire, whites started leaving Rhodesia in larger numbers (economic output began declining accordingly). Smith stayed.

A group of black leaders emerged. Generally the leaders were generally tribal leaders (full democracy in these countries at these times just meant putting the largest tribe in control of the country). Leaders at the time include Nkomo, Sithole, Mugabe, and Muzorewa. The latter was the first Prime Minister after Smith, but wasn’t ruthless enough (and hostile enough to whites, one suspects) to keep Mugabe out.

Fully free elections (to the surprise of the British apparently, but not anyone who was actually paying attention) turned out to be competitions to see who could terrorize the largest number of citizens. The prize, under this enlightened method for choosing a leader, would obviously eventually be Mugabe’s.

Smith stayed in Rhodesia until he was stripped of his citizenship, at which point he left for South Africa. He died in 2007. The Zimbabwean “government” seized his land in 2012.

Fixation Error

Tuesday, June 23rd, 2015

In a crisis, the brain’s perceptual field narrows and shortens:

We become seized by a tremendous compulsion to fix on the problem we think we can solve, and quickly lose awareness of almost everything else. It’s an affliction to which even the most skilled and experienced professionals are prone.

Imagine a stalled car, stuck on a level crossing as a distant train bears down on it. Panic rising, the driver starts and restarts the engine rather than getting out of the car and running. The three doctors bent over Elaine Bromiley’s throat were intent on finding a way to intubate, just as the three pilots in the cockpit of United 173 were determined to establish the status of the landing gear. In neither case did these seasoned professionals look up and register the oncoming train: in the case of Elaine, her oxygen levels, and in the case of United 173, its fuel levels.

When people are fixating, their perception of time becomes highly erratic; minutes stretch and elongate. One of the most striking aspects of the transcript of United 173’s last minutes is the way the captain seems to be under the impression that he has plenty of time, right up until the moment the engines cut out. It’s not that he didn’t have the correct information; it’s that his brain was running to a different clock. Similarly, it’s not that the doctors weren’t aware that Elaine Bromiley’s oxygen supply was a problem; it’s that their sense of how long she had been without it was distorted. When Harmer interviewed him, the anaesthetic consultant confessed that he had no idea how much time had passed.

Imagine, for a moment, being one of those doctors. You have a patient who has stopped breathing. The clock is ticking. The standard procedure isn’t working, but you have employed it dozens of times before and you know it works. Each of the senior colleagues around you is experiencing the same difficulty, which reassures you. You cling to the belief that, between the three of you, you will solve the problem, if it is soluble at all. You vaguely register nurses coming into the room and saying things but you don’t really hear what they say. Perhaps it occurs to you to step back from the patient and demand a rethink, but you don’t want your peers to see you as panicky or naive. So you focus on the one thing you can control: the procedure. You repeat it over and over, hoping for a different result. It is madness, but it is comprehensible madness.

Fiction and the Strategist

Monday, June 22nd, 2015

When the moment of decision arrives the time for study and reflection has ended, T. Greer reminds us:

Decisions made under pressure often rely on heuristics, assumptions, and interpretive frames formed long before crisis arrives. Some of these are created through personal experience; others are gifts of genetic inheritance. But a large part of our inner model of the world and its workings comes from what we have read. This is why the strategist should read. Books allow strategists to learn the painful lessons of defeat without the sort of destruction that usually attends it, provide the conceptual tools needed to make sense of a complex world, and helps strategists spot patterns and trends that they might be able to leverage to their own benefit. But — and this is an important but — this is only true if the lessons, ideas, and narratives incorporated into their model of the world are themselves accurate depictions of reality. The fruits of false assumptions about human motivation, war, or politics incorporated in the worldview of the strategist are disaster.

The implication of all this is that one should choose carefully what one reads. This is especially true with works of fiction, whose events and characters are decided by the demands of narrative art, not the connections between cause and effect operative in the real world.

T. Greer is particularly concerned about Ender’s Game and its high place on a recent strategy reading list:

Ender’s Game is not a realistic depiction of politics and war. It was never designed to be. This is because its subject is not strategy, but ethics. Orson Scott Card believes that morality is not found in consequences of our actions, but in the intentions that lead us to act in the first place.

[...]

That is my case against Ender’s Game in a nut-shell, though I can understand why some of its other themes might make it popular with professional strategists. This is particularly true for the folks who first read the book shortly after it was first published. In a culture enamored with “disruptive innovation” and obsessed with “thinking outside of the box” it is easy to forget that these concepts are relatively new ideas. Ender’s “the enemy gate is down” preceded both by two decades. A strategist should have something of a maverick mentality, and Ender’s Game seems like a perfect case study in the art.

The problem is that it is nothing of the sort.

[...]

It is important to remember here the reason Card needs Ender to be a tactical genius is not because he wants to teach us enduring lessons about zero gravity combat tactics, but because the premise of his novel calls for an innocent but unparalleled genius to be its protagonist. The Battle School does not exist to teach readers universal principles of strategy, politics, or leadership, but to demonstrate the in-universe brilliance of Ender Wiggin. This point can be generalized to all of the ideas, events, and characters of the novel — indeed, to all novels. Storylines are created by the author to manipulate the emotions and perceptions of the audience. This is true for even simple plot points like Ender’s maverick tag-line, “the enemy’s gate is down”.

Read the whole thing.

Crew Resource Management

Monday, June 22nd, 2015

The story of United Airlines Flight 173 is known to every airline pilot, because it is studied by every trainee:

Shortly after 5pm on the clear-skied evening of 28 December 1978, United Airlines Flight 173 began its descent to Portland International Airport. The plane had taken off from New York that morning and, after making a pre-scheduled stop in Denver, it was reaching its final destination with 189 souls on board.

As the landing gear was lowered there was a loud thump and the aircraft yawed slightly to the right. The flight crew noticed that one of the green landing gear indicator lights wasn’t lit. The captain radioed air-traffic control at Portland, telling them, “We’ve got a gear problem.”

Portland’s control agreed that the plane would orbit the airport while the captain, first officer and second officer worked out what to do. The passengers were told that there would be a delay. The cabin crew began to carry out checks. The flight attendants were instructed to check the visual indicators on the wings, which suggested that the landing gear was locked down.

Nearly half an hour after the captain told Portland about the landing gear problem, he contacted the United Airlines maintenance centre, informing the staff there that he intended to continue the holding pattern for another 15 or 20 minutes. He reported 7,000lbs of fuel aboard, down from 13,000 when he had first spoken to Portland.

United’s controller sounded a mild note of concern. “You estimate that you’ll make a landing about five minutes past the hour. Is that OK?” The captain’s response was ostentatiously relaxed: “Yeah, that’s a good ball park. I’m not gonna hurry the girls [the cabin crew].” United 173 had 30 minutes of fuel left.

The captain and his two officers continued to debate the question of whether the landing gear was down. The captain asked his crew how much fuel they would have left after another 15 minutes of flying. The flight engineer responded, “Not enough. Fifteen minutes is gonna – really run us low on fuel here.” At 18.07 one of the plane’s engines lost power. Six minutes later, the flight engineer reported that both engines were gone. The captain, as if waking up to the situation for the first time, said: “They’re all going. We can’t make Troutdale [a small airport on the approach route to Portland].” “We can’t make anything,” said the first officer. At 18.13, the first officer sent the plane’s final message to air-traffic control: “We’re going down. We’re not going to be able to make the airport.”

[...]

It’s a miracle that only ten people were killed after Flight 173 crashed into an area of woodland in suburban Portland; but the crash needn’t have happened at all. Had the captain attempted to land, the plane would have touched down safely: the subsequent investigation found that the landing gear had been down the whole time. But the captain and officers of Flight 173 became so engrossed in one puzzle that they became blind to the more urgent problem: fuel shortage. This is called “fixation error”.

This led the industry to create a set of principles and procedures known as CRM, or Crew Resource Management:

CRM was born of a realisation that in the late 20th century the most frequent cause of crashes wasn’t technical failure, but human error. Its roots go back to the Second World War, when the US army assigned a psychologist called Alphonse Chapanis to investigate a curious phenomenon. B-17 bombers kept crashing on to the runway on landing, even though there were no apparent mechanical problem with the planes. Rather than blaming the pilots, Chapanis pointed to the instrument panel. The lever to control the landing gear and the lever that operated the flaps were next to each other. Pilots, weary after long flights, were confusing the two, retracting the wheels and causing the crash. Chapanis suggested attaching a wheel to the handle of the landing lever and a triangle to the flaps lever, making each easily distinguishable by touch alone. Problem solved.

Chapanis had recognised that human beings’ propensity to make mistakes when they are tired is much harder to fix than the design of levers. His deeper insight was that people have limits, and many of their mistakes are predictable effects of those limits. That is why the architects of CRM defined its aim as the reduction of human error, rather than pilot error. Rather than trying to hire or train perfect pilots, it is better to design systems that minimise or mitigate inevitable human mistakes.

In the 1990s, a cognitive psychologist called James Reason turned this principle into a theory of how accidents happen in large organisations. When a space shuttle crashes or an oil tanker leaks, our instinct is to look for a single, “root” cause. This often leads us to the operator: the person who triggered the disaster by pulling the wrong lever or entering the wrong line of code. But the operator is at the end of a long chain of decisions, some of them taken that day, some taken long in the past, all contributing to the accident; like achievements, accidents are a team effort. Reason proposed a “Swiss cheese” model: accidents happen when a concatenation of factors occurs in unpredictable ways, like the holes in a block of cheese lining up.

James Reason’s underlying message was that because human beings are fallible and will always make operational mistakes, it is the responsibility of managers to ensure that those mistakes are anticipated, planned for and learned from. Without seeking to do away altogether with the notion of culpability, he shifted the emphasis from the flaws of individuals to flaws in organisation, from the person to the environment, and from blame to learning.

The science of “human factors” now permeates the aviation industry.

Confederate Flag

Sunday, June 21st, 2015

Confederate National FlagSon of the South explains the origin and evolution of the Confederate Flag:

With the formation of the Confederate States of America in early 1861, one of the first orders of business was to create a flag for the new nation. The Committee on the Flag and Seal was formed, and given this task. There were basically two schools of thought in creating the flag. One was to create something that looked very similar to the existing United States flag. The second school of thought was to create a flag very different than the existing United States flag. At the time, there was still feelings of allegiance to the original US flag, and popular opinion was lining up in support of a flag that was similar to the familiar United States flag.  Such a flag was created and proposed. This flag is pictured at [left].

The proposed flag resembled the United States flag, but replaced the “stripes” with 3 “bars”. The flag had 7 stars, one for each state that was part of the confederacy at the time. This flag was dubbed the “Stars and Bars”. The United States flag had been known as the “Stars and Stripes”. This flag had replaced the stripes with bars, so it was logical to call it the “Stars and Bars”. Note that today people often refer to the Confederate battle flag (pictured at the top of the page, on the left of the photograph) as the “Stars and Bars”. Strictly speaking, this is not a correct description of the Confederate battle flag.

Those who preferred a very different flag from that of the United States proposed several different flags, one of which resembled what would later become the Confederate battle flag.

In March of 1861, those who supported a flag similar to that of the United States prevailed, and the “Stars and Bars” became the official National flag of the Confederacy. The flag’s first official use was at the inauguration of Jefferson Davis on March 4, 1861.

As more states joined the Confederacy, more stars were added to the flag. Eventually the Stars and Bars had a total of 13 stars. The thirteen states represented were: South Carolina, Mississippi, Florida, Alabama, Georgia, Louisiana, Texas, Virginia, Arkansas, North Carolina, Tennessee, Missouri, and Kentucky. Note that Missouri and Kentucky never officially seceded, but were slave states, and did have some confederate state governments, although they were in exile for the most part.

With this matter resolved, the participants proceeded with the prosecution of the War. While there were several smaller skirmishes in early 1861, the first major battle of the year, and the war for that matter, was the Battle of Bull Run, which was fought on July 21, 1861. There were over 4,800 men killed or wounded on the two sides.

At the Battle of Bull Run, there were a number of Confederate regiments that used the Confederate national flag as their battle flag. While having a National flag that looks similar  to the old United States flag might have been comforting to the people of the newly formed Confederacy, it turned out to be a bad idea in battle. In battle, the purpose of a flag is to help identify who is who. Who is on your side, and who is on the other side. From this perspective, having two sides fight under flags that are similar in appearance is a very bad idea. It actually did cause some degree of confusion at the Battle of Bull Run.

The confusion caused by the similarity in the flags was of great concern to Confederate General P.G.T. Beauregard. He suggested that the Confederate national flag be changed to something completely different, to avoid confusion in battle in the future. This idea was rejected by the Confederate government. Beauregard then suggested that there should be two flags. One, the National flag, and the second one being a battle flag, with the battle flag being completely different from the United States flag.

Confederate Battle Flag

Beauregard was successful in having a separate battle flag created. The one chosen was actually similar to one of the flags that had earlier been proposed to be the National flag. The battle flag would be a blue X on a red field. As a battle flag, the flag would be square. The flag had 13 stars, for the thirteen states in the Confederacy. This flag was first used in battle in December 1861. Being a new flag, different from the United States flag, it gained widespread acceptance and allegiance among the Confederate soldiers, and population in general. The flag is referred to as the Confederate battle flag, and as the Battle Flag of the Army of Northern Virginia.

It should be noted, however, that there were many different confederate battle flags used at different times, and by different regiments in the war.

The National flag of the confederacy is almost forgotten today, and the battle flag of the Army of Northern Virginia has become the symbol most associated with the confederacy, and it remains a controversial, and divisive symbol to this day.

Doctors Make Mistakes

Sunday, June 21st, 2015

Airline pilot Martin Bromiley’s wife died during routine surgery to fix a deviated septum that was causing sinus trouble, and he assumed that the next step would be an investigation:

“You get an independent team in. You investigate. You learn.” When he asked the head of the intensive-care unit about this, the doctor shook his head. “That’s not how we do things in the health service. Not unless somebody complains or sues.”

This doctor was privately sympathetic to Bromiley’s question, however. Shortly after Elaine’s death, he got in touch with Bromiley to say that he had asked a friend of his, Professor Michael Harmer, an eminent anaesthetist, if he would be prepared to lead an investigation. Harmer had said yes. After Bromiley gained the hospital’s consent, Harmer set to work, interviewing everyone involved, from the consultants to the nursing team.

[...]

The truth was that Elaine had died at the hands of highly accomplished, technically proficient doctors with 60 years of experience between them, in a fine, well-equipped modern hospital, because of a simple error.

[...]

Doctors make mistakes. A woman undergoing surgery for an ectopic pregnancy had the wrong tube removed, rendering her infertile. Another had her Fallopian tube removed instead of her appendix. A cardiac operation was performed on the wrong patient. Some 69 patients left surgery with needles, swabs or, in one case, a glove left inside them. These are just some of the incidents that occurred in English hospitals in the six months between April and September 2013.

[...]

The National Audit Office estimates that there may be 34,000 deaths annually as a result of patient safety incidents. When he was medical director, Liam Donaldson warned that the chances of dying as a result of a clinical error in hospital are 33,000 times higher than dying in an air crash. This isn’t a problem peculiar to our health-care system. In the United States, errors are estimated to be the third most common cause of deaths in health care, after cancer and heart disease. Globally, there is a one-in-ten chance that, owing to preventable mistakes or oversights, a patient will leave a hospital in a worse state than when she entered it.

[...]

Within two minutes of Elaine Bromiley’s operation beginning, the anaesthetic consultant realised that the patient’s airway had collapsed, hindering her supply of oxygen. After repeatedly trying and failing to ventilate the airway, he issued a call for help. An ENT surgeon answered it, as did another senior anaesthetist. The three consultants struggled to get a tube down Elaine’s throat, a procedure known as intubation, but encountered a mysterious blockage. So they tried again.

“Can’t ventilate, can’t intubate” is a recognised emergency in anaesthetic practice, for which there are published guidelines. The first instruction in one version of the guidelines is this: “Do not waste time trying to intubate when the priority is oxygenation.” Deprived of oxygen, our brains soon find it hard to function, our hearts to beat: ten minutes is about the longest we can suffer such a shortage before irreversible damage is done. The recommended solution is to carry out a form of tracheotomy, puncturing the windpipe to allow air in. Do not waste time trying to intubate.

Twenty minutes after Elaine’s airway collapsed, the doctors were still trying to get a tube down her throat. The monitors indicated that her brain was starved of oxygen and her heart had slowed to a dangerously low rate. Her face was blue. Her arms periodically shot up to her face, a sign that brain tissue is being irritated. Yet the doctors ploughed on. After 25 minutes, they had finally intubated their patient. But that was too late for Elaine.

If the severity of Elaine’s condition in those crucial minutes wasn’t registered by the doctors, it was noticed by others in the room. The nurses saw Elaine’s erratic breathing; the blueness of her face; the swings in her blood pressure; the lowness of her oxygen levels and the convulsions of her body. They later said that they had been surprised when the doctors didn’t attempt to gain access to the trachea, but felt unable to broach the subject. Not directly, anyway: one nurse located a tracheotomy set and presented it to the doctors, who didn’t even acknowledge her. Another nurse phoned the intensive-care unit and told them to prepare a bed immediately. When she informed the doctors of her action they looked at her, she said later, as if she was overreacting.

Reading this, you may be incredulous and angry that the doctors could have been so stupid, or so careless. But when the person closest to this event, Martin Bromiley, read Harmer’s report, he responded very differently. His main sensation wasn’t shock, or fury. It was recognition.

Any pilot knows that smart people can make dumb mistakes that get people killed.

Not Likely to Start a Race War

Saturday, June 20th, 2015

The Charleston mass-murderer’s alleged manifesto is short — and not especially angry, although it does have its moments:

I was not raised in a racist home or environment. Living in the South, almost every White person has a small amount of racial awareness, simply beause of the numbers of negroes in this part of the country. But it is a superficial awareness. Growing up, in school, the White and black kids would make racial jokes toward each other, but all they were were jokes. Me and White friends would sometimes would watch things that would make us think that “blacks were the real racists” and other elementary thoughts like this, but there was no real understanding behind it.

The event that truly awakened me was the Trayvon Martin case. I kept hearing and seeing his name, and eventually I decided to look him up. I read the Wikipedia article and right away I was unable to understand what the big deal was. It was obvious that Zimmerman was in the right. But more importantly this prompted me to type in the words “black on White crime” into Google, and I have never been the same since that day. The first website I came to was the Council of Conservative Citizens. There were pages upon pages of these brutal black on White murders. I was in disbelief. At this moment I realized that something was very wrong. How could the news be blowing up the Trayvon Martin case while hundreds of these black on White murders got ignored?

From this point I researched deeper and found out what was happening in Europe. I saw that the same things were happening in England and France, and in all the other Western European countries. Again I found myself in disbelief. As an American we are taught to accept living in the melting pot, and black and other minorities have just as much right to be here as we do, since we are all immigrants. But Europe is the homeland of White people, and in many ways the situation is even worse there.

There’s more, if you’re curious.

Not Seeking Visionary Experiences

Saturday, June 20th, 2015

Psychedelic researcher Dr. James Fadiman is now looking at microdosing:

Microdosing refers to taking extremely small doses of psychedelics, so small that the affects usually associated with such drugs are not evident or are “sub-perceptual,” while going about one’s daily activities.

[...]

He explained that, beginning in 2010, he had been doing a study of microdosing. Since research with LSD remains banned, he couldn’t do it in a lab, but had instead relied on a network of volunteers who administered their own doses and reported back with the results. The subjects kept logs of their doses and daily routines, and sent them via email to Fadiman. The results were quite interesting, he said.

“Micro-dosing turns out to be a totally different world,” he explained. “As someone said, the rocks don’t glow, even a little bit. But what many people are reporting is, at the end of the day, they say, ‘That was a really good day.’ You know, that kind of day when things kind of work. You’re doing a task you normally couldn’t stand for two hours, but you do it for three or four. You eat properly. Maybe you do one more set of reps. Just a good day. That seems to be what we’re discovering.”

Study participants functioned normally in their work and relationships, Fadiman said, but with increased focus, emotional clarity, and creativity. One physician reported that microdosing put him “in touch with a deep place of ease and beauty.” A singer reported being better able to hear and channel music.

In his book, a user named “Madeline” offered this report: “Microdosing of 10 to 20 micrograms (of LSD) allow me to increase my focus, open my heart, and achieve breakthrough results while remaining integrated within my routine. My wit, response time, and visual and mental acuity seem greater than normal on it.”

These results are not yet peer-reviewed, but they are suggestive.

“I just got a report from someone who did this for six weeks,” Fadiman said. “And his question to me was, ‘Is there any reason to stop?’”

It isn’t just Fadiman acolytes who are singing the praises of microdosing. One 65-year-old Sonoma County, California, small businesswoman who had never heard of the man told AlterNet she microdosed because it made her feel better and more effective.

“I started doing it in 1980, when I lived in San Francisco and one of my roommates had some mushrooms in the fridge,” said the woman, who asked to remain anonymous. “I just took a tiny sliver and found that it made me alert and energized all day. I wasn’t high or anything; it was more like having a coffee buzz that lasted all day long.”

This woman gave up on microdosing when her roommate’s supply of ‘shrooms ran out, but she has taken it up again recently.

“I’m very busy these days and I’m 65, so I get tired, and maybe just a little bit surly sometimes,” she admitted. “So when a friend brought over some chocolate mushrooms, I decided to try it again. It makes my days so much better! My mood improves, my energy level is up, and I feel like my synapses are really popping. I get things done, and I don’t notice any side-effects whatsoever.”

She’s not seeking visionary experiences, just a way to get through the day, she said.

In an in-depth post on the High Existence blog, Martijn Schirp examined the phenomenon in some detail, as well as describing his own adventure in microdosing:

“On a beautiful morning in Amsterdam, I grabbed my vial of LSD, diluted down with half high grade vodka and half distilled water, and told my friend to trust me and open his mouth. While semi-carefully measuring the droplets for his microdose, I told him to whirl it around in his mouth for a few minutes before swallowing the neuro-chemical concoction. I quickly followed suit,” Schirp wrote. “We had one of the best walking conversations of our lives.”

James Oroc, author of Tryptamine Palace: 5-MeO-DMT and the Sonoran Desert Toad, exposed another realm where microdosing is gaining popularity. In a Multidisciplinary Association for Psychedelic Studies monograph titled “Psychedelics and Extreme Sports,” Oroc extolled the virtues of microdosing for athletes. Taking low-dose psychedelics improved “cognitive functioning, emotional balance, and physical stamina,” he wrote.

“Virtually all athletes who learn to use LSD?at psycholytic [micro] dosages believe that the use of these compounds improves both their stamina and their abilities,” Oroc continued. “According to the combined reports of 40 years of use by the extreme sports underground, LSD can increase your reflex time to lightning speed, improve your balance to the point of perfection, increase your concentration until you experience ‘tunnel vision,’ and make you impervious to weakness or pain. LSD’s effects in these regards amongst the extreme-sport community are in fact legendary, universal, and without dispute.”

Even the father of LSD, Albert Hofman seems to have been a fan. In his book, Fadiman notes that Hofmann microdosed himself well into old age and quoted him as saying LSD “would have gone on to be used as Ritalin if it hadn’t been so harshly scheduled.”

Psychonauts, take note. Microdosing isn’t going to take you to another astral plane, but it may help you get through the day.

The idea that LSD would improve athletic performance sounds crazy, but Doc Ellis did pitch a no-hitter on a not-so-microdose.

Educational Romanticism & Economic Development

Friday, June 19th, 2015

What’s the best way to interpret Hausmann’s Education Myth?

Yes, “years of schooling” is a poor proxy for educational outcomes. But it captures very well the policy instrument that governments can actually control easily — building large boxes and herding children into them like cattle. That investment has obviously not caused a convergence in test scores between developed and developing countries.

There’s no evidence that education, how ever measured, promotes the sort of growth rates that result in eventual convergence with the rich countries. There’s plenty of evidence to suggest education has contributed to the positive but relatively low growth rates which have been insufficient for convergence. Economic growth research implicitly assumes that the rapid convergence of East Asia with the western countries is ‘normal’ and the slow growth of other non-western countries ‘abnormal’. But maybe the former is the anomaly.

NYPD Chief Bratton

Thursday, June 18th, 2015

From The Guardian, an article about a public official telling the truth:

“We have a significant population gap among African American males because so many of them have spent time in jail and, as such, we can’t hire them,” Bratton said in an interview with the Guardian.

So far, Steve Sailer says, William Bratton has remained strikingly invulnerable to social justice hate mobs over the decades:

This is probably because he has presided over crime declines as top cop in Boston, New York, and Los Angeles. Those are cities where important citizens live. They are not podunk burghs like Ferguson. So leftist Bill de Blasio hired him to run the NYPD again.

Bratton is both a slick politician and rather more of a straightshooter than we’re used to these days.

Sailer cites this blunt, wise 2006 interview:

Q. [Frum] So you know a little bit about our city? You know about our problems? A 27-per-cent increase in the number of homicides from 1995 to today. A Boxing Day slaying where a 15-year-old innocent bystander was gunned down during a gang shootout on a major shopping street. Can I tell you — it would be nice if you were our police chief.

A. [Bratton] Well, thank you. Tell me, the gang violence that you are experiencing, what is the racial or ethnic background of the gangs?

Q. That’s a refreshingly blunt question. Some say it may be as high as 80 per cent Jamaican. But no one knows for sure, because people here don’t like to talk about that.

A. You need to talk about it. It’s all part of the issue. If it’s Jamaican gangs that are committing the crimes, well then, go after the Jamaican gangs. And don’t be afraid to go after them because they’re black. That’s the last thing you need to be concerned with.

Q. Oh boy, I can see the complaints coming in already. You have to understand the climate here. The major local daily in Toronto, the Toronto Star, says it doesn’t believe in “gratuitously” labelling people by ethnic origin.

A. Well, that really helps identify who they are, doesn’t it? The next step will be to refuse to allow the police to identify people by their race or ethnic origin. That type of societal consciousness really goes to extremes.

Outgunned

Thursday, June 18th, 2015

The Marines have not adopted a new sniper rifle in 14 years — and this has had consequences:

It was the summer of 2011 in southern Helmand province, Afghanistan, and mission after mission, Sgt. Ben McCullar of Third Battalion, Second Marines, would insert with his eight-man sniper team into the berms and dunes north of the volatile town of Musa Qala.

Sometimes they would fire at a group of enemy fighters, sometimes the enemy would fire at them first, but almost immediately, McCullar explained, their team would be pinned down by machine guns that outranged almost all of their sniper rifles.

“They’d set up at the max range of their [machine guns] and start firing at us,” McCullar said. “We’d take it until we could call in [close air support] or artillery.”

The story of McCullar and his snipers is not an isolated one. For 14?years, Marine snipers have suffered setbacks in combat that, they say, have been caused by outdated equipment and the inability of the Marine Corps to provide a sniper rifle that can perform at the needed range.

They trace the problem to the relatively small Marine sniper community that doesn’t advocate effectively for itself because it is made up of junior service members and has a high turnover rate. Additionally, snipers say that the Marine Corps’ weapons procurement process is part of an entrenched bureaucracy resistant to change.

The Marines have been using a Remington 700 bolt-action rifle chambered in 7.62×51mm NATO (.308), which they call the M40.

The Army has been using a Remington 700 bolt-action rifle chambered in 7.62×51mm NATO (.308), which they call the M24 — but there’s a crucial difference:

The primary difference between the Army and the USMC rifles is that while the USMC M40 variants use the short-action version of the Remington 700/40x (which is designed for shorter cartridges such as the .222 and .223 Remington, and the .243 and .308 Winchester), the Army M24 uses the Remington 700 Long Action. The long action of the M24 is designed for full-length cartridges, such as the .30-06 Springfield, and magnum cartridges such as the 7 mm Remington Magnum and .300 Winchester Magnum, but shorter cartridges such as the 7.62×51mm NATO (the military version of the .308 Winchester) can also be used. The U.S. Army’s use of the long action was the result of an original intention to chamber the M24 for .300 Winchester Magnum. Despite the fact that the M24 comes fitted with a 7.62×51mm NATO barrel upon issue, retaining the longer action allows them to reconfigure the rifle in the larger—longer-range—calibers if necessary (which has been the case during the longer engagement distances during Operation Enduring Freedom).

So, the Army has switched to the more powerful .300 Winchester Magnum, or even the .338 Lapua Magnum, which profoundly increases the rifle’s effective range in the hands of a skilled shooter. Its latest version of the M24 has been dubbed the M2010 Enhanced Sniper Rifle.

XM2010 with Dark Earth Suppressor

Popular Sculpture

Wednesday, June 17th, 2015

Just as we have popular music and popular cinema, we also have popular sculpture:

Much of it is what we would usually call ornaments. Some of it is minis — i.e., miniatures. Minis are sculpture for the masses in the same way as pop is music for the masses. (If you are trying to explain this to someone suspicious with an arts degree you can call them Kleinplastik, which means almost the same thing but is German and therefore a valid intellectual construct.)

Warhammer 40k Space Marine Minis

Every mini is linked to and feeds back into an overarching fiction, so each mini must encapsulate and even move forward a bit of the story. It has to have continuity with what came before.

In the 40k ’verse older technology is always better — and most of it is lost. R&D is forbidden. To make something better you have to actually find an archive and mine it for already existing designs. This makes sense of the insane tech levels in 40k, especially in human culture. Old and new, recently discovered and long forgotten all mixed together almost incoherently. If game designers want to invent something new they just have something old discovered. This means designers get to invent what they want, so long as it makes artistic sense. It feeds back into the power of the fiction because everything is old and decayed and no-one understands it.

Stories need inherent technology to talk about the future so that we understand it now. Star Trek has post-relativistic speeds, gravity control, matter reorganisation, and AI. A society with those things would look and act like nothing we can recognise. So the tech is used but the implications are ignored.

40k gets around this by inventing an incoherent culture. Its brokenness adds emotional and aesthetic power rather than taking it.One of my favourite things about 40k lore is the backward technology.

If it’s made by a major corporation then it will be affected by what the market wants. Space Marine models outnumber other human models because everyone wants to play Space Marines. The company semi-accidently hit something that jams right in to the adolescent male mind. It does so in an interesting way. It’s like a pop hit of popular sculpture. (If we look back at the fiction constraint, everything developed by the company for this setting needs to live inside a universe that justifies the existence of Space Marines. In the same way, everything developed for the Star Wars universe needs to live inside a universe that can justify Jedi Knights and star fighters.)

It will be affected by what the company thinks it can persuade people to want and by what makes the most money. The company has worked out it has a higher profit margin on very large very expensive kits. Now every army has one. (This means that in the fiction of the game, every imagined culture suddenly has access to unique giant robots that they are assumed to have always had, but that they just didn’t mention until now.)

In terms of sculpture, the Games Workshop mega-kit provides an entirely new aesthetic territory to work on. A small figure of a hero has the main job of persuading you that a very tiny thing can represent a very powerful or potent personality or being. A lot of what very small minis do is shape their form to persuade you that they are larger and have more mass, both more physical and more dramatic weight, than they actually have.

A very large figure is trying to be beautiful or interesting in a different way. It has real size, real mass, and space for enormous amounts of detail. A big part of its job is organising the arrangement of its detail and surfaces in a way that seems both pleasing and correct for its scale. Another job it has it to relate its enormous size to the imagined world whose active participants are usually represented by much smaller things. It must feel as if it can meaningfully interact with these tiny things, as if it represents something made by the same culture and belongs to the same world.