How to fight a war in space (and get away with it)

July 11th, 2019

We depend on satellites, so knocking them out is becoming a military priority:

Today, much more civilian infrastructure relies on GPS and satellite communications, so attacks on them could lead to chaos. The military leans more heavily on satellites too: data and video feeds for armed UAVs, such as the Reaper drones that the US military has flying over Afghanistan and Iraq, are sent via satellite to their human operators. Intelligence and images are also collected by satellites and beamed to operations centers around the world. In the assessment of Chinese analysts, space is used for up to 90% of the US military’s intelligence.

[...]

Non-state actors, as well as more minor powers like North Korea and Iran, are also gaining access to weapons that can bloody the noses of much larger nations in space.

That doesn’t necessarily mean blowing up satellites. Less aggressive methods typically involve cyberattacks to interfere with the data flows between satellites and the ground stations. Some hackers are thought to have done this already.

For example, in 2008, a cyberattack on a ground station in Norway let someone cause 12 minutes of interference with NASA’s Landsat satellites. Later that year, hackers gained access to NASA’s Terra Earth observation satellite and did everything but issue commands.

[...]

There are strong suspicions that Russia has been jamming GPS signals during NATO exercises in Norway and Finland, and using similar tactics in other conflicts. “Russia is absolutely attacking space systems using jammers throughout the Ukraine,” says Weeden. Jamming is hard to distinguish from unintentional interference, making attribution difficult (the US military regularly jams its own communications satellites by accident). A recent report from the US Defense Intelligence Agency (DIA) claims that China is now developing jammers that can target a wide range of frequencies, including military communication bands. North Korea is believed to have bought jammers from Russia, and insurgent groups in Iraq and Afghanistan have been known to use them too.

Spoofing, meanwhile, puts out a fake signal that tricks GPS or other satellite receivers on the ground. Again, it’s surprisingly easy. In the summer of 2013, some students at the University of Texas used a briefcase-sized device to spoof a GPS signal and cause an $80 million private yacht to veer hundreds of meters off course in the Mediterranean. Their exploit wasn’t detected (they later announced it themselves).

[...]

There’s no evidence that anyone has yet used lasers to destroy targets in space, though aircraft-borne lasers have been tested against missiles within the atmosphere. The DIA report suggests that China will have a ground-based laser that can destroy a satellite’s optical sensors in low Earth orbit as early as next year (and that will, by the mid-2020s, be capable of damaging the structure of the satellite). Generally, the intention with lasers is not to blast a satellite out of the sky but to overwhelm its image sensor so it can’t photograph sensitive locations. The damage can be temporary, unless the laser is powerful enough to make it permanent.

Lasers need to be aimed very precisely, and to work well they require complex adaptive optics to make up for atmospheric disturbances, much as some large ground-based telescopes do. Yet there is some evidence, all unconfirmed and eminently deniable, that they are already being used. In 2006, US officials claimed that China was aiming lasers at US imaging satellites passing over Chinese territory.

[...]

In November 2016, the Commercial Spaceflight Center at AGI, an aerospace firm, noticed something strange. Shortly after it was launched, a Chinese satellite, supposedly designed to test high-performance solar cells and new propellants, began approaching a number of other Chinese communications satellites, staying in orbit near them before moving on. It got within a few miles of one—dangerously close in space terms. It paid visits to others in 2017 and 2018. Another Chinese satellite, launched last December, released a second object once it reached geostationary orbit that seemed to be under independent control.

The suspicion is that China is practicing for something known as a co-orbital attack, in which an object is sent into orbit near a target satellite, maneuvers itself into position, and then waits for an order. Such exercises could have less aggressive purposes—inspecting other satellites or repairing or disposing of them, perhaps. But co-orbiting might also be used to jam or snoop on enemy satellites’ data, or even to attack them physically.

Russia, too, has been playing about in geostationary orbit. One of its satellites, Olymp-K, began moving about regularly, at one point getting in between two Intelsat commercial satellites. Another time, it got so close to a French-Italian military satellite that the French government called it an act of “espionage.” The US, similarly, has tested a number of small satellites that can maneuver around in space.

Fortnite’s dominance is ebbing

July 10th, 2019

The Wall Street Journal takes a look at the man behind Fortnite:

By age 30, Epic Games Inc. founder and CEO Tim Sweeney had a couple of successful videogames under his belt and was starting to make real money.

“I had a Ferrari and a Lamborghini in the parking lot of my apartment,” he recalled. “People who hadn’t met me thought I must be a drug dealer.”

Today, Mr. Sweeney, at 48, is worth more than $7 billion, according to Bloomberg’s Billionaires Index. Epic was last valued at $15 billion, counting Walt Disney Co. and China’s Tencent Holdings PLC among its investors. And “Fortnite,” its blockbuster game, has racked up 250 million players and $3.9 billion in estimated revenue.

[...]

While the biggest U.S. videogame companies are clustered in Los Angeles, New York and the Bay Area, Epic is based in Cary, N.C., down the road from Raleigh. Mr. Sweeney said the location prevents Epic from being swayed by Silicon Valley groupthink.

[...]

Epic tried something different. It made “Fortnite” free and put it on every major device people use to play games — consoles, computers, smartphones and tablets. It put its own spin on a trendy new genre called Battle Royale, where a large group of players fight until only one person or squad is left standing. It constantly tweaked the game’s virtual world to give players something new to discover. And it took the popular shooter format and made it less violent and more playful, with colorful characters who compete with dance moves as well as firearms.

[...]

By erasing the barriers between players with different devices, Epic effectively turned “Fortnite” into a massive social network. Wearing headsets to talk to one another, groups of friends trade jokes and gossip while battling to survive.

[...]

Mr. Sweeney founded Epic in 1991 from his parents’ basement, at age 20, funding it with $4,000 in personal savings. He later dropped out of the University of Maryland a few credits shy of a mechanical-engineering degree. “I went from mowing lawns to being CEO of Epic,” said Mr. Sweeney, who got his diploma in 2018.

In its early years, the company had some success with a handful of games, including “Unreal Tournament” and “Gears of War,” that followed more traditional shoot-’em-up formats.

[...]

Today, “Fortnite’s” dominance is ebbing. Monthly revenue from sales of virtual perks such as costumes and dance moves for players’ avatars has fallen 56% since peaking at a record $372.2 million in December, according to Nielsen’s SuperData.

All the hand-wringing about getting into good colleges is probably a waste of time

July 10th, 2019

Scott Alexander looks at increasingly competitive college admissions and ends with this summary:

  1. There is strong evidence for more competition for places at top colleges now than 10, 50, or 100 years ago. There is medium evidence that this is also true for upper-to-medium-tier colleges. It is still easy to get into medium-to-lower-tier colleges.
  2. Until 1900, there was no competition for top colleges, medical schools, or law schools. A secular trend towards increasing admissions (increasing wealth + demand for skills?) plus two shocks from the GI Bill and the Vietnam draft led to a glut of applicants that overwhelmed schools and forced them to begin selecting applicants.
  3. Changes up until ten years ago were because of a growing applicant pool, after which the applicant pool (both domestic and international) stopped growing and started shrinking. Increased competition since ten years ago does not involve applicant pool size.
  4. Changes after ten years ago are less clear, but the most important factor is probably the ease of applying to more colleges. This causes an increase in applications-per-admission which is mostly illusory. However, part of it may be real if it means students are stratifying themselves by ability more effectively. There might also be increased competition just because students got themselves stuck in a high-competition equilibrium (ie an arms race), but in the absence of data this is just speculation.
  5. Medical schools are getting harder to get into, but law schools are getting easier to get into. There is no good data for graduate schools.
  6. All the hand-wringing about getting into good colleges is probably a waste of time, unless you are from a disadvantaged background. For most people, admission to a more selective college does not translate into a more lucrative career or a higher chance of admission to postgraduate education. There may be isolated exceptions at the very top, like for Supreme Court justices.

The trees are ready to cut

July 9th, 2019

A new federal program in the 1980s offered farmers money to reforest depleted land:

Pine trees appealed to Mr. George. He bought loblolly seedlings and pulled his pickup into a parking lot where hands-for-hire congregated.

“We figured we’d plant trees and come back and harvest it in 30 years and in the meantime go into town to make a living doing something else,” he said.

Three decades later the trees are ready to cut, and Mr. George is learning how many other Southerners had the same idea.

A glut of timber has piled up in the Southeast. There are far more ready-to-cut trees than the region’s mills can saw or pulp. The surfeit has crushed timber prices in Mississippi, Alabama and several other states.

The volume of Southern yellow pine, used in housing and to make paper, has surged in recent decades as farmers replaced cropland with trees and as clear-cut forests were replanted. By 2020, the amount of wood growing per acre of timberland in many counties will have more than quadrupled since 1980, U.S. forestry officials estimate.

It has been a big loser for some financial investors, among them the country’s largest pension fund. The California Public Employees’ Retirement System spent more than $2 billion on Southern timberland, and harvested trees at depressed prices to pay interest on money borrowed to buy. Calpers sold much of its land this summer at a loss. A spokeswoman for the pension fund declined to comment.

It has also been tough for the individuals and families who own much of the South’s forestland, and who had banked on its operating as a college fund or retirement account. The region has more than six million owners of at least 10 wooded acres, say academics and forestry consultants. Many of the owners were counting on forests as a long-term investment that could be replenished and passed on to heirs.

The marvel of advancing through life’s stations

July 9th, 2019

Much of our pop culture is made by and for folks who rate high on openness, the sort attracted to novelty — world travels, new drugs, and so forth — but not country music:

Emotional highlights of the low-openness life are going to be the type celebrated in “One Boy, One Girl”: the moment of falling in love with “the one,” the wedding day, the birth one’s children (though I guess the song is about a surprising ultrasound). More generally, country music comes again and again to the marvel of advancing through life’s stations, and finds delight in experiencing traditional familial and social relationships from both sides. Once I was a girl with a mother, now I’m a mother with a girl. My parents took care of me, and now I take care of them. I was once a teenage boy threatened by a girl’s gun-loving father, now I’m a gun-loving father threatening my girl’s teenage boy. Etc. And country is full of assurances that the pleasures of simple, rooted, small-town, lives of faith are deeper and more abiding than the alternatives.

(Hat tip to T. Greer.)

The top 20 most watched shows on Netflix include only a few “originals”

July 8th, 2019

I’m not sure I’d say that ‘Stranger Things’ helps illustrate the flaws in Netflix’s strategy:

Last year, Netflix shelled out more than $12 billion to purchase, license and produce content. This year, that figure will rise to $15 billion. It will spend $2.9 billion more on marketing. These costs come as Netflix is expected to report $20.2 billion in revenue in 2019, according to analysts surveyed by Refinitiv.

[...]

From 2012 to 2016, Netflix subscriptions in the U.S. grew about 5% each year and spiked by 10% in 2017. However, in 2018, domestic memberships only grew about 3.6%.

Internationally, Netflix has grown its subscriptions to nearly 81 million, up from just 1.86 million in 2011. Since 2015, the company has seen double digit growth in this area. Altogether, the company has just under 150 million subscribers.

Also, of the top 20 most watched shows on Netflix, six are “originals,” but only one of those are actually owned by the company, according to data from Nielsen and Pachter.

Top 20 Shows on Netflix in 2018 by Minutes

I knew I was odd, but I guess I don’t watch any of Netflix’s top shows.

They suddenly find themselves in a society that is disgustingly self-centered

July 8th, 2019

T. Greer’s life’s short course has brought him to many places, bound him to sundry peoples, and urged him to varied trades:

Yet out of the lands I’ve lived and roles I’ve have donned, none blaze in my memory like the two years I spent as a missionary for the Church of Jesus Christ. It is a shame that few who review my resume ask about that time; more interesting experiences were packed into those few mission years than in the rest of the lot combined.

To be a missionary is to confront the uncanny. You cannot serve without sounding out the weird bottoms of the human heart. But if missionary life forces you to come full contact with mankind at its most desperate and unsettled, so too it asks you to witness mankind at its most awesome and ethereal. Guilt’s blackest pit, fear’s sharpest grip, rage at its bluntest, hope at its highest, love at its longest and fullest — to serve as a missionary is to be thrust in the midst of the full human panorama, with all of its foulness and all of its glory. I doubt I shall ever experience anything like it again. I cannot value its worth. I learned more of humanity’s crooked timbers in the two years I lived as missionary than in all the years before and all the years since.

Attempting to communicate what missionary life is like to those who have not experienced it themselves is difficult. You’ll notice my opening paragraph restricted itself to broad generalities; it is hard to move past that without cheapening or trivializing the experience.

Yet there is one segment of society that seems to get it. In the years since my service, I have been surprised to find that the one group of people who consistently understands my experience are soldiers. In many ways a Mormon missionary is asked to live something like a soldier: like a soldier, missionaries go through an intense ‘boot camp’ experience meant to reshape their sense of self and duty; are asked to dress and act in a manner that erodes individuality; are ‘deployed’ in far-flung places that leave them isolated from their old friends, family members, and community; are pushed into contact with the full gamut of human personality in their new locales; live within a rigid hierarchy, follow an amazing number of arcane rules and regulations, and hold themselves to insane standards of diligence, discipline, and obedience; and spend years doing a job which is not so much a job as it is an all-encompassing way of life.

The last point is the one most salient to this essay. It is part of the reason both many ex-missionaries (known as “RMs” or “Return Missionaries” in Mormon lingo) and many veterans have such trouble adapting to life when they return to their homes. This comparison occurred to me first several years ago, when I read a Facebook comment left by a man who had served as a Marine mechanic in Afghanistan. He was commenting on an interview Sebstation Junger had done to promote his book, Tribe: On Homecoming and Belonging.

I really enjoyed the audiobook of Tribe, by the way, but audiobooks don’t lend themselves to excerpts.

Many RMs report a sense of loss and aimlessness upon returning to “the real world.” They suddenly find themselves in a society that is disgustingly self-centered, a world where there is nothing to sacrifice or plan for except one’s own advancement. For the past two years there was a purpose behind everything they did, a purpose whose scope far transcended their individual concerns. They had given everything — “heart, might, mind and strength” — to this work, and now they are expected to go back to racking up rewards points on their credit card? How could they?

The soldier understands this question. He understands how strange and wonderful life can be when every decision is imbued with terrible meaning. Things which have no particular valence in the civilian sphere are a matter of life or death for the soldier. Mundane aspects of mundane jobs (say, those of the former vehicle mechanic) take on special meaning. A direct line can be drawn between everything he does — laying out a sandbag, turning off a light, operating a radio — and the ability of his team to accomplish their mission. Choice of food, training, and exercise before combat can make the difference between the life and death of a soldier’s comrades in combat. For good or for ill, it is through small decisions like these that great things come to pass.

In this sense the life of the soldier is not really his own. His decisions ripple. His mistakes multiply. The mission demands strict attention to things that are of no consequence in normal life. So much depends on him, yet so little is for him.

This sounds like a burden. In some ways it is. But in other ways it is a gift. Now, and for as long as he is part of the force, even his smallest actions have a significance he could never otherwise hope for. He does not live a normal life. He lives with power and purpose — that rare power and purpose given only to those whose lives are not their own.

[...]

This sort of life is not restricted to soldiers and missionaries. Terrorists obviously experience a similar sort of commitment. So do dissidents, revolutionaries, reformers, abolitionists, and so forth. What matters here is conviction and cause. If the cause is great enough, and the need for service so pressing, then many of the other things — obedience, discipline, exhaustion, consecration, hierarchy, and separation from ordinary life — soon follow. It is no accident that great transformations in history are sprung from groups of people living in just this way. Humanity is both at its most heroic and its most horrifying when questing for transcendence.

What do you call a female defender?

July 7th, 2019

The French language has masculine and feminine genders. Somehow this his become confusing when referring to female soccer players and managers. What do you call a female defender?

The language offers at least three options: the masculine form défenseur, the feminine form défenseuse, or another feminine form défenseure, which is pronounced exactly the same as the masculine. And if you follow French coverage of the tournament, you might see all three.

In Le Monde, you would read about a défenseuse or sélectionneuse (the word used for national team managers). A dispatch from Agence France-Presse, meanwhile, will say défenseure and sélectionneure. Television networks TF1 and Canal+, which are broadcasting the tournament here, often use one form in graphics on screen, but let commentators like Mr. Lizarazu employ another during live broadcasts.

Traditionally, you use the masculine form unless you want to explicitly refer to a female. I have no idea where this third, quasi-female gender came from.

When it comes to questions of proper usage, the country has its own ancient authority, the 384-year-old Académie Française. Its 35 members are known as the Immortals. They are charged with sporadically producing the definitive dictionary on usage and cutting through the babble of a constantly evolving tongue. They are even issued swords.

But time moves slowly at the Académie. In 1984, as more French speakers adapted their speech to reflect a growing number of women in the workplace, the Académie felt compelled to weigh in on the topic: It ruled out any changes, preferring to stick to the masculine form, except in cases where usage had already taken root. It was important to remember, the Académie argued at the time, that there was no connection between what it called “natural gender” and “grammatical gender.”

Have you confidence in me to trust me with your watch until tomorrow?

July 7th, 2019

The term “confidence man” appears to have been coined in 1849 during the trial of one William Thompson in New York:

A debonair thief, Thompson had a knack for ingratiating himself with complete strangers on the street and then asking, “Have you confidence in me to trust me with your watch until tomorrow?” Many did, which cost them their expensive timepieces. The much-publicized trial and the odd crime at its heart piqued the interest of Herman Melville, who reworked it eight years later for his under-appreciated high-concept final novel, The Confidence-Man. After boarding a Mississippi steamboat on April Fool’s Day, its Mephistophelean titular character adopts a succession of guises with evocative backstories and surnames (Goodman, Truman, Noble) with the aim of getting one over on fellow passengers. Spurred by self-interest and reflective of society at large, the dupes place unquestioning trust in tokens such as attire and profession, making them as complicit in the con as the perpetrator. In The Adman’s Dilemma, which used literary and cultural waypoints to chart the evolution of the common snake-oil salesman into the modern man of advertising, Paul Rutherford bleakly described Melville’s novel as “a study in deception and even a self-deception so complete that there was no possibility of redemption”.

These contests will be byzantine

July 6th, 2019

Suez Deconstructed aims to be a historically rooted how-to manual for statecraft:

The book seeks to convey the experience of “masterminding solutions to giant international crises,” Zelikow writes, by providing “a sort of simulator that can help condition readers just a little more” before confronting their own crises. It sets up that simulation by scrambling the storytelling. First, Suez Deconstructed divides the crisis into three phases: September 1955 through July 1956, July 1956 through October 1956, and October through November of that year. In doing so, the authors hope to show that “most large problems of statecraft are not one-act plays” but instead begin as one problem and then mutate into new ones. This was the case with Suez, which began with Egypt purchasing Soviet arms and which became a multipronged battle over an international waterway. Second, the book proceeds through these phases not chronologically but by recounting the perspectives of each of the six participants: the United States, the Soviet Union, the United Kingdom, France, Israel, and Egypt. The goal — and the effect — is to deprive the reader of omniscience, creating a “lifelike” compartmentalization of knowledge and perspective.

Zelikow encourages readers to assess Suez by examining three kinds of judgments made by the statesmen during the crisis: value judgments (“What do we care about?”), reality judgments (“What is really going on?”), and action judgments (“What can we do about it?”). Asking these questions, Zelikow argues, is the best means of evaluating the protagonists. Through this structure, Suez Deconstructed hopes to provide “a personal sense, even a checklist, of matters to consider” when confronting questions of statecraft.

The book begins this task by describing the world of 1956. The Cold War’s impermeable borders had not yet solidified, and the superpowers sought the favor of the so-called Third World. Among non-aligned nations, Cold War ideology mattered less than anti-colonialism. In the Middle East, its champion was Egyptian President Gamal Abdel Nasser, who wielded influence by exploiting several festering regional disputes. He rhetorically — and, the French suspected, materially — supported the Algerian revolt against French rule. He competed with Iraq, Egypt’s pro-British and anti-communist rival. He threatened to destroy the State of Israel. And through Egypt ran the Suez Canal, which Europe depended on for oil.

Egypt’s conflict with Israel precipitated the Suez crisis. In September 1955, Nasser struck a stunning and mammoth arms deal with the Soviet Union. The infusion of weaponry threatened Israel’s strategic superiority, undermined Iraq, and vaulted the Soviet Union into the Middle East. From that point forward, Zelikow argues, the question for all the countries in the crisis (aside from Egypt, of course) became “What to do next about Nasser?”

Israel responded with dread, while, Britain, France, and the United States alternated between confrontation and conciliation. Eventually, the United States abandoned Nasser, but he doubled down by nationalizing the Suez Canal. This was too much for France. Hoping to unseat Nasser to halt Egyptian aid to Algeria, it concocted a plan with Israel and, eventually, Britain for Israel to invade Egypt and for British and French troops to seize the Canal Zone on the pretense of separating Israeli and Egyptian forces. The attack began just before the upcoming U.S. presidential election and alongside a revolution in Hungary that triggered a Soviet invasion. The book highlights the Eisenhower administration’s anger at the tripartite plot. Despite having turned on Nasser, Eisenhower seethed at not having been told about the assault, bitterly opposed it, and threatened to ruin the British and French economies by withholding oil shipments.

[...]

Even so, it is possible to extract several key lessons about statecraft. Chief among them is the extent to which policymakers are informed as much by honor and will as by interest. Britain and France, for example, ultimately joined forces to invade Egypt, but they did so for different reasons and with different degrees of resolve. As Zelikow notes, in the mid-1950s, France, recently beaten in Indochina, seemed beleaguered, while Britain “still seemed big,” boasting a “far-flung network of bases and influence.” But appearances could deceive. France was led by men who “had been heroes of the resistance” during World War II and were determined to restore their country’s honor. Outwardly strong, meanwhile, Britain suffered from a gnawing sense of exhaustion.

This imbalance of morale would shape each nation’s actions during the crisis and contribute to Suez’s strange outcome. France’s Socialist-led coalition, Zelikow writes, was “driven by ideas and historical experience.” It possessed a vision of restoring French pride and a dedication to defeating what it saw as “antimodern throwbacks” in Algeria backed by a Mussolini-like figure in Cairo. It was thus undeterred when complications arose and “more creative in [its] policy designs.” But because Washington, Moscow, and Cairo all judged France by its seeming lack of material power and its recent defeats alone, they underestimated its will.

British leaders, equally eager to topple Nasser and more capable of acting independently than the French, nevertheless struggled to overcome their nation’s fatigue. Initially behind the government’s desire to punish Nasser, the British public, as the book details, “[lost] its appetite for military adventure” as diplomacy commenced. British Prime Minster Anthony Eden had long argued for the need to reconcile with anti-colonialism and with Nasser, its chief Middle Eastern apostle. The British public, tired of war, could not long support Eden’s reversal. London ultimately joined French-Israeli strikes not so much out of conviction but to save face — avoiding the embarrassment of abandoning the demands it made of Nasser.

The second lesson that emerges is the centrality of relationships between statesmen, which drove events just as much as, if not more than, money, power, and ideas. One of the central drivers of the war, in fact, was the bond between French and Israeli statesmen. France’s Socialist leaders had all fought in the French Resistance during World War II. They sympathized with Israel, feeling morally obligated to prevent another massacre of the Jewish people and, as one author in the book describes, viewing Israel’s struggle “as a sort of sequel” to the fight against fascism. The Israelis, many of whom were former guerilla fighters themselves, easily related to the French and appreciated their support. Paris and Jerusalem grew closer for practical reasons as well: France sought Israel’s aid in addressing the Algerian revolt. But the relationship extended beyond material interest. As one chapter relates, during French-Israeli negotiations regarding the attack on Egypt, “there was an emotional connection between [the French and Israeli leaders] that documents do not easily capture.” The affection between French and Israeli officials repeatedly propelled the war planning forward.

If intimate ties catalyzed the invasion of Egypt, so, too, did combustible ones — none more so than the rancor between Eden and Dulles. Eden detested Dulles as moralistic, legalistic, and tedious (as related in Suez Deconstructed, he once described Dulles with the quip, “Dull, Duller, Dulles”). Their mutual disregard plagued U.S.-British cooperation. At key moments, Eden believed, Dulles would intervene with a maladroit statement that would harm planning or undermine British leverage. In early October 1956, for example, Dulles stated that there were “no teeth” to the diplomatic plan that the powers had been devising and that when it came to issues of “so-called colonialism,” the United States would “play a somewhat independent role.” For Eden, feeling isolated, this statement “was in some ways the final blow,” spurring him to join the French-Israeli initiative.

The statesmen of the Suez Crisis were haunted by history as much as they were guided by pride and personality — another striking theme that surfaces in Suez Deconstructed. Zelikow begins his overview of the world in 1956 by stating that “[t]hey were a wartime generation,” nations that had “lived through conclusive, cataclysmic wars, some more than one.” Those experiences permeated their approaches to the crisis. French and British leaders could not help but see Nasser as a 1930s potentate.

[...]

It is a rare quality in world leaders to be able to make historical analogies without fully embracing them, thereby becoming trapped.

[...]

The wars of the coming decades, however, are likely to look more like Suez than Berlin or Iraq. They will likely be multi-state conflicts, in which states of every size and strength play major roles. These contests will be byzantine. Like Suez, they will be local skirmishes and global crises simultaneously. They will feature webs of overlapping rivalries and alliances (and rivalries within alliances), strategic and ideological considerations at multiple levels, and high-stakes signaling amid confusion and disinformation.

How would fifty guineas for a night’s work suit you?

July 5th, 2019

I was listening to Stephen Fry’s narration of “The Adventure of the Engineer’s Thumb,” when the young (unemployed) engineer at the center of the story was offered 50 guineas for a night’s work:

The guinea was a coin of approximately one quarter ounce of gold that was minted in Great Britain between 1663 and 1814. The name came from the Guinea region in West Africa, where much of the gold used to make the coins originated. It was the first English machine-struck gold coin, originally worth one pound sterling, equal to twenty shillings, but rises in the price of gold relative to silver caused the value of the guinea to increase, at times to as high as thirty shillings. From 1717 to 1816, its value was officially fixed at twenty-one shillings.

When Britain adopted the gold standard the guinea became a colloquial or specialised term. Although the coin itself no longer circulated, the term guinea survived as a unit of account in some fields. Notable usages included professional fees (medical, legal etc), which were often invoiced in guineas, and horse racing and greyhound racing, and the sale of rams. In each case a guinea meant an amount of one pound and one shilling (21 shillings), or one pound and five pence (£1.05) in decimalised currency.

One pound in 1892 has inflated to well over 100 pounds today, so 50 guineas would be worth over 6,000 pounds in 2019.

Happy Secession Day!

July 4th, 2019

I almost forgot to wish everyone a happy Secession Day:

Brexit 1776

Conspiracies are normal and common

July 4th, 2019

Moldbug’s Cathedral is not a conspiracy, Anomaly UK explains:

It makes more sense to say that the Cathedral is the opposite of a conspiracy. It is what you get when there are no conspiracies.

The word “conspiracy” is basically clickbait, but I’m going to stick with it anyway. Be aware, though, that I don’t mean anything really weird by it. The management of any company is a conspiracy, in that the members discuss plans in private and only publicise them if it is advantageous for them to do so. [Smug Misha] pointed out on twitter that HBO were able to keep the secret of the ending of Game of Thrones for months, despite hundreds of people needing to know it to make the episode.

In this sense, conspiracies are normal and common, though not quite as common as they used to be. That was my argument in the earlier piece: that as recently as a decade or so ago, a political party (or at least a faction within it) could agree an agenda in private and make confidential plans to pursue that agenda. That capability seems, since then, to have been lost. The key debates between leading politicians of the same party over what goals should be pursued and what means should be employed to pursue them are carried out in public.

I stand by that point. But on reflection I think it’s a much bigger deal. This is a recent development in a much longer trend. As I wrote yesterday in a comment, the Cathedral is defined by its lack of secrecy. The distinctive role of the universities and the press is to inform the public, and to do so with authoritative status. It is not defined by its ideology. However, its ideological direction is a predictable consequence of its transparency. A public competition for admiration causes a movement to the extreme: the most attractive position is the one just slightly more extreme than the others. This is the “holiness spiral”.

The breakdown of conspiracy, then, is not just a phenomenon of the last decade that has given us Trump and so on. It is the root cause of the political direction of the last few centuries.

What is the cause of the breakdown of conspiracy? If I had to guess and point at one thing it would be protestantism. That, after all, was largely a move to remove the secrecy from religion. Once democracy got going, that removed much more secrecy. But it’s still an ongoing process: democracy until recently was mediated by non-public formal and informal institutions. The opening of the guilds can be seen as part of the same trend. Many of the things I have written about in the past may be related — the decline in personal loyalty, for example.

That produces a feedback loop — a belief in equality and openness brings more decision-making into the public sphere, which leads to holiness spirals, which leads to ever increasing belief in equality and openness. But it seems to me that the openness comes first, and the ideology results from it. The Cathedral is a sociological construct, not an ideological one.

[...]

However, the actual powers of the state were immediately in the hands of the civil service and political parties, who were not transparent, and exerted a moderating influence. There were self-perpetuating groups of powerful people — conspiracies — who could limit the choices open to the electorate and therefore slow the long-term political trends driven by the Cathedral. Today, as a result of internal democracy in political parties (particularly in the UK, a very recent development), and of unmediated channels of communication, those conspiracies have been broken open. A politician today is fundamentally in the same business as a journalist or a professor — he is competing for status by means of public statements. The internal debates of political parties are now public debates. In the past, in order to become a politician, other politicians had to accept you. Now you can be a TV star or a newspaper columnist today, and be a politician tomorrow. The incumbents can’t quietly agree to stop you, any more than they could quietly agree to have pizza for lunch.

Lounging about his sitting-room in his dressing-gown, reading the agony column

July 3rd, 2019

I was listening to Stephen Fry’s narration of “The Adventure of the Engineer’s Thumb,” when Watson finds Holmes lounging about his sitting-room in his dressing-gown, reading the agony column of The Times and smoking his before-breakfast pipe.

But what’s an agony column? A simple Google search gives this definition:

a column in a newspaper or magazine offering advice on personal problems to readers who write in.

Sherlock Holmes reads the advice column? Well, not so fast. Wikipedia briefly notes that it can refer to a column of a newspaper that contains advertisements of missing relatives and friends. I had no idea such a thing existed, but I can certainly see why that would draw the attention of the famous consulting detective.

The agony column did in fact originate with The Times. I found a collection of columns from 1800-1870:

Agony Column

I suppose a modern Holmes would check the missed connections on Craig’s List.

This broken gene may explain humans’ endurance

July 2nd, 2019

A “broken” gene may explain humans’ endurance:

Some clues came 20 years ago, when Ajit Varki, a physician-scientist at the University of California, San Diego (UCSD), and colleagues unearthed one of the first genetic differences between humans and chimps: a gene called CMP-Neu5Ac Hydroxylase (CMAH). Other primates have this gene, which helps build a sugar molecule called sialic acid that sits on cell surfaces. But humans have a broken version of CMAH, so they don’t make this sugar, the team reported. Since then, Varki has implicated sialic acid in inflammation and resistance to malaria.

In the new study, Varki’s team explored whether CMAH has any impact on muscles and running ability, in part because mice bred with a muscular dystrophy–like syndrome get worse when they don’t have this gene. UCSD graduate student Jonathan Okerblom put mice with a normal and broken version of CMAH (akin to the human version) on small treadmills. UCSD physiologist Ellen Breen closely examined their leg muscles before and after running different distances, some after 2 weeks and some after 1 month.

After training, the mice with the human version of the CMAH gene ran 12% faster and 20% longer than the other mice, the team reports today in the Proceedings of the Royal Society B. “Nike would pay a lot of money” for that kind of increase in performance in their sponsored athletes, Lieberman says.

The team discovered that the “humanized” mice had more tiny blood vessels branching into their leg muscles, and — even when isolated in a dish — the muscles kept contracting much longer than those from the other mice. The humanlike mouse muscles used oxygen more efficiently as well. But the researchers still have no idea how the sugar molecule affects endurance, as it serves many functions in a cell.