Few even had wallets

Tuesday, January 22nd, 2019

A century ago the market economy was important, but a lot of economic activity still took place within the family, Peter Frost notes, especially in rural areas:

In the late 1980s I interviewed elderly French Canadians in a small rural community, and I was struck by how little the market economy mattered in their youth. At that time none of them had bank accounts. Few even had wallets. Coins and bills were kept at home in a small wooden box for special occasions, like the yearly trip to Quebec City. The rest of the time these people grew their own food and made their own clothes and furniture. Farms did produce food for local markets, but this surplus was of secondary importance and could just as often be bartered with neighbors or donated to the priest. Farm families were also large and typically brought together many people from three or four generations.

By the 1980s things had changed considerably. Many of my interviewees were living in circumstances of extreme social isolation, with only occasional visits from family or friends. Even among middle-aged members of the community there were many who lived alone, either because of divorce or because of relationships that had never gone anywhere. This is a major cultural change, and it has occurred in the absence of any underlying changes to the way people think and feel.

Whenever I raise this point I’m usually told we’re nonetheless better off today, not only materially but also in terms of enjoying varied and more interesting lives. That argument made sense back in the 1980s — in the wake of a long economic boom that had doubled incomes, increased life expectancy, and improved our lives through labor-saving devices, new forms of home entertainment, and stimulating interactions with a broader range of people.

Today, that argument seems less convincing. Median income has stagnated since the 1970s and may even be decreasing if we adjust for monetization of activities, like child care, that were previously nonmonetized. Life expectancy too has leveled off and is now declining in the U.S. because of rising suicide rates among people who live alone. Finally, cultural diversity is having the perverse effect of reducing intellectual diversity. More and more topics are considered off-limits in public discourse and, increasingly, in private conversation.

Liberalism is no longer delivering the goods — not only material goods but also the goods of long-term relationships and rewarding social interaction.

Language in this exchange does not reflect Carolina’s values

Monday, January 21st, 2019

The same nonprofit that is suing Harvard University for racial discrimination — against Asians and Whites — is now suing the University of North Carolina:

In their filing, the plaintiffs said their analysis of UNC’s admissions data showed race is a “determinative” factor for many underrepresented minorities, particularly African-American and Hispanic applicants from outside the state.

In a 2003 ruling, the Supreme Court said universities can use race as a “plus” factor in admissions, but must evaluate each applicant individually and not consider race as the defining feature of the application.

The plaintiffs also say the school has violated Supreme Court precedent by failing to seriously attempt race-neutral alternatives to achieving diversity.

Lawyers for UNC said in Friday’s filing that race is not a dominant factor in admissions. UNC said it uses a holistic approach to admissions, with application readers scoring applicants in five categories: academic program, academic performance, extracurricular activities, essays and personal qualities, like “curiosity, integrity, and history of overcoming obstacles.”

The school said race has no numerical weight at any point in the review.

[...]

For applicants to UNC-Chapel Hill in 2012, the average SAT score for admitted Asian or Asian-American students was 1431, compared with 1360 for white applicants and 1229 for African Americans, according to the plaintiffs. They said that differential, as well as a similar gap in grade-point averages, shows the school gives an unfair tip to applicants of certain races or ethnicities, despite weaker academic credentials.

In Friday’s filing, the plaintiffs also said UNC admissions readers frequently highlight the applicant’s race, citing one reader’s comment that even with an ACT score of 26, they should “give these brown babies a shot at these merit $$.” Another reader wrote, “Stellar academics for a Native Amer/African Amer kid,” the plaintiffs said.

Steve Farmer, the university’s vice provost for enrollment and undergraduate admissions, said in response: “Language in this exchange does not reflect Carolina’s values or our admissions process.”

Previously they had been a lumpenproletariat of single men and women

Monday, January 21st, 2019

Liberal regimes tend to erode their own cultural and genetic foundations, thus undermining the cause of their success:

Liberalism emerged in northwest Europe. This was where conditions were most conducive to dissolving the bonds of kinship and creating communities of atomized individuals who produce and consume for a market. Northwest Europeans were most likely to embark on this evolutionary trajectory because of their tendency toward late marriage, their high proportion of adults who live alone, their weaker kinship ties and, conversely, their greater individualism. This is the Western European Marriage Pattern, and it seems to go far back in time. The market economy began to take shape at a later date, possibly with the expansion of North Sea trade during early medieval times and certainly with the take-off of the North Sea trading area in the mid-1300s (Note 1).

Thus began a process of gene-culture coevolution: people pushed the limits of their phenotype to exploit the possibilities of the market economy; selection then brought the mean genotype into line with the new phenotype. The cycle then continued anew, with the mean phenotype always one step ahead of the mean genotype.

This gene-culture coevolution has interested several researchers. Gregory Clark has linked the demographic expansion of the English middle class to specific behavioral changes in the English population: increasing future time orientation; greater acceptance of the State monopoly on violence and consequently less willingness to use violence to settle personal disputes; and, more generally, a shift toward bourgeois values of thrift, reserve, self-control, and foresight. Heiner Rindermann has presented the evidence for a steady rise in mean IQ in Western Europe during the late medieval and early modern era. Henry Harpending and myself have investigated genetic pacification during the same timeframe in English society. Finally, hbd*chick has written about individualism in relation to the Western European Marriage Pattern (Note 2).

This process of gene-culture coevolution came to a halt in the late 19th century. Cottage industries gave way to large firms that invested in housing and other services for their workers, and this corporate paternalism eventually became the model for the welfare state, first in Germany and then elsewhere in the West. Working people could now settle down and have families, whereas previously they had largely been a lumpenproletariat of single men and women. Meanwhile, middle-class fertility began to decline, partly because of the rising cost of maintaining a middle-class lifestyle and partly because of sociocultural changes (increasing acceptance and availability of contraception, feminism, etc.).

This reversal of class differences in fertility seems to have reversed the gene-culture coevolution of the late medieval and early modern era.

Liberalism delivered the goods

Sunday, January 20th, 2019

How did liberalism become so dominant?

In a word, it delivered the goods. Liberal regimes were better able to mobilize labor, capital, and raw resources over long distances and across different communities. Conservative regimes were less flexible and, by their very nature, tied to a single ethnocultural community. Liberals pushed and pushed for more individualism and social atomization, thereby reaping the benefits of access to an ever larger market economy.

The benefits included not only more wealth but also more military power. During the American Civil War, the North benefited not only from a greater capacity to produce arms and ammunition but also from a more extensive railway system and a larger pool of recruits, including young migrants of diverse origins — one in four members of the Union army was an immigrant (Doyle 2015).

During the First World War, Britain and France could likewise draw on not only their own manpower but also that of their colonies and elsewhere. France recruited half a million African soldiers to fight in Europe, and Britain over a million Indian troops to fight in Europe, the Middle East, and East Africa (Koller 2014; Wikipedia 2018b). An additional 300,000 laborers were brought to Europe and the Middle East for non-combat roles from China, Egypt, India, and South Africa (Wikipedia 2018a). In contrast, the Central Powers had to rely almost entirely on their own human resources. The Allied powers thus turned a European civil war into a truly global conflict.

The same imbalance developed during the Second World War. The Allies could produce arms and ammunition in greater quantities and far from enemy attack in North America, India, and South Africa, while recruiting large numbers of soldiers overseas. More than a million African soldiers fought for Britain and France, their contribution being particularly critical to the Burma campaign, the Italian campaign, and the invasion of southern France (Krinninger and Mwanamilongo 2015; Wikipedia 2018c). Meanwhile, India provided over 2.5 million soldiers, who fought in North Africa, Europe, and Asia (Wikipedia 2018d). India also produced armaments and resources for the war effort, notably coal, iron ore, and steel.

Liberalism thus succeeded not so much in the battle of ideas as on the actual battlefield.

Longines Chronoscope with Princess Alexandra Kropotkin

Saturday, January 19th, 2019

Longines Chronoscope with Princess Alexandra Kropotkin sounds like the title of a steampunk novel, but it’s actually a 1951 television interview with the daughter of Peter Kropotkin, one of the most prominent left-anarchist figures of the late 19th and early 20th centuries. As Jesse Walker of Reason notes, sometimes the very fact that something exists is reason enough to watch it:

Henry Hazlitt, author of Economics in One Lesson, makes a brief appearance.

If you make a community truly open it will eventually become little more than a motel

Saturday, January 19th, 2019

The emergence of the middle class was associated with the rise of liberalism and its belief in the supremacy of the individual:

John Locke (1632–1704) is considered to be the “father of liberalism,” but belief in the individual as the ultimate moral arbiter was already evident in Protestant and pre-Protestant thinkers going back to John Wycliffe (1320s–1384) and earlier. These are all elaborations and refinements of the same mindset.

Liberalism has been dominant in Britain and its main overseas offshoot, the United States, since the 18th century. There is some difference between right-liberals and left-liberals, but both see the individual as the fundamental unit of society and both seek to maximize personal autonomy at the expense of kinship-based forms of social organization, i.e., the nuclear family, the extended family, the kin group, the community, and the ethnie. Right-liberals are willing to tolerate these older forms and let them gradually self-liquidate, whereas left-liberals want to use the power of the State to liquidate them. Some left-liberals say they simply want to redefine these older forms of sociality to make them voluntary and open to everyone. Redefine, however, means eliminate. If you make a community truly “open” it will eventually become little more than a motel: a place where people share space, where they may or may not know each other, and where very few if any are linked by longstanding ties — certainly not ties of kinship.

For a long time, liberalism was merely dominant in Britain and the U.S. The market economy coexisted with kinship as the proper way to organize social and economic life. The latter form of sociality was even dominant in some groups and regions, such as the Celtic fringe, Catholic communities, the American “Bible Belt,” and rural or semi-rural areas in general. Today, those subcultures are largely gone. Opposition to liberalism is for the most part limited, ironically, to individuals who act on their own.

This is the mindset that enabled northwest Europeans to exploit the possibilities of the market economy

Friday, January 18th, 2019

There is reason to believe that northwest Europeans were pre-adapted to the market economy:

They were not the first to create markets, but they were the first to replace kinship with the market as the main way of organizing social and economic life. Already in the fourteenth century, their kinship ties were weaker than those of other human populations, as attested by marriage data going back to before the Black Death and in some cases to the seventh century (Frost 2017). The data reveal a characteristic pattern:

  • men and women marry relatively late
  • many people never marry
  • children usually leave the nuclear family to form new households
  • households often have non-kin members

This behavioral pattern was associated with a psychological one:

  • weaker kinship and stronger individualism;
  • framing of social rules in terms of moral universalism and moral absolutism, as opposed to kinship-based morality (nepotism, amoral familialism);
  • greater tendency to use internal controls on behavior (guilt proneness, empathy) than external controls (public shaming, community surveillance, etc.)

This is the mindset that enabled northwest Europeans to exploit the possibilities of the market economy. Because they could more easily move toward individualism and social atomization, they could go farther in reorganizing social relationships along market-oriented lines. They could thus mobilize capital, labor, and raw resources more efficiently, thereby gaining more wealth and, ultimately, more military power.

This new cultural environment in turn led to further behavioral and psychological changes. Northwest Europeans have adapted to it just as humans elsewhere have adapted to their own cultural environments, through gene-culture coevolution.

[...]

Northwest Europeans adapted to the market economy, especially those who formed the nascent middle class of merchants, yeomen, and petty traders. Over time, this class enjoyed higher fertility and became demographically more important, as shown by Clark (2007, 2009a, 2009b) in his study of medieval and post-medieval England: the lower classes had negative population growth and were steadily replaced, generation after generation, by downwardly mobile individuals from the middle class. By the early 19th century most English people were either middle-class or impoverished descendants of the middle class.

This demographic change was associated with behavioral and psychological changes to the average English person. Time orientation became shifted toward the future, as seen by increased willingness to save money and defer gratification. There was also a long-term decline in personal violence, with male homicide falling steadily from 1150 to 1800 and, parallel to this, a decline in blood sports and other violent though legal practices (cock fighting, bear and bull baiting, public executions). This change can largely be attributed to the State’s monopoly on violence and the consequent removal of violence-prone individuals through court-ordered or extrajudicial executions. Between 1500 and 1750, court-ordered executions removed 0.5 to 1.0% of all men of each generation, with perhaps just as many dying at the scene of the crime or in prison while awaiting trial (Clark 2007; Frost and Harpending 2015).

Similarly, Rindermann (2018) has argued that mean IQ steadily rose in Western Europe during late medieval and post-medieval times. More people were able to reach higher stages of mental development. Previously, the average person could learn language and social norms well enough, but their ability to reason was hindered by cognitive egocentrism, anthropomorphism, finalism, and animism (Rindermann 2018, p. 49). From the sixteenth century onward, more and more people could better understand probability, cause and effect, and the perspective of another person, whether real or hypothetical. This improvement preceded universal education and improvements in nutrition and sanitation (Rindermann 2018, pp. 86-87).

Decoupling is not a worry for anything but a very small explosion

Thursday, January 17th, 2019

The U.S. government conducted more than 1,000 nuclear tests, most of them in the Nevada desert or on faraway Pacific islands, but it also set off a couple nukes under Mississippi:

In 1959, the American physicist Albert Latter theorized that setting off a bomb in an underground cavity could muffle the blast. After tests with conventional explosives, Latter wrote that a detonation as big as 100 kilotons—more than six times bigger than the bomb dropped on Hiroshima—“would make a seismic signal so weak it would not even be detected by the Geneva system.” His theory, known as “decoupling,” became a rallying point for people who wanted to keep testing, says Jeffrey Lewis, of the James Martin Center for Nonproliferation Studies in Monterey, California.

“They wanted to come up with a reason that we couldn’t verify an agreement with the Soviets,” says Lewis, who’s also the publisher of the Arms Control Wonk blog. But in 1963, after the Cuban Missile Crisis brought the world nose-to-nose with the unthinkable, the superpowers signed the Limited Test Ban Treaty. It kept future tests underground, and researchers turned to making sure those tests would be spotted.

The Atomic Energy Commission wanted to test Latter’s theory using actual nukes. And salt deposits were considered the ideal places for tests, since they could be excavated more easily than rock and the resulting cavity would endure for years. So the search was on for a salt dome in territory similar to where the Russians tested their bombs, Auburn University historian David Allen Burke says.

“It had to be a certain diameter. It had to be a certain size. It needed to be a very large salt dome that was still a distance underground and not where it could interfere with water or petroleum or anything else,” says Burke, who wrote a book about the Mississippi tests.

That led the agency to southern Mississippi, which is full of salt domes. The government leased a nearly 1,500-acre patch of forest atop one of those domes and got to work.

[...]

The first blast, code-named Salmon, was a 5.3-kiloton device that would blow a cavity into the salt dome half a mile underground. The second, Sterling, was only 380 tons, and would go off in the cavity left behind by Salmon. AEC crews drilled a 2,700-foot hole down into the salt dome, lowered the first bomb into it, plugged it with 600 feet of concrete… and waited.

The Salmon test was put off nearly a month by a string of technical problems and bad weather, including Hurricane Hilda, which hit one state over in Louisiana. People living up to five miles from the test site were evacuated and recalled twice in preparation for blasts that never happened. They got paid $10 a head for adults and $5 for children for their trouble.

[...]

Far from Latter’s predictions that a blast as big as 100 kilotons could be kept off the scopes, Lewis says, it turned out that decoupling “is not a worry for anything but a very small explosion.” However, the data helped shape a later treaty which limited underground tests to 150 kilotons.

[...]

Federal records now indicate cancer rates in Lamar County are lower than both the state and national average.

(Hat tip to Hans Schantz.)

Macroeconomics is a combination of voodoo complex systems and politics

Wednesday, January 16th, 2019

In a recent interview, Shane Parrish asked Naval Ravikant, What big ideas have you changed your mind on in the last few years?

There’s a lot on kind of the life level. There’s a couple, obviously, in the business level. I think on a more practical basis, I’ve just stopped believing in macroeconomics. I studied economics in school and computer science. There was a time when I thought I was going to be a PhD in economics and all of that. The further I get, the more I realize macroeconomics is a combination of voodoo complex systems and politics. You can find macroeconomists that take every side of every argument. I think that discipline, because it doesn’t make falsifiable predictions, which is the hallmark of science, because it doesn’t make falsifiable predictions, it’s become corrupted.

You never have the counterexample on the economy. You can never take the US economy and run two different experiments at the same time. Because there’s so much data, people kind of cherry-pick for whatever political narrative they’re trying to push. To the extent that people spend all their time watching the macroeconomy or the fed forecasts or which way the stocks are going to go the next year, is it going to be a good year or bad year, that’s all junk. It’s no better than astrology. In fact, it’s probably even worse because it’s less entertaining. It’s just more stress-inducing. I think of macroeconomics as a junk science. All apologies to macroeconomists.

That said, microeconomics and game theory are fundamental. I don’t think you can be successful in business or even navigating through most of our modern capital society without an extremely good understanding of supply and demand and labor versus capital and game theory and tit for tat and those kinds of things. Macroeconomics is a religion that I gave up, but there are many others. I’ve changed my mind on death, on the nature of life, on the purpose of life, on marriage. I was originally not someone who wanted to be married and have kids. There have been a lot of fundamental changes. The most practical one is I gave up macro and I embraced micro.

I would say that’s not just true in macroeconomics, that true in everything. I don’t believe in macro-environmentalism, I believe in microenvironmentalism. I don’t believe in macro-charity. I believe in micro-charity.

I don’t believe in macro improving the world. There’s a lot of people out there who get really fired up about I’m going to change the world, I’m going to change this person, I’m going to change the way people think.

I think it’s all micro. It’s like change yourself, then maybe change your family and your neighbor before you get into abstract concepts about I’m going to change the world.

The plasmids force their hosts to lay down their arms

Tuesday, January 15th, 2019

Bacteria evolve drug-resistance in the usual way, but they also spread genes for drug-resistance horizontally, through plasmids:

As a self-defense mechanism, Acinetobacter kills other bacteria that get too close, which doesn’t help the plasmids reproduce. So, the plasmids force their hosts to lay down their arms, allowing them to then pass copies of themselves into the neighboring bacteria.

In response, the researchers mutated the plasmids so they couldn’t stop the bacteria from defending itself. In another test, they mutated the Acinetobacter itself so its defenses couldn’t be lowered, and in both cases the outcome was the same. The plasmids — and by extension, antibiotic resistance — were unable to spread.

The barbarian invaders had one thing the civilized Incas did not

Monday, January 14th, 2019

James LaFond praises the barbarians who took down the overly civilized Aztecs and Incas:

The Aztecs were besieged and in crisis, having lost their entire empire before the small pox killed 9 in 10 of them. So you are correct, this was not a win by disease but by deed. What did them in, primarily — and Barbara Tuchman in her March of Folly makes the best case for this — was their excessively civilized and fatalistic slave mind set, which had the entire nation acting in slavish obedience to a superstitious fool, Montezuma. As Bernal Diaz relates, their city was the greatest in the world surpassing any in Europe in all ways, from sanitation to food distribution to obedience to the law. All of this contributed to their fragility in the face of the Barbarian invaders, who fought over who was to be their leader up until the time of battle. Cortez usurped the leadership of the expedition.

Secondarily, the Aztecs had embarked on the folly of empire, enchaining slave races to their cruel will, races all too ready to ally in their hundreds of thousands with Cortez.

Now to the Incas, in which even fewer Aryan Barbarians took down an empire many times more powerful than the Aztecs. The Incas had bronze axes, maces and flails and stone weapons which defeated Spanish helmets. One Inca detachment even defeated a Spanish force in a fairly even battle. Again, the slave mind of a people whose king was a living god defeated the Incas, even as their vast army, which would have slaughtered the Aztecs and was organized much like the Roman Legions, stood obediently outside of the city where their fool leader and all of his officers agreed to meet unarmed with the 150-odd armed invaders. They never imagined that 150 armed men would kill all 20,000 of them [this in itself indicating a lack of heroic mindset] as their leaderless slave army watched from the surrounding hills. To their credit, the Incas, having already suffered heavy disease losses before the encounter through third-party contact, fought on for 40 years. However, their imperial system turned on them as the Spaniards used their road networks and stone fortress cities and recruited allies from their subject peoples. Agriculturally, the Incas were the most advanced civilization on earth at this time, but militarily they were barely into the Bronze Age. However, there have been studies done showing that they could have easily won and kept the Spaniards at bay while reverse engineering technology and using captive Spaniards as craftsmen much like the Japanese did hundreds of years later.

What really killed the Incas was that they had homogenized their people to a degree not seen until postmodern America, forcing tribes to give up their identity and moving them to alien places, enforcing communal food distribution, and finally softening to the point where they were unable to conquer the barbarian peoples to north, east and south. As with Rome before the Germans and Persia before the Macedonians, and then America before the drug cartels with its interstate system after, the centralized nature of the empire and the highly developed road network, blessed their invaders with godspeed.

But these incidences were only partially a lost defense and very much a gained conquest. The Barbarian invaders had one thing the Civilized Incas did not, a heroic ethos, which gave their enemies no rest as the Pizarro Brothers and the ruthless Soto descended on the faltering empire [already in the midst of a plague and a civil war] like wolves on sheep, which are what barbarians are to disorganized civilians.

Where Athaluppa sat stoically at Cajamarca and was burned to death in the very fire that melted down his sacred relics into bullion, even though he could have called in his armies to kill the invaders as he burned, not many years later we are related to an example of how Pizarro, his conqueror, behaved in the every same circumstance, when surrounded by Barbarian cutthroats. In the land of the Inca, not so far from where Athalupa died stoically as an ascendant Sun God, Pizarro and his assistant were attacked by five armored conquistadors while wearing only clothes and swords. While his secretary groveled, Pizarro cursed him and his assassins, tore a drape from the window as a shield, and well into his sixties, took some of his killers to hell with him.

What doomed the Incas is they were too civilized and this was permitted by an erasure of the heroic — if indeed they ever had a heroic ideal — from their martial culture. The heroic ideal did rise up [or reemerge] in the form of renegade Incas as a result of this cultural clash, but too little and too late.

The perfect counterpoint to these two civilizations being failed by their leaders and failing to rise up as a people, but remaining slaves to the alien invader to this day, is found in the Anabasis of Xenophon, or The March-down-to-the-sea. When the 10,000 leaderless Greek mercenaries sat by under orders while their leaders were murdered at a parley with the Persian army which vastly outnumbered them, the Greeks simply elected new leaders and fought their way free. Had the Inca army outside Cajamarca been made up of ancient Greeks the Spaniards would have been butchered that afternoon. It was the Anabasis which convinced Alexander that Persia was ripe for the picking.

The behavior of Conquistadors was that of independent rogue operators, often criminals in a state of disobedience to their government handlers, a shining half-century in masculine history when men reverted to the ancient Homeric ideals of heroism, cunning in the face of an alien foe and brutal natural selection among themselves to determine who was fit to lead the small barbarian pack in its descent on the soft, degenerate fold of Civilization, with its quivering neck bared to the ever-reoccurring cycle of cleansing barbarism that remains humanity’s last hope.

I highly recommend Bernal Diaz Del Castillo’s The Discovery And Conquest Of Mexico, by the way. I’ve been meaning to read Xenophon’s Anabasis forever.

(Hat tip to our Slovenian Guest.)

We were looking for the Future Book in the wrong place

Sunday, January 13th, 2019

The interactive book of the future hasn’t caught on, but technology has changed books nonetheless:

Physical books today look like physical books of last century. And digital books of today look, feel, and function almost identically to digital books of 10 years ago, when the Kindle launched. The biggest change is that many of Amazon’s competitors have gone belly up or shrunken to irrelevancy. The digital reading and digital book startup ecosystem that briefly emerged in the early 2010s has shriveled to a nubbin.

Amazon won. Trounced, really. As of the end of 2017, about 45 percent (up from 37 percent in 2015) of all print sales and 83 percent of all ebook sales happen through Amazon channels. There are few alternatives with meaningful mind- or market share, especially among digital books.

Yet here’s the surprise: We were looking for the Future Book in the wrong place. It’s not the form, necessarily, that needed to evolve — I think we can agree that, in an age of infinite distraction, one of the strongest assets of a “book” as a book is its singular, sustained, distraction-free, blissfully immutable voice. Instead, technology changed everything that enables a book, fomenting a quiet revolution. Funding, printing, fulfillment, community-building — everything leading up to and supporting a book has shifted meaningfully, even if the containers haven’t.

[...]

Our Future Book is composed of email, tweets, YouTube videos, mailing lists, crowdfunding campaigns, PDF to .mobi converters, Amazon warehouses, and a surge of hyper-affordable offset printers in places like Hong Kong.

For the black market, everything stays the same

Saturday, January 12th, 2019

Gun ownership is rising across Europe, the Wall Street Journal reports:

The uptick was spurred in part by insecurity arising from terrorist attacks—many with firearms, and reflects government efforts to get illegal guns registered by offering amnesty to owners.

Europe is still far from facing the gun prevalence and violence in Latin America or the U.S., which lead the world. World-wide civilian ownership of firearms rose 32% in the decade through 2017, to 857.3 million guns, according to the Small Arms Survey, a research project in Geneva. Europe accounts for less than 10% of the total.

But Europe’s shift has been rapid, and notable in part because of strict national restrictions. In most European countries, gun permits require thorough background checks, monitored shooting practice and tests on regulations. In Belgium, France and Germany, most registered guns may only be used at shooting ranges. Permits to bear arms outside of shooting ranges are extremely difficult to obtain.

Strict registration requirements don’t account for—and may exacerbate—a surge in illegal weapons across the continent, experts say.

Europe’s unregistered weapons outnumbered legal ones in 2017, 44.5 million to 34.2 million, according to the Small Arms Survey. Many illegal weapons come from one-time war zones, such as countries of the former Yugoslavia, and others are purchased online, including from vendors in the U.S.

[...]

Armed robbery and similar crimes often entail illicit guns, while legally registered firearms tend to appear in suicide and domestic-violence statistics, said Nils Duquet of the Flemish Peace Institute, a Belgian research center.

“It’s clear that illegal guns are used mostly by criminals,” he said.

[...]

In Germany, the number of legally registered weapons rose roughly 10%, to 6.1 million, in the five years through 2017, the most recent year for which statistics are available, according to Germany’s National Weapons Registry. Permits to bear arms outside of shooting ranges more than tripled to 9,285, over the same five years.

Permits for less lethal air-powered guns that resemble real guns and shoot tear gas or loud blanks to scare away potential attackers roughly doubled in the three years through the end of 2017, to 557,560, according to the registry.

[...]

In Belgium, firearm permits and membership in sport-shooting clubs has risen over the past three years.

Belgian applications for shooting licenses almost doubled after the terrorist attacks by an Islamic State cell in Paris in Nov. 2015 and four months later in Brussels, offering “a clear indication of why people acquired them,” said Mr. Duquet.

[...]

Belgium has for years tightened regulations in response to gun violence, such as a 2006 killing spree by an 18-year-old who legally acquired a rifle.

“Before 2006, you could buy rifles simply by showing your ID,” recalled Sébastien de Thomaz, who owns two shooting ranges in Brussels and previously worked in a gun store.

“They used to let me shoot with all my stepfather’s guns whenever I joined him at the range,” said Lionel Pennings, a Belgian artist who joins his stepfather at one of Mr. De Thomaz’s shooting ranges on Sundays.

Mr. Pennings recalled that in the past he could easily fire a few rounds with his stepfather’s gun. “Now it’s much stricter,” he said. “You can only use the guns you have a permit for.”

A Belgian would-be gun owner must pass almost a year of shooting and theory tests, plus psychological checks, said Mr. De Thomaz.

The gun-range owner questions the impact of that policy. “With each terror attack, the legislation gets stricter,” he said. “For the black market, everything stays the same.”

Les Gentils, les Méchants

Saturday, January 12th, 2019

Voilà! Les Gentils, les Méchants:

(With a tip of the chapeau to a certain ami.)

Who is supporting you? Big Kale?

Friday, January 11th, 2019

Siddhartha Mukherjee, author The Emperor of All Maladies: A Biography of Cancer, says it’s time we studied diet as seriously as we study drugs:

Several months before my surgical procedure, a cancer patient asked me whether she should change her diet. She had lost her appetite. One nutritionist had advised her to start consuming highly caloric, sugar-loaded drinks to maintain her body weight. But, she worried, what if the sugar ended up “feeding” her cancer? Her anxiety was built on nearly eight decades of science: In the 1920s, Otto Warburg, a German physiologist, demonstrated that tumor cells, unlike most normal cells, metabolize glucose using alternative pathways to sustain their rapid growth, provoking the idea that sugar might promote tumor growth.

You might therefore expect the medical literature on “sugar feeding cancer” to be rich with deep randomized or prospective studies. Instead, when I searched, I could find only a handful of such trials. In 2012, a team at the Dana-Farber Cancer Institute in Boston divided patients with Stage 3 colon cancer into different groups based on their dietary consumption, and determined their survival and rate of relapse. The study generated provocative data — but far from an open-and-shut case. Patients whose diets consisted of foods with a high glycemic load (a measure of how much blood glucose rises after eating a typical portion of a food) generally had shorter survival than patients with lower glycemic load. But a higher glycemic index (a measure of how much 50 grams of carbohydrate from a food, which may require eating a huge portion, raises blood glucose) or total fructose intake had no significant association with overall survival or relapse.

While the effect of sugar on cancer was being explored in scattered studies, the so-called ketogenic diet, which consists of high fat, moderate protein and low carbohydrate, was also being promoted. It isn’t sugars that are feeding the tumor, the logic runs. It’s insulin — the hormone that is released when glucose enters the blood. By reducing carbohydrates and thus keeping a strong curb on insulin, the keto diet would decrease the insulin exposure of tumor cells, and so restrict tumor growth. Yet the search for “ketogenic diet, randomized study and cancer” in the National Library of Medicine database returned a mere 11 articles. Not one of them reported an effect on a patient’s survival, or relapse.

But what if diet, rather than acting alone, collaborates with a drug to produce an effect on a tumor? In the winter of 2016, I had dinner with Lewis Cantley, director of the Meyer Cancer Center at Weill Cornell Medicine. Decades ago, Cantley discovered an enzyme named PI3 kinase, which regulates the growth and survival of cells in the presence of nutrients. By inhibiting this enzyme using novel drugs, researchers had hoped to target the signals used by tumor cells to grow, thus “starving” the cancer. But the drugs designed thus far were only marginally effective. Why, we wondered over salmon teriyaki in a nondescript Upper East Side joint, might blocking such a central hub of growth activity have had only a modest effect on tumor growth?

The trials gave us a crucial, obvious clue that we had missed: Many patients had become diabetic, a phenomenon seen as a side effect of the drug that had been ignored. Perhaps the drug wasn’t just providing a “starvationlike” signal only to the tumor cells, we speculated. As most drugs do, the molecule circulated through the entire body of the patient and also acted on the liver, which sensed the same starvationlike signal and, as a reflexive response, sent glucose soaring into the blood. The glucose, in turn, most likely incited insulin release in the pancreas. And some patients treated with the medicine returned to the clinic with sky-high levels of glucose and insulin — in essence, in the throes of drug-induced diabetes.

Cantley wondered whether the additional insulin was reactivating the signals within the tumor cells that had been shut off by the PI3 kinase inhibitor, and so allowing the cells to survive — in effect, undoing all the good being done by the drug. On a paper napkin borrowed from the waiter, he drew out a scheme to outwit this vicious cycle. What if we cut off all extra insulin released, by putting patients on a low-carb, ketogenic diet while on the drug? It would be a novel kind of trial — one in which diet itself would become a drug, or a co-drug, with the PI3 kinase inhibitors.

Between 2016 and 2018, postdoctoral researchers in Cantley’s laboratory and mine established that this strategy worked on several mouse cancers, and on human cancers implanted into mice. By 2019, working with clinicians at Columbia, Cornell and Memorial Sloan Kettering, we hope to begin a study in humans with lymphomas, endometrial cancer and breast cancer, to use ketogenic diets in concert with the PI3 kinase inhibitors. (In the meantime, a host of other studies have also demonstrated that other diets could potently modulate the effects of targeted therapies on cancers in mouse models.)

But the experiments on mice also warned us of an important pitfall of such an approach. While the “drug plus diet” model worked on experimental mouse and human cancers, the ketogenic diet had a limited effect by itself. For some cancers in the mouse models, the keto diet alone kept the tumor growth at bay. But for others, like some leukemias implanted into mice, the diet alone accelerated the cancer, while the drug-plus-diet approach slowed it down.

We published this data in the scientific journal Nature early this year. I sent out a tweet with the results, emphasizing that the human trial was about to be started, and that the keto diet alone might have a negative effect on some tumors — in essence, a “folks, don’t try this at home” message. The response over social media was unexpected — brisk, vicious, angry, suspicious and, at times, funny. “Keto is pure hype,” one responder wrote. Another countered: “Who is supporting you? Big Kale?”