The obscure religion that shaped the West

April 15th, 2017

Zoroastrianism might be called the obscure religion that shaped the West:

It is generally believed by scholars that the ancient Iranian prophet Zarathustra (known in Persian as Zartosht and Greek as Zoroaster) lived sometime between 1500 and 1000 BC. Prior to Zarathustra, the ancient Persians worshipped the deities of the old Irano-Aryan religion, a counterpart to the Indo-Aryan religion that would come to be known as Hinduism. Zarathustra, however, condemned this practice, and preached that God alone – Ahura Mazda, the Lord of Wisdom – should be worshipped. In doing so, he not only contributed to the great divide between the Iranian and Indian Aryans, but arguably introduced to mankind its first monotheistic faith.

The idea of a single god was not the only essentially Zoroastrian tenet to find its way into other major faiths, most notably the ‘big three’: Judaism, Christianity and Islam. The concepts of Heaven and Hell, Judgment Day and the final revelation of the world, and angels and demons all originated in the teachings of Zarathustra, as well as the later canon of Zoroastrian literature they inspired. Even the idea of Satan is a fundamentally Zoroastrian one; in fact, the entire faith of Zoroastrianism is predicated on the struggle between God and the forces of goodness and light (represented by the Holy Spirit, Spenta Manyu) and Ahriman, who presides over the forces of darkness and evil. While man has to choose to which side he belongs, the religion teaches that ultimately, God will prevail, and even those condemned to hellfire will enjoy the blessings of Paradise (an Old Persian word).

How did Zoroastrian ideas find their way into the Abrahamic faiths and elsewhere? According to scholars, many of these concepts were introduced to the Jews of Babylon upon being liberated by the Persian emperor Cyrus the Great. They trickled into mainstream Jewish thought, and figures like Beelzebub emerged. And after Persia’s conquests of Greek lands during the heyday of the Achaemenid Empire, Greek philosophy took a different course. The Greeks had previously believed humans had little agency, and that their fates were at the mercy of their many gods, whom often acted according to whim and fancy. After their acquaintance with Iranian religion and philosophy, however, they began to feel more as if they were the masters of their destinies, and that their decisions were in their own hands.

Decivilization in the 1960s

April 14th, 2017

Steven Pinker discusses decivilization in the 1960s:

After a three-decade free fall that spanned the Great Depression, World War II, and the Cold War, Americans multiplied their homicide rate by more than two and a half, from a low of 4.0 in 1957 to a high of 10.2 in 1980 (U.S. Bureau of Statistics; Fox and Zawitz: 2007). The upsurge included every other category of major crime as well, including rape, assault, robbery, and theft, and lasted (with ups and downs) for three decades. The cities got particularly dangerous, especially New York, which became a symbol of the new criminality. Though the surge in violence affected all the races and both genders, it was most dramatic among black men, whose annual homicide rate had shot up by the mid-1980s to 72 per 100,000.

[...]

The rebounding of violence in the 1960s defied every expectation. The decade was a time of unprecedented economic growth, nearly full employment, levels of economic equality for which people today are nostalgic, historic racial progress, and the blossoming of government social programs, not to mention medical advances that made victims more likely to survive being shot or knifed. Social theorists in 1962 would have happily bet that these fortunate conditions would lead to a continuing era of low crime. And they would have lost their shirts.

[...]

When rock music burst onto the scene in the 1950s, politicians and clergymen vilified it for corrupting morals and encouraging lawlessness. (An amusing video reel of fulminating fogies can be seen in Cleveland’s Rock and Roll Hall of Fame and Museum.) Do we now have to – gulp – admit they were right? Can we connect the values of 1960s popular culture to the actual rise in violent crimes that accompanied them? Not directly, of course. Correlation is not causation, and a third factor, the pushback against the values of the Civilizing Process, presumably caused both the changes in popular culture and the increase in violent behavior. Also, the overwhelming majority of baby boomers committed no violence whatsoever. Still, attitudes and popular culture surely reinforce each other, and at the margins, where susceptible individuals and subcultures can be buffeted one way or another, there are plausible causal arrows from the decivilizing mindset to the facilitation of actual violence.

One of them was a self-handicapping of the criminal justice Leviathan. Though rock musicians seldom influence public policy directly, writers and intellectuals do, and they got caught up in the zeitgeist and began to rationalize the new licentiousness. Marxism made violent class conflict seem like a route to a better world. Influential thinkers like Herbert Marcuse and Paul Goodman tried to merge Marxism or anarchism with a new interpretation of Freud that connected sexual and emotional repression to political repression and championed a release from inhibitions as part of the revolutionary struggle. Troublemakers were increasingly seen as rebels and nonconformists, or as victims of racism, poverty, and bad parenting. Graffiti vandals were now ‘artists,’ thieves were ‘class warriors,’ and neighborhood hooligans were ‘community leaders.’ Many smart people, intoxicated by radical chic, did incredibly stupid things. Graduates of elite universities built bombs to be set off at army social functions, or drove getaway cars while ‘radicals’ shot guards during armed robberies. New York intellectuals were conned by Marxobabble-spouting psychopaths into lobbying for their release from prison (Pinker 2002: 261–262).

Read the whole thing. (It’s an excerpt from The Better Angels of Our Nature: Why Violence Has Declined.)

The lowest layer of the pyramid is the foundation

April 13th, 2017

It’s hard to find a teacher who doesn’t make reference to Bloom’s Taxonomy, Doug Lemov notes, because it’s part of the language of teaching, but there’s a problem:

Bloom’s Taxonomy is often represented as a pyramid with the understanding — intended or accidental — that teachers should try to get to the top. That’s the nature of pyramids, I guess.

Bloom's Taxonomy Pyramid

Generally when teachers talk about “Bloom’s taxonomy,” they talk with disdain about “lower level” questions. They believe, perhaps because of the pyramid image which puts knowledge at the bottom, that knowledge-based questions, especially via recall and retrieval practice, are the least productive thing they could be doing in class. No one wants to be the rube at the bottom of the pyramid.

But this, interestingly is not what Bloom’s argued — at least according to Vanderbilt’s description. Saying knowledge questions are low value and that knowledge is the necessary precondition for deep thinking are very different things. More importantly believing that knowledge questions — even mere recall of facts — are low value doesn’t jibe with the overwhelming consensus of cognitive science, summarized here by Daniel Willingham, who writes,

Data from the last thirty years lead to a conclusion that is not scientifically challengeable: thinking well requires knowing facts, and that’s true not simply because you need something to think about. The very processes that teachers care about most — critical thinking processes such as reasoning and problem solving — are intimately intertwined with factual knowledge that is in long-term memory (not just found in the environment)

In other words there are two parts to the equation. You not only have to teach a lot of facts to allow students to think deeply but you have to reinforce knowledge enough to install it in long-term memory or you can’t do any of the activities at the top of the pyramid.

US healthcare is famous for three things

April 12th, 2017

US healthcare is famous for three things, Ben Southwood notes:

It’s expensive, it’s not universal, and it has poor outcomes. The US spends around $7,000 per person on healthcare every year, or roughly 18% of GDP; the next highest spender is Switzerland, which spends about $4,500. Before Obamacare, approx 15% of the US population were persistently uninsured (8.6% still are). And as this chart neatly shows, their overall outcome on the most important variable — overall life expectancy — is fairly poor.

But some of this criticism is wrongheaded and simplistic: when you slice the data up more reasonably, US outcomes look impressive, but being the world’s outrider is much more expensive than following behind. What’s more, most of the solutions people offer just don’t get to the heart of the issue: if you give people freedom they’ll spend a lot on healthcare.

The US undoubtedly spends a huge amount on healthcare. One popular narrative is that because of market failures and/or extreme overregulation in healthcare, prices are excessively high. So Americans with insurance (or covered by Medicare, the universal system for the elderly, or Medicaid, the government system for the poor) get the same as other developed world citizens, but those without get very poor care and die younger. A system like the NHS solves the problem, according to this view, with bulk buying of land, labour, and inputs, better incentives, and universal coverage.

But there are some serious flaws in this theory. Firstly, extending insurance to the previously-uninsured doesn’t, in America, seem to have large benefits. For example, a recent NBER paper found no overall health gains from the massive insurance expansion under Obamacare.* A famous RAND study found minuscule benefits over decades from giving out free insurance to previously uninsured in the 1970s. In fact, over and above the basics, insuring those who choose not to get insurance doesn’t ever seem to have large gains. Indeed, there is wide geographic variation in the life expectancy among the low income in the US, but this doesn’t even correlate with access to medical care! This makes it unlikely that the gap between the US and the rest is explained by universality.

To find the answer, consider the main two ingredients that go into health outcomes. One is health, and the other is treatment. If latent health is the same across the Western world, we can presume that any differences come from differences in treatment. But this is simply not the case. Obesity is far higher in the USA than in any other major developed country. Obviously it is a public health problem, but it’s unrealistic to blame it on the US system of paying for doctors, administrators, hospitals, equipment and drugs.

In fact in the US case it’s not even obesity, or indeed their greater pre-existing disease burden, that is doing most of the work in dragging their life expectancy down; it’s accidental and violent deaths. It is tragic that the US is so dangerous, but it’s not the fault of the healthcare system; indeed, it’s an extra burden that US healthcare spending must bear. Just simply normalising for violent and accidental death puts the USA right to the top of the life expectancy rankings.

One of our cultural problems, Arnold Kling adds, is that we spend too much on health care and not enough on public health.

Take your shoes off at the door

April 11th, 2017

It turns out that taking your shoes off when you come inside doesn’t just keep the carpets cleaner. It’s also healthier:

Among samples collected in homes, 26.4% of shoe soles tested positive for C. Diff, about three times the number found on the surfaces of bathrooms and kitchens.

And that’s just one bacterium. In an earlier investigation, Dr. Garey examined past studies to learn if “shoe soles are a vector for infectious pathogens.” The answer was a resounding yes.

Among the studies: Austrian researchers found at least 40% of shoes carried Listeria monocytogenes in 2015. And a 2014 German study found that over a quarter of boots used on farms carried E.coli.

“Essentially, when you wear your shoes in a house, you are bringing in everything you stepped in during the day,” says Jonathan Sexton, a laboratory manager at the Mel & Enid Zuckerman College of Public Health at the University of Arizona.

Wiping your feet, however vigorously, on a welcome mat, provides only limited help, he says. “It will remove some of the dirt, but you have to think of the person who wiped their feet before. You might be picking stuff they left behind.”

Some homeowners may worry that guests in socks or bare feet might also represent a health risk. That’s possible, Dr. Sexton says, but the inside of a shoe has far less bacteria than the outside.

Both researchers agree that the risk is muted. “Shoes in the house are not something to freak out about,” Dr. Sexton says.

A Family Planning Miracle in Colorado

April 10th, 2017

So, what fraction of pregnancies are unintended?

Almost half of Colorado women who got pregnant in 2008 said that the pregnancy happened sooner than they wanted or that they hadn’t wanted to get pregnant at all. That was similar to the US average: the rate of unintended pregnancy has been stuck around 50 percent since the 1960s.

The Colorado Family Planning Initiative cut teen births and abortions almost in half:

They fell by nearly 20 percent among women aged 20-24. (Note: Under normal circumstances, over 80 percent of teen pregnancies and 70 percent of pregnancies among single women aged 20-29 are unsought, so this change means women’s realities are better matching their family desires.) Second-order births to teens—teens who gave birth a second or third time—dropped by 58 percent. High-risk births, including preterm births, also diminished.

Poor families benefited the most, because unsought pregnancy is four times as common and unsought birth seven times as common among poor women as among their more prosperous peers. With fewer families facing the dire circumstances triggered by an unexpected pregnancy or unplanned birth, the state saved $66-70 million in public assistance, according to a team of economists at the University of Colorado.

How did Colorado get such dramatic results? They provided “get it and forget it” forms of contraception, such as long-acting IUDs and implants.

This is seen as a great Progressive victory.

To Be a Genius, Think Like a 94-Year-Old

April 9th, 2017

To be a genius, think like a 94-year-old — more specifically, like Dr. John Goodenough:

In 1946, a 23-year-old Army veteran named John Goodenough headed to the University of Chicago with a dream of studying physics. When he arrived, a professor warned him that he was already too old to succeed in the field.

Recently, Dr. Goodenough recounted that story for me and then laughed uproariously. He ignored the professor’s advice and today, at 94, has just set the tech industry abuzz with his blazing creativity. He and his team at the University of Texas at Austin filed a patent application on a new kind of battery that, if it works as promised, would be so cheap, lightweight and safe that it would revolutionize electric cars and kill off petroleum-fueled vehicles. His announcement has caused a stir, in part, because Dr. Goodenough has done it before. In 1980, at age 57, he coinvented the lithium-ion battery that shrank power into a tiny package.

We tend to assume that creativity wanes with age. But Dr. Goodenough’s story suggests that some people actually become more creative as they grow older. Unfortunately, those late-blooming geniuses have to contend with powerful biases against them.

[...]

On the contrary, there’s plenty of evidence to suggest that late blooming is no anomaly. A 2016 Information Technology and Innovation Foundation study found that inventors peak in their late 40s and tend to be highly productive in the last half of their careers. Similarly, professors at the Georgia Institute of Technology and Hitotsubashi University in Japan, who studied data about patent holders, found that, in the United States, the average inventor sends in his or her application to the patent office at age 47, and that the highest-value patents often come from the oldest inventors — those over the age of 55.

[...]

Years ago, he decided to create a solid battery that would be safer. Of course, in a perfect world, the “solid-state” battery would also be low-cost and lightweight. Then, two years ago, he discovered the work of Maria Helena Braga, a Portuguese physicist who, with the help of a colleague, had created a kind of glass that can replace liquid electrolytes inside batteries.

Dr. Goodenough persuaded Dr. Braga to move to Austin and join his lab. “We did some experiments to make sure the glass was dry. Then we were off to the races,” he said.

Some of his colleagues were dubious that he could pull it off. But Dr. Goodenough was not dissuaded. “I’m old enough to know you can’t close your mind to new ideas. You have to test out every possibility if you want something new.”

When I asked him about his late-life success, he said: “Some of us are turtles; we crawl and struggle along, and we haven’t maybe figured it out by the time we’re 30. But the turtles have to keep on walking.” This crawl through life can be advantageous, he pointed out, particularly if you meander around through different fields, picking up clues as you go along. Dr. Goodenough started in physics and hopped sideways into chemistry and materials science, while also keeping his eye on the social and political trends that could drive a green economy. “You have to draw on a fair amount of experience in order to be able to put ideas together,” he said.

He also credits his faith for keeping him focused on his mission to defeat pollution and ditch petroleum. On the wall of his lab, a tapestry of the Last Supper depicts the apostles in fervent conversation, like scientists at a conference arguing over a controversial theory. The tapestry reminds him of the divine power that fuels his mind. “I’m grateful for the doors that have been opened to me in different periods of my life,” he said. He believes the glass battery was just another example of the happy accidents that have come his way: “At just the right moment, when I was looking for something, it walked in the door.”

Last but not least, he credited old age with bringing him a new kind of intellectual freedom. At 94, he said, “You no longer worry about keeping your job.”

Short- and Long-Term Memories

April 8th, 2017

All memories start as a short-term memory and then slowly convert into a long-term memory — or so we thought:

Two parts of the brain are heavily involved in remembering our personal experiences.

The hippocampus is the place for short-term memories while the cortex is home to long-term memories.

This idea became famous after the case of Henry Molaison in the 1950s.

His hippocampus was damaged during epilepsy surgery and he was no longer able to make new memories, but his ones from before the operation were still there.

So the prevailing idea was that memories are formed in the hippocampus and then moved to the cortex where they are “banked”.

[...]

The results, published in the journal Science, showed that memories were formed simultaneously in the hippocampus and the cortex.

[...]

The mice do not seem to use the cortex’s long-term memory in the first few days after it is formed.

They forgot the shock event when scientists turned off the short-term memory in the hippocampus.

However, they could then make the mice remember by manually switching the long-term memory on (so it was definitely there).

“It is immature or silent for the first several days after formation,” Prof Tonegawa said.

The researchers also showed the long-term memory never matured if the connection between the hippocampus and the cortex was blocked.

So there is still a link between the two parts of the brain, with the balance of power shifting from the hippocampus to the cortex over time.

Should America have entered World War I?

April 7th, 2017

The US entered the Great War 100 years ago, but why?

The war lasted only another year and a half, but in that time, an astounding 117,000 American soldiers were killed and 202,000 wounded.

[...]

America intervened nearly three years after it began, and the “doughboys,” as our troops were called, engaged in serious combat for only a few months. More Americans in uniform died away from the battlefield — thousands from the Spanish flu — than with weapons in hand. After victory was achieved, Wilson’s audacious hope of making a peace that would advance democracy and national self-determination blew up in his face when the Senate refused to ratify the treaty he had signed at the Palace of Versailles.

But attention should be paid. America’s decision to join the Allies was a turning point in world history. It altered the fortunes of the war and the course of the 20th century — and not necessarily for the better. Its entry most likely foreclosed the possibility of a negotiated peace among belligerent powers that were exhausted from years mired in trench warfare.

Although the American Expeditionary Force did not engage in combat for long, the looming threat of several million fresh troops led German generals to launch a last, desperate series of offensives. When that campaign collapsed, Germany’s defeat was inevitable.

How would the war have ended if America had not intervened? The carnage might have continued for another year or two until citizens in the warring nations, who were already protesting the endless sacrifices required, forced their leaders to reach a settlement. If the Allies, led by France and Britain, had not won a total victory, there would have been no punitive peace treaty like that completed at Versailles, no stab-in-the-back allegations by resentful Germans, and thus no rise, much less triumph, of Hitler and the Nazis. The next world war, with its 50 million deaths, would probably not have occurred.

The pacifists failed:

Since the war began, feminists and socialists had worked closely with progressive members of Congress from the agrarian South and the urban Midwest to keep America out. They mounted street demonstrations, attracted prominent leaders from the labor and suffrage movements, and ran antiwar candidates for local and federal office. They also gained the support of Henry Ford, who chartered a ship full of activists who crossed the Atlantic to plead with the heads of neutral nations to broker a peace settlement.

They may even have had a majority of Americans on their side. In the final weeks before Congress declared war, anti-militarists demanded a national referendum on the question, confident voters would recoil from fighting and paying the bills so that one group of European powers could vanquish another.

Once the United States did enter the fray, Wilson, with the aid of the courts, prosecuted opponents of the war who refused to fall in line. Under the Espionage and Sedition Acts, thousands were arrested for such “crimes” as giving speeches against the draft and calling the Army “a God damned legalized murder machine.”

The intervention led to big changes in America, as well as the world. It began the creation of a political order most citizens now take for granted, even as some protest against it: a state equipped to fight war after war abroad while keeping a close watch on allegedly subversive activities at home.

To make the world “safe for democracy” required another innovation: a military-industrial establishment funded, then partly and now completely, by income taxes.

The Pentagon Is Making The CIA And State Department Obsolete

April 6th, 2017

The Pentagon is making the CIA and State Department obsolete, Ryan Landry argues:

The State Department and CIA have been anxious since election night, not just because of the threat of a change of policy, but because these agencies now face an existential threat on some level.

As Trump’s cabinet was assembled in late fall, the nervousness grew as Blue Empire realized how many generals and former generals would serve in high-ranking positions. Unbeknownst to most Americans, the fear is based on change within Red Empire. The Pentagon’s growth into a one-stop shop for policy formulation and implementation threatens Blue’s existence and at a minimum, influence.

Since the planes flew into the Twin Towers, the Department of Defense has been in a steady growth mode not strictly limited to increases in spending and budgets. The very scope and nature of the DOD has changed. America’s wars of choice, the imperial wars after World War II, have forced the Pentagon to send a first-world military into regions with zero infrastructure. The U.S. military had to change and find ways to support such a military in many undeveloped areas from scratch. Random reports by American media outlets repeat the same phrase over and over. The Pentagon has become a one-stop shop for the presidency to seek advice, consider policies, and execute plans. For all the recent talk of a Deep State, the Pentagon has grown into a fully operational imperium in imperio.

If the president inquires, the Pentagon offers a soup to nuts service for policy that extends beyond warmaking. As one of the largest users of oil and oil products, the Pentagon needed to focus on securing oil, and therefore energy policy advisers and researchers sprung forth. The Pentagon has a budget and a literal army of individuals to enact policy with speed that other departments can only dream of enjoying. The need to defend America’s computer networks has had secondary consequences of a private ex-military IT network of contractors and systems experts.

Mission creep has created recent open reactions from rivals at State and CIA. CIA’s fear of former national security adviser Michael Flynn was not due to his supposed Russia connections, but rather because of his approach to intelligence. Flynn voiced concerns and criticism with the CIA’s approach. Flynn also built the Defense Intelligence Agency up to integrate intelligence units and analysis directly with Army units for faster, more efficient operations. CIA was out to get Flynn as part of a turf battle and legitimate fear of reform. If not reform, simple reorientation of who is used when and where in the empire.

Defense has also spent a generation pushing its officers into furthering their education. This is not simple credentialism, but also creating a corps of officers that could rival their mirror images in State.

One-handed zipping

April 5th, 2017

Under Armour introduced an ingenious new zipper design created by engineer Scott Peters a couple years ago:

Although the fastening still relies on the interlocking of two bands of metal teeth, the clasps at the bottom have received a thoughtful re-design. The motivation for Peters, he says, was watching his uncle, who suffers from myotonic dystrophy, struggle to engage the conventional clasps. The solution is the inclusion of magnets and a unique catch, so that the two halves automatically align with one another and the zipper can even be done up one handed.

MagZip

More on the MagZip‘s development:

The eureka moment of a magnetic zipper was crucial. But the exact millimeter grooves making the process practical would require painstaking nuance.

“Magnets in and of themselves won’t work. They’ll drive components together, but you have issues of alignment, issues of holding things together without popping out – and pulling them apart can be a nightmare,” Peters explains. “We had to figure out the combination of mechanical design so it self-aligns and easily locks itself in place, enabling you to zip with one hand.”

“We started rapid prototyping, getting parts machined, and testing. We’d make a part, assemble it, and glue it on a zipper to find out what worked and didn’t work. I had one part that actually broke, and when this had broken, it kind of showed me the way. . .we were able to evolve the design to where it is today, a more open hook-and-catch.”

The kingdom of women

April 4th, 2017

Choo Waihong grew up in Singapore before training and working as a corporate lawyer in Canada, the US, and London. She felt drawn back to China, but stumbled across the kingdom of women, a series of villages dotted around a mountain and Lugu Lake, where a Tibetan tribe called the Mosuo still practices its matriarchal ways:

As an unmarried woman in a community where marriage is non-existent, Waihong felt at home.

“All Mosuo women are, essentially, single,” she says. “But I think I’m seen as an oddity because I’m not from here, and I live alone, rather than with a family. I get a lot of dinner invitations, and my friends are always egging me on to find a nice Mosuo lover.” Has she? “That would be telling.”

With life centred on the maternal family, motherhood is, unsurprisingly, revered. For a young Mosuo woman, it is life’s goal. “I’ve had to advise many young women on ovulation, so keen are they to get pregnant,” she says. “You are seen as complete once you become a mother.” In this respect, Waihong, who doesn’t have children, is regarded more keenly. “My sense is that I’m pitied,” she says, “but people are too polite to tell me.”

Why Japan’s Rail Workers Can’t Stop Pointing at Things

April 3rd, 2017

It is hard to miss when taking the train in Japan:

White-gloved employees in crisp uniforms pointing smartly down the platform and calling out — seemingly to no one — as trains glide in and out of the station. Onboard is much the same, with drivers and conductors performing almost ritual-like movements as they tend to an array of dials, buttons and screens.

Shisa kanko on Skinhansen in Kyoto Station

While these might strike visitors as silly, the movements and shouts are a Japanese-innovated industrial safety method known as pointing-and-calling; a system that reduces workplace errors by up to 85 percent.

Known in Japanese as shisa kanko, pointing-and-calling works on the principle of associating one’s tasks with physical movements and vocalizations to prevent errors by “raising the consciousness levels of workers”—according to the National Institute of Occupational Safety and Health, Japan. Rather than rely on a worker’s eyes or habit alone, each step in a given task is reinforced physically and audibly to ensure the step is both complete and accurate.

Charles Murray’s SPLC page as edited by Charles Murray

April 2nd, 2017

Charles Murray admits that he had some fun editing the SPLC’s page about him:

My self-imposed ground rules are that I can’t delete accurate quotes from my work that I wish I had worded more felicitously, but I am permitted to extend quotes with material that immediately adjoins the quoted text, to correct factual mistakes, and to make suggestions to the author, as copy editors routinely do.

If Andre Agassi’s dad could do everything all over again

April 1st, 2017

I don’t follow tennis, so I didn’t realize that Andre Agassi’s dad was a former Olympic boxer from Iran:

“When people didn’t have my nuanced take on him they just represented him as abusive. But my dad was clear. He said: ‘Andre, I know how I’ve lived and I know who I am and who I’m not. If I could do everything all over again I would change only one thing – I wouldn’t let you play tennis.’ I’d pulled the car over when he said: ‘I would only change one thing.’ I said, ‘Wow, why’s that Dad?’ He said: ‘Because I’d make you play baseball or golf so you can do it longer and make more money.’ I got back on the freeway with a chuckle.”