Student-Athletes

October 22nd, 2014

I am shocked — shocked! — to find cheating going on at UNC!

A blistering report into an academic fraud scandal at the University of North Carolina released Wednesday found that for nearly two decades two employees in the African and Afro-American Studies department ran a “shadow curriculum” of hundreds of fake classes that never met but for which students, many of them Tar Heels athletes, routinely received A’s and B’s.

Nearly half the students in the classes were athletes, the report found, often deliberately steered there by academic counselors to bolster their worrisomely low grade-point averages and to allow them to continue playing on North Carolina’s teams.

I’m so glad we’ve ferreted out this one isolated program, and America’s student-athletes can continue their long tradition of academic excellence.

The Greatness of George Orwell

October 22nd, 2014

Bruce Charlton discusses the greatness of George Orwell — and his fatal flaw:

My generation was fed Orwell at school from our mid teens — some of the essays such as Shooting an Elephant and Boys’ Weeklies; excerpts from the documentary books such as Down and Out.. and …Wigan Pier; and the two late political novels Animal Farm and 1984.

That Orwell was mostly correct about things was not really argued, but assumed; on the basis that he seemed obviously correct to almost everybody; so far as the English were concerned, Orwell was simply expressing the national character better than we ourselves could have done.

Orwell was claimed both by the Left — on the basis that he was explicitly a socialist through most of his life; and he was claimed by the Right — on the basis that his two best known novels are anti-communist warnings against totalitarianism.

In sum: Orwell’s influence was much as any writer reasonably could have hoped for. And his warnings about the dangers of Leftism and the operations of totalitarianism were as lucid, as explicit, and as forceful as any writer could have made them.

And yet Britain today is an ‘Orwellian’ society to a degree which would have seemed incredible even 25 years ago. The same applies to the USA, where Orwell was also revered.

In particular, the exact types of abuses, manipulations and distortions of language which Orwell spelled-out in fiery capital letters 100 feet high have come to pass; have become routine and unremarked — and they are wholly-successful, barely-noticed, stoutly-defended — and to point them out is regarded either as trivial nitpicking or evasive rhetoric.

The current manifestations of the sexual revolution, deploying the most crudely Orwellian appropriations and taboos of terminology, go further than even Orwell envisaged. The notion that sexual differences could so easily be subverted, and their evaluations so swiftly reversed; apparently at will and without any apparent limit would — I think — have gone beyond the possibilities Orwell could have realistically imagined.

(Indeed, it is characteristic of the Kafka-esque absurdity of modern Western life that a plain description of everyday reality — say in a state bureaucracy, the mass media or university — is simply disbelieved, it ‘does not compute’ and is rejected by the mind. And by this, nihilistic absurdity is safeguarded.)

I think Orwell would never have believed that people would accept, en masse, and so readily go along with (willingly embrace and enforce, indeed), the negative relabelling of normal biological reality, and he substitution of arbitrary and rapidly changing inverted norms: for Orwell, The Proles were sexually normal, like animals, and would continue so. The elites, whatever their personal preferences and practices, left them alone in this — presumably because sexuality was seen as a kind of bedrock.

And this leads to Orwell’s fatal flaw — which was exactly sexuality.

Projected Recoilless Improvised Grenade

October 22nd, 2014

The Projected Recoilless Improvised Grenade (PRIG) was a shoulder fired weapon developed by the Provisional Irish Republican Army (PIRA) for use against lightly armored vehicles:

The launcher consisted of a length of steel tube adapted to accept a charge of black powder in the middle by way of a capped off perforated pipe welded in place. The charge is wired to a simple circuit, often utilizing a light bulb holder as an arming switch and fired by a long arm micro-switch which serves as a trigger.

PRIG Diagram

The warhead itself consists of a standard food tin filled with 600g of Semtex, complete with a frontal explosive lens to create an armor piercing shaped charge. This round was designed to explode on impact, being an adaption of an earlier used improvised stick-grenade known as a ‘drogue bomb’ which was sometimes fitted with a trash bag to act as a guide parachute.

PRIG Round

To the rear of the launcher was placed the ‘counter-shot’, incorporated to utilize the recoilless principle (Reduced to as little as a .22lr rifle’s, according to some!). This consisted of two packets of digestive tea biscuits, wrapped in j-cloth.

PRIG Counter-Shot Digestive Biscuits

Transportation, Divergence, and the Industrial Revolution

October 21st, 2014

Nick Szabo explores transportation, divergence, and the Industrial Revolution:

After about 1000 AD northwestern Europe started a gradual switch from using oxen to using horses for farm traction and transportation.  This trend culminated in an eighteenth-century explosion in roads carrying horse-drawn carriages and wagons, as well as in canals, and works greatly extending the navigability of rivers, both carrying horse-drawn barges. This reflected a great rise in the use of cultivated fodder, a hallmark of the novel agricultural system that was evolving in northwestern Europe from the start of the second millennium: stationary pastoralism.  During the same period, and especially in the seventeenth through nineteenth centuries, most of civilized East Asia, and in particular Chinese civilization along its coast, navigable rivers, and canals, faced increasing Malthusian pressures and evolved in the opposite direction: from oxen towards far more costly and limited human porters. Through the early middle ages China had been far ahead, in terms of division of labor and technology, of the roving bandits of northern Europe, but after the latter region’s transition to stationary pastoralism that gap closed and Europe surged ahead, a growth divergence that culminated in the industrial revolution.  In the eighteenth century Europe, and thus in the early industrial revolution, muscle power was the engine of land transportation, and hay was its gasoline.

Metcalfe’s Law states that a value of a network is proportional to the square of the number of its nodes.  In an area where good soils, mines, and forests are randomly distributed, the number of nodes valuable to an industrial economy is proportional to the area encompassed.  The number of such nodes that can be economically accessed is an inverse square of the cost per mile of transportation.  Combine this  with Metcalfe’s Law and we reach a dramatic but solid mathematical conclusion: the potential value of a land transportation network is the inverse fourth power of the cost of that transportation. A reduction in transportation costs in a trade network by a factor of two increases the potential value of that network by a factor of sixteen. While a power of exactly 4.0 will usually be too high, due to redundancies, this does show how the cost of transportation can have a radical nonlinear impact on the value of the trade networks it enables.  This formalizes Adam Smith’s observations: the division of labor (and thus value of an economy) increases with the extent of the market, and the extent of the market is heavily influenced by transportation costs (as he extensively discussed in his Wealth of Nations).

Q.E.D.

October 21st, 2014

American political and social life today is pretty much one great big Q.E.D. for the two main theses of The Bell Curve, Charles Murray argues:

Those theses were, first, that changes in the economy over the course of the 20th century had made brains much more valuable in the job market; second, that from the 1950s onward, colleges had become much more efficient in finding cognitive talent wherever it was and shipping that talent off to the best colleges. We then documented all the ways in which cognitive ability is associated with important outcomes in life — everything from employment to crime to family structure to parenting styles. Put those all together, we said, and we’re looking at some serious problems down the road.

Gian-Carlo Rota’s Ten Lessons

October 21st, 2014

Gian-Carlo Rota of MIT shares ten lessons he wishes he had been taught:

  1. Lecturing
  2. Blackboard Technique
  3. Publish the same results several times.
  4. You are more likely to be remembered by your expository work.
  5. Every mathematician has only a few tricks.
  6. Do not worry about your mistakes.
  7. Use the Feynmann method.
  8. Give lavish acknowledgments.
  9. Write informative introductions.
  10. Be prepared for old age.

His lesson on lecturing:

The following four requirements of a good lecture do not seem to be altogether obvious, judging from the mathematics lectures I have been listening to for the past forty-six years.

Every lecture should make only one main point
The German philosopher G. W. F. Hegel wrote that any philosopher who uses the word “and” too often cannot be a good philosopher. I think he was right, at least insofar as lecturing goes. Every lecture should state one main point and repeat it over and over, like a theme with variations. An audience is like a herd of cows, moving slowly in the direction they are being driven towards. If we make one point, we have a good chance that the audience will take the right direction; if we make several points, then the cows will scatter all over the field. The audience will lose interest and everyone will go back to the thoughts they interrupted in order to come to our lecture.

Never run overtime
Running overtime is the one unforgivable error a lecturer can make. After fifty minutes (one microcentury as von Neumann used to say) everybody’s attention will turn elsewhere even if we are trying to prove the Riemann hypothesis. One minute overtime can destroy the best of lectures.

Relate to your audience
As you enter the lecture hall, try to spot someone in the audience with whose work you have some familiarity. Quickly rearrange your presentation so as to manage to mention some of that person’s work. In this way, you will guarantee that at least one person will follow with rapt attention, and you will make a friend to boot.

Everyone in the audience has come to listen to your lecture with the secret hope of hearing their work mentioned.

Give them something to take home
It is not easy to follow Professor Struik’s advice. It is easier to state what features of a lecture the audience will always remember, and the answer is not pretty. I often meet, in airports, in the street and occasionally in embarrassing situations, MIT alumni who have taken one or more courses from me. Most of the time they admit that they have forgotten the subject of the course, and all the mathematics I thought I had taught them. However, they will gladly recall some joke, some anecdote, some quirk, some side remark, or some mistake I made.

How Palmer Luckey Created Oculus Rift

October 20th, 2014

If there is a case to be made that unconventional schooling, without busywork or fixed schedules, helps unleash creativity, Palmer Luckey, creator of the Oculus Rift, might well be Exhibit A for the prosecution:

His mother, Julie, home-schooled all four of her children during a period of each of their childhoods (Luckey’s father, Donald, is a car salesman), but Palmer was the only one of the kids who never went back; he liked the flexibility too much. In his ample free time, he devoted most of his considerable energy to teaching himself how to build electronics from scratch.

No one else in Luckey’s family was especially interested in technology, but his parents were happy to give over half of the garage at their Long Beach, California, home to his experiments. There, Luckey quickly progressed from making small electronics to “high-voltage stuff” like lasers and electromagnetic coilguns. Inevitably, there were mishaps. While working on a live Tesla coil, Luckey once accidentally touched a grounded metal bed frame, and blew himself across the garage; another time, while cleaning an infrared laser, he burned a gray spot into his vision.

When Luckey was 15, he started “modding” video game equipment: taking consoles like the Nintendo GameCube, disassembling them, and modifying them with newer parts, to transform them into compact, efficient and hand-crafted devices. “Modding was more interesting than just building things entirely using new technologies,” Luckey told me. “It was this very special type of engineering that required deeply understanding why people had made the decisions they made in designing the hardware.”

Luckey soon became obsessed with PC gaming. How well, he wondered, could he play games? “Not skill level,” he clarified to me, “but how good could the experience be?” By this time, Luckey was making good money fixing broken iPhones, and he spent most of it on high-end gaming equipment in order to make the experience as immersive as possible. At one point, his standard gaming setup consisted of a mind-boggling six-monitor arrangement. “It was so sick,” he recalled.

But it wasn’t enough. Luckey didn’t just want to play on expensive screens; he wanted to jump inside the game itself. He knew the military sometimes trained soldiers using virtual reality headsets, so he set out to buy some — on the cheap, through government auctions. “You’d read that these VR systems originally cost hundreds of thousands of dollars, and you thought, clearly if they’re that expensive, they must be really good,” Luckey said. Instead, they fell miles short of his hopes. The field of view on one headset might be so narrow that he’d feel as if he was looking through a half-opened door. Another might weigh ten pounds, or have preposterously long lag between his head moving and the image reacting onscreen — a feature common to early VR that literally makes users nauseated.

So Luckey decided to do what he’d been doing for years with game consoles: He’d take the technology apart, figure out where it was falling short and modify it with new parts to improve it. Very quickly, he realized that this wasn’t going to be simple. “It turned out that a lot of the approaches the old systems were taking were dead ends,” he said.

The problem was one of fundamental design philosophy. In order to create the illusion of a three-dimensional digital world from a single flat screen, VR manufacturers had typically used complex optical apparatuses that magnified the onscreen image to fill the user’s visual field while also correcting for any distortion. Because these optics had to perform a variety of elaborate tricks to make the magnified image seem clear, they were extremely heavy and costly to produce.

Luckey’s solution to this dilemma was ingeniously simple. Why use bulky, expensive optics, he thought, when he could put in cheap, lightweight lenses and then use software to distort the image, so that it came out clear through them? Plus, he quickly realized that he could combine these lenses with screens from mobile phones, which the smartphone arms race had made bigger, crisper and less expensive than ever before. “That let me make something that was a lot lighter and cheaper, with a much wider field of view, than anything else out there,” he said.

From 2009 to 2012, while also taking college classes and working at the University of Southern California’s VR-focused Institute for Creative Technologies, Luckey poured countless hours into creating a working prototype from this core vision. He tinkered with different screens, mixed and matched parts from his collection of VR hardware, and refined the motion tracking equipment, which monitored the user’s head movements in real-time. Amazingly, considering the eventual value of his invention, Luckey was also posting detailed reports about his work to a 3-D gaming message board. The idea was sitting there for anyone to steal.

But, as Brendan Iribe put it to me, “Maybe his name is Luckey for a reason.” By that point, no one was interested in throwing more money away on another doomed virtual reality project.

Then, in early 2012, luck struck again when the legendary video game programmer John Carmack stumbled onto his work online and asked Luckey if he could buy one of his prototypes. Luckey sent him one for free. “I played it super cool,” he assured me. Carmack returned the favor in a big way: At that June’s E3 convention — the game industry’s gigantic annual commercial carnival — he showed off the Rift prototype to a flock of journalists, using a repurposed version of his hit game “Doom 3” for the demonstration. The response was immediate and ecstatic. “I was in Boston at a display conference at the time,” Luckey said, “and people there were like, ‘Dude, Palmer, everyone’s writing articles about your thing!’”

The rest, as they say, is virtual history: Over the next 21 months, Luckey partnered with Iribe, Antonov and Mitchell, launched a Kickstarter campaign that netted $2.4 million in funding — nearly ten times its initial goal — and joined the Facebook empire, thereby ensuring the company the kind of financial backing that most early-stage tech companies can only dream of.

The Oculus Rift is now entering its final stages of development — it’s slated for commercial release next year — and this fall Samsung will release a scaled-down product for developers and enthusiasts, powered by Oculus technology, that will clip over the company’s Galaxy Note 4 smartphone. But Luckey knows that success is by no means assured. “To this point, there has never been a successful commercial VR product, ever,” Luckey told me. “Nobody’s actually managed to pull this off.” Spend a few minutes inside the Rift, though, and one can’t help but believe that Luckey will be the one to do it.

Apple’s Next Big Imitative Leap

October 20th, 2014

Apple is just buying time until its next big imitative leap:

Samsung debuted its first, much maligned and hugely successful Galaxy Note — the first phone with a bigger-than-5-inch screen — in September, 2011. For two years afterwards, Apple was content to present incremental improvements to the iPhone. Compared with the iPhone 5, the iPhone 5s just added a fingerprint sensor and an improved camera (plus a few other features that most consumers didn’t care about).

Meanwhile Apple carefully observed the “phablet” market, watched other handset makers follow Samsung’s example and erode its market share, and experimented with ways to make a big phone easier to navigate one-handed. It struck just when Samsung started posting lower profits, because of the increased competitive pressure.

It was a perfectly-timed attack and, after setting a first-weekend record — 10 million iPhones sold — iPhone 6 and iPhone 6 Plus are continuing their rampage. Apple chief executive Tim Cook said yesterday that the first sales month for the two new phones was the company’s best ever “by a lot. A whole lot.”

The iPad Air 2′s most important improvements on last year’s device are, again, a fingerprint sensor and a better camera. As with iPhone 5s in 2013, it may appear as if Apple is stuck in a rut of timid, incremental innovation. My bet, however, is that it’s watching another innovator collect bumps, get bad reviews, then get things right. Once that innovator’s success is assured, Apple will pounce.

This time it isn’t a Samsung product Apple is watching, but Microsoft’s Surface Pro.

Microsoft hit on the idea of producing a tablet-laptop cross in 2012, incurring losses and writing off inventory as it refined the concept. This year, it finally produced a device that reviewers liked — the Surface Pro 3. It’s reasonably convincing both as a laptop and as a tablet, albeit a large and heavy one. Microsoft has not released numbers, saying only that the Pro 3 was its fastest-selling tablet yet — the company underestimated demand, creating shortages in some markets.

The analysis company Gartner puts the Surface Pro in the same category — “premium ultra-mobile” computers — as Apple’s MacBook Air laptops.

How To Think Real Good

October 20th, 2014

After compiling How to do research at the MIT AI Lab, David Chapman went on to write How To Think Real Good, a rather meandering piece that culminates in this list:

  • Figuring stuff out is way hard.
  • There is no general method.
  • Selecting and formulating problems is as important as solving them; these each require different cognitive skills.
  • Problem formulation (vocabulary selection) requires careful, non-formal observation of the real world.
  • A good problem formulation includes the relevant distinctions, and abstracts away irrelevant ones. This makes problem solution easy.
  • Little formal tricks (like Bayesian statistics) may be useful, but any one of them is only a tiny part of what you need.
  • Progress usually requires applying several methods. Learn as many different ones as possible.
  • Meta-level knowledge of how a field works — which methods to apply to which sorts of problems, and how and why — is critical (and harder to get).

I didn’t find that list as interesting as his pull-out points along the way:

  • Understanding informal reasoning is probably more important than understanding technical methods.
  • Finding a good formulation for a problem is often most of the work of solving it.
  • Before applying any technical method, you have to already have a pretty good idea of what the form of the answer will be.
  • Choosing a good vocabulary, at the right level of description, is usually key to understanding.
  • Truth does not apply to problem formulations; what matters is usefulness.
  • All problem formulations are “false,” because they abstract away details of reality.
  • Work through several specific examples before trying to solve the general case. Looking at specific real-world details often gives an intuitive sense for what the relevant distinctions are.
  • Problem formulation and problem solution are mutually-recursive processes.
  • Heuristics for evaluating progress are critical not only during problem solving, but also during problem formulation.
  • Solve a simplified version of the problem first. If you can’t do even that, you’re in trouble.
  • If you are having a hard time, make sure you aren’t trying to solve an NP-complete problem. If you are, go back and look for additional sources of constraint in the real-world domain.
  • You can never know enough mathematics.
  • An education in math is a better preparation for a career in intellectual field X than an education in X.
  • You should learn as many different kinds of math as possible. It’s difficult to predict what sort will be relevant to a problem.
  • If a problem seems too hard, the formulation is probably wrong. Drop your formal problem statement, go back to reality, and observe what is going on.
  • Learn from fields very different from your own. They each have ways of thinking that can be useful at surprising times. Just learning to think like an anthropologist, a psychologist, and a philosopher will beneficially stretch your mind.
  • If all you have is a hammer, everything looks like an anvil. If you only know one formal method of reasoning, you’ll try to apply it in places it doesn’t work.
  • Evaluate the prospects for your field frequently. Be prepared to switch if it looks like it is approaching its inherent end-point.
  • It’s more important to know what a branch of math is about than to know the details. You can look those up, if you realize that you need them.
  • Get a superficial understanding of as many kinds of math as possible. That can be enough that you will recognize when one applies, even if you don’t know how to use it.
  • Math only has to be “correct” enough to get the job done.
  • You should be able to prove theorems and you should harbor doubts about whether theorems prove anything.
  • Try to figure out how people smarter than you think.
  • Figure out what your own cognitive style is. Embrace and develop it as your secret weapon; but try to learn and appreciate other styles as well.
  • Collect your bag of tricks.
  • Find a teacher who is willing to go meta and explain how a field works, instead of lecturing you on its subject matter.

Altucher on Personal Finance

October 19th, 2014

James Altucher provides a “real education” on personal finance:

A) Don’t save money. Make more. If you think this is not so easy then remember: whatever direction you are walking in, eventually you get there.

B) That said, don’t spend money on the BIGGEST expenses in life. House and college (and kids and marriage but, of course, there are exceptions there). Just saving on these two things alone is worth over a million dollars in your bank account.

C) But doesn’t renting flush money down the toilet? No, it doesn’t. Do the math. You can argue all you want but the math is very clear as long as you are not lying to yourself.

D) Haven’t studies shown that college graduates make more money 20 years later?

No, studies have not shown that. They show correlation but not causation and they don’t take into account multi-collinearity (it could be that the children of middle class families have higher paying jobs later and, oh by the way, these children also go to college).

E) Don’t invest in anything that you can’t directly control every aspect of. In other words…yourself.

In other words:

  1. You can’t make or save money from a salary.And salaries have been going down versus inflation for 40 years. So don’t count on a salary. You’re 20, please take this advice alone if you take any advice at all.
  2. Investing is a tax on the middle class. There are at least 5 levels of fees stripped out of your hard-earned cash before your money touches an investment.

F) If you want to make money you have to learn the following skills. None of these skills are taught in college.

I’m not saying college is awful or about money, etc. I’m just saying that the only skills needed to make money will never be learned in college:

  • how to sell (both in a presentation and via copywriting)
  • how to negotiate (which means win-win, not war)
  • creativity (take out a pad, write down a list of ideas, every day)
  • leadership (give more to others than you expect back for yourself)
  • networking (a corollary of leadership)
  • how to live by themes instead of goals (goals will break your heart)
  • reinvention (which will happen repeatedly throughout a life)
  • idea sex (get good at coming up with ideas. Then combine them. Master the intersection)
  • the 1% rule (every week try to get better 1% physically, emotionally, mentally)
  • “the google rule” – always send people to the best resource, even if it’s a competitor. The benefit to you comes back tenfold
  • give constantly to the people in your network. The value of your network increase linearly if you get to know more people but EXPONENTIALLY if the people you know, get to know and help each other.
  • how to fail so that a failure turns into a beginning
  • simple tools to increase productivity
  • how to master a field. You can’t learn this in school with each “field” being regimented into equal 50 minute periods. Mastery begins when formal education ends. Find the topic that sets your heart on fire. Then combust.
  • stopping the noise: news, advice books, fees upon fees in almost every area of life. Create your own noise instead of falling in life with the others.

If you do all this you will gradually make more and more money and help more and more people. At least, I’ve seen it happen for me and for others.

How to see into the future

October 18th, 2014

So, what is the secret of looking into the future?

Initial results from the Good Judgment Project suggest the following approaches. First, some basic training in probabilistic reasoning helps to produce better forecasts. Second, teams of good forecasters produce better results than good forecasters working alone. Third, actively open-minded people prosper as forecasters.

But the Good Judgment Project also hints at why so many experts are such terrible forecasters. It’s not so much that they lack training, teamwork and open-mindedness — although some of these qualities are in shorter supply than others. It’s that most forecasters aren’t actually seriously and single-mindedly trying to see into the future. If they were, they’d keep score and try to improve their predictions based on past errors. They don’t.

This is because our predictions are about the future only in the most superficial way. They are really advertisements, conversation pieces, declarations of tribal loyalty — or, as with Irving Fisher, statements of profound conviction about the logical structure of the world.

Some participants in the Good Judgment Project were given advice, a few pages in total, which was summarised with the acronym CHAMP:

  • Comparisons are important: use relevant comparisons as a starting point;
  • Historical trends can help: look at history unless you have a strong reason to expect change;
  • Average opinions: experts disagree, so find out what they think and pick a midpoint;
  • Mathematical models: when model-based predictions are available, you should take them into account;
  • Predictable biases exist and can be allowed for. Don’t let your hopes influence your forecasts, for example; don’t stubbornly cling to old forecasts in the face of news.

The Advent of Cholera

October 17th, 2014

Cholera seems to have existed in the Ganges delta for a long time, but it only spread to the rest of the world fairly recently, Gregory Cochran notes, and two factors interfered with an effective policy response:

[Scientists] concluded that contagion was never the answer, and accepted miasmas as the cause, a theory which is too stupid to be interesting. Sheesh, they taught the kids in medical school that measles wasn’t catching — while ordinary people knew perfectly well that it was. You know, esoteric, non-intuitive truths have a certain appeal — once initiated, you’re no longer one of the rubes. Of course, the simplest and most common way of producing an esoteric truth is to just make it up.

On the other hand, 19th century liberals (somewhat like modern libertarians, but way less crazy) knew that trade and individual freedom were always good things, by definition, so they also opposed quarantines — worse than wrong, old-fashioned! And more common in southern, Catholic, Europe: enough said! So, between wrong science and classical liberalism, medical reformers spent many years trying to eliminate the reactionary quarantine rules that still existed in Mediterranean ports.

The intellectual tide turned: first heroes like John Snow, and Peter Panum, later titans like Pasteur and Koch. Contagionism made a comeback.

Welcome to Our Reality

October 17th, 2014

How do the Swedes recruit soldiers? Like this:

How do Unschoolers Turn Out?

October 17th, 2014

Peter Gray and Gina Riley surveyed 232 parents who unschool their children:

Getting into college was typically a fairly smooth process for this group; they adjusted to the academics fairly easily, quickly picking up skills such as class note-taking or essay composition; and most felt at a distinct advantage due to their high self-motivation and capacity for self-direction. “The most frequent complaints,” Gray notes on his blog, “were about the lack of motivation and intellectual curiosity among their college classmates, the constricted social life of college, and, in a few cases, constraints imposed by the curriculum or grading system.”

Most of those who went on to college did so without either a high school diploma or general education diploma (GED), and without taking the SAT or ACT. Several credited interviews and portfolios for their acceptance to college, but by far the most common route to a four-year college was to start at a community college (typically begun at age 16, but sometimes even younger).

None of the respondents found college academically difficult, but some found the rules and conventions strange and sometimes off-putting. Young people who were used to having to find things out on their own were taken aback, and even in some cases felt insulted, “when professors assumed they had to tell them what they were supposed to learn,” Gray says.

[...]

The range of jobs and careers was very broad—from film production assistant to tall-ship bosun, urban planner, aerial wildlife photographer, and founder of a construction company—but a few generalizations emerged. Compared to the general population, an unusually high percentage of the survey respondents went on to careers in the creative arts—about half overall, rising to nearly four out of five in the always-unschooled group. Similarly, a high number of respondents (half of the men and about 20 percent of the women) went on to science, technology, engineering or math (STEM) careers.

Why is Football More Popular than Ever?

October 16th, 2014

Why is football more popular than ever?

In practice getting people to watch spot advertising means programming that has to be watched live and in practice that in turn means sports. Thus it is entirely predictable that advertisers will pay a premium for sports. It is also predictable that the cable industry will pay a premium for sports because must-watch ephemera is a good insurance policy against cord-cutting. Moreover, as a straight-forward Ricardian rent type issue, we would predict that this increased demand would accrue to the owners of factor inputs: athletes, team owners, and (in the short-run) the owners of cable channels with contracts to carry sports content. Indeed this has basically all happened.

Here’s something else that is entirely predictable from these premises: we should have declining viewership for sports. If you’re the marginal viewer who ex ante finds sports and scripted equally compelling, it seems like as sports get more expensive and you keep having to watch ads, whereas scripted gets dirt cheap, ad-free, and generally more convenient, the marginal viewer would give up sports, watch last season’s episodes of Breaking Bad on Netflix, be blissfully unaware of major advertising campaigns, and pocket the $50 difference between a basic cable package and a $10 Netflix subscription.

The weird thing is that this latter prediction didn’t happen. During exactly the same period over which sports got more expensive in absolute terms and there was declining direct cost and hassle for close substitutes, viewership for sports increased. From 2003 to 2013, sports viewership was up 27%. Or rather, baseball isn’t doing so great and basketball is holding its own, but holy moly, people love football. If you look at both the top events and top series on tv, it’s basically football, football, some other crap, and more football. I just can’t understand how when one thing gets more expensive and something else that’s similar gets a lot cheaper and lower hassle, that you see people flocking to the thing that is absolutely more hassle and relatively more money.