Inside the Superforecasters

Monday, November 30th, 2015

Superforecaster Michael Story provides his own inside look at the Good Judgement Project:

The only requirement for entry to which was an undergraduate degree and the willingness to give it a go, and the weighted average forecasts of the main sample beat the 4 year programme’s final projected goals in year one. The project has also produced novel findings for psychological science, the most important being that there are some people who, independent of knowledge about a particular region or subfield, are just really good at forecasting.

The idea that forecasting is an independent skill to domain specific expertise seems rather novel. What impact do you think it will have?

The idea that forecasting is an independent skill is absolutely revolutionary to organisations involved in intelligence gathering or dissemination. They nearly all recruit by regional expertise and use subject matter knowledge as a proxy for future predictive power. But Tetlock and the team that he and his wife have created (fellow psychology professor Barbara Mellors) and named ‘the Good Judgment Project‘ discovered that this isn’t nearly as important as one might think. An amateur with a good forecasting record will generally outperform an expert without one.

For example, putting the top 2% of performers, the ‘Superforecasters’ into teams, enlarged their lead over the already highly performing sample to 65%, and beat the algorithms of competitor teams by between 35 and 60%, the open prediction markets by 20-35%, and (according to unconfirmed leaks from the Intelligence Community) the forecasts of professional government intelligence analysts by 30%. Some people are just very good at forecasting indeed.

What makes someone a superforecaster?

Part of it is personality, part of it the environment and incentives in which the forecasts take place, and there’s a role for training too.

Superforecasters were, in comparison to the general population of forecasting volunteers, more actively open minded — actively trying to disprove their own hypotheses, with a high fluid IQ, high need for cognition and a comfort with numbers.

But even the best forecasters need the right incentives- people who are rewarded for being ‘yes men’ will never give you their best work regardless of potential.

Can superforecasting be trained?

Given three hours of rationality and source-discovery training, nearly everyone (even relatively poor performers) got better. Being encouraged to think about the ‘outside view’ of a situation, looking for comparisons, base rates, and a healthy dose of Baysian thinking works wonders for one’s Brier score.

What has your experience been like?

My personal experience of fellow superforecasters was one of similarity — despite coming from a great variety of national and educational backgrounds, the first time we all gathered in one place together over at Berkeley college in California, a chap sitting next to me (now a colleague and pal) whispered that he ‘felt like that scene at the end of E.T. where he returns to his home planet and meets all the other E.T.s’. The personality factors certainly seemed dominant in the room. It’s amazing where an email can take you.

Leave a Reply