How to see the future 60% better – part one

How to see the future 60% better – part one

On a rainy Saturday morning a few weeks back, myself Ben Traynor and Dan Denning went to a conference at the London School of Economics in Central London. It was called “Superforecasting and geopolitical intelligence – who can predict the future, and how?”, and it brought together superforecasters, academics and intelligence experts from all over the world.

About half the attendees were already deeply involved with the Superforecasting project, and they came along to compare notes and meet other experts.

The other half of us had heard some intriguing things about superforecasting: that it had found a way to increase the accuracy of people’s forecasts by 60%; that it showed how to make better forecasts than armies of better-resourced and informed CIA analysts; and anybody can improve their forecasting skills if they desire to do so.

As investors, we’re in the business of making predictions and forecasts. So naturally, Ben, Dan and I were keen to learn.

First up – what is a superforecaster?

Taking on the CIA with a $250 Amazon voucher

The story starts after the Iraq War in 2003, when the CIA was totally humiliated by its failure to find weapons of mass destruction in Iraq.

And even worse than getting the call wrong – it didn’t even think it could be wrong. As the Presidential investigation put it, “failing to conclude that Saddam had ended his banned weapons program is one thing – not even considering it as a possibility is another.”

The CIA took the whole thing pretty hard. It realised it had to change. So it set up a new body called IARPA – Intelligence Advanced Research Projects Activity – to experiment with radical new techniques and methods for coming up with better predictions.

One of the first thing it did was to join up with Philip Tetlock, a professor at the University of Pennyslvania who’s an expert on predictions and forecasting. Tetlock had an idea about how to improve forecasting accuracy – set up a massive global forecasting tournament, open to anyone. IARPA basically threw a tonne of money at the project to see what would happen.

Five teams, led by some of the top researchers in the world, were created to compete in the tournament. Some were made up of professional CIA analysts, others of academics. And Tetlock entered a team of his own, which was called the Good Judgment Project (or GJP).

The GJP team was made up of large numbers of ordinary people – retired plumbers and teachers and housewives and the like. They were paid in the form of $250 Amazon vouchers.

Every day, the teams were required to submit predictions to identical questions posed by IARPA about world affairs. The questions were diverse and specific: will Colonel Gaddafi be overthrown before the first of April 2014; how thick will Arctic ice be on the 1st of January 2013, will traces of polonium be found in the body of Yasser Arafat by 1st July 2008 (which would show he was poisoned).

Tetlock says almost 500 questions were posed over the study period and that his team gathered more than 1 million individual forecasts.

After the first year, GJP’s team of superforecasters had beaten the pros by a wide margin. After two years, it was beating the competition so badly that IARPA dropped the other teams to focus all its efforts on GJP.

So what was going on inside the GJP team? How did it turn ordinary volunteers into superforecasters?

Well there’s a lot to say about that! So I’ll pause here. More in the next issue of Risk and Reward.

By the way, the GJP isn’t finished! If you’re interested, it’s running a challenge on its website that you can participate in if you like. You can learn more about it here.

Have a comment? Reach me at sean@agora.co.uk