How to Gain a Predictive Edge in 2017

You don't need a crystal ball to get better at forecasting.
You don’t need a crystal ball to get better at forecasting.

As the tumultuous 2016 made perfectly clear, uncertainty is a fixture in our fast-changing world. For businesses trying to cope with the ever-accelerating pace of technological change, political upheaval and regulatory changes add even more variables. To operate and succeed within uncharted spaces, companies and managers must develop methods for overcoming uncertainty. So how can companies take steps towards reliably predicting changes in global markets and adapting to disruptive technologies?

Whether developing strategic policy, launching a new product, navigating the stock market, and – especially relevant to recent events – predicting election outcomes and geopolitical shifts, we would all undoubtedly benefit from improved forecasting. Unfortunately, even the efforts of experts often prove to be far off the mark.

Professor Phil Tetlock
Phil Tetlock

Research on the best practices for organizational forecasting, co-led by Professor Phil Tetlock of the Wharton School, culminated in the Good Judgment Project, a large-scale, twenty-year prediction tournament involving both seasoned intelligence analysts and amateurs. The results were surprising: “the average expert was roughly as accurate as a dart-throwing chimpanzee.” How, then, can experts hone their forecasting abilities?

Tetlock’s research, along with subsequent forecasting tournaments, including the recent Disruptive Vehicle Forecasting Challenge run by the Mack Institute’s Program on Vehicle and Mobility Innovation, revealed some key insights.

How to Improve Your Forecasting Results

Forecasting skills can be cultivated. As a result of their findings, Tetlock and Paul Schoemaker identify three key ways to enhance businesses’ predictive performance using a combination of data, logic, analysis, seasoned judgment, and careful questioning.

Using Diverse Teams to Boost Accuracy

Companies should strive to build teams composed of forecasters with a natural aptitude for sound reasoning and data analysis – those who are cautious, analytical, aware of biases, and open-minded. Forecasting teams should be intellectually diverse with both generalists and domain specialists for the greatest variety of diverging approaches. The Good Judgment Project found that non-expert generalists often outperformed trained intelligence specialists.

Training for Good Judgment

Superforecasters are well-versed in reasoning, probability concepts, information asymmetry, and cognitive biases. They are willing to learn from their mistakes and think in gradations, calibrate their hypotheses when proven wrong with new data, and consider problems holistically rather than in isolation. The Good Judgment Project discovered that as little as one hour of training improved forecasting accuracy by around 14% over one year.

Tracking Prediction Performance and Providing Continuous Feedback

To improve predictive abilities, it’s critical to provide timely feedback of outcomes and the processes used to create predictions, as well as incentives for improvement. To replicate successes and identify areas for improvement, companies should collect real-time data through videos or transcripts of meetings and keep records of decision-making processes to provide other teams with best practices for improving forecasting.

Want to try your hand at forecasting? Join the Good Judgment Project now.