Paul Schoemaker, Mack Institute Senior Fellow, and Phil Tetlock, Management, The Wharton School
Harvard Business Review, May 2016
Excerpt: Imagine that you could dramatically improve your firm’s forecasting ability, but to do so you’d have to expose just how unreliable its predictions — and the people making them — really are. That’s exactly what the U.S. intelligence community did, with dramatic results.
Back in October 2002, the National Intelligence Council issued its official opinion that Iraq possessed chemical and biological weapons and was actively producing more weapons of mass destruction. Of course, that judgment proved colossally wrong. Shaken by its intelligence failure, the $50 billion bureaucracy set out to determine how it could do better in the future, realizing that the process might reveal glaring organizational deficiencies.
The resulting research program included a large scale, multi-year prediction tournament, co-led by one of us (Phil), called the Good Judgment Project. The series of contests, which pitted thousands of amateurs against seasoned intelligence analysts, generated three surprising insights: First, talented generalists often outperform specialists in making forecasts. Second, carefully crafted training can enhance predictive acumen. And third, well-run teams can outperform individuals. These findings have important implications for the way organizations and businesses forecast uncertain outcomes, such as how a competitor will respond to a new-product launch, how much revenue a promotion will generate, or whether prospective hires will perform well.
Learn more about the Mack Institute’s Superforecasting tournament here.