If you're caught in a political or business crisis, you might well be tempted to listen to the advice of so-called experts. Don't bother. Because new research has found that their ability to predict outcomes is little better than guesswork.
A study about predicting the outcome of actual conflicts has found that the forecasts of experts who use their unaided judgment are actually little better than those of novices or random guesswork.
Kesten C. Green of Monash University in Australia and J. Scott Armstrong of the Wharton School at the University of Pennsylvania used disguised versions of actual crises to see whether the predictions made by experts were significantly better than those made by untrained undergraduate students.
The case studies included a hostile takeover attempt, nations preparing for war, a controversial investment proposal, a nurses' strike, an action by football players for a larger share of the gate, an employee resisting the downgrading of her job, artists demanding taxpayer funding and a new distribution arrangement that a manufacturer proposed to retailers.
What they found was that the experts - conflict, domain and forecasting specialists - correctly forecasted the decisions made by the various parties in only a third (32 per cent) of the cases. That's barely better than the strike-rate of the undergraduates, who got it right 29 per cent of the time.
Just using random guesswork would deliver the right outcomes 28 per cent of the time.
Kesten Green said that the research has serious consequences for foreign policy and business.
"Forecasting problems such as this are the stuff of not only international relations but also of takeover battles, commercial competition, and labor-management disputes. In most cases, experts use their judgment to predict what will happen. How good are their forecasts?
"The short answer is that they are of little value in terms of accuracy. In addition, they lead people into false confidence."
The study, published in the Institute for Operations Research and the Management Sciences journal, Interfaces, questions experts' ability to forecast without proven structured methods.
"Accurate prediction is difficult because conflicts tend to be too complex for people to think through in ways that realistically represent their actual progress," the authors argue. "Parties in conflict often act and react many times, and change because of their interactions."
The authors also looked at whether veteran experts would be more likely to make accurate forecasts than less experienced experts - and found that they were not.
"Common sense expectations did not prove to be correct," they write. "The 57 forecasts of experts with less than five years experience were more accurate (36 per cent) than the 48 forecasts of experts with more experience (29 per cent)."
Unsurprisingly, Green and Armstrong conclude that decision-makers should not rely on experts' unaided judgments for forecasting decisions in conflicts.
Instead, they recommend that experts use reliable decision-support tools. They cite two examples of decision aids that can improve forecasts. In an earlier study, Green reported that simulated interaction, a type of role playing for forecasting behavior in conflicts, reduced error by 47 per cent.
Using another technique, structured analogies, the authors found favorable results. In that study, they asked experts to recall and analyze information on similar situations.
When experts were able to think of at least two analogies, forecast error was reduced by 39 per cent. This structured technique requires experts, and those with more expertise were able to contribute much more to making accurate forecasts.