Does your company have good judgement?

by L&D26 Oct 2016
Imagine that you could dramatically improve your firm’s forecasting ability, but to do so you’d have to expose just how unreliable its predictions – and the people making them – really are.

An article published in the Harvard Business Review pointed to the U.S intelligence community’s judgement in the lead-up to the 2003 Iraq War as a prime example of this.

“Back in October 2002, the National Intelligence Council issued its official opinion that Iraq possessed chemical and biological weapons and was actively producing more weapons of mass destruction. Of course, that judgment proved colossally wrong,” the article stated.

“Shaken by its intelligence failure, the $50bn bureaucracy set out to determine how it could do better in the future, realizing that the process might reveal glaring organizational deficiencies.”
But improving a firm’s forecasting competence even a little can yield a competitive advantage.
A company that is right three times out of five on its judgment calls is going to have an ever-increasing edge on a competitor that gets them right only two times out of five.

Below are a few tips your organisation can use to improve its judgement, and ultimately, its forecasting abilities.

Train for Good Judgment
Most predictions made in companies, whether they concern project budgets, sales forecasts, or the performance of potential hires or acquisitions, are not the result of cold calculus. They are coloured by the forecaster’s understanding of basic statistical arguments, susceptibility to cognitive biases, desire to influence others’ thinking, and concerns about reputation. Indeed, predictions are often intentionally vague to maximize wiggle room should they prove wrong. The good news is that training in reasoning and de-biasing can reliably strengthen a firm’s forecasting competence. The Good Judgment Project demonstrated that as little as one hour of training improved forecasting accuracy by about 14% over the course of a year.

Learn the basics
Basic reasoning errors (such as believing that a coin that has landed heads three times in a row is likelier to land tails on the next flip) take a toll on prediction accuracy. So it’s essential that companies lay a foundation of forecasting basics: The GJP’s training in probability concepts such as regression to the mean and Bayesian revision (updating a probability estimate in light of new data), for example, boosted participants’ accuracy measurably. Companies should also require that forecasts include a precise definition of what is to be predicted (say, the chance that a potential hire will meet her sales targets) and the time frame involved (one year, for example). The prediction itself must be expressed as a numeric probability so that it can be precisely scored for accuracy later. That means asserting that one is “80% confident,” rather than “fairly sure,” that the prospective employee will meet her targets.

Understand cognitive biases
Cognitive biases are widely known to skew judgment, and some have particularly pernicious effects on forecasting. They lead people to follow the crowd, to look for information that confirms their views, and to strive to prove just how right they are. It’s a tall order to de-bias human judgment, but the GJP has had some success in raising participants’ awareness of key biases that compromise forecasting. For example, the project trained beginners to watch out for confirmation bias that can create false confidence, and to give due weight to evidence that challenges their conclusions. And it reminded trainees to not look at problems in isolation but, rather, take what Nobel laureate Daniel Kahneman calls “the outside view.” For instance, in predicting how long a project will take to complete, trainees were counselled to first ask how long it typically takes to complete similar projects, to avoid underestimating the time needed.

Training can also help people understand the psychological factors that lead to biased probability estimates, such as the tendency to rely on flawed intuition in lieu of careful analysis. Statistical intuitions are notoriously susceptible to illusions and superstition. Stock market analysts may see patterns in the data that have no statistical basis, and sports fans often regard basketball free-throw streaks, or “hot hands,” as evidence of extraordinary new capability when in fact they’re witnessing a mirage caused by capricious variations in a small sample size.

COMMENTS

Most Read