Do managers think they know more than they actually do? Are they overconfident? What does it mean if overconfidence is rampant in a company’s executive suite? Questions such as these are critical for those involved in studying management behavior and decision making. Confidence and its relationship with the ability to make accurate decisions is a subject that sometimes has been ignored by traditional research – though it requires more attention. The basic issue is: Are people accurate when they are confident and confident when they are accurate?

As consumers, of course, people deal with this question all the time—and an example can help illustrate the problem. Consider a shopper who wants to buy a new shirt at the best possible price. There are three possibilities. First, she might buy the shirt at a particular store because she is highly confident that she has found the best price and then later learn that the same shirt was available for less money elsewhere. Second, she may later learn that most other stores have higher prices — even though one or two stores may stock the item at a slightly lower price. And third, she may be far from confident in her judgments about the best price and still often be correct.

These three possibilities show that confidence and accuracy do not always go hand in hand. The relationship between the two is called knowledge calibration. Calibration refers to the match between confidence and accuracy, rather than accuracy itself. In other words, accuracy reflects what consumers know; confidence reflects what they think they know; and calibration reflects how well the two correspond to one another.

A recent paper by J. Wesley Hutchinson, who teaches marketing at Wharton, and Joseph W. Alba, a marketing professor at the University of Florida in Gainesville, sets out to explore these themes. In the paper – titled “Knowledge Calibration: What Consumers Know and What They Think They Know” – the researchers consider the methods and models used in calibration research. They then adopt a time-based perspective to review a wide variety of empirical results. Finally, Alba and Hutchinson examine the theoretical explanations that have been given for calibration phenomena and offer suggestions for future research.

Discussing his research, Hutchinson explains that the relationship between confidence about knowledge and its accuracy has implications beyond consumer research. For instance, CEOs routinely ask sales managers about the likelihood that sales targets will be met in the next few quarters. Stock market analysts, in turn, pose the same question to CEOs and CFOs. “Our research applies to any situation where some uncertainly about predicting the future exists,” says Hutchinson. “When people make predictions, in addition to telling you what they think will happen they also tell you something about their confidence.” For example, a sales manager might suggest that he or she is “90% sure” that a certain target will be hit.

Since hubris is as dangerous in corporate boardrooms as it is in politics or on the sports field, Hutchinson warns that it is crucial to watch out for danger zones that fuel overconfidence. Among them:

  1. When someone making a prediction is very confident, that person is almost always over confident. Grill a manager who is “90% sure” about landing a deal, and the real odds may turn out to be closer to 70%.
  2. When you think you are guessing, you are probably performing better than chance even though your odds of being right may not be high. “Intuition is a good guide in taking you beyond 50-50 odds to 60% or 65%,” says Hutchinson. “Asking people to go with their gut instinct is not bad advice.”
  3. What are the chances of being right when you are thinking something over, or “sleeping on it?” Hutchinson points out that when managers mull over a decision, unless they uncover new information or generate new insights, the process of repeatedly going over old information makes them overconfident without changing the probability that they are right.
  4. Another danger zone is the so-called truth bias. People assume that their information is correct—though their basis for believing so may be very slim. “We all remember facts or factoids; they are part of our knowledge about the world,” says Hutchinson. “We may forget where we read them, but we assume that what we remember is true. As a result, we exaggerate the validity of things that come to mind.” But memory can play tricks.
  5. The final danger zone is that people can be misled. This is sometimes done deliberately or viciously—but often it happens because questions may be posed in such a way as to prompt the answer that the questioner wants.

How can managers increase their knowledge calibration and protect themselves against these danger zones that lead to overconfidence? Hutchinson offers several recommendations. Among them:

  1. If you are dealing with an issue about which there is some slight uncertainty, always assume at least a 20% chance that you are wrong. This, Hutchinson notes, has two beneficial effects. First, on average, this will be closer to reality. And second, just considering the effects of being wrong—for example, what course should be followed if a certain supplier doesn’t make the deadline—will improve the decision-making process. “The world is a stranger place than we think, and time-overruns are more common than finishing a task early,” Hutchinson says.
  2. “Recognize that you are not an expert about everything,” says Hutchinson. “Carefully examine the areas of your expertise and find the boundaries of your knowledge.” Research shows that people who have expertise in a certain area tend to overgeneralize that. The fact that someone is an expert in one industry does not imply that that knowledge will extend to other areas. “Expertise is domain specific,” notes Hutchinson. “As people move beyond their immediate area of expertise, their ability drops off sharply but their confidence does not. The disconnect between expertise and confidence, paradoxically, puts experts at risk.”
  3. Look for base rates. Each decision and situation is different. Executives, however, sometimes tend to ignore the details. “Don’t be afraid to play the odds rather than trying to beat them,” says Hutchinson. “Don’t be overconfident about trying to influence things that are outside your control.”
  4. Test your opinions by looking for disconfirming information. Hutchinson explains that overconfidence often arises because once we have decided something is right, we tend to look just for information that supports us. “We have our own little yes men in our heads, and they take us down paths where we look more and more correct,” he says. “It is crucial to go out of our way to look for facts that challenge our beliefs.”
  5. Finally, Hutchinson suggests that managers should create external sources of memory. “Memory is wonderful, but it’s not flawless,” he says. “A lot of biases are biases of memory. The more that people create external sources of memory—for example, simply by putting things in writing—the more helpful it is.”

While some of these danger zones and recommendations might appear to be little more than common sense, they are often forgotten amid the hectic pace of corporate activity. Hutchinson points out that recognizing the dangers and acting on the recommendations can help improve decision-making. They can also help managers strike a better-calibrated balance between predictions and their outcomes.