Overconfidence

For years, professors at Duke University conducted a survey in which the chief financial officers of large corporations estimated the returns of the Standard & Poor’s index over the following year. The Duke scholars collected and examined 11,600 such forecasts. The conclusion was straightforward: financial officers of large corporations had no clue about the short-term future of the stock market; the correlation between their estimates and the true value was slightly less than zero!

When they said the market would go down, it was slightly more likely than not that it would go up.

But the CFOs seemed to have no clue that their forecasts were worthless.

In addition to their best guess about S&P returns, the participants made two other estimates: a value that they were 90% sure would be too high, and one that they were 90% sure would be too low. The range between the two values is called an “80% confidence interval” and outcomes that fall outside the interval are labeled “surprises.”

Thinking, Fast & Slow

In most cases where a confidence interval is used, one would expect 20% of the outcomes to be surprises. But in this exercise, as in many others like it, there were too many surprises (67%) – 3 times higher than expected. The CFOs were overconfident in their ability to predict what would happen in the market.

WYSIATI (What You See Is All You Get)

Kahneman thinks that overconfidence is a direct consequence of WYSIATI – that is, we are totally oblivious to how much information we don’t know. When the CFOs wanted to estimate what would happen, they would rely on information that comes to mind and conceive of a coherent story which would back up the estimate.

In other words, there are three things going on (at least) that create the illusion of competence (or overconfidence).

  1. Ignorance of Ignorance: We don’t know what we don’t know. In the same way that a child thinks they know everything about the world, the CFOs in the experiment will assume that they already know everything there is to know about their predictions. They discount hidden knowledge.
  2. WYSIATI (Kahneman’s acronym): The CFOs thought that what they already knew described how the world actually was. This is just a failure to see the mismatch between one’s knowledge of the world and what the world is really like. This is not being unaware of ignorance, but being unaware of one’s misunderstandings or misperceptions.
  3. Rationalization: After surveying the facts in their mind, the CFOs will find a coherent story that matches their predictions. This is an after-the-fact phenomenon. There is an intuition about what will happen, and then to explain why such an intuition is valid, the expert will draw on the relevant facts that will create a good story.
The Need for Certainty

There is a large cost to certainty, when it leads to failure. But there is also a cost to uncertainty. Investors don’t want a CEO who isn’t sure if he is going to succeed, they want a CEO who is 100% confident that he will. And there may be justification for that, since belief itself may tilt the odds in one’s favor. Economists are often the most frustrating, since they usually understand the tradeoffs of each decision. One US president, Harry Truman, unimpressed by “on the other hand” answers demanded a one-armed economist who could give them a straight recommendation.

But companies who take the word of overconfident “experts” can expect costly consequences. The same study of CFOs mentioned about showed that the same people who were confident about the S&P index were also optimistic about the prospects of their own firm, and went on to take more risk than others.

As Taleb argued in his books, a blindness to the uncertainty of the environment leads to risk taking that should have been avoided. But risk taking is highly valued by society and the market. Truth telling is not as rewarded.

One of the lessons of the financial crisis that led to the Great Recession is that there are periods in which competition, among experts and among organizations, creates powerful forces that favor a collective blindness to risk and uncertainty.

Thinking, Fast & Slow

But overconfidence is not limited to the financial sphere. The people who are most likely to appear on the news are the most overconfident, not the most informed. Even in medicine, this problem exists.

A study of patients who died in the ICU compared autopsy results with the diagnosis that physicians had provided while the patients were still alive. Physicians also reported their confidence. The result: “clinicians who were ‘completely certain’ of the diagnosis antemortem were wrong 40% of the time.” Here again, expert overconfidence is encouraged by their clients “

Thinking, Fast and Slow

If a doctor seems unsure, it will be picked up as a sign of weakness. Experts who acknowledge the extent of their ignorance will likely be outcompeted by those who don’t.

A careful appreciation of ignorance is a cornerstone of rationality, but it isn’t what people want. And extreme uncertainty is paralyzing under dangerous circumstances. When the stakes are high, it is unacceptable for anyone to admit that they are unsure of what they are doing.

If we read between the lines, we can notice that the real problem here is that some confidence is necessary, but we are not sure how much. According to a scientist like Kahneman, intellectual integrity trumps all else, so the threshold for how much confidence one should have is quite low. But a CFO has different incentives, he isn’t paid to stick to the facts and to be as cautious as possible, he is paid to take risks and succeed, against the odds.

If every entrepreneur truly understood how much the odds were stacked against them, we wouldn’t have any new businesses. If every investor understood the true inaccuracy of their prediction, they may never invest. So, confidence despite uncertainty, is necessary for economic growth. Risk-taking must contain an irrational component, but the problem is false expectations. It’s when people expect more certainty than actually exists that they run into trouble, not that they are playing an uncertain game.

"A gilded No is more satisfactory than a dry yes" - Gracian