www.forbes.com /sites/heatherwishartsmith/2022/01/03/look-before-you-leap-innovation-cognitive-noise-and-informed-intuition/

Look Before You Leap: Innovation, Cognitive Noise, And Informed Intuition

Heather Wishart-Smith 11-14 minutes 1/3/2022
Chamois (Rupicapra rupicapra) and fox in the snow in winter, Gran Paradiso National Park, Italian Alps, Italy

Chamois goat and fox (Photo by: Arterra/Universal Images Group via Getty Images)

Universal Images Group via Getty Images

There’s something to be said about getting sufficient facts ahead of time.

Roughly 2,500 years ago or so, Greek fabulist Aesop spelled out the treacheries of poor decision making via the ancient tale, The Fox and The Goat. Basically, there's a fox. He falls into a well and can't get out. Luckily for him, a thirsty goat happens by. To make a short story even shorter, the fox tricks the goat to jump into the well by telling him the water is delicious: the cagey predator then leaps onto the goat's back and (spoiler alert) escapes out of the well, leaving the goat in a pickle.

The moral here is that a decision made without sufficient inquiry into the facts is a fool's errand. 

So, when it comes to making decisions (especially strategically important ones), when should you go with your gut and when should you hold your fire? How do you strike the proper balance between reflection and action? To get some insight into the science of how and why people make decisions, I sat down with Dr. Daniel Kahneman, Eugene Higgins Professor of Psychology, Emeritus, and Professor of Psychology and Public Affairs, Emeritus at Princeton University.

In 2002 Dr. Kahneman was awarded the Nobel Prize in Economic Sciences "for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty." In Spring 2021, he co-authored the book Noise: A Flaw in Human Judgment, which examines the impact of noise and how it can adversely affect judgment.

In the discussion that follows, Dr. Kahneman speaks to intuition, risk acceptance and aversion, hierarchical bias, and the considerations to keep in mind when weighing the virtues of individual versus group decision making.

Corporate innovation is challenged by the fact that innovation by nature involves more unknowns, and more unknowns means more risk. Corporations tend to avoid risk. What recommendations do you have for corporate innovation decision makers as they pertain to achieving the proper balance between fast and slow thinking: quick, intuitive decision making and deliberate, thoughtful decision making?

Image of Dr. Daniel Kahneman in a dark blazer and light blue shirt

Daniel Kahneman, Princeton University (Photo by Andreas Rentz/Getty Images for Burda Media)

Getty Images for Burda Media

KAHNEMAN: I think there are two questions here. One is about creativity and the other one is about fast and slow thinking, and they're not necessarily the same thing. I would say with respect to creativity and to risk seeking (or, to avoid exaggerated risk aversion), it's a matter of incentives and the attitude of the organization for failures and anticipated failure. By reducing the relative cost of failure to the individual, you're increasing their willingness to take risks. It's the anticipation of how much hindsight there will be and how much blame and responsibility you're taking.

Jeff Bezos has talked about this very overtly, about the fact that it's healthy for an organization to fail often, because if it doesn't fail often, it's not taking enough risks. Failing often means that there are individuals who are in charge of a project, who are clearly over-optimistic over a project that eventually fails, and they have to get the backing of the organization. It has to be considered a badge of honor that they've tried and not held against them that it didn't work, and it’s not held against the people who suggested doing it. The whole issue of how you treat failure is the primary mover with respect to creativity in an organization.

How can they avoid the negative effects of both snap judgments and paralysis by analysis, particularly when paralysis by analysis often prevents innovation?

KAHNEMAN: Different problems need different degrees of reflection, so it's hard to give advice in general about how much reflection people should engage in before they take decisions. It really depends on the amount of information that is required. I think there is an interesting concept, which is of due diligence. The notion of due diligence is to spend the appropriate amount of effort on a problem – not more and not less.

There is an appropriate amount of effort so that if you're making an acquisition, there is a certain checklist of things that you ought to know about the organization before you recommend acquiring it. That list is reasonably well understood, and anybody who shortcuts that list by saying that he or she has an intuition about what the outcome will be is taking an inappropriate risk.

Subjective confidence is not an indication of accuracy.” – Dr. Daniel Kahneman, 2002 Nobel Laureate in Economics

In the same way that you have a checklist of what needs to be done about an acquisition, I think there is a sense within any organization (with any class of decisions) what has to be done before you make a decision. Going beyond that and wading into the weeds leads to paralysis but going faster than that leads to inadequate decision making.

It ought to be made clear within any organization what level of due diligence is expected in decisions of a particular coin, down to the level of how much time you want to spend studying that organization (to be acquired) before you decide whether to engage or not.

Regarding the challenges of decision making in times of uncertainty, is there a preferential difference in how individuals make decisions versus how groups make decisions?

KAHNEMAN: I think the main benefit of groups in the context of decision making is that they shield the individual from responsibility for bad bids. That is, when you have an individual deciding to invest in a particular stock or to buy a particular company or to do something along those lines, it's very important to have the group endorsing the individual decision so that if things go sour, it's not the individual who gets blamed and that goes back to what I was saying earlier about the amount of risk.

The big advantage of groups in that context is the diffusion of responsibility for failures.  I'm starting from the premise that you want to encourage risk-taking, that you have an organization that is overly risk averse. If you want to encourage risk taking, the way to go is to shield the individuals, and the way to do this is to operate within groups. If you feel there is too much risk taking, then that's not the way that you're going to do it. It really depends on where you think you are.

What do you see as the proper role of intuition in judgment? How can we best leverage it to our advantage, particularly in innovation?

KAHNEMAN: That's a topic I've thought a lot about. There is a worship of intuition and of intuitive leaders that I think is exaggerated and potentially dangerous.

There is such a thing as genuine intuition, and we know when intuition can be trusted. Gary Klein and I actually wrote a paper together, putting conflicting points of view on this, because he is a great supporter of expert intuition and I'm a skeptic.

It's clear that when the world is regular, when there is a lot of opportunity to learn it, and when the feedback that you get from the world is clear, unambiguous, and rapid, then an expert intuition develops. That happens to chess players, to tactical decision-makers, to physicians who learn to make a particular diagnosis. They live in a world that's predictable.

It does not happen in stock picking. That is, it does not happen where the world is profoundly irregular; there is no opportunity.

Now, the striking thing is that intuition is a feeling. You have a feeling that you know something although you cannot articulate exactly what you know. The problem is that you get intuitions so defined even when you actually don't know what you're talking about. Some intuitions are trustworthy, others are not. Subjectively, you don't have a clue which it is. The only way to decide which intuitions to trust and which intuitions to question is to ask the questions I asked before about the environment and about your opportunity to learn it. If the environment is regular and you have had an opportunity to learn it, then your intuitions can be trusted; otherwise, they cannot.

That's an important point. Subjective confidence is not an adequate metric because people frequently feel very confident when they're making mistakes, even when they're making predictable mistakes.

What is the role of intuition? In the book Noise, we present a point of view that what you want to do is not to eliminate the role of intuition in decision making because it's absolutely essential to reach a decision with a certain sense of confidence in its validity, and an intuitive sense that, “This is okay; this is what we're going to do.” However, our point of view is that you should delay intuition.

You should not eliminate it, but you should delay it. You should systematically collect and analyze information on various attributes of the problem, and only after information collection and when you have a general profile of the situation with all its aspects, then you can form a global impression, have a general intuition, develop confidence in your judgment. The idea of delaying intuition is absolutely essential to our point of view because otherwise people tend to jump to conclusions; they're susceptible to confirmation bias, and they do not collect information that they need.

How can organizations overcome hierarchical bias (where the higher up the ladder you go, the more weight your viewpoint is given), and instead better embrace the value of ideas, irrespective of the person’s position in the organization?

KAHNEMAN: I think that here the responsibility falls entirely on the people at the top of the organization. People at the lower levels of the organization have to be free to express their ideas within reason. The tone, what is and what is not allowed, that is entirely determined at the top. There are organizations that actually seek out the opinions of relatively low-level employees. Google was known to have that in its early years. There would be a companywide meeting where people would just say things, and the leaders of the organizations would be there to listen.

A culture of promoting this either exists or doesn't exist. In many bureaucratic organizations, that type of culture tends to be stifled. The more levels there are in an organization, the less likely it is to have that kind of excitement about new ideas. The responsibility is at the top. It's not for individuals within the organization to deal with.

There are techniques – certainly the highest paid person should not be the first to speak. They should be very careful in not making their opinions obvious, and the best way to do this is for them not to have an opinion. That is the best way to run a discussion. If you have already decided, then you're not running a discussion; you're trying to convince other people to follow you. If this is a context in which you are going to be deciding, then delaying intuition applies to the leader.

This determining in advance what you're trying to do here, whether you're trying to make the best possible decision or whether you're trying to line people up behind the decision you've already made, that's not always obvious to the group and it may not always be obvious to the individual himself or herself. Having that clear would seem to be a useful thing. Clearly, it's more important to hide your tentative opinions if you want to reach a real decision. If you want to bring people over, then hiding your opinion is much less valuable.