A Risk-Averse Business Climate
“We’re afraid to take a chance,” replied the treasurer at a large, U.S.-based manufacturing firm. He was responding to my question about the biggest challenges he faced at work. Executives complain about a lack of fresh ideas from the front lines while employees lament the absence of bold leadership. “Where is the vision? Why can’t we ever take a risk?” ask exasperated employees.
It’s unanimous. Everyone thinks risk aversion limits the organization. Yet, no one will take a chance. What’s going on here?
Risk Aversion Explained
Human beings hate uncertainty because it implies volatility and danger. When we notice information is missing, our brain raises a red flag warning us to pay attention because “this could be important.” From an evolutionary standpoint, this reaction to ambiguity makes sense. As explained in Why Biology and Decision Making Don’t Mix, 200,000 years ago, knowing whether that rustling in the bushes belonged to a tiger or a mouse could mean the difference between life and death. We’re wired to reduce uncertainty because our minds were adapted for another era where physical hazards were part of our moment-by-moment existence.
Unfortunately, the instinct that served us well in the wild interferes with our decision making today. As Rolf Dobelli explains in The Art of Thinking Clearly, “Thinking is a biological phenomenon. Biology has dispelled all doubt. Physically, including cognition, we are hunter gathers in Hugo Boss. What has changed markedly since ancient times when we lived in small communities of about 50 people with no technology and very few tools, is the environment in which we live. Only in the last 10,000 years did the world begin to transform dramatically with the advent of crops, tools, and technology. Since these changes, little remains of the environment for which our brains are optimized.”
More specifically, risk aversion hinders our decision making. We often place too much emphasis on information that is easily accessible and easy to understand. Psychologists refer to this tendency as the availability bias. This cognitive bias or blind spot emerges forcefully when assessing risk. We vastly overestimate the likelihood of events like shark attacks and plane crashes because, while exceedingly rare, these events come to mind easily; they are dramatic, even lurid. In contrast, we underestimate the risk of car accidents or the likelihood of succumbing to cancer or heart disease, the two leading causes of death in the U.S.
In business, the availability bias leads to errors such as mistakenly attributing employee turnover to salary issues when, in fact, former employees cite limited career opportunities and a lack of recognition as the reasons for leaving a job. Given the high cost of employee turnover, managers and HR professionals would be well advised to assess why employees really leave and the impact on the company’s bottom line.
The availability bias (also known as the representativeness heuristic) contributes to poor investment decisions. As Hersh Shefrin explains in Beyond Greed and Fear, “investors who rely on the representativeness heuristic become overly pessimistic about past losers and overly optimistic about past winners…” Shefrin explains that mispricing is temporary meaning that prices tend to even out over time, “Then losers will outperform the general market, while winners will underperform.” By focusing narrowly on winning and losing stocks rather than historical performance data and other factors, individual and institutional investors limit their ability to make good decisions.
Fortunately, in some cases we are able to step back and evaluate a larger information set thus avoiding the availability bias. Seeking more information comes with a downside, however. When data is missing, we tend overestimate its value. Our brain assumes that since we are expending cognitive resources to find more information, this missing information must be useful. It could be, but it might also turn out to be irrelevant.
Mitigating Risk Aversion
This does not mean that we should avoid seeking more information. Rather, we should acknowledge the information in hand, investigate further, and then assess all the available data though perhaps within a reasonable time limit to avoid analysis paralysis.
More generally, we would be wise to follow Rolf Dobelli’s advice by identifying our circle of competence. Within this circle our intuitions are more likely to serve us well. When, however, we face a decision outside that circle, it’s worth taking time to apply hard, slow, rational thinking.
Want to learn more? Click five things you need to know about decision-making to download our free white paper.