By Paul Wiefels
Part Two
It’s summer time and I am devoting some of my “beach reading” to topics that personally intrigue me but are a bit off the beaten path from subjects I typically write about. My previous post concerned how flawed thinking processes among healthcare providers can be extremely costly, not to mention, deadly. Similarly, flawed thinking juries can wrongly convict the innocent, or let the bad guy off the hook. Flawed thinking at work, in governments, and in our personal lives can be expensive, career-limiting, or simply embarrassing. We put a huge premium on “getting it right” yet we fall prey to a number of biases and cognitive errors that prevent us from doing just that. More troubling, cognitive psychologists tell us that we don’t just get it wrong occasionally. We get it wrong routinely, tripping again and again over obstacles to decision-making that are systematic—deviations from rational thought, observation, and analysis that lead us into cognitive cul-de-sacs.
Sadly, there is no magic pill, chant, or nostrum that we can summon to combat our tendencies here. But we can learn to spot some of the most obvious, as well as be aware of some of the more obscure or subtle. I have no idea how many of these exist. Subject experts who study such things don’t seem to know either. My research suggests that there may be over 100 depending on how we categorize them. Many of them are, of course, related including these:
When we encounter a challenge and lock on to a diagnosis early, we may invoke Anchoring—the tendency to fixate on the most salient features of a patient, problem, or situation as is first presented, thereby creating the potential for not recognizing or ignoring new or conflicting information that may present later. “I’ve seen this before. This looks like that. Therefore, this is that” is how such a bias unfolds. Anchoring can also be coupled with Availability—our most recent experiences may inflate the likelihood that we will look at this situation similarly as well.
Now, combine this with a close relative—Confirmation Bias—the tendency to look for evidence that supports or confirms our initial diagnosis or hypothesis, rather than looking for evidence that would refute it, which might be more persuasive or definitive. Confirmation bias is the bane of scientific and market research. It can be found in abundance in advertising and political campaigns, on CNBC and in financial analyst reports. Notably, we often see it in presentations to potential investors.
Confirmation bias can lead to Diagnosis Momentum. What might have started as a possibility now creates its own momentum until other possibilities become easy to ignore. Our diagnosis now seems obvious. This leads to a denouement of sorts—the Need for Closure. The conclusions are now drawn, the verdict now in. We now pressure ourselves, those around us, our management teams, and the like to “get on with it.” Self-imposed time constraints and a bias for “action” (often prevalent in Western cultures); and perceived social pressures (often prevalent in Eastern cultures) now propel us forward to avoid our lingering feelings of doubt and uncertainty. We may be more comfortable with known unknowns. But we’re still betting big as a result of thinking processes that are inherently flawed. Sound familiar?