Friday, June 19, 2015

Just the Facts

 
“Prejudice is a great time saver. You can form opinions without having to get the facts.” --E. B. White

Years ago I worked with an insurance company CEO who took immense pleasure in putting people on the spot in meetings. First he would ask someone to explain why something was happening, and when they began to answer he would cut them off with this comment:
“That’s not an explanation, that’s an excuse.”
If they tried to argue that point, he’d cut them off again with the same comment. It was an awkward and uncomfortable dynamic, but the first time I witnessed it I couldn’t help but reflect upon the distinction between an explanation and an excuse. Both involve an attempt to describe why something happened, of course, but an excuse has a motive an explanation doesn’t have—self-protection. One definition of excuse:
An explanation designed to avoid or alleviate guilt or negative judgment
According to Jenise Harmon at PsychCentral, excuses are defensive attempts to deny responsibility, and they often emerge when someone feels threatened. Next time you are trying to figure out why something happened, and you realize you have been methodically discounting and discarding any explanation that might implicate or reflect poorly on you, hit PAUSE. You’ve probably been looking for an excuse, particularly if you don’t believe (or don’t want to believe) you are responsible for whatever happened. The urge to protect yourself when threatened is normal—and most of us learn how to make excuses and shift blame when we are children. As we get older, the reasoning error known as confirmation bias enables us to focus on information that confirms our beliefs while we ignore contradictory information. Psychology expert Kendra Cherry describes it this way:
While we like to imagine that our beliefs are rational, logical, and objective, the fact is that our ideas are often based on paying attention to the information that upholds our ideas and ignoring the information that challenges our existing beliefs.
Confirmation bias interferes not only with how we gather information, but also with how we interpret it, which helps explain the attitude polarization that often happens when people with dissimilar values and beliefs look at the same information and interpret it very differently. Experts say the responsible root causes include wishful thinking, ego, memory limitations and our inability to process information effectively. And of course the persistent and undeniable need we humans have to be right:
Another explanation for confirmation bias is that people tend to weigh up the costs of being wrong, rather than investigate in a neutral, scientific way.
In college I studied the natural sciences, so I learned to approach problems using the scientific method , an evidence-based technique to explain how and why things happen. The scientific method has three steps:
  • Observe and collect data
  • Analyze and develop a hypothesis
  • Test and challenge the hypothesis
There’s no room for confirmation bias in science. Scientists are trained to be objective and skeptical, to research a situation, propose an explanation, then test and refine it through experimentation. They follow an iterative, fact-driven process in which the objective is to come up with a solid hypothesis and then challenge it. If peer review and other attempts to disprove it fail, then a well-researched, rigorously tested hypothesis might eventually become a theory—a broadly accepted truth, reliable enough to make predictions that can be validated by experimentation.

Unfortunately, you don’t commonly see anything as rigorous and disciplined as the scientific method being used to address challenges in business. In some dark corners of the property casualty insurance business, for instance, it is entirely acceptable for an executive with a hunch (or a bias, a fear, or even a guilty conscience) to disguise a conjecture or an excuse as a theory. A theory developed with no objective research, no validation, and a healthy dose of confirmation bias. Even worse, such theories are often carelessly advanced. After all, since it’s only a theory, where’s the harm if it is flawed or incorrect? If you manage an insurance claims operation, you know what I am talking about, since you’ve probably squandered innumerable hours debunking theories about claims and the claims handling process that were promulgated by well-meaning (and some not-so-well-meaning) folks.

Let’s bring this pervasive problem to life by imagining we are observing an insurance company executive management board meeting. The numbers aren’t looking good, so the CEO turns to the Chief Underwriting Officer (CUO) and asks why the loss ratio is trending higher than planned. “I have a theory,” says the CUO, and he spins an elaborate tale in which the claims department, by setting case-level loss reserves that were too high, was encouraging adjusters to make loss payments that were also too high, artificially inflating the loss ratio. The meeting room goes silent. The CUO holds his breath, mentally bracing himself to try again, when suddenly the CFO mutters “We should look into that.” The CEO nods and instructs the group to go off-line and figure out why case level loss reserves are being overstated. As observers we are surprised, and the Chief Claims Officer is speechless, but the CUO is beaming like a movie character on death row who has just been informed that the governor has granted a last minute stay of execution.

Perhaps you’ve witnessed a scenario like this, or been caught up in the disruptive all-hands fire drill it generated. Now imagine a month has passed, the executive management board is meeting once again, and the cross-functional team charged with determining why case level loss reserves were being overstated reports its findings:
  • No evidence of case level reserve overstatement
  • No evidence of inappropriate loss payment inflation
  • Some evidence that rates being charged were being discounted to levels significantly lower than planned
  • Solid evidence that risk selection guidelines were not being followed consistently by underwriters
Finally, facts! Business discussions are usually more useful and fructuous when framed with facts and evidence instead of hunches and suspicions, so this imaginary follow-up executive management board meeting might actually accomplish something. Perhaps the CUO will agree with the findings and accept accountability and responsibility for the inflated loss ratio, which would probably end the discussion. Or he might stay the course, disputing the findings and offering another explanation. No matter which way it goes, however, hopefully this time around the CEO will be a more demanding and discerning discussion leader. While rejecting half-baked theories with the admonition “That’s not an explanation, that’s an excuse” might be insensitive and provocative, at least it’s a nod in the direction of the scientific method.

To see more about the laws of social behavior, check out Richard Connif’s amusing article in the Smithsonian Magazine, and for a deeper look at the cognitive biases that interfere with rational decision making, check out this article by George Dvorsky. To understand how and why AIG has embraced evidence-based decision making, take a look at this HBR article. Finally, for readers who are Dragnet fans, Jack Webb as Joe Friday never actually said “Just the facts, Ma’am” on the show, but Dan Aykroyd said it while playing Joe Friday’s nephew in the 1987 movie Dragnet.

Dean K. Harring, CPCU, CIC is a retired insurance executive who now enjoys his time as an advisor, board member, educator and animal portrait artist.  He can be reached at dean.harring@gmail.com or through LinkedIn or Twitter or Harring Watercolors.












No comments:

Post a Comment