Investigators Must Account for Implicit Bias

0
2302

 

By Gerry Zack
Incoming CEO, SCCE & HCCA

It’s a shame that everyone except me is so biased.

That seems to be the reaction when the subject of bias comes up, reinforcing the fact that we are ALL biased. I’m not talking about conscious biases (my dog is better than your dog). It’s the unconscious (implicit) biases that everyone has that can be harmful—especially if you’re an investigator.

Implicit biases refer to biases that we are not really aware of—at least not at the time they affect our behavior. Having implicit biases does not make you a bad person; that’s one of the first things to keep in mind. Everyone has them. They are one category within a broader range of unconscious behavior that drives our actions.

There are many types of implicit bias, but two of them are particularly important for investigators. The first is affinity bias. Affinity bias, sometimes called “in-group” bias, is interesting because it is not entirely unconscious. We are all aware of our affinities for certain groups or individuals with whom we share common characteristics: a region we come from, a university we attended, a religion we share, an ethnic or cultural heritage, etc. What we need to address as investigators is the unconscious element of affinity bias that inevitably sneaks into our work in three contexts:

  1. Bias of the interviewer in conducting an interview
  2. Bias of the interviewee being interviewed
  3. Bias of the investigator in evaluating representations made by others (starting with the initial allegation)

Affinity bias can have a significant detrimental effect on interviews, since both interviewer and interviewee bring implicit biases into the room. At its most extreme, this can lead to incorrect conclusions regarding the honesty or dishonesty of an interviewee. But even if affinity bias doesn’t have that dramatic of an impact, there are more subtle influences it can have. There is a natural tendency to shorten discussions and move along too quickly when we are uncomfortable, and we tend to be most comfortable around people we have the most in common with. And the discomfort is not likely something we are consciously aware of. But it’s there.

Where this most significantly harms interviews is in the rapport-building stages. Affinity bias can be a significant barrier to building rapport, an essential component of informational interviewing. However, having a lot in common with the interviewee can result in a level of comfort that can lead to being less observant of behavioral anomalies and signs of deception from an interviewee.

The good news is that the acknowledgment of affinity bias really is more than half the battle. Then, taking action to compensate for it is not terribly difficult. For example, when interviewing someone with whom you have little in common, you may need to make a more deliberate effort to find the common ground and to establish a good tone. Plan for it accordingly by preparing carefully for your interviews. And remind yourself periodically, “What action should I take to counter affinity bias in the next interview?” You’ll find by taking these simple steps, you’ll be a more consistent and thorough interviewer.

The second type of bias that can tarnish investigations is confirmation bias, which is the natural tendency to seek out or interpret information in a manner that supports an existing hypothesis or belief. This one is trickier to deal with than affinity bias. It is entirely unconscious and very natural.

Confirmation bias can affect an investigation from the outset. Many investigators think that their job is to find fraud, corruption, and noncompliance. Then, an allegation comes in. You already have two strikes against your impartiality. First, you should view your job as “fact finder.” It’s not as sexy as fraud or corruption finder, but it’s important. Then, in connection with the allegation, you have to view your role as equal parts clearing the individual and proving the allegation. Let the facts fall where they may.

Confirmation bias also creeps into an investigation after it has begun, sometimes very early. How it manifests itself in this stage is that an investigator develops a hypothesis (e.g., who did it, how they did it) based on early evidence. The investigator then begins interpreting all additional evidence in a manner that supports the hypothesis, sometimes even ignoring conflicting evidence (this variation is sometimes called bounded awareness).

Again, it is perfectly natural to do this. We all do it. Yes, even investigators. A key point here is that nobody is naturally impartial. We have to take specific actions to “become” impartial or, more accurately, offset the effects of our natural confirmation bias.

Trying to refrain from developing a hypothesis too early can help, but it is sometimes impossible. Instead, one good technique is to occasionally make a conscious effort to poke holes in your hypothesis, consider where it could be wrong, and even temporarily assume it’s wrong. It’s helpful to write down alternative hypotheses or possible problems with the current theory.

Along the same lines is a concept known as reverse proof, which refers to testing every alternative hypothesis. By proving each one wrong, hopefully, your original hypothesis is proven to be the only one that has not been disproven. The reverse proof approach is useful in investigations involving a lot of indirect (circumstantial) evidence, although it is certainly not limited to such cases. The investigator approaches incriminating evidence like a defense attorney might: What are the nonfraudulent explanations for the evidence? What are the explanations that could implicate someone else?

Another technique, albeit one that might not be possible in smaller organizations, is to have an independent reviewer of the case file. These fresh looks at a case can reveal new theories or identify gaps in logic.

There are many other important steps to becoming unbiased, too many for a short blog post. But they all begin with the acknowledgment that everyone has implicit biases, and offsetting them takes much more than saying, “I’ll be impartial.”