When Normal Shouldn’t Be

7
1053

turteltaub-adam-200x200-150x150By Adam Turteltaub
adam.turteltaub@corporatecompliance.org

Last year I returned from a two-week trip to SCCE programs in Europe, and while opening my mail I discovered a tax refund check from the state of Indiana.  Since I live in Los Angeles, and the SCCE/HCCA is based in Minnesota, I called immediately to tell them there must be a mistake of some sort.

The man who answered the phone couldn’t have been nicer.  He listened to my story and told me I must have been the victim of identity theft.  The thieves had made a mistake with the bank transfer information, which is why I received the check.

He told me to fill out an affidavit of identity theft on the IRS site and send it in to him along with the check.

Then, about a week later, I received a letter from Michigan telling me that they had questions about my tax return.  By this time, I was annoyed but no longer panic stricken.  In fact, I knew the routine, and I immediately sent them the affidavit as well.

And now, I just received a notification that my medical records may have been breached.  I knew it was coming – the breach had been in the news.  I wasn’t surprised by the letter.  I wasn’t angry. My blood didn’t boil.  I just sighed and will now explore the identity protection that the health system is providing in addition to the identity protection I have been paying for since I received that Indiana check.

Bottom line:  I’ve gotten oddly used to the notion of having my data stolen right, left and center.  I’ve gone from wondering if it would happen to when it’s going to happen next and by whom.

It’s a great example of what SCCE Compliance and Ethics Institute speaker Garrett Reisman, a former NASA astronaut, calls the “normalization of deviancy.”  In a video interview he did as a preview of his presentation, he talks about how falling foam from the external fuel tank of the Space Shuttle grew to be normal.  Everyone knew the foam wasn’t supposed to fall off, and much work went into stopping it, but nothing bad happened, and people were used to it. Then one horrific day we lost the shuttle and all the lives on board.

Letting bad things seem normal is not good when lives are at stake.  Nor is it a good thing in business.

There’s the risk that, as more and more breaches occur, management and employees may start seeing them as normal.  “It happens all the time, we’re not the first and not the last” thinking could take over.

Careless behavior is already a substantial problem when it comes to protecting data.  Laptops and jump drives go missing constantly.  Yet despite the frequency of these events, we can’t allow acceptance of the risk to lead to indifference to it.

Much like safety, we have to educate to the point that it becomes rote.  That way, even if the workforce stops noticing the risk, we’ve already ingrained the habits that mitigate it.

[bctt tweet=”We can’t allow acceptance of the risk to lead to indifference to it @AdamTurteltaub” via=”no”]

 

7 COMMENTS

  1. There’s some great behavioral-science literature out there on the normalization of deviance in organizations — essentially, how a business culture (or subculture within a business unit) comes to accept misconduct as normal, necessary, unremarkable part of the business day while still, paradoxically, recognizing that outsiders will see things differently and therefore that the “normal” conduct must remain concealed. The best introduction to the field is Ashforth and Anand’s paper, “The Normalization of Corruption in Organizations.” Really great breadth and depth and a fun if ultimately depressing read. For those in the healthcare industry I recommend an excellent piece by John Banja of the Emory Ethics Center, “The Normalization of Deviance in Healthcare Delivery.”

    • Thanks Scott. All I could find online for free was a very short abstract of the corruption article. Is there a good summary of it somewhere?

      I found the other article at http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2821100/. It’s very intriguing and not just for healthcare.

      I found one thing of particular note: “Resist the Urge to be Overly Optimistic.” Patrick Kuhse, a felon who spoke at the Compliance and Ethics Institute years ago, talked about the fact that he found that he and many of the other men he was incarcerated with suffered from what he called “super optimism.”

  2. Great article! Sad but true. I like the analogy of approaching Information Security with the same communication tactics as a company’s safety program…making behavior rote so that employees will choose information safety even when they’ve become sensitized to the alarm of risk.

  3. Great point, Adam! It reminds me of Solomon Asch’s research into compliance drift to the majority opinion/action in a group setting. Clearly, there’s longstanding precedence on this issue! I know I can find myself falling prey to this phenomenon at times when the group opinion is strong.

Comments are closed.