Blinded by Human Nature and Compliance

1
1428
Editorial Images of Stephen PaskoffBy Stephen M. Paskoff, Esq.
Paskoff@eliinc.com

We all like to think we’d act quickly and effectively to stop ongoing, illegal, dangerous, or unethical acts once we had notice or a whiff of knowledge of improper behavior. That’s one reason why the apparent inaction of many in the face of allegedly cruel, sadistic, and criminal conduct at Penn State is so shocking. That could not be me, we say. I know I’d have behaved differently. That was certainly my first reaction. On reflection and after further reading, I’ve changed my mind. It’s clear there are too many instances when individuals have known of hazards and kept quiet in the recent past, and historically, to comfortably believe we’d all do “the right thing.” What can explain this and what does this have to do with organizational compliance initiatives in place to prevent, detect and correct misconduct?

First, we may be “programmed” or “hard-wired” to obey authority in ways which can lead us to either engage in abominable actions or fail to halt them. Just laying out standards of right and wrong, communicating them and setting up complaint processes, the key steps of many compliance initiatives, won’t change this. This conclusion is suggested by research conducted by Stanley Milgram, an experimental psychologist as reported in his 1974 work, Obedience to Authority: An Experimental View. Dr. Milgram conducted a series of experiments in which “subjects” were told they were participating in a study to measure how negative stimuli would affect learning. In truth, Milgram sought to determine how individuals would behave when their natural instincts to act humanely conflicted with organizational commands to get a job done and conform to authoritative directives. In Milgram’s experiments, each time a “student” missed an answer, subjects were repeatedly ordered to deliver increasingly powerful electric shocks that ultimately reached potentially fatal doses. No real shocks were delivered, except in terms of how the subjects responded.

As “students” purposely answered incorrectly, the jolts increased in power. Even when the students pleaded for the exercise to stop, the subjects consistently kept ratcheting up the shocks. Almost none of the subjects refused to inflict pain. Dr. Milgram drew participants from a wide array of backgrounds and demographic characteristics. The conclusion: no matter who we are, knowing what we should do may take second place when we unconsciously decide what we must do to adhere to contrary organizational rules, orders or unspoken but well known norms.

Second, even when well intentioned, organizations may inadvertently build cultures and systems which reinforce the human instinct to obey authority and tolerate misdeeds, risks and serious crimes. This theme surfaced in Blind Spots: Why We Fail to Do What’s Right and What to Do about It by Max H. Bazerman and Ann E. Tenbrunsel. With respect to compliance programs, one of the author’s key point is that they lay out standards of conduct which set a ceiling for effective remedial action when they should set a floor.

Individuals do the minimum and conclude they have done enough. In the context of Milgram’s work, they balance their instinct to do the right thing against a stronger natural inclination to avoid challenging authority. The focus shifts to avoiding penalties or personal harm rather than correcting a problem. This pattern runs through entire organizations. As an example, notifying an administrative superior of a problem may be what is required but it may not be enough to halt entrenched practices. A note to file documents awareness of a problem; taken alone, that’s all it does. As this kind of “organizationally proper” but ineffective response continues, so do bad acts. The compromise leads to disaster.

We need to rethink compliance processes so they do not reinforce inaction and token responses. Some key questions to be answered in your organization are:

  • Are your leaders committed to operating in line with values in the hardest cases involving “one of
    their own” or, are they committed to doing the legal minimum to protect their members?
  • Are messages being communicated by leaders that finding out about and correcting problems is an
    element of business necessity not an externally and narrowly conceived compliance mandate?
  • Are these communications frequent, credible and spread throughout the organization via face-toface
    conversations at all job levels

This is not a simple undertaking as we must fight against human nature. Otherwise, compliance will continue to mean compliance with an unspoken drive to conform rather than acting to prevent catastrophe.

[bctt tweet=”We need to rethink #compliance processes so they do not reinforce inaction and token responses @SCCE” via=”no”]

Stephen M. Paskoff, Esq., is the founder, president and CEO of ELI®, a training company that teaches professional workplace conduct, helping clients translate their values into behaviors, increase employee contribution, build respectful and inclusive cultures, and reduce legal and ethical risk.

Mr. Paskoff is a nationally recognized speaker and author on workplace legal issues. He has written extensively on topics related to workplace compliance and legal issues and how to affect culture change in order to build lawful, professional operations that align with an organization’s mission and values.

In addition, Mr. Paskoff is the former Co-Chair of the ABA’s Compliance Training and Communication Subcommittee, which explores best practices in training methodology as well as overall strategies for implementing learning and communication plans to maintain corporate compliance. He currently serves on the Editorial Board of Workforce Management magazine.

Prior to establishing ELI® in 1986, Mr. Paskoff was a trial attorney with the Equal Employment Opportunity Commission and a partner in a management law firm. He is a graduate of Hamilton College and the University of Pittsburgh School of Law and is a member of the Pennsylvania and Georgia bars.

1 COMMENT

  1. One thing that often goes unremarked about the Milgram experiments is the easy start and gradual escalation. Once the subject has given the first, very mild, shock to the “student,” there’s little reason in principle not to give a slightly larger shock. Soon it seems too late to turn back, what with cognitive dissonance and all. So the experiment is about the combination of the authority bias and the slippery slope, and it accurately depicts how toxic leaders can inculcate severe misconduct by starting off encouraging a small breach.

Comments are closed.