How to implement the DOJ’s Evaluation of Corporate Compliance Programs

3
1832
By Ricardo Pellafone, Founder, Broadcat

“This is interesting, but what am I supposed to do with, like, a million questions?”

– you, after reading the DOJ’s Evaluation of Corporate Compliance Programs.

Don’t worry. You’re not alone.

In fact, if that’s what you’re thinking: take heart. At least you’re in the 50% that has actually attempted to read it.

And to be clear, this is the document that the DOJ put out to explain how they tend to evaluate compliance programs. So it’s fair to say you should read it and figure out how to implement it—before you’re sitting across from a prosecutor.

The “implementation” part, however, is admittedly tough.

That’s because the 100+ questions are organized by concept, reflecting how the DOJ thinks—not how you would use it.  And making the jump from “how the DOJ thinks” to “how we’ll use this” is a mammoth task.

We know, because we made a roadmap that does exactly that.

You can cut to the chase and download that here, or you can read on to learn how we broke it down—so you can take a similar approach when you have to translate something from one context to another (like when giving compliance training).

Frame things around how they’ll be used.

If you want something to be useful, you need to frame it around the context in which it’ll be used.

Otherwise you’re asking someone to (1) understand it (2) translate it into their own context and then (3) apply it. Most people give up after #1—because #2 is really hard.

(This is why most compliance training fails, by the way; it’s not an “engagement” problem. It’s because it’s framed around risks instead of business process, and it’s too much work for people to figure out how a bunch of abstract legal concepts apply to what they actually do—while also actually doing their real jobs. We did a deep dive on that topic at last year’s Compliance and Ethics Institute in Chicago; if you missed it, you can grab a writeup of the presentation here.)

In this case, the context we want is basically “project planning.” So to put it into your context, you need to understand and reframe the content along how you do project planning.

And project planning is a lot of “who/when/how” stuff, because that’s how you parcel out the work and create schedules for getting it done. So we went through every question in the DOJ’s document and coded it based on:

When you’d ask each question,

Who you’d probably task with answering it, and

How often you’d be likely to update your response.

Next, we went through the coded questions—repeatedly—to see how we could sort them into categories based on how they were coded. That’s because at this point we had 100+ data points, and we needed to organize them in some way to make using them more achievable.

And we found that we could sort them into three big practical categories, which we called “Governance and Structure,” “Program Operations,” and “Incident Response.”Each category reflects a consistent approach to when you’ll answer its questions, who will answer them, and how often you’ll need to check in for an update.

And that’s how we organized the final version; it uses the DOJ’s original questions—they’re just sorted by how you’ll actually answer them.

That’s the high-level process for reframing something complicated: (1) figure out how it’ll be used, (2) “code” the pieces of it based on how they’d be used, and (3) identify patterns that let you sort those pieces into new categories to simplify the result.

Example: Senior and Middle Management

 Let’s look at how this shakes out for an “easy” one. This is a screenshot of the questions from the DOJ’s “Senior and Middle Management” topic:

It’s only 9 questions—so, easy. Right?

Sure. Until you go to use it.

Because you’ll find that those questions will be answered by different people, at different points in time, and will be updated on different rhythms. And you need to sort that out before you can, you know, answer them.

So, we applied the process we described above, and here’s what happened.

First, a couple questions fell into what we called the “Governance and Structure” category.Questions in this category get to how your program is set up and governed. The pattern we saw for the “when, who, how often” was this:

When: you’ll answer these proactively. They’re not tied to a specific compliance incident—you don’t need to wait for the hotline to ring.

Who: you’ll give these to a senior team member (like your Deputy CCO) engaged with high-level, program-wide issues like reporting lines, budget, compensation, and board oversight.

How often: you’ll update your answers when things change—because the types of issues these questions tackle don’t usually change very often, and you should know when they do.

Next, about half of the questions fell under the “Program Operations” category.

These questions are about day-to-day risk management stuff. The pattern for “when, who, how often” was this:

When: you’ll answer these operations-focused questions proactively, just like the Governance and Structure questions.

Who: you’ll give these to individual risk owners to complete on a risk-by-risk basis. Your program’s operations will vary by risk—so you’ll want to know how things work for each key risk you have.

How often: you’ll do these on a regular rhythm as a health check. These questions get operational, and operations change fast. A time-based cadence helps make sure you don’t miss something.

Finally, let’s look at the most straightforward questions—the ones we’ve put under the “Incident Response” category:

These questions get at specific compliance issues your company has faced. The “when, who, how often” pattern looks like this:

When: you’ll answer these reactively, in response to specific compliance issues.

Who: you’ll give these to the investigator and risk owner(s) relevant to the specific issue.

How often: you’ll do these in response to significant cases—for example, cases you’d flag to your board in a quarterly meeting. (If that’s more than a few a year, you might be over-reporting. Ask outside counsel for a refresher on board duties.)

Now, if you looked at the pictures closely, you probably noticed there’s overlap between the “Program Operations” and “Incident Response” categories. Some of the questions appear in both categories.

That was deliberate. The DOJ’s document contains about a half-dozen compound questions that can be answered partially proactively, partially reactively—so we listed them in both categories.

And that’s it! The other DOJ topics shake out the same way.

Now, if doing this seems like a ton of work, it is. Translating something into a new context requires you to master both contexts and it takes a ton of time to execute. (That’s why making good compliance training is so hard—you have to frame it around the stuff your audience actually does at work, instead of just telling them about anti-corruption or privacy or whatever.)

But it’s also worth it.

Because compliance is about driving behavior, not just conveying abstract knowledge. And so investing the time to make applying something accessible to your audience—so that they can use it—is always time well-spent.

[clickToTweet tweet=”How to implement the DOJ’s Evaluation of Corporate Compliance Programs” quote=”How to implement the DOJ’s Evaluation of Corporate Compliance Programs” theme=”style3″]

3 COMMENTS

  1. We developed that model back in 2003 when it came to training. It worked so great. A lot more work to get it running but training and our compliance guide were all based on a business process.

Comments are closed.