Tuesday, 14 July 2020

Designing an Information Security Campaign

For the past few months, I've been kicking some compliance training ideas around in my head. Since my memory is less than perfect, and (more importantly) since these ideas help nobody while they're inside my head, I've taken some time to put them to paper (literally). Then I had to to spend a little more time deciphering my chicken scratch.

In order to give them some coherence, I've organized them into a sample compliance program based around information security.

Here's what I have for you so far.


Key Considerations
  • Frequent “top-ups”, to help keep the information fresh. There’s something happening every month. This also reinforces that information security is an ongoing activity, not a “one and done” event.
  • Quick hits. Although there’s something happening every month, none of it is long.
  • Real KPIs (Key Performance Indicators), based on information security red flags and actual incidents, not on vanity metrics (such as course completions or test scores).
  • Real-world tests, because life doesn’t ask you multiple-choice questions. 
  • A course, but not always, not for everybody, and not necessarily all of it (more on this later).
  • As needed, if needed support elements.
  • Story-based delivery. The "When Spies Attack" series illustrates information security concepts in a narrative-driven manner.
  • Variety. Some years “When Spies Attack” will be a comic, other years it’ll be a podcast, maybe it's an episodic game another year.

The Course

The courses, actually. Information security is being broken up into smaller self-contained chunks (remember what i said about "quick hits"?). There's a course on safe computing, with topics like spotting phishing emails and how not to plug outside devices into your computer. The second course is on safe file handling, which includes both physical files (clear desk policy) and digital files (data protection). The final course covers physical security concerns, such as preventing unauthorized access to secured areas.

I have no intention of forcing people through these courses year after year. To get around that, we have opportunities for exemptions. In a perfect world, nobody would take the courses more than once.
For starters, we'll have a pretest to gauge existing knowledge. If someone can ace the pretest, they'll automatically get credit for the course. People who don't score high enough on the pretest to skip the whole course can still skip portions which correspond to the pretest questions they succeeded on.
After someone gets credit for the course, they unlock an "express recertification" option for the following year. The following year, they won't have to redo the course (unless they really want to). Instead, they can quickly review the policy and check a box to acknowledge that they agree to abide by it. The "express recertification" option is available to anybody who hasn't proven themselves to be a risk (more on that in a bit).


The Policy

Have you seen those books, like Beowulf or Shakespeare, where they have the original (incomprehensible) text on one page, and a plain English translation on the facing page? That's what the policy will look like.
As much as I'd love to have the whole policy written in plain English, I doubt most compliance and legal stakeholders will let us do that. The next best thing is to have "translations" right next to the policy text, so that people understand what they're agreeing to. I've seen banks do this, so why can't we?


Performance Support

Has it ever felt like every system that requires a password has a different set of requirements? Uppercase, lowercase, digits, symbols, the last 53 characters of your genome sequence. Who can keep track?
Here are a few things we can do to make password management less of a burden for people:
  • Display password requirements on the screen when people are required to select/change their password. I've seen it done sometimes, I'd like to see it done all the time.
  • Proactively send people tips on making good passwords when their current password starts to get stale (like that "your password will expire in 12 days" message).
  • Identify a password management app and recommend it to people. 

Non-Training Solutions

There are things that IT is almost certainly already doing which can be better integrated into a more comprehensive solution. For instance:
  • Monitoring for unauthorized devices being plugged in to work computers.
  • Monitoring network files and email for sensitive data.
Imagine if we had access to that data in L&D? Birgit plugged a personal thumb drive into her laptop? She just lost her exemption to next year's safe computing course. Nathan tried to email sensitive information in an unencrypted message? He no longer qualifies for express recertification on safe file handling.


Real-World Testing

Multiple choice tests aren't realistic don't cut it. We'll still have some in the courses, but the real testing happens outside the course, when people aren't expecting it.
Here are two ideas, both of which should be scattered through the year (e.g. target some people in January, others in February, and so on):
  • Fake phishing emails. The design and level of sophistication should be at least equal to what our people are being hit with "in the wild".
  • Mystery shoppers. Hire outside testers to attempt to access secure areas using piggybacking or attempt to obtains sensitive information using shoulder surfing or eavesdropping.
  • Lures. I read about a bank years ago that tested their staff by leaving thumb drives in the parking lot. An alarming number of employees picked them up and plugged them into their work computers, putting the network at risk.
Again, if someone fails these real-world tests, they've proven themselves to be a risk to the organization and flagged themselves as needing more training or guidance.


Evaluation

There will be two tiers of evaluation. The first evaluates the the course and/or the policy acknowledgement. It's similar to a standard L1 evaluation, but somewhat more useful. What we're evaluating is whether people know what to do.
I propose we have people rate the degree to which they agree with the following statements:
  1. I know what the policy is.
  2. I know what is expected of me.
  3. I know how to recognize breaches or wrongdoing.
  4. I know how to report them.
  5. I know how to get answers to any questions I may have.
The real evaluation is tied to actual incidents (those things IT is monitoring under "Non-Training Solutions" above) and red flags (areas where people messed up under the "Real-World Testing" heading).

Here's a sample of the dashboard I have in mind:
Information Security Dashboard

Notice how it doesn't say anything about how many people completed training? That's deliberate. Those numbers don't matter. What matters is how people behave in the real world, so that's what we're reporting in the dashboard.

The Calendar

Here’s how it plays out over the course of a year:

 January    Announcement of annual campaign from head of organization. Include previous year's dashboard results.
 February    Policy acknowledgement 
 March "When Spies Attack" issue 1 (Suspicious Email)
 April Course 1 (Safe Computing)
 May Optional micro-module: Selecting Secure Passwords
 June "When Spies Attack" issue 2 (Piggybacking)
 July Course 2 (Safe File Handling)
 August Announcement: reminder about password checker app
 September "When Spies Attack" issue 3 (Shoulder Surfing and Eavesdropping)
 October Announcement: reminder about password manager app
 November Course 3 (Physical Security)
 December "When Spies Attack" issue 4 (Unsecured Documents)


Over to you
Is there anything you can use here? Do you have any ideas for better compliance campaigns? I’d love to hear from you in the comments.

No comments:

Post a comment