Maturity in Measuring Security Awareness Program Success
We’re a big fan of maturity models in the technology industry – from CMM on, we talk a lot about the maturity of various processes within our organizations. And I’ve started more and more to hear people talk about the maturity of an organization’s security awareness program. Lance Spitzner has been a big advocate of it as have a few others. And it’s about time.
The word that is thrown around most often when talking about maturity is “Metrics”. But that’s like saying that the way to have a mature weight-loss program is “diet and exercise” – while it may be a true statement, it’s a little light on the specifics. And, with all significantly complex undertakings, the devil is in the details.
While it’s a much bigger topic to try to define all of the measurements that one needs to take to be effective, the easiest one to start with (and the one that I’ve seen most of us get wrong) is a model for the maturity of what comprises success in security awareness.
Here are the five levels we use at MAD to describe how the organization is determining if the security awareness program is “successful”:
Level 1 – No Measurement
The organization tracks no metrics that determine the success of their user behavior modification efforts.
Level II – Compliance-Driven Measurement
The organization takes metrics based around proving that users participated. This is where most organizations that have standard “annual awareness training” are keeping – the goal of the measurement is to prove that a sufficient number of users participated in the program to satisfy the compliance auditors.
Level III – User Satisfaction
The organization tracks not only the compliance-required metrics, but also surveys its users to determine that they are receiving the awareness training and are satisfied with it. While many users (and even some who claim to be doing security awareness “research”) are using this type of survey, these surveys are fraught with issues. (Foreshadowing: Kati will have an upcoming post on the problems with surveys and the need for us as an industry to do better here)
Level IV – Behavior-Based Metrics
The organization tracks measurements based on each behavior that they hope to achieve from their users. Separate measurements are made for each of the organization’s priority metrics and are tracked over time.
Level V – Composite User Behavior Dashboard
The organization tracks a composite metric or package of metrics based on user behavior that enables them to know at a moment’s glance the forward motion of their user security efforts and whether these efforts tie to the strategic goals of the security organization.
In my personal experience, there are very few organizations that are achieving the fourth level of this type of maturity. And only a couple who actually reach level V. Unfortunately, it’s those organizations that are actually seeing significant security benefit from their user behavior programs – those are the organizations who are actually seeing reduction in security incidents and better security performance because of their users (rather than in spite of them).