2008 Issa Journal Security Metrics Hype Reality And Value Demonstration
1. Article Title | Article Author Voice of Information Security
ISSA The Global ISSA Journal | May 2008
Security Metrics
Hype, reality, and value demonstration
By Aurobindo Sundaram - ISSA member, Metro Atlanta, USA chapter
This article demonstrates the difference between measurements versus metrics; the dangers
of using metrics which detract from the security program’s value; using normalization to make
metrics more consistent; and tips to creating the right metrics for different audiences.
S
ecurity metrics are all the rage these days. One can to boards, to interact with business management, and to as-
hardly open a security- or IT-related magazine with- sist the sales organization in competitive situations, tasks
out hearing about how security professionals need to many of them are ill-prepared for because of their primarily
“measure” their programs, to create and communicate “met- technical, non-business background.
rics,” to demonstrate “value” to the organization. Depending A key aspect of security program value demonstration is com-
on the maturity level of the organization, some of the efforts munication of key performance indicators (KPIs) or, as they
to create metrics have been haphazard, and have detracted are more commonly known, metrics. Reporting has evolved
from demonstrating long-term value to the organization. over the years, and best practice security corporations have
Other efforts have been so centered around technical metrics adapted their metrics to meet business needs as their security
that they have overlooked fundamental aspects of people and organizations and implementations mature – more mature
process maturity. The article will explain the different kinds organizations are more integrated and aligned with the busi-
of metrics, some of the efforts around creating “numbers that ness.
matter” to executive management, middle management, and
the IT organization, and present (a) the continued evolution Metrics versus measurements
of metrics processes, (b) lessons learned, and (c) a simple
three-pronged measurement framework that focuses on in- So, what are metrics? People have different definitions, but
formation security health, people behavior, and process ma- the following is most appropriate for this discussion: metrics
turity, which will allow the security professional to present a are measures used to indicate progress or achievement. Fur-
holistic view of information security risk to all stakeholders. ther, a definition of measurements is important to this dis-
cussion: measurements are a quantitative assessment of a phe-
By reading this article, the reader will understand the dif- nomenon. Metrics and measurements are both important.
ference between measurements versus metrics; the dangers But metrics can be improved, measurements do not need to
of using metrics which detract from the security program’s be. Be careful when you select KPIs that you select ones you
value; the use of normalization to make metrics more consis- have control over and can improve. Security professionals
tent; and tips to creating the right metrics for different audi- often refer to score measurements as metrics, for example,
ences (one size never fits all). number of firewall “drop” logs. But the security professional
has very little recourse to impact this measurement, since
Background the primary drivers of the measurement are all external, i.e.,
Back in the 90s (and early 00s), security professionals were probes from the Internet. There is no prospect for improve-
an underappreciated lot: expected to run firewalls, stay in ment in the score, or no correlation from month to month.
dark corners, and mumble geeky terms every time they were I am in no way suggesting that measurements have no place
placed in front of management. With the ongoing evolution in the security professional’s toolkit. Measurements can (and
of the Internet, and the rapidly with which so much content must) be used in security operations to help analyze trends
has gone online, the security professional’s job has suddenly from month-to-month, or day-to-day; to demonstrate value
become crucial to the success of an organization. Security
professionals (CISOs, in particular) are being asked to report For one list, see http://www.google.com/search?q=define%3Ametrics.
24
2. Security Metrics: Hype, reality, and value demonstration | Aurobindo Sundaram ISSA Journal | May 2008
big picture (i.e., enterprise risk) and does not
Evolution of Metrics V1 measure the maturity of an organization’s se-
Metric Value Author comment curity processes. V3 is a more holistic view of
Number of Internet firewall 2934 This is excellent as a measurement. For instance, month- an enterprise’s risk preparedness, combining
“drop” entries to-month correlation of entries could help in detecting aspects of people, process, and technology se-
targeted attacks. However, security professionals should curity into measures that truly indicate the
be careful not to use this as a metric to be improved since risk and maturity of an organization. For this
they have very little control over the behavior of users on reason, a combination of V3 (for executive
the Internet.
management) and V2 (for middle manage-
Productivity gained by $3000 This is an excellent measurement as well: the quantifica- ment) is the optimum method benchmark-
blocking spam tion of the value of spam blocked to the organization
– for instance, $0.01 for every message blocked, so in this ing an information security program. The
case 300,000 messages were blocked – can be used in next few sections will explore each of these
executive presentations to show the ROI of security. Again, evolutions and point out their relative advan-
this is not generally a metric that can be improved. tages and disadvantages.
Compliance with vulnerability 92% This is a true metric, which can improve or regress.
assessment procedures V1 – just numbers
Amount of time (on average) 27 In the early days, security professionals
This is a true metric because it can be improved, e.g., by
to brute-force crack a min. forcing longer passwords, complex passwords, etc. How- would create metrics that were predominant-
Windows password ever, it also represents the dangers of trying to shoe-horn
ly technical in nature. There was often a mis-
everything into a measurement. The business executive understanding of what a metric was versus a
only cares what risk the 27 minutes introduces to the
organization. This metric might be changed to say that measurement. There were also metrics that
were only peripherally related to risk. For
passwords that can be cracked in less than three minutes
are a critical risk to the organization. Then, the metric
instance, there would be numbers on how
would be “% of users who have weak passwords” where many packets a firewall dropped, which is a
weak is defined as “cracked in less than three minutes.”
measurement rather than a metric because it
This would be an improvable measurement, hence, a met-
ric, and would expose the risk to business management. cannot be controlled or influenced by the se-
curity professional. There would be metrics
Figure 1 – Metrics Evolution V1
Note that numbers and measurements used in all charts in the article are dummy data. on how long it took to crack a password in
brute-force mode – which does not really give
to executive management; and to keep track of internal re- a good enough indication of risk. After all, a business man-
source/project utilization. What I am suggesting is that a ju- ager needs to know if a password is strong or not, not whether
dicious mix of metrics and measurements be used to demon- a password can be cracked in 27.2 minutes or 27.8 minutes.
strate both tangible and improvable value to the organization, The disadvantage of these technology-based metrics is that
as well as operational aspects of information security. they often did not map very well to business risk (or some-
times, not even to technology risk). Figure demonstrates a
Metrics evolution over the years few V metrics along with comments on their relevance.
The evolution of metrics can be divided into three overlap-
ping versions: V, V2, and V3. As the profession has matured,
V2 – business unit/region and program focus
metrics have become more aligned with business risk, rather As the security profession matured and gained a better busi-
than simply information technology risk. V was simply an ness focus, professionals started focusing on program mea-
IT organization creating predominantly technical indica- surements and business units (an example in Figure 2). This
tors intended for technical audiences. While they mapped is still an excellent method to measure risk and communicate
well to technology risk, they
often did not address people Evolution of Metrics V2.0
and process risks. V2 was an
incremental improvement Vulnerability Anti-virus Security Enablers Savings DR/BC Physical
assessment awareness (measurement) (measurement) security
on V; professionals focused
on demonstrating risks to North America 74% 97% 80% 10.4M 1.5M 3 3
middle management as well US 88% 96% 85% 5.4M .75M 3 3
as measuring program-based Canada 95% 94% 87% 3M .5M 2 4
metrics. While the advantages
of this approach are that it Mexico 62% 91% 37% 2M .25M 2 3
was focused and established Asia 97% 95% 87% 3.3M 0.6M 3 3
some degree of risk owner- India 99% 96% 82% 1.7M 0.34M 2 1
ship, the disadvantages are
China 94% 94% 84% 1.6M 0.26M 4 3
that it does not bring out the
Figure 2 – Metrics Evolution V2.0
25
3. Security Metrics: Hype, reality, and value demonstration | Aurobindo Sundaram ISSA Journal | May 2008
it to middle management; however, it falls a little short when • Compliance with technical standards – Scores in-
it comes to an enterprise focus. dicating compliance levels with best practice system
The advantages of using a business unit/region and program configuration standards
focus are: Most often, the components of this index are scored from 0-
• Focus: By measuring individual programs, it is easy 00 to aid in consistency of communication. Further, each
to detect weak performers and focus effort on them. component has a weight that it contributes to the total health
index. For instance, in Figure 3 below, the external vulner-
• Ownership: Business units have line managers who ability health score is worth 25% of the total health index
are ultimately responsible for the security program. because of its criticality. This is quite similar to the weight-
Programs also, generally, have an owner, or equiva- ing done to create stock market benchmark indexes such as
lent. The advantage of dedicated ownership is that the SP 500. The components of this index are measured at
executives always know whose throat to choke. Met- least every month; care should be taken to assign the correct
rics in this evolution are excellent for use with middle weight to the individual components, based on perceived risk
management and technology/network management. to the organization.
They are not always suitable for use with
executive management. Information Security Health Index
The disadvantages of using a business unit/region
External Web Patch Anti-Virus Total Health
and program focus are: Vulnerability Vulnerability Management Health Score Index
• Lack of enterprise focus: There is not al- Health Score Health Score Health Score
ways an easy way to roll up results to an January 86.00 78.14 89.11 98.30 88.80
enterprise risk score. Attempts to do this
February 87.00 80.90 89.10 98.41 89.78
using averages and weighting can result in
inaccuracies. March 91.00 94.01 88.69 98.03 93.86
• Limited focus on maturity: Most metrics Weight 25% 25% 15% 35%
in this evolution are technology-focused, Figure 3 – Metrics Evolution V3 Information Security Health Index
not control- or process-focused. Efficiencies of execu-
tion, process maturity, and user behavior are not gen- 2. People behavior index
erally measured. This index is an indicator of how information security stan-
dards, controls, technology, and training are modifying the
V3 – proposed maturity model behavior of people in the organization. For example, it is not
Good practice organizations in the V3 evolution of security enough if an employee simply takes security awareness train-
metrics work at a holistic, process- and framework-based ing. After all, most organizations require this, so the score
level. They measure impact more than they do actions; they would always be close to 00%. It is more important that the
measure maturity more than they do execution; they mea- employee’s behavior changes for the better, e.g., are they less
sure behavioral change more than they do test-taking. likely to download unauthorized software. Mature organiza-
tions realize that compliance is not security and measure the
The following three- pronged model is proposed for measur- underlying impact of a control rather than just the control
ing and communicating security metrics to the organization: itself. Some examples of components that make up this met-
Information security health, people behavior, and process ric are:
maturity. Each of these is explained in more detail below.
• Security awareness training scores (this is measuring
1. Information security health index compliance)
This generally technical metric gives a single indicator of the • Detection of unauthorized software downloads on
wellness of an organization’s control framework, which may desktops (this is measuring behavior; this metric can
include an integration of results from the following: easily be measured using software management tools
• External vulnerability assessments – A score indi- such as SMS)
cating risk from Internet-facing hosts, e.g., the per- • Results of internally performed phishing and social
centage of hosts without any High or Medium vulner- engineering tests (this metric can require some in-
abilities ternal effort and programming, but is well worth the
• Internal vulnerability assessments – A score indi- effort)
cating risk from Intranet hosts • Password strength (this measures compliance as well
• Patch management results – A score indicating com- as behavior)
pliance levels with current patches • Amount of sensitive information sent without suffi-
• Anti-virus preparedness – A score indicating com- cient protections (for instance, measuring the amount
pliance levels with required anti-virus practices of sensitive email)
26
4. Security Metrics: Hype, reality, and value demonstration | Aurobindo Sundaram ISSA Journal | May 2008
automated controls over manual ones because au-
User Behavior Index tomated controls are less likely to fail once set up
Month Password Awareness Laptop Software User Total Behavior properly; by preferring preventative controls over
Strength Index Encryption Compliance Behavior Index detective ones because preventative controls stop a
Score Compliance Testing security threat, rather than simply sense it; and by
January 98.70 99.29 92.10 98.30 68.00 90.98 judging the continuous improvement posture of
different processes. This index is the hardest of all
February 98.00 99.62 97.16 99.66 75.00 93.67 to measure because it is inherently subjective and
March 99.00 99.72 96.53 99.66 89.00 96.59 each person measuring it might measure it slightly
differently (independent maturity assessments are
Weight 25% 20% 25% 10% 20%
too expensive to conduct on a monthly basis, be-
Figure 4 – Metrics Evolution V3.0 User Behavior Index cause they require a rigorous analysis by a trained
professional).
It is a challenge for many organizations to reach this plateau
because of their focus on programs (as presented in evolution The security professional should list all the key security pro-
V2). Most security professionals are happy with their 98% cesses that are performed, and informally, but rigorously and
security awareness training score (which is certainly impor- consistently, perform the following:
tant). However, it is important to measure whether people are . Rank them based on the Capability Maturity Model
following the principles they learned in the training because (Figure 5) which provides for five maturity levels.
the true measure of security awareness is whether it changes
2. Notate gaps in the process that would prevent it from
people’s behavior to be more secure. An example of this index
being ranked in a particular maturity level.
and its components is shown in Figure 4.
3. Use this to drive improvements for the following
3. Process maturity index
month. In addition, there are aspects of process ma-
This is an indicator of how mature the processes for informa- turity that can be measured, e.g., percentage of corpo-
tion security are. Maturity is often measured by preferring rate systems automatically provisioned for new users.
Capability Maturity Model
Process Maturity Index
Maturity Level Description Security Organization
Example Month Vulnerability User Disaster Total Process
Management Provisioning Recovery Index
1 Processes not documented; Ad-hoc and informal Processes Processes Processes
(Initial) Unstable environment; Unable policies. No manage-
to repeat past successes ment commitment. No January 37 23 31 29.6
measurement of results February 38 23 40 32.6
whatsoever
March 40 23 41 33.5
2 Some processes are repeatable; Early stages of V1. Some
(Repeatable) Basic project management; ad-hoc metrics created. Weight 30% 40% 30%
Project status visible at key Policy and program Figure 6 – Metrics Evolution V3 Process Maturity Index
milestones management in its
infancy Note in Figure 6 that the range for this metric is 0-50, to be
more consistent with CMM’s -5 levels. Values are scaled up
3 Process defined and consistent; Late stages of V1, early
(Defined) Organization establishes objec- stages of V2. Formal by 0, so that small changes in maturity can be reflected and
tives and tracks policies and program communicated.
management documen-
tation (but less at the The advantages of using this system for enterprise-level met-
process level) rics are:
4 Process metrics established, Late stages of V2 and 1. Simplicity: There are three high-level scores that all
(Managed) quantitative techniques used to V3. Formal policies and executives can understand (high is good, low is not).
analyze and manage goals program documentation.
Process documentation 2. Comprehensive: The system measures aspects of
nearly complete, but people (behavior), process (maturity model), and
not always used for im- technology, which are the three cornerstones of an
provement. Structured
metrics in place and information security program. It thus eliminates the
communicated general over-reliance on technical metrics.
5 Process improvement goals are All of level 4 + 3. Visualization: It is simple to visualize these metrics.
(Optimized) set, continuously measured, process implementation For instance, one way to do it may be to create an Ex-
and used in a feedback loop to continuously reviewed cel graph with the index as the primary bar graph and
improve process and improved (total all the elements as individual line graphs as shown
feedback cycle)
Figure 5 – Capability Maturity Model
27
5. Security Metrics: Hype, reality, and value demonstration | Aurobindo Sundaram ISSA Journal | May 2008
Figure 7. Since all metrics are normalized, trends in according to perceived risk to the organization. The
individual elements can easily be observed. best way to do this is by discussing risk rankings with
There are certainly some disadvantages with using this met- business managers, business risk managers, and the
rics system: internal audit department. If possible, an indepen-
dent (e.g., internal audit) review of the weighting cri-
. The weighting and consolidation of multiple metrics teria should be performed.
into a single index may lead to situations where, al-
though the index score is high, individual elements 3. The system is not, by itself, sufficient to measure all
are scoring poorly. This can lead to a false sense of aspects of the information security program. It must
be supplemented by a drill-down to business units
Information Security Health Index (e.g., as demonstrated in V2) so that the appropri-
ate delegation of accountability can be performed. In
120
fact, it may be possible to derive certain V3 metrics
110 from V2 metrics such as vulnerability management,
100 ▲
▲ ▲
▲ anti-virus management, and patch management – ba-
▲ ▲
▲
90 ▲ ▲ sically, any program that can produce reports both by
80 business-unit and at an enterprise level without sub-
▲ stantial modification – thus leading to synergies in
70
analysis.
60
50
▲
Conclusion
40 Security professionals are being asked to measure the value
30 of their information security programs and demonstrate the
20 continued maturity of their organizations. I have described
10 the difference between metrics and measurements, the evo-
0 lution of metrics in the information security field, and pre-
January February March April sented a model that security professionals may consider using
in their organizations. I believe that a combination of high-
Total Health Index level executive metrics and lower level business-unit and pro-
▲ External Vulnerability Health Score gram-based metrics are sufficient to demonstrate the value of
▲ Internal Vulnerability Health Score an information security program.
▲ Web Vulnerability Health Score About the Author
▲ Patch Management Health Score Aurobindo Sundaram is the vice president
▲ Anti-virus Health Score of information security at ChoicePoint,
▲ Inc., Alpharetta, GA. He has worked in
Technical Standards Health Score
the information security industry for more
Figure 7 – Information Security Health Index
than 10 years and is responsible for ar-
ticulating the vision and supervising the
security. The appropriate visualization technique implementation of ChoicePoint’s Security
should be used to help mitigate this risk (for instance, Control Framework. He can be reached at aurobindo.sunda-
see above, where it is obvious that the web vulner- ram@choicepoint.com.
ability program needs substantial improvement, even
though the total health index barely budged).
2. The weights assigned to different elements must be
carefully analyzed to ensure that they are assigned
28
6. Security Metrics: Hype, reality, and value demonstration | Aurobindo Sundaram ISSA Journal | May 2008
Examples of Metrics to Use for Different Audiences
Audience Appropriate Metric or Measurement Comments
Executives/Board Information Security/People Behavior/ High-level, enterprise-centric, holistic metrics that executives can relate to
of Directors Process Maturity Health Indices
Revenue contributed to/protected due to This measurement could include RFPs that have been responded to, customers retained due to the
information security posture excellent information security program. Note that only some organizations will be able to use this
measurement.
(Year to year) – services, capital, and These measurements allow you to show executives relative spending of the security program. Over
headcount trends mapped against the long term (year to year, not month to month), you should expect to show the flat to slightly
company revenue increasing use of automation (i.e., capital) and the decreasing use of expenses (i.e., headcount).
Middle Business-unit based drilldown metrics These are actually a set of metrics and measurements that are scoped down to the business
Management for programs that contribute to the three manager’s responsibility (see evolution V2). This is particularly useful for accountability.
executive indices
Revenue contributed to/protected due to This measurement could include RFPs that have been responded to, customers retained due to the
information security posture (focused by excellent information security program. Note that only some organizations will be able to use this
business unit/region) measurement.
Information Security/People Behavior While these two metrics are generally targeted at executives, the entire organization benefits from
Health Indices seeing metrics that show trends in the information security posture of the organization. Process
maturity is not an easily understood term and is best left out of general communications to users.
IT/Security Vulnerability assessment metrics Web application scan results (for the development organization) and network vulnerability scan
Organization results (for the operations organization) are excellent, high impact metrics that can be used to
drive improvement.
Technical standards and patch This metric focuses on initial setup and continuous management of systems, and is excellent both
management metrics technically and to verify that build processes are working efficiently.
Number of attacks on systems This measurement allows the IT organization to identify trends of attacks on the corporation.
Entire Productivity gained by blocking spam/ Use a simple formula such as $0.01 of productivity gained for every spam blocked to compute this
Organization/End virus/malware in email or web traffic measurement.
Users
Results of internally performed phishing Use a sample (say 100) of users to target with an attack, then measure the success rate (i.e., how
and social engineering simulations many users did the “right” thing). Then use awareness training and targeted messaging to improve
(normalized) this score.
Information Security/People Behavior While these two metrics are generally targeted at executives, the entire organization benefits from
Health Indices seeing metrics that show trends in the information security posture of the organization. Process
maturity is not an easily understood term and is best left out of general communications to users.
29