Más contenido relacionado
La actualidad más candente (16)
Similar a Continuous Monitoring 2.0 (20)
Continuous Monitoring 2.0
- 1. Continuous Monitoring 2.0:
Cloud-based Benchmarking in Industry
and the Federal Government
Keren Cummins, Director, Federal Programs
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 2. nCircle at a Glance
• More than 6,500 customers worldwide
• 10 consecutive years of revenue growth
• 150 employees with significant investment in R&D, & continued innovation
• Core business is VA, Configuration Compliance, File Integrity Monitoring,
PCI, Performance Management
• Ranked in Inc. 5000 six years in a row
• Ranked one of San Francisco Bay Area’s Top 100 Fastest Growing Private
Companies
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 3. Agenda
• The evidence for benchmarking as an
essential element of success in continuous
monitoring
• Commercial initiative in cloud-based
benchmarking
• Mapping this initiative into the federal space
• Your feedback!
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 4. Defining Terms
• Continuous Monitoring - the context of information security, is defined
in 800-137 as “maintaining ongoing awareness of information
security, vulnerabilities, and threats to support organizational risk
management decisions.
• Benchmarking - the process of comparing one's business processes
and performance metrics to industry bests and/or best practices from
other industries. Dimensions typically measured are quality, time and
cost.
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 5. Game Changers
• State Department
– 89% risk reduction in the first 12
months across the entire world
• USAID
– FISMA C- to consistent A+’s for five
years
• Center for Medicare/Medicaid
Services (CMS)
– 80% risk reduction at 88 data
centers and as high as 95% at one
major center
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 6. Common Elements
• Breadth of engagement
• Simplicity of result
• Context
• Short cycle time
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 7. Why hasn’t everyone done this?
• Or, why is this hard?
– Metrics are hard
– My organizational structure is different
– My monitoring solution won’t do that
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 8. The Challenge for Security Performance
Management
• How can we replicate benchmarking
success effectively?
– With the organizations and tools that
we already have in place?
– For all our security programs (not just
vulnerability management and
configuration auditing)?
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 10. The CISO needs what the CFO has….
• CISO needs a metrics language to describe a
company’s security performance just like the
CFO describes financial performance
• CISO’s can now field a formal security
performance management program built on
objective, fact based metrics that
– Shows how security organization is protecting the
company
– Benchmarks performance vs. internal goals, and
vs. industry peers
– Trends performance over time
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 11. With a Security Performance Management
Program, CISOs can demonstrate that
• There is a comprehensive approach to security
that is…
– Measured against specific goals & standards
– In line with our risk tolerance
– Aggregated by meaningful asset groupings
– At least equal to or better than our
own industry's investment & performance
– Controls aligned with GRC objectives
• Based on actual data on an ongoing basis
that we can rely on to make decisions on:
– Investment
– Execution
– Resource allocation
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 12. Security Metrics & Scorecards– cornerstone
of an effective IT GRC assessment
• Metrics affirm the existence and effectiveness of security
controls
• Scorecards enable and evidence management oversight;
communicate performance and evaluate corrective actions
• Well constructed Metrics and Scorecards:
– Continuously monitor controls
– Deliver trusted, timely, and actionable decision making information
– Identify and communicate concentration of risks
– Align security initiatives with business objectives
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 13. An Effective Security Performance Management Solution
Proven Metrics and Scorecards
• Measure performance to goals
• Cover the entire IT Ecosystem
• Objective, Fact- based metrics
• Relevant & Actionable
• Benchmark with peer groups
How secure and compliant is our enterprise?
How do we compare to others?
Are we investing effectively?
IT Security Ecosystem
Event Management &
Incident Response
Antivirus &
Network Endpoint
Endpoint
Protection Encryption
Protection
Vulnerability Configuration Identity & Access Patch
Management Auditing Management Management
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 14. Valuable Peer Benchmarks
Benchmark
Performance
Quadrants
Benchmark
Performance
Standard
Participant
Results Weekly
Performance
Benchmark
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 15. Analyze performance against Benchmarks &
Identify underperforming areas
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 16. Over 1,000 companies have joined nCircle Benchmark to-date
Financial Services Bellwether Metrics
nCircle Benchmark Accounts Benchmark Benchmark
As of 7/20/12 Metric Average Median Quartile
1000
900 Top 25%: 0–5
800 Second Quartile: 6 - 33
Average CVSS host
172 33 Third Quartile: 34 - 67
700 score (per host)
Bottom 25%: 68 - 700
600
500 Top 25%: 0 – 1 days
400 Second Quartile: 2–9
Average days since last
23 9 Third Quartile: 10 – 32
300 scan
Bottom 25%: 33 – 90
200
100 Top 25%: 0 – 2 days
0 Second Quartile: 3 – 22
Virus definition age
29 22 Third Quartile: 23 – 40
(days)
Bottom 25%: 41 - 56
Top 25%: .00 - .03%
Second Quartile: .040 - .049%
Failed logins per
.05% .04% Third Quartile .05 - .08%
attempt
Bottom 25%: .09 - .11%
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 17. Benchmarking in the Federal Space
• All the same security domains as commercial, plus…
• Agencies generate CyberScope continuous
monitoring data, usually from SCAP XML files
• Generated using a wide and growing variety of SCAP
validated solutions, numerous vendors
• Files uploaded to OMB once/month
• Files are
– Human readable? Not so much
– Don’t lend themselves to trending
– Don’t lend themselves to comparative analysis
– Readily ingested and processed by nCircle Benchmark
data collectors
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 19. Asset Classification & Departmental Benchmark
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 21. SCAP Output
• Continous Monitoring Metrics driven directly
from SCAP data
– Asset based Compliance, Vulnerability and
Classification Scorecards
• Asset Grouping identifies areas of improvement and
concentration of risk or examines specific critical
cyber assets
– Intra- and Inter-Agency (Bureau/Service)
Benchmark Comparisons
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 23. Asset Identification & Departmental
Comparison
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 27. Benchmark Federal Notional Diagram
Cyberscope reporting and benchmark comparisons
Cyberscope
Assets Vulnerabilities Configuration
Internal Benchmark Scorecards, by Asset Group, SCAP sources plus
Department
local
agencies bureaus FISMA locations
requirements
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.
- 28. Questions?
• Contact information:
Keren Cummins, Director
Federal and MidAtlantic Programs
(301) 379-2493
kcummins@ncircle.com
nCircle Company Confidential © 2012 nCircle. All Rights Reserved.