Myths & Realities of Data Security & Compliance - ISACA Atlanta - Ulf Mattsson Jul 22 2016.
Data breaches are on the rise. The constant threat of cyber attacks combined with the high cost and a shortage of skilled security engineers has put many companies at risk. There is a shift in cybersecurity investment and IT risk and security leaders must move from trying to prevent every threat and acknowledge that perfect protection is not achievable. PCI DSS 3.2 is out with an important update on data discovery and requirements to detect security control failures.
In this session, cybersecurity expert Ulf Mattsson will highlight current trends in the security landscape based on major industry report findings, and discuss how we should re-think our security approach.
Myths and realities of data security and compliance - Isaca Alanta - ulf mattsson jul 22 2016
1. 1
1
Myths & Realities of
Data Security & Compliance:
Risk-based Data Protection
Ulf Mattsson, Chief Technology Officer, Compliance Engineering
umattsson@complianceengineers.com
www.complianceengineers.com
2. 2
Ulf Mattsson
Inventor of more than 25 US Patents
Industry Involvement
PCI DSS - PCI Security Standards Council
• Encryption & Tokenization Task Forces, Cloud & Virtualization SIGs
IFIP - International Federation for Information Processing
• WG 11.3 Data and Application Security
CSA - Cloud Security Alliance
ANSI - American National Standards Institute
• ANSI X9 Tokenization Work Group
NIST - National Institute of Standards and Technology
• NIST Big Data Working Group
User Groups
• Security: ISACA & ISSA
• Databases: IBM & Oracle
3. 3
My work with PCI DSS Standards
Payment Card Industry Security Standards Council (PCI SSC)
1. PCI SSC Tokenization Task Force
2. PCI SSC Encryption Task Force
3. PCI SSC Point to Point Encryption Task Force
4. PCI SSC Risk Assessment SIG
5. PCI SSC eCommerce SIG
6. PCI SSC Cloud SIG
7. PCI SSC Virtualization SIG
8. PCI SSC Pre-Authorization SIG
9. PCI SSC Scoping SIG Working Group
10. PCI SSC 2013 – 2014 Tokenization Task Force
6. 6
• The Dilemma for CISO, CIO, CFO, CEO, and Board
• Where are my most valuable data asset?
• Who Has Access to it?
• Is it Secure?
• Insider/External Threats?
• Am I Compliant?
• What is/has been the Financial Cost?
• Am I Adhering to Best Practices? How Do I Compare to My Peers?
• Can I Automate the Lifecycle of Data Security?
The Security & Compliance Issue
14. 14
Keep cardholder data storage to a minimum by implementing data retention
and disposal policies, procedures and processes that include at least the
following for all cardholder data storage
Discovery Results Supporting Compliance
1. Limiting data storage amount and retention time to that which is required
for legal, regulatory, and/or business requirements
2. Specific retention requirements for cardholder data
3. Processes for secure deletion of data when no longer needed
4. A quarterly process for identifying and securely deleting stored
cardholder data that exceeds defined retention.
Old PCI DSS Requirement 3.1
15. 15
• PCI DSS v2 did not have data flow in the 12
requirements, but mentioned it in “Scope of
Assessment for Compliance with PCI DSS
Requirements.”
• PCI DSS v3.1 added data flow into a requirement.
• PCI DSS v3.2 added data discovery into a requirement.
New PCI DSS 3.2 Standard – Data Discovery
Source: PCI DSS 3.2 Standard: data discovery (A3.2.5, A3.2.5.1, A3.2.6) for service providers
17. 17
• IT risk and security leaders must move from trying to prevent
every threat and acknowledge that perfect protection is not
achievable.
• Organizations need to detect and respond to malicious
behaviors and incidents, because even the best preventative
controls will not prevent all incidents.
• By 2020, 60% of enterprise information security budgets will
be allocated for rapid detection and response approaches, up
from less than 20% in 2015.
Shift in Cybersecurity Investment
Source: Gartner - Shift Cybersecurity Investment to Detection and Response, 7 January 2016
18. 18
Growing Information Security Outsourcing
The information security market is estimated to have
grown 13.9% in revenue in 2015
with the IT security outsourcing segment
recording the fastest growth (25%).
Source: Gartner Forecast: Information Security, Worldwide, 2014-2020, 1Q16 Update
20. 20
Discovery Deployment Example
Example of Customer Provisioning:
• Virtual host to load Software or Appliance
• User ID with “Read Only” Access
• Firewall Access
ApplianceDiscovery
Admin
22. 22
STEP 4:
The scanning
execution can
be monitored
by Provider
and the
customer via a
Job Scheduler
interface
Discovery Process (Step 4) – Scanning Job Lists
Discover all sensitive PII – Not just PCI data
29. 29
FS-ISAC Summit about “Know Your Data”
• Encryption at rest has become the new norm
• However, that’s not sufficient
• Visibility into how and where it flows during the course
of normal business is critical
Source: On May 18, 2016 Lawrence Chin reported from the FS-ISAC Summit
31. 31
Know Your Data – Identify High Risk Data
Begin by determining the risk profile of all relevant data collected and stored
• Data that is resalable for a profit
• Value of the information to your organization
• Anticipated cost of its exposure
Data Field Risk Level
Credit Card Number 25
Social Security Number 20
CVV 20
Customer Name 12
Secret Formula 10
Employee Name 9
Employee Health Record 6
Zip Code 3
32. 32
Match Data Protection Solutions with Risk Level
Risk Level Solution
Monitor
Monitor, mask,
access control
limits, format
control encryption
Tokenization,
strong
encryption
Low Risk
(1-5)
At Risk
(6-15)
High Risk
(16-25)
Data
Field
Risk
Level
Credit Card Number 25
Social Security Number 20
CVV 20
Customer Name 12
Secret Formula 10
Employee Name 9
Employee Health Record 6
Zip Code 3
Deploy Defenses
35. 35
Time
Total Cost of
Ownership
Strong Encryption:
3DES, AES …
I
2010
I
1970
How did Data Security Evolve 1970 - 2010?
I
2005
I
2000
Type Preserving
Encryption:
FPE, DTP …
Tokenization
in Memory
High -
Low -
38. 38
NIST - Increasing Relevance
Crypto Modules
PCI DSS
Payment Card Industry Data Security Standard
Hardware & Software Security Modules
NIST Federal Information Processing
Standard FIPS 140
NIST Special Publication 800-57
AES
Advanced Encryption Standard
NIST U.S. FIPS
PUB 197
FPEFormat Preserving Encryption
NIST Special Publication 800-38G
HIPAA
HIPAA/HITECH/BREACH-NOTIFICATION
NIST SP 800-111
40. 40
Need for Masking Standards
Many of the current techniques
and procedures in use, such as
the HIPAA Privacy Rule’s Safe
Harbor de-identification standard,
are not firmly rooted in theory.
There are no widely accepted
standards for testing the
effectiveness of a de-
identification process or gauging
the utility lost as a result of
de-identification.
42. 42
Type of
Data
Use
Case
I
Structured
How Should I Secure Different Data?
I
Un-structured
Simple -
Complex -
PCI
PHI
PII
File
Encryption
Card
Holder
Data
Field
Tokenization / Encryption
Protected
Health
Information
42
46. 46
Data Exposed in Cloud & Big Data
Do we
know our
sensitive
data?
Big
Data
Public
Cloud
47. 47
Encryption Usage - Mature vs. Immature Companies
Source: Ponemon - Encryption Application Trends Study • June 2016
Lessuseofencryption
Public
Cloud
48. 48
• Rather than making the protection platform based, the security
is applied directly to the data, protecting it wherever it goes,
in any environment
• Cloud environments by nature have more access points and
cannot be disconnected
• Data-centric protection reduces the reliance on controlling the
high number of access points
Data-Centric Protection Increases Security
49. 49
Protect Sensitive Cloud Data - Example
Internal Network
Administrator
Attacker
Remote
User
Internal
User
Cloud Gateway
Public Cloud
Each
sensitive field
is protectedEach
authorized
field is in
clear
Each
sensitive field
is protected
Data encryption, tokenization or masking of fields or files (at transit and rest)
50. 50
Cloud Providers Not Becoming Security Vendors
• There is great demand for security providers that can offer
orchestration of security policy and controls that span not just
multicloud environments but also extend to on-premises
infrastructure
• Customers are starting to realize that the responsibility for mitigating
risks associated with user behavior lies with them and not the
CSP — driving them to evaluate a strategy that allows for incident
detection, response and remediation capabilities in cloud
environments
Source: Gartner: Market Trends: Are Cloud Providers Becoming Security Vendors? , May 2016
51. 51
Encryption Usage - Mature vs. Immature Companies
Source: Ponemon - Encryption Application Trends Study • June 2016
Lessuseofencryption
Big
Data
52. 52
Attacking Big Data
HDFS (Hadoop Distributed File System)
Pig (Data Flow) Hive (SQL) Sqoop
ETL Tools BI Reporting RDBMS
MapReduce
(Job Scheduling/Execution System)
OS File System
Big Data
53. 53
Securing Big Data - Examples
• Volume encryption in Hadoop
• Hbase, Pig, Hive, Flume and Scope using protection API
• MapReduce using protection API
• File and folder encryption in HDFS
• Export de-identified data
Import de-
identified data
Export
identifiable
data
Export audit
for reporting
Data
protection at
database,
application,
file
Or in a
staging area
HDFS (Hadoop Distributed File System)
Pig (Data Flow) Hive (SQL) Sqoop
ETL Tools BI Reporting RDBMS
MapReduce
(Job Scheduling/Execution System)
OS File System
Big Data
Data encryption, tokenization or masking of fields or files (at transit and rest)
54. 54
Topology Performance Scalability Security
Local Service
Remote Service
Data Protection Implementation Layers
System Layer Performance Transparency Security
Application
Database
File System
Legend: Best
Worst
57. 57
PCI DSS 3.2 – Security Control Failures
PCI DSS 3.2 include 10.8 and 10.8.1 that outline that service providers need to
detect and report on failures of critical security control systems.
PCI Security Standards Council CTO Troy Leach explained
• “without formal processes to detect and alert to critical security control
failures as soon as possible, the window of time grows that allows
attackers to identify a way to compromise the systems and steal
sensitive data from the cardholder data environment.”
• “While this is a new requirement only for service providers, we encourage
all organizations to evaluate the merit of this control for their unique
environment and adopt as good security hygiene.”
58. 58
Example - Report on Failures of Critical Security controls
API
MTSS
Management
Environment
60. 60
MSSP - Managed Security
Service Provider
• SOC – Security Operations
Center
• Security monitoring
• Firewall integration /
management
• Vulnerability scanning
• SIEM - Security Incident &
Event Monitoring and
management
MTSS - Managed Tool Security
Service
• Professional Services that applies
best practices & expert analysis of
your security tools
• Customized alarms and reports
through SaaS
• Provides overall security tools
management and monitoring
• Ticketing, Resolution & Reporting
• Ensure availability of security
tools
• License analysis
Examples of Security Outsourcing Models
WHO IS MONITORING YOUR MSSP?
61. 61
Benefits of Managed Tool Security Service
Security controls in place and functioning.
Prepared to address information security when it
becomes a Boardroom Issue
Visibility to measure ROI
Confidence in reduced risk of data loss, damaged share
price, stolen IP, etc.
Ability to produce a positive return on capital
investments in tools.
Cost reduction in (people, licenses, maintenance, etc.)
Reduced risk of breach and associated costs (financial,
reputational, regulatory losses)
65. 65
SOCTools
24/7 Eyes on
Glass (EoG)
monitoring,
Security
Operations
Center (SOC)
Managed
Tools Security
Service
Software as a Service (SaaS)
data discovery solution
Security Tools and Integrated Services
Discovery
Security Tools
and
Integrated
Services
66. 66
Compliance
Assessments
• PCI DSS & PA Gap
• HIPAA (2013
HITECH)
• SSAE 16-SOC 2&3*
• GLBA, SOX
• FCRA, FISMA
• SB 1385, ISO
27XXX
• Security Posture
Assessments
(based on industry
best practices)
• BCP & DRP (SMB
market)
Professional Security
Services
• Security Architecture
• Engineering/Operations
• Staff Augmentation
• Penetration Testing
• Platform Baseline
Hardening (M/F, Unix,
Teradata, i-Series,
BYOD, Windows)
• IDM/IAM/PAM
architecture
• SIEM design, operation
and implementation
• eGRC Readiness &
Deployment
E Security &
Vendor Products
• Data Discovery
• Managed Tools
Security Service
• Data Loss
Protection
• SIEM & Logging
• Identity and Access
Management
• EndPoint
Protection
• Network Security
Devices
• Encryption
• Unified Threat
• Multi-factor
Authentication
Managed
Security
Services
• MSSP/SOC
• SIEM 365
• Data Center
SOC
• IDM/IAM
Security
Administration
• Healthcare
Infrastructure
Solutions (2013
3rd Qtr.
• Vulnerability
Scans
• Penetration
Testing
Samples of Our Services
Welcome to my session and Thank you for inviting me
Myths & Realities of Data Security & Compliance: Risk-based Data Protection
where we are now and where things are headed
How the Latest Trends in Data Security Can Help Your Data Protection Strategy
Data breaches are on the rise. The constant threat of cyber attacks combined with the high cost and a shortage of skilled security engineers has put many companies at risk. There is a shift in cybersecurity investment and IT risk and security leaders must move from trying to prevent every threat and acknowledge that perfect protection is not achievable. PCI DSS 3.2 is out with an important update on data discovery and requirements to detect security control failures.
In this webinar, cybersecurity expert Ulf Mattsson will highlight current trends in the security landscape based on major industry report findings, and discuss how we should re-think our security approach.
Worked mostly in research and sw development
I’ll discuss a variety of research reports. Not headlines from media about the latest breach
In my opinion PCI is leading with good security hygiene
What I hear from my industry contacts and customers
I view this as A major issue now
Most organizations do not have a common process for assessing the risks to sensitive or
confidential data. Figure 4 reveals the common processes organizations have in place to
safeguard sensitive or confidential information. Only one-third of respondents say their
organization has a common process for assessing the risks to sensitive data in the cloud and 43
percent of respondents say they have a process for assessing data on premise.
Organizations are tracking individuals who have access to sensitive or confidential data, but will it
prevent unauthorized access to sensitive information? As shown in the figure below, 70 percent
of respondents say their organizations have a common process for tracking the individuals who
have access to sensitive information on premise and 29 percent say they have a process for the
cloud. Yet, only 22 percent of respondents say there is little risk that employees, temporary
employees or contractors would have too much access to data (not shown in the figure).
Most organizations are not taking steps to determine potential threats to sensitive information.
Only 46 percent of respondents have a common process for discovering and classifying sensitive
or confidential data on premise and 30 percent say they have this process for data in the cloud.
Forty-six percent of respondents say they track changes in access patterns to identify unusual
activity that could indicate a potential threat at a granular level for data on premise. However, only
19 percent of respondents say they use this process for data in the cloud. Similarly, 45 percent
have a process for implementing new controls and preventative measures in the presence of a
new threat and only 19 percent say they have this process for the cloud.
Where is data?
Understanding risk
Outsourcing
Mobile and Cloud
Most organizations do not have a common process for assessing the risks to sensitive or
confidential data. Figure 4 reveals the common processes organizations have in place to
safeguard sensitive or confidential information. Only one-third of respondents say their
organization has a common process for assessing the risks to sensitive data in the cloud and 43
percent of respondents say they have a process for assessing data on premise.
Organizations are tracking individuals who have access to sensitive or confidential data, but will it
prevent unauthorized access to sensitive information? As shown in the figure below, 70 percent
of respondents say their organizations have a common process for tracking the individuals who
have access to sensitive information on premise and 29 percent say they have a process for the
cloud. Yet, only 22 percent of respondents say there is little risk that employees, temporary
employees or contractors would have too much access to data (not shown in the figure).
Most organizations are not taking steps to determine potential threats to sensitive information.
Only 46 percent of respondents have a common process for discovering and classifying sensitive
or confidential data on premise and 30 percent say they have this process for data in the cloud.
Forty-six percent of respondents say they track changes in access patterns to identify unusual
activity that could indicate a potential threat at a granular level for data on premise. However, only
19 percent of respondents say they use this process for data in the cloud. Similarly, 45 percent
have a process for implementing new controls and preventative measures in the presence of a
new threat and only 19 percent say they have this process for the cloud.
Good practice beyond PCI and Services Providers
PCI DSS 3.2 is out and new requirements include 10.8 and 10.8.1 that outline that service providers need to detect and report on failures of critical security control systems. PCI Security Standards Council CTO Troy Leach explained that “without formal processes to detect and alert to critical security control failures as soon as possible, the window of time grows that allows attackers to identify a way to compromise the systems and steal sensitive data from the cardholder data environment. While this is a new requirement only for service providers, we encourage all organizations to evaluate the merit of this control for their unique environment and adopt as good security hygiene.”
I see that companies use a variety of tools to manage and monitor the security of their network and application infrastructure, picked according to their needs and requirements. They are generally expensive, and it's imperative that the output be actionable and properly directed. In order to assure proper operation, the tools themselves must be kept healthy, current, and properly configured. This is time consuming and requires a broad skillset to perform effectively, a skillset not often present or affordable for the companies. Organizations may have 10-25 security products to combat the persistent threats from the hostile world they operate in. The constant threat combined with the high cost and a shortage of skilled security engineers has put many companies at risk. Simply put, companies are unable to maintain and utilize the strategic investment in core security technologies to maximize their potential use.
Compliance Engineering offers a Managed Tool Security Service (MTSS) from a Security Operations Center to address these needs in a secure and cost effective fashion. This is a fully staffed 24.7.365 operations center that monitors and maintains tool availability, health, applies patches and performs version upgrades to keep your security tool environment in optimal shape.
Good practice beyond PCI and Services Providers
PCI DSS 3.2 is out and new requirements include 10.8 and 10.8.1 that outline that service providers need to detect and report on failures of critical security control systems. PCI Security Standards Council CTO Troy Leach explained that “without formal processes to detect and alert to critical security control failures as soon as possible, the window of time grows that allows attackers to identify a way to compromise the systems and steal sensitive data from the cardholder data environment. While this is a new requirement only for service providers, we encourage all organizations to evaluate the merit of this control for their unique environment and adopt as good security hygiene.”
I see that companies use a variety of tools to manage and monitor the security of their network and application infrastructure, picked according to their needs and requirements. They are generally expensive, and it's imperative that the output be actionable and properly directed. In order to assure proper operation, the tools themselves must be kept healthy, current, and properly configured. This is time consuming and requires a broad skillset to perform effectively, a skillset not often present or affordable for the companies. Organizations may have 10-25 security products to combat the persistent threats from the hostile world they operate in. The constant threat combined with the high cost and a shortage of skilled security engineers has put many companies at risk. Simply put, companies are unable to maintain and utilize the strategic investment in core security technologies to maximize their potential use.
Compliance Engineering offers a Managed Tool Security Service (MTSS) from a Security Operations Center to address these needs in a secure and cost effective fashion. This is a fully staffed 24.7.365 operations center that monitors and maintains tool availability, health, applies patches and performs version upgrades to keep your security tool environment in optimal shape.
Is PCI DSS v3.2 changing our data security process?
PCI DSS v3.2 provides several technical, process, documentation updates and new assessment guidance. One of the important and unique updates is specified data discovery (A3.2.5, A3.2.5.1, A3.2.6) for service providers. While these requirements are not mandatory for some time, it’s important to know that you and your service providers now have an opportunity to leverage and adopt these controls. Implementing data discovery solutions can significantly and positively impact or reduce scope/cost, which will ultimately make it easier to validate PCI compliance.
Compliance Engineering is excited about being a part of the PCI QSA community and has many years of PCI experiences. Compliance Engineering has also developed specialized tools to support the Payment Card Industry. Compliance Engineering specializes in being a trusted advisor and solution provider for organizations with complex to simplistic PCI environments. It is becoming widely recognized that “unknown” data leakage of PCI data, and more broadly other Personally Identifiable Information, within enterprises is the highest value target for the “bad guys”. While current market Data Loss Prevention tools are valuable, they do not provide for expansive and prescriptive data discovery. Compliance Engineering has developed a next generation data discovery tool called PII Finder. This agentless SaaS solution combines a rigorously tested and client proven scanning software with or without the analysis expertise of our security engineering professionals. PII Finder can execute remote or on-premise, scheduled scans of your data stores for a nearly endless variety of Personally Identifiable Information. This process is an essential component for scoping the IT environment for Security & Privacy, PCI, HIPAA as well as other industry and regulatory compliance. Not to mention, just a strong security best practice.
Many organizations may outsource for scalability and some do it for cost reasons
List of Tables
1-1 Security Spending by Region, 2014-2020 (Millions of Dollars)
1-2 Security Spending by Segment, 2014-2020 (Millions of Dollars)
2-1 Security Spending by Region, 2014-2020 (Millions of Dollars)
2-2 Security Spending by Segment, 2014-2020 (Millions of Dollars)
3-1 Worldwide Spending on Security by Technology Segment, Country and Region, 2014-2020
5-1 Exchange Rates Used in Creating This Report
Overview
This file provides five years of forecast data and two years of historical data for the worldwide overall security market. Data is shown by region for key segments of the security industry.
The data in the .zip download contains one or more Excel reports, along with data formatted in a comma-delimited flat file (.csv) that can be imported into a variety of other applications.
Gartner Recommended Reading
Some documents may not be available as part of your current Gartner subscription.
"Market Definitions and Methodology: Public Cloud Services"
The results of the PII Finder scans stay within your data center.
My opinion: Most advanced ISAC compared to other industries
On May 18, 2016 Lawrence Chin reported from the FS-ISAC Summit about “Know Your Data” that “At the end of the day, your business critical data is the asset that needs to be protected. Consequently, an awareness of where it resides, who has access to it, and how it travels through your network is necessary. To protect data, encryption at rest has become the new norm. However, that’s not sufficient. Visibility into how and where it flows during the course of normal business is critical. Armed with this knowledge, deviations from the baseline can be detected and even stopped.”
Historically, organizations have taken a reactive approach to data security in response to government regulations and industry standards. Recent breaches demonstrate the urgent need to be more proactive and flexible to the ever-changing nature of big data technology and threat landscape.
I think that the first step is to locate sensitive data in databases, file systems, and application environments and then identify the data’s specific retention requirements and apply automated processes for secure deletion of data when it’s no longer needed.
With cost-effective approaches possibly based on agentless technologies and cloud based solutions, these goals are attainable.
On May 18, 2016 Lawrence Chin reported from the FS-ISAC Summit about “Know Your Data” that “At the end of the day, your business critical data is the asset that needs to be protected. Consequently, an awareness of where it resides, who has access to it, and how it travels through your network is necessary. To protect data, encryption at rest has become the new norm. However, that’s not sufficient. Visibility into how and where it flows during the course of normal business is critical. Armed with this knowledge, deviations from the baseline can be detected and even stopped.”
Historically, organizations have taken a reactive approach to data security in response to government regulations and industry standards. Recent breaches demonstrate the urgent need to be more proactive and flexible to the ever-changing nature of big data technology and threat landscape.
I think that the first step is to locate sensitive data in databases, file systems, and application environments and then identify the data’s specific retention requirements and apply automated processes for secure deletion of data when it’s no longer needed.
With cost-effective approaches possibly based on agentless technologies and cloud based solutions, these goals are attainable.
21 What’s the first step in developing a risk-based data security plan?
Lets review Data Risks
You begin by determining the risk profile of sensitive data collected and stored by an enterprise,
and then classify that data according to its designated risk level.
It’s really just a matter of using common sense. Data that is resalable for a profit — typically financial,
personally identifiable and confidential information — is high risk data and requires the most rigorous
protection; other data protection levels should be determined according to its value to your organization
and the anticipated cost of its exposure — would business processes be impacted?
Would it be difficult to
manage media coverage and public response to the breach?
One simple way to determine a risk profile is to assign a numeric value for each class of data; high risk =
5, low risk = 1. Use the same values to grade the odds of exposure. Then multiply the data value by the
risk of exposure to determine the risk levels in your enterprise.
That can sound rather overwhelming…
It doesn’t have to be. Organizations with robust data classification plans typically use an automated tool to
assist in the discovery of the subject data. Available tools will examine file metadata and content, index the
selected files, and reexamine on a periodic basis for changes made. The indexing process provides a complete
listing and rapid access to data that meets the defined criteria used in the scanning and classification process.
Most often, the indices created for files or data reflect the classification schema of data sensitivity, data type,
and geographic region.
You also need to consider data volumes, server, connectivity, physical security, HR aspects, geography, compensating controls -- and more.
53 Lets go back to our Example of Data with different Risk Levels
WE can now Pick a Risk Value, and map it to the most Cost-Effective solution from a Risk management Perspective.
The key thing to remember here is that one size security solutions are never the best fit.
The strongest protection for high risk data will be strong encryption (or tokenization) of
individual data fields. .
The risk levels here will depend on value of the data, data
volumes, the servers, connectivity, physical security, HR aspects, geography, compensating controls and
other issues.
51 Lets summarize and Position the Different Approaches to protect Data
-6 Approaches
-Position impact on Performance Storage size Security and Transparency
-3 approaches can be used to protect cardholder data
Look for multi-tasking solutions that provide a complete
set of protection technologies that can be deployed when and as needed, incombinations that suit the individual business’ needs, in order to protect data now andquickly address changes in data risk-levels and new threat vectors.
Format Controllong encryption and Tokenization can also provide protection of Prod data in a test environment. This can enable high quality data for test in a secure way.
High-risk data is bestsecured using encryption or tokenization of individual data fields.
For example, Data Format Controlling Encryption retains the
original format, on a character-by-character basis, of encrypted data, putting an end to
the data re-formatting and database schema changes required by other encryption
techniques. It’s especially well-suited to protect data that’s being used for testing or
development in a less-controlled environment.
Policy-Based Masking provides the ability to mask selected parts of a sensitive asset.
Implemented at the database level rather than application level, policy-based Data
Masking provides a consistent level of security across the enterprise without interfering
with business operations and greatly simplifies data security management.
Health Insurance Portability and Accountability Act.
Format Preserving Encryption Gets NIST Stamp of Approval
Posted By: charles, @chvrles (Twitter)
19Apr
2016
Yet after more than a decade of research, there is comparatively little known about the underlying science of de-identification. Many of the current techniques and procedures in use, such as the HIPAA Privacy Rule’s Safe Harbor deidentification standard, are not firmly rooted in theory. There are no widely accepted standards for testing the effectiveness of a de-identification process or gauging the utility lost as a result of de-identification. Given the growing interest in de-identification, there is a clear need for standards and assessment techniques that can measurably address the breadth of data and risks described in this paper.
Figure 2. Extensive usage of 14 encryption applications for mature vs. immature companies
Average deployment rate for all encryption technologies = 41%
Consolidated view
The reason for high interest is based on the Cloud Gateway Benefits
Example
Eliminates the threat of third parties exposing your sensitive information
Delivers a secure and uncompromised SaaS user experience
Identifies malicious activity and proves compliance to third parties and detailed audit trails
Eases cloud adoption process and acceptance
Product is transparent and has close to 0% overhead impact
Simplifies compliance requirements
Ability to outsource a portion of your IT security requirements
Eliminates data residency concerns and requirements
Greatly reduces cloud application security risk
Enables partner access to your sensitive data
Controls cloud security from the enterprise
Protects your business from third party access
Source: Gartner: Market Trends: Are Cloud Providers Becoming Security Vendors? Published: 31 May 2016
Analyst(s): Sid Deshpande, Jay Heiser, Craig Lawson
Customer demand for richer security capabilities for cloud environments has
driven leading CSPs to offer better security features and enablers for their
platforms. This research explores the changing dynamics between CSPs
and their ecosystem of external security providers.
Key Findings
■ Leading cloud service providers are offering more security features natively on their platforms as
well as providing technical and business enablers to external security providers, leading to both
competition and synergy between CSPs and their security partner ecosystem.
■ Security providers are heavily dependent on CSPs for enabling features, leading to
inconsistency in the depth of features they can offer on each cloud platform.
■ There still exists a "long tail" of CSPs that isn't as security conscious as the leaders, leading to
different types of opportunities for external security providers.
■ There is great demand for security providers that can offer orchestration of security policy and
controls that span not just multicloud environments but also extend to on-premises
infrastructure.
■ Customers are starting to realize that the responsibility for mitigating risks associated with user
behavior lies with them and not the CSP — driving them to evaluate a strategy that allows for
incident detection, response and remediation capabilities in cloud environments.
Figure 2. Extensive usage of 14 encryption applications for mature vs. immature companies
Average deployment rate for all encryption technologies = 41%
Consolidated view
Data protection at database, application or file
Data protection in a staging area
3. Volume encryption in Hadoop
4. Hbase, Pig, Hive, Flume and Scope using protection API
5. MapReduce using protection API
6. File and folder encryption in HDFS
8. Export de-identified data
7. Import de-identified data
9. Export identifiable data
10. Export audit s for reporting
Data protection at database, application or file
Data protection in a staging area
3. Volume encryption in Hadoop
4. Hbase, Pig, Hive, Flume and Scope using protection API
5. MapReduce using protection API
6. File and folder encryption in HDFS
8. Export de-identified data
7. Import de-identified data
9. Export identifiable data
10. Export audit s for reporting
PCI DSS 3.2 is out and new requirements include 10.8 and 10.8.1 that outline that service providers need to detect and report on failures of critical security control systems. PCI Security Standards Council CTO Troy Leach explained that “without formal processes to detect and alert to critical security control failures as soon as possible, the window of time grows that allows attackers to identify a way to compromise the systems and steal sensitive data from the cardholder data environment. While this is a new requirement only for service providers, we encourage all organizations to evaluate the merit of this control for their unique environment and adopt as good security hygiene.”
I see that companies use a variety of tools to manage and monitor the security of their network and application infrastructure, picked according to their needs and requirements. They are generally expensive, and it's imperative that the output be actionable and properly directed. In order to assure proper operation, the tools themselves must be kept healthy, current, and properly configured. This is time consuming and requires a broad skillset to perform effectively, a skillset not often present or affordable for the companies. Organizations may have 10-25 security products to combat the persistent threats from the hostile world they operate in. The constant threat combined with the high cost and a shortage of skilled security engineers has put many companies at risk. Simply put, companies are unable to maintain and utilize the strategic investment in core security technologies to maximize their potential use.
Compliance Engineering offers a Managed Tool Security Service (MTSS) from a Security Operations Center to address these needs in a secure and cost effective fashion. This is a fully staffed 24.7.365 operations center that monitors and maintains tool availability, health, applies patches and performs version upgrades to keep your security tool environment in optimal shape.
So unlike a MSSP that provides a suite of Information security services, including virus scanning, spam blocking, hardware/software firewall integration/management and overall security monitoring/management.
Managed Tool Security Service is a new and unique offering that provides Professional Services/Consulting, security tools management and also provides expert analysis of your security tools behavior and delivers CUSTOMIZED monitoring, alarms and reports through the use of a SaaS or Software as a Service application.
MTSS addresses the issue within enterprises of security tools becoming underutilized due to a number of reasons we will cover
IT security is a complex technical discussion but with visibility to show Information Security and Systems are available and performing effectively,
MTSS can provide the ability to state with confidence and prove that security and service delivery controls are in place and functioning.
Other benefits include:
Visibility to measure ROI
Cost reduction in administrative oversight, license management and individual tool maintenance expenses.
Reduced risk of breach and associated costs
How the Latest Trends in Data Security Can Help Your Data Protection Strategy
Data breaches are on the rise. The constant threat of cyber attacks combined with the high cost and a shortage of skilled security engineers has put many companies at risk. There is a shift in cybersecurity investment and IT risk and security leaders must move from trying to prevent every threat and acknowledge that perfect protection is not achievable. PCI DSS 3.2 is out with an important update on data discovery and requirements to detect security control failures.
In this webinar, cybersecurity expert Ulf Mattsson will highlight current trends in the security landscape based on major industry report findings, and discuss how we should re-think our security approach.
Hawkeye SCS consists of three integrated tool products: PII Finder, Vision and MTSS
Attachments
Questions
How the Latest Trends in Data Security Can Help Your Data Protection Strategy
Data breaches are on the rise. The constant threat of cyber attacks combined with the high cost and a shortage of skilled security engineers has put many companies at risk. There is a shift in cybersecurity investment and IT risk and security leaders must move from trying to prevent every threat and acknowledge that perfect protection is not achievable. PCI DSS 3.2 is out with an important update on data discovery and requirements to detect security control failures.
In this webinar, cybersecurity expert Ulf Mattsson will highlight current trends in the security landscape based on major industry report findings, and discuss how we should re-think our security approach.