In this presentation, I discuss the evolution to the analysis era in information security and the challenges associated with it. This includes several examples of cognitive biases and the negative effects they can have on the analysis process. I also discuss different analytic techniques that can enhance analysis such as differential diagnosis and relational investigation.
4. Chris Sanders
“[Practical Packet Analysis] gives you everything you need, step by step, to become
proficient in packet analysis. I could not find a better book.”
– Amazon Reviewer
5. Outline
Objectives:
What is Analysis?
What is Bias?
Recognizing Bias
Defeating Bias
Analysis Methods
“How to make better technical decisions in any kind
of security analysis.“
6. **Disclaimer**
I’m going to talk about matters of the brain, not
sure the normal tech stuff.
My research for this presentation involved
consultation with psychologists.
I, however, am not one.
18. Analysis is Everywhere
• Making judgments based upon data
• Security Analysis Happens for:
– Malware Analysts
– Intelligence Analysts
– Incident Response Analysts
– Forensic Analysts
– Programming Logic Analysts
• My main focus is network intrusion analysis,
so this talk will be framed through that.
19. Network Security Monitoring
• The collection, detection, and analysis of
network security data.
• The goal of NSM is escalation, or to declare
that an incident has occurred to that incident
response can occur.
21. The Need for Analytic Technique
• Kansas State University Anthropological Study
on SOCs - Key Finding:
– “SOC analysts often perform sophisticated
investigations where the process required to
connect the dots is unclear even to analysts.”
• Analysis == “Tacit Knowledge”
22. Analysis: Thinking About Thinking
• We need to critically examine how we think
about information security analysis.
• We aren’t alone!
– Scientific
– Medical
– Legal
23. Perception vs. Reality
• Perception:
– “A way of regarding, understanding, or
interpreting something.”
• Reality:
– “The state of things as they actually exist.”
Let’s take a test…
29. Test Results
• Variation of Stroop Test (John Stroop, 1935)
• Measures Cognition
– The Process of Perception
• Identifies Gap Between Perception & Reality
• Used to Measure
– Selective Attention
– Cognitive Flexibility
– Processing Speed
30. What is Bias?
“Prejudice in favor of or against one thing,
person, or group compared with another,
usually in a way considered to be unfair.”
•Perception != Reality
•Perception is Everything, but Fallible
•We tend to perceive what we expect/are
conditioned to perceive
40. Anchoring
• Defined: Heavily relying on a single piece of
information.
• Examples:
– Src/Dst Country -> OMG China!
– IDS Alert Name -> It say this is X, so it must be X.
– Timing -> It’s every 5 minutes!
41. Clustering Illusion
• Defined:
Overestimating the
value of perceived
patterns in random
data.
• Examples:
– The great “beaconing”
fallacy
– Unguided
Visualizations
42. Availability Cascade
• Defined: Strong belief in something due to its
repetition in public discourse
• Example:
– “Chinese Traffic is Bad.”
– “That rule generates a lot of false positives.”
43. Belief Bias
• Defined: Occurs when a decision is based on
the believability of the conclusion.
• Examples:
– “We wouldn’t be a target for a nation-state
actor.”
– “This is probably a false positive because it’s
unlikely someone would attack our VoIP system.”
44. Confirmation Bias
• Defined: Interpreting data during analysis with
a focus on confirming one’s preconception.
• Ego is a big factor here
• Examples:
– “I think this is nothing.”
– “I think there is something going on here.”
45. Impact Bias
• Defined: Tendency to overestimate the
significance of something based on the
potential impact.
• Signature/Alert Naming + Lack of Experience
Contribute to this.
• Example:
– “The alert says this is a known APT1 back door, so
I need to spend all day looking at this.”
46. Irrational Escalation
• Defined: Justifying increased time investment
based on existing time investment when it
may not make sense.
• Sunk Cost Fallacy
• Example:
– “What do you mean this is nothing? I’ve spent all
day looking at this. I’ll spend all day tomorrow
digging into it; I’m sure I’ll find something else
there.”
47. Framing Effect
• Defined: Interpreting information differently
based on how or from whom it was
presented.
• Important in interaction with other analysts
• Example:
– Old Vet: “Steve doesn’t know what he is doing, so
if he is telling me this it probably doesn’t mean
much.”
– New Guy: “None of the more experienced guys
said anything about this, so it must not matter.”
48. Overconfidence Effect
• Defined: Excessive confidence in ones own
decisions, especially in light of contrasting
data.
• Example:
• 99% Paradox – “I’m 99% sure this is right.”
• One psych study suggest this statement is
wrong ~40% of the time.
49. Pro-Innovation Bias
• Defined: Excessive optimism and biased
decisions based on an invention of one’s own
making being involved in the analysis.
• Invention == System / Code / Concept
• Example:
– “My tool can do that.”
– “I wrote that signature so I know it’s accurate.”
– “This fits perfectly in my model!”
50. There are over 100 types of bias.
How can we overcome them?
52. What Can We Do?
• Preconception and Bias Cannot Be Fully
Avoided
• Therefore:
– Develop Repeatable Analytic Technique
– Recognize Key Assumptions
– Allow them to be Challenged
54. Relational Investigation
• “Link Analysis”
• Commonly Used in Criminal Investigations
• Focuses on Entities, Relationships,
Interactions, and Degrees of Separation
61. Incident M&M
• Dr. Ernest Codman at Mass. General Hospital
• Post-Patient Meetings to Discuss What
Occurred and How to Better It
• Incident M&M
1. Handler/Analyst Presents Case
2. Followed by Alternative Analysis
62. Alternative Analysis
• Developed by Richards Heuer Jr. (FBI)
• Series of Peer Analysis Methods
• Designed to Help Overcome Bias and Improve
Quality of Analysis
63. Group A / Group B
• Group A – Presenting Analyst/Team
• Group B – Secondary Analyst/Team
• Two Independent Analysis Efforts
• Note are Compared During the Presentation
• Identify Differing Conclusions from Same Data
64. Red Cell Analysis
• Peer Focus on Attacker’s Viewpoint
• Questioning in Relation to Attackers Perceived
Goals
• Requires Some Offensive Experience
• Best Executed by Red Team if Available
65. What If Analysis
• Focus on Cause/Effect of Actions That May
Not Have Actually Occurred
– What is the attacker had done X? How would you
have changed your approach?
– What if you didn’t stumble across X in Y data?
• Enhances Later Investigations
66. Key Assumptions Check
• Presenter Identifies Assumptions During
Analysis
• Peers Challenge Assumptions
• Pairs Well with “What If” Analysis
– “What if it were possible for that malware to
escape that virtual machine?”
– “Would you come to the same conclusion if you
knew this was APT3 instead of APT1?”
67. Incident M&M Best Practices
• Limit Frequency
• Set Expectations
• Require a Strong Mediator
• Keep it at the Team Level – No Sr. Managers
• Encourage Servant Leadership
• Discourage Personal Attacks
• Write it Down!
68. Conclusion
• The Era of Analysis is Upon Us
• Bias is Inevitable – Learn to Recognize It
• Overcome Analysis Hurdles With:
– Analytic Technique
– Alternative Analysis