More Related Content Similar to Embedded Systems, Asset or Security Threat? (6 May 2014, (ICS)2 Secure Rotterdam) (20) More from Jaap van Ekris (17) Embedded Systems, Asset or Security Threat? (6 May 2014, (ICS)2 Secure Rotterdam)1. © Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Embedded systems,
a hidden security threat?
Jaap van Ekris, Delta Pi
J.vanEkris@Delta-Pi.nl
3. 3
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Agenda
• What are embedded systems?
• What makes them different?
• How can they disturb my business?
• What to do about it?
4. 4
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
In the beginning…
• Mechanical control
• Electromechanical Relays
• PLC Controllers
5. 5
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Drivers for introduction
• More flexible technology
• More complex functionality
• Remote management reduces labour cost
6. 6
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Embedded systems…
• “Traditional” industrial
automation
• Deep integration with
controlled hardware
– Production lines
– Robots
7. 7
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Moving into the cockpit…
• Control by operator
moves to “fly by wire”
• Strong move to
virtualization of all
controls:
– Control rooms
– Airplanes
– Ships
8. 8
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Into the consumer space…
• Point Of Sales and
checkout Systems:
– Cash registers
– ATM’s
– OV Chipkaart
• connected for
additional services
9. 9
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Deep into our lives
• Management systems
to improves safety and
security:
– Offices
– Hospitals
– Tunnels
– Public spaces
• Tightly connected with
energy management
and HR
10. 10
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Definition (sort of)
• Control a physical
object
• Used to control
equipment in a
process
• Usually a PLC or small
barebone
11. 11
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Distinguishing properties
• Part of a 24x7 solution
• Controlling long-lived
expensive equipment
• Difficult to update or
replace
• Large number of
sensors and actuators
12. 12
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Distinguishing properties
• Geographic distribution
• Easily accessible for
hackers
• Proprietary protocols
• Usually not designed
for defense-in-depth
13. 13
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
A frequent target
• Protection is
inadequate
• Security is not on the
management agenda
• Hackers do know their
way around
14. 14
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Deeper impact
• Physical damage is
possible
• Physical process
often is fragile
• Locally updating
hardware takes ages
15. 15
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
A recent example
• Widely used in US, UK,
France, China and
Canada
• Typical replacement
technology, retrofitted
into existing roads
• Encryption and
authentication
removed upon
customer request
17. 17
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Can we fix this?
• Huge number of traffic
lights
• Replacement takes
days per crossing, with
traffic interruptions
• When will the disease
be worse than the
cure?
18. 18
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
The mindset hasn’t kept up…
• IT is introduced as a
technical replacement,
a silent killer
• Designed with a
mechanical 1960’s
mindset, not a 2010’s
security mindset
19. 19
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Small errors, large consequences…
• Petrobas 36
• Software omission
missed overpressure
event
• Losses
– 11 people died
– Spillage: 1500 ton crude
– Oil Rig: $350 million
– Production loss (84.000 barrels of
oil a day)
20. 20
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Safety systems…
• Most embedded systems
have safety
consequences
• Are required to check
their integrity frequently
• Are not allowed to have
configuration changes
• Exhibit fail-to-safe
behaviour
21. 21
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Politicians have become aware…
• Smart metering rollout
starts in 2015
• There are serious
scenario’s
• “Kill Switch” has to be
removed
22. 22
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Industry should be aware
IEC 61508-1:2010
23. 23
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Traditional measures
• Build a big firewall
• Disregard the human
element:
– a technician brings
along infected
equipment
– Operators using
personal USB sticks or
laptops
24. 24
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
What about the owner?
• Deliver maximum
performance on a
shoestring budget
• Extremely aware of
operational risks
threatening services
• Risk management often
is a core competence
25. 25
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Balancing risk as a way of life
Availability of
the service
Safety of
the installation
vs.
26. 26
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Quantitative risk analysis
• “Unplanned
unavailability” is the
term
• For every intrinsic failure
a chance and impact are
determined
• The biggest availability
killers are dealt with
27. 27
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
RAMS and CIA
• Reliability
• Availability
• Maintainability
• Safety +
--------------------------------
Deaths and Dollars
• Confidentiality
• Availability
• Integrity
+
--------------------------------
Make you look bad?
28. 28
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
RAMS versus Security
RAMS
• Intrinsic failure of a
system
• Calculates missed
business revenue
• Has a SLA with
penalties/bonus
• Is responsibility of a
business manager
Security
• Extrinsic attack on a
system
• Talks about threats
• Has a SLA with best
effort
• Is a problem of the IT
department
29. 29
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Is a security risk a safety risk?
• Security does affect
“Deaths and Dollars”
• Can we express
security in a
quantitative way?
30. 30
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
An example
• National infrastructure
• Effect of long failure is devastating for the national
economy
• Five control-rooms, operated 24x7
• One unsegmented network, allows for redundancy
• Filled with fail-to-safe components based on
Windows® controllers
• Repairmen are very frequent visitors
31. 31
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
A scenario
• A repairman or operator introduces virus or worm
by day
• The virus will spread to the entire network easily
within hours
• Overnight 70% of the infrastructure will perform an
emergency shutdown due to a fail-to-safe reaction
• Unaffected (Unix) stations will have to follow due to
physical interactions of emergency shutdowns
32. 32
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
A quantitative view
• Chance of occurrence (guestimate)
– Once every 100 years
• Impact (scenario analysis)
– Safe but unavailable
– Life expectancy all equipment shortened by a year
– Recovery
• Recovery of essential backbone: week 6
• Recovery of secondary lines: week 14
• Complete recovery: week 26
• Chance of regression: high
• Estimated loss: €1010 (catastrophic)
33. 33
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Sensitivity analysis
• What are values ranges
and effects?
Example:
– Chance is quite
dominant
– Recovery time is driven
by geographical spread
of repairmen
34. 34
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Segmenting a network
• Prevent or limit the
spread of a worm/virus
• Would concentrate the
recovery effort
• Reduces recovery time
significantly
35. 35
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
A reference Architecture (CIM)
• Levels 0 and 1 are
challenging
• Level 2 is achievable
and wise
• Level 3 would be
foolish not to do
Business
planning
& logistics
(level 4)
Operations &
planning
(level 3)
Process Supervisory
Control (level 2)
Process Control (level 1)
Field (level 0)
36. 36
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
The quantitative impact
• Chance of occurrence (guestimate)
– Once every 1000 years
• Impact (scenario analysis)
– Safe but unavailable
– Life expectancy some equipment shortened by a year
– Recovery
• Recovery of essential backbone: day 3
• Recovery of secondary lines: day 7
• Complete recovery: day 15
• Chance of regression: medium/low
• Estimated loss: €107 (survivable)
37. 37
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Some RAMS-questions…
• Increase of “unplanned unavailability” due to
– False positives on the firewall disrupting the process
– Failure of the hardware
• Increase of maintenance effort
– More updates needed
– Some might even lead to planned maintenance
• Limitation of flexibility in crisis situations
38. 38
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Can we answer quantitative questions?
• IT Hardware reliability is
a “soft number”
• Not many statistical
data about false
positives
• Proprietary protocol
performance is
uncharted territory
39. 39
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Competing in the same field
Fixing security
problems
Fixing operational
problems
VS.
40. 40
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Competing in the same field
• Security usually becomes a High Impact Low
Probabily (HILP) event
• Opens the debate about the impact of security on
company performance: “what risk is the biggest
threat to out performance”
• Does release budget for fixing problems
41. 41
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Work to be done…
• Systematically describing attack vectors in
scenario’s, relating to FTA, is achievable
• Quantifying attack vectors is difficult
• Quantifying positive and negative consequences of
measures is a challenge
42. 42
© Copyright 1989 – 2014, (ISC)2 All Rights Reserved
Conclusion
• Embedded systems are everywhere
• They are:
– An easy attack vector
– One of our biggest assets
• We have to learn from each other
– As security experts we have to learn talk about lost
dollars and lives
– As embedded system owners we have to see security as
a real threat inside their domain
Editor's Notes InfoPlus sign shown is part of GSM-R, the same network used for controlling trains Major challenge is “frequent checking”: what if all my systems are infected and then fail to safe? Once had a customer where 70% of all support calls were people using private laptops during nightshifts and couldn’t get the internet to work! Odd thing is: they talk about the same effects! Losing the power to do business!