The document summarizes the work of a team called Sea++ over 10 weeks. In the first 3 weeks, they interviewed cyber engineers to understand barriers to testing ship navigation systems safely. They developed an initial MVP of a collaboration tool for commanding officers and engineers to plan cyber tests. In weeks 4-7, they sought to understand why integrated cyber testing does not occur on ships post-acquisition. They identified issues such as lack of tools, documentation, and cultural barriers. In weeks 8-10, they focused on what commanding officers need to know about cyber threats and developed a way to convey essential mission impact information to them. Going forward, they see opportunities to expand their work to other maritime sectors.
Basic Civil Engineering first year Notes- Chapter 4 Building.pptx
Sea++ H4D Stanford 2018
1. Ruben Krueger
BS CS
Software
Scott Pratz
MS EE
Strategy
Meilinda Sun
BS CS
Software
Ross Ewald
BS CS
Policy
Sea++
Original Problem:
Test engineers cannot
safety or adequately cyber
test ship nav systems.
99 interviews
Current Problem:
COs must understand
impacts of cyber
vulnerabilities to
missions.
Sponsor: MITRE (in collaboration with the Office of Naval Research)
2.
3.
4. Weeks 1-3: Understanding problem scope
Customer Discovery Objective: Interview cyber
engineers to learn why cyber testing is unsafe
and inadequate.
5. - MITRE Systems Engineer, Former USAF
“[Operational leadership] is often the
barrier to cyber testing, as they worry about
the safety of the crew and integrity of the
asset.”
6. Mission Model Canvas™ Week 1
Key Partners
NavSea Philadelphia
Cyber Command
Private Cybersecurity Firms
Shipbuilders
Commanding officers
Key Activities:
Understanding hardware
testing procedures
Developing evaluation criteria
Gaining trust and gauging
expectations of ship captains
Value Proposition
Development of physical
and electronic security
testing procedures.
Streamline testing
procedures to minimize
operational downtime
Target individual systems
for most useful testing
applications
Incremental aggression
capabilities to maximize
vulnerability detection.
Beneficiaries
Test Engineers
Commanding Officers
Key Resources
Safety parameters of cyber-
physical systems
Integrated navigation system
models/block diagrams
User Design Input
Buy-In / Support
Operational commanders for
conducting testing
NavSea/MITRE (Navy) and
C4IT (CG) to conduct testing
and evaluate effectiveness of
methodology
Deployment
Testing of methodology on
sample test plans
Full electronic software
deployment to customers via
web-app
Mission Budget / Costs
Fixed:
Software design & engineering
Testing and reporting costs
Variable:
Expansion of hardware testing capabilities
Travel
Mission Achievement
More aggressive testing taking place onboard ships
Adoption and continued use metrics of the methodology by engineers
Increased vulnerability detection on ships
User retention during testing lifecycle
7. Develop testing procedures
Target individual systems
Cyber Engineers
Commanding Officers
Mission Model Canvas™ Week 1
Value Proposition Beneficiaries
Streamline testing procedures
Incremental aggression capability
8. Initial MVP - Collaboration Tool Between COs and Engineers
Ship’s Personnel CONCERNS
Cyber Engineers/Testers TESTING RQMNTS
Sea++
Platform
Testbed Creation:
Environmental/
Cyber Monitoring
Requirements
Integrated
Navigation
System
Data
Identification of System Vulnerabilities
s
Hypothesis: Better
testing planning is needed
to improve safety.
MVP: Platform to engage
COs and engineers in the
planning of cyber tests
and to create a
corresponding testbed.
9. “There is no routine entity that tests cyber
security on ships.”
- Member of National Security Council, Navy Officer
10. What we had been doing:
“Can you please direct us to the group that does integrated testing?”
NAVSEA
SPAWAR
NAVAIR
OFFICE OF
NAVAL
RESEARCH
FLEET
CYBER
11. Weeks 4-7: Why isn’t anyone testing ships?
Customer Discovery Objective: Understand why
post-acquisitions integrated cyber testing does
not occur.
12. Nonstop Pivoting, Beneficiary Confusion
R&D
High-value assets may
be damaged
Why doesn’t post-acquisitions cyber testing occur?
Pentesters
Post-test remediation
is time-consuming
and inadequate.
Test Engineers
Inadequate
documentation for a
thorough test
Test Engineers
Integrated system of the
ship is too complex to
understand
Government
Officials
No tools to
understand the
impact of cyber
Commanding
Officers
Organizational issues,
cultural issues
13. Nonstop Pivoting, Beneficiary Confusion
R&D
High-value assets may
be damaged
Why doesn’t post-acquisitions cyber testing occur?
Pentesters
Post-test remediation
is time-consuming
and inadequate.
Test Engineers
Inadequate
documentation for a
thorough test
Test Engineers
Integrated system of the
ship is too complex to
understand
Government
Officials
No tools to
understand the
impact of cyber
Commanding
Officers
Organizational issues,
cultural issues
14. How can we make post-acquisitions cyber testing
happen in the Navy?
Commanding
Officers
3
Establish Office to Manage
Integrated Testing
Chief of Naval
Operations1
2
Naval Systems
Commands /
Fleet Cyber
15. How can we make post-acquisitions cyber testing
happen in the Navy?
Commanding
Officers
3
Establish Office to Manage
Integrated Testing
Chief of Naval
Operations1
2
Naval Systems
Commands /
Fleet Cyber
Empower Commanding
Officers to Demand Fix
16. Weeks 8-10: What does a commanding officer
need to know about cyber threats?
Commanding officers will address cyber threats
once they understand them.
Customer Discovery Objective: What does a
commanding officer need to know about cyber?
17. “To many commanders, a cyber threat is not
as real as a missile attack.”
- National Security Council Member, Naval Officer
18. Make cyber as real as traditional threatConcrete
Only relevant info about mission impactEssential
Clear path to remediationActionable
Communicating Cyber Impacts to Commanders
19. “I understand everything in terms of mission
impact ... I just want sh*t to work.”
- Navy Commanding Officer
20. Make cyber as real as traditional threatConcrete
Only relevant info about mission impactEssential
Clear path to remediationActionable
Communicating Cyber Impacts to Commanders
21. “It’s depressing when cyber people tell
us that everything is vulnerable. My
focus is not on assessing vulnerabilities
but on fixing them.”
- Naval Commanding Officer
22. Make cyber as real as traditional threatConcrete
Only relevant info about mission impactEssential
Clear path to remediationActionable
Communicating Cyber Impacts to Commanders
23. Make cyber as real as traditional threatConcrete
Only relevant info about mission impactEssential
Clear path to remediationActionable
Communicating Cyber Impacts to Commanders
24. Current Mission Model Canvas™
Key Partners
Equipment
Manufacturers
Naval Engineers
MITRE
Fleet Cyber
Chief of Naval
Operations
NavSea
Key Activities:
Development of software
and visualization
integration
Contract for acquisitions
Modifications for private
sector
Value Proposition
Understand how cyber
threats degrade missions
Understand how to
address cyber threats
Beneficiaries
Ship Commanders
Key Resources
Access to private sector
simulations
Buy-In / Support
Chief of Naval
Operations
Office of Naval Research
Ship commanders - Fleet
Cyber
Deployment
Contract for MVP written
and submitted to
acquisitions
Mission Budget / Costs
Development of MVP (materials)
Travel
Installation costs for MVP
Mission Achievement
Increased clarity of cyber-vulnerabilities leading to better decision-
making and increased testing
25. Understand how to address
cyber threats
Naval Operators
(Commanding Officers)
Current Mission Model Canvas™
Value Proposition Beneficiaries
Understand how cyber threats
degrade missions
26. Filling the gap: Convey information to commanding
officers
Convey
Information to
Fleet Cyber
(Navy Cyber
Operational View)
Present IT Tools
(VRAM, Nessus, etc.)
Convey information to
commanding officers
27. Filling the gap: Convey information to commanding
officers
Convey
Information to
Fleet Cyber
(Navy Cyber
Operational View)
Our MVP
Present IT Tools
(VRAM, Nessus, etc.)
Additional Tools/Capabilities
30. “This is exactly what we needed.”
- Former Program Manager,
Army Cyber Command
31. Maritime Commercial Industry
$500 BILLION
SAM
Maritime Government
$194 BILLION
SOM
Naval Ships and Subs
$63 BILLION
Commercial Factories
$155 BILLION
TAM
Size of Opportunity
Department of Defense
$716 BILLION
s
32. Moving Forward
Interest from:
- Former program manager at Army Cyber Command
- Ship commanding officers
- Lockheed Martin cyber security personnel
All four of us plan on continuing with Sea++
33. Acknowledgments
Our work would not have been possible without our sponsors at MITRE (Suresh
Damodaran, Matt Mickelson, and Alex Schlichting) and numerous other supporters,
including the teaching team, TAs (especially Will Papper), H4D military liaisons, and our
mentor Daniel Bardenstein. Additionally, a special thanks to individuals at the following
organizations:
36. High-value assets may be damaged
R&D
Inadequate documentation
for a thorough test
Test Engineers
No tools to understand
the impact of cyber
Commanding Officers
Post-test remediation is time-
consuming and inadequate.
Pentesters
Integrated system of the ship
is too complex to understand
Test Engineers
Organizational issues,
cultural issues
Government Officials
38. Pivot
Many of the same problems (not damaging high value assets,
organizational issues)
Testing of
systems
(electrical,
navigational,
etc.)
Testing of all
systems (i.e.,
ship-wide
testing)in a
“red team”
style
39. Pivot
Testing of all systems cannot be solved without understanding
how to test individual systems
Testing of
systems
(electrical,
navigational,
etc.)
Testing of all
systems (i.e.,
ship-wide
testing)
42. MVP - “Black Box”
● During testing, replace high-risk component with MVP
● Using an AI simulation of data sent and received by the
component, simulate the behavior of the component
● Virtually replicate the behavior of the system without
risking damage to it
● Mitigate risk by removing at-risk and expensive
components from testing
43. Customer Discovery
Hypothesis: The USS Secure is inadequate for some
kinds of cyber testing.
Result: Some systems, such as an engineering control
system, would probably not be able to be adequately
tested in a purely simulated environment (the USS
Secure). Our sponsor - "Lower fidelity (constructive
or virtual) simulations do not usually come with
adequate attack surface."
44. Customer Discovery
Hypothesis: A machine learning model can accurately
simulate the responses of a physical component (such
as an engine) under a variety of conditions.
Result: Confirmed, if we can get the necessary data
for the model.
45. Customer Discovery
Hypothesis: The device does not currently exist.
Result: Confirmed; the Navy is currently working on
building out USS Secure, a virtual testing environment
for ships, but a physical proxy to replace components
does not exist.
47. Buy-In/Support
Supporters Advocates Saboteurs
Private
Sector
Commercial Shipping
Companies
Industrial Control System
Operators
Integrated System
Testers
Component-Level
testing Companies
Public
Sector
Fleet Cyber Command
NavSea
Crews/Captains
Time/Budget
Opponents
(Engineering
Testing, Weapons
Testing, etc.)
48. Previous Mission Model Canvas™
-Format for tracking
the testing process
-Determining
organizational
constraints
- US Cyber
Command
-Fleet Cyber
-Chief of Naval
Operations
Saboteur - Naval
Support
Organizations
- compile cyber
vulnerabilities into one
process for sake of
efficiency and security
-Ensure that red-team
cyber testing occurs
consistently and safely
on ships
-improved security of
naval vessels
- conflicts of interest
between proposed and
existing budgets and
schedules for
cybertesting
- Increased vulnerability detection on ships
- Consistent, comprehensive cyber testing on naval
vessels
-creation of cyber-
testing agency within
Fleet Cyber
- Testing and reporting costs
- Travel
-Ships to test on
-Trust of ship
commanders
-Safety parameters
of cyber-physical
systems
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
-NavSea
-SPAWAR
- Fleet Cyber
Command
-Chief of Naval
Operations
49. CAPT Jim Passarelli: Value Proposition
Canvas™
Products
& Services
Black box that
simulates a
physical
component
Remove equipment from
testing entirely
Customer Jobs
-Complete missions
-Maintain health
and safety of
crew and ship-Testing which interferes with
mission scheduling
-Testing that damages any part
of the ship
9th Intelligence Squadron
Gains
Pains
Gain
Creators
Pain
Relievers
Increased operational readiness
Fast installation and removal
of tools to ensure that they
are implemented.
Fast removal decreases
system downtime and
restoration period
50. Test Engineer Mr. Gene Lockhart: Value
Proposition Canvas™
Products
& Services
Black box that
simulates a
physical
component
- Free up time by easy
installation and interface-
ability with multiple types of
industrial physical systems
Customer Jobs
Find cyber
vulnerabilities in the
electrical system
- High risk of Damaging
system components
- Tme spent mitigating safety
concerns of physical equipment
9th Intelligence Squadron
Gains
Pains
Gain
Creators
Pain
Relievers
-More secure integrated ship
systems, better testing available
- Removes risk of damaging
equipment by entirely
removing equipment from
the systems testing
51. Research Engineer Suresh Damodaran: Valu
Proposition Canvas™
Products
& Services
Black box that
simulates a
physical
component
Removes physical equipment
from the system entirely to
decrease machinery damage
risk
Customer Jobs
Find cyber
vulnerabilities in the
electrical systemDamaging components in
sub-systems.
Decreasing thoroughness of
testing to accommodate risk
to damage equipment
9th Intelligence Squadron
Gains
Pains
Gain
Creators
Pain
Relievers
- Faster, more accurate testing of
components that will be installed
on ships
Does not compromise
accuracy of results due to
accurate modelling of
physical systems.
52. Grow
EXPAND customer base
through the Coast
Guard, private sector
shipping (tankers,
cruise ships), and
private sector industry
(industrial farming,
utility companies)
OBTAIN REFERRAL via
proof of successful
military adoption to
commercial industry
Keep
MAINTAIN interest by
refining and
integrating the product
to meet new needs
IMPROVE product by
adding more complex
features that improve
the capabilities of the
product
Get
ACQUIRE customers via
successful testing,
approach potential
users and market value
of our solution
ACTIVATE customers with
pushing naval offices
to adopt the cyber
testing device, train
users in test device
usage to ensure
adoption
53. Week 1 Mission Model Canvas
- NavSea
Philadelphia
- Cyber
Command
- Private
cybersecurity
firms
- Shipbuilders
- MITRE
- Development of
physical and
electronic security
testing procedures.
- Streamline testing
procedures to
minimize operational
downtime
- Target individual
systems for most
useful testing
applications
- Incremental
aggression
capabilities to
maximize
vulnerability
detection.
- More aggressive testing taking place onboard ships
- Adoption and continued use metrics of the
methodology by engineers
- Increased vulnerability detection on ships
- User retention during testing lifecycle
- Safety parameters
of cyber-physical
systems
- Integrated
navigation system
models/block
diagrams
- User Design Input
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Operational
commanders for
conducting testing
- NavSea/MITRE
(Navy) and C4IT
(CG) to conduct
testing and evaluate
effectiveness
Primary: safety of
ship crewmen, safety
of large assets
Primary: Providing
defense contractors
with methodology for
determining security
of navigational
systems
Secondary: Decrease
in potential for
international
incidents
Fixed:
- Software design & engineering
- Testing and reporting costs
Variable:
- Expansion of hardware testing capabilities
- Hardware testing
procedures
understanding
- Developing
evaluation criteria
-Gaining trust and
gauging expectations
of ship captains
- Testing of
methodology on
sample test plans
- Full electronic
software deployment
to customers via
web-app
54. Week 2 Mission Model Canvas
-Understanding
hardware testing
procedures
- Understanding
evaluation criteria
-NavSea Philadelphia
-Cyber Command
-Private
cybersecurity firms
-Shipbuilders
-MITRE
-Operational
commanders
- Test engineers
- Defense
contractors
- Streamline testing
procedures to minimize
operational downtime
- Yield best ID of
vulnerabilities for
correction
- Development of
physical and electronic
security testing
procedures
- Target individual
systems for most useful
testing applications
- Incremental
aggression capabilities
to maximize
vulnerability detection.
- More aggressive testing taking place onboard ships
- Adoption and continued use metrics of the
methodology by engineers
- Increased vulnerability detection on ships
- Using methodology on
ships
- Full electronic
software deployment to
customers via web-app
- Software design & engineering
- Testing and reporting costs
-Ships to test on
-Trust of ship
commanders
-Safety parameters
of cyber-physical
systems
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Operational
commanders
- NavSea/MITRE
(Navy) and C4IT (CG)
to conduct testing
and evaluate
effectiveness of
methodology
55. -Determining how to
streamline the
restoration process
-Determining
necessity of
adversarial
assessment
-NavSea Philadelphia
-Cyber Command
-Private
cybersecurity firms
-MITRE
-Operational
commanders
- Test engineers
- Streamline testing
restoration procedures
to minimize operational
downtime
- Yield best ID of
vulnerabilities for
correction without
risking damage to ship
- Development of
physical and electronic
security testing
restoration procedures
- Target individual
systems for most useful
testing applications
- Incremental
aggression capabilities
to maximize
- More ships undergoing aggressive testing procedures
- Adoption and continued use metrics of the
methodology by engineers
- Increased vulnerability detection on ships
- Using methodology on
ships
- Full electronic
software deployment to
customers via web-app
- Software design & engineering
- Testing and reporting costs
- Travel to naval bases
-Ships to test on
-Trust of ship
commanders
-Safety parameters
of cyber-physical
systems
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Operational
commanders
- NavSea/MITRE
(Navy) and C4IT (CG)
to conduct testing
and evaluate
effectiveness of
methodology
Week 3 Mission Model Canvas
56. Week 4 Mission Model Canvas
-Format for tracking
the testing process
and constructing
relevant information
upon completion
-Determining how to
streamline the
restoration process
-NavSea
- US Cyber
Command
-Private cybersecurity
firms
-MITRE
-Operational
commanders
-Test engineers
- Ensure that red-team
cybertesting occurs
consistently and safely
on ships Streamline
cybertesting tracking
process restoration
procedures to minimize
operational downtime
- Yield comprehensive
ID of test parameters
and results without
risking damage to ship
- Development of tool
to aid in tracking of
testing and restoration
physical and electronic
security testing
restoration procedures
- Reduced restoration time
- Increased vulnerability detection on ships
-consistent, comprehensive cybertesting on naval vessels
- Full electronic
software deployment to
customers via web-app
-creation of cyber-
testing agency within
Fleet Cyber
- Software design & engineering
- Testing and reporting costs
- Travel Installation of app
-Ships to test on
-Trust of ship
commanders
-Safety parameters
of cyber-physical
systems
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Operational
commanders
- NavSea / MITRE
- Fleet Cyber
Command
57. Week 5 Mission Model Canvas
-Format for tracking
the testing process
and constructing
relevant information
upon completion
-Determining how to
streamline the
restoration process
-NavSea
- US Cyber
Command
-Private cybersecurity
firms
-MITRE
-Operational
commanders
-Test engineers
- Ensure that red-team
cybertesting occurs
consistently and safely
on ships Streamline
cybertesting tracking
process restoration
procedures to minimize
operational downtime
- Yield comprehensive
ID of test parameters
and results without
risking damage to ship
- Development of tool
to aid in tracking of
testing and restoration
physical and electronic
security testing
restoration procedures
- Reduced restoration time
- Increased vulnerability detection on ships
-consistent, comprehensive cybertesting on naval vessels
- Full electronic
software deployment to
customers via web-app
-creation of cyber-
testing agency within
Fleet Cyber
- Software design & engineering
- Testing and reporting costs
- Travel Installation of app
-Ships to test on
-Trust of ship
commanders
-Safety parameters
of cyber-physical
systems
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Operational
commanders
- NavSea / MITRE
- Fleet Cyber
Command
58. Week 6 Mission Model Canvas
-Fleet Cyber
-NavSea
-Risk of damaging
high value assets
reduced
-Ability to run wider
battery of tests faster
and more securely
- Removal of
Equipment from
testing plans
-Near perfect simulation of physical component with the
black box
-More comprehensive testing and remediation of cyber-
physical system
-Training data for AI
model
-Microcontrollers
with clock speeds
fast enough to
respond to PLC’s
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Chief of Naval
Operations
-Office of Naval
Research
-Ship commanders
- Fleet Cybercommand
-Ship commanders
-Test engineers
- Research Engineers
-Development of MVP (materials)
-Travel
-Installation costs for MVP
-Build physical
prototype
-Develop machine-
learning technology
- Contract for MVP
written and
submitted to
acquisitions
59. Week 7 Mission Model Canvas
-Fleet Cyber
-NavSea
-Risk of damaging
high value assets
reduced and
increased clarity on
impact of results
-Ability to run wider
battery of tests faster
and more securely
- Removal of
Equipment from
testing plans
-loss of monopoly
over diagnostics of
their product
-Simulation of physical component with black box
- Integration of existing private sector technologies into
testing process and increased clarity of results
- Metrics are impossible to know from security issues
-More comprehensive testing and remediation of cyber-
physical system
-Training data for AI
model access to
private sector
simulations
-Microcontrollers
with clock speeds
fast enough to
respond to PLC’s
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Chief of Naval
Operations
-Office of Naval
Research
-Ship commanders
- Fleet Cybercommand
-Ship commanders
-Test engineers
- Research Engineers
-Saboteur:
equipment
manufacturers
-Development of MVP (materials)
-Travel
-Installation costs for MVP
-Build physical
prototype and
reporting tool
-Develop machine-
learning technology
-Contract for MVP
written and
submitted to
acquisitions
60. Week 8 Mission Model Canvas
-Equipment
Manufacturers
-Naval Engineers
-MITRE
-Fleet Cyber
-Chief of Naval
Operations
-Risk of damaging
high value assets
reduced and
increased clarity on
impact of results
-Ability to run wider
battery of tests and
convey importance of
vulnerabilities to COs
- Removal of
Equipment from
testing plans
-loss of monopoly
over diagnostics of
their product
- Integration of existing private sector technologies into
testing process and increased clarity of results
-Test engineers implement as primary tracking tool for
testing; COs utilize as reference point for cyber-threats
-access to private
sector simulations
-Microcontrollers
with clock speeds
fast enough to
respond to PLC’s
Beneficiaries
Mission AchievementMission Budget/Costs
Buy-In/Support
Deployment
Value
Proposition
Key Activities
Key Resources
Key Partners
- Chief of Naval
Operations
-Office of Naval
Research
-Ship commanders
- Fleet Cyber
-Ship commanders
-Test engineers
- Research Engineers
-Saboteur:
equipment
manufacturers
-Development of MVP (materials)
-Travel
-Installation costs for MVP
-development of
software and
visualization
integration
-contract for
acquisitions
-modifications for
private sector
-Contract for MVP
written and
submitted to
acquisitions
Scott:
Original problem description
Via 99 Interviews
Arrived at current problem
Introduce members of team
Scott
Incidents have brought attention to cyber vulnerability potential
Cyber is important because of
Integrated systems
Not built with cyber in mind
Major assets vulnerable
Many ships are currently vulnerable, and identifying/fixing vulnerabilities is extremely important
Scott
Via 99 interviews
About ⅓ of each area
About ⅙ were beneficiary - military operational unit leaders (beneficiaries changed throughout the process)
Scott
Attempting to identify the core beneficiary: cyber integrated ship testers and CO
What is unsafe and inadequate and why?
What does inadequate/unsafe even mean?
What is the underlying problem?
Scott
Seems to be a disjoint between the commanding officers and the engineers, where engineers want to perform thorough tests and commanding officers are worried that cyber tests will compromise the safety of the ship
Scott
This was our initial Mission Model Canvas, that organizes the key aspects of the identification of a problem/fit into a cohesive document.
Our initial MMC changed dramatically relative to our final MMC
The most important factor that changed was our value proposition and beneficiaries
Scott
Our initial discovery led is to identify two major beneficiaries:
Cyber engineers/testers (describe archetype)
Commanding Officers (describe archetype)
Problems we found
No streamlined testing procedures -> minimize downtime
No ability to target individual systems -> accurate results
Develop procedures -> help streamline and improve safety
Incremental ability -> can cater test to timeframe/thoroughness required
Scott
Develop a tool that takes the following input:
Engineers-
Equipment testing scope
Depth scope
COs
Timeline of testing
Safety of testing
Create a testing methodology that enables comprehensive test plans of the ship that meets the needs of both beneficiaries.
Meilinda ( due to the complex responsibilities of equipment and budget conflicts)
We discovered that we were struggling to identify an integrated cyber tester to present this MVP to.
Where were these cyber testers? What office did they work for?
Only able to find OT&E (operational test and evaluation) - pre-acquisition testers.
What testing is being completed RIGHT NOW?
Only specific offices are testing their own equipment post-acquisitions… why is this happening?
No integrated ship testing developed
Meilinda
As it turns out, during our initial interviews, we kept asking whether we could be pointed to the org that does integrated systems testing, and each interviewee said “I think this is what NAVSEA or FLEET CYBER or some other organization does”.
After weeks of talking with specific naval systems commands
NAVSEA
NAVAIR
SPAWAR
ONR (not systems command)
Fleet Cyber (not systems command)
They were just directing us to other offices as the “I think that’s what ONR/SPAWAR/NAVSEA/NAVAIR does…”
Nobody has taken the responsibility of carrying out these tests.
There was no singular entity who could control the process (budget/equipment responsibilities)
Meilinda
Major Pivot**
After we learned nobody was doing testing of entire ships… we decided we needed to ask more questions
This was a major pivot, as we were previously under the impression that integrated ship cyber-testers existed, they do not.
To making testing of entire ships actually happen, we know we would have to look beyond the test engineer as the sole agent of change
Rather than creating a technical MVP for technical people, we pivoted toward finding a way to make cyber testing of entire ships occur
Meilinda
After talking to various industry professionals from Industrial controls, to pentesters, to commanding officers, we went on a hunt for an MVP
Could not focus on the “correct beneficiary”
Could not identify the most palpable problem
Why was this space so disorganized?
We iterated many times through MVPs, all very different, filling the needs of niche groups of beneficiaries.
Meilinda
Everything eventually pointed back to a cultural problem
No enough skill/retention
Not enough manpower
Not enough money
Not set up organizationally
How do we solve all these problems? Organizational paradigm shift.
Meilinda
TWO MAJOR APPROACHES
Top-down approach, request CNO designate office to manage cyber testing. Force change and specify office structure and role it will play
Bottom-up approach, give COs the ability to request assistance from offices, and conduct cyber testing at a lower level onboard. Give them the ability to make the change themselves
Top-down is unrealistic, not enough information about office roles, it is challenging to just randomly create a solution that isn’t fully understood.
Meilinda
TWO MAJOR APPROACHES
Top-down approach, request CNO designate office to manage cyber testing. Force change and specify office structure and role it will play
Bottom-up approach, give COs the ability to request assistance from offices, and conduct cyber testing at a lower level onboard. Give them the ability to make the change themselves
Top-down is unrealistic, not enough information about office roles, it is challenging to just randomly create a solution that isn’t fully understood.
Meilinda
COs do not understand cyber threats on an operational level (which is how they understand their assets, as MISSION COMPLETION assets)
How can be convey cyber threats in a way conducive to commanding officers?
How can we give them the information they need to begin to make changes?
Ross
COs are challenged in their ability to conceptualize a vulnerability in an asset that is unseen and unknown (by non-professionals)
COs care about mission completion, and how their ships are able to complete/fail missions
Ross
Findings: Cyber needs to be understood as a traditional vulnerability in physical system.
Ross:
CO’s just want to be able to prioritize the most important information to them about operational readiness
Ross:
Finding: Enable the prioritization of vulnerabilities by mission needs and how “impacting” they are to mission.
Ross:
Give recommendations on how to fix problems on the ship, or direct Commanding officers on “who to call”
Ross:
Finding: Tool must have features that identify the most effective means of remediation.
Ross:
Restate previous three features:
Make it tangible
Make it relevant to mission
Make it helpful (for fixing)
Ross:
These new findings dramatically changed our MMC from our week 1 MMC:
Remove cyber testers as beneficiary (they don’t exist)
Change value propositions dramatically
Ross:
New major values:
Convey vulnerabilities in terms of mission impact
Recommend solutions to them
Ruben Krueger
Currently, there exists:
Network-level tools that are able to identify network-level vulnerabilities and basic monitoring features
Reporting/Statistics for Fleet Cyber to identify problems in systems, see status
There is nothing that identifies risks and vulnerabilities in terms of mission impacts
Ruben
Our MVP will give a mapping ability to the ship, with the ability to identify vulnerabilities by how they impact missions, not just a list of “10k things wrong with the boat”.
It will also utilize additional third party scanning that is much deeper than just SIPR/NIPRnet scanning (industrial control systems)
Ruben:
Current Model of our MVP
Who do we present to?
Commanding Officers, Informations Technicians, Informations Officers
What does it do?
Answers all the needs of the captains to promote the changes in cyber culture
(update picture since we changed the header title last night)
We need to have two images
The “first view” of operational impact ranking
The view above of total vulnerabilities
MAYBE a page showing the “remediation steps” of a specific vulnerability
Describe general view
Mission View <- System View <- Equipment View <- Vulnerability View <- Remediation View
Ruben:
Current Model of our MVP
Who do we present to?
Commanding Officers, Informations Technicians, Informations Officers
What does it do?
Answers all the needs of the captains to promote the changes in cyber culture
(update picture since we changed the header title last night)
We need to have two images
The “first view” of operational impact ranking
The view above of total vulnerabilities
MAYBE a page showing the “remediation steps” of a specific vulnerability
Describe general view
Mission View <- System View <- Equipment View <- Vulnerability View <- Remediation View
Ruben
We have received feedback from multiple individuals describing the need for awareness at the operational level of how cyber-security can be a real viable threat to missions.
Ruben:
This is just the “Tip of the iceberg” in the market that this concept could be applied
Any “mission completion” asset could benefit from a tool that identifies mission needs by physical equipment
Ensure to state the below Acronyms (teacher recommendation)
TAM or Total Available Market is the total market demand for a product or service.
SAM or Serviceable Available Market is the segment of the TAM targeted by your products and services which is within your geographical reach.
SOM or Serviceable Obtainable Market is the portion of SAM that you can capture.
Ruben
Interest from multiple entities in the operational community
MVP assumes that testing data exists
Big “hangup” will be the ability to deploy additional software scanning tools
Deployment will be challenging due to software on operational units is required
Ruben:
We would like to thank all the organizations that have helped us throughout the process.
We would not have been able to even begin to understand the problem without the help from our sponsor and the many individuals who helped us.
Thanks to DANIEL-DADDY
Organizational Restructuring is unrealistic: Difficult to identify most successful structure Challenging to deploy desired structure Still does not promote better retention of experts or create organic knowledge to evaluate systems Still does not remedy disjoint understanding between operators and engineers
Jeff - · Okay. I think I understand the slide. There are a number of reasons that cyber testing is not occurring. Am I wrong? Not too sure given the R&D box that doesn’t seem to fit with the others. First, it would help to have a title explaining this “Cyber testing does not occur for X reasons”. Second, you are attempting to convey way too much in one slide. This needs to be a build slide and even then, you may want to cut down the number of reasons to 3. Choose the best three and then drill down on them. Obviously the organizational issue is a big one.
THIS IS FILLER DON”T EDIT
Using ORMF also legitimizes cyber attacks as real attacks
Simple - Commanders need the information presented in a familiar way so they will understand it (i.e. Operational Risk Management Framework).
Actionable - A commander must know to whom they can delegate the task of addressing vulnerabilities
Essential - Commanders care about a cyber vulnerability’s mission impact, not the details of the cyber vulnerability. The tool needs to contain only critical information.
Adaptable - Understanding mission impact of cyber vulnerabilities is crucial across the DoD, so the tool needs to be adaptable for other cyber-physical systems.
We were pretty much pivoting every week -- this was a tough few weeks because every time we thought we had settled on something, we realized there were other issues at hand and we had to pivot