Why 'positive security' is a software security game changer
6 de Nov de 2019•0 recomendaciones
0 recomendaciones
Sé el primero en que te guste
ver más
•59 vistas
vistas
Total de vistas
0
En Slideshare
0
De embebidos
0
Número de embebidos
0
Descargar para leer sin conexión
Denunciar
Internet
This deck goes through challenges with software security today, how we got to this position and best ways of addressing these challenges through the lens of 'positive security'.
Working or saving lives?
> Work for
We empower developers to write secure code
> Developer >> Pentester >> Developer
> Help organisations build kick-ass
training awareness programs
Agenda
• Today’s challenges with software security
• How did we end up here?
• Shift Left Start Left – How to scale and make an impact as appsec
• Build a positive security culture
Software engineers around the world ~ Evans Data
22M
Source: https://evansdata.com/reports/viewRelease.php?reportID=9
Lines of code written by developers
every year ~ CSO Online
111BN
Source: https://www.csoonline.com/article/3151003/application-development/world-will-need-to-secure-111-billion-lines-of-new-software-code-in-2017.html
Security incidents result from defects in the design
or code of software ~ DHS
90%
Source: https://www.us-cert.gov/sites/default/files/publications/infosheet_SoftwareAssurance.pdf
Of data breaches caused by software vulnerability ~
Verizon
21%
Source: Verizon, Data Breach Report, 2018 (but in there the last 10 years)
of newly scanned applications had SQL injections
over the past 5 yrs ~ Cisco
1 in 3
Source: Cybersecurity as a Growth Advantage, Cisco, 2016
AppSec in 2000
Corporates had a branding website, the
Internet was mostly for geeks
> AppSec was virtually non-existent in corporate world
> Hacking was focussed on exploiting infrastructure
vulnerabilities (bof, race conditions, fmt str*)
> Research on first web app weaknesses
> OWASP started and Top 10 released!
> Penetration testing was black magic
We’ve got bigger problems (Y2K) than worrying
about Application Security
AppSec in 2010
Companies started offering web-based services;
Web 2.0 and Mobile are new
> Penetration testing was THE thing
> Web Application Firewalls will stop everything
> Paper-based secure coding guidelines
> Static Code Analysis Tools (SAST) emerge
BUILDERS
Know their code
Do not speak
“security”
BREAKERS
Always pointing out
problems
Not developers
SQL Injections
XSS
Object
Deserialization
IDOR
Constructors
JAVA Spring
SWIFT
Angular.JS
vs
SHIFT START left
Solution – Better Pen-Testing
> Bobby’; DROP TABLE pentesting_attitude;
> Provide a FIX more than input_validation();
> Create a JIRA ticket with advise/fix
> Create a pull request (wishful thinking)
> Lessons Learned to dev teams to distribute
knowledge
Less finding problems, more security engineering
200
Secure Coding Guidelines
1. Ensure application logging (Where, What, When, Who, Why)
2. Use context encoding on untrusted user input
Project X - Secure Coding rules for
<insert your favourite coding framework>
1. Use SecureLogger log_object;
2. Don’t use GetParameter(), Use LibSafe_GetParam()
Solution – Distribute Knowledge
Secure Coding Guidelines
1. Ensure application logging (Where, What, When, Who, Why)
2. Use context encoding on untrusted user input
Project X - Secure Coding rules for
<insert your favourite coding framework>
1. Use SecureLogger log_object;
2. Don’t use GetParameter(), Use LibSafe_GetParam()
Upon Commit
1. Your code violates security rules: You shall not pass!
2. Your code violates security rules: Fill in your get out of jail card
(JIRA ticket)
3. Points++ for delivering secure code
Solution – Distribute Knowledge
200
Application Security
1
Developer fixes issue
● Use TLS() for any sensitive data
Security Vulnerabilities
● Sensitive data not
transported securely
Solution – Learn from Mistakes
Developer fixes issue
● Use TLS() for any sensitive data
Security Vulnerabilities
● Sensitive data not transported securely
Project X - Secure Coding rules for
<insert your favourite coding framework>
1. Use SecureLogger log_object;
2. Don’t use GetParameter(), Use LibSafe_GetParam()
3. Use TLS() for any sensitive data
200
Solution – Learn from Mistakes
> Build a positive
security culture
Break down “us” vs “them” culture
Positive
Security Culture
Create a fun culture
> People remember a memorable brand
> Make it fun and geeky!
> AppSec are not marketing experts, get help from
Security Awareness and Comms teams
Positive
Security Culture
Build a community of Security
Champions
> Special interest group for those interested in AppSec
and cyber security
> Self-drives the culture from within Engineering with
support and collaboration from AppSec
> Fun events and competitions – write your best
phishing email, lock picking, hack internal applications
> Work on initiatives that improve the security posture
of applications such as security libraries, mentoring
peers and liaising with AppSec
Security Champions
Jane Doe John Smith
> Interested in AppSec
> Great grasp of security concepts
> coding_skills++ - best coder in the team
> Well respected by peers
> Not part of other communities
Works with AppSec doing security
engineering
> Interested in AppSec
> Good grasp of security concepts
> Good coding skills
> Well liked by peers
> Part of internal communities
Helps spread the word and drive behaviour
change
Positive
Security Culture
Reward good behaviour
> Peer and executive recognition
> Speeding pass - prove security awareness, introduce
security pipelining and skip manual security checks
> Internal bug bounty program - reward developers for
finding security bugs you would pay pen-tester for
Positive
Security Culture
Remember – it’s not easy!
> Crawl…walk…RUN
> Visible management buy-in
> Harder to change mindset of existing employees,
easier to introduce to new starters
Secure Developers Are Superheroes
Takeaways:
● Software security is a big concern in today’s software landscape
● Demand better outcomes in security testing
● Distribute knowledge to scale AppSec
● Focus on positive outcomes, what to do vs what not to do
● Build a positive security culture
Notas del editor
Intro
Why I am presenting this topic
We went from 16M in 2014 to 22M today and 26M tomorrow.
How many of those software “engineers” have been told about the dangers in their job? Developers learn about security by making mistakes and “on the job”
How many civil engineers know that if you’re building a house, the safety and security of the construction is important?
The whole world runs on software. Its not only the banks, but cars (Tesla, BMW, etc), oil rigs, airplanes, stock exchanges and soon, your mum’s water kettle will be running some form of Linux with a crappy PHP interface to remotely manage the water kettle.
Roughly 2 million exploitable security bugs written every year
The interesting part about Verizon data breach report is that AppSec has been mentioned in there since 2010 as one of the biggest causes of data breaches. We must be doing something wrong.
We have known about SQL injection for 20 years, still making the same mistakes
Speak about Twitter and what just happened.
> AppSec Virtually nonexistent. Nobody cared
> Exploits techniques very simple (BoF, Race conditions, etc).
> Security was solved by infrastructure technology and perimeter security.
> First major focus of hackers on WebApp & Database Vulnerabilities (e.g. SQLi & XSS)
Let’s look at WHY this is happening
SAST, 10 year old technology, usually run when code has been written already
RASP, an agent you install in production that tries to analyse data flows and stop the attack. Even if you’re code is bad, they claim they can stop everything.
DAST, tools you can run on a QA version of your software. Tries to blindly poke holes in what it can access and determine whether its a problem
BugBounties, let’s take pentesting but multiply it with 1000 resources. Surely one of them will find something usefull
Results of your test highly depend on the skills of who you hire. The customer, usually has no clue which skills are required to perform a proper test.
The pentester, usually does not have all the skills
Tell story about our platform being tested by 2 firms with credible reputation (CREST, etc). They found some small stuff (outdated library stuff) but nothing major. Two weeks later, one of our own staff (not certified) found a law in our assessment engine which would allow anyone to pass an exam.
Customers don’t understand the criteria required to select a good security tester, usually pick on a combination of price+reputation. The price pressure results into good companies to be competitive, they down scope it, not enough time to go in depth and to understand the business behind the application. They focus on getting low hanging fruits into a paper document.
The business wants to go all hands agile and to 10 code-drops per day
You’re on your own. Trying to prioritise what’s important and what can be skipped
That’s usually what happens with you if you work too long in AppSec. You bleed :)
No budget
Even if you found budget, not enough people in infosec
Security experts don’t know the code – hard time scaling
New technology/hotness – cannot keep up
There are so many tools, for so many stacks, for so many problems.
GuardRails for Github commits
RASP - Imperva bought Prevoty (WAFng, WAF++)
Exciting RIPsTech -> automatic code patch generation in specific framework
Veracode -> analyse path of execution and advise where to patch
According to the NIST (National Institute of Standards and Technology, US Dept. of Commerce), there are 125 frequent occurring vulnerabilities.
Each developer of the team can not master how to tackle all these vulnerabilities (think juniors).
Developers work on different bits of the code so security knowledge and best practices need to be shared (hard as class room style training and wiki’s aren’t effective, , time consuming, not a priority).
New developers join and need to be trained on best practices, developers leave (is their knowledge preserved?).
Once a vulnerability is fixed, the ‘how it was fixed’ isn’t typically shared or if shared it is done in wiki’s, confluence etc.
Big chance vulnerability will be reintroduced: again research on how to fix or even worse a different fix (new library f.e.).
All the above cause security bugs to be present and hard to eradicate.
We as an industry need to stp focussing on negatives
Old standard – gatekeepers, say no
Don’t tell them what NOT to do, provide guidance on what todo. Make sure secure foundations can be built
Create a memorable brand so people remember.
I am from Appsec so I know we are not branding experts. Leverage security awareness teams, they are great at internal marketing.
Forget to tell people why they should care
Need a teachable moment – failed project delivery, worse breach (better if its your competitor)
Talk about how long it takes to fix a bug for a developer, unknowns for project delivery, competitors taking the lead to business folks
Home on internal social media – news reports, lessons learnt, fun events
Takes a while to create a vibrant community
Everyone needs to be security aware but not everyone needs to be an expert.
Identify those interested in security and help them gain deeper knowledge. Not a hacking expert, still focussed on secure coding
Reward for being a champion! Send them to conferences, pay for more training, help make decisions on tooling
Currently negative experience for developers when they find bugs
- Increase work for themselves or their peers, not a positive experience
Currently negative experience for developers when they find bugs
- Increase work for themselves or their peers, not a positive experience