Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Blending Automated and Manual Testing

1.240 visualizaciones

Publicado el

DevOps puts an intense focus on automation – taking humans out of the loop whenever possible to allow frequent, incremental updates to production systems. However, thorough application testing often has multiple components – much of this can be automated, but manual testing is also required. This is inconvenient and not “DevOps-y,” but is unfortunately an unavoidable requirement in the real world. In addition, managing these multiple sources of application vulnerability intelligence often requires manual interaction – to clear false positives, de-duplicate repeated results, and make decisions about triage and remediation.

Axway has rolled out an application security program that incorporates automated static and dynamic testing, attack surface analysis, component analysis, as well as inputs from 3rd parties including manual penetration testing, automated and manual dynamic testing, automated and manual static testing, and test results from vendors providing test data on their products. Automation has allowed Axway to increase the frequency of web application testing, thus reducing the cycle time in the application vulnerability “OODA loop.” Moving beyond the identification of vulnerabilities, Axway has deployed ThreadFix to automatically aggregate the results of the automated testing and de-duplicate findings. 3rd party penetration testers are also finding vulnerabilities and reporting them in reasonably structured CSV files requiring Axway to convert this manual test data and incorporate it into the aggregated vulnerability model in ThreadFix. Centralizing this pipeline allows for metric tracking – both for the application security program as a whole as well as on a per-vulnerability-source basis. This automation and consolidation now covers 50% of Axway’s application vulnerability review process - with plans to extend further.

This presentation walks through Axway’s construction of their application security-testing pipeline and the decisions they were forced to make along the way to best maximize the use of automation while accommodating the reality of manual testing requirements. It then looks at how this testing regimen and the associated automation have allowed them to impact deployment practices as well as collect metrics on their assurance program. Finally, it looks at lessons learned along the way – the good and the bad – and identifies targeted next steps Axway plans to take to increase the depth and frequency of application security testing while dealing with the deployment realities placed on them to remain agile and responsive to business requirements.

Publicado en: Tecnología
  • Sé el primero en comentar

Blending Automated and Manual Testing

  1. 1. Blending  Automated  and   Manual  Testing Making  Application  Vulnerability   Management  Pay  Dividends
  2. 2. My  Background • Dan  Cornell,  founder  and  CTO  of  Denim   Group • Software  developer  by  background  (Java,   .NET,  etc) • OWASP  San  Antonio @danielcornell
  3. 3. My  Background • Steve  Springett,  Application  Security   Architect  for  Axway • Software  developer  by  background • Leader  of  OWASP  Dependency-­‐Track • Contributor  to  OWASP  Dependency-­‐Check @stevespringett
  4. 4. Goal:  Continuous  Security • Prerequisites – Standardization – Continuous  Integration – Continuous  Delivery • Compliments – Continuous  Acceptance
  5. 5. Standardization • All  projects  use  same  build  system • All  projects  built  the  same  way • Automated  onboarding  for  new  projects • Per-­‐project  build  expertise  not  required
  6. 6. MetricsArtifacts Continuous  Integration Continuous   Integration   Factory Source  Code  (SCM)
  7. 7. Deliverables Continuous  Delivery Continuous   Delivery   Factory Artifacts
  8. 8. Security  Metrics Continuous  Security Continuous   Security   Factory Source  Code  (SCM) Deliverables
  9. 9. Automated  Security  Metrics • Static  Analysis  Findings • Dynamic  Analysis  Findings • Component  Analysis  Findings • Attack  Surface  Analysis  Findings
  10. 10. Continuous  Security  Pipe Jenkins  CI ThreadFix Defect  TrackerSCM False  Positive
  11. 11. Target Application
  12. 12. 12 ThreadFix Accelerate  Software  Remediation ThreadFix   is  a  software  vulnerability   aggregation   and  management   system  that  helps  organizations   aggregate   vulnerability   data,   generate   virtual  patches,  and  interact  with  software  defect  tracking   systems.
  13. 13. ThreadFix • Open  Source  (MPL)  application  vulnerability   management  platform • Create  a  consolidated  view  of  your   applications  and  vulnerabilities • Prioritize  application  risk  decisions  based  on   data • Translate  vulnerabilities  to  developers  in  the   tools  they  are  already  using
  14. 14. ThreadFix Community  Edition • Main  ThreadFix website: – General  information,  downloads • ThreadFix GitHub site: – Code,  issue  tracking • ThreadFix GitHub wiki: – Project  documentation • ThreadFix Google  Group:!forum/threadfix – Community  support,  general  discussion
  15. 15. Vulnerability  Aggregation Automated Automated Manual
  16. 16. Access  to  Vulnerability  Data • Tradeoffs – The  more  places  the  vulnerability  data  lives,  the   more  likely  a  compromise – Withholding  information  from  people  who  need  it   makes  remediation  more  challenging
  17. 17. Managing  All  Vulnerability  Data • Manual  activities – Penetration  Testing – Code  Reviews • 3rd Party  Data  Sources – Customer-­‐performed  Testing – External  auditor-­‐performed  Results
  18. 18. SSVL  and  Manual  Results • SSVL  Data  Format: – • SSVL  Conversion  Tool: –­‐Converter
  19. 19. RESTful API  to  Vulnerability  Data Custom R&D  Monitoring Dashboard Custom Dashboards
  20. 20. Key  Performance  Indicators • Don’t  go  overboard  – Use  only  what  is  needed • Progress  and  velocity • Per  team  comparison • Min/max/avg time  to  close  per  severity • By  CWE
  21. 21. Lessons  Learned • Always  automate  static  analysis • Always  automate  attack  surface  analysis • Always  automate  component  analysis • Always  automate  dynamic  analysis • Always  perform  manual  dynamic  analysis • Use  native  tools  &  workflow  for  static  analysis
  22. 22. Lessons  Learned • Provide  as  much  visibility  as  possible – Varying  degrees  of  detail – Multiple  delivery  vehicles • Set  clear  pass/fail  criteria  for  Security  Bars – Provide  custom  dashboard  to  provide  status  and   advanced  warning
  23. 23. Additional  Advice • Automation  is  not  better  than  manual – It’s  faster  and  more  efficient – Both  are  necessary • Don’t  forget  manual  assessments – Threat  Modeling – Secure  Design/Architecture  and  Code  Review – Penetration  Testing
  24. 24. Finally • Vulnerabilities  in  CI  /  CD  /  CS  Infrastructure – Threat  Model – Secure  Architecture  Review – Patch  Management   – Configuration  Management – Key  Management – Always  use  TLS
  25. 25. Q  &  A