Se ha denunciado esta presentación.
Se está descargando tu SlideShare. ×

Cybersecurity Strategies - time for the next generation

Cargando en…3

Eche un vistazo a continuación

1 de 65 Anuncio

Cybersecurity Strategies - time for the next generation

Descargar para leer sin conexión

In this talk, presented in June 2016 at KAIST, I argue that it is time for the next generation of cybersecurity strategies. These must have a governance focus, and be based on international laws, declarations and agreements, basic internet rights and public good provisions.

In this talk, presented in June 2016 at KAIST, I argue that it is time for the next generation of cybersecurity strategies. These must have a governance focus, and be based on international laws, declarations and agreements, basic internet rights and public good provisions.


Más Contenido Relacionado

Presentaciones para usted (20)

Similares a Cybersecurity Strategies - time for the next generation (20)


Más reciente (20)

Cybersecurity Strategies - time for the next generation

  1. 1. National Cyber Security Strategies A contradiction in terms?
  2. 2. root@myops:~# whoami • Theoretical chemist and philosopher by training (PhD 1993 and 2012) • Wrote DALTON program code [in FORTRAN] • Played with supercomputers such as Cray Y-MP • First got hacked in 1991 • Worked 15 years as IT Infrastructure architect for various NZ companies • Now lead the IT Security team @UoA by day • Lecture in cyber security at Unitec and UoA • Present at technical cyber security conferences
  3. 3. root@myops:~# whoami > graphic
  4. 4. Security trainwreck: tech pre-conditions Eternal economic disincentives to build better security in: 1. Rapid consumerisation, hence feature driven development 2. Time and Cost driven market model (lowering quality) 3. Rapid development cycles and an ‘unstable’ (i.e. rapidly evolving and incompatible between versions) technology stack requiring rapid re-engineering of key components With IoT, to make it worse, these disincentives are meeting: 4. Long expected lifetimes
  5. 5. Examples: Why this is important How secure is your tech? What does that mean exactly? How secure is your data in Google, Facebook, LinkedIn, WhatsApp and others? Does the NZ prime minister really understand the difference between bulk collection (which he admits) and mass surveillance (which he denies)? Can Donald Trump really ‘fix the internet’ by calling Bill Gates for advice on how to ‘close some parts off’?
  6. 6. Put this on twitter: look at the screen
  7. 7. Contents 1. Nations, states, security, cyber security, privacy and the stacks 2. The Snowden effect and the Snowden paradox 3. Why raw ‘freedom’ fails 4. At what level should the cyber [in]security problem be tackled? 5. A social philosophy of cyber security 6. A national cyber security strategy
  8. 8. Nations, states, security, cyber security, privacy and the stacks
  9. 9. What is a ‘nation’? There are many dimensions • Economic • Social • Legal • Historical • Geopolitical • Population
  10. 10. Origin of nationhood Crisis of the Seventeenth century • Thirty year war in Germany (1618-1648) • Dutch revolt (1568- 1648) • Military revolution Treaty of Westphalia (1648) Social philosophy (Hobbes, Locke)
  11. 11. Westphalian principles of nationhood Westphalian sovereignty • Territorial authority • Religious tolerance • Non-interference in another state’s internal affairs • Equality on the international scene
  12. 12. A new player in nationhood At Webstock 2013 Bruce Sterling defined what he calls ‘The Stacks’ A new type of corporation that uses lock-tight integration of hardware and software in order to form a branded ecosystem
  13. 13. ‘Stacks’: GOOG, AAPL, FB, MSFT, TWTR • Size: very large, both in terms of employees and users • Vertically integrated global software structures used by millions • Proprietary OS’s AND devices AND large server farms AND loyal user base AND proprietary revenue stream AND (sometimes) own money • The internet of old had users, the stacks have livestock • Advertising as a revenue model depends on surveillance • Stacks have social networks and buy IOT / VR companies • Livestock security can be detrimental to revenue models
  14. 14. Are the ‘stacks’ nations? • Google, Apple, Facebook now larger in turnover than GDP of many small nations • Each has their own ‘cloud’ – i.e. ‘informational territory’ • The military now talk about ‘cyber’ as the ‘fifth domain’ (i.e. theatre of war) after land, water, air and space • You usually sign a EULA and privacy statement – some kind of fundamental human right as a citizen of that ‘state’? • They buy hardware companies at rapid clip: Nest, Oculus Rift, Kinetic
  15. 15. The Snowden effect and the Snowden paradox
  16. 16. ‘Snowden effect’ Increase in public concern about surveillance, data privacy and information security resulting from the Snowden revelations Some comments • Most security professionals suspected this sort of capability in our agencies all along, we just had no proof • Laws are now being changed around the globe (which is a good thing and a bad thing) • Change of how some of the major cloud providers implement security
  17. 17. Snowden effect An illustration: as a result of this most famous PostIt note ever, Google now encrypts its data centre traffic
  18. 18. The Snowden paradox The public professes to be worried about issues of privacy and security, yet signs up en masse for services that 1. Are built upon surveillance as a business model, rather than an incidental feature 2. Have a EULA subject to unilateral change as a basic protection of ‘human rights’ 3. Practice widespread censorship and tax evasion 4. Are predicated on a business model where their users are the product
  19. 19. Why ‘freedom’ fails
  20. 20. What is freedom? Different interpretations in different domains Roosevelt’s four freedoms could be seen as what is required of a nation: 1. Freedom of speech and expression 2. Freedom of [religion] 3. Freedom from want 4. Freedom from fear F. D. Roosevelt, State of the Union Address to the Congress, January 6, 1941
  21. 21. Typical ‘NSA spies on us’ talk • Telephone system was designed to be intercepted – this was bad • Then the internet created freedom – this was good • Then the NSA was afraid of the internet ‘going dark’ • And started listening to everything • Then we were freed by Snowden • And encrypted everything • The progress of encrypted communications is now generating wider political crisis • Last words of the NSA: I’ll be back
  22. 22. It makes for nice quotes “Whatever else, history will record of them that they didn't think long before acting. Presented with a national calamity which also constituted a political opportunity, nothing stood between them and all the mistakes that haste can make for history to repent at leisure. And what they did, of course—in secret, with the assistance of judges chosen by a single man operating in secrecy, and with the connivance of many decent people who believed themselves to be doing the only thing that would save the society—was to unchain the listeners from law.”
  23. 23. But…there is more at stake Surveillance and espionage have always been a legitimate, though somewhat murky, state function • Subject to political control and governance • With generally a separation between intelligence and counter- intelligence With the ‘stacks’ we now get the first ‘states’ whose model is entirely predicated on mass surveillance and monetisation of that data • Without such controls and separations • And a EULA as your basic human rights
  24. 24. Example: There is more at stake • Tapping the phone system required a warrant, which needed to be acquired by a legal process • Bulk collection of data (i.e. actual conversations) will never get a warrant • Bulk collection of metadata doesn’t need one • Our politicians have a poor understanding of the issues • ‘States’ based explicitly on monetisation of surveillance data do not have any policies and controls on the data they hoover up • And these ‘states’ like a monopoly on that data
  25. 25. Raw ‘freedom’ and freedom • Of the ‘four freedoms’ only two are provided by the stacks and then in limited form (‘speech and expression’ and ‘religion’) • Some stacks now have censorship • Their international practices have the effect that the stacks are detrimental to the remaining two freedoms • Tax evasion erodes ‘freedom from want’ in many nations • Secret backroom deals (though not called ‘diplomacy’) erode freedom from ‘fear’ in many others
  26. 26. A sense of the problems • States cannot just help themselves indiscriminately to data • Disruptive changes to data environments make legal overhaul inevitable • Politicians are incapable of exerting effective governance and controls on the ‘stacks’, though they can do local deals • The ‘stacks’ should be included in such discussion, but how? • Of the ‘four freedoms’ only two are provided by the stacks (‘speech and expression’ and ‘religion’) • Can philosophers help? Yes, but they haven’t been very helpful so far
  27. 27. At what level should the cyber [in]security problem be tackled?
  28. 28. Where philosophers go wrong… • In philosophy education, general disconnect between history and philosophy • We see ‘bad behaviour’ as a personal problem, not as a systemic problem (lack a sociological point of view) Hence • Cyber security is seen as a problem of ‘hackers’ lacking ethical behaviour
  29. 29. Is cyber insecurity an ethical problem? Let’s suppose cyber insecurity is a primarily an issue of failing personal ethics. Then we have three very big problems. They are 1. The principles problem 2. The actor / attribution problem 3. The implementation problem
  30. 30. 1. The principles problem What is ‘ethical’?
  31. 31. What principles? Two unsatisfactory answers: Answer 1: Turn the question around • What sort of principles would make ‘hacking’ wrong? • Are these the right ones? Answer 2: Look at ‘environmental ethics’ • (e.g. Floridi) All binary data has inherent rights
  32. 32. What makes ‘hacking’ ‘wrong’ Why do people hack? Are all these motivations ‘wrong’? • Curiosity • Intellectual property • Defacements and activism • Thieving • Scamming • Spying • Sabotage • The ‘Fifth Domain’ (i.e. ‘war’)
  33. 33. ‘Environmental’ ethics [Floridi] Information ‘environment’ with global principles • entropy ought not to be caused in the infosphere (null law); • entropy ought to be prevented in the infosphere; • entropy ought to be removed from the infosphere; • the flourishing of informational entities as well as of the whole infosphere ought to be promoted by preserving, cultivating and enriching their properties. Where ‘entropy’ is information destruction or corruption
  34. 34. A cyber basic set of rights? • Could we define an ‘ethical actor’ – i.e. an entity with duties, obligations and ethical demands? For that we need a basic set of rights • The problem with EULAs • The price we pay for insisting on more privacy
  35. 35. The EULA Is a EULA sufficient protection? One AV company (F-Secure) decided to find out. They let people sign the EULA on the left to get free WiFi Spot the ‘Herod clause’ Six people signed up 014/sep/29/londoners-wi-fi-security-herod- clause
  36. 36. The Google ‘opt out’ village (The Onion)
  37. 37. 2. The actor / attribution problem In many / most cases of digital evil, ‘whodunit’ is difficult or fundamentally impossible Secrecy and security requirements a large barrier to open communication Thomas Rid: Attribution “is what states make of it” – i.e. a complex political process of negotiation Example: Did NORK hack Sony and SWIFT?
  38. 38. Attribution example 1 Who did this? This is the well-known ‘CryptoLocker’ ransomware Encrypts your files, then asks for a ransom Payment instructions on an onion routed website Payment is in bitcoin
  39. 39. Attribution example 2 And its ugly cousin ‘Petya’, same idea
  40. 40. Assumptions of the cyber attribution problem Three assumptions (all limited and insufficient) 1. Attribution is one of the most intractable problems of an emerging field, created by the underlying technical architecture and geography of the Internet 2. A binary view on attribution: for any given case, the problem can either be solved, or not be solved 3. Attributive evidence is readily comprehensible, the main challenge is finding the evidence itself, not analysing, enriching, and presenting it Thomas Rid (The Journal of Strategic Studies, 2015, 38(1–2), 4–37)
  41. 41. The Q model
  42. 42. The attribution problem (Bruce Schneier) Is attribution intractable? Not really, but there is a difference between 1. I know you did it 2. I can prove to you that I know you did it 3. I can prove publicly that I know you did it 4. I can prove in a court of law that you did it The scope and size of attribution depends on what you want from it
  43. 43. Attribution is an intelligence problem • Intelligence (or INTEL) is someone’s interpreted view of the world • Intelligence gathering follows an intelligence cycle • Intelligence is information which is analysed, enriched, and presented • Where does that ‘intelligence’ sit on the ladder of Schneier’s four escalating questions? • What does the data look like?
  44. 44. Threat Intel Cycle / Pyramid of Pain Direction Collection CollationAnalysis Reporting
  45. 45. 3. The implementation problem • How do we enforce our cyber security in practice? • Governance models on the internet?
  46. 46. Who is responsible? Roles of the state Market oversight Law Enforcement National Security National Defence Regulators Police and prosecutors Civil protection authorities Military and intelligence agencies Correcting market failures Fighting cybercrime Protecting critical infrastructures Executing military and intelligence operations
  47. 47. My terrorist, your freedom fighter, and a government as well ]Hacking Team[ was a somewhat odious outfit supplying surveillanceware to dubious governments around the globe They were comprehensively hacked and all their data stolen
  48. 48. Cyber insecurity is not an ethical problem • No broadly supported principles • No actor • No implementation What about criminals then? • Criminals are unethical because they display criminal behaviour (i.e. stealing, lying, cheating), not because they are hackers
  49. 49. A social philosophy of cyber security
  50. 50. Social Philosophy of cyber security More big problems, but ones we can solve more readily • Is a cyber social contract possible? • If so, what would make it up? • Should cyber security be a public good? [i.e. open to all if it’s provided for any members of a group] • Should ‘stacks’ be regulated? Note and comment: at the moment nation states can’t even get stacks to pay their taxes, so good luck with the rest
  51. 51. Social Contract theory [general structure] Look at it as a philosopher of science would Go from ‘atomic’ to ‘aggregate’ state 1. Postulate a ‘state of nature’ (hypothetical but not always) 2. Postulate a set of atomic and universal rights and obligations 3. Stipulate the conditions for a contract discussion [discursive model] 4. Stipulate participant trade-offs [game theory or other] 5. Simulate the discussion 6. Formulate the outcome: a ‘just’ arrangement
  52. 52. Social Contract theory [historical structure] • Hobbes and the Civil War / Restoration • Locke and the Glorious Revolution • Rawls and the welfare state Contract modelling is influenced by historical conditions
  53. 53. The ‘original hacker’ contract? 1. ‘State of nature’ A large unregulated internet 2. Rights and obligations The four freedoms of open source 3. Rationality model Make it work 4. Trade-off Let’s share everything 5. Simulate Information wants to be free 6. Outcome Privacy is evil, the ‘Circle’
  54. 54. Minimal ‘stacks’ contract 1. ‘State of nature’ Collection of walled gardens 2. Rights and obligations Sign some EULA / privacy statement 3. Rationality model It ‘just works’ 4. Trade-off Convenience for me, data for you 5. Simulate Convenience is noticeable, surveillance invisible (i) I like convenience (ii) I ignore what I can’t see 6. Outcome Welcome to ‘our community’
  55. 55. A (perhaps) desirable contract 1. ‘State of nature’ A large open internet 2. Rights and obligations Rights of basic protocols Distributed trust anchors 3. Rationality model Interoperability 4. Trade-off Privacy and integrity over convenience 5. Simulate Iteration of trust anchors 6. Outcome ‘Aware digital presence’
  56. 56. A brief word on distributed trust anchors • Tracers and tethers • Tether is a verifiable trust anchor • Tracer is its ongoing certification • Blockchain and others Technical solutions to a political problem: the risk at the moment is that states look at the internet from the starting point of national security and are willing to trade network public health for national security
  57. 57. A national cyber security strategy
  58. 58. With all of this • It’s no surprise that national cyber security strategies are a bit of a muddle • In most countries, strategies are now in their second generation • First generation acknowledged the existence of the problem • Second generation has some sort of remediation / resilience focus • We need a third generation which has a governance focus
  59. 59. First generation of strategies Generally recognise the existence of the problem and try to raise awareness Example: New Zealand’s policy from 2011. Objectives are to • raise the cyber security awareness and understanding of individuals and small businesses; • improve the level of cyber security across government; and • build strategic relationships to improve cyber security for critical national infrastructure and other businesses.
  60. 60. Second generation of strategies What usually gets addressed (NZ, UK, NL) is 1. National resilience 2. Crime 3. Diplomatic relations and cooperation 4. Capability What is usually not explicitly addressed is the tension between the nation and the ‘stacks’
  61. 61. Example The Dutch Cyber Security Assessment does a very good job of reporting against a ‘second generation’ cyber security strategy on an ongoing basis It is available in Dutch and English Reports have been produced every year since 2012
  62. 62. Third generation of strategies Without wanting to run ahead, a third generation is needed, which has a governance focus, possibly with binding laws, basic internet rights and public good provisions We have a precedent: Grotius’ Mare Liberum (1609) covered governance of the ‘second domain’ in an era of rapid naval expansion of Western Europe For us as philosophers: ‘ethics’ is not going to solve the governance problem, we need an alternative!
  63. 63. Conclusions • We have built a new domain of ‘faulty tech’ • This domain is a social domain, not a personal domain • Exploits of faulty tech are not necessarily an ethical failure • Philosophical thinking on the issue should be based on a social philosophy, not on ethics • Most countries now have second generation strategies • We need a third generation based on governance, fundamental rights and public good provisions which includes the large tech companies as states of their own – i.e. as diplomatic efforts
  64. 64. Questions?

Notas del editor

  • Many insurance companies are now offering discounts to customers who agree to wear a fitness tracking device and whose data shows an active lifestyle. Unfit bits are basically a way of fooling this.