Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.

Thinking Architecturally

144 visualizaciones

Publicado el

Presented by Nate Schutta at the Fedex Cloud-Native Conference in Pittsburgh on July 12th, 2019.

Publicado en: Software
  • Sé el primero en comentar

  • Sé el primero en recomendar esto

Thinking Architecturally

  1. 1. Nathaniel T. Schutta @ntschutta ntschutta.io Thinking Architecturally
  2. 2. https://content.pivotal.io/ ebooks/thinking-architecturally
  3. 3. Architecting is hard…
  4. 4. Many competing agendas.
  5. 5. Technology changes.
  6. 6. Constantly.
  7. 7. Feature not a bug.
  8. 8. Keeps things interesting…
  9. 9. We want to avoid legacy platforms.
  10. 10. But we can’t change things every few months.
  11. 11. “Our app has 4 different UI frameworks…”
  12. 12. Developers kept chasing the new hotness.
  13. 13. How do we avoid that?
  14. 14. How do we evaluate new technology?
  15. 15. I have no idea what language/ framework/platform is “next”.
  16. 16. No one does.
  17. 17. But I can guarantee you this much:
  18. 18. It will be different than what we use today.
  19. 19. Five years from now we will be using something that isn’t invented yet.
  20. 20. Chasing the new thing.
  21. 21. Technology changes.
  22. 22. Constantly.
  23. 23. Tempting to always chase the “new hotness”.
  24. 24. Bleeding edge.
  25. 25. It’s fun!
  26. 26. Part of being in this industry.
  27. 27. Our understanding constantly evolves.
  28. 28. Let’s be honest…
  29. 29. Developers have opinions!
  30. 30. Often *very* strong opinions.
  31. 31. Maybe we fear old things?
  32. 32. Predictable hype cycle.
  33. 33. https://mobile.twitter.com/cote/status/963481741171265537
  34. 34. How do we know where not to use a technology?
  35. 35. Trial and error.
  36. 36. Developers tend to get bored quickly.
  37. 37. Learning keeps it fresh.
  38. 38. But we have to deliver business value.
  39. 39. Can’t do that if we’re always experimenting.
  40. 40. Have to commit at some point.
  41. 41. Develop some expertise.
  42. 42. Bleeding edge… means you will bleed!
  43. 43. https://mobile.twitter.com/joeerl/status/930774515512201216
  44. 44. Pioneers…the ones with arrows in their backs.
  45. 45. What is your strategy?
  46. 46. How do we avoid dead platforms?
  47. 47. Without constantly changing direction?
  48. 48. Strategy.
  49. 49. Hope is not a strategy!
  50. 50. But it is what rebellions are built on.
  51. 51. We need to be deliberate.
  52. 52. There are a lot of bits out there...
  53. 53. New languages, techniques, approaches.
  54. 54. How do you keep up?
  55. 55. Blogs? Books? Twitter? Podcasts? Conferences?
  56. 56. Develop a routine.
  57. 57. Block out Friday afternoon. Tuesday over lunch. Whatever fits.
  58. 58. Consider “morning coffee”.
  59. 59. Take 15-30 minutes in the morning to peruse the tech news.
  60. 60. Before the day gets away from you…
  61. 61. Attention is precious.
  62. 62. — Seth Godin “Attention is a bit like real estate, in that they're not making any more of it. Unlike real estate, though, it keeps going up in value.” http://sethgodin.typepad.com/seths_blog/2011/07/ paying-attention-to-the-attention-economy.html
  63. 63. Don’t waste it.
  64. 64. Be selective.
  65. 65. Can’t read it all.
  66. 66. http://www.npr.org/blogs/monkeysee/2011/04/21/135508305/the-sad- beautiful-fact-that-were-all-going-to-miss-almost-everything In fact, you’ll miss almost everything.
  67. 67. We cannot adopt every new thing.
  68. 68. How do we know where to invest our time?
  69. 69. Hacker’s Radar? http://www.paulgraham.com/javacover.html
  70. 70. “I have a hunch that [Java] won't be a very successful language.”
  71. 71. Never written a line of Java, glanced at some books.
  72. 72. Need more than just a hunch.
  73. 73. “Judging Covers” can be a useful filter.
  74. 74. But beware bias.
  75. 75. Where is the community?
  76. 76. Are you skating to where the puck *was*?
  77. 77. Technology Radar. https://www.thoughtworks.com/radar
  78. 78. Remember Google’s 20% time?
  79. 79. Fallen out of favor in some circles…
  80. 80. Innovation Fridays.
  81. 81. Could you carve out Friday afternoons?
  82. 82. How about Tuesday Tech Talks?
  83. 83. Architectural Briefings. https://github.com/stuarthalloway/presentations/wiki/Architectural-Briefings
  84. 84. One person does some research, presents to the team.
  85. 85. And no, you don’t need to be an architect to present!
  86. 86. Why should we use X?
  87. 87. What do you need to know to answer the “why”?
  88. 88. What do you need to know in order to use X?
  89. 89. Keep it short - 45 minutes.
  90. 90. Not a how to.
  91. 91. Beyond the initial documentation.
  92. 92. These are participatory events!
  93. 93. Attendees should be taking notes.
  94. 94. Asking questions.
  95. 95. Using their own experiences.
  96. 96. Do you agree? Why or why not?
  97. 97. By the way, you are up next week…
  98. 98. Pass the briefing filter?
  99. 99. Hands on time.
  100. 100. Workshop it.
  101. 101. Couple of hours.
  102. 102. A few exercises.
  103. 103. Focus on how to, simple setup.
  104. 104. Pass the hands on filter?
  105. 105. Time to trial it in the organization.
  106. 106. Real project work that is a good fit.
  107. 107. Probably not a “bet the company” project though!
  108. 108. The new hotness is not our only concern though.
  109. 109. Need to stay current on the things we are using day in day out.
  110. 110. Oops.
  111. 111. Don’t think you’re a target?
  112. 112. — Justin Smith At high velocity, the three Rs starve attacks of the resources they need to grow. It’s a complete 180-degree change from the traditional careful aversion to change to mitigate risk. Go fast to stay safer — in other words, speed reduces risk.
  113. 113. What is your patching strategy?
  114. 114. What version of X are you on?
  115. 115. Some organizations have a policy of N or N-1.
  116. 116. Do they measure it? Do they enforce it?
  117. 117. What needs to change in your culture to stay at N?
  118. 118. What hurts more? Changing your patching strategy?
  119. 119. Or being on the receiving end of the latest “largest hack ever”?
  120. 120. Pros and Cons.
  121. 121. Every technical choice involves tradeoffs.
  122. 122. — Susan J. Fowler
 Production-Ready Microservices When we find ourselves presented with technology that promises to offer us drastic improvements, we need to look for the trade-offs.
  123. 123. Essence of design.
  124. 124. To paraphrase Harry Truman…
  125. 125. Give me a one handed technologist.
  126. 126. Should we use React or Angular?
  127. 127. Should we refactor to microservices?
  128. 128. Should we be on prem or public cloud?
  129. 129. https://twitter.com/KentBeck/status/596007846887628801
  130. 130. In many cases?
  131. 131. && ! ||
  132. 132. Balancing those opposing forces is the art of architecture.
  133. 133. No tech is perfect, don’t pretend it is.
  134. 134. Acknowledge the negatives.
  135. 135. What do you like about it?
  136. 136. What don’t you like about it?
  137. 137. What would you add?
  138. 138. What would you remove?
  139. 139. King of Java for a day...
  140. 140. https://mobile.twitter.com/kelseyhightower/status/963428093292457984
  141. 141. How does it stack up to alternatives?
  142. 142. The spreadsheet approach.
  143. 143. Options across the top.
  144. 144. Criteria down the left.
  145. 145. Criteria can be weighted.
  146. 146. Harvey balls. http://en.wikipedia.org/wiki/Harvey_Balls
  147. 147. !"#$%
  148. 148. How closely does does it map to the criteria?
  149. 149. Very effective...
  150. 150. Angular React Documentation % $ Community % $ Committer diversity % % Codebase % % Testability % % Update history $ $ Maturity % $
  151. 151. Angular React Stability % $ Extensibility % % Support % % Training % % Hiring % $ Corporate fit ? ? Usage % %
  152. 152. What criteria should you use?
  153. 153. How should they be weighted?
  154. 154. Up to you.
  155. 155. You can tip the scales…
  156. 156. Usually backfires.
  157. 157. Establish principles.
  158. 158. We can’t be everywhere…
  159. 159. We can’t be involved with every decision.
  160. 160. We can establish principles.
  161. 161. Guard rails.
  162. 162. Guide posts.
  163. 163. North stars.
  164. 164. Create the environment within which our projects can thrive.
  165. 165. But how do we know if projects are following our principles?
  166. 166. Fitness functions.
  167. 167. We’re all familiar with the second law of thermodynamics…
  168. 168. Otherwise known as a teenagers bedroom.
  169. 169. The universe really wants to be disordered.
  170. 170. Software is not immune from this!
  171. 171. We go through the thoughtful effort to establish an architecture…
  172. 172. How do we maintain it?
  173. 173. We can’t spend every minute of every day on every project.
  174. 174. How do we ensure teams continue to make good decisions?
  175. 175. We cannot predict the future.
  176. 176. That’s not entirely true.
  177. 177. One constant - change.
  178. 178. Architecture is often defined as the decisions that are hard to change.
  179. 179. Or the decisions we wish we got right.
  180. 180. But we *know* things will change!
  181. 181. Isn’t this approach anti agile?
  182. 182. Contributing factor to the “we’re agile, we don’t have architects” theory.
  183. 183. You definitely have people making architectural decisions!
  184. 184. Sure hope they are making good ones…
  185. 185. You’ll know in a year or two.
  186. 186. “Our app has 4 different UI frameworks…”
  187. 187. 🤔
  188. 188. What do we do about that?
  189. 189. Maybe we should change our assumptions.
  190. 190. https://mobile.twitter.com/martinfowler/status/949323421619548161
  191. 191. What if our architectures expected to change?
  192. 192. http://evolutionaryarchitecture.com
  193. 193. — Building Evolutionary Architectures An evolutionary architecture supports guided, incremental change across multiple dimensions.
  194. 194. Some architectures are more evolvable than others…
  195. 195. http://evolutionaryarchitecture.com
  196. 196. Components are deployed, features are enabled via toggles.
  197. 197. Allows us to change incrementally.
  198. 198. Also perform hypothesis driven development!
  199. 199. But how do we ensure the architecture still meets our needs?
  200. 200. How do we know if a solution violates part of the architecture?
  201. 201. Fitness functions!
  202. 202. Concept comes from evolutionary computing.
  203. 203. Is this mutation a success?
  204. 204. Are we closer to or further from our goal?
  205. 205. For architecture, it is all about protecting the ilities.
  206. 206. And balancing the tradeoffs.
  207. 207. We want to capture and preserve the key architectural characteristics.
  208. 208. First, we need to identify those key measures for project success.
  209. 209. Service Level Indicators if you will.
  210. 210. What can we measure?
  211. 211. Sometimes we let what we can measure dictate too much…
  212. 212. Just because we can measure it doesn’t mean it matters!
  213. 213. Lines of code anyone?
  214. 214. Once we have our metrics, we can set some goals.
  215. 215. Service Level Objectives.
  216. 216. SLO !== SLA!
  217. 217. Now we can create a fitness function!
  218. 218. Basically, a set of tests we execute to validate our architecture.
  219. 219. How close does this particular design get us to our objectives?
  220. 220. Ideally, all automated. But we may need some manual verifications.
  221. 221. For example…
  222. 222. All service calls must respond within 100 ms.
  223. 223. Cyclomatic complexity shall not exceed X.
  224. 224. Hard failure of an application will spin up a new instance.
  225. 225. https://github.com/Netflix/SimianArmy
  226. 226. Chaos Engineering. https://medium.com/production-ready/chaos-monkey-for-fun-and-profit-87e2f343db31
  227. 227. Fitness functions remind us what is important in our architecture.
  228. 228. Informs our thinking about tradeoffs.
  229. 229. Different categories of fitness functions.
  230. 230. Atomic vs. Holistic.
  231. 231. Some characteristics must be tested in isolation…others cannot.
  232. 232. Holistic fitness functions test combined features.
  233. 233. We can’t test every possible combination!
  234. 234. Must be selective, driven by the value of the architectural characteristic.
  235. 235. Triggered vs. Continual.
  236. 236. Must consider frequency of execution.
  237. 237. Fitness functions can be triggered by something - checkin, QA pass…
  238. 238. Continual tests are just that.
  239. 239. Monitoring Driven Development! http://benjiweber.co.uk/blog/2015/03/02/monitoring-check-smells/
  240. 240. Static vs. Dynamic.
  241. 241. Static tests have a fixed result - they either pass or they fail.
  242. 242. Nearly any test based on a metric.
  243. 243. Other fitness functions have a shifting definition of success.
  244. 244. Generally defined within a range of acceptable outcomes.
  245. 245. Automated vs. Manual.
  246. 246. Automation is good!
  247. 247. Ideally most of our fitness functions will live in our deployment pipeline.
  248. 248. Not everything is amenable to automation though…
  249. 249. Legal.
  250. 250. Existing projects.
  251. 251. Temporal fitness functions.
  252. 252. Essentially a reminder.
  253. 253. Check for an upgrade of library X.
  254. 254. Break upon upgrade tests.
  255. 255. Clearly we want to identify fitness functions as early as we can.
  256. 256. The discussion about the tradeoffs is invaluable to our understanding.
  257. 257. Help us prioritize features.
  258. 258. May lead us to break a system up to isolate certain features.
  259. 259. We can’t know everything up front.
  260. 260. Fitness functions will emerge as the system changes.
  261. 261. But we should strive to identify as many as we can up front.
  262. 262. We can also classify fitness functions.
  263. 263. Key - critical decisions.
  264. 264. Relevant - considered but unlikely to influence the architecture.
  265. 265. Not Relevant - won’t impact our decisions.
  266. 266. Can still be very useful to identify the non relevant dimensions!
  267. 267. Keep fitness functions visible!
  268. 268. Need to review the fitness functions.
  269. 269. Are they still relevant?
  270. 270. Are there new dimensions we need to track?
  271. 271. Are there better ways of measuring/ testing our current fitness functions?
  272. 272. Aim for at least an annual review.
  273. 273. Architecting is hard…
  274. 274. We have a lot to juggle!
  275. 275. Important that we thing strategically.
  276. 276. We can’t afford Resume Driven Design.
  277. 277. Good luck!
  278. 278. Questions?
  279. 279. Nathaniel T. Schutta @ntschutta ntschutta.io Thanks! I’m a Software Architect, Now What? with Nate Shutta Modeling for Software Architects with Nate Shutta Presentation Patterns with Neal Ford & Nate Schutta

×