Se ha denunciado esta presentación.
Utilizamos tu perfil de LinkedIn y tus datos de actividad para personalizar los anuncios y mostrarte publicidad más relevante. Puedes cambiar tus preferencias de publicidad en cualquier momento.
Caching for Microservices
Architectures
Introducing Pivotal Cloud Cache
Jagdish Mirani
Why Microservices Architectures Need a Cache
Autonomy Availability Cost Reduction
$
$
Performance
Why Microservices Architectures Need a Cache
Autonomy Availability Cost Reduction
$
$
Performance
Ability to handle a large number of concurrent requests
Performance Drivers for Modern Applications
▪  More users of new m...
Caches Provide Blazing Fast Performance
▪  Memory is orders of magnitude faster than disk
▪  Caches can present a structur...
Externalizing state is a requirement for microservice instances to scale
Microservices Need Performance and Scalability
▪ ...
Microservices with large, frequently accessed data sets need a cache layer
Microservices Need Performance and Scalability
...
Why Microservices Architectures Need a Cache
Autonomy Availability Cost Reduction
$
$
Performance
Fosters an agile, dynamic application culture
Team Autonomy Equates to Velocity
▪  Separate development and release cycles...
Extreme ends of data sharing continuum present challenges
Autonomy in the Context of the Data Layer
Autonomous
Distributed...
Data APIs Present Autonomous Views of Data
●  Define a Data API that projects a data model to match the
needs of the consu...
Data API evolve in support of the evolution of the microservice(s)
Versioned APIs Facilitate Change Management
▪  Analogou...
Caching Can Present an Autonomous View of Data
●  Provides a surface area to:
○  implement access control
○  implement thr...
Caching Can Present an Autonomous View of Data
●  Data APIs project a bounded context
○  Each bounded context has a single...
Why Microservices Architectures Need a Cache
Autonomy Availability Cost Reduction
$
$
Performance
Several points of failure
Large Number of ‘Moving Parts’
▪  Single request can touch several components: servers,
distinct...
Highly Available Caching Layer Offers Protection
●  Cache serves as the ‘primary’ data
store for the application
●  High a...
Why Microservices Architectures Need a Cache
Autonomy Availability Cost Reduction
$
$
Performance
Expensive and Brittle
Legacy Application Infrastructures
▪  High startup costs
▪  Steep pricing curve for adding capacity
...
20
Legacy Modernization is key to success
$$$$
ROI Funds
Transformation
Replatform Modernize Migrate
Runs on
PCF
Existing ...
Legacy Systems: Part of a Cloud Native Evolution
Create microservices around the edges of the legacy system
●  Caching lay...
Introducing Pivotal Cloud Cache
Prepackaged for Simple Consumption
●  Plans (use cases) based on caching patterns
●  Look-aside pattern supported out of t...
Pivotal Cloud Cache
•  Easy accessibility
through Marketplace
•  Instant Provisioning
•  Bind to apps through
easy to use ...
Easily Provisioned for Developer Self Service
Operators create and register service plans with the Services Marketplace
Cr...
In-memory, and horizontally scalable for parallel execution
Pivotal Cloud Cache Performance
Grow cluster dynamically with ...
This is how an in-memory cache can horizontally scale
Partitioning (aka Sharding)
Take advantage of the memory and network...
Replicated regions
Partitioning and Co-location
Replicated regions model many-to-many relationships
High Availability
Spanning Servers and Availability Zones
Stretched cluster across availability zones
Replication for high...
Powerful Eventing System
Publish/subscribe and Continuous Query built right in
Put(key, value) Subscribers are
automatical...
Integrated Security
Pre-configured Authentication and Authorization
●  Role-based, configurable, authorization
for adminis...
Summary
Speed up your apps on Pivotal Cloud Foundry
●  PCC can overcome the performance, elasticity and scaling, challenge...
Caching for Microservices Architectures: Session I
Caching for Microservices Architectures: Session I
Próxima SlideShare
Cargando en…5
×

Caching for Microservices Architectures: Session I

13.216 visualizaciones

Publicado el

In this 60 minute webinar, we will cover the key areas of consideration for data layer decisions in a microservices architecture, and how a caching layer, satisfies these requirements. You’ll walk away from this webinar with a better understanding of the following concepts:

- How microservices are easy to scale up and down, so both the service layer and the data layer need to support this elasticity.
- Why microservices simplify and accelerate the software delivery lifecycle by splitting up effort into smaller isolated pieces that autonomous teams can work on independently. Event-driven systems promote autonomy.
- Where microservices can be distributed across availability zones and data centers for addressing performance and availability requirements. Similarly, the data layer should support this distribution of workload.
- How microservices can be part of an evolution that includes your legacy applications. Similarly, the data layer must accommodate this graceful on-ramp to microservices.

Presenter : Jagdish Mirani is a Product Marketing Manager in charge of Pivotal’s in-memory products

Publicado en: Tecnología
  • Sé el primero en comentar

Caching for Microservices Architectures: Session I

  1. 1. Caching for Microservices Architectures Introducing Pivotal Cloud Cache Jagdish Mirani
  2. 2. Why Microservices Architectures Need a Cache Autonomy Availability Cost Reduction $ $ Performance
  3. 3. Why Microservices Architectures Need a Cache Autonomy Availability Cost Reduction $ $ Performance
  4. 4. Ability to handle a large number of concurrent requests Performance Drivers for Modern Applications ▪  More users of new mobile and web applications ▪  Users expect real-time response, even during peak usage ▪  Increasing number of requests from other applications ▪  New use cases from new data sources, ex: IoT and streaming data ▪  Scaling the application logic results in the need for scaling the data access layer
  5. 5. Caches Provide Blazing Fast Performance ▪  Memory is orders of magnitude faster than disk ▪  Caches can present a structural view of data optimized for performance ▪  Maximizing cache hits -  Preloading cache (cache warming) -  Expiration and eviction •  Application driven •  Time based •  Notifications and events Microservice Instance Cache Database
  6. 6. Externalizing state is a requirement for microservice instances to scale Microservices Need Performance and Scalability ▪  Externalize microservices state for performance and scalability of the business logic -  Store application state information in cache for fast retrieval -  Adheres to 12-factor principles ▪  Dynamically change the number of application instances without losing state information
  7. 7. Microservices with large, frequently accessed data sets need a cache layer Microservices Need Performance and Scalability Performance and scalability of data ▪  Add servers to a shared cluster ▪  Reduces the pressure to scale rigid backing stores ▪  Enables availability and resilience
  8. 8. Why Microservices Architectures Need a Cache Autonomy Availability Cost Reduction $ $ Performance
  9. 9. Fosters an agile, dynamic application culture Team Autonomy Equates to Velocity ▪  Separate development and release cycles -  Evolve each microservice independently -  Independent development, test, production cycles -  Continuous integration, continuous delivery ▪  Independent technology decisions, including data layer -  Polyglot persistence -  Independent data model decisions ▪  Changes should be non-breaking for other teams and microservices
  10. 10. Extreme ends of data sharing continuum present challenges Autonomy in the Context of the Data Layer Autonomous Distributed workload and data management challenges Shared Database Database Per Service No autonomy Development and runtime coupling
  11. 11. Data APIs Present Autonomous Views of Data ●  Define a Data API that projects a data model to match the needs of the consuming microservices ●  Data API is the access point to a microservice that’s primarily responsible for accessing data ●  Data API provides a contract for accessing data ●  Teams create data caches optimized for their microservices ●  Allows more flexibility for (and isolation from) changes to backing stores Caches provide data for each autonomous view
  12. 12. Data API evolve in support of the evolution of the microservice(s) Versioned APIs Facilitate Change Management ▪  Analogous to the notion of versioned microservices ▪  Parallel deployment of versions creates the possibility of a managed evolution ▪  Allows for data transformations within the microservice as an alternative to changing the backing store(s) V1 V2
  13. 13. Caching Can Present an Autonomous View of Data ●  Provides a surface area to: ○  implement access control ○  implement throttling ○  perform logging ○  enforce other policies Teams create data caches optimized for their microservices
  14. 14. Caching Can Present an Autonomous View of Data ●  Data APIs project a bounded context ○  Each bounded context has a single, unified model ○  Relationships between models are explicitly defined ○  Teams are typically given full responsibility over one or more bounded contexts
  15. 15. Why Microservices Architectures Need a Cache Autonomy Availability Cost Reduction $ $ Performance
  16. 16. Several points of failure Large Number of ‘Moving Parts’ ▪  Single request can touch several components: servers, distinct clusters, microservice instances ▪  Availability zones can fail ▪  Regions can become unstable ▪  Network is unreliable ▪  Cloud native architecture components are ephemeral by design -  Instances added and removed dynamically Enterprise Readiness Requires the Ability to Tolerate Failures
  17. 17. Highly Available Caching Layer Offers Protection ●  Cache serves as the ‘primary’ data store for the application ●  High availability: copy data for failure protection ●  Immune to lapses in backing store availability ●  Backing stores kept up-to-date through the cache
  18. 18. Why Microservices Architectures Need a Cache Autonomy Availability Cost Reduction $ $ Performance
  19. 19. Expensive and Brittle Legacy Application Infrastructures ▪  High startup costs ▪  Steep pricing curve for adding capacity -  Mainframe MIPS pricing -  Legacy RDBMS data stores are expensive to scale ▪  Complex deployments ▪  Easily disrupted ▪  Points of failure ▪  Scalability bottlenecks
  20. 20. 20 Legacy Modernization is key to success $$$$ ROI Funds Transformation Replatform Modernize Migrate Runs on PCF Existing Workloads Cloud Native Built for PCF New Initiatives Cloud Native ModernizeReplatform Migrate
  21. 21. Legacy Systems: Part of a Cloud Native Evolution Create microservices around the edges of the legacy system ●  Caching layer to mediates between the old and the new ●  Optionally, re-platform the legacy application ●  Optionally, reduce reliance on legacy application over time Microservices Legacy Middleware Legacy Application Monolithic Application
  22. 22. Introducing Pivotal Cloud Cache
  23. 23. Prepackaged for Simple Consumption ●  Plans (use cases) based on caching patterns ●  Look-aside pattern supported out of the box ○  Cache is controlled and managed by the application ○  Good for saving application state, microservices architectures, reducing load on legacy systems, etc.. ○  Perfect match for the Spring Framework @Cacheable annotation ●  Other caching patterns and options to come (WAN replication, Session State Caching, Inline caching pattern, etc.). Look-Aside Cache Look-aside pattern supported out of the box App Instance Cache Database
  24. 24. Pivotal Cloud Cache •  Easy accessibility through Marketplace •  Instant Provisioning •  Bind to apps through easy to use interface •  Lifecycle management •  Common access control and audit trails across services MySQL New Relic Single Sign- On RabbitMQ Config Server Service Directory Circuit Breaker Signal Sciences Crunchy PostgreSQL AND MORE Services Marketplace Pivotal Cloud Cache Dynatrace Extending the Pivotal Cloud Foundry Platform for Microservices Architectures
  25. 25. Easily Provisioned for Developer Self Service Operators create and register service plans with the Services Marketplace Create Service Plans Set Quotas Deploy PCC Broker Define VMs Define Memory Define CPU Define Disk Size Max Cluster Size Max # of Clusters OpsMan Tile Register with Marketplace
  26. 26. In-memory, and horizontally scalable for parallel execution Pivotal Cloud Cache Performance Grow cluster dynamically with no interruption of service or data loss Data is sharded or replicated across servers
  27. 27. This is how an in-memory cache can horizontally scale Partitioning (aka Sharding) Take advantage of the memory and network bandwidth of all members of the cluster
  28. 28. Replicated regions Partitioning and Co-location Replicated regions model many-to-many relationships
  29. 29. High Availability Spanning Servers and Availability Zones Stretched cluster across availability zones Replication for high availability of data in cache Pivotal Cloud Foundry resurrects lost VMs
  30. 30. Powerful Eventing System Publish/subscribe and Continuous Query built right in Put(key, value) Subscribers are automatically notified
  31. 31. Integrated Security Pre-configured Authentication and Authorization ●  Role-based, configurable, authorization for administrative activities ●  Pre-defined, pre-configured roles ●  Consistent mechanism for authenticating and authorizing actions ●  Every administrative function can require authorization ●  Every data access can require authorization ●  Some users can read/write data ●  Others can start/stop servers ●  Still others can configure cluster User 1 User 2 User 3 User 4 Group 1 Group 2 Role 1 Role 2 Role 3 Role 1A Role 1B Role 1C Role 2A Role 3A Role 3B Determines experience Determines permissions
  32. 32. Summary Speed up your apps on Pivotal Cloud Foundry ●  PCC can overcome the performance, elasticity and scaling, challenges of microservices architectures ●  Using PCC with data APIs can increase autonomy between teams ●  PCC has rock solid availability and failure recovery ●  PCC provides an evolutionary approach to adopting microservices that can extend the life of legacy systems ●  The combination of PCC and PCF make it possible to get started quickly and easily adjust cache capacity as needed

×