How to bake automation and predictability into your application’s lifecycle with AWS OpsWorks, and showcase one of our latest collaborations with AWS CodePipeline that streamlines CI/CD processes.
2. About me:
Chris Munns - munns@amazon.com
• Business Development Manager – DevOps
• New Yorker
• Previously:
• AWS Solutions Architect 2011-2014
• Lead of Infrastructure/DevOps @hingeapp
• Formerly on operations teams @Etsy and @Meetup
• Little time at a hedge fund, Xerox and others
• Rochester Institute of Technology: Applied Networking and
Systems Administration ’05
• Internet infrastructure geek
3. Why do you need OpsWorks?
Model and group your applications
Manage the life-cycle of your instances
Control Access Management
Monitor the health of your resources
Analyze logging information
Mitigate operational problems
4. Configure your instances using AWS OpsWorks
Uses Chef to configure the software on the instances
Chef provides a Ruby DSL abstraction for common OS operations
Associates pre-defined scripts (i.e. Chef cookbooks) with your instances
Applies cookbooks configuration changes using life-cycle events
29. OpsWorks Access Management
Provide IAM users full SSH / RDP and sudo / admin privileges
Provided limited access on a group level (i.e. OpsWorks stack)
30. SSH / RDP session management
AWS OpsWorks grants SSH / RDP access to IAM users
31. Temporary RDP session management
AWS OpsWorks grants temporary RDP access to IAM users
33. Configure your instances using AWS OpsWorks
14 free one minute metrics (CPU, Memory, load, process count, etc.)
Aggregation on the group level (OpsWorks stack, layer)
CloudWatch optimized dashboards (contextual dashboards)
37. Continuous delivery service for fast and
reliable application updates
Model and visualize your software release
process
Builds, tests, and deploys your code every time
there is a code change
Integrates with 3rd party tools and AWS
AWS CodePipeline
45. Update your Chef cookbooks and deploy your applications
Streamline your CI/CD processes with
AWS OpsWorks & AWS CodePipeline
46. How do I Get Started with OpsWorks?
Grab some community cookbooks
https://supermarket.chef.io/
Learn more
https://aws.amazon.com/opsworks/
Get started
https://aws.amazon.com/opsworks/
https://aws.amazon.com/codepipeline/
Let's visualize the life cycle idea.
So at some point an app server is needed. You define the server layer in OpsWorks and create an instance.
Launch first instance
We start the instance on EC2 for you, install the agent, get the cookbooks you want on the machine, and install all the things described in the recipes to make it a working app server.
As soon this is done – we upload a log to S3 and inform OpsWorks.
Setup triggers configure event
OpsWorks understands that there was a change to the Stack and triggers on all available instances a configure event.
The event is send to the one instance we have and runs the Chef recipes associated to the configure event.
In this case this is probably a no op.
Deploy the static App
So lets deploy a static app. The app was defined in OpsWorks, which stores were the source code lives.
The deploy event is triggered and the deploy recipes are run.
They will download the source code and restart the app server if needed.
Voila the app is running.
Add a database instance
But actually we probably want a database to store things in our app.
So we add a database layer, add an instance, and launch it.
OpsWorks again provisions the instance for you and sets it up.
Reconfigure Stack
Again logs are shipped and the configure event is triggered on all instances in the Stack as there was a change.
This time the configure event is interesting as it actually changes the configuration.
The app server will find out about the database server and update its configuration in a way that the app can actually use the database.
The database on the other hand might update the access control list of the instance IPs that are allowed to access the database and add the Ip of the one app server.
Deploy and migrate database
We now trigger the deploy again to roll out our new version of the app.
The deploy event again executes layer specific recipes from you cookbook.
So the app code won't land on the database server as this is usually not what you want.
But the deploy recipes on the database might do a database migration, while the recipes run on the app server check out the source code and restart the app server.
Deploy and migrate database
We now trigger the deploy again to roll out our new version of the app.
The deploy event again executes layer specific recipes from you cookbook.
So the app code won't land on the database server as this is usually not what you want.
But the deploy recipes on the database might do a database migration, while the recipes run on the app server check out the source code and restart the app server.
Add more instances
The app gets more visitors so you have decided you need more app servers.
Same game.
Add a server and start it. It will run the setup event and get your server configured and your app deployed.
After that again the configure event is triggered.
This time the new app server reconfigures its database connection, the old one will have a no op as the database connection already existed and nothing changed here, and the database will reconfigure its access control list to allow the new app server to access the database.
As you can see with the right recipes you can model all potential changes to your infrastructure.
Execute recipes – any time
You also can run at any time any group of recipes on any set of servers.
This makes sense if you want to trigger e.g. the backup of data, request logs, change the list of users that have access to the servers and so on.
Stop instance
It is later that day and the amount of visitors is lower. You or our auto scaling mechanism terminates the instance.
The shut down event is triggered and the instance ships the latest logs.
Configure Stack
Again as this is a change to the Stack a configure event is fired.
I guess by now you get the idea.
The app server doesn't care as the database is still online nothing relevant changed for him.
The database on the other hand will remove the IP address from the ACL as the instance is not online anymore.
28
32
34
36
Let’s take a look at an example Pipeline. I’ve created a simple 3 stage Pipeline to talk though my example.
Source actions are special actions. They continuously poll the source providers, such as GitHub and S3, in order to detect changes. Once a change is detected, the new pipeline run is created and the new pipeline begins its run. The source actions retrieve a copy of the source information and place it into a customer owned S3 bucket.
Once the source action is completed, the Source stage is marked as successful and we transition to the Build stage.
In the Build Stage we have one action, Jenkins. Jenkins was integrated into CodePipeline as a CustomAction and has the same lifecycle as all custom actions. Talk through interaction
Once the build action is completed, the Build stage is marked as successful and we transition to the Deploy stage
The Deploy stage contains one action, an AWS Elastic Beanstalk deployment action. The Beanstalk action retrieves the build artifact from the customer’s S3 bucket and deploys it to the Elastic Beanstalk web container.
Let’s take a look at an example Pipeline. I’ve created a simple 3 stage Pipeline to talk though my example.
Source actions are special actions. They continuously poll the source providers, such as GitHub and S3, in order to detect changes. Once a change is detected, the new pipeline run is created and the new pipeline begins its run. The source actions retrieve a copy of the source information and place it into a customer owned S3 bucket.
Once the source action is completed, the Source stage is marked as successful and we transition to the Build stage.
In the Build Stage we have one action, Jenkins. Jenkins was integrated into CodePipeline as a CustomAction and has the same lifecycle as all custom actions. Talk through interaction
Once the build action is completed, the Build stage is marked as successful and we transition to the Deploy stage
The Deploy stage contains one action, an AWS Elastic Beanstalk deployment action. The Beanstalk action retrieves the build artifact from the customer’s S3 bucket and deploys it to the Elastic Beanstalk web container.
Let’s take a look at an example Pipeline. I’ve created a simple 3 stage Pipeline to talk though my example.
Source actions are special actions. They continuously poll the source providers, such as GitHub and S3, in order to detect changes. Once a change is detected, the new pipeline run is created and the new pipeline begins its run. The source actions retrieve a copy of the source information and place it into a customer owned S3 bucket.
Once the source action is completed, the Source stage is marked as successful and we transition to the Build stage.
In the Build Stage we have one action, Jenkins. Jenkins was integrated into CodePipeline as a CustomAction and has the same lifecycle as all custom actions. Talk through interaction
Once the build action is completed, the Build stage is marked as successful and we transition to the Deploy stage
The Deploy stage contains one action, an AWS Elastic Beanstalk deployment action. The Beanstalk action retrieves the build artifact from the customer’s S3 bucket and deploys it to the Elastic Beanstalk web container.
Let’s take a look at an example Pipeline. I’ve created a simple 3 stage Pipeline to talk though my example.
Source actions are special actions. They continuously poll the source providers, such as GitHub and S3, in order to detect changes. Once a change is detected, the new pipeline run is created and the new pipeline begins its run. The source actions retrieve a copy of the source information and place it into a customer owned S3 bucket.
Once the source action is completed, the Source stage is marked as successful and we transition to the Build stage.
In the Build Stage we have one action, Jenkins. Jenkins was integrated into CodePipeline as a CustomAction and has the same lifecycle as all custom actions. Talk through interaction
Once the build action is completed, the Build stage is marked as successful and we transition to the Deploy stage
The Deploy stage contains one action, an AWS Elastic Beanstalk deployment action. The Beanstalk action retrieves the build artifact from the customer’s S3 bucket and deploys it to the Elastic Beanstalk web container.
Source actions are special actions. They continuously poll the source providers, such as GitHub and S3, in order to detect changes. Once a change is detected, the new pipeline run is created and the new pipeline begins its run. The source actions retrieve a copy of the source information and place it into a customer owned S3 bucket.
Once the source action is completed, the Source stage is marked as successfully complete and we transition to the Build stage.
In the Build Stage we have one action, Jenkins. Jenkins was integrated into CodePipeline as a CustomAction and had the same lifecycle as all custom actions. Talk through interaction
Once the build action is completed, the Build stage is marked as successfully completed and we transition to the Deploy stage
The Deploy stage contains one action, an AWS Elastic Beanstalk deployment action. The Beanstalk action retrieves the build artifact from the customer’s S3 bucket and uses it to deploy to the Elastic Beanstalk web container.