In this new presentation, we will cover advanced Terraform topics (full-on DevOps). We will compare the deployment of Terraform using Azure DevOps, GitHub/GitHub Actions, and Terraform Cloud. We wrap everything up with some key takeaway learning resources in your Terraform learning adventure.
NOTE: A recording of this presenting is available here: https://www.youtube.com/watch?v=fJ8_ZbOIdto&t=5574s
3. Microsoft’s investments in Terraform
• Microsoft Team HashiCorp Team
• Terraform AzureRM Provider updates
• Latest release v2.18.0 (July 10, 2020)
• 23 features added (new data sources, resources)
• 27 enhancements
• 6 bug fixes
• 4x releases/updates published in June alone!
• Terraform Module Registry
• https://registry.terraform.io/browse/modules?provider=azurerm
5. Terraform v0.13 highlights
Support for , ,
and
New syntax
Custom
command connects a CLI user to the Terraform Cloud app
variable "image_id" {
type = string
description = "The id of the machine image (AMI) to use for the server."
validation {
condition = length(var.image_id) > 4 && substr(var.image_id, 0, 4) == "ami-"
error_message = "The image_id value must be a valid AMI id, starting with "ami-"."
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "2.0.0"
}
}
}
8. Hot off the press!
• Announcing the Azure DevOps Provider for
Terraform
• https://cloudblogs.microsoft.com/opensource/2020/06/18/announ
cing-hashicorp-terraform-azure-devops-provider-release/
9. Configuration
Using Azure DevOps Repo vs GitHub
Yes, you can use GitHub, but I was going for the “full” Azure DevOps
(ADO) experience
Integrated with Azure KeyVault (for SPN credentials)
Via Variable Groups in the Pipeline’s Library
Multiple pipelines created
Deploy pipelines (hub/spoke/VNet peering)
Cleanup pipelines (hub/spoke/VNet peering)
16. Configuration
• Using GitHub repo
• Leverage GitHub Secrets (for SPN credentials, SAS keys, API
tokens)
• Multiple workflows (aka pipelines) created
• Deploy workflows (hub/spoke/VNet peering)
• Cleanup workflows (hub/spoke/VNet peering)
17. GitHub Actions (aka pipelines)
• A JavaScript action that sets up Terraform CLI in
your GitHub Actions workflow by:
• Downloading a specific version of Terraform
CLI and adding it to the PATH
• Configuring the Terraform CLI configuration
file with a Terraform Cloud/Enterprise
hostname and API token
• Installing a wrapper script to wrap
subsequent calls of the terraform binary and
expose its STDOUT, STDERR, and exit
code
20. Workflow Jobs
jobs:
terraform:
name: 'Terraform'
runs-on: ubuntu-latest
# Use the Bash shell regardless whether the GitHub Actions runner is ubuntu-latest,
macos-latest, or windows-latest
defaults:
run:
shell: bash
working-directory: ./Terraform/Networking/Deployments/Network-Deployment/Hub-
Deploy
21. Workflow Steps
# Install the latest version of Terraform CLI and configure the Terraform CLI configuration file with
a Terraform Cloud user API token
- name: Setup Terraform
uses: hashicorp/setup-terraform@v1
with:
# terraform_version: 0.12.25 You can use this to set the specific version of Terraform to target.
cli_config_credentials_token: ${{ secrets.TF_API_TOKEN }}
# Initialize a new or existing Terraform working directory by creating initial files, loading any
remote state, downloading modules, etc.
- name: Terraform Init
run: terraform init
# Generates an execution plan for Terraform
- name: Terraform Plan
run: terraform plan -var-file='Hub.tfvars' -out HubDeploy.plan
# - name: Terraform Apply
# if: github.ref == 'refs/heads/master' && github.event_name == 'push'
# run: terraform apply -auto-approve
22. What you can’t do
• Use modules with a relative path!
• Known issue #23333
• Specifically when using Terraform Cloud as
the remote backend
• Trigger another Action/Workflow after a
workflow is completed (ie. chaining)
• Manually trigger an Action/Workflow
• Not apparent you can use an alternative
backend (ie. Azure Storage) when using
the built-in Terraform GitHub Action
module "vnets-SharedServices"
{
source = "../../../Hub/"
…
}
25. Configuration
• Using GitHub repo
• Leverage Terraform variables (for SPN credentials)
• Multiple workspaces created (1 workspace = 1 state)
• Deploy workspace (hub/spoke/VNet peering)
• Note: “cleanup” workspaces not required, as the destruction and deletion
process is built into the existing one
26. TF Cloud Workspaces (aka pipelines)
• How Terraform Cloud organizes infrastructure
• Terraform Cloud manages infrastructure collections with workspaces
instead of directories
• Contains configuration, state data, variables, etc.
• Functions like a completely separate working directory
• Each workspace retains backups of its previous state files
• Retains a record of all run activity
• Summaries, logs, a reference to the changes that caused the run, and user
comments
28. Workspace Variables
• Terraform vs Environment variables
• terraform.tfvars did not work for me
• Had to use *.auto.tfvars
terraform plan -var=“X” -var-file=“Y.tfvars" -out=“Z.plan“
29. Workspace Runs
• Terraform Cloud always performs Terraform runs
in the context of a workspace
• The workspace serves the same role that a
persistent working directory serves when
running Terraform locally:
• it provides the configuration, state, and
variables for the run
30. Run Triggers
• allow runs to queue automatically in
this workspace on successful apply
of runs in any of the source
workspaces
NOTE!
31. Points to remember
• You can’t have a custom named .tfvars file, unless you use
the *.auto.tfvars naming
• Workspace ‘working directory’ controls the root terraform init
location, with no option/method to travers directories
• Triggering a delete/destroy, will trigger other chained/linked
workspaces (ie. delete Hub will trigger deploy Spoke)
34. Bonus! TFLint
• A part of the GitHub Super Linter
• One linter to rule them all
• Used to validate against issues
• Focused on possible errors, , etc.
• Support for all providers
• Rules that warn against
• AWS = 700+ rules
• Azure = 279 rules (Experimental support)
• GCP = WIP
35. Resources
• Adin’s personal curated list of Terraform resources
• Advanced Tips & Tricks to Optimize your Terraform Code
• Terraform: How to Rename (Instead of Deleting) a Resource
• The Ultimate Terraform Workflow: Setup Terraform (and Remote State)
with GitHub Actions
• Automating infrastructure deployments in the Cloud with Terraform and
Azure Pipelines
• Deploying Terraform Infrastructure using Azure DevOps Pipelines Step
by Step
Don’t forget about these Visual
Studio Code (VS Code) extensions:
Azure Terraform (by Microsoft)
Terraform (by Mikael Olenfalk)
Now owned by HashiCorp!
36. More resources
• Misadventures with Terraform
• Azure DevOps Lab - Terraform using GitHub Actions
• Terraform GitHub Actions
• Getting Started with Terraform Cloud
• How to deploy production-grade infrastructure in a fraction of the time
using Gruntwork with Terraform Cloud and Terraform Enterprise
• Using Modules from the Terraform Cloud Private Module Registry
38. This is me
Adin Ermie
• Cloud Solution Architect – Azure Apps & Infra @ Microsoft
• Azure Infrastructure-as-a-Service (IaaS), Platform-as-a-Service
(PaaS)
• Cloud Management & Security
• Azure Monitor, Azure Security Center (ASC) / Azure Sentinel
• Cloud Governance
• Azure Policy, Blueprints, Management Groups, and Azure Cost Management
(ACM)
• Business Continuity and Disaster Recovery (BCDR)
• Azure Site Recovery (ASR) / Azure Migrate, and Azure Backup
• Infrastructure-as-Code (IaC)
• Azure Resource Manager (ARM), and Terraform
• 5x MVP - Cloud and Datacenter Management (CDM)
• 1x HCA – HashiCorp Ambassador
Adin.Ermie@outlook.com
@AdinErmie
https://AdinErmie.com
linkedin.com/in/adinermie
https://github.com/AErmie
Notas del editor
There are 2 types of Triggers:
Continuous integration (CI), and
Pull request (PR)
Continuous integration (CI) triggers cause a pipeline to run whenever you push an update to the specified branches or you push specified tags.
You can reference a branch (ie. master), use wildcards (ie. releases/*), use exclude (ie. releases/old), tags (on branches)
Note: You cannot use variables in triggers, as variables are evaluated at runtime (after the trigger has fired).
Note: If you specify an exclude clause without an include clause, then it is equivalent to specifying * in the include clause.
Note: When you specify paths, you must explicitly specify branches to trigger on. You can't trigger a pipeline with only a path filter; you must also have a branch filter, and the changed files that match the path filter must be from a branch that matches the branch filter.
The ‘Terraform-BuildVariables’ is the from the Pipeline > Library > Variable Group (which is integrated with Azure KeyVault)
User-defined variables
System variables
Environment variables
System and user-defined variables also get injected as environment variables for your platform. When variables are turned into environment variables, variable names become uppercase, and periods turn into underscores.
Note that if you do not include the ‘inputs’ ‘terraformVersion’ it will NOT install the latest version, but rather, version 0.12.3!
Notice that we’re passing through the command-line the backend config for using Azure Storage as the remote State store
On the terraform plan command, you can augment it by including a ‘var-file’ reference, and output the plan file
Tasks are versioned, and you must specify the major version of the task used in your pipeline
In YAML, you specify the major version using @ in the task name (ie. TerraformInstaller@0)
I want to kick-off the Spoke pipeline after the Hub pipeline has completed
Notice the ‘trigger’ is set to ‘none’, and we have a ‘resources’ ‘pipelines’ code block
pipeline: BLAH specifies the name of the pipeline resource
source: BLAH specifies the name of the triggering pipeline
To date, there are 28 “terraform” GitHub Actions
There is one official HashiCorp – Setup Terraform action
Workflows are custom automated processes that you can set up in your repository to build, test, package, release, or deploy any code project on GitHub.
With GitHub Actions you can build end-to-end continuous integration (CI) and continuous deployment (CD) capabilities directly in your repository. GitHub Actions powers GitHub's built-in continuous integration service.
The name of the GitHub event that triggers the workflow.
You can provide a single event string, array of events, array of event types, or an event configuration map that schedules a workflow or restricts the execution of a workflow to specific files, tags, or branch changes.
You can configure a workflow to start once:
An event on GitHub occurs, such as when someone pushes a commit to a repository or when an issue or pull request is created.
A scheduled event begins.An external event occurs.
To trigger a workflow after an event happens on GitHub, add on: and an event value after the workflow name.
Encrypted secrets
Environment variables
GitHub sets default environment variables that are available to every step in a workflow run.
Environment variables are case-sensitive.
A workflow run is made up of one or more jobs. Jobs run in parallel by default. To run jobs sequentially, you can define dependencies on other jobs using the jobs.<job_id>.needs keyword.
Note the ‘working-directory’ and how the path is set (it does not use the double-dot-slash ..\, but rather a single)
A workflow run is made up of one or more jobs. Jobs run in parallel by default. To run jobs sequentially, you can define dependencies on other jobs using the jobs.<job_id>.needs keyword.
Note: There is an error when terraform plan tries to use “var-file” and “out”
This may be due to the state pointing to Terraform Cloud vs an Azure Storage Account
This means you cannot use “-out” to produce a .plan file as an artifact
This also means you cannot pass in a “-var-file”, it looks for “*.auto.tfvars” instead
A job contains a sequence of tasks called steps.
Not all steps run actions, but all actions run as a step.
Because steps run in their own process, changes to environment variables are not preserved between steps
Trigger an action upon completion of another action: https://github.community/t/trigger-an-action-upon-completion-of-another-action/17642
Triggering a new workflow from another workflow: https://github.community/t/triggering-a-new-workflow-from-another-workflow/16250
At first, I thought I should use the Environment Variables for Subscription ID, Client ID/Secret, and Tenant ID.
But apparently this is not the case, as no value is passed from any key in the ‘Environment Variables’
In short, if you want to use it as part of a terraform command-line (ie. terraform plan -var=“X” -var-file=“Y.tfvars" -out=“Z.plan“ then you need to use the Terraform Variables
Terraform Cloud workspaces can set values for two kinds of variables:
Terraform input variables, which define the parameters of a Terraform configuration.
Shell environment variables, which many providers can use for credentials and other data.
Terraform Cloud passes variables to Terraform by writing a terraform.tfvars file and passing the -var-file=terraform.tfvars option to the Terraform command.
Terraform runs managed by Terraform Cloud are called remote operations.
Remote runs can be initiated by webhooks from your VCS provider, by UI controls within Terraform Cloud, by API calls, or by Terraform CLI.
In a workspace linked to a VCS repo, runs start automatically when you merge or commit changes to version control.
A workspace is linked to one branch of its repository, and ignores changes to other branches. Workspaces can also ignore some changes within their branch: if a Terraform working directory is configured, Terraform Cloud assumes that only some of the content in the repository is relevant to Terraform, and ignores changes outside of that content.
Note that a successful APPLY needs to happen in the source workspace first before it triggers the next one
Note the auto-apply warning!
This means you cannot actually successfully “fully” deploy an entire environment in an automated way; human interaction is required!
You can connect your workspace to up to 20 source workspaces.