2. Agenda
• What is Jenkins Pipeline?
• Getting Started
• Using SCM with Pipeline
• Pipeline Examples & Best Practices
• Shared Libraries
• Vars & Steps & Helper Functions
• Developing and Testing Pipeline scripts
• Advanced Reporting
• Q & A
3. What is Jenkins Pipeline?
• A new way to design/create/write Jenkins Jobs
• No more Job Config UI
• All Code
• Replaces Job Config UI Build steps with code
• Uses a Groovy CPS library to compile the code
• Allows to save states to disk as the program runs to survive restarts
• Groovy is the syntax of the pipeline script
• Not efficient due to the nature of the CPS compiler
• Doesn’t support all fancy syntax operations of the groovy language
4. Example of a FreeStyle Job:
• Copies Artifacts from another Job
• Executes some Shell Command
• Injects some properties from Shell step
into the Build
• Triggers another job
• Conditional statements as well
• Maybe a check out & Build
These Build Steps are how you
design/configure your Jenkins Jobs
5. Creating a Pipeline Job
Job Config Just no longer has
all those build steps you can
add, just a place for your code
6. Source Control for Pipeline
• Pipeline scripts can be inline
• Best practice is to put them in source control, also forces code
review….
• Can be in source control with the product the job is for
• Can be in separate source control project for just Pipeline code
• This is the option we took, our shared library, external resource files and the
pipeline scripts themselves are all in one git repo
• Source control forces the pipeline scripts to be executed in Groovy
Sandbox
7. What is a groovy sandbox?
• All over the Jenkins interface
• Means this groovy pipeline script runs in the master Jenkins JVM
space without restriction, and has access to everything
• Recommended to always use a groovy sandbox
• Rogue scripts can and will take down your Jenkins master
• Some methods may require Admin Approval
• Admin Approval can also tell you if the function an engineer wants to use is dangerous
and you can deny it
• Admin Approvals can be a pain in current design…. But worth it!
9. Examples of how to write Pipeline scripts
• Copy Artifacts
• Execute Shell
• Conditional
• Archive Artifacts
• File I/O
• Test Results
• Trigger Other Jobs
10. Basics of Pipeline Scripts:
• Node Blocks
• Determine where to run this part of the job… what Jenkins Agent/Node to use an
executor on
• Stage Blocks
• Organize segments of your Pipeline script
• Allows for easy code readability and for the Dashboard of the Job when the job is
running
• Can easily see what part of the Pipeline script is being executed
• Checkpoint Statements
• Almost like “saving your work”, identifies a good place in the script to save, so that if
you wanted to restart the pipeline at a later time you could resume from this point
13. Error Handling in Pipeline Scripts
• error(“Job Failed”)
• Stops your pipeline script from executing as if it was aborted
• Try/Catch Blocks
• Can handle errors and report yourself
• Allows you to handle if someone “Aborts the Build” and gracefully clean up and
report
• Notify people
• Email, slack all supported in pipeline
• Groovy Postbuild plugin
• Gives you access to manager.buildFailure() – marks the build red
• Gives you access to manager.buildUnstable() – marks the build yellow
• Allows you to control the result
14. Snippet Generator
For Plugins Installed:
• Helps you understand
and learn the syntax to
invoke them in your
Pipeline
• You provide how you
would invoke via Job
Config UI (Old way to
define Jenkins Jobs) and
it shows you the pipeline
equivalent!
15. Global Variables available to your scripts
These are recognized in your scripts and have methods/variables available to you
• env
• env.NODE_NAME
• env.WORKSPACE
• env.BUILD_URL
• env.JOB_URL
• params
• This builds parameters if it’s a Parameterized build
• currentBuild
• Next slide shows examples
• scm
• manager
• Groovy PostBuild Plugin available to Pipeline Scripts
• Shared Library Vars
• Ones you create, will discuss later when we look at Shared Libraries
• Other Plugins you may have installed that contribute here
16.
17. One pipeline script for multiple jobs?
• Found I had a lot of jobs that
were similar
• As I developed the pipeline
scripts, there was a lot in
common
• Too much in common, where
shared libraries didn’t really
apply, as it would be a entire
function of the job
• Prepared environment =
perfect solution
18. How to leverage the Prepared Environment?
• The items in prepared env come in in the env. Variable
• Can reference them and create what you need
• Then all your jobs can all reference the same pipeline script
• Makes easy to maintain and modify
19. Shared Libraries
• Used to store helper functions that may be common or used across pipeline
scripts
• Used to store global variables used throughout your pipeline projects
• Used to create your own “steps” for pipeline scripts to leverage
• Helper functions go in the /src folder and include: package com.ibm.shared;
• Steps and Variables for directly pipeline access go in the /vars folder
• Import and allow shared library access in a pipeline script with this at the top
• Access these shared libraries by instantiating the class
20. /vars/jobinfo.groovy
• Note: Lowercase name of the
groovy file
• In pipeline scripts you can access
these things by:
• jobinfo.PIPELINE_CONFIG.jobURL
• JenkinsJobRef is a class in our shared
library that has jobURL and
remotePath instance variables to
allow access to these items
• We wanted once place to
reference these hardcoded strings
in the event they were to change
21. /vars/nodescript.groovy
• We step out and execute
JavaScript resource files with
Node
• Created a “Step” helper to be
used in pipeline
• Call from pipline within a
nodescript {} block
• Where you can set variables to
determine what to do
22. /vars/nodescript.groovy
• Allows the pipeline script to
overwrite variables if they set them
in their block
• Checks out the resources project
• Copies down the javascript files
• Executes them and returns the
result
23. Pipeline Tips
• Pipeline is not efficient
• Use Strict typing even though Groovy doesn’t necessarily require it
• Not meant to do heavy parsing, or to hold complex logic
• You may be tempted to write it this way because the groovy programming language
allows it
• Declarative Pipeline
• New type of pipeline script that helps limit this
• The pipeline script is running in the master space, node blocks step out the “steps” to
your Jenkins Agent executor, but the main pipeline script uses your masters
CPU/memory
• Heavy logic can impact master to recommended to step it out to other processes,
shell, node, external groovy scripts
• You will notice for loops, if blocks in Pipeline are not efficient and can take a long time
to execute because of the CPS compiler
• If you must have these, you could use @NonCPS blocks around those functions, and it will compile
the functions normally
• CODE REVIEW!!!!
24. What doesn’t work?
• Nested Stages/Parallel blocks of stages
• Blue Ocean I think is looking to solve this
• Fancy Groovy programming
• Stick to the basics, go simple
• Shared Libraries cannot use Pipeline steps
• If you see a crazy exception and stack, likely you have something not
supported in your script
• Google is useful in this case, but carefully go through your pipeline script
• Pipeline scripts don’t automatically && its result with results of things you
do, unless a fatal error, you need to control the result
25. Test and Develop Pipeline Scripts
• Created a JobsUnderDevelopment Folder
• Engineers have their own sub folder within that space
• Keeps all Dev Jobs separate
• Engineers define their Folder Config to point to the SCM branch to load
the Shared Libraries
• Then any jobs executing in that Folder will use the dev stream of the shared
libraries
• Pipeline scripts of production jobs can be copied into these folders as well and
point to the SCM Branch to load the development pipeline code
• Allows multiple engineers to work and test off their SCM branch before
delivering pipeline script changes
26. Advanced Reporting
• Customize your
build pages of
the jobs
• Does require
Admin Approval
of various
functions to do
this
• We modify the
build description
first of our jobs
when they run
27. • Shared Library Helper function
• setDescriptionWithTestResult
• Lots of my pipelines use this so
made a shared library function
• Call it from pipeline script
• Returns a String of what summary
• Can be used in notification email
30. Sort Items on the
build page
Our parallel blocks write out
what is going on
So you know where it is
throughout the build
But at end of job we want to
clean that up