This is a short story how the Jenkins Job DSL made our and the customers life a whole lot easier.
Any software developing company that takes pride in software craftsmanship has some sort of build automation tool in place. In most cases this is Jenkins CI . Figure 1 shows a typical lifecycle of a Jenkins Job. When a new software project is started, new build, test and deploy jobs are created. Then developers start coding and commit their code to a version control system (DVCS). This triggers the execution of the build job. After a while developers might decide to switch from Java 7 to Java 8, so they need to migrate their code and update the CI job to build the code with JDK 8 instead of JDK 7. After a couple years in production the project becomes obsolete and gets deleted or archived from the DVCS. Build, test and deploy jobs have to be deleted in order to keep the CI infrastructure clean.
Jenkins is a great automation tool. It is easy to get started and once you have your first jobs automated, you do not want to go back. After a while you will have several Jenkins instances and slaves running and your Continuous Integration (CI) infrastructure might look like the following diagram.
You are running at least one Jenkins master with a dozen or more slaves and between 100 and 1000 different jobs. Each job is either manually created, copied from a template, created using the REST API or client JAR. There are various ways and they all work just fine.
You typically have jobs for building, testing and releasing software. It does not matter which programming language you prefer. With Jenkins only the sky seems to be the limit. There are more than 1000 plugins for integrating Jenkins with typical developer tools.
And then things start to change. Here is just a short list of things that I came across over the last couple of years:
- Java Version upgrade
- Jobs need to use a different JDK for compiling the source code
- Since all build artifacts (WAR/EAR files) are automatically deployed to application servers, these servers needed to be upgraded as well, to run with the correct Java version
- Generated server and app configurations needed to be upgraded
- Build tools get replaced
- Ant Jobs need to be migrated to Maven Jobs
- Deployment and Test scripts and Jobs need to be updated as well
- Version Control Systems change
- After a migration from CVS or Subversion to Git, all Jobs need to be updated to pull code from a new server
- Application Server upgrade
- Automated deployment scripts need to deploy to new servers
- Job parameters change
- Configuration templates need to be upgraded
- Projects get refactored and renamed in Git
- Update Job name and SCM settings
- Projects want to use different branches for CI
- The default Branch for the CI Job needs to be updated
- Framework Versions change
- Spring Framework v4 and Hibernate v5 require JDK6+
- Hadoop requires JDK7+
- Your company framework might require JDK7 for newer versions
- New Projects
- Create new Build and Release Jobs
- Projects are deprecated
- Cleanup Jobs of deprecated projects on all Jenkins Server
Now comes the fun part. Imagine you have to take care of 140 Java projects. About 1000 different builds run on your Continuous Integration (CI) infrastructure each day and all of these projects are migrating at different times from Ant to Maven, from Java 6 to Java 7, from Tomcat 6 to Tomcat 7 and so on.
On top of that, we have a full copy of the CI infrastructure for testing purposes (http://jenkins-test, http://sonar-test, http://artifactory-test, http://git-test, …). We use these test instances to try out new tool versions, test data migrations and to validate infrastructure changes. In order for the test environment to be useful, we need to keep those jobs up-to-date as well.
At the beginning we were manually updating jobs, in case projects made changes. We were making heavy use of the Jenkins CLI to update multiple jobs at once using custom Groovy scripts. We wrote Bash Scripts that use the Jenkins client JAR to create Jobs from XML Templates. Over time the Groovy- and Shell Scripts got more and more complicated and it was hard keeping all scripts up-to-date.
Tip: The Scriptler plugin is a great help for managing a large Jenkins infrastructure
Around that time I came across two articles from Dennis Schulte and Daniel Reuter that talked about the usage and advantages of the Jenkins Job DSL. If you are looking for a good introduction to the Jenkins Job DSL I suggest you read up the following blog entries first.
- Continuous Delivery for Microservices with Jenkins and the Job DSL Plugin
- Generated Jenkins Jobs and automatic Branch Merging for Feature Branches
Getting started
Here is a simple example of the Job DSL. The Job DSL is a Domain Specific Language written in Groovy. After installing the Job DSL Plugin you have to create a Seed Job. Next, you will write a Groovy Script that makes use of the Job DSL. You can find the full API in the Job DSL API Viewer . The result might look like the following code snippet:
https://gist.github.com/marcelbirkner/ee5c36d960da7c29e913
After executing the Seed Job, the plugin will automatically generate a Jenkins Job that checks a GitHub Repository every 15 minutes for incoming changes and executes mvn clean install, including a couple of MAVEN_OPTS and Properties. That is all.
The DSL allows to create different types of Jobs, Views, Folders, Queues and many other useful things and is actively in development. Another great advantage is, that you can make use of the full programming power of Groovy when writing Seed Scripts. You can do HTTP requests, parse JSON and XML, read information from databases and pretty much do anything you would do with plain old Java.
Here are a couple of tests we did before getting started, to verify we were able to access already existing data in different types of systems.
Access GitLab REST API with Groovy
GitLab provides a nice REST API to access all repositories and project details. The REST API is protected with a private token, which can be found in the user profile through the GitLab UI. The Groovy JsonSlurper makes it extremely easy to parse the returned JSON and transform the result into an object.
https://gist.github.com/marcelbirkner/37e4b5c1715d0613e087
Access Oracle Database with Groovy
In our case the customer is using a configuration management tool that stores the application data in an Oracle database. All projects rely on the same naming convention, and the Git repository names equal the application names in the configuration database. Therefore we are able to retrieve all application details like build tool (Ant/Maven/Gradle), Java Version (6/7/8), Framework Version, Deployment Server (WebSphere/Tomcat/Batch/None) and many other details. To access the Oracle database all we had to do was configure and use the groovy.sql.Sql class.
https://gist.github.com/marcelbirkner/180f1f438ef455642451
Access MySQL Database with Groovy
In case you are using a MySQL and want to store some statistics about the CI jobs you create, you can use the following snippet.
https://gist.github.com/marcelbirkner/a599f965b48a3af619f6
Planning phase
Now, that we had a good idea what is possible, we started planning our next steps. Figure 3 shows the final process we implemented. The Jenkins Seed Job runs every night and updates all existing jobs and creates new jobs if required. You can run the seed job more often, since it does not interfere with running jobs. For us, running it once at night was all we needed.
Step 1: First the Groovy Seed Job Script gets a list of all projects by accessing the GitLab REST API.
Step 2: After that we iterate over all projects.
Step 3: We check each project if the configuration management database contains matching details.
– if details are found, we continue with step 4
– else, we skip the project (i.e. project might not be a Java project)
Step 4: Depending on the project details we create Ant/Maven build Jobs, Tomcat/WebSphere deploy jobs and other jobs that might be required.
Step 5: After all jobs have been created, we create a set of views that make managing the jobs easier. For projects that have a build, deploy and acceptance test job, we additionally create build pipeline views .
Status Quo
Here are a couple of numbers from my current project:
- > 140 Java projects
- > 700 automatically generated build, test, deploy, release jobs
- about 1000 builds / day
- everything runs on Xen virtual CentOS servers
The seed job only needs 2-3 minutes to create/update all jobs. Our previous solution that used the Jenkins CLI took between 15-30 minutes for the same number of projects. At the bottom of this article you will find the full CI Seed Script.
Docker Example
If you want to play around with the Job DSL yourself, you can easily get started using the following Github Repository, https://github.com/marcelbirkner/docker-jenkins-job-dsl
https://gist.github.com/marcelbirkner/100d01a4187ee45c3443
The repository contains all required Jenkins Plugins as well as a couple of Job DSL Seed Jobs that create a couple of build and test Jobs and custom Views.
Update: Getting started using Image from Docker Hub
You can get the latest version for this demo from Docker Hub, https://hub.docker.com/r/mbirkner/docker-jenkins-job-dsl/
https://gist.github.com/marcelbirkner/5e8303f30547de12027e
Conclusion
Using the Job DSL reduced our workload more than we expected. Overall complexity decreased since we did not have to keep various shell scripts, groovy scripts and Jenkins Job templates in sync. There is only a single Groovy Seed Script, that contains everything. Additional information is taken from existing systems, like configuration management database and GitLab. All jobs are up-to-date in the morning. The test environment is identical to production. Jobs of deprecated projects are automatically cleaned up. Developers can migrate their projects whenever they find time. Teams do not have to rely on the CI infrastructure team to get their projects up-and-running.
If you have a zoo of different programming languages and each project is handled differently, this plugin might not be of much use for you. For us, the Job DSL plugin was a great win, since our projects follow a set of conventions, that help us automate the software development & release process as much as possible.
References
- Jenkins Job DSL – https://wiki.jenkins-ci.org/display/JENKINS/Job+DSL+Plugin
- Job DSL API Viewer – https://jenkinsci.github.io/job-dsl-plugin
- Jenkins CLI – https://wiki.jenkins-ci.org/display/JENKINS/Jenkins+CLI
- GitLab REST API – https://github.com/gitlabhq/gitlabhq/tree/master/doc/api
- Dennis Schulte – https://blog.codecentric.de/en/2015/01/continuous-delivery-microservices-jenkins-job-dsl-plugin/
- Daniel Reuter – https://blog.codecentric.de/en/2015/04/generated-jenkins-jobs-and-automatic-branch-merging-for-feature-branches/
- Configuration Slicing Plugin – https://wiki.jenkins-ci.org/display/JENKINS/Configuration+Slicing+Plugin
- Scriptler Plugin – https://wiki.jenkins-ci.org/display/JENKINS/Scriptler+Plugin
Full CI Seed Script example
https://gist.github.com/marcelbirkner/9bc906b24348f31e03b2
More articles
fromMarcel Birkner
Your job at codecentric?
Jobs
Agile Developer und Consultant (w/d/m)
Alle Standorte
More articles in this subject area
Discover exciting further topics and let the codecentric world inspire you.
Gemeinsam bessere Projekte umsetzen.
Wir helfen deinem Unternehmen.
Du stehst vor einer großen IT-Herausforderung? Wir sorgen für eine maßgeschneiderte Unterstützung. Informiere dich jetzt.
Hilf uns, noch besser zu werden.
Wir sind immer auf der Suche nach neuen Talenten. Auch für dich ist die passende Stelle dabei.
Blog author
Marcel Birkner
Do you still have questions? Just send me a message.
Do you still have questions? Just send me a message.