Continuous Delivery for Maven project using Jenkins Pipeline and AWS EC2


The main advantage of setting up continuous delivery is the ability to deploy faster and make each commit candidate for a release ready to be deployed in production.
If you are still living with manual or scripted build/testing to prepare the package then you are looking at the right post. Here we are going to present the steps you need to build a true and efficient pipeline.
However implementing CD can be challenging especially in the context of a large company and it is hard task to get people outside of there comfort zone (existing development and release environment), but when you see all the benefits -Reliable Releases, Improve productivity…, it is worth it and for sure it will pay off in other ways.
There are many tools that can help you reach this goal, for our example we have selected the ones that you will more often hear about and for me are the best:
  • Jenkins: Continuous Integration Server
  • Maven: Build Tool
  • Git: Version Control System
  • SonarQube: code analyzer
  • Artifactory: Archive Repository
  • Ansible: Configuration Management Tool


As you can see for the purpose of this course I’ve setup Ansible, Jenkins, SonarQube and Artifactory locally, then created a GitHub repository and pushed a simple SpringBoot web application.
So that everything is set up, let’s get started…

General Configuration

First thing to do is to configure Jenkins with both SonarQube and Artifactory
Go to “Manage Jenkins > Manage Plugin” and Install “Artifactory Plugin”and “SonarQube Scanner for Jenkins”.
Once done you need to go to “Manage Jenkins > Configure System” in order to configure SonarQube and Artifactory URLs and access (See screenshots below)
Screen Shot 2018-02-03 at 18.37.00
Note that starting from 5.3 or Higher version of SonarQube it’s no more possible to set login and password, you need to specify an authentication token (You can get it from your profile page in SonarQube instance)
Screen Shot 2018-02-03 at 18.37.17

Pipeline stages

Next step is to created our famous Jenkins Job which will:
  1. Depending on either the developer has pushed new changes or not, the job will
  2. Checkout source code from Github repository
  3. Create the package using maven build tool and deploy it to Artifactory
  4. Run SonarQube analysis
  5. Initialise an EC2 instance form a public AMI
  6. Deploy the package and start the service
At the end the user will have access to Sonarqube and Artifactory dashboard and also the url of the service deployed
To do so, go to Jenkins and Create a new “Pipeline” Job
Screen Shot 2018-02-03 at 16.19.03
In the configuration page, go to the “Pipeline” section and choose “Pipeline script from SCM” then copy/paste the stack-tech github repository URL

Screen Shot 2018-02-03 at 16.22.59

Click on “Save”.

But before we run the job let’s go through the JenkinsFile since it’s there where all the magic happens:

Screen Shot 2018-02-04 at 10.23.31

Here at the “Initialize” stage we set M2_HOME environment property which will be used during the build and jdk version for compilation.
“Maven 3.3.9” and “jdk8” has already been set in the Jenkins Global Tool Configuration
Screen Shot 2018-02-04 at 10.29.40
Since there is no Artifactory functionality added as declarative pipeline syntax, we used a specific syntax instead.
We started by creating an Artifactory object which will run a maven install command to create the package and then publish the result to Artifactory.
Note that even is you have set the repository section in your pom.xml, Maven won’t take it into account and will use “libs-release” and “libs-snapshots” only to download required dependencies for the build.
Luckily for us, SonarQube plugin provides a pipeline functionnality to ease this stage
Screen Shot 2018-02-04 at 10.34.59
All we had to do is the use “withSonarQubeEnv” in order to inject SonarQube server environment into the shell where the maven command is running
Then comes the interesting part which consists of running a new EC2 instance and deploy the jar inside it.
We will use Ansible for that matter; Ansible is a configuration management tool that helps to provision and configure new environments based on code which make rollback and updates easy.
  • AWS modules requires “boto” to be installed
  • Ansible and Ansi Color Jenkins plugins

Screen Shot 2018-02-12 at 13.41.39

Is this section we will just invoke the ansible-playbook command and run the playbook.yml script. the AnsiColor used onyl to add some color to the input console

Now let’s take a brief look at the playbook.yml
The script is separated into two plays:
  • Create an instance” and “Configure instance”, in the fist one we use ec2 module to create a new EC2 instance, add it to a group name and wait for it to be up and running.
    Then the configuration step consists of upgrading OpenJDK version to version 8, copy the stack-tech.jar file and execute the script which will start the service.
Please note that in order the ask for an EC2 instance you need to:
  • Get Access Key and Secret Access Key of your AWS account.
    For me I’ve set them as envrionment variables inside Jenkins and added a third property which allows to skip ssh host checking when first logging
    Screen Shot 2018-02-12 at 14.15.45
  • Prepare values for required properties:
    • key_name: You need to download the pem file for the region where you are launching your instance, in my case it’s us-east-1 and then run the command: ssh-add path/to/pem/file, this allows to skip adding “-i YOU_PEM_FILE” each time you need to do an ssh and also at the Jenkins stage you won’t need to add the pem file to you git repo and declare it at the playbook.yml using ansible_ssh_private_key_file property
    • security-group: For the sake of the demo I’ve created a security group which allows 22, 80 and 8080 ports, But in production you should be more careful when it comes to security groups
      Screen Shot 2018-02-12 at 14.04.42
    • instance_type: I’ve chosen the t2-micro instance type since it costs nothing
Now let’s run the Jenkins Job: Go to Jenkins, the Job dashboard and click “Build Now”.
Screen Shot 2018-02-12 at 14.27.20
At the end of the build execution and as expected the package has been deployed and successfully started.
You can check by doing a curl to the url “http://XX.XX.XX.XX:8080/api” you should get as a result:
Hello From the other side!


My idea behind this blog is to focus on the best practices and tools used to help automate deployment and let developers focus on making readable and maintainable code. So there will be other posts coming to improve this project where each focus on a new functionality.
I know there was a lot of requirements to be able to have a successful build, I’ll will later update the perquisites step and make it more easier by using Docker to deploy Sonarqube, Artifactory and Jenkins tools all at once.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s