Build a Jenkins Pipeline to Upload Files to AWS S3

Noah Hsu
8 min readJun 20, 2021

--

In this Post, I’ll share a practice doing the following job with jenkins pipeline:

  1. Git checkout code form a git repository (from a specific branch)
  2. Build project with maven
  3. Build image with docker
  4. Zip image
  5. Upload zipped image to AWS S3 bucket
  6. Git Tag and push to git git repository

All of those thing are not hard or complex, but just annoying to do it manually every time I want to deploy image to an client’s environment, which is put a lot of limit( email size, network connection, … and so on). we can only ask the IT staff to download zipped image from AWS S3 bucket in the VM inside the client environment. So our work is put the zipped image file in AWS S3 bucket and inform the IT Staff the URL of download the files. So, let’s start!

Component Version

I list the plugins we need.

  • Jenkins: 2.277.1
  • Pipeline: AWS Steps: 1.43
  • Docker Pipeline: 1.23
  • Git: 4.7.0
  • Office 365 Connector:
  • Pipeline: 2.6
  • Workspace Cleanup Plugin: 0.39

Credentials

Before we start to write our pipeline script, we shall build the credentials we need rather than write the account/password in the script. That give as some benefit such as reuse, central management… and so on. We can reference to this article https://www.cyberark.com/resources/threat-research-blog/configuring-and-securing-credentials-in-jenkins and learn more about why we should use.

Here we build the credentials for Git And AWS.

First, In the bottom part in page of System management/Credentials, click the down triangle beside the ”Jenkins System” and click the “Add credentials”

Credential for Git
Credential for AWS

input the infomation and click “OK”. Then we’ll got the Id of each credential in the “Credential” page.

Build The Pipeline Job

  1. Create a new job( Pipeline)
  2. setup parameterized build
    Here, we added two parameters in this job
    1. BRANCH_NAME: the branch we’ll checkout and build
    2. TAG_NAME: The tag we all push to the target branch( in the head point. of BRANCH_NAME) after the job is done successfully
  3. select the pipeline script source (we choose to write script with the editor in web, or you can use JenkinsFile from SCM, other repository)
step2. build parameter for BRANCH_NAME
step3. pipeline source choose

Writing Pipeline Script

this part we’ll walk through the whole script and explain the usage.

1. Definition

First, we define the constants we need including the credential id we got in adding credential step. Then set up the maven version we want to use in building project( which are several Sprint Boot Application).

pipeline {
agent any
environment {
git_credential = "GIT_CREDNETIAL_ID"
aws_credential = "AWS_CREDNETIAL_ID"
repo_url = "MyRepositoryUrl"
base_imagename = "base-image"
api_imagename = "my-api"
auth_imagename = "my-auth"
bucket = "MyBucketName"
region = "MyAwsRegion"
webHook_url = "myWebHookURL"
api_res_url = "https://${bucket}.s3.${region}.amazonaws.com/${TAG_NAME}/${api_imagename}-${TAG_NAME}.tar.gz"
auth_res_url = "https://${bucket}.s3.${region}.amazonaws.com/${TAG_NAME}/${auth_imagename}-${TAG_NAME}.tar.gz"
notify_text = "image upload to s3 <br>${api_imagename}: <${api_res_url}><br> ${auth_imagename}: <${auth_res_url}><br>tag by ${TAG_NAME}"

}

tools {
// Install the Maven version and add it to the path.
maven "maven-3.3.9"
}

2. Git Checkout

We use the plugin “git“ and there are many arguments that we can set. Here we set hte following 3:

  1. url: the target repository that we clone from
  2. credentialId: the credentialId(contains account/password) we used to access the repository
  3. branch: the taret branch that we’ll checkout

After we clone and checkout our target repository and branch. we use shell to obtain the current commit id( the plugin didn’t provide this function, so we can only use the git commend by shell).

stage('checkout') {
steps {
script {
git branch:"${BRANCH_NAME}",
credentialsId: "${git_credential}",
url: "http://${repo_url}"
commitId = sh (script: 'git rev-parse --short HEAD ${GIT_COMMIT}', returnStdout: true).trim()
}
}
}

3. Maven build

There is nothing to explain in this snippet. just a normal maven build command.

stage("Maven build"){
steps {
// Run Maven on a Unix agent.
sh "mvn clean install ."
}
}

4. Image build

In this stage, we move to the folder of each project because there total three images should be built .

  1. Build a base-image from openjdk:8-jdk-alpine and set time zone to Asia/Taipei to be our base image.
  2. Build a api-image from base-image
  3. Build a auth-image from base-image
  • cd xxx_folder will NOT work in Jenkins pipeline script. we should use dir("xxx_folder") instead.
  • docker.build is the docker pipeline plugin command, it provide many function, please checkout in it’s document
stage("Image build"){
steps {
script {
sh "docker images"
dir("${base_imagename}"){
sh "ls"
docker.build "${base_imagename}"
}
dir("${api_imagename}"){
sh "ls"
docker.build "${api_imagename}"
}
dir("${auth_imagename}"){
sh "ls"
docker.build "${auth_imagename}"
}
sh "docker images"
}
}
}

adding sh docker images and we can see that all images before and after docker.bulid.

5. Zip images

There is nothing to explain in this snippet. Just make a folder and zip the images we build in previous stage(ignore the base images, cause we don’t need to upload it).

stage("Zip"){
steps{
sh "mkdir ${TAG_NAME}"
dir("${TAG_NAME}"){
sh "docker save ${api_imagename}:latest > ${api_imagename}-${TAG_NAME}.tar.gz"
sh "docker save ${auth_imagename}:latest > ${auth_imagename}-${TAG_NAME}.tar.gz"
}
}
}

6. Upload To AWS S3

We refference to document of Pipeline: AWS Steps( https://github.com/jenkinsci/pipeline-aws-plugin). we can see that the primary function are explained, while some arguments in method are not. So, Here is my practice and experience.

In Job Step:

  1. withAWS:equals to login. The arguments below are used to login.
    1. region: The AWS region
    2. credentials: the credential we create early.
  2. s3Upload: uplaod file/ folder to AWS S3.
    1. file: The file/folder name(in Jenkins workspace) you want to upload
    2. bucket:The bucket name in AWS S3 you want to upload to
    3. path: The path folder in the bucket you want to upload to(if not exist, a new folder will be created)

In Post Job Step( means the job in step is done, no matter success or fail):

  1. if success: send the notification through Teams WebHook, contains commit id, s3 resource url.
  2. if fail: send the notification through Teams WebHook,

The web-hook plugin office365ConnectorSend also provide some argument to use. Here are what i’m used:

  1. message: The main message(content), which is compatible with html tag.
  2. status: The Status String
  3. webhookUrl: use teams to setup a Webhook for Jenkins, and you’ll get one url, copy and paste it here.
stage("Upload"){
steps{
withAWS(region:"${region}", credentials:"${aws_credential}){
s3Upload(file:"${TAG_NAME}", bucket:"${bucket}", path:"${TAG_NAME}/")
}
}
post {
success{
office365ConnectorSend message: "${notify_text}<br>commit id: ${commitId}", status:"Success Upload", webhookUrl:"${webHook_url}"
}
failure{
office365ConnectorSend message: "Fail build,<br> see (<${env.BUILD_URL}>)", status:"Fail Upload", webhookUrl:"${webHook_url}"
}
}
}
Success-notification in team( I hide the important info )

7. Push Tag

after upload the images to AWS S3. We want to tag the target branch so that we will remember which folder in S3 bucket is build by which commit point of which branch.

The Git plugin didn’t provide the tag function, so we will do it through shell command. In the first stage, we use git plugin to clone and checkout with credential, which is exist in git plugin, not in shell command. If we use git push directly the error will occur:

(gnome-ssh-askpass:9353): Gtk-WARNING **: cannot open display:
error: unable to read askpass response from ‘/usr/libexec/openssh/gnome-ssh-askpass’
fatal: could not read Username for ‘<myip>’: No such device or address

So we wrap the git tag, git push by withCredentials and save the userName/passeword as env.GIT_USERNAME/env.GIT_PASSWORD.

stage('Push Tag') {
steps {
script {
datetime = new Date().format("yyyy-MM-dd HH:mm:ss");
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: "${git_credential}", usernameVariable: 'GIT_USERNAME', passwordVariable: 'GIT_PASSWORD']]) {
sh("git tag -a ${TAG_NAME} -m '${datetime}'")
sh("git push http://${env.GIT_USERNAME}:${env.GIT_PASSWORD}@${repo_url} --tags")
}
}
}
}

8. CleanUp

Finally, After all Stages in the job, we declare a post stages. to clean up the workspace and the docker.

Here is still some opening issue in cleanWs() not deleteing the @tmp/@script/@… directories. so we add an workaround suggest by Viktor Be in stack overflow to clear them (https://stackoverflow.com/questions/58588794/what-are-the-tmp-folders-in-a-jenkins-workspace-and-how-to-clean-them-up).

post {
always {
cleanWs()
dir("${env.WORKSPACE}@tmp") {
deleteDir()
}
dir("${env.WORKSPACE}@script") {
deleteDir()
}
dir("${env.WORKSPACE}@script@tmp") {
deleteDir()
}
sh "docker rmi ${base_imagename}"
sh "docker rmi ${api_imagename}"
sh "docker rmi ${auth_imagename}"
}
}

Above are the explanation of my Jenkins practice for using Git, Maven, Docker, AWS S3, Teams notification.

The Stage view of the pipeline job

Here is our final script

pipeline {
agent any
environment {
git_credential = "GIT_CREDNETIAL_ID"
aws_credential = "AWS_CREDNETIAL_ID"
repo_url = "MyRepositoryUrl"
base_imagename = "base-image"
api_imagename = "my-api"
auth_imagename = "my-auth"
bucket = "MyBucketName"
region = "MyAwsRegion"
webHook_url = "myWebHookURL"
api_res_url = "https://${bucket}.s3.${region}.amazonaws.com/${TAG_NAME}/${api_imagename}-${TAG_NAME}.tar.gz"
auth_res_url = "https://${bucket}.s3.${region}.amazonaws.com/${TAG_NAME}/${auth_imagename}-${TAG_NAME}.tar.gz"
notify_text = "image upload to s3 <br>${api_imagename}: <${api_res_url}><br> ${auth_imagename}: <${auth_res_url}><br>tag by ${TAG_NAME}"

}

tools {
// Install the Maven version and add it to the path.
maven "maven-3.3.9"
}
stages {
stage('checkout') {
steps {
script {
git branch:"${BRANCH_NAME}",
credentialsId: "${git_credential}",
url: "http://${repo_url}"
commitId = sh (script: 'git rev-parse --short HEAD ${GIT_COMMIT}', returnStdout: true).trim()
}
}
}
stage("Maven build"){
steps {
// Run Maven on a Unix agent.
sh "mvn clean install -DskipTests=true -Dmaven.test.failure.ignore=true."
}
}
stage("Image build"){
steps {
script {
sh "docker images"
dir("${base_imagename}"){
docker.build "${base_imagename}"
}
dir("${api_imagename}"){
docker.build "${api_imagename}"
}
dir("${auth_imagename}"){
docker.build "${auth_imagename}"
}
sh "docker images"
}
}
}
stage("Zip"){
steps{
sh "mkdir ${TAG_NAME}"
dir("${TAG_NAME}"){
sh "docker save ${api_imagename}:latest > ${api_imagename}-${TAG_NAME}.tar.gz"
sh "docker save ${auth_imagename}:latest > ${auth_imagename}-${TAG_NAME}.tar.gz"
}
}
}
stage("Upload"){
steps{
withAWS(region:"${region}", credentials:"${aws_credential}){
s3Upload(file:"${TAG_NAME}", bucket:"${bucket}", path:"${TAG_NAME}/")
}
}
post {
success{
office365ConnectorSend message: "${notify_text}<br>commit id: ${commitId}", status:"Success Upload", webhookUrl:"${webHook_url}"
sh "ls"
}
failure{
office365ConnectorSend message: "Fail build,<br> see (<${env.BUILD_URL}>)", status:"Fail Upload", webhookUrl:"${webHook_url}"
}
}
}
stage('Push Tag') {
steps {
script {
datetime = new Date().format("yyyy-MM-dd HH:mm:ss");
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: "${git_credential}", usernameVariable: 'GIT_USERNAME', passwordVariable: 'GIT_PASSWORD']]) {
sh("git tag -a ${TAG_NAME} -m '${datetime}'")
sh("git push http://${env.GIT_USERNAME}:${env.GIT_PASSWORD}@${repo_url} --tags")
}
}
}
}
}
post {
always {
cleanWs()
dir("${env.WORKSPACE}@tmp") {
deleteDir()
}
dir("${env.WORKSPACE}@script") {
deleteDir()
}
dir("${env.WORKSPACE}@script@tmp") {
deleteDir()
}
sh "docker rmi ${base_imagename}"
sh "docker rmi ${api_imagename}"
sh "docker rmi ${auth_imagename}"
}
}
}

If you enjoyed this article, please follow me here on Medium for more stories about CI/CD process. Thanks for reading.

--

--

Noah Hsu
Noah Hsu

Written by Noah Hsu

Java ServerSide Engr🚀, Focusing on Spring, Toggle system, Kafka, Event Sourcing, and CI/CD. Support my work with a 🍺. https://www.buymeacoffee.com/swbhcjhtyvv

No responses yet