Jenkins Pipeline: Examples, Best Practices & Use Cases

Master Jenkins Pipeline with practical examples, proven best practices, and insightful use cases to enhance your CI/CD automation and development workflow.

Get Started free
Jenkins Pipeline - Examples, Best Practices & Use Cases
Home Guide Jenkins Pipeline: Examples, Best Practices & Use Cases

Jenkins Pipeline: Examples, Best Practices & Use Cases

Jenkins Pipeline is a powerful tool for automating the stages of your CI/CD workflow, from code integration to deployment. Its flexibility and scalability have become a staple for DevOps teams looking to streamline their processes and ensure faster, more reliable software delivery.

This article explores practical examples, best practices, and use cases to demonstrate how Jenkins Pipeline can optimize development and automation processes.

What is a Jenkins Pipeline?

Jenkins Pipeline is a suite of plugins in Jenkins that allows you to define and automate your software delivery process as code. It defines a series of automated steps that need to be executed to build, test, and deploy an application.

In essence, a Jenkins Pipeline acts as a roadmap for the software development lifecycle, helping development teams automate repetitive tasks and focus on code quality and innovation. It provides a unified interface for defining and controlling the flow of the CI/CD process, often integrating with other tools and systems for testing, deployment, and monitoring.

BrowserStack Automate Banner

Two Primary Styles of Jenkins Pipeline

Jenkins Pipeline provides two basic approaches for defining automation workflows:

  • Declarative
  • Scripted

These styles provide many ways to create pipelines, letting users select the method that best meets their requirements and level of experience in defining continuous integration and continuous delivery processes.

Declarative

Declarative Pipeline offers a streamlined and structured syntax. It employs preset sections and directives, making it simpler for newcomers to understand and build pipeline code.

Declarative syntax has a more strict structure, akin to a configuration file, with well-defined blocks like ‘pipeline’, ‘stages’, and’ steps.’ This approach is intended to be more legible and manageable, with a reduced reliance on complicated Groovy programming ideas.

Scripted

It is written entirely in Groovy, allowing users to use programming constructs like loops, conditionals, and custom logic. Unlike Declarative Pipeline’s structured syntax, Scripted Pipeline gives full control over the pipeline’s flow and behavior. This approach is well-suited for advanced users who need to define highly customized CI/CD workflows.

Examples of Jenkins Pipelines

The Jenkins Pipeline examples below demonstrate how Jenkins Pipeline improves collaboration, streamlines workflows, and accelerates product delivery:

1. CI/CD Pipeline for a Java Application

A Jenkins pipeline for a Java application usually includes stages for code checkout, Maven build, unit test execution, and SonarQube code quality analysis. It may also include instructions for packaging the program as a JAR or WAR file, publishing artifacts to a repository such as Nexus, and deploying to a test or production environment. This pipeline maintains code quality and automated testing and improves deployment efficiency.

For instance, this Jenkins pipeline automates CI/CD for a Java application using Maven. It pulls code, builds, runs tests, performs code quality checks, uploads artifacts to Nexus, and deploys to a test server.

pipeline {

    agent any



    environment {

        MAVEN_HOME = '/usr/share/maven' // Path to Maven installation

        SONARQUBE_SERVER = 'SonarQube'  // SonarQube server configured in Jenkins

        NEXUS_REPO = 'http://nexus.example.com/repository/maven-releases/' // Nexus repository URL

    }



    stages {

        stage('Checkout Code') { // Clones the repository

            steps {

                git 'https://github.com/your-repo/java-app.git'

            }

        }



        stage('Build with Maven') { // Builds the project and creates JAR/WAR

            steps {

                sh 'mvn clean package'

            }

        }



        stage('Run Unit Tests') { // Executes unit tests

            steps {

                sh 'mvn test'

            }

        }



        stage('Code Analysis with SonarQube') { // Runs static code analysis

            steps {

                withSonarQubeEnv(SONARQUBE_SERVER) {

                    sh 'mvn sonar:sonar'

                }

            }

        }



        stage('Publish to Nexus') { // Uploads artifact to Nexus repository

            steps {

                sh 'mvn deploy -DaltDeploymentRepository=nexus-releases::default::${NEXUS_REPO}'

            }

        }



        stage('Deploy to Test Server') { // Deploys JAR to a remote server

            steps {

                sh 'scp target/*.jar user@your-server:/opt/app/' // Copies JAR

                sh 'ssh user@your-server "systemctl restart java-app"' // Restarts app

            }

        }

    }



    post {

        success {

            echo 'Build and deployment successful!'

        }

        failure {

            echo 'Build failed!'

        }

    }

}

2. Dockerized Application Deployment

This pipeline focuses on containerizing an application with Docker. It contains phases for developing the application, creating a Docker image, doing container security assessments with tools like Trivy, and publishing the image to a container registry like Docker Hub or Amazon ECR.

The pipeline then deploys the containerized application to a target environment, such as a Kubernetes cluster or a cloud platform that supports Docker deployments.

For example, this pipeline automates CI/CD for a containerized application. It builds the app, creates a Docker image, scans for security issues, pushes the image to a registry, and deploys it to Kubernetes.

pipeline {

    agent any




    environment {

        IMAGE_NAME = 'your-docker-image'

        IMAGE_TAG = 'latest'

        DOCKER_REGISTRY = 'your-dockerhub-username'

        REGISTRY_URL = 'https://index.docker.io/v1/' // Change for ECR/GCR

    }




    stages {

        stage('Checkout Code') { // Clones repository

            steps {

                git 'https://github.com/your-repo/docker-app.git'

            }

        }




        stage('Build Application') { // Builds app (modify for your stack)

            steps {

                sh './build.sh' // Replace with actual build commands

            }

        }




        stage('Build Docker Image') { // Creates Docker image

            steps {

                sh "docker build -t ${DOCKER_REGISTRY}/${IMAGE_NAME}:${IMAGE_TAG} ."

            }

        }




        stage('Security Scan with Trivy') { // Scans image for vulnerabilities

            steps {

                sh "trivy image ${DOCKER_REGISTRY}/${IMAGE_NAME}:${IMAGE_TAG}"

            }

        }




        stage('Push Image to Registry') { // Publishes image to Docker Hub/ECR

            steps {

                withDockerRegistry([credentialsId: 'docker-credentials', url: REGISTRY_URL]) {

                    sh "docker push ${DOCKER_REGISTRY}/${IMAGE_NAME}:${IMAGE_TAG}"

                }

            }

        }




        stage('Deploy to Kubernetes') { // Deploys container to Kubernetes

            steps {

                sh 'kubectl apply -f k8s/deployment.yaml' // Assumes a valid YAML config

            }

        }

    }




    post {

        success {

            echo 'Deployment successful!'

        }

        failure {

            echo 'Build failed!'

        }

    }

}

3. Parallel Testing Pipeline

A parallel testing pipeline accelerates the testing process by executing numerous test suites concurrently. It often divides the test suite into smaller sections and runs them on multiple Jenkins agents or nodes.

This pipeline could comprise phases for setting up the test environment, dividing tests, running them in parallel, and aggregating results. It’s especially beneficial for large projects with complex test suites.

For example, this pipeline speeds up testing by running unit, integration, and UI tests in parallel on multiple Jenkins agents

pipeline {

    agent any




    stages {

        stage('Checkout Code') { // Clones repository

            steps {

                git 'https://github.com/your-repo/test-app.git'

            }

        }




        stage('Setup Test Environment') { // Installs dependencies

            steps {

                sh 'pip install -r requirements.txt' // Example for Python; modify as needed

            }

        }




        stage('Run Tests in Parallel') { // Executes tests concurrently

            parallel {

                stage('Unit Tests') { 

                    agent { label 'test-node-1' } // Runs on specific agent

                    steps {

                        sh 'pytest tests/unit' // Modify for your test framework

                    }

                }




                stage('Integration Tests') { 

                    agent { label 'test-node-2' }

                    steps {

                        sh 'pytest tests/integration'

                    }

                }




                stage('UI Tests') { 

                    agent { label 'test-node-3' }

                    steps {

                        sh 'pytest tests/ui'

                    }

                }

            }

        }




        stage('Aggregate Test Results') { // Collects and publishes test reports

            steps {

                junit '**/test-reports/*.xml' // Adjust pattern for your test reports

            }

        }

    }




    post {

        success {

            echo 'All tests passed successfully!'

        }

        failure {

            echo 'Some tests failed!'

        }

    }

}

As Jenkins pipelines evolve to manage increasingly complex workflows, such as parallel testing across multiple environments or handling environment-specific deployment scenarios, BrowserStack empowers you to scale your testing strategy effortlessly.

By integrating seamlessly into your CI/CD pipeline, BrowserStack enables you to automate cross-browser and cross-device testing at scale, ensuring your application works flawlessly across diverse platforms and configurations.

This integration accelerates the software development cycle, reduces the risk of browser-specific issues, and maintains high-quality delivery, all while freeing your team from the complexities of managing a physical test infrastructure.

4. Blue-Green Deployment Pipeline

This pipeline uses a blue-green deployment technique to reduce downtime and risk during updates. It entails establishing a duplicate environment (green) alongside the current production environment (blue).

The pipeline contains phases for developing and testing the new version, deploying it to the green environment, doing smoke tests, and finally switching traffic from blue to green. It also has a rollback option in case of errors.

For example, the Blue-Green deployment pipeline below automates the deployment of new application versions by creating two parallel environments (blue and green). It deploys the new version to the green environment, runs smoke tests, and switches traffic to green, ensuring minimal downtime. If issues arise, it rolls back to the blue environment.

pipeline {

    agent any




    environment {

        BLUE_ENV = 'blue'

        GREEN_ENV = 'green'

        DEPLOYMENT_SERVER = 'your-server-ip-or-url'

    }




    stages {

        stage('Checkout Code') { // Clones the repository

            steps {

                git 'https://github.com/your-repo/app.git'

            }

        }




        stage('Build & Test New Version') { // Builds and tests the new version

            steps {

                sh './build.sh' // Modify with your build commands

                sh './run_tests.sh' // Run unit/integration tests

            }

        }




        stage('Deploy to Green Environment') { // Deploys the new version to Green

            steps {

                sh "scp target/app.war user@${DEPLOYMENT_SERVER}:/opt/${GREEN_ENV}/app.war" // Copy new version to green environment

                sh "ssh user@${DEPLOYMENT_SERVER} 'systemctl restart app-green'" // Restart green environment

            }

        }




        stage('Smoke Test on Green Environment') { // Validates the green environment

            steps {

                sh 'curl -f http://${DEPLOYMENT_SERVER}/health' // Replace with appropriate smoke test

            }

        }




        stage('Switch Traffic to Green') { // Switches live traffic to the green environment

            steps {

                sh "ssh user@${DEPLOYMENT_SERVER} 'update-alternatives --set app /opt/${GREEN_ENV}/app.war'" // Update traffic routing to green

            }

        }




        stage('Clean up Blue Environment') { // Optionally clean up blue environment

            steps {

                sh "ssh user@${DEPLOYMENT_SERVER} 'systemctl stop app-blue'"

            }

        }

    }




    post {

        success {

            echo 'Blue-Green deployment completed successfully!'

        }

        failure {

            echo 'Deployment failed. Rolling back...'

            // Rollback to blue environment

            sh "ssh user@${DEPLOYMENT_SERVER} 'update-alternatives --set app /opt/${BLUE_ENV}/app.war'"

            sh "ssh user@${DEPLOYMENT_SERVER} 'systemctl restart app-blue'"

            echo 'Rolled back to Blue environment.'

        }

    }

}

5. Infrastructure as Code (IaC) Deployment with Terraform

An IaC pipeline built with Terraform automates the process of managing and provisioning infrastructure. It provides steps for checking out Terraform settings from version control, initializing Terraform, planning modifications, and applying them to construct or edit infrastructure resources.

The pipeline may also include processes for automating infrastructure testing, certifying Terraform code, and securely maintaining state files.

For example, this Terraform-based pipeline automates infrastructure management using Infrastructure as Code (IaC). It initializes Terraform, plans and applies changes to provision infrastructure, validates and tests the configuration, and optionally cleans up resources.

pipeline {

    agent any




    environment {

        TF_VAR_region = 'us-east-1' // Example region, modify as necessary

        TF_BUCKET = 'your-tf-state-bucket' // S3 bucket for storing Terraform state

        TF_CREDENTIALS = 'aws-credentials' // AWS credentials for Terraform

        TERRAFORM_VERSION = '1.1.5' // Specify desired Terraform version

    }




    stages {

        stage('Checkout Terraform Code') { // Clones the Terraform repository

            steps {

                git 'https://github.com/your-repo/terraform-configs.git'

            }

        }




        stage('Terraform Init') { // Initializes Terraform environment and configures backend

            steps {

                withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: TF_CREDENTIALS]]) {

                    sh "terraform init -backend-config='bucket=${TF_BUCKET}'"

                }

            }

        }




        stage('Terraform Plan') { // Previews changes that Terraform will apply

            steps {

                withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: TF_CREDENTIALS]]) {

                    sh "terraform plan -out=tfplan"

                }

            }

        }




        stage('Terraform Apply') { // Applies Terraform plan to provision infrastructure

            steps {

                withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: TF_CREDENTIALS]]) {

                    sh "terraform apply -auto-approve tfplan"

                }

            }

        }




        stage('Terraform Validate') { // Validates Terraform code for syntax and correctness

            steps {

                sh 'terraform validate'

            }

        }




        stage('Terraform Test') { // Runs automated tests to ensure infrastructure works as expected

            steps {

                sh 'terraform test' // Modify with your testing framework

            }

        }




        stage('Terraform Clean-up') { // Optionally remove resources (useful for ephemeral environments)

            steps {

                withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', credentialsId: TF_CREDENTIALS]]) {

                    sh 'terraform destroy -auto-approve'

                }

            }

        }

    }




    post {

        success {

            echo 'Infrastructure provisioned successfully!'

        }

        failure {

            echo 'Terraform execution failed!'

            // Rollback or clean up resources if necessary

            sh 'terraform destroy -auto-approve' // Optional rollback step

        }

    }

}

Best Practices for Jenkins Pipelines

To ensure reliable, maintainable, and efficient pipelines, consider the following best practices:

  • Use Declarative Pipelines: Choose declarative syntax over programmed pipelines for improved readability and maintainability. This technique allows you to create your CI/CD procedures in a more systematic and intuitive manner.
  • Leverage Jenkins Shared Libraries: Use shared libraries to centralize reusable pipeline logic across multiple projects. This reduces code duplication, enforces consistency, and makes pipelines easier to update and maintain at scale.
  • Optimize Pipeline Performance: Speed up execution by parallelizing independent stages, caching build dependencies, and distributing workloads across multiple Jenkins agents. These techniques help reduce build times and avoid pipeline bottlenecks.
  • Follow “Pipeline as Code” Practice: Store your Jenkinsfile in version control alongside your application code. This allows for versioning, code review processes, and simpler administration of pipeline changes.
  • Implement Robust Error Handling: Use retry mechanisms, notifications, and fail-safe procedures to improve pipeline dependability and simplify debugging when problems arise.
  • Integrate with BrowserStack for Scalable Testing:  Enhance your Jenkins pipelines by integrating with BrowserStack’s cloud-based testing infrastructure. Run automated tests across a wide range of real browsers and devices at scale, reduce test execution time with parallelization, and ensure your applications work seamlessly across platforms without maintaining in-house test environments.

Talk to an Expert

Conclusion

Jenkins Pipeline is a versatile solution for automating software delivery that supports various use cases, including Java CI/CD and Terraform-based infrastructure management. Adopting suggested techniques, such as declarative pipelines and common libraries, enables organizations to improve processes and software quality.

Integrating BrowserStack Automate into Jenkins Pipelines dramatically improves testing capabilities by allowing cross-browser testing on a wide range of real devices and browsers. This ensures thorough test coverage, faster feedback loops, and a better user experience across all platforms.

When testing across several browsers or devices, mistakes may occur that are difficult to trace. BrowserStack Test Reporting & Analytics allows users to track tests across several settings, resulting in real-time insights and improved troubleshooting.

Tags
Automation Testing Manual Testing Real Device Cloud Website Testing

Get answers on our Discord Community

Join our Discord community to connect with others! Get your questions answered and stay informed.

Join Discord Community
Discord