• +43 660 1453541
  • contact@germaniumhq.com

Optimizing Jenkins Builds


Optimizing Jenkins Builds

Here are three important lessons we’ve learned at GermaniumHQ in optimizing build speeds.

1. Use Docker Caching

In docker every line in the Dockerfile is being cached on build if docker can detect that no files have changed content since the previous build. This allows for some interesting optimizations that will benefit from that, namely having a dockerfile something like this:

FROM ubuntu:..

COPY /install-software.sh /install-software.sh
RUN install-software.sh

COPY /requirements.txt /requirements.txt
RUN pip install -r requirements.txt

RUN ... # to the actual build

Then whenever doing new builds the time to provision a new container decreases dramatically:

Docker Caching

Only a few seconds for a new build!

2. Parallelize Everything

In Jenkins is very easy to run things in parallel.

Instead of:

sh """
    cd /src
    mypy .
    flake8 .
"""

A simple:

def parallelChecks = [:]

parallelChecks."mypy" = {
    sh """
        cd /src
        mypy .
    """
}

parallelChecks."flake 8" = {
    sh """
        cd /src
        flake8 .
    """
}

parallel(parallelChecks)

This will reduce the time significantly, since the builds are usually waiting on IO. In our case it reduced from 20 seconds to 12.

Reading the sources takes milliseconds, but the static analysis takes a lot of time, and you’ll usually have some cores sitting idle.

Not only that, but you get the labels also nicely displayed in blueocean, so you’ll immediately be able to pinpoint what failed!

3. Get Better Hardware

Remember the picture at the 1st point, where a cold provisioning took 20 minutes? The reason for that is that many things do happen. Both firefox and chrome are being installed, a bunch of binary drivers are downloaded, etc.

This is how it looks after we switched the hard disk to an SSD:

Docker on SSD

Wow!

From 20 minutes to 5. In that build even the base images were downloaded, while in the 20 minutes previous build, the base ubuntu image was cached. Just much better raw performance.

Switching the hard disk to an SSD did not improve so much the CPU bound jobs. Yes, tests still take time, so does static analysis.

Conclusions

  1. Use docker caching,

  2. Use parallelism for everything,

  3. Use better hardware.