Java docker file. Build and test or just run jar file?

I’m thinking this in a CI pipeline where I should first build and test my app and the result should be a docker image.

I’m wondering if it’s more common to build on the build server using the build environment and then running tests. Maybe using a build script for this. And lastly just add the jar file to the Docker container produced using COPY and then have Entrypoint java -jar .jar. So keep the Dockerfile very small, and have testing and building outside the container.

A bit like this:

FROM openjdk:8-jre-alpine
CMD java ${JAVA_OPTS} -jar *.jar
COPY build/libs/*.jar .

Or if I should add all the source code to the container, build it and then run tests inside the container and then having the Entrypoint (as before) running the jar file that was produced? So keeping everything in the Dockerfile? Maybe doing some cleanup also, removing the source code

This doesn’t really have to be Java I guess, the same question applies to all languages

Optimizing the container build

Historically one was forced to run Docker twice in order to create a docker image that did not contain the source code (and the software used to create the binary) For example see

  • How to build a docker container for a java app

Now, Docker supports a new multistage build capability:

  • https://docs.docker.com/engine/userguide/eng-image/multistage-build/

This enables Docker to build a container with an image containing the build tools but output an image with only the runtime dependencies. The following example demonstrates this concept, note how the jar is copied from target directory of the first build phase

FROM maven:3.3-jdk-8-onbuild 

FROM java:8
COPY --from=0 /usr/src/app/target/demo-1.0-SNAPSHOT.jar /opt/demo.jar
CMD ["java","-jar","/opt/demo.jar"]

The resultant image does not contain the maven, only java and the built jar.

Testing

Assuming we’re not talking about unit tests (which can be run locally) an integration test requires that the code first be deployed. The answer in this case is highly dependent on how you deploy your containerized Java application.

For example if you’re using Kubernetes or Openshift one option is to use the Fabric8 plugin to deploy the code before running your test phase in Maven.

You build and test the app in the so called build docker image, with the JDK and all tools you need for that purpose. When you are done and happy, you extract the jar/war as an artifact into your CI/CD pipeline and then, when you consider it production ready, you build a production docker image and put the artifact inside, where you only have a JRE/Tomcat (whatever you need for production only) – no dev tools, not compile tools nothing – as small and sleek and simple as possible.

So you basically always have 2 images per app at least, one for building it and one for running it in production. Mixing both is very bad practice and will lead to issues sooner or later.

Building on the host is even worse, since you do not use clean environments this way, which is more or less one of the key gains with docker – and you cannot reproduce the build locally easily.