Build Pipeline
A build pipeline transforms raw source code into a deployable, versioned artifact — compiling, bundling, containerizing, and signing the output so downstream delivery systems can consume it reliably.
A build pipeline transforms raw source code into a deployable, versioned artifact — compiling, bundling, containerizing, and signing the output so downstream delivery systems can consume it reliably.
How the pipeline works
The build pipeline is typically a stage within a broader CI Pipeline, but it can also run independently for languages or runtimes that require heavy compilation steps. It begins by fetching source code at a specific commit SHA to ensure the build is reproducible.
The first phase resolves and installs dependencies from a lock file, caching the dependency tree when possible to reduce build time. Next, the compiler or bundler runs: for a JVM application this means compiling Java or Kotlin bytecode; for a Node.js app, running a bundler like Vite or webpack; for a Go service, producing a statically linked binary.
With a compiled output in hand, the pipeline packages the result. For containerized workloads, this means building a Docker image using a multi-stage Dockerfile that keeps the final image minimal. The image is tagged with both the commit SHA and a semantic version label derived from the branch or tag, ensuring every artifact is uniquely traceable back to its source.
Before publishing, the artifact or image passes through a vulnerability scan. If critical CVEs are found, the pipeline fails and the artifact is not published. Once clean, the image is pushed to a container registry or the binary is uploaded to an artifact store (see Artifact Storage Pipeline), ready for the CD Pipeline to deploy.