diagram.mmd — flowchart
Build Pipeline flowchart diagram

A build pipeline transforms raw source code into a deployable, versioned artifact — compiling, bundling, containerizing, and signing the output so downstream delivery systems can consume it reliably.

How the pipeline works

The build pipeline is typically a stage within a broader CI Pipeline, but it can also run independently for languages or runtimes that require heavy compilation steps. It begins by fetching source code at a specific commit SHA to ensure the build is reproducible.

The first phase resolves and installs dependencies from a lock file, caching the dependency tree when possible to reduce build time. Next, the compiler or bundler runs: for a JVM application this means compiling Java or Kotlin bytecode; for a Node.js app, running a bundler like Vite or webpack; for a Go service, producing a statically linked binary.

With a compiled output in hand, the pipeline packages the result. For containerized workloads, this means building a Docker image using a multi-stage Dockerfile that keeps the final image minimal. The image is tagged with both the commit SHA and a semantic version label derived from the branch or tag, ensuring every artifact is uniquely traceable back to its source.

Before publishing, the artifact or image passes through a vulnerability scan. If critical CVEs are found, the pipeline fails and the artifact is not published. Once clean, the image is pushed to a container registry or the binary is uploaded to an artifact store (see Artifact Storage Pipeline), ready for the CD Pipeline to deploy.

Free online editor
Edit this diagram in Graphlet
Fork, modify, and export to SVG or PNG. No sign-up required.
Open in Graphlet →

Frequently asked questions

A build pipeline transforms source code into a deployable artifact — compiling code, running a bundler or container build, tagging the output with a version, and publishing it to a registry. It is typically a stage within a broader CI pipeline.
The pipeline fetches source at a fixed commit SHA, resolves dependencies from a lock file, compiles or bundles the code, packages it (often as a Docker image using a multi-stage Dockerfile), runs a vulnerability scan, and pushes the clean artifact to a registry.
Any service that produces a deployable artifact needs a build pipeline. If your compilation or containerisation steps take more than a minute or require specific toolchains, isolating them in a dedicated build pipeline improves cache efficiency and parallelism.
Common mistakes include not pinning base image versions (causing non-reproducible builds), skipping the vulnerability scan to save time (shipping known CVEs), and not using multi-stage Dockerfiles (producing oversized images that increase attack surface).
mermaid
flowchart TD Source[Fetch source at commit SHA] --> InstallDeps[Install dependencies from lock file] InstallDeps --> CacheCheck{Dependency cache hit?} CacheCheck -->|Yes| SkipInstall[Use cached dependencies] CacheCheck -->|No| Download[Download and cache dependencies] SkipInstall --> Compile[Compile or bundle source code] Download --> Compile Compile --> Package[Package into deployable artifact] Package --> ContainerBuild[Build Docker image] ContainerBuild --> TagImage[Tag image with SHA and version] TagImage --> VulnScan[Run vulnerability scan on image] VulnScan --> VulnGate{Critical CVEs found?} VulnGate -->|Yes| FailBuild[Fail build and notify team] VulnGate -->|No| SignArtifact[Sign artifact] SignArtifact --> Publish[Push to container registry] Publish --> Done[Artifact ready for deployment]
Copied to clipboard