diagram.mmd — flowchart
CDN Request Flow flowchart diagram

A CDN (Content Delivery Network) is a globally distributed network of edge servers that cache and serve content from locations geographically close to end users, reducing latency, offloading origin servers, and improving availability.

CDNs like Cloudflare, Fastly, and AWS CloudFront operate hundreds of Points of Presence (PoPs) worldwide. When a user requests a CDN-backed resource, DNS resolution returns the IP of the nearest edge node (often via Anycast routing) rather than the origin server's IP.

Cache Hit Path: If the edge node has a valid cached copy of the requested resource (determined by URL, headers, and cache policy), it serves the response immediately without contacting the origin. This is the "fast path" — latency is dominated by the user-to-edge RTT, typically 5–20ms for well-distributed CDNs.

Cache Miss / Origin Fetch: On a cache miss (first request for a resource, or after TTL expiry), the edge node forwards the request to the origin server (your application server or object storage). The response is cached at the edge according to Cache-Control headers, then returned to the client.

Cache Invalidation: CDNs allow programmatic cache purging via API. This is necessary for deployments where content changes but cache TTLs haven't expired (e.g., after deploying new JavaScript bundles). Some CDNs support surrogate keys (cache tags) for selective bulk invalidation.

TLS Termination: CDN edge nodes terminate TLS connections, meaning your origin receives plain HTTP (or HTTP over a private network). This offloads TLS computation from your origin and enables CDN visibility into request contents for security filtering.

CDNs work closely with load balancers at the origin tier and reverse proxies for request routing.

Free online editor
Edit this diagram in Graphlet
Fork, modify, and export to SVG or PNG. No sign-up required.
Open in Graphlet →

Frequently asked questions

A CDN (Content Delivery Network) is a globally distributed network of edge servers that cache and serve content from locations near end users. When a request arrives, DNS resolves to the nearest edge node (often via anycast). If the edge has a cached copy within its TTL, it responds instantly. On a cache miss, the edge fetches the resource from the origin server, caches it, then returns it to the client.
A cache hit means the edge node already holds a valid cached copy of the requested resource and serves it directly, with latency dominated by the client-to-edge round trip (typically 5–20ms). A cache miss means the edge must fetch the resource from the origin server, adding the edge-to-origin round trip to total response time. Cache policies are controlled via `Cache-Control` headers.
Use a CDN for any content served to geographically distributed users — static assets (JS, CSS, images), video streams, API responses with predictable caching, and DDoS-mitigation. CDNs are less suited for highly dynamic, user-specific responses that cannot be cached without per-user keys.
The most common mistakes are setting cache TTLs too long (stale content after deploys), setting them too short (defeating the purpose), failing to include `Vary` headers for content negotiation, and not purging the cache after deployments. Sending cookies with every request can also prevent caching entirely if the CDN treats cookied requests as uncacheable by default.
A CDN is a globally distributed network of PoPs that primarily optimizes geographic latency and offloads origin traffic at massive scale. A reverse proxy is typically a single-region intermediary that handles TLS termination, routing, and caching for a specific application. CDN edges often implement reverse proxy behavior, but reverse proxies deployed on a single server lack the geographic distribution that defines CDNs.
mermaid
flowchart LR User([User Browser]) --> DNS[DNS resolves to\nnearest CDN PoP] DNS --> Edge[CDN Edge Node\nnearby PoP] Edge --> CacheCheck{Resource in\nedge cache?\nTTL valid?} CacheCheck -->|Cache HIT| ServeCache[Serve from edge cache\nAdd Age header] ServeCache --> UserResp([Response to user\nlow latency]) CacheCheck -->|Cache MISS| FetchOrigin[Forward request\nto origin server] FetchOrigin --> Origin[Origin Server\napp / storage] Origin --> OriginResp[Origin response\n+ Cache-Control headers] OriginResp --> CacheStore{Cacheable?\nCache-Control: public} CacheStore -->|Yes| StoreEdge[Store in edge cache\nwith TTL] StoreEdge --> UserResp2([Response to user]) CacheStore -->|No — private/no-store| PassThrough([Pass through\ndo not cache]) Edge --> Purge[Cache purge API\nor surrogate key tag] Purge --> Invalidate[Invalidate cached\nresources on deploy]
Copied to clipboard