diagram.mmd — flowchart
Cache Aside Pattern flowchart diagram

The cache-aside pattern (also called lazy loading) is a caching strategy where the application code is responsible for loading data into the cache on demand — the cache never proactively populates itself.

This diagram shows the two code paths: cache hit and cache miss. On every read, the application first checks the cache (typically Redis or Memcached) for the requested key. If the key exists and has not expired (cache hit), the cached value is returned directly to the client without touching the database. This is the fast path — a cache hit takes microseconds versus milliseconds for a database query.

On a cache miss, the application queries the database for the data, stores the result in the cache with a configured TTL, and then returns the data to the client. Subsequent reads for the same key will hit the cache until the TTL expires or the key is explicitly invalidated.

Cache-aside is the most commonly used caching pattern because it gives the application full control over what data enters the cache, when it expires, and how invalidation is handled. The application only caches data that has actually been requested, avoiding pre-population of rarely accessed data.

The main risk is thundering herd: if a popular cached key expires and many requests arrive simultaneously, all of them will miss the cache, all query the database, and all attempt to write the same result back to the cache. This can be mitigated with probabilistic early expiration, mutex locks on cache writes, or staggered TTLs.

Cache-aside also leaves a window of inconsistency: between a database write and a cache invalidation (or TTL expiry), readers may get stale data from the cache. Compare this to Write Through Cache, which keeps cache and database in sync on every write, and Write Back Cache, which prioritizes write performance at the cost of durability guarantees.

Free online editor
Edit this diagram in Graphlet
Fork, modify, and export to SVG or PNG. No sign-up required.
Open in Graphlet →

Frequently asked questions

Cache-aside (lazy loading) is a caching strategy where the application is responsible for loading data into the cache on demand. The cache never pre-populates itself — data only enters the cache after the first cache miss causes the application to fetch it from the database.
On every read, the application checks the cache first. A cache hit returns the value immediately without touching the database. A cache miss causes the application to query the database, write the result into the cache with a TTL, and return the data. All subsequent reads serve from cache until the TTL expires or the key is explicitly invalidated.
Use cache-aside when your read patterns are unpredictable and you want to avoid populating the cache with data that may never be read again. It is the most versatile caching pattern and works well for general-purpose read acceleration. Prefer write-through when data is frequently read immediately after being written and you need a consistent cache at all times.
The most common mistake is neglecting thundering herd protection: if a popular key expires and thousands of concurrent requests miss simultaneously, all of them query the database at once. Mitigate with mutex locks, probabilistic early expiration, or staggered TTLs. Another mistake is using cache-aside for mutable data without an explicit invalidation strategy, leaving stale values in the cache indefinitely.
In cache-aside the application code explicitly checks and populates the cache. In read-through caching, the cache itself is responsible for fetching from the database on a miss — the application only ever talks to the cache. Read-through simplifies application code but requires a cache that supports a loader function (e.g., Ehcache, Caffeine with a loader). Cache-aside is more portable and works with any key-value store.
mermaid
flowchart TD App[Application] --> CheckCache{Cache Hit?} CheckCache -->|Yes| ReturnCached[Return cached data] CheckCache -->|No| QueryDB[Query database] QueryDB --> StoreCache[Store result in cache\nwith TTL] StoreCache --> ReturnFresh[Return fresh data] ReturnCached --> Client[Client receives data] ReturnFresh --> Client WriteApp[Application write] --> UpdateDB[(Update database)] UpdateDB --> InvalidateKey[Invalidate cache key] InvalidateKey --> NextRead[Next read will miss\nand reload from DB]
Copied to clipboard