CDN and Caching: Make Your Site Fast Without Effort
Production Ready, Part 25 of 30
A developer I know launched a SaaS tool earlier this year. Traffic was modest -- a few hundred users a day -- but his hosting bill was already creeping up, and users in Europe were complaining the dashboard felt sluggish. He spent a weekend adding database indexes, rewriting queries, optimizing React components. None of it moved the needle.
Then he spent 20 minutes enabling proper cache headers and configuring Vercel's edge network correctly. Time to First Byte dropped from 480ms to 38ms for European users. His server costs dropped 60% the following month. The site felt instant.
He had not changed a single line of application logic.
That is what caching does. It is the highest-leverage performance optimization available to most web developers, and most of you are leaving it entirely on the table.
What a CDN Actually Is
A CDN (Content Delivery Network) is a network of servers distributed around the globe. When a user in Sydney requests your site, they are not waiting for a response from your server in Virginia. Instead, a copy of your content is served from a node in Sydney -- a physical distance of roughly 50 kilometers instead of 15,000.
The speed of light is not negotiable. Reducing physical distance is.
Vercel's edge network currently spans over 100 Points of Presence (PoPs) worldwide. According to real-world benchmarks, adding a CDN layer can cut Time to First Byte from over 130ms down to under 40ms, and reduce total page load times by 50-70% for users far from your origin server.
The key word is "copy." For the CDN to serve a cached copy instead of hitting your origin server, you have to tell it what to cache and for how long.
The Four Caching Layers in a Next.js App
Understanding these layers separately will save you hours of debugging.
1. Browser Cache
The browser stores responses locally on the user's machine. Controlled entirely by Cache-Control response headers.
Cache-Control: public, max-age=31536000, immutable
This tells the browser: "Keep this for one year. Do not check the server again." Use this for versioned static assets like JS bundles and images that have content-hashed filenames.
2. CDN / Edge Cache
Vercel's edge layer sits between the user and your server. It obeys the s-maxage directive in your Cache-Control header:
Cache-Control: public, s-maxage=3600, stale-while-revalidate=86400
s-maxage=3600 means the CDN caches the response for one hour. stale-while-revalidate=86400 means after that hour, the CDN can continue serving the stale copy while it fetches a fresh one in the background. Users never see a slow response during revalidation.
In next.config.js you can set default headers for all routes:
// next.config.js
module.exports = {
async headers() {
return [
{
source: '/blog/:slug*',
headers: [
{
key: 'Cache-Control',
value: 'public, s-maxage=3600, stale-while-revalidate=86400',
},
],
},
];
},
};
3. Next.js Data Cache
Next.js maintains its own server-side cache for fetch() calls. The behavior changed significantly in Next.js 15: requests are no longer cached by default. You now opt in explicitly.
// Cached indefinitely (or until manually revalidated)
const data = await fetch('https://api.example.com/posts', {
cache: 'force-cache',
});
// Revalidated every 60 seconds (ISR-style)
const data = await fetch('https://api.example.com/posts', {
next: { revalidate: 60 },
});
// Never cached -- use for real-time or user-specific data
const data = await fetch('https://api.example.com/cart', {
cache: 'no-store',
});
Next.js 16 goes further with the "use cache" directive, which lets you mark entire components, pages, or individual functions as cacheable with compiler-generated cache keys. The framework's direction is explicit caching: you decide what is cached, not the framework.
4. Full Route Cache
When Next.js pre-renders a page at build time (static rendering), that rendered HTML is stored in the full route cache. Vercel serves it from the edge with no server execution at all -- the fastest possible response.
// page.tsx -- force static rendering
export const dynamic = 'force-static';
export default async function BlogPost({ params }) {
const post = await getPost(params.slug);
return <article>{post.content}</article>;
}
Any route without dynamic functions (cookies, headers, search params at request time) will be statically rendered by default.
When NOT to Cache
Aggressive caching breaks things fast. Never cache:
- Authenticated pages -- a logged-in user's dashboard must never be served from a shared CDN cache. Use
Cache-Control: private, no-storefor anything behind auth. - Personalized content -- user-specific recommendations, cart data, notification counts.
- Real-time data -- live scores, stock prices, anything where stale data is worse than a slow response.
- Form submission responses -- POST requests are not cached by CDN layers by design, but make sure your API routes that return user-specific data carry
no-storeheaders.
// API route for authenticated user data
export async function GET(request: Request) {
const session = await getSession(request);
const userData = await getUserData(session.userId);
return Response.json(userData, {
headers: {
'Cache-Control': 'private, no-store',
},
});
}
Cache Invalidation: The Hard Part
Phil Karlton's famous observation holds: "There are only two hard things in computer science: cache invalidation and naming things."
The problem is straightforward. You cache your product listing for one hour. A customer updates a price. Everyone sees the old price for up to an hour.
Vercel's answer in 2025 was tag-based invalidation at the edge. You tag cached responses, then purge by tag. According to Vercel's Ship 2025 announcements, cache tag expiration now propagates across their entire edge network in under 300 milliseconds.
// Tag a fetch with revalidation tags
const products = await fetch('https://api.example.com/products', {
next: {
revalidate: 3600,
tags: ['products'],
},
});
// In an API route or server action, purge when data changes
import { revalidateTag } from 'next/cache';
export async function updateProduct(id: string, data: ProductData) {
await db.products.update(id, data);
revalidateTag('products'); // Purges all responses tagged 'products' globally
}
This is the architecture you want for content-driven apps: cache aggressively, invalidate precisely.
Images: Free Performance from next/image
Every image rendered through Next.js's <Image> component is automatically:
- Resized to the requested dimensions
- Converted to WebP or AVIF (smaller formats)
- Served through Vercel's image cache
- Lazy-loaded by default
import Image from 'next/image';
export function ProductCard({ product }) {
return (
<Image
src={product.imageUrl}
alt={product.name}
width={400}
height={300}
priority={false} // true for above-the-fold images
/>
);
}
The only thing you need to configure is the allowed remote domains in next.config.js:
module.exports = {
images: {
remotePatterns: [
{
protocol: 'https',
hostname: 'your-image-cdn.com',
},
],
},
};
Vercel's image optimization costs are billed per transformation, but each unique transformed image is cached indefinitely afterward.
What to Measure
You cannot optimize what you do not measure. Track these three metrics in Vercel Analytics or your monitoring tool of choice:
- TTFB (Time to First Byte): Should be under 200ms. If it is over 500ms, you have a caching or server problem.
- Cache Hit Rate: Found in Vercel Analytics under the "Edge Network" tab. A well-configured site should see 80-95% cache hits for static and semi-static content.
- Core Web Vitals: LCP (Largest Contentful Paint) is directly correlated with caching. A cached page almost always has a better LCP score.
Check your headers in the terminal to verify caching is working:
curl -I https://yoursite.com/blog/some-post | grep -i cache
# Look for: x-vercel-cache: HIT
# HIT means the edge served it. MISS means it hit your server.
A response with x-vercel-cache: MISS every time means your cache headers are wrong or not present. Fix the headers first, then measure again.
Action Checklist
- Add
Cache-Control: public, s-maxage=3600, stale-while-revalidate=86400headers to all public, non-personalized routes - Add
Cache-Control: private, no-storeto all authenticated and user-specific API routes - Audit every
fetch()call in server components -- decide deliberately:force-cache,revalidate, orno-store - Switch to
export const dynamic = 'force-static'on any page that does not need request-time data - Add
next: { tags: [...] }to cacheable fetches and wire uprevalidateTag()to your mutation logic - Use
<Image>fromnext/imagefor every image in your app - Run
curl -Iagainst your key routes and confirmx-vercel-cache: HITappears after first request - Check Vercel Analytics for cache hit rate and TTFB on your most-visited routes
Ask The Guild
What has been your hardest cache invalidation problem? Did you ever accidentally cache something you should not have -- or discover a page had been serving stale data for days without anyone noticing? Share your war story (or your current caching config) in the community forum. The more specific, the better.