Next.js 16. Caching Finally Makes Sense
Next.js 16 is here, and the biggest change is how caching works. They reversed it completely.
Caching used to be implicit. The framework decided what to cache based on patterns it detected. Sometimes it worked, sometimes you'd spend hours figuring out why stale data kept appearing. Fun times.
Now it's explicit and nothing caches unless you ask it to. What a concept.
What "use cache" actually does
The new directive marks what should be cached. Pages, components, functions, whatever you wrap with "use cache" gets cached. Everything else runs fresh on every request.
"use cache"; export default async function HomePage() { const data = await fetchData(); return <div>{data}</div>; }
That page caches, and if you remove the directive it doesn't. The compiler generates cache keys automatically so you don't write them.
This completes Partial Pre-Rendering (PPR), the model they introduced in 2023. Static parts cache while dynamic parts don't, and the boundary is explicit.
The caching APIs changed
revalidateTag() now requires a second argument, a cache profile. Before, you'd call revalidateTag('posts') and it would invalidate immediately. Now you specify behaviour:
revalidateTag('posts', 'hours'); // Stale-while-revalidate for an hour revalidateTag('posts', 'max'); // Serve stale while fetching fresh
There's a new updateTag() for Server Actions. It gives you read-your-writes semantics. When a user updates something, the UI reflects the change immediately instead of waiting for revalidation. The cache expires and fresh data loads in the same request.
async function updatePost(formData: FormData) { 'use server'; await savePost(formData); updateTag('posts'); // Immediate refresh }
And there's refresh(), which only refetches uncached data. It doesn't touch anything marked with "use cache". This matters when you want to reload dynamic parts without wiping static content.
Turbopack is now default
Turbopack shipped as stable. It's the default bundler for new projects. Fast Refresh is up to 10× faster. Production builds are 2–5× faster.
I've been using Turbopack exclusively since it became available. I don't develop without it anymore. The speed difference is noticeable on every save, Fast Refresh is instant and builds don't make me wait. Once you feel how fast development can be, webpack feels like driving with the handbrake on. It only makes sense why they made it the default.
If you've got custom webpack config, you can still use it with --webpack. But Turbopack handles most cases now. Over 50% of Next.js 15.3+ dev sessions already use it.
There's also File System Caching in beta. It speeds up cold starts on large apps by caching compiled output between sessions.
proxy.ts replaces middleware.ts
They renamed middleware.ts to proxy.ts. The logic is identical, but the name makes the network boundary clearer. You're proxying requests, not just injecting middleware.
I've never liked the word middleware. It's not that I didn't know what it was, but it's always been a word that never made sense straight away. What does "middle" even refer to? Middle of what? The developer's confusion? Proxy is direct and tells you exactly what it does. Imagine that, naming things after what they actually do.
// proxy.ts export function proxy(request: NextRequest) { // Same logic as before return NextResponse.next(); }
Middleware still exists for Edge runtime cases, but it's deprecated. Future versions will remove it entirely.
What breaks
Node.js 18 support is gone. You need 20.9 or higher.
params and searchParams are now async. Instead of destructuring them directly, you await them:
// Before export default function Page({ params }: { params: { slug: string } }) { return <div>{params.slug}</div>; } // Now export default async function Page({ params }: { params: Promise<{ slug: string }> }) { const { slug } = await params; return <div>{slug}</div>; }
The experimental.ppr flag is gone. Cache Components replace it. If you were testing PPR, you'll need to migrate to "use cache".
Why this matters
Implicit caching felt clever until it didn't. You'd deploy, see stale data, then spend three hours digging through docs trying to figure out which layer was caching what. Because nothing says "developer experience" like a mystery debugging session at 2am.
Explicit caching removes the mystery. If it caches, you asked for it, and if it doesn't, you didn't. The behaviour is in the code, not hidden in framework defaults.
This makes reasoning about performance straightforward because static content caches, dynamic content doesn't, and the boundary is visible in your code. Systems become easier to maintain when you can see exactly what's happening.
I'm loving that Vercel is paying attention to how people actually use the framework. This change came from real developer frustration, not theoretical improvements.
When this actually matters
Not every app needs this. My personal site is mostly static content, blog posts that don't change often, portfolio pieces that update rarely. Static generation at build time works fine. Upgrading just for component-level caching would add migration work without solving a real problem.
For WonderBook, ISR handles the explore page well enough. Story listings stay fresh, the page loads fast. I could split the page shell from the feed with "use cache", but ISR already does what I need. The dashboard is fully dynamic because it shows user-specific data, and that's straightforward with Next.js 15.
FounderBase is different. The workflow builder and AI reasoning need to stay dynamic, but help docs, templates, and onboarding flows don't. Right now I'd have to choose between making entire pages static or dynamic. With component-level caching, I could cache the docs and keep the AI features fresh without awkward page-level compromises.
The updateTag() API is where I'd actually use this. When someone saves a workflow or updates settings in FounderBase, the UI could reflect changes immediately instead of waiting for revalidation. That matters for forms and interactive features where stale data feels broken.
What makes component-level caching powerful
The granularity is what makes this model work. Before, you chose caching at the page level. The entire page was static, or it used ISR, or it was fully dynamic. If you had a dashboard with a mix of static UI and dynamic data, you either over-cached and served stale information or under-cached and recomputed everything. Pick your poison.
Now you can wrap individual components with "use cache" and the framework handles the rest. A navigation component caches once while a user profile widget stays dynamic. A settings panel caches while a real-time notification feed doesn't. You're not forced into all-or-nothing decisions about entire routes.
This matters when building apps where some data changes constantly and other parts don't change for weeks. You don't have to pick between speed and freshness anymore, you get both, split exactly where you need them. The code shows which parts cache and which don't, so six months from now you'll know exactly why a component behaves the way it does.