How do you cache tRPC server responses w/Next App Router?

What is the correct way to define the cache for tRPC endpoints with [email protected] & Next App Router? Cache-Control headers are overwritten by NextJs (https://nextjs.org/docs/app/api-reference/next-config-js/headers#cache-control) Meaning the tRPC docs are not relevant (https://trpc.io/docs/server/caching) I have validated this with a very simple trpc route returning new Date().toJSON(). Current setup in create-t3-app appears to be 'no-cache' for these endpoints. NextJs will not respect cache-control headers set via responseMeta in the fetchRequestHandler, or via context.
1 Reply
psihoza
psihoza5mo ago
I'm banging my head with this for days. You saved me some work on testing Cache-control headers. My worry is is mainly caching database queries, to cut the costs down. I have more than a few that change rarely but are queried by many different users daily. But once I mutate the data, I want to invalidate the cache right away. My second option was to wrap db query in unstable_cache, in trpc router endpoints, then use revalidateTag in mutate endpoints. This seems to work ok, but looks ugly and I'm not sure I want to go that route and spend days refactoring all my endpoints. Also, this caches only db query part of the API response and ideally I want to cache complete response. Last option is beta of Vercel's Data Cache (https://vercel.com/docs/infrastructure/data-cache) but it extends fetch and I dont see how to use it with trpc. A bit frustrated, so close to dumping trpc alltogether. So much more options for regular fetch. So far, what I did... I created a small wrapper for caching db queries:
export const cachedQuery = async <T>(
query: Promise<T>,
keys: string[],
revalidation: { tags: string[]; revalidate: number },
name?: string
) => {
let cacheHit = true
const cached = unstable_cache(
async () => {
cacheHit = false
console.log(`>>>>>>> DB QUERY '${name ?? revalidation.tags[0] ?? 'unknown'}' will run... `)
return await query
},
keys,
revalidation
)

const queryResult = await cached()

if (cacheHit)
console.log(`<><><><><> DB QUERY '${name ?? revalidation.tags[0] ?? 'unknown'}' cache hit!`)

return queryResult
}
export const cachedQuery = async <T>(
query: Promise<T>,
keys: string[],
revalidation: { tags: string[]; revalidate: number },
name?: string
) => {
let cacheHit = true
const cached = unstable_cache(
async () => {
cacheHit = false
console.log(`>>>>>>> DB QUERY '${name ?? revalidation.tags[0] ?? 'unknown'}' will run... `)
return await query
},
keys,
revalidation
)

const queryResult = await cached()

if (cacheHit)
console.log(`<><><><><> DB QUERY '${name ?? revalidation.tags[0] ?? 'unknown'}' cache hit!`)

return queryResult
}
... so my route executes db queries like this:
const keys = ['user', 'getAllUsers']
const revalidation = {
tags: ['user.getAllUsers'],
revalidate: 60 * 60 * 24,
}

return cachedQuery(
ctx.db
.select({
id: users.id,
name: users.name,
})
.from(users),
keys,
revalidation
)
const keys = ['user', 'getAllUsers']
const revalidation = {
tags: ['user.getAllUsers'],
revalidate: 60 * 60 * 24,
}

return cachedQuery(
ctx.db
.select({
id: users.id,
name: users.name,
})
.from(users),
keys,
revalidation
)
This way its realtively easy to change routes, quickly test if it works ok and future changes unstable_cache API should be easier to implement. ... now testing.
Want results from more Discord servers?
Add your server