Is it possible to purge the cache used for asset retention
Is it possible to purge the cache used for asset retention (or at least adjust the TTL)? I've noticed that some users can get an old version of a .json file i import into my next.js project which is not ideal for the site I'm running. Thanks! https://developers.cloudflare.com/pages/configuration/serving-pages/#asset-retention
Cloudflare Docs
Serving Pages · Cloudflare Pages docs
Cloudflare Pages includes a number of defaults for serving your Pages sites. This page details some of those decisions, so you can understand how …
11 Replies
The cache for asset retention should just be for assets present in one version and not in the next, it should not apply for files which were changed. Changed files should display immediately or near-immediately.
Are you using a pages.dev or a custom domain? And if a custom domain, have you tried purging cache for the zone? As this may have its on cache running "in front" of the Pages cache
strange. I am using a custom domain from namecheap. maybe Ill just host the JSON file as a static asset and do a fetch call for it in my code to ensure the newest version
The custom domain is not using Cloudflare nameservers?
Sorry yes it is, but I did purge the cache after changing the JSON file but it seems like there was still a user that got the outdated version of the file
Maybe the user had the outdated file downloaded in their browser? I'm not sure how this could have happened
Its possible there was browser cache involved? Would depend on the way the json is fetched I suppose.
I believe it was incldued with the webpack since I essentially "imported" it into my page.tsx file https://stackoverflow.com/a/68930283
Stack Overflow
Loading content from a static JSON Next JS
I have Next JS project with static JSON placed in /pages/api/data.json that looks like this:
{
"Card": [
{ "title": "Title 1", "content": "Content 1...
but yeah if browser cache is involved then changing it to be a fetch probably wouldn't help. I've seen this happen a handful of times now
Normally with webpack and such I would expect the filenames to all be hashed so that one changing without another one wouldnt be an issue (and asset retention keeps the old ones around for a time)
Not so sure about how this interacts with next.js particularly though
ohhh fascinating so asset retention may be kicking in since future webpacks have different filenames?
Would need data from a concrete example to say with any certainty whats happening 😅
that's the thing is I have no idea how to replicate it 😠I did just add a cache rule to disable caching for my pages domain as I heard that can cause some issues with cache invalidation. I guess I'll just have to wait and see if the issue comes up again
but changing the pages.tsx file to fetch instead of importing the JSON doesn't seem to have fixed anything? still got 1 user get stale JSON data today. which is very strange considering that data is close to a week old at this point
disabled the cache rule real quick to see and here's the header of the data in question
Seems like it was being cached (no
no-cache
present) and I'm not sure if the max-age there is accurate but if so that would come out to 7 days 🤔
The great news is that enabling that cache rule now has that file coming back with Cache-Control: no-cache
so seems promising 🤞