Cache is not being utilized efficiently

Hello, I would like to ask for the opinion of those more experienced, on what you would recommend regarding our CDN strategy. We run an e-commerce site with approximately 55,000 products and a few thousand other pages like landing pages and content pages, so in total, we are talking about around 65,000 pages for one domain. We currently have four domains (CZ, SK, HU, DE). Our e-commerce website is a single-page application, rendered both for customers and bots via SSR. And the most visited pages are PDP and PLP sites. In the initial phase, we have enabled the Business plan on the production HU domain, where the traffic is the smallest of all domains, to troubleshoot initial issues. Unfortunately, we have not been able to improve the user experience for customers and the related Google metrics through the CDN. What we are trying to cache: • HTML page (SSR content) • API endpoint response • static content (media, JS, CSS, etc.)
The biggest issue I see with Cloudflare is that we are caching very little HTML content. From the total daily hits of 1.76M, only 20k are HTML hits. I started analyzing why we have so few hits on HTML. ○ Probably becase of insufficient traffic considering the 2-minute TTL of these pages?
If we could increase the TTL of these pages, would it lead to more efficient cache usage and thus a higher hit ratio on HTML? Yes, but only maybe. What is our current maximum capacity that we can use for storing content? From the documentation, I found 512MB, which, at first glance, seems quite small for large e-commerce sites. If we do simple math, where for our SPA with SSR we are caching for a single customer visit: • loading the first page of a new visit (SSR render) - around 1MB • loading static files and API endpoint responses for the entire visit (CSR) - around 1MB per page
2 Replies
Koviczech
KoviczechOP3mo ago
Hello, I would like to ask for the opinion of those more experienced, on what you would recommend regarding our CDN strategy. We run an e-commerce site with approximately 55,000 products and a few thousand other pages like landing pages and content pages, so in total, we are talking about around 65,000 pages for one domain. We currently have four domains (CZ, SK, HU, DE). Our e-commerce website is a single-page application, rendered both for customers and bots via SSR. And the most visited pages are PDP and PLP sites. In the initial phase, we have enabled the Business plan on the production HU domain, where the traffic is the smallest of all domains, to troubleshoot initial issues. Unfortunately, we have not been able to improve the user experience for customers and the related Google metrics through the CDN. What we are trying to cache: • HTML page (SSR content) • API endpoint response • static content (media, JS, CSS, etc.)
The biggest issue I see with Cloudflare is that we are caching very little HTML content. From the total daily hits of 1.76M, only 20k are HTML hits. I started analyzing why we have so few hits on HTML. ○ Probably becase of insufficient traffic considering the 2-minute TTL of these pages?
If we could increase the TTL of these pages, would it lead to more efficient cache usage and thus a higher hit ratio on HTML? Yes, but only maybe. What is our current maximum capacity that we can use for storing content? From the documentation, I found 512MB, which, at first glance, seems quite small for large e-commerce sites. If we do simple math, where for our SPA with SSR we are caching for a single customer visit: • loading the first page of a new visit (SSR render) - around 1MB • loading static files and API endpoint responses for the entire visit (CSR) - around 1MB per page Based on GA, which tells me that the average visit involves 7 pages, for one visit (customer browsing through the site), we need to store around 8MB in the cache. Simple math shows that we can ideally cache only 64 different visits (512/8). And here we might be hitting the limit of cache usage. If in our case, the traffic is spread widely across all PDPs, then each visit and browsing session will be quite different. In such a case, the cache for those 64 visits will definitely not be enough, and the content in the cache will be constantly replaced, making it impossible to utilize the cache effectively. So the question is whether increasing the TTL will help, given that the content will be constantly refreshed. Also, how does Cloudflare replace existing entries when memory is full? Is it a classic FIFO method? Or does it prioritize more frequently visited pages in some way? I would like to ask if anyone has faced a similar issue or if there is someone with an e-commerce site where the conditions are similar. What is your recommendation for cache settings, and what should we cache and how? How do you handle your test environment in such cases? Do you have caching enabled there as well or not? Because if we need a larger cache, it might be unnecessarily taking up memory in the test environment.
Chaika
Chaika3mo ago
The biggest issue I see with Cloudflare is that we are caching very little HTML content. From the total daily hits of 1.76M, only 20k are HTML hits. I started analyzing why we have so few hits on HTML.
Worth noting you need a Cache Rule to cache html at all, no matter what headers you have CF just won't cache html without a cache rule. (CF-Cache-Status response header being Dynamic/bypass means no cachability, miss/updating/hit means cachable). I'm guessing you have at least one if you're getting any hits at all (or a page rule forcing)
If we could increase the TTL of these pages, would it lead to more efficient cache usage and thus a higher hit ratio on HTML?
By some amount for sure. Just depends on what works for you. Tiered Caching is also an interesting option if you just want to reduce load on origin/origin is slow
Yes, but only maybe. What is our current maximum capacity that we can use for storing content? From the documentation, I found 512MB, which, at first glance, seems quite small for large e-commerce sites. If we do simple math, where for our SPA with SSR we are caching for a single customer visit:
There's no max per site. You're talking about the per file cache limit. CF will evict infrequent assets quickly though. Cache eviction is just based on access frequency. The rest of your message is focused on that cache limit but yea, it's not for your entire site, just for a specific cached asset.
Want results from more Discord servers?
Add your server