Migrating from Vercel, vercel.json alternative
Hello everyone,
I am migrating my projects from Vercel but cant find a way to setup my project like I used to in vercel using a vercel.json that contains:
{
"headers": [
{
"source": "/assets/babylon/ext.babylon",
"headers": [
{
"key": "Accept-Encoding",
"value": "br"
},
{
"key": "content-encoding",
"value": "br"
}
]
}
],
"cleanUrls": true
}
I have some “.babylon” files pre-compressed in brotli and just need the browser to use them.
Any idea in how to achieve this?
26 Replies
Why are you migrating
Vercel limits the amount of data that your deployments have received or sent to 100GB, cloudflare pages doesnt limit
Pages still isn't meant to be used as a file locker and has a 25 MB limit per file, just worth mentioning. If you just want something to store and deliver large files of up to 5 TB in size, look at Cloudflare R2
For setting custom headers Pages has a _headers file: https://developers.cloudflare.com/pages/platform/headers/
Headers · Cloudflare Pages docs
To attach headers to Cloudflare Pages responses, create a _headers plain text file in the output folder of your project. It is usually the folder that …
Cloudflare has docs on how it handles compression with the CDN here: https://developers.cloudflare.com/speed/optimization/content/brotli/content-compression/
Pages is a bit special, I haven't played around with it too much in how it handles things, but the important bit of that doc is here:
If you do not want a particular response from your origin to be encoded with GZIP/Brotli when delivered to website visitors, you can disable this by including a cache-control: no-transform HTTP header in the response from your origin web server.Would try that
Content compression | Brotli · Cloudflare Speed docs
Cloudflare compresses content in two ways: between Cloudflare and your website visitors and between Cloudflare and your origin server.
Thank you for the heads up but I am using cloudflare pages just for static sites, that latetly have a lot of traffic.
I have tried _headers with the following code, but I cant see them reflected in the Response headers:
/assets/babylon/*.babylon
Content-Encoding: br
/assets/babylon/ext.babylon
Content-Encoding: br
Your _headers file is in your build output folder?
You can also see if it got picked up by the build if you go to the deployment details and Headers tab
Magic Link: https://dash.cloudflare.com/?to=/:account/pages/view/:pages-project/:pages-deployment/headers
Thanks for the support, the _headers.txt is where the index.html file is. Its a simple html, css and js so I guess the root directory is where it is supposed to be, but cloudflare isnt picking it up
There's supposed to be no extension on it
not
_headers.txt
, it's supposed to be just _headers
I changed that and now it works!
But now the file is 12mbs, when the precompressed one in my project is 1.2 mb, why is that?
that's a pretty big jump, possible Cloudflare is decompressing it though. You could try setting
cache-control: no-transform
to get it to leave it alone. What's the URL?I tried cache-control: no-transform with no luck
The url is https://dctest.logicaexp.com/assets/babylon/ext.babylon
Vercel
CF pages
hmm that's interesting, CF does a bit of magical decompression at CDN level and it looks like Pages does some interesting build time compression as well. Not stuff I've messed with much before
The one in your Git it builds from is 1.2 MB?
I was expecting no-transform to leave it alone, but it looks like another way is just not setting the Content-Encoding so Cloudflare doesn't know it's compressed to even mess with it
Depends what you're targeting though. If you want to use it in a browser/have browsers handle decompression or if you're using it in some other context where you can just always expect it to be compressed
Yes, actually is 1.5 mb
Its an asset for a webgl app so I need to send it compressed to the client and the client decompress it
Its the first time that I use pages and its quite weird to me that its not straightforward has vercel about this
Pages is special, you essentially have two layers of handling
User -> Their browser decompression/handling -> Custom Domain -> Cloudflare decompression/handling -> Pages.dev -> Pages decompression/handling
Lots of hungry layers which love to eat cache.
It seems when you set it via _headers file, the layer above says "Hey, this is compressed, I gotchu, and uncompresses it", but if we use a Transform Rule instead to set the headers on the Custom Domain, it works properly
ex: https://uiguiguigfg.chaika.dev/ext.babylon should be properly served as you would expect
Simple transform Rule (Rules -> Transform Rules -> Response Headers)
Magic Link to new: https://dash.cloudflare.com/?to=/:account/:zone/rules/transform-rules/modify-response-header/new
Might be an easier way, was just what I found.
Also with that approach if the client doesn't send Accept-Encoding: br, it'll just fallback to decompressed
Thank you for the reply and the explanation, but I see that I only have 10 active transform rules (with the free version) and I might need way more than that later
But once again thank you for trying to help me
I might have to stick with vercel then
Sure no problem, was fun to look into. There's a lot of magic within actual CDNs to serve the optimally compressed version for each visitor
If purely the amount of files was your concern, you can use the
is in
operator and list hunderds, or use starts with
and include everything in a directoryBut yea, not best user experience, kind of confusing
Hello Chaika, I tried this approach but I cant get it to show the content-encoding response header, but the content disposition one does appear
Any idea why "content encoding" wont show up? I have tried many combinations of doing this and with/without _headers with no luck
Also enabling and disabling brotli in the CF settings
nice discussion, thanks everyone
How are you testing it? Make sure you're setting request headers including br, ex:
Accept-Encoding: br
, else Cloudflare will think it's a client without brolti support and "helpfully" decompress it
For me I don't see either headers on the url you gave above. I'd guess the transform rule isn't matching