Chang
CDCloudflare Developers
•Created by Chang on 12/24/2024 in #workers-help
losing host header in wrangler
I have Caddy to proxy to wrangler for a worker. In my local I am struggling to get the sub domain. I know I can pass host wile launching wrangler but that wont scale for integration tests.
And when I add this in Caddy hoping it could be a decent workaround, I am shocked to see the following behavior:
request_header X-Original-Host customer3.mydomain.app
request_header X-Original-Host1 customer3.whatever
request_header X-Custom-Header customer3.mydomain.app
the second header is preserved but the first one it comes in as "mydomain.app". But surprisingly even the third one comes in as mydomain.app. It seems like a real bug to me where some code is doing a simple string parsing on domain matching irrespective of the header.
It's really blocking me to add logic based on subdomain and run the same code in production and local. Also my integration suite in local has to test this flow properly as I use subdomain to run some custom logic.
3 replies
CDCloudflare Developers
•Created by Chang on 12/24/2024 in #workers-help
worker static assets
I am using static assets worker and it works fine in production - but in local I want to proxy any non matching paths to my 8080 port.
I am trying to match exmaple.com/login* --> it works.
now example.com/appname -> this doesn't proxy to my app server in local that's running on 8080
Unable to figure out which setting to use in toml.
7 replies
CDCloudflare Developers
•Created by Chang on 11/12/2024 in #workers-help
Workers concurrency
Maybe a dumb question but I tried to learn about concurrent requests to workers - when documentation says only one instance of the worker in a edge server, how will it handle let's say we get 100 requests concurrently to the same edge server? How are memory limits handled? Will it span more workers in the same edge location or it will reject the requests if the instance is already beyond 128mb. Are cancelled requests retried? I am a bit lost trying to understand this topic even after a lot of research into Cloudflare forums also.
How do we plan for concurency? GCP Cloud Run offers it as a configurable parameter for each instance of the container.
1 replies
CDCloudflare Developers
•Created by Chang on 2/17/2024 in #workers-help
workers site caching and DDoS
I am kind of new to CF. I have a workers site that renders static HTML for any subdomain using the route *.mydomain.com. It works fine as expected.
Now I am unable to figure out how to enforce caching on the generated HTML so that the Worker won't be hit every time until the cached HTML is purged explicitly or CF removes the cached page.
I am more worried about getting DDoS-ed. I am implementing a simple static site builder using CF Worker site as the origin and using Cloudflare SaaS for custom domains.
I understand I can upgrade to Business/Enterprise and have better DDoS options. But I am wondering how are small companies dealing with unexpected large unexpected Workers bill.
Edit: Small typo
3 replies