Too many subrequests with Cloudflare Pages

To my knowledge, with the new Standard usage, there is no more limitation on the number of subrequests. So why I am having an error "Too many subrequests" for some of my Pages functions?
No description
16 Replies
Chaika
Chaika5d ago
there's still a limit, you're looking at it right there..?
Lucas Willems
Lucas Willems5d ago
Ok my bad. I misread the documentation. For Standard, there is a limit of 1000 per request for paid plan Yes, indeed, was going to say it
Chaika
Chaika5d ago
All standard did is mash together Bundled and Unbound into one tier with 30s max of cpu time billable (instead of 50ms bundled got, and instead of Unbound billing on duration), and the other limits inherited for most were Unbound
Lucas Willems
Lucas Willems5d ago
This thread can be marked as solved 👍 (I can't find how to do it myself)
Chaika
Chaika5d ago
ah ok, are you saying you don't think you're hitting it then? May be worth mentioning there is a limit of 6 subrequests concurrently per request as well
Lucas Willems
Lucas Willems5d ago
I'm hitting it in production. However when I testing locally with wrangler, I don't have any such limit. Do you have an idea why?
Chaika
Chaika5d ago
It's not enforced in local wrangler, most of the limits aren't, including cpu time
Lucas Willems
Lucas Willems5d ago
The thing that's weird is that I'm pretty sure I do less than 50 subrequests. When I log them I only see 36. But on Cloudflare I get the error Too many subrequests. How could I debug further or understand why I hit this limit?
Chaika
Chaika5d ago
May be worth mentioning there is a limit of 6 subrequests concurrently per request as well
Are you on Standard, or free plan? Could you be doing too many concurrently per request?
Lucas Willems
Lucas Willems5d ago
I am on the free plan. And yes I'm doing a lot of requests in parallel. Cloudflare would throttle or just fail?
Chaika
Chaika5d ago
Docs are here: https://developers.cloudflare.com/workers/platform/limits/#simultaneous-open-connections the key part is this:
If the system detects that a Worker is deadlocked on open connections — for example, if the Worker has pending connection attempts but has no in-progress reads or writes on the connections that it already has open — then the least-recently-used open connection will be canceled to unblock the Worker. If the Worker later attempts to use a canceled connection, an exception will be thrown. These exceptions should rarely occur in practice, though, since it is uncommon for a Worker to open a connection that it does not have an immediate use for.
ex: If you're awaiting on a ton of them, it detects the dead lock and starts killing
Lucas Willems
Lucas Willems5d ago
Okay but I should not get an error "Too many subrequests" no? I should NOT get *
Chaika
Chaika5d ago
I don't believe from just too many concurrent requests yea, you'd get other errors from the requests failing
Lucas Willems
Lucas Willems5d ago
Do you have an idea how could I debug further? And understand why I get "Too many requests" error?
Chaika
Chaika5d ago
There's third party libraries like otel which could potentially help but do the same thing as you could do, which is log each request and see. May be worth noting as well that workers fetch will automatically follow requests which will count towards your subrequest limit as well, make sure you're using https:// and not being redirected you could also change the redirect behavior to manual if you think that could be the case: https://developers.cloudflare.com/workers/runtime-apis/request/#properties
Lucas Willems
Lucas Willems5d ago
Thanks! Investigating and it seems related to some dependencies that I've upgraded