Implementing complex rate-limiting
I have an upstream service I want to hammer with as many requests as I can get. But, unsurprisingly, they have rate limiting in place. I have recently implemented this in DOs, blocking concurrency on each DO, but this is incredibly expensive (10d ~ $120) as the DO is permanently active and making requests.
Is there any smarter way to implement stateful rate limiting? I'm already using queues to make the requests, but not sure if there's a way to get state as part of queues.
5 Replies
Could you use cron workers?
For example, if the API handles rate limiting on a per-minute bases (as some do) or with a "bucket" of requests that deplete over time, you could set a Worker on a 1-minute Cron trigger and perform as many request as you can (until you get a 429 Too Many Requests or similar) and then exit and wait until the next Worker cron is fired.
That's pretty smart - I guess I'd have to somehow else implement the queueing functionality but it might work
Let me know how it goes! 🙂
Maybe uses Queues directly.
https://developers.cloudflare.com/queues/learning/batching-retries/
Batching and Retries · Cloudflare Queues
When configuring a consumer Worker for a queue, you can also define how messages are batched as they are delivered.
I don't think that's gonna work because there's no way of using state there
and manual dequeue isn't available yet
and the approach that's possible today with piping the queue through a DO is very expensive, that's what I tried