No: batch timeout is on the producer side. It does not determine how the consumer scales. Your volum
No: batch timeout is on the producer side. It does not determine how the consumer scales. Your volume/message rate is very low, and (right now) we prefer to avoid scaling too quickly on the consumer side.
17 Replies
Question about queues I guess I should have put here? https://discord.com/channels/595317990191398933/1171516366650617866
Question about the queues beta. Given the public beta has been out for almost a year now, is there a rough timeline or progress level for how far along the public-beta is for full production? We would love to use queues for logging tasks but don’t yet feel comfortable using a beta product.
We want to get Queues into GA by end of Q1 next year. There’s architectural work to get per-queue throughput into the thousands-per-queue that’s happening behind the scenes.
Hello, what is the quickest/fastest way to purge a queue? We are in a situation where our queue has accumulated too much backlog that our consumer can not cope with it anymore.
We would like to purge the whole queue first as way to mitigate the issue before we look at finding out the underlying issue and work on a long term fix.
Is it a case of deploying a new version of our consumer that just acks the batches with say
ackAll()
?
Or is there an API that can be invoked from outside a worker?
I've seen that there is a way to list and ack message in the dashboard but this will not help as it limits the batch size to 100There’s not a “purge all” method - re-deploying a consumer that acks every batch and returns immediately is the fastest way right now. Or deleting the queue/rebinding to a new queue.
According to the JavaScript API documentation of Queue, inside the 'export default' block, the queue consumer is a separate handler(queue). Is there a way to poll a queue inside a running fetch()? Or it is possible to get notified inside a running fetch function for changes of a queue? Im looking for a way to implement communication between two running worker instances.
Not without using something like Durable Objects where the Workers can read/write state for the other
Will it be possible to implement such function with kv?
It’s not ideal because KV is eventually consistent and reads are cached for up to 60 seconds, so polling doesn’t really work
If you polled, and there was nothing, it’d still read as nothing even if your queue wrote to KV for up to 60 seconds.
Are Queues right for me?
I'm looking to send many events to a queue, wait 30 seconds, and then send a single array with all accumulated events to a destination.
When I removed the batch.messages for each part so that it wouldn't process each event individually, it failed.
Can you expand? You control batching via your queue settings (https://developers.cloudflare.com/queues/learning/batching-retries/) - the messages in the consumer are always accessed via
batch.messages
, whether it's a batch or 1 or 100.Batching and Retries · Cloudflare Queues
When configuring a consumer Worker for a queue, you can also define how messages are batched as they are delivered.
Thank you so much for replying. Essentially what's happened is that consumer did send the batch of events as a single array to the destination, but it thought that it failed. So it would retry (sending the same batch over and over) until it reached the retry limit.
It would "fail" if the
queue
handler throws an exception, otherwise returning without an error is considered success.Alternatively you can explicitly ack or retry messages: https://developers.cloudflare.com/queues/learning/batching-retries/#explicit-acknowledgement
Batching and Retries · Cloudflare Queues
When configuring a consumer Worker for a queue, you can also define how messages are batched as they are delivered.
The batch was definitely delivered (as a single batch) because I'm not interating over batch.messages. So would I still need to explicitely ack every individual message even though I'm sending them as a single batch?
You can
ackAll()
as described there too 🙂Ahh got it. Wasn't working in dev. Thank you so much for the help.