I have a queue with batch of size 50 and

I have a queue with batch of size 50 and timeout of 10s. Few questions: 1- If I do .send will msg be processed as soon as it arrives or it will wait for 10s before processing? 2- If a batch of 50 msgs is being processed by consumer and it takes longer than 10s, what happens next, if there are more messages waiting to be processed. 2.1- Will a new batch be delivered to the consumer? If so, will it make a new instance of the consumer or same isntance of consumer will process it. 2.2 Will new batch wait till the processing of previous batch is complete? 3- There is something related to concurrency. But it seems like it is never triggered. How can I verify if it is being triggered? For now it seems like all processing is sequential. It becomes a bottleneck very quickly if mutliple users are using it.
4 Replies
Pranshu Maheshwari
Pranshu Maheshwari•4w ago
1- If I do .send will msg be processed as soon as it arrives or it will wait for 10s before processing?
The message will be sent from your producer to the Queue immediately. But your Consumer will receive the batch of messages after a batch is filled up. This would be whenever 50 messages are received by the Queue, or when 10 seconds have passed since the last batch, whichever comes first (docs linked here: https://developers.cloudflare.com/queues/configuration/batching-retries/)
2- If a batch of 50 msgs is being processed by consumer and it takes longer than 10s, what happens next, if there are more messages waiting to be processed.
The batch logic doesn't matter as such here. The best way to think about it is: batches determine the logic for how often, and how many, messages are sent to your consumer. If it takes your consumers longer than 10s to process the batch, that's ok! If there are more messages in the backlog than your consumer can keep up with, your consumer workers will autoscale (up to a max of 250) to keep up with the backlog. Consumer workers operate in parallel - so up to 250 batches of messages can be processed at once.
2.1- Will a new batch be delivered to the consumer? If so, will it make a new instance of the consumer or same isntance of consumer will process it.
Yes, new batches will be delivered. It'll be a new consumer worker invocation per batch
2.2 Will new batch wait till the processing of previous batch is complete?
Nope. Your concurrent consumers should operate in parallel.
Pranshu Maheshwari
Pranshu Maheshwari•4w ago
3- There is something related to concurrency. But it seems like it is never triggered. How can I verify if it is being triggered?
Concurrency should get triggered automatically. The docs here list out why it might not be happening (https://developers.cloudflare.com/queues/configuration/consumer-concurrency/#why-are-my-consumers-not-autoscaling). To verify it is happening, you can use the dashboard or our GraphQL API. On the dashboard, navigate to the Queue in question. You should see a "concurrency" metric on the metrics section for the Queue. If you want to do it via graphql, here's a sample query: https://developers.cloudflare.com/queues/observability/metrics/#get-average-consumer-concurrency-by-hour
Cloudflare Docs
Metrics | Cloudflare Queues
You can view the metrics for a Queue on your account via the Cloudflare dashboard ↗. Navigate to Workers > Queues > your Queue and under the Metrics tab you’ll be able to view line charts describing the number of messages processed by final outcome, the number of messages in the backlog, and other important indicators.
Cloudflare Docs
Consumer concurrency | Cloudflare Queues
Consumer concurrency allows a consumer Worker processing messages from a queue to automatically scale out horizontally to keep up with the rate that messages are being written to a queue.
Pranshu Maheshwari
Pranshu Maheshwari•4w ago
Could you send me over your Queue ID and account ID? Before I overwhelm you with more information, let me double check that everything is working as expected 🙂
Saqib (Langbase)
Saqib (Langbase)OP•4w ago
This is super helpful. I have been able to debug and streamline it. Thank you so much @Pranshu Maheshwari
Want results from more Discord servers?
Add your server