Hi @Pranshu Maheshwari. How are you

Hi @Pranshu Maheshwari. How are you doing? The push based should, but it doesn't 😦 My currently workaround is to have 10 producers and consumers and distribute between then Here is the config:
[[queues.producers]]
binding = "WEBHOOK_QUEUE_1"
queue = "webhook-1"

[[queues.producers]]
binding = "WEBHOOK_QUEUE_2"
queue = "webhook-2"

[[queues.producers]]
binding = "WEBHOOK_QUEUE_3"
queue = "webhook-N"

[[queues.consumers]]
queue = "webhook-1"
max_batch_size = 10
max_batch_timeout = 2
max_retries = 3
retry_delay = 30
dead_letter_queue = "webhook-dlq"

[[queues.consumers]]
queue = "webhook-2"
max_batch_size = 10
max_batch_timeout = 2
max_retries = 3
retry_delay = 30
dead_letter_queue = "webhook-dlq"

[[queues.consumers]]
queue = "webhook-N"
max_batch_size = 10
max_batch_timeout = 2
max_retries = 3
retry_delay = 30
dead_letter_queue = "webhook-dlq"
[[queues.producers]]
binding = "WEBHOOK_QUEUE_1"
queue = "webhook-1"

[[queues.producers]]
binding = "WEBHOOK_QUEUE_2"
queue = "webhook-2"

[[queues.producers]]
binding = "WEBHOOK_QUEUE_3"
queue = "webhook-N"

[[queues.consumers]]
queue = "webhook-1"
max_batch_size = 10
max_batch_timeout = 2
max_retries = 3
retry_delay = 30
dead_letter_queue = "webhook-dlq"

[[queues.consumers]]
queue = "webhook-2"
max_batch_size = 10
max_batch_timeout = 2
max_retries = 3
retry_delay = 30
dead_letter_queue = "webhook-dlq"

[[queues.consumers]]
queue = "webhook-N"
max_batch_size = 10
max_batch_timeout = 2
max_retries = 3
retry_delay = 30
dead_letter_queue = "webhook-dlq"
Where N is just the following configs for both producers and consummers until 10. If you have any tips ...
8 Replies
Pranshu Maheshwari
Pranshu Maheshwari•3mo ago
Could you send me the config you used previously that wasn't working? As well as the code you used in your consumer? Feel free to DM me if you'd prefer Finally, could you share your account ID and Queue ID? Both are safe to share here
Emerson Macedo
Emerson Macedo•3mo ago
I sent on the initial message. The consumers does not scale. Concurrency config and on docs are useless. Account ID: ef862e42c5cf2d39a50def7dc2ff3534 Queue ID f0f24e8d582d4c0383b8d4e29ecc90a8
Pranshu Maheshwari
Pranshu Maheshwari•3mo ago
thanks, digging into this
Emerson Macedo
Emerson Macedo•3mo ago
Thanks As I mentioned, I started with 1 queue, tested with and without max_concurrency config but does not spawn any new consummers. It keeps waiting for the whole batch to complete.
Pranshu Maheshwari
Pranshu Maheshwari•3mo ago
Gotcha. I think what's going on is: - we scale up consumer concurrency by looking at the backlog of messages - We run the check to scale up a consumer at the end of each batch - If your batches process quickly, this works just fine - But, if you have a single batch which takes much longer (I noticed that this happens occasionally in your acct) this doesn't work as well. I'll add an explanation of this to the docs, thanks for highlighting the issue We're going to address this in the future, but until then, a couple of ideas to unblock you: 1. Making batch sizes a little smaller might help in your case; I noticed that most of your batches are finishing very quickly, but occasionally some batches take much longer to process. 2. A timeout on your consumer could help too; if a single message is taking too long to process, you could add it into the DLQ, or a secondary queue
Emerson Macedo
Emerson Macedo•3mo ago
Thanks, @Pranshu Maheshwari . But I have a concern. On my opinion, it should have a way to define concurrency up front based on my needs. Today, I created lots of consummers and producers and I'm distributing though then (e.g I created 10 producers and 10 consumers).
Pranshu Maheshwari
Pranshu Maheshwari•3mo ago
IE you want a min_concurrency setting? why not just let the consumers autoscale? If our queues would autoscale while your batches are being processed, would you need min_concurrency?
Unknown User
Unknown User•3mo ago
Message Not Public
Sign In & Join Server To View
Want results from more Discord servers?
Add your server