Hello. What would the average queue backlog be, on a healthy queue? Our right now keeps on being 760

Hello. What would the average queue backlog be, on a healthy queue? Our right now keeps on being 760, with a delayed backlog of around 530..
18 Replies
sathoro
sathoroā€¢2mo ago
if you need more concurrency, create your own queue system with Durable Objects so it can be scaled infinitely
Pranshu Maheshwari
Pranshu Maheshwariā€¢2mo ago
A quick feature update to share: you can now customize the retention period for a queue https://developers.cloudflare.com/changelog/2025-02-14-customize-queue-retention-period/
achesui
achesuiā€¢2mo ago
Hello, is there a way to bind a queue Event Notification to a local R2 bucket?. For development
Unknown User
Unknown Userā€¢2mo ago
Message Not Public
Sign In & Join Server To View
scotto
scottoā€¢2mo ago
setting this in queues consumer
[[queues.consumers]]
queue = "gopersonal-llm-ranking-queue"
max_batch_size = 100
max_batch_timeout = 180
max_concurrency = 1
[[queues.consumers]]
queue = "gopersonal-llm-ranking-queue"
max_batch_size = 100
max_batch_timeout = 180
max_concurrency = 1
and getting the error āœ˜ [ERROR] A request to the Cloudflare API (/accounts/920b1a6e159cf77dab28969103a4765b/queues/c30e85e6726c455b880498715d8a0b4c/consumers/5bad5c90b2644f498177c45a164e96ac) failed. Queue consumer (type worker) has invalid settings: maximum wait time must be between 0 and 60000 ms. [code: 100127]
If you think this is a bug, please open an issue at: https://github.com/cloudflare/workers-sdk/issues/new/choose if we put the value of max_batch_timeout to 60 it works.., that means the error message is in milliseconds and the actual limit is 60 seconds?
GitHub
Build software better, together
GitHub is where people build software. More than 150 million people use GitHub to discover, fork, and contribute to over 420 million projects.
From An unknown user
From An unknown user
From An unknown user
John Spurlock
John Spurlockā€¢2mo ago
yep you got it, the max batch wait time is 60 seconds [1], the backend rest api takes milliseconds [2], so wrangler multiplies the config value by 1000 [3], making the error message confusing : ) [1] https://developers.cloudflare.com/queues/platform/limits/ [2] https://developers.cloudflare.com/api/resources/queues/subresources/consumers/methods/create/ [3] https://github.com/cloudflare/workers-sdk/blob/0322d085f634c1a0a12a59b4db293088d0cadb62/packages/wrangler/src/deploy/deploy.ts#L1236
scotto
scottoā€¢2mo ago
thanks, any plans to increase those limits?
John Spurlock
John Spurlockā€¢2mo ago
someone orange would have to speak to that - personally I would love to see larger values allowed there and also the batch size 100msg/256kb lifted a bit as well
HowlOftheSun
HowlOftheSunā€¢2mo ago
What would cause a queue to send the same msg multiple times? Is there a timeout i might have set wrong? Errors cause retries.... šŸ¤¦ which cause more errors šŸ¤¦ šŸ¤¦
ajgeiss0702
ajgeiss0702ā€¢2mo ago
Did the queues sendBatch api change? I've been running a worker in production for around 7 months now, and it has been working fine. The past few days (first occurance was Febuary 21st, 23:40 UTC, and the previous successful evocation was 23:10 UTC), i've suddenly been getting an error: sendBatch() requires at least one message I've added an additional check in my code to make sure the batch isnt empty to fix the issue, but I missed 2 days of data (that I cannot recover) before I noticed the issue. Is this a recent change in the queues api? I would appreciate it if you could ping me when you reply šŸ™‚
adeola13.
adeola13.ā€¢2mo ago
Hello! How does one deploy multiple queue handlers (for different queues) in one single worker deployment project in wrangler? So in my main index.ts I would like to have multiple queue consumer handlers for different queues.
import producerHandler from './producer';
import consumerHandler from './consumer';
import { Env, TrackingEvent } from './types';

export default {
fetch: producerHandler.fetch,
queue: consumerHandler.queue,
} satisfies ExportedHandler<Env, TrackingEvent>;
import producerHandler from './producer';
import consumerHandler from './consumer';
import { Env, TrackingEvent } from './types';

export default {
fetch: producerHandler.fetch,
queue: consumerHandler.queue,
} satisfies ExportedHandler<Env, TrackingEvent>;
`
James
Jamesā€¢2mo ago
You can only have one consumer function in your worker code, but you can get the name of the queue on MessageBatch: https://developers.cloudflare.com/queues/configuration/javascript-apis/#messagebatch This way you can effectively handle multiple different queues and just change the code you execute depending on the queue name
adeola13.
adeola13.ā€¢2mo ago
ah I see! Very helpful! And so then I can just bind multiple queue consumers to this worker and then my handler would only be able to read from messages send to the consumers that were bound from it is that right?
James
Jamesā€¢2mo ago
Correct, yep!
LordSilver
LordSilverā€¢2mo ago
i remembered queues was made free damn, i just had to test the tf provider...
cloudkite
cloudkiteā€¢2mo ago
has anyone succesfully gotten browser rendering to work with queues? I can't even launch a browser without it locking up the worker https://github.com/cloudflare/puppeteer/issues/93
GitHub
[Bug]: Using the sessions API causes hanging queue Ā· Issue #93 Ā· cl...
Minimal, reproducible example import { WorkerEntrypoint } from "cloudflare:workers"; import puppeteer, { type Browser } from "@cloudflare/puppeteer"; export default class extend...
ajgeiss0702
ajgeiss0702ā€¢2mo ago
When I had browser rendering working with queues, i was running into all sorts of issues. I would recommend writing a Durable Object to handle the browser rendering, and calling that from the queue
cloudkite
cloudkiteā€¢5w ago
Not ideal as then you have to pay for wall time

Did you find this page helpful?