Kays
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
Cool, yes I’m happy with that rate, yesterday was more like 90-10 that’s why I mentioned it
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
right now I'm getting around 50/50 flashboots
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
yes it is on the container
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
seems to be fixed now somehow
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
Hey there, any updates on this? Is it just the model being too big? @PRB @flash-singh thanks!
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
But what's weird is that cold start sometimes is extremely quick, like under 5 seconds
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
I'm not using network volumes, the model is flux-dev (24gb)
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
I can give you an example test request if you like
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
Its mostly A100 for me
27 replies
RRunPod
•Created by sahir on 2/25/2025 in #⚡|serverless
queue delay times
Same here, almost 2 minutes cold start every time
27 replies
RRunPod
•Created by Kays on 1/16/2025 in #⚡|serverless
Workers wrongfully reported as "idle"
I do all my pushes to "latest", I guess I should do version names then
8 replies
RRunPod
•Created by Kays on 1/16/2025 in #⚡|serverless
Workers wrongfully reported as "idle"
yes but the problem is that it started building when it was reported as ready for predictions, billing prediction time while it was building
8 replies
RRunPod
•Created by Kays on 1/16/2025 in #⚡|serverless
Workers wrongfully reported as "idle"
The logs don't work either, should I just wait?
8 replies