Ammar Ahmed
RRunPod
•Created by Ammar Ahmed on 1/2/2025 in #⚡|serverless
Failed to load docker package.
I don't think v2 is the main issue here because other images and workers are working fine
13 replies
RRunPod
•Created by Ammar Ahmed on 1/2/2025 in #⚡|serverless
Failed to load docker package.
first it gave me denied with new registry key. then i made it public but it's stuck here
13 replies
RRunPod
•Created by Ammar Ahmed on 1/2/2025 in #⚡|serverless
Failed to load docker package.
5 minutes
13 replies
RRunPod
•Created by Ammar Ahmed on 1/2/2025 in #⚡|serverless
Failed to load docker package.
2025-01-02T09:41:15Z image pull: ghcr.io/ammarft-ai/img-remover:1.6: pending
13 replies
RRunPod
•Created by Ammar Ahmed on 1/2/2025 in #⚡|serverless
Failed to load docker package.
ahan yes working now. thanks
13 replies
RRunPod
•Created by Ammar Ahmed on 1/1/2025 in #⚡|serverless
Cannot send request to one endpoint
it simple runsync serverless template. using it for a while. it was the very first time this type of issue occured
8 replies
RRunPod
•Created by Ammar Ahmed on 1/1/2025 in #⚡|serverless
Cannot send request to one endpoint
it's back to normal, i guess it was some bug. Thanks tho
8 replies
RRunPod
•Created by Ammar Ahmed on 1/1/2025 in #⚡|serverless
Cannot send request to one endpoint
it was working fine 5 minutes ago 🙂
8 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
This is the pool, which loaded into memory. Everytime a request is on the server it gets model from this pool.
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
Fixed it, I created a model pool which will keep number of models loaded according to the max concurrency. It reduced the time to below 10 seconds 😀
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
also processing on a single request is fast, but when multiple requests are being processed concurrently, processing is very slow
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
yes i have flashboot enabled
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
It seems like concurrent request are taking too long to get processed together.
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
It's taking time to load the model into memory
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
okay
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
ohh okay. Thanks
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
yes
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
yes i found it on docs and was figuring out how to implement it in python. Will it go with input in handler.py?
38 replies
RRunPod
•Created by Ammar Ahmed on 10/4/2024 in #⚡|serverless
How can I make a single worker handle multiple requests concurrently before starting the next worker
Essentially, I want to modify how the queue behaves so that premium requests jump ahead of regular ones in the processing order. Can I modify the queue behavior or set some priority rules for incoming requests in RunPod?
38 replies