SyedAliii
SyedAliii
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
@gnarley_farley. I was facing the same issue if use network volumn and put everything in it and then from docker image use network volumn then it is extremely very slow. If directly put in docker image then it's very fast but image size is very large. I don't see any solution to that right now. Though there are techniques to reduce docker image.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Why not make a pod and run your comfy expertiments there? Install models using wget in network volumn so that you can use later.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
You can also see other flags like highVram etc. Write python main.py --help
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
When you run python main.py in comfy ui directory there write python main.py --gpu-only
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Yes i have seen that too. Using better gpus does not effect much. But i have seen that if I run comfy with --gpu-only, i have seen increase like for example if one job takes 18 sec then with gpu flag takes 13-14 seconds.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
@gnarley_farley. Hello, Thank you so much. Your point totally makes sense. Though two queries: 1- Even though async is better performant but even if I use sync they said that limit is 2000 per 10 sec. I believe i haven't even cross 100. 2- Using async, what is the time interval after which you make request to check status Though it depends upon a task you are performing but still what is your suggestion.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
@Encyrption tried blocking the EU and US-OR region but the issue persists. However, I have noticed that today the error rate is low. Like if I send 10 requests then I see that all 10 get completed and sometimes 2-3 requests fail. So issue appears to be runpod internal. Thank you for your time to take a look at this.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Yes, each worker is seperate gpu.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Can you please explain about this more?
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Can you please explain about this more?
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
But on single worker everything works fine. Issue is when multiple workers get request. I don't think, i can simulate this behavior on my local machine.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Do you get these updates regarding which region is causing issue from their other channel, if so please let me know
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
I have and i am able to run comfyui gui.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Btw, may be you can say that to my specific endpont, there is some internet bug. But tested on a test endpoint. Issue remains the same.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
No
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
No specific region, i have selected global region coz network volumn is not attached so no region restriction.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
I have custom nodes and models but these are unrelated to the issue i believe
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Yes, i can see everything from the logs. Server retry time out and then send back fail response. There are no unusual errors in logs. i am seeing it is trying to reach to server api.
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Yes
80 replies
RRunPod
Created by SyedAliii on 9/20/2024 in #⚡|serverless
Issue with Multiple instances of ComfyUI running simultaneously on Serverless
Some other endpoints i am running that's why i need those.
80 replies