smoke
smoke
Explore posts from servers
RRunPod
Created by smoke on 6/28/2024 in #⚡|serverless
Kohya-ss on serverless
Hi there, I was wondering if anyone got Kohya setup successfully via a serverless endpoint on RunPod
5 replies
RRunPod
Created by smoke on 3/1/2024 in #⚡|serverless
Docker image cache
Hi there, I am quite new to RunPod so I could be wrong but my Docker image is quite large and before my serverless endpoint actually runs, the endpoint is in the 'Initializing' state for quite long. Is there a way to cache this image across endpoints or does this already happen? This is the first request I am doing so it might already be cached for this endpoint but not quite sure. I'd appreciate it! I am not using the network volume/storage so maybe that's also why.
123 replies
RRunPod
Created by smoke on 2/19/2024 in #⚡|serverless
Estimated time comparison - Comfy UI
Hi everyone, I've been looking at the various different GPU options for serverless and I am trying to see if anyone has a rough estimate of how many times each GPU would be faster/slower and if this is even possible to calculate. There is no actual formula obviously but I am wondering if someone has similar experiences. In my case, it takes around 355 seconds to run my workflow on my local machine (GTX 3080 Ti). Is there a rough estimate on how long this would take on the serverless GPU's? I assume that it will be faster. I will compare all the GPU's soon anyways but I was just wondering.
7 replies