R
RunPod•2w ago
houmie

How to download models for Stable Diffusion XL on serverless?

1) I created a new network storage of 26 GB for various models I'm interested in trying.
2) I created a Stable Diffusion XL endpoint on serverless, but couldn't attach the network storage.
3) After the deployment succeeded, I clicked on edit endpoint and attached that network storage to it. So far so good I believe. But how do I exactly download various SDXL models into my network storage, so that I could use them via Postman? Many Thanks
17 Replies
Madiator2011
Madiator2011•2w ago
quick deploy sdxl endpoint is based on difusser and it has model baked in
houmie
houmie•2w ago
So if I wanted to use DreamShaper XL could I do this with that? Or do I need to clone https://github.com/runpod-workers/worker-sdxl and add the DreamShaper XL to it, push it to DockerHub and then pull it as serverless template?
Madiator2011
Madiator2011•2w ago
should work if model is in difusser format
Madiator2011
Madiator2011•2w ago
GitHub
worker-sdxl/src/rp_handler.py at 10b177bff0ec746b48cf9a4e4c682797ad...
RunPod worker for Stable Diffusion XL. Contribute to runpod-workers/worker-sdxl development by creating an account on GitHub.
GitHub
worker-sdxl/builder/cache_models.py at 10b177bff0ec746b48cf9a4e4c68...
RunPod worker for Stable Diffusion XL. Contribute to runpod-workers/worker-sdxl development by creating an account on GitHub.
houmie
houmie•2w ago
Ah so it's currently using the base model stable-diffusion-xl-base-1.0 ? So do I have to clone the https://github.com/runpod-workers/worker-sdxl and change the two files manually fromstabilityai/stable-diffusion-xl-base-1.0 to stablediffusionapi/dreamshaper-xl ? Is there no environment variable to inject in instead?
Madiator2011
Madiator2011•2w ago
nope no env
Madiator2011
Madiator2011•2w ago
if you would like something more flexible you can use https://github.com/ashleykleynhans/runpod-worker-a1111
GitHub
GitHub - ashleykleynhans/runpod-worker-a1111: RunPod Serverless Wor...
RunPod Serverless Worker for the Automatic1111 Stable Diffusion API - ashleykleynhans/runpod-worker-a1111
houmie
houmie•2w ago
Ahh nice. But this repo is based on the classic SD, not SDXL, correct? In that case for SDXL, I will try to close it and change the files myself. Then I need to add the model to my docker image and push it to DockerHub, correct? Then in RunPod I would create a template based on the dockerHub image and build a new serverless endpoint? So far my plan makes sense? 🙂 And will the model that the docker downloads be added to the attached network storage? I have a feeling because there is no environment variable passed in, the docker image is loaded in local storage, instead of network storage. I hope I'm wrong, because that would take a very long time each time I would post to the endpoint.
Madiator2011
Madiator2011•2w ago
for a1111 it supports sdxl just need to get safetensors file for runpod-workers/worker-sdxl you would need to edit lines I told you rebuild image and push to dockerhub runpod-worker-a1111 supports network storage
houmie
houmie•2w ago
Thanks, yes, I 'm making progress with runpod-worker-a1111 . Is there a way to check from the dashboard how much space is left on the network-storage?
digigoblin
digigoblin•2w ago
Not unless you attach it to a pod
houmie
houmie•2w ago
ok, after I attached it to a pod how could I do that? df -h ? I don't think that's possible on network storage because it shows everything
digigoblin
digigoblin•2w ago
No, df -h will show you the space on the entire network storage, not just what is assigned to you check usage in your pod in the runpod web console It has a percentage indicator, which is also a bit crappy, would be nice to show actual space used if you hover over it or something
houmie
houmie•2w ago
Ah yeah. It says 95%. Yeah would be good if it could give us an actual number instead of guess work.
digigoblin
digigoblin•2w ago
Well you can do the math, don't need to do guess work but still its an unnecessary waste of time for something that could have better UX.
houmie
houmie•2w ago
Sorry, I worded it badly. Of course I could do 5% of 50 GB. What I meant is that having the actual number is more accurate and convenient.
digigoblin
digigoblin•2w ago
Yeah definitely, I agree that it can do with improvement
Want results from more Discord servers?
Add your server
More Posts