How to deploy flux.schnell to serveless?
Title says it all
Would be nice to have a guide on how to setup flux on a serverless endpoint
also, I'm planning to train some loras and store them for future use
should I save it in network volume?
would be a plus to have some guidance on how I can accomplish this as well
6 Replies
serverless is basically a runner for python code
you can see an example here:
https://github.com/runpod-workers/worker-stable_diffusion_v2
GitHub
GitHub - runpod-workers/worker-stable_diffusion_v2: 🖼️ | RunPod wor...
🖼️ | RunPod worker for the Stable Diffusion v2 endpoints. - GitHub - runpod-workers/worker-stable_diffusion_v2: 🖼️ | RunPod worker for the Stable Diffusion v2 endpoints.
IN this file:
https://github.com/runpod-workers/worker-stable_diffusion_v2/blob/main/builder/model_fetcher.py
Change abit with the code from here:
https://huggingface.co/black-forest-labs/FLUX.1-schnell
GitHub
worker-stable_diffusion_v2/builder/model_fetcher.py at main · runpo...
🖼️ | RunPod worker for the Stable Diffusion v2 endpoints. - runpod-workers/worker-stable_diffusion_v2
and i think thats all
or also modify the rp_handler.py
thank you very much
I will try this
Hi, it would be nice if you share your experience @Roberto. I am looking for the soulutions as well.