R
RunPod2mo ago
cyberzen

Flux.1 Dev Inference

I would like to run inferences (txt2img) with Flux.1 dev in fp8 and fp16. Is there any 'Quick Deploy / Template' in planning?
1 Reply
Encyrption
Encyrption2mo ago
I suggest you look at: https://github.com/blib-la/runpod-worker-comfy It has binary images available for flux.1.dev, flux.1.schnell, sd3, and sdxl. You would just need to build a template using that image.
GitHub
GitHub - blib-la/runpod-worker-comfy: ComfyUI as a serverless API o...
ComfyUI as a serverless API on RunPod. Contribute to blib-la/runpod-worker-comfy development by creating an account on GitHub.
Want results from more Discord servers?
Add your server