I am trying to follow the massed compute FLUX LORA tutorial. I downloaded FLUX models, added them to
I am trying to follow the massed compute FLUX LORA tutorial. I downloaded FLUX models, added them to stableswarmUI folders correctly. But now I get error which says "ERROR: self-start comfy-UI-0 on port 7823 failed. "
9 Replies
what does it do? should i enable it?
in the supir upscaler
great
if not using shared vram
it is its speed 😄
start as tiled
and fp8
not much worthy to play with
we already have 2 configs you can test them
@Dr. Furkan Gözükara using SUPIR and getting these errors, any suggestions? decoderF
is not supported because:
max(query.shape[-1] != value.shape[-1]) > 128
xFormers wasn't build with CUDA support
attn_bias type is <class 'NoneType'>
operator wasn't built - see
python -m xformers.info for more info
[email protected] is not supported because:
max(query.shape[-1] != value.shape[-1]) > 256
xFormers wasn't build with CUDA support
operator wasn't built - see
python -m xformers.info for more info
cutlassF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see
python -m xformers.info for more info
smallkF is not supported because:
max(query.shape[-1] != value.shape[-1]) > 32
xFormers wasn't build with CUDA support
dtype=torch.bfloat16 (supported: {torch.float32})
operator wasn't built - see
python -m xformers.info` for more info
unsupported embed per head: 512did you do a fresh install of latest version_
this is xformers incompability
my installer shouldnt have this issue
i'm using arch on my local system
xformers only for nvidia gpus afaik
how do I add realistic scars, do I need to create a LORA for that?
you can search for better prompting too
but lora may work better
th