Hello, just joined this Discord server. Not sure if this is old news that for lora training captions

Hello, just joined this Discord server. Not sure if this is old news that for lora training captions were not found due to wrong file extension? I was following this turorial: https://www.youtube.com/watch?v=AY6DMBCIZ3A
SECourses
YouTube
First Ever SDXL Training With Kohya LoRA - Stable Diffusion XL Trai...
Updated for SDXL 1.0. How to install #Kohya SS GUI trainer and do #LoRA training with Stable Diffusion XL (#SDXL) this is the video you are looking for. I have shown how to install Kohya from scratch. The best parameters to do LoRA training with SDXL. How to use Kohya SDXL LoRAs with ComfyUI. How to do checkpoint comparison with SDXL LoRAs and m...
15 Replies
Furkan Gözükara SECourses
hello welcome
doggo 🐕
doggo 🐕11mo ago
Guys, a question, what should you use for checkpoints for hyperrealism?
SOPRO
SOPRO11mo ago
is my lora realistic?
No description
Furkan Gözükara SECourses
looks really good I use SDXL 1.0 really works good
doggo 🐕
doggo 🐕11mo ago
thank YOU!
mikemenders
mikemenders11mo ago
Yes, I use Furkan's reg images, that's best collection!
NextGen5.0
NextGen5.011mo ago
Any idea why my character doesnt have the same style as this guys? https://www.youtube.com/watch?v=2mUEeeoA6G8 I'm using same settings and checkpoint
Sebastian Torres
YouTube
Ai Animation in Stable Diffusion
The Method I use to get consistent animated characters with stable diffusion. BYO video and it's good to go! Sign up to my Newsletter for the companion PDF with all the links needed to play along. http://www.sebastiantorres.com.au/newsletter/ Add LCM to Automatic 1111 https://github.com/light-and-ray/sd-webui-lcm-sampler You're awesome! Than...
No description
NextGen5.0
NextGen5.011mo ago
No description
Kallamamran
Kallamamran11mo ago
@Dr. Furkan Gözükara Why 1536x1536 pixel images in the datasets? Training above 1024 is not a good idea right?!
Alexm
Alexm11mo ago
it doesnt matter because of bucketing, they will get resized automatically to fit just wanted to ask if some people on here use 4090s for dreambooth training in kohya and their experience? I noticed that on runpod, a 4090 is a lot slower than a 3090ti and im trying to figure out if this is a issue on runpod or with a 4090 overall. Ive seen some issues posted on the kohya github about 4090s being slow, but im not sure if this is still the case
banfi6885
banfi688511mo ago
Hello Furkan, I have watched this video a couple of times and followed the instructions. I have installed torch using the following command: pip3 install torch==2.1.0 torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118 and transformers using the following command: pip install xformers==0.0.23 I also edit webui-user.bat to use xformers: set COMMANDLINE_ARGS=--xformers With pip install xformers==0.0.23, I get the following: Installing collected packages: torch, xformers Attempting uninstall: torch Found existing installation: torch 2.1.0+cu118 Uninstalling torch-2.1.0+cu118: Successfully uninstalled torch-2.1.0+cu118 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. torchvision 0.16.0+cu118 requires torch==2.1.0+cu118, but you have torch 2.1.1 which is incompatible. torchaudio 2.1.0+cu118 requires torch==2.1.0+cu118, but you have torch 2.1.1 which is incompatible. Successfully installed torch-2.1.1 xformers-0.0.23 And when I execute webiu-user.bat, I get the following: RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check Is there something I am overlooking? Thank you
Alexm
Alexm11mo ago
is it possible to train dreambooth on kohya with 24gig of vram? no matter what i try, it keeps running oom with 8bitadam
allen
allen11mo ago
@Dr. Furkan Gözükara Hey there - it seems there is an issue with download the runpod installer for your caption scripts
allen
allen11mo ago
No description
allen
allen11mo ago
https://www.patreon.com/posts/sota-very-best-90744385 are you using the runpod kohya template by any chance? i had never noticed until watching the good dr's video that if you don't kill auto1111, 25-30% of the vram is getting sucked up before you start
Want results from more Discord servers?
Add your server