DeluxeWalrus
DeluxeWalrus
RRunPod
Created by DeluxeWalrus on 1/22/2024 in #⛅|pods
Running 2x H100 80gb. Does this mean my cap is now 160gb of vram?
actually that was probably a stupid question since I don't even know what that is haha
10 replies
RRunPod
Created by DeluxeWalrus on 1/22/2024 in #⛅|pods
Running 2x H100 80gb. Does this mean my cap is now 160gb of vram?
Does that have inpainting?
10 replies
RRunPod
Created by DeluxeWalrus on 1/22/2024 in #⛅|pods
Running 2x H100 80gb. Does this mean my cap is now 160gb of vram?
okay thank you for explaining. Sorry I'm pretty new at stable diffusion. Is my only option really to have a lower res output?
10 replies
RRunPod
Created by DeluxeWalrus on 1/22/2024 in #⛅|pods
Running 2x H100 80gb. Does this mean my cap is now 160gb of vram?
My pod has two GPUs. Is there a reason the program isn't utilizing both?
10 replies
RRunPod
Created by DeluxeWalrus on 1/22/2024 in #⛅|pods
Running 2x H100 80gb. Does this mean my cap is now 160gb of vram?
OutOfMemoryError: CUDA out of memory. Tried to allocate 151.88 GiB (GPU 0; 79.11 GiB total capacity; 55.44 GiB already allocated; 22.45 GiB free; 55.54 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
10 replies