ComfyUI : GPU and VRAM at 0%

Hi. I'm running an RTX4090 pod with the comfyui template by ai-dock to run the flux[dev] model . However, the pod shows 0% GPU usage and also 0% VRAM usage. In contrast, the RAM ahs about 30gb taken up. The model runs slower than I expected too(although I have no point of comparison). Is this likely to be a bug in runpod's resource monitoring or is there something wrong with my pod or pod template ?
1 Reply
Marcus
Marcus5mo ago
There is a bug in the UI, use nvidia-smi to monitor GPU usage instead.
Want results from more Discord servers?
Add your server