R
RunPod3mo ago
Hello

A100 GPU vram being used

I have a pod running but one of my assigned GPUs has its vram taken up and I can't clear it even if after restarting the pod or torch.cuda.empty_cache
No description
3 Replies
Poddy
Poddy3mo ago
@Hello
Escalated To Zendesk
The thread has been escalated to Zendesk!
nerdylive
nerdylive3mo ago
What is your pod id Which template do you use? Custom or runpod templates or official templates
ege0189
ege01893mo ago
I have the same issue

Did you find this page helpful?