R
RunPod5mo ago
Hello

A100 GPU vram being used

I have a pod running but one of my assigned GPUs has its vram taken up and I can't clear it even if after restarting the pod or torch.cuda.empty_cache
No description
5 Replies
Poddy
Poddy5mo ago
@Hello
Escalated To Zendesk
The thread has been escalated to Zendesk!
Jason
Jason5mo ago
What is your pod id Which template do you use? Custom or runpod templates or official templates
ege0189
ege01895mo ago
I have the same issue
GENGHIS
GENGHIS2mo ago
same issue. i don't want to change pods. i have tons of data on here.
Jason
Jason2mo ago
Open a ticket report your pod

Did you find this page helpful?