CUDA not recognized
Hi, I'dl like to run a PyTorch model on a GPU, but for some reason the GPU is not recognized by PyTorch even though I'm specifying a template that includes CUDA.
6 Replies
I'm getting the following:
`
The template I'm using is runpod/pytorch:2.1.1-py3.10-cuda12.1.1-devel-ubuntu22.04
May you run nvidia-smi in the console?
What is the pod id?
fho00ms04wxzxw
I have the same problem with NVIDIA 4090 pods and cuda 12.1 / torch 2.1
pods with other GPUs and same cuda / torch versions works well