R
RunPod5mo ago
InnerSun

Official Template not running correct version of CUDA

Hello ! I'm trying to run a pod using the official templates : runpod/pytorch:2.1.0-py3.10-cuda11.8.0-devel-ubuntu22.04 runpod/pytorch:2.0.1-py3.10-cuda11.8.0-devel-ubuntu22.04 Unless I completely misunderstood the notation, the image should run with cuda11.8.0 right? I've tried with Secure Cloud RTX 4090 and Secure Cloud RTX Ada 6000 All of them start with
2024-08-17T12:23:11.943448391Z ==========
2024-08-17T12:23:11.943453191Z == CUDA ==
2024-08-17T12:23:11.943456021Z ==========
2024-08-17T12:23:11.959357698Z
2024-08-17T12:23:11.959372989Z CUDA Version 11.8.0
2024-08-17T12:23:11.943448391Z ==========
2024-08-17T12:23:11.943453191Z == CUDA ==
2024-08-17T12:23:11.943456021Z ==========
2024-08-17T12:23:11.959357698Z
2024-08-17T12:23:11.959372989Z CUDA Version 11.8.0
However I noticed when running nvidia-smi that the CUDA version is incorrect.
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.129.03 Driver Version: 535.129.03 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 4090 On | 00000000:01:00.0 Off | Off |
| 0% 29C P8 17W / 450W | 3MiB / 24564MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.129.03 Driver Version: 535.129.03 CUDA Version: 12.2 |
|-----------------------------------------+----------------------+----------------------+
| GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC |
| Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. |
| | | MIG M. |
|=========================================+======================+======================|
| 0 NVIDIA GeForce RTX 4090 On | 00000000:01:00.0 Off | Off |
| 0% 29C P8 17W / 450W | 3MiB / 24564MiB | 0% Default |
| | | N/A |
+-----------------------------------------+----------------------+----------------------+
Any idea what's wrong, and is there a workaround ?
Solution:
@InnerSun nvidia-smi shows max CUDA version supported by host
Jump to solution
2 Replies
Solution
Madiator2011
Madiator20115mo ago
@InnerSun nvidia-smi shows max CUDA version supported by host
InnerSun
InnerSunOP5mo ago
damn you're right
$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Sep_21_10:33:58_PDT_2022
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0
$ nvcc --version
nvcc: NVIDIA (R) Cuda compiler driver
Copyright (c) 2005-2022 NVIDIA Corporation
Built on Wed_Sep_21_10:33:58_PDT_2022
Cuda compilation tools, release 11.8, V11.8.89
Build cuda_11.8.r11.8/compiler.31833905_0
sorry so my issue might be something else
Want results from more Discord servers?
Add your server