tzar_impersonator
tzar_impersonator
RRunPod
Created by tzar_impersonator on 1/7/2025 in #⚡|serverless
no compatible serverless GPUs found while following tutorial steps
hi, i'm trying to run orca-mini on serverless by following this tutorial [https://docs.runpod.io/tutorials/serverless/cpu/run-ollama-inference]. whenever the download finishes, i get the error message below and then the ckpt download resstarts.
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info][GIN] 2025/01/07 - 22:02:45 | 200 | 4.060412ms | 127.0.0.1 | HEAD "/"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:45.001Z level=INFO source=types.go:105 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="4.4 GiB" available="4.2 GiB"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:45.001Z level=INFO source=gpu.go:346 msg="no compatible GPUs were discovered"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:44.975Z level=INFO source=gpu.go:205 msg="looking for compatible GPUs"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:44.975Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 rocm_v60102]"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info][GIN] 2025/01/07 - 22:02:45 | 200 | 4.060412ms | 127.0.0.1 | HEAD "/"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:45.001Z level=INFO source=types.go:105 msg="inference compute" id=0 library=cpu compute="" driver=0.0 name="" total="4.4 GiB" available="4.2 GiB"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:45.001Z level=INFO source=gpu.go:346 msg="no compatible GPUs were discovered"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:44.975Z level=INFO source=gpu.go:205 msg="looking for compatible GPUs"\n
2025-01-07 22:02:53.719[1vt59v6j5ku3yh][info]time=2025-01-07T22:02:44.975Z level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 rocm_v60102]"\n
9 replies