PavelDonchenko
PavelDonchenko
RRunPod
Created by PavelDonchenko on 7/9/2024 in #⛅|pods
Ollama stoped using GPU
I intalled ollama on pod as usual on 3090, by this tutorial: https://docs.runpod.io/tutorials/pods/run-ollama#step-4-interact-with-ollama-via-http-api. But now everything works very slowly. And GPU Memory Used is always on zero. What can be a reason?
34 replies