Ollama stoped using GPU
I intalled ollama on pod as usual on 3090, by this tutorial: https://docs.runpod.io/tutorials/pods/run-ollama#step-4-interact-with-ollama-via-http-api. But now everything works very slowly. And GPU Memory Used is always on zero. What can be a reason?
Set up Ollama on your GPU Pod | RunPod Documentation
Learn how to set up Ollama, a powerful language model, on a GPU Pod using RunPod, and interact with it through HTTP API requests, allowing you to harness the power of GPU acceleration for your AI projects.
19 Replies
Ye this is kinda annoying things about third party tools one update and can make guide not working.
From what I tested they official docker image works fine.
libraries that doesn't work with the driver.. codes on the application your using can cause that yeah
Yes, there is exist an ollama template that uses docker image and it works fine
I am having the same issue - it's completely broken my workflow
I've burned about 14 hours on trying to get this to work, so this happened sometime in the last 24 hours with no resolution
Report it from the website
As an email with a 1 to 2 day response?
Given the premium Runpod charges, I'll pass on this kind of service and move today to a competitor. This is crazy.
Thanks for the suggestion though.
Alright no problem if that's what you want
I unfortunately bought a bunch of credits. Sunk cost.
You can't just break your top template and not fix it as a priority.
That's amateur service at best.
Try ask for refunds if you want
Will do, thank you.
Seems like runpod team is busy in working something else
The serverless worker which Im gonna use is out of date for a while now
Haha - their submit a request button and RunPod Support Bot is broken
Huge red flag
Huh
The support bot has been offline since some months ago I guess
But the submit request should be working
Try to check if any firewall is blocking your requests
I tried mutiple browsers and two different networks, same outcome
It just does nothing
Let me check
If i could I can submit it for you too
You're too kind, thank you. Give it a try - I'm curious if it works for you!
Hey maybe, i could just submit with your subject, email and description if yoou want?
send it to me on my dms ill try to do that
Hey guys I am running into the same problem. Have you found a solution? or are you using the docker template instead? @nerdylive @PavelDonchenko
@jcmohed Give try to my template #Better Ollama - CUDA12