Sir Falk
Pod's connection is less stable that the tower of babel
I'm trying to use ollama in a container on runpod as a pod and I keep running into connection errors over and over again. I've tried different pods, Secure Cloud vs Community and different GPUs but I keep getting timeouts like this:
Occasionally it works, but I need a stable connection on a GPU to work on my project properly. Last weekend, I probably spent 15$ without making any progress, just trying to fix the connection.
I use the proxy on 11434 to reach ollama but when the timeout occurs, I can't even reach it in the terminal using
ollama run my-model
Please help!6 replies