TcCat
ollama won't pull manifest - weird error.
In a runpod I've tried the various ollama templates, and also installed ollama on a basic template.
I can run ollama serve; but in every case when I run ollama run <model> I always get the error:
Error: pull model manifest: Get "https://registry.ollama.ai/v2/library/mistral-large/manifests/latest": dial tcp: lookup registry.ollama.ai on 127.0.0.11:53: read udp 127.0.0.1:59647->127.0.0.11:53: i/o timeout
I'm wondering if this is a new issue, maybe related to proxies? but have no idea what to try next.
8 replies