acamp
Ollama on Runpod
I have not been using streamed responses.
After a bit of explorng, I think the issue seems to lie with the ollama version. The download link (presented in the article) installs ollama version 0.4.1, however, when I used an older ollama variant (0.1.32) the issue disappears. The problem is that ollama 0.1.32 does not support llama3.1 onwards. Would anyone happen to know how I could install a specific version of ollama?
14 replies
Ollama on RunPod
@Papa Madiator and @justin [Not Staff] Thank you both for the assistance and resources! Would you happen to know if it's possible to setup llama3 on open-webui and make inferences to it using an API? I was not able to find specific instructions on how to set up an LLM on open-webui
34 replies
Ollama on RunPod
I was just looking into that. Did you have any resources that maybe helpful? I was referring to this link: https://hub.docker.com/r/ollama/ollama#!, but I was wondering if there was an approach more suited to RunPod.
34 replies