acamp
acamp
RRunPod
Created by acamp on 11/12/2024 in #⛅|pods
Ollama on Runpod
@baldy Thanks for all the help. I was able to resolve the issue by utilizing an older docker image (0.3.14 instead of 0.4.1).
14 replies
RRunPod
Created by acamp on 11/12/2024 in #⛅|pods
Ollama on Runpod
Thank you! I'll definitely give this a shot.
14 replies
RRunPod
Created by acamp on 11/12/2024 in #⛅|pods
Ollama on Runpod
I have not been using streamed responses. After a bit of explorng, I think the issue seems to lie with the ollama version. The download link (presented in the article) installs ollama version 0.4.1, however, when I used an older ollama variant (0.1.32) the issue disappears. The problem is that ollama 0.1.32 does not support llama3.1 onwards. Would anyone happen to know how I could install a specific version of ollama?
14 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
@Papa Madiator and @justin [Not Staff] Thank you both for the assistance and resources! Would you happen to know if it's possible to setup llama3 on open-webui and make inferences to it using an API? I was not able to find specific instructions on how to set up an LLM on open-webui
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
@justin [Not Staff] Hey Justin, I noticed that you were able to provide some valuable advise to other users regarding Ollama on runpod, so I was hoping to reach out to you regarding this thread, that I have yet to debug.
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
Yes, but in this case I think it's trying to run the command "serve" along with "run" and "gemma" as the arguments.
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
It looks like the Container Start Command can only take one command.
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
It returns the following error: Error: accepts 0 arg(s), received 2
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
Thank you for all the support so far, but are there any other fixes I coudl implement to get this to work?
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
It looks like I just have to run two commands - "serve" and "run gemma", after which I should be able to make inferences with gemma, but I'm not sure how to implement that.
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
Tried this, and the error seems to be the same.
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
It seems to be returning the same error.
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
If I just have "gemma", the error messages is: Error: unknown command "gemma" for "ollama"
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
No description
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
Thanks for the link. I went ahead and spun up a pod with the ollama/ollama container image. After the pod starts, would you know how to make inferences with a model (e.g. gemma).
34 replies
RRunPod
Created by acamp on 4/24/2024 in #⛅|pods
Ollama on RunPod
I was just looking into that. Did you have any resources that maybe helpful? I was referring to this link: https://hub.docker.com/r/ollama/ollama#!, but I was wondering if there was an approach more suited to RunPod.
34 replies