How can I deploy Mixtral using Ollama as service?
Hi everyone!
I want deploy mixtral 7x8b model using ollama on runpod, but I can't install it as service using runpod desktop template.
Plz help me!
Solution:Jump to solution
Answered - in #general .
Run the install script, ollama serve in one terminal, ollama run [model name] in a new terminal...
1 Reply
Solution
Answered - in #general .
Run the install script, ollama serve in one terminal, ollama run [model name] in a new terminal