How can I deploy Mixtral using Ollama as service?

Hi everyone! I want deploy mixtral 7x8b model using ollama on runpod, but I can't install it as service using runpod desktop template. Plz help me!
Solution:
Answered - in #general . Run the install script, ollama serve in one terminal, ollama run [model name] in a new terminal...
Jump to solution
1 Reply
Solution
justin
justin10mo ago
Answered - in #general . Run the install script, ollama serve in one terminal, ollama run [model name] in a new terminal
Want results from more Discord servers?
Add your server