R
RunPod9mo ago
Volko

Download Mixtral from HuggingFace

How can I download this model in my pod ?
No description
8 Replies
digigoblin
digigoblin9mo ago
If you use a template with oobabooga installed, it can download it for you.
Volko
VolkoOP9mo ago
Yeah but I want to run it with ollama via a modelfile
justin
justin9mo ago
U can run ollama normally run the installation script for linux and the start ollama server and will work fine use a pytorch template as a starting point to launch a pod and then just use the terminal thro the jupyter notebook server
Volko
VolkoOP9mo ago
But what do I need to write on the model file ? Normally it's FROM XXX but here there is multiple files
justin
justin9mo ago
No clue, i dont use custom models, but im sure there are yt videos on it / docs on it
justin
justin9mo ago
https://youtu.be/0ou51l-MLCo?si=OiedA2tChtvd5PDG just giving a complete shot in the dark - i havent watched this
Matt Williams
YouTube
Adding Custom Models to Ollama
I bet you have always wanted to have an emoji model. Even if you haven't, this video will show you how to make your own Ollama models. Here is the docker command I mentioned. docker run --rm -v .:/model ollama/quantize -q q4_0 /model
justin
justin9mo ago
but i see other similar videos
Volko
VolkoOP9mo ago
The issue is that in every video they show only with a single GGUF model but not multi file models.
Want results from more Discord servers?
Add your server