aurelium
RRunPod
•Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
That works, thanks!
19 replies
RRunPod
•Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
I keep getting JSON decoding errors trying to run queries on it...
19 replies
RRunPod
•Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
When you say "In the Container Start Command field, specify the Ollama supported model", do you mean literally just pasting the ollama model ID into that field?
19 replies