Armyk
RRunPod
•Created by Armyk on 5/30/2024 in #⚡|serverless
GGUF in serverless vLLM
Cpu inference isn't good enough but thank you
58 replies
RRunPod
•Created by Armyk on 5/30/2024 in #⚡|serverless
GGUF in serverless vLLM
I found this thread. I will probably need to configure it myself. Thank you for your help.
58 replies
RRunPod
•Created by bigslam on 3/24/2024 in #⚡|serverless
How to run OLLAMA on Runpod Serverless?
Any news on this? Did you manage to run Ollama in serverless? I need to run a GGUF model.
32 replies
RRunPod
•Created by Armyk on 5/30/2024 in #⚡|serverless
GGUF in serverless vLLM
Where can I browse community templates for serverless? There has to be someone that already did this
58 replies
RRunPod
•Created by Armyk on 5/30/2024 in #⚡|serverless
GGUF in serverless vLLM
Such a shame that it doesn't. Can I run Ollama in serverless?
58 replies