bigslam
RRunPod
•Created by bigslam on 3/24/2024 in #⚡|serverless
How to run OLLAMA on Runpod Serverless?
Easier to run ollama on a gpu pod, but I’m trying to save time and want a serverless implementation
32 replies
RRunPod
•Created by bigslam on 3/24/2024 in #⚡|serverless
How to run OLLAMA on Runpod Serverless?
I want to run quantized LLMs @justin eg GGUF
32 replies