serverless deployment
i want to deploy my llm on serverless endpoint, how can i do that?
Solution:Jump to solution
GitHub
GitHub - runpod-workers/worker-vllm: The RunPod worker template for...
The RunPod worker template for serving our large language model endpoints. Powered by vLLM. - runpod-workers/worker-vllm
3 Replies
Solution
GitHub
GitHub - runpod-workers/worker-vllm: The RunPod worker template for...
The RunPod worker template for serving our large language model endpoints. Powered by vLLM. - runpod-workers/worker-vllm
Please ask in #🎤|general , this is for support with problems with serverless not for asking basic questions that the community can help with.
This channel is fine, they asked a question about serverless in the serverless channel, doesn't matter if they are new to the platform and asking basic questions.