R
RunPod•5mo ago
Data_Warrior

serverless deployment

i want to deploy my llm on serverless endpoint, how can i do that?
Solution:
GitHub
GitHub - runpod-workers/worker-vllm: The RunPod worker template for...
The RunPod worker template for serving our large language model endpoints. Powered by vLLM. - runpod-workers/worker-vllm
Jump to solution
3 Replies
Solution
ashleyk
ashleyk•5mo ago
GitHub
GitHub - runpod-workers/worker-vllm: The RunPod worker template for...
The RunPod worker template for serving our large language model endpoints. Powered by vLLM. - runpod-workers/worker-vllm
ashleyk
ashleyk•5mo ago
Please ask in #🎤|general , this is for support with problems with serverless not for asking basic questions that the community can help with.
haris
haris•5mo ago
This channel is fine, they asked a question about serverless in the serverless channel, doesn't matter if they are new to the platform and asking basic questions.