K
RRunPod
•Created by K on 3/31/2025 in #⚡|serverless
Unable to deploy my LLM serverless with the vLLM template
I am trying to deploy a serverless LLM with the vLLM template. But I cannot get it to work. Is there something wrong with the configurations?
Ideally, I want to deploy the model I trained, but even deploying the "meta-llama/Llama-3.1-8B-Instruct" as shown in the tutorials didn't work.
Ideally, I want to deploy the model I trained, but even deploying the "meta-llama/Llama-3.1-8B-Instruct" as shown in the tutorials didn't work.
3 replies