R
RunPod3d ago
K

Unable to deploy my LLM serverless with the vLLM template

I am trying to deploy a serverless LLM with the vLLM template. But I cannot get it to work. Is there something wrong with the configurations?
Ideally, I want to deploy the model I trained, but even deploying the "meta-llama/Llama-3.1-8B-Instruct" as shown in the tutorials didn't work.
1 Reply
K
KOP3d ago
I have gated access to the models and used the access tokens
No description

Did you find this page helpful?