richterscale9
RRunPod
•Created by richterscale9 on 5/13/2024 in #⚡|serverless
Serverless vLLM doesn't work and gives no error message
I've spent a few hours trying to deploy a serverless vLLM endpoint according to the instructions at https://docs.runpod.io/serverless/workers/vllm/get-started
The endpoint doesn't work and it gives no error message or any other indication of what's wrong.
All the requests I send just stay "in queue" and the status of the requests never change.
The logs show an initialization message and some warnings, but no errors, and the requests aren't shown in logs.
The endpoint id is o13ejihy2p9hi8.
12 replies