vlad000ss
RRunPod
•Created by vlad000ss on 12/1/2024 in #⚡|serverless
Custom vLLM OpenAI compatible API
It didn't work... The problem is that when I send the request to /openai/v1 the enpoint is invoked but the request is not processed, I guess because my vllm process is listening to just /v1 endpoint, didn't you have such problem? I'm using my custom vllm image, not the runpod one
25 replies
RRunPod
•Created by vlad000ss on 12/1/2024 in #⚡|serverless
Custom vLLM OpenAI compatible API
https://docs.runpod.io/serverless/workers/vllm/openai-compatibility#initialize-your-project
For anyone who will face the same issue as I did
25 replies