jd24
RRunPod
•Created by jd24 on 4/23/2024 in #⚡|serverless
How does the vLLM serverless worker to support OpenAI API contract?
I wonder how a serverless worker can implement a custom API contract, if it is mandatory that the request must be a POST and the payload is forced to be a JSON with a mandatory input field.
I understand that the vLLM worker (https://github.com/runpod-workers/worker-vllm) solved it, implements OpenAI API endpoints, but I don´t get how it bypassed these limitation.
12 replies