Netrve
Netrve
RRunPod
Created by Netrve on 7/24/2024 in #⚡|serverless
How does the vLLM template provide an OAI route?
Hi, so the vLLM template provides an additional OAI compatible route. As I'm currently looking into making my own serverless template for exl2, I wondered how this was achieved as I currently don't see any description in the documentation about how to set it up and looking into the source doesn't seem to provide much more insight. If I check for job.get("openai_route") is that handled automatically or how would I go about adding it into the handler (or elsewhere)?
8 replies
RRunPod
Created by Netrve on 3/3/2024 in #⛅|pods
Trying to create a Spot GPU instance leads to 400 response error
No description
9 replies