R
RunPod6mo ago
Netrve

How does the vLLM template provide an OAI route?

Hi, so the vLLM template provides an additional OAI compatible route. As I'm currently looking into making my own serverless template for exl2, I wondered how this was achieved as I currently don't see any description in the documentation about how to set it up and looking into the source doesn't seem to provide much more insight. If I check for job.get("openai_route") is that handled automatically or how would I go about adding it into the handler (or elsewhere)?
4 Replies
Henky!!
Henky!!5mo ago
+1 making a template for KCPP is pointless without this
NERDDISCO
NERDDISCO5mo ago
@Henky!! @Netrve I was also looking into this and I’m also not 100 % sure how this is working in detail. It looks like a combination of code and some stuff happening on the platform itself. But I will make sure to find this out and then let you both know ok?
Henky!!
Henky!!5mo ago
Alright because the actual worker part i need would be insanely simplistic We all need the same thing All we need is a 1:1 mapping of the webserver No matter if its a exl2 webserver, kcpp webserver or ollama webserver it would work best that way
NERDDISCO
NERDDISCO5mo ago
I totally agree! It makes everything so much nicer to have an OpenAI-compatible API.
Want results from more Discord servers?
Add your server