OBJay
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
didn't test with longer running jobs, but good to know that it doesn't matter even if it shows as sync
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
So even though it shows the request as "sync-93450349539'', it still processes it async?
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
Yes of course we have tried it, we're asking this question because it doesn't work the way it should
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
I don't understand how there's no documentation about this at all lol
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
also, when I do that it seems to default it to a sync request (the id will be like sync-xxxxx) even though I use AsyncOpenAI
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
you're referring us to the default vllm template usage again, we were asking how to do this when you make a custom docker image with the model built in.
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
but then we cant use the openai client.chat.completions.create right? since it doesn't format it with the {"input":{....
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
but then that kinda defeats the point, i think the template vllm worker automatically adds those to the request or something
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
it treats it like an openai request if i did a normal request with use_openai_format, openai_route and openai_input
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
it received mine but it still treats it like a normal request
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
when I send a request to it, it seems to still treat it as a 'normal' request, but maybe I need to pass in openai_route myself?
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
it's still not clear whether custom, model baked in images can use the openai route or not
47 replies
RRunPod
•Created by 3WaD on 10/11/2024 in #⚡|serverless
OpenAI Serverless Endpoint Docs
Did you figure this out?
47 replies