vllm +openwebui

Hi guys, has anyone used Vllm as endpoint in OpenWebUI? I have created a serverless pod but it does not let me connect from openwebui (loaded locally). Does anyone know if I have to configure the external port and how it would be?
3 Replies
nerdylive
nerdylive3w ago
connect? how? its best to use the openai api
DEVIL_EGOX
DEVIL_EGOXOP3w ago
It's because, for data confidentiality reasons, I want to use my own endpoint. I assumed that vLLM uses the same configuration as the OpenAI API, which is why I chose this option on Runpod.
nerdylive
nerdylive3w ago
Yes openai api in your endpoint Using the Runpod's openai api to check runpod docs for vllm endpoints
Want results from more Discord servers?
Add your server