vllm +openwebui
Hi guys, has anyone used Vllm as endpoint in OpenWebUI? I have created a serverless pod but it does not let me connect from openwebui (loaded locally). Does anyone know if I have to configure the external port and how it would be?
3 Replies
connect? how?
its best to use the openai api
It's because, for data confidentiality reasons, I want to use my own endpoint. I assumed that vLLM uses the same configuration as the OpenAI API, which is why I chose this option on Runpod.
Yes openai api in your endpoint
Using the Runpod's openai api to check runpod docs for vllm endpoints