DEVIL_EGOX
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
It's because, for data confidentiality reasons, I want to use my own endpoint. I assumed that vLLM uses the same configuration as the OpenAI API, which is why I chose this option on Runpod.
6 replies