DEVIL_EGOX
RRunPod
•Created by DEVIL_EGOX on 2/4/2025 in #⚡|serverless
Deployed deepseek-ai/DeepSeek-R1-Distill-Llama-8B on Serverless
I'll answer better, but now I'm mixing languages in my answer. Even the GPU
2 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
I solved it, it was only the api key that was missing.
44 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
thank you very much
44 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
If I don't put the api key, should I declare it in the variables configuration? It would be something like this (API_KEY = XXXXXX) ?
44 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
@nerdylive Maybe I am misconfiguring the endpoint.
44 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
@nerdylive Any suggestions, please
44 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
Hi guys, again, I have tried to use the address as mentioned (https://api.runpod.ai/v2/a2auhmx8h7iu3x/openai/v1/) but I still can't connect. Help me, please 🥲
44 replies
RRunPod
•Created by DEVIL_EGOX on 12/5/2024 in #⚡|serverless
vllm +openwebui
It's because, for data confidentiality reasons, I want to use my own endpoint. I assumed that vLLM uses the same configuration as the OpenAI API, which is why I chose this option on Runpod.
44 replies