R
RunPod•4mo ago
peasoup

Open-WebUI 404 Error

When using the Better Ollama CUDA 12 template, and following the instructions found here: blog.runpod.io/run-llama-3-1-405b-with-ollama-a-step-by-step-guide, getting an error when posting a query using open-webui: Ollama: 404, message='Not Found', url='https://<snip>-11434.proxy.runpod.net/api/chat' Interestingly enough, replacing the open-webui localhost URL with the above URL works well with cURL using network diagnostics. Wanted to replicate the issue on a less expensive server, but can no longer find the template.
No description
4 Replies
Madiator2011 (Work)
Madiator2011 (Work)•4mo ago
Just use https://<snip>-11434.proxy.runpod.net no need for /api/chat
peasoup
peasoupOP•4mo ago
Thank you! Not sure how to change that parameter, as it's all defaults, though
Madiator2011 (Work)
Madiator2011 (Work)•4mo ago
in webui you can set OLLAMA_URL or something like this. you can also set it admin panel of open webui
peasoup
peasoupOP•4mo ago
Don't know what I did to get it to work, aside from shutting the machine down and restarting the docker container, but it's working now. Thank you for your help 🙂
Want results from more Discord servers?
Add your server