Open-WebUI 404 Error
When using the Better Ollama CUDA 12 template, and following the instructions found here: blog.runpod.io/run-llama-3-1-405b-with-ollama-a-step-by-step-guide, getting an error when posting a query using open-webui: Ollama: 404, message='Not Found', url='https://<snip>-11434.proxy.runpod.net/api/chat'
Interestingly enough, replacing the open-webui localhost URL with the above URL works well with cURL using network diagnostics.
Wanted to replicate the issue on a less expensive server, but can no longer find the template.
4 Replies
Just use
https://<snip>-11434.proxy.runpod.net
no need for /api/chat
Thank you! Not sure how to change that parameter, as it's all defaults, though
in webui you can set OLLAMA_URL or something like this.
you can also set it admin panel of open webui
Don't know what I did to get it to work, aside from shutting the machine down and restarting the docker container, but it's working now. Thank you for your help 🙂