Open-WebUI 404 Error
When using the Better Ollama CUDA 12 template, and following the instructions found here: blog.runpod.io/run-llama-3-1-405b-with-ollama-a-step-by-step-guide, getting an error when posting a query using open-webui: Ollama: 404, message='Not Found', url='https://<snip>-11434.proxy.runpod.net/api/chat'
Interestingly enough, replacing the open-webui localhost URL with the above URL works well with cURL using network diagnostics.
Wanted to replicate the issue on a less expensive server, but can no longer find the template.
Interestingly enough, replacing the open-webui localhost URL with the above URL works well with cURL using network diagnostics.
Wanted to replicate the issue on a less expensive server, but can no longer find the template.

