Podman openwebui cannot connect to Ollama

I followed this guide https://inhumanloop.com/tutorial/ai/2025/01/17/corp-friendly-llm.html
I have a working open web ui but it fails to connect to ollama
2025-08-29 16:35:17.019 | ERROR    | open_webui.routers.ollama:send_get_request:106 - Connection error: Cannot connect to host localhost:11434 ssl:default [Multiple exceptions: [Errno 111] Connect call failed ('::1', 11434, 0, 0), [Errno 111] Connect call failed ('127.0.0.1', 11434)]
A Human in the Loop
While setting up local LLMs has become increasingly popular, many of us face restrictions on corporate laptops that prevent using Docker. Here’s how I successfully set up OpenWebUI using Podman on my IBM-issued MacBook, creating a secure and IT-compliant local AI environment.
Was this page helpful?