Podman openwebui cannot connect to Ollama
I followed this guide https://inhumanloop.com/tutorial/ai/2025/01/17/corp-friendly-llm.html
I have a working open web ui but it fails to connect to ollama
I have a working open web ui but it fails to connect to ollama
A Human in the Loop
While setting up local LLMs has become increasingly popular, many of us face restrictions on corporate laptops that prevent using Docker. Here’s how I successfully set up OpenWebUI using Podman on my IBM-issued MacBook, creating a secure and IT-compliant local AI environment.