RunPod Deploy Streamlit App
Hello folks, I have a Streamlit app and I want to use this app from anywhere like a website. I use Whisper model for STT, and Gemma 3 LLM locally with Ollama. I use these models with LangChain and have web UI with Streamlit. This app is a prototype. What I want to do is; I want to serve this app and show some people. How do I do it? What should I use? I can be more specific if you need to know better.
11 Replies
I think I should dockerize the app and serve
yes you can do that in pods, or host the app in cpu pod, then use serverless gpu's
this is simpler and a good option
Thanks bro, that's my first time and will do it 🙂
this should help
@Jason @riverfog7 hey folks, sorry for disturbance. I made a Docker container and put it into a pod. And I can access it from http. But Streamlit ui stuck at loading. In my local it works but not works in RunPod. Where should I look?

No need to apologize, and btw you pinged people..
I'm not sure why is that, maybe start on your browser's console and network tba
Tab* see what's failing
Then after, also check on logs of yorr streamlit app in runpod if needed
Yes, sorry for that
I think it's because of CORS and xsrf. When I add these commands, it started:
--server.enableCORS=false --server.enableXsrfProtection=false
ohh i see
it works now?
Building the last version, will update soon
You know if you want a quick test rather than making new images every change you can directly use the jupyter or web terminal on runpod and install there quickly
But it's temporary, files added will be lost on pod stop /termination
*by using pytorch template