Is it possible to run OpenWebUI on a pod?
I asked the AI bot and it said yes however google Claude and the document the bot referenced don't specify that it can be run on the pod. I need more RAM for my RAG usecase and it would be so much cheaper and efficient to host it all in Runpod but I don't know if I can, Id need to use OoenWebUI as I did a personal test and it worked but now I need to upload 13k documents to it too and my PC isn't beefy enough for that.
4 Replies
@Jakob U can probably zip ur files and scp it over, how many gbs is it? or however u got the document have ur pod directly download the documents from the source
I got the documents through a custom scraper script using python. I was thinking about running it on the pod directly but it's still only <1GB. The issue is it's so many files so when I attempted to upload them to OpenWebUI's KB system it kept bottle necking my PC and by my estimates it would take 5+ hours for them to get fully uploaded. I imagined with better hardware it would be significantly quicker.
If I can use Runpod to prove my idea then invest in better personal hardware, it would be worth it. I kinda just want to use it as a full proof.
I think making a tarball and uploading it will be significantly quicker
Assuming you have access to a shell for the server
1gig should be uploaded in a matter of minutes
Amd try to avoid untarring to a network volume
Its really slow
what ^ said. I think your best bet is to tarball / zip it to your pod and then unzip it