How to run OLLAMA on Runpod Serverless?

As the title suggests, I’m trying to find out a way to deploy the OLLAMA on Runpod as a Serverless Application. Thank you
Solution
Ollama has a way to override where u the models get downloaded. so u essentially create a network volume on serverless under /runpod-volume is where they get mounted for serverless

And when ur ollama server starts through a background script on start, u do whatever u want. overall its a bit of a pain
Was this page helpful?