How to override ollama/ollama image to run a model at startup
Hi, I´m trying to run pods using the ollama template (ollama/ollama) and trying to override the default template to during pod creating serving the model that I want.
I tried to use ./bin/ollama serve && ollama run llama3.1:8b command into "container start command" but it doesn´t work. Any way to do this? Thanks!
I tried to use ./bin/ollama serve && ollama run llama3.1:8b command into "container start command" but it doesn´t work. Any way to do this? Thanks!