Ollama stoped using GPU

I intalled ollama on pod as usual on 3090, by this tutorial: https://docs.runpod.io/tutorials/pods/run-ollama#step-4-interact-with-ollama-via-http-api. But now everything works very slowly. And GPU Memory Used is always on zero. What can be a reason?
Set up Ollama on your GPU Pod | RunPod Documentation
Learn how to set up Ollama, a powerful language model, on a GPU Pod using RunPod, and interact with it through HTTP API requests, allowing you to harness the power of GPU acceleration for your AI projects.
19 Replies
Madiator2011 (Work)
Ye this is kinda annoying things about third party tools one update and can make guide not working. From what I tested they official docker image works fine.
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
PavelDonchenko
PavelDonchenkoOP17mo ago
Yes, there is exist an ollama template that uses docker image and it works fine
HaciricaH
HaciricaH17mo ago
I am having the same issue - it's completely broken my workflow I've burned about 14 hours on trying to get this to work, so this happened sometime in the last 24 hours with no resolution
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
HaciricaH
HaciricaH17mo ago
As an email with a 1 to 2 day response? Given the premium Runpod charges, I'll pass on this kind of service and move today to a competitor. This is crazy. Thanks for the suggestion though.
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
HaciricaH
HaciricaH17mo ago
I unfortunately bought a bunch of credits. Sunk cost. You can't just break your top template and not fix it as a priority. That's amateur service at best.
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
HaciricaH
HaciricaH17mo ago
Will do, thank you.
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
HaciricaH
HaciricaH17mo ago
Haha - their submit a request button and RunPod Support Bot is broken Huge red flag
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
HaciricaH
HaciricaH17mo ago
I tried mutiple browsers and two different networks, same outcome It just does nothing
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
HaciricaH
HaciricaH17mo ago
You're too kind, thank you. Give it a try - I'm curious if it works for you!
Unknown User
Unknown User17mo ago
Message Not Public
Sign In & Join Server To View
jcmohed
jcmohed17mo ago
Hey guys I am running into the same problem. Have you found a solution? or are you using the docker template instead? @nerdylive @PavelDonchenko
Madiator2011
Madiator201117mo ago
@jcmohed Give try to my template #Better Ollama - CUDA12

Did you find this page helpful?