Ollama stoped using GPU
I intalled ollama on pod as usual on 3090, by this tutorial: https://docs.runpod.io/tutorials/pods/run-ollama#step-4-interact-with-ollama-via-http-api. But now everything works very slowly. And GPU Memory Used is always on zero. What can be a reason?
Set up Ollama on your GPU Pod | RunPod Documentation
Learn how to set up Ollama, a powerful language model, on a GPU Pod using RunPod, and interact with it through HTTP API requests, allowing you to harness the power of GPU acceleration for your AI projects.
19 Replies
Ye this is kinda annoying things about third party tools one update and can make guide not working.
From what I tested they official docker image works fine.
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
Yes, there is exist an ollama template that uses docker image and it works fine
I am having the same issue - it's completely broken my workflow
I've burned about 14 hours on trying to get this to work, so this happened sometime in the last 24 hours with no resolution
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
As an email with a 1 to 2 day response?
Given the premium Runpod charges, I'll pass on this kind of service and move today to a competitor. This is crazy.
Thanks for the suggestion though.
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
I unfortunately bought a bunch of credits. Sunk cost.
You can't just break your top template and not fix it as a priority.
That's amateur service at best.
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
Will do, thank you.
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
Haha - their submit a request button and RunPod Support Bot is broken
Huge red flag
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
I tried mutiple browsers and two different networks, same outcome
It just does nothing
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
You're too kind, thank you. Give it a try - I'm curious if it works for you!
Unknown User•17mo ago
Message Not Public
Sign In & Join Server To View
Hey guys I am running into the same problem. Have you found a solution? or are you using the docker template instead? @nerdylive @PavelDonchenko
@jcmohed Give try to my template #Better Ollama - CUDA12