© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Search
Star
Feedback
Setup for Free
Ollama stoped using GPU - Runpod
R
Runpod
•
2y ago
•
33 replies
PavelDonchenko
Ollama stoped using GPU
I intalled ollama on pod as usual on 3090
, by this tutorial
:
https://docs.runpod.io/tutorials/pods/run-ollama#step-4-interact-with-ollama-via-http-api
. But now everything works very slowly
. And GPU Memory Used is always on zero
. What can be a reason
?
Set up Ollama on your GPU Pod | RunPod Documentation
Learn how to set up Ollama
, a powerful language model
, on a GPU Pod using RunPod
, and interact with it through HTTP API requests
, allowing you to harness the power of GPU acceleration for your AI projects
.
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,906
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Recent Announcements
Similar Threads
Stoped Pod Price
R
Runpod / ⛅|pods
2y ago
Pod using CPU instead of GPU
R
Runpod / ⛅|pods
3y ago
Ollama API
R
Runpod / ⛅|pods
3y ago
No CUDA GPU available after not using GPU for a while
R
Runpod / ⛅|pods
2y ago