Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Out of memory errors on 48gb gpu which didn't happen before - Runpod
R
Runpod
•
2y ago
•
1 reply
Jidovenok
Out of memory errors on 48gb gpu which didn't happen before
Some requests fail due to OOM
, but the endpoint uses 48gb and is definitely capable of processing these requests
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Recent Announcements
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
CUDA out of memory (80GB GPU)
R
Runpod / ⚡|serverless
2y ago
out of memory error
R
Runpod / ⚡|serverless
2y ago
GPU memory issue
R
Runpod / ⚡|serverless
2y ago
The memory is too small; 31GB of RAM is prone to "Out of Memory" errors
R
Runpod / ⚡|serverless
3w ago