Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Run LLM Model on Runpod Serverless - Runpod
R
Runpod
•
2y ago
•
49 replies
TumbleWeed
Run LLM Model on Runpod Serverless
Hi There
,
I have LLM Model which build on docker image and it was 40GB
+
+ docker Image
.
I
'm wondering
, can I mount the model as volume instead of add the model in the docker image
?
Thanks
!
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Recent Announcements
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
How to run OLLAMA on Runpod Serverless?
R
Runpod / ⚡|serverless
2y ago
Rag on serverless LLM
R
Runpod / ⚡|serverless
11mo ago
Can u run fastapi gpu project on serverless runpod?
R
Runpod / ⚡|serverless
16mo ago
LLM inference on serverless solution
R
Runpod / ⚡|serverless
2y ago