Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
how to host 20gb models + fastapi code on serverless - Runpod
R
Runpod
•
2y ago
•
28 replies
RK
how to host 20gb models + fastapi code on serverless
I have 20gb model files and a fastapi pipeline code to perform preprocessing and inference
+ training
.
How can I use runpods serverless
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
YOLO models on serverless?
R
Runpod / ⚡|serverless
6mo ago
How to download models for Stable Diffusion XL on serverless?
R
Runpod / ⚡|serverless
2y ago
FastAPI RunPod serverless request format
R
Runpod / ⚡|serverless
2y ago
SDXL Serverless Worker: How to Cache LoRA models
R
Runpod / ⚡|serverless
2y ago