Search
Star
1.4k
Feedback
Setup for Free
R
Runpod
•
16mo ago
•
1 reply
Kaneda
When are multiple H100s cores on a single node available ?
10 x 48Gb GPUs cannot host all the model weights
. Is RunPod planning to upgrade their platform
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,032
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Recent Announcements
Similar Threads
Was this page helpful?
Yes
No
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Similar Threads
Multiple containers on a single GPU instance?
R
Runpod / ⛅|pods
2y ago
Multiple models in a single serverless endpoint?
R
Runpod / ⚡|serverless
11mo ago
US-KS-2 H100s Slow Network
R
Runpod / ⛅|pods
6mo ago
How come there are significantly less pods available across multiple regions?
R
Runpod / ⛅|pods
3mo ago