Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
H100 NVL - Runpod
R
Runpod
•
17mo ago
•
5 replies
baldo
H100 NVL
If I
've understood the docs correctly
, H100 NVL is not available on serverless
. Are there any plans to bring it to serverless
? The extra 14GB of VRAM over the other GPUs is pretty useful for 70
(ish
)B parameter LLMs
.
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Recent Announcements
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
H100 Replicate VS RunPod
R
Runpod / ⚡|serverless
9mo ago
Unexpected Charges on serverless h100 80gb
R
Runpod / ⚡|serverless
12mo ago
🚨 All 30 H100 workers are throttled
R
Runpod / ⚡|serverless
16mo ago
EU-RO-1 region severless H100 gpu not available ....
R
Runpod / ⚡|serverless
13mo ago