Search
Setup for Free
R
Runpod
•
2y ago
ashley
RTX A6000 vCPU issue
Where do you see 48
? Its 8 for both secure cloud and community cloud
. You were probably looking at VRAM and not vCPU
.
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
20,849
Members
View on Discord
Similar Threads
Was this page helpful?
Yes
No
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Similar Threads
Docker issues on RTX A6000 ADA gpu pod.
R
Runpod / ⛅|pods
2y ago
Error while using vLLm in RTX A6000
R
Runpod / ⛅|pods
2y ago
Compatibility of RTX A6000 for Multi-GPU Training
R
Runpod / ⛅|pods
2y ago
GPU is not available on 1 x RTX A6000
R
Runpod / ⛅|pods
7mo ago