How to make cross-CUDA compatible docker?
I want to use same Docker template on different GPUs, but it turns out different GPUs have different CUDA driver and I get crashes with my venv with torch2.7.0+cu128
I wonder whats the best practice?
4 Replies
Unknown User•5mo ago
Message Not Public
Sign In & Join Server To View
actually it turned out not the CUDA driver version is the problem, but some libraries compile differently depending on Compute Capability, so i had to create different venvs for different compute capabilities
Unknown User•5mo ago
Message Not Public
Sign In & Join Server To View
If you can compile ptx
KoboldCpp is ptx based with our optimized binary being cuda 12.1, it runs on anything on runpod
And with our oldpc binary we can even go cuda 11.5+
And ptx then still lets that run on the 5000 series despite those not having native cuda 12.1