``` 1xGPU FP32 vs FP16 FP32 (float) (Non Tensor) A100 SXM4 80G 19.49 TFLOPS A100 80G 1

1xGPU FP32 vs FP16

FP32 (float) (Non Tensor)
    A100 SXM4 80G 19.49 TFLOPS
    A100 80G      19.50 TFLOPS
    3090          35.58 TFLOPS
    A40           37.42 TFLOPS
    A6000         38.71 TFLOPS
    H100          51.22 TFLOPS 
    H100 NVL      60.00 TFLOPS
    H100 SXM5     66.91 TFLOPS 
    H200          66.91 TFLOPS 
    4090          82.58 TFLOPS 
    L40           90.52 TFLOPS
    A6000 Ada     91.06 TFLOPS
    L40S          91.61 TFLOPS 
    5090          104.8 TFLOPS


FP16 (float) (Non Tensor)
    3090          35.58 TFLOPS
    A40           37.42 TFLOPS
    A6000         38.71 TFLOPS
    A100 80G      77.97 TFLOPS
    A100 SXM4 80G 77.97 TFLOPS
    4090          82.58 TFLOPS
    L40           90.52 TFLOPS
    A6000 Ada     91.06 TFLOPS
    L40S          91.61 TFLOPS 
    5090          104.8 TFLOPS
    H100          204.9 TFLOPS
    H100 NVL      248.3 TFLOPS
    H100 SXM5     267.6 TFLOPS
    H200          267.6 TFLOPS
Was this page helpful?