So the version a Lora is trained on doesn’t matter?
So the version a Lora is trained on doesn’t matter?

--fp8_base is specified, the FLUX.1 model file with fp8 (float8_e4m3fn type) can be loaded directly. Also, in flux_minimal_inference.py, it is possible to load it by specifying fp8 (float8_e4m3fn) in --flux_dtype.
--fp8_basefloat8_e4m3fnflux_minimal_inference.pyfp8 (float8_e4m3fn)--flux_dtype