Hi <@205854764540362752> can you create another train preset using 24gb Vram to train fp16 Lora. I t
Hi @Furkan Gözükara SECourses can you create another train preset using 24gb Vram to train fp16 Lora. I trained using Rank_3_18950MB. The lora is fp8 compare with fp16 training in your Rank 1 or Rank 2 presets, the prompt understanding of the fp8 lora is reduce and the result of fp8 is so far not satisfy my need