Today I tried training flux lora with 30 images using A6000 and at 9 s/it it's giving me 11 and a ha

Today I tried training flux lora with 30 images using A6000 and at 9 s/it it's giving me 11 and a half hours. With 4090 at 5.11 s/it (and the same number of images) it gave me 6 and a half hours lol. Why is that? Oh and I used the 24GB_GPU_Quality_Tier_2_23100MB_9.8_Second_IT_and_T5_Attention config in both cases. Is it because of "--blocks_to_swap 12"?
Was this page helpful?