Hey guys. I'm currently training a flux style finetune using kohya with a dataset consisting of arou
Hey guys. I'm currently training a flux style finetune using kohya with a dataset consisting of around 1800 images total. When leaving the max epoch to 0 (default) it was automatically set to 6. This seemed strange to me, I've never used a dataset this large. I'm using the "Batch_Size_7_48GB_GPU_46250MB_29.1_second_it_Tier_1" preset as provided by Dr. Furkan. Should I tweak another variable like learning rate for a dataset this big? I set max epochs to 10 and it says it will train for 20 hours on an A6000 GPU. I thought I'd ask before I have it train for 20 hours for it to end up being trash.
