I used 24GB_GPU_23150MB_10.2_second_it_Tier_1.json for my first 200 epoch which has a LR of 4e-06, d
I used 24GB_GPU_23150MB_10.2_second_it_Tier_1.json for my first 200 epoch which has a LR of 4e-06, does it make sense to lower it as I fine tune further to avoid over fitting?
Do you look at the loss or the samples to see if you're about to overfit?.
Do you look at the loss or the samples to see if you're about to overfit?.








