another question. Batch size will affect how exactly the learning rate? I know one should divide tot
another question. Batch size will affect how exactly the learning rate? I know one should divide total steps by batch size to get the number of steps. So for a training of (12 instance images x 10 epochs x 80 repeats) (total 9600steps) / batch size = 1200 steps... I use batch 8 to speed up the process, but does this degrade the learning rate or loss? And I guess 1200 steps is on the low side. I should be rather between 1500-3000 steps after batch divide, correct?

