hey quick question, im trying to train a LoRa on Flux, I have a 3060 with 12gb so im using your conf

hey quick question, im trying to train a LoRa on Flux, I have a 3060 with 12gb so im using your config tier 2 for and following the youtube tutorial. I disabled all my start ups and my task manager is showing im using about 3-400mb of memory on my gpu when I start training. Ive reduced my resolution to 512x512, increased block swaps, and tried using the 10gb config. everytime I get a CUDA out of memory error and the training fails
Was this page helpful?