Hello everyone. I am Dr. Furkan Gözükara. PhD Computer Engineer. SECourses is a dedicated YouTube channel for the following topics : Tech, AI, News, Science, Robotics, Singularity, ComfyUI, SwarmUI, ML, Artificial Intelligence, Humanoid Robots, Wan 2.2, FLUX, Krea, Qwen Image, VLMs, Stable Diffusion
Hmm okay, I'll make sure I'm not using it.. I was thinking to also resume, since this is on a slow GPU. But I'm confident i've been testing with clean configs.. just change model and training folders.
thats why I'm trying to get my 2nd GPU running.. just let that run all the time and free up primary PC.. plus i want to actually generate images and use all the cool AI stuff, but you also can't do that whie training.
I have Gigabyte Eagle rtx3090, no overclocked. Are you running Rank_3_T5_XXL_23500MB_11_35_Second_IT.json to train LORA? I had posted 8.94s with Doc's V9, but unfortunately after some times the training process failed due to lack of VRAM. Now I'm testing with Kohya original version and it reached 10.5 s/It. But it is more stable in VRAM usage.
@Zet do you happen to know what it means to be overtrained/overcooked. I see this term all the time in reference to too many epoc/repeats, but what does it actually mean? is the 200th epoch usually overtrained?