Hello everyone. I am Dr. Furkan Gözükara. PhD Computer Engineer. SECourses is a dedicated YouTube channel for the following topics : Tech, AI, News, Science, Robotics, Singularity, ComfyUI, SwarmUI, ML, Artificial Intelligence, Humanoid Robots, Wan 2.2, FLUX, Krea, Qwen Image, VLMs, Stable Diffusion
I have Gigabyte Eagle rtx3090, no overclocked. Are you running Rank_3_T5_XXL_23500MB_11_35_Second_IT.json to train LORA? I had posted 8.94s with Doc's V9, but unfortunately after some times the training process failed due to lack of VRAM. Now I'm testing with Kohya original version and it reached 10.5 s/It. But it is more stable in VRAM usage.
@Zet do you happen to know what it means to be overtrained/overcooked. I see this term all the time in reference to too many epoc/repeats, but what does it actually mean? is the 200th epoch usually overtrained?
i see, so it starts to head more like the early epochs. What i noticed, is with dreambooth, it starts with some "person" in memory that is simiar to some of your training photos, then it fine tunes that person into you epoch over epoch until its pretty much you at around 120
i've had ones where for some reason it started with a really old guy with wrinkles and everything, than as it go closer to me, it slowly de-aged him, but still left remnants of the original old guy. Another time it chose another pic and started with a chubby guy, and left remanents of a bigger guy. Its all random, but i think you can get better results based on which person it starts with.
I noticed that the speed gain is in the best GPU utilization. GPU utilization is at 100% constantly. It's great, but be careful to keep your equipment well cooled. In my case the VRAM temp went up to 94 celsius after 7 hours of training.