Hello everyone. I am Dr. Furkan Gözükara. PhD Computer Engineer. SECourses is a dedicated YouTube channel for the following topics : Tech, AI, News, Science, Robotics, Singularity, ComfyUI, SwarmUI, ML, Artificial Intelligence, Humanoid Robots, Wan 2.2, FLUX, Krea, Qwen Image, VLMs, Stable Diffusion
Get more from SECourses: Tutorials, Guides, Resources, Training, FLUX, MidJourney, Voice Clone, TTS, ChatGPT, GPT, LLM, Scripts by Furkan Gözü on Patreon
The worst thing about running the kohya flux training update. "Haha you fewl, you installed torch 2.5. I will now downgrade you to 2.4 and complain about dependencies needing 2.5" And then the installer runs off like a 1920's villain and ties a damsel to train tracks
@Dr. Furkan Gözükara One question, I'm trying to train flux with a big dataset of 1500 images, with the sugested learning rate is too slow to converge and get any results, I'm trying now with 4e-4 lr and I getting results much faster and until now looks good, do you think is too aggressive lr? may be I can start with this high lr and then finalize the training whith a lower lr to learn more finer details but without having to wait 15 days of training
well the LR bigger certainly it will learn faster but it is like you are slicing the cake with a bigger knife so the minor detailed cuts becomes impossible
you right, but by now I doing everything on my local computer, do you think that finalize the training with a lower lr can improve the final results, startin with a high learning rate and when is aready on track I can reduce the lr, do you think that lowerimg the lr at the end can be beneficial?
Just a quick FYI on flux training. Using installer v30 on an i9-13900 with 64gig ram and a 4090. Best Config Rank 4 quoted as 4.85 secs/IT, I'm getting 3.60/IT. So even quicker than expected