Hello everyone. I am Dr. Furkan GözĂŒkara. PhD Computer Engineer. SECourses is a dedicated YouTube channel for the following topics : Tech, AI, News, Science, Robotics, Singularity, ComfyUI, SwarmUI, ML, Artificial Intelligence, Humanoid Robots, Wan 2.2, FLUX, Krea, Qwen Image, VLMs, Stable Diffusion
@Furkan GözĂŒkara SECourses one question for understanding. why would 16GB config be slower than 12gb config. isn't more VRAM better? I am comparing configs and only difference is blocks to swap. should it now slow down training if 12gb config has more blocks to swap ?
L40S and A100 PCIe, are my first choices. L40S is very polyvalent and has a good speed. I use A100 PCIe when I don't want to loose time about searching for the parameters limits and fall into Out Of Memory problems.
@Furkan GözĂŒkara SECourses can we train FP8 models as well using same configs? or do I need to select fp8 in the training parameters. Atm I can train lora on FP8 using your fp8 config.