Hi <@205854764540362752>, I cannot get either SDPA nor xformers to work with koyha_ss on my 5090. I

Hi @Furkan Gözükara SECourses, I cannot get either SDPA nor xformers to work with koyha_ss on my 5090. I have used both your install and also another method recommended by Bmaltais himself 2 weeks ago (https://github.com/bmaltais/kohya_ss/issues/3096#issuecomment-2730229439).

The installation of pytorch + xformers is successful. When I check in the .venv, the xformers are indeed installed.
However during training, the CrossAttention settings simply get ignored, both if I set it to xformers and to SDPA.
No mention of CrossAttention is shown in the training logs, and I notice on the VRAM usage that this is not being enabled.
I have all the requirements on my pc (python 3.10, cuda 12.8 etc).

Do you have any advice? Are the cross attention settings working in your koyha_ss install with your 5090, xformers in particular?
Thank you in advance.
GitHub
Currently when I start kohya ss gui on RTX5090, I get the following error and cannot learn Lora. Maybe I should wait for an update, but is there any way to learn Lora using RTX5090? UserWarning: NV...
Was this page helpful?