trying to use Supor and at 2nd stage getting:
[Tiled VAE]: split to 4x2 = 8 tiles. Optimal tile size 480x448, original tile size 512x512
[Tiled VAE]: Executing Encoder Task Queue: 100%|████████████████████████████████████| 728/728 [00:01<00:00, 405.80it/s]
[Tiled VAE]: Done in 2.045s, max VRAM alloc 9757.875 MB
P:\SUPIR_v7\SUPIR\venv\lib\site-packages\torch\nn\functional.py:5476: UserWarning: 1Torch was not compiled with flash attention. (Triggered internally at ..\aten\src\ATen\native\transformers\cuda\sdp_utils.cpp:263.)
attn_output = scaled_dot_product_attention(q, k, v, attn_mask, dropout_p, is_causal)