Dataset just 20 images like last time. Command in Musumi-TUner to start: python3 src/musubi_tuner/

Dataset just 20 images like last time.

Command in Musumi-TUner to start:

python3 src/musubi_tuner/qwen_image_cache_latents.py --dataset_config dataset/dataset_qwen_test.toml --vae models/qwen_image/vae/vae/diffusion_pytorch_model.safetensors



python3 src/musubi_tuner/qwen_image_cache_text_encoder_outputs.py --dataset_config dataset/dataset_qwen_test.toml --text_encoder models/qwen_image/text_encoders/split_files/text_encoders/qwen_2.5_vl_7b.safetensors




PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True accelerate launch --num_cpu_threads_per_process 1 src/musubi_tuner/qwen_image_train_network.py \
--dataset_config dataset/dataset_qwen_test.toml \
--dit models/qwen_image/diffusion_models/split_files/diffusion_models/qwen_image_bf16.safetensors \
--text_encoder models/qwen_image/text_encoders/split_files/text_encoders/qwen_2.so_vl_7b.safetensors \
--vae models/qwen_image/vae/vae/diffusion_pytorch_model.safetensors \
--output_dir output_qwen_advanced \
--output_name Qwen-LoRA-Arek-Advanced \
--mixed_precision bf16 \
--sdpa \
--optimizer_type adamw8bit \
--learning_rate 2e-4 \
--lr_scheduler cosine_with_restarts \
--lr_warmup_steps 300 \
--lr_scheduler_num_cycles 3 \
--network_module networks.lora_qwen_image \
--network_dim 32 \
--network_alpha 16 \
--network_args "loraplus_lr_ratio=4" \
--max_train_steps 3000 \
--save_every_n_steps 250 \
--gradient_checkpointing \
--timestep_sampling qinglong_qwen \
--dynamo_backend INDUCTOR \
--fp8_base \
--fp8_scaled \
--blocks_to_swap 20
Was this page helpful?