okay so same config i used for fine-tuning i will use but im guessing since it is lora training it u

okay so same config i used for fine-tuning i will use but im guessing since it is lora training it uses less memory?
Was this page helpful?