for lora it's easy to use a model in low precision, because the values you are changing are still hi

for lora it's easy to use a model in low precision, because the values you are changing are still higher precision. but for full finetune he uses less than 16bit?
Was this page helpful?