What's the practical effect of training a finetune rather than a lora? At the moment I have about 15

What's the practical effect of training a finetune rather than a lora? At the moment I have about 15 loras running in the dev checkpoint. If I run a finetune instead, that can only incorporate 1 lora within it and bloats the size of the file? Or am I just confused about how it works? Can I merge all my loras (they are all character loras) into 1 finetune?
Was this page helpful?