Best Mixtral/LLaMA2 LLM for code-writing, inference, 24 to 48 GB? - Runpod