Can't run a 70B Llama 3.1 model on 2 A100 80 gb GPUs. - Runpod