RunpodR
Runpod13mo ago
Nickbkl

Running llama 3.3 70b using vLLM and 160gb network volume

Hi, I want to check if 160 gb is enough for llama 70b and whether I can use use a smaller network volume
Was this page helpful?