RunpodR
Runpod2y ago
125 replies
md

Run Mixtral 8x22B Instruct on vLLM worker

Hello everybody, is it possible to run mixtral 8x22B on vLLM worker i tried to run it on the default configuration with 48 gb GPU A6000, A40 but its taking too long, what are the requirements for running mixtral 8x22B successfully ? this is the model that im trying to run https://huggingface.co/mistralai/Mixtral-8x22B-Instruct-v0.1
Was this page helpful?