Recipe for Llama 4 Scout on vLLM
I am trying to follow this Llama 4 recipe from vLLM and deploy it on Runpod Serveless.
Even using 2 x H100 or a B200, I could not deploy the LLM.
Has someone managed to deploy it?
3 Replies
GitHub
[Request] Runpod Serverless Version of the Llama4-Scout Deployment ...
I was going through the excellent Llama 4 Scout on vLLM recipe Would it be possible to provide a version of this recipe adapted for Runpod vLLM Serverless In particular, guidance on the required GP...
Thank you for making an issue! I've changed the assignee to the right individual.
Unknown User•2mo ago
Message Not Public
Sign In & Join Server To View