RunpodR
Runpod2y ago
3 replies
Carlos

vllm

Any plans to update the vllm image worker? I would like to test Phi 3 and Llama 3.1, currently both are unsupported with the current image. (serverless)
Was this page helpful?