R
Runpod16mo ago
aurelium

Is the vLLM worker updated for LLaMA3.1 yet?

If not, is anyone aware of a good serverless container that does support it?
2 Replies
Unknown User
Unknown User16mo ago
Message Not Public
Sign In & Join Server To View
tim
tim15mo ago
It is updated, you can use it!

Did you find this page helpful?