R
Runpod8mo ago
Yobin

Can you now run gemma 3 in the vllm container?

In the serverless, its seems im getting an error, any help on this
14 Replies
Unknown User
Unknown User8mo ago
Message Not Public
Sign In & Join Server To View
Yobin
YobinOP8mo ago
I deleted it but it seems bc gemma3 is a new model so transformer is relatively outdated afaik?
Unknown User
Unknown User8mo ago
Message Not Public
Sign In & Join Server To View
Yobin
YobinOP8mo ago
I used the preset vllm, llama 3.2b worked but the new gemma 3 didnt
Dj
Dj8mo ago
vLLM needs to publish an update first unfortunately You can use vLLM directly from the main branch, but that's not super easy if you're using our vLLM template iirc
Unknown User
Unknown User8mo ago
Message Not Public
Sign In & Join Server To View
Bj9000
Bj90008mo ago
Looks like vllm v0.8.0 added gemma3 support will the serverless vllm be updated soon?
Unknown User
Unknown User8mo ago
Message Not Public
Sign In & Join Server To View
Aizen
Aizen7mo ago
Hi, i have the same issue , have you resolved it? , if then please help me out with it too
Yobin
YobinOP7mo ago
I used ollana Ollama
Aizen
Aizen7mo ago
Okay
Unknown User
Unknown User7mo ago
Message Not Public
Sign In & Join Server To View
Javier
Javier7mo ago
I deployed an end point to try to call gemma3:4b but nothing is happening when I call it, anybody managed?
Unknown User
Unknown User7mo ago
Message Not Public
Sign In & Join Server To View

Did you find this page helpful?