Can you now run gemma 3 in the vllm container?
In the serverless, its seems im getting an error, any help on this
14 Replies
Unknown User•8mo ago
Message Not Public
Sign In & Join Server To View
I deleted it but it seems bc gemma3 is a new model so transformer is relatively outdated afaik?
Unknown User•8mo ago
Message Not Public
Sign In & Join Server To View
I used the preset vllm, llama 3.2b worked but the new gemma 3 didnt
vLLM needs to publish an update first unfortunately
You can use vLLM directly from the main branch, but that's not super easy if you're using our vLLM template iirc
Unknown User•8mo ago
Message Not Public
Sign In & Join Server To View
Looks like vllm v0.8.0 added gemma3 support will the serverless vllm be updated soon?
Unknown User•8mo ago
Message Not Public
Sign In & Join Server To View
Hi, i have the same issue , have you resolved it? , if then please help me out with it too
I used ollana
Ollama
Okay
Unknown User•7mo ago
Message Not Public
Sign In & Join Server To View
I deployed an end point to try to call gemma3:4b but nothing is happening when I call it, anybody managed?
Unknown User•7mo ago
Message Not Public
Sign In & Join Server To View