Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
VLLM repo not available in serverless - Runpod
R
Runpod
•
3mo ago
•
11 replies
volodymyrhotsiy
VLLM repo not available in serverless
Hello
! I am not able to select vllm as repo for serverless
, It was there yesterday and the day before but did not work prolly cuz of outage
, but right now It is not listed in avaliable repos
, how can I fix this
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
GGUF in serverless vLLM
R
Runpod / ⚡|serverless
2y ago
Serverless VLLM batching
R
Runpod / ⚡|serverless
9mo ago
Serverless vllm - lora
R
Runpod / ⚡|serverless
17mo ago
vLLM Serverless error
R
Runpod / ⚡|serverless
2y ago