Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
serverless deployment - Runpod
R
Runpod
•
2y ago
•
5 replies
Data_Warrior
serverless deployment
i want to deploy my llm on serverless endpoint
, how can i do that
?
Solution
https://github.com/runpod-workers/worker-vllm
GitHub
GitHub - runpod-workers/worker-vllm: The RunPod worker template for...
The RunPod worker template for serving our large language model endpoints
. Powered by vLLM
.
- runpod
-workers
/worker
-vllm
Jump to solution
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
Custom serverless deployment
R
Runpod / ⚡|serverless
2y ago
Serverless docker image deployment
R
Runpod / ⚡|serverless
11mo ago
Serverless Deployment runpod request Issue
R
Runpod / ⚡|serverless
12mo ago
Error getting response from a serverless deployment
R
Runpod / ⚡|serverless
2y ago