Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Serverless vllm - lora - Runpod
R
Runpod
•
17mo ago
•
11 replies
Sven
Serverless vllm - lora
Is there a way to set the lora
-modules
(for the vllm docker container
-
-lora
-modules lora
_adapter1
=abc
/efg
) in the Template
, or do i need to use the
"standard
" vllm container for it
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
LoRA path in vLLM serverless template
R
Runpod / ⚡|serverless
15mo ago
Lora modules with basic vLLM serverless
R
Runpod / ⚡|serverless
2y ago
Can I use LoRA in vLLM serverless with OpenAI API?
R
Runpod / ⚡|serverless
3mo ago
Serverless VLLM batching
R
Runpod / ⚡|serverless
9mo ago