Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Lora modules with basic vLLM serverless - Runpod
R
Runpod
•
2y ago
•
2 replies
ArtyomKosakyan
Lora modules with basic vLLM serverless
Is it possible to use lora modules with default vLLM endpoint
? If not
, how is it possible to do it quickly
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
Serverless vllm - lora
R
Runpod / ⚡|serverless
17mo ago
LoRA path in vLLM serverless template
R
Runpod / ⚡|serverless
15mo ago
Can I use LoRA in vLLM serverless with OpenAI API?
R
Runpod / ⚡|serverless
3mo ago
Serverless VLLM batching
R
Runpod / ⚡|serverless
9mo ago