© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Search
Star
Feedback
Setup for Free
Lora modules with basic vLLM serverless - Runpod
R
Runpod
•
2y ago
•
2 replies
ArtyomKosakyan
Lora modules with basic vLLM serverless
Is it possible to use lora modules with default vLLM endpoint
? If not
, how is it possible to do it quickly
?
Similar Threads
Serverless vllm - lora
R
Runpod / ⚡|serverless
2y ago
LoRA path in vLLM serverless template
R
Runpod / ⚡|serverless
17mo ago
Can I use LoRA in vLLM serverless with OpenAI API?
R
Runpod / ⚡|serverless
5mo ago
Serverless VLLM batching
R
Runpod / ⚡|serverless
10mo ago