Ā© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Search
Star
Feedback
Setup for Free
Can I use LoRA in vLLM serverless with OpenAI API? - Runpod
R
Runpod
ā¢
5mo ago
ā¢
20 replies
deseculavalutent
Can I use LoRA in vLLM serverless with OpenAI API?
I need both LoRA and Structured Outputs
, but it seems like LoRA is only supported by Runpod API and Structured Outputs are only
(poorly
) supported by OpenAI API
?
Similar Threads
Serverless vllm - lora
R
Runpod / ā”ļ½serverless
2y ago
Lora modules with basic vLLM serverless
R
Runpod / ā”ļ½serverless
2y ago
LoRA path in vLLM serverless template
R
Runpod / ā”ļ½serverless
17mo ago
Custom vLLM OpenAI compatible API
R
Runpod / ā”ļ½serverless
16mo ago