Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Can I use LoRA in vLLM serverless with OpenAI API? - Runpod
R
Runpod
•
3mo ago
•
20 replies
deseculavalutent
Can I use LoRA in vLLM serverless with OpenAI API?
I need both LoRA and Structured Outputs
, but it seems like LoRA is only supported by Runpod API and Structured Outputs are only
(poorly
) supported by OpenAI API
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
Serverless vllm - lora
R
Runpod / ⚡|serverless
17mo ago
Lora modules with basic vLLM serverless
R
Runpod / ⚡|serverless
2y ago
LoRA path in vLLM serverless template
R
Runpod / ⚡|serverless
15mo ago
Custom vLLM OpenAI compatible API
R
Runpod / ⚡|serverless
15mo ago