Can I use LoRA in vLLM serverless with OpenAI API?

I need both LoRA and Structured Outputs, but it seems like LoRA is only supported by Runpod API and Structured Outputs are only (poorly) supported by OpenAI API? 😐
Was this page helpful?