RunpodR
Runpod13mo ago
vlad000ss

Custom vLLM OpenAI compatible API

Hello,
I'm running OpenAI compatble server using vLLM.
In runpod for SERVERLESS service you cannot choose the endpoint you want to track the POST requests to, it's /run or /runsync by default, ny question is how do I either change the runpod configuration of this endpoint to /v1 (OpenAI endpoint) or how do I run the vLLM docker image so that it is compatible with the runpod?
Was this page helpful?