RunpodR
Runpod13mo ago
will_t

New vllm Serverless interface issue

Hi guys, I logged in early to run my vllm-worker, which have been worker perfectly before, but I noticed that the interface for serverless have changed. I noticed there's no openai compatible url anymore. My codes were also experiencing internal server errors. Would appreciate it if you could share fixes to this issue. I'm not sure if this page is updated according to the new interface: https://docs.runpod.io/serverless/workers/vllm/openai-compatibility
image.png
image.png
Discover the vLLM Worker, a cloud-based AI model that integrates with OpenAI's API for seamless interaction. With its streaming and non-streaming capabilities, it's ideal for chatbots, conversational AI, and natural language processing applications.
Was this page helpful?