openai compatible endpoint for custom serverless docker image
how can I get openai compatible endpoint for my custom docker image in runpod serverless.
I am trying to create llama cpp docker image
I am trying to create llama cpp docker image
/abc , you would use this to tell the handler to do logic for /v1/chat/completions, /v1/models, etcstream: true in your openai request, then you just return the openai completions/chatcompletions/etc object as a dict in the outputstream: true , then this will be an SSE stream, for which you would yield your output, but instead of yielding the dict directly, you would put it in an SSE stream chunk string format, which is something like f"data: {your json output as string}"\n\n" /abc/v1/chat/completions/v1/modelsstream: truestream: true