vllm +openwebui
Hi guys, has anyone used Vllm as endpoint in OpenWebUI? I have created a serverless pod but it does not let me connect from openwebui (loaded locally). Does anyone know if I have to configure the external port and how it would be?
27 Replies
Unknown User•11mo ago
Message Not Public
Sign In & Join Server To View
It's because, for data confidentiality reasons, I want to use my own endpoint. I assumed that vLLM uses the same configuration as the OpenAI API, which is why I chose this option on Runpod.
Unknown User•11mo ago
Message Not Public
Sign In & Join Server To View
@DEVIL_EGOX did you ever get this working?
@Ryan Not yet🥲
Dang, it's something I really want to be able to do too
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
You got it working or never?
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
like this yeah?
I havent been able to get it to connect

Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
right..... i guess i left out the last part https://api.runpod.ai/v2/{RUNPOD_ENDPOINT_ID}/openai/v1
i got it working
only problem is everytime i reload or change pages in my openwebui site it spins up a worker because the endpoint gets triggered when it looks for available models
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
actually seems like its not a big issue, its in the running status for milliseconds
actually it may be an issue when the GPU im trying to use is unavailable... if openwebui doesnt get a response the side wont load for about a minute until the request times out
Guys I am facing issue while using run pod
RUNPOD_CHATBOT_URL = "https://api.runpod.ai/v2/vllm-runpod-endpoint-id/openai/v1"
vllm- should be hard coded since it does not have it anymore ?
response = client.chat.completions.create(
model=model_name,
messages=[{"role": "user", "content": "What is the capital of Germany"}],
temperature=0,
top_p=0.8,
max_tokens=2000,
)
err
@Aung Nanda Oo you connection URL in openwebui should be set to this:
https://api.runpod.ai/v2/YourServerlessEndpointIDhere/openai/v1
Thanks I got it!
Hi guys, again, I have tried to use the address as mentioned (https://api.runpod.ai/v2/a2auhmx8h7iu3x/openai/v1/) but I still can't connect. Help me, please 🥲
@nerdylive Any suggestions, please
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View


Use this configuration in the endpoint

@nerdylive Maybe I am misconfiguring the endpoint.
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
If I don't put the api key, should I declare it in the variables configuration? It would be something like this (API_KEY = XXXXXX) ?
😬
Unknown User•10mo ago
Message Not Public
Sign In & Join Server To View
thank you very much
I solved it, it was only the api key that was missing.