© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Search
Star
Feedback
Setup for Free
vllm +openwebui - Runpod
R
Runpod
•
16mo ago
•
43 replies
Devil_egox
vllm +openwebui
Hi guys
, has anyone used Vllm as endpoint in OpenWebUI
? I have created a serverless pod but it does not let me connect from openwebui
(loaded locally
)
. Does anyone know if I have to configure the external port and how it would be
?
Similar Threads
qwen2.5 vllm openwebui
R
Runpod / ⚡|serverless
17mo ago
vllm
R
Runpod / ⚡|serverless
2y ago
GGUF vllm
R
Runpod / ⚡|serverless
2y ago
Vllm docker
R
Runpod / ⚡|serverless
2y ago