Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
vllm +openwebui - Runpod
R
Runpod
•
15mo ago
•
43 replies
Devil_egox
vllm +openwebui
Hi guys
, has anyone used Vllm as endpoint in OpenWebUI
? I have created a serverless pod but it does not let me connect from openwebui
(loaded locally
)
. Does anyone know if I have to configure the external port and how it would be
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
qwen2.5 vllm openwebui
R
Runpod / ⚡|serverless
16mo ago
vllm
R
Runpod / ⚡|serverless
2y ago
GGUF vllm
R
Runpod / ⚡|serverless
2y ago
Vllm docker
R
Runpod / ⚡|serverless
2y ago