Search
Setup for Free
R
Runpod
•
13mo ago
Devil_egox
vllm +openwebui
Hi guys
, has anyone used Vllm as endpoint in OpenWebUI
? I have created a serverless pod but it does not let me connect from openwebui
(loaded locally
)
. Does anyone know if I have to configure the external port and how it would be
?
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
20,849
Members
View on Discord
Similar Threads
Was this page helpful?
Yes
No
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Similar Threads
qwen2.5 vllm openwebui
R
Runpod / ⚡|serverless
14mo ago
vllm
R
Runpod / ⚡|serverless
2y ago
GGUF vllm
R
Runpod / ⚡|serverless
16mo ago
Vllm docker
R
Runpod / ⚡|serverless
2y ago