Search
Star
Feedback
Setup for Free
© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
Multiple models in a single serverless endpoint? - Runpod
R
Runpod
•
12mo ago
•
2 replies
Weej
Multiple models in a single serverless endpoint?
Hi everyone
, I was wondering if this is possible since the environment variable seems to suggest that it
's something that
's supported
.
as well as the fact that you have to mention the model name when posting a request
.
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Recent Announcements
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
Serverless Endpoint Streaming
R
Runpod / ⚡|serverless
3y ago
YOLO models on serverless?
R
Runpod / ⚡|serverless
6mo ago
Multi-nodes Serverless Endpoint
R
Runpod / ⚡|serverless
6mo ago
OpenAI Serverless Endpoint Docs
R
Runpod / ⚡|serverless
17mo ago