Search
Star
Feedback
Setup for Free
Ā© 2026 Hedgehog Software, LLC
Twitter
GitHub
Discord
System
Light
Dark
More
Communities
Docs
About
Terms
Privacy
How can I deploy Mixtral using Ollama as service? - Runpod
R
Runpod
ā¢
2y ago
ā¢
3 replies
šš»š²š®š·ššµš
How can I deploy Mixtral using Ollama as service?
Hi everyone
!
I want deploy mixtral 7x8b model using ollama on runpod
, but I can
't install it as service using runpod desktop template
.
Plz help me
!
Solution
Answered
- in
#general
.
Run the install script
, ollama serve in one terminal
, ollama run
[model name
] in a new terminal
Jump to solution
Runpod
Join
We're a community of enthusiasts, engineers, and enterprises, all sharing insights on AI, Machine Learning and GPUs!
21,202
Members
View on Discord
Resources
ModelContextProtocol
ModelContextProtocol
MCP Server
Recent Announcements
Similar Threads
Was this page helpful?
Yes
No
Similar Threads
How can I use ollama Docker image?
R
Runpod / ā ļ½pods
3y ago
Ollama stoped using GPU
R
Runpod / ā ļ½pods
2y ago
Download Mixtral from HuggingFace
R
Runpod / ā ļ½pods
2y ago
ollama: when i try to install ollama with the command
R
Runpod / ā ļ½pods
12mo ago