swarm monitoring

I have a dokploy swarm setup with 3 nodes. I have started a stack for testing purposes and constrained it to node 2, which works great. - But I can not monitor it through the dokploy GUI. - I can see the logs when I click the swarm/native toggle though. Am I missing some setup step?
No description
No description
No description
11 Replies
Henrik
Henrik3mo ago
I don't think that is supported, but out of curiosity, do you run 3 manager nodes?
deck0405
deck0405OP3mo ago
1 manager and 2 workers
deck0405
deck0405OP3mo ago
No description
deck0405
deck0405OP3mo ago
Should I be running 3 managers right now? I plan on expanding the cluster in the future to 4, then to 6 nodes.
Henrik
Henrik3mo ago
I wouldn't scale to three managers right know since Dokploy's postgres database isn't constrained to only live on the first node. That means another manager would take over Dokploy, but not have the data migrated over. In case of a crash on the first manager that is
deck0405
deck0405OP3mo ago
Ah, good to know, thank you. Is there any way of monitoring the worker nodes through dokploy? Do you by chance know what happens if I also add them as "Remote Servers"?
Henrik
Henrik3mo ago
I'm waiting for someone who have tried it to explain how they deal with that problem here That will not be part of the swarm. Think of it as two separate systems you're able to control from one UI.
deck0405
deck0405OP3mo ago
Got it, thanks. In regard to 3 managers. I think I would attempt to use some sort of network or distributed storage, but I am not sure if conflicting writes could be a thing...
Henrik
Henrik3mo ago
It could work. Since it's contrained such that there can only exists 1 postgres db for Dokploy. Maybe I'll try out that route myself
deck0405
deck0405OP3mo ago
Good luck!
Henrik
Henrik3mo ago
Thanks. Currently I am running one manager and a lot of worker nodes, but that is not a sustainable way of running a cluster

Did you find this page helpful?