On how to host TwentyCRM on Debian VPS using Docker compose and external PostgreSQL
I am trying to deploy Twenty CRM on a Debian VPS using docker compose.
I have a few issues/questions
* External PostgreSQL
* Backup
* Firewall issues
External PostgreSQL
I plan to use an external PostgreSQL server running on the docker host. For two reasons:
1. I plan to run other docker apps on this host, and many apps seem to require PostgreSQL, so I figured it is easier to manage and more resource efficient to run a single PostgreSQL instance.
2. I believe it is easier to do a central backup of the single PostgreSQL instance than having to figure out how to extract a pg_dump from a db running inside some docker container. (and then having to set this up per application)
It is unclear to me what is the correct way to deploy twenty with external postgres. I am following the "Option 2: Manual steps" here: https://twenty.com/developers/section/self-hosting/docker-compose
This means that I have the docker-compose.yml as downloaded and an .env file that looks like this:
I can bring this up using
However, it does not use an external postgres.
I have already installed the external postgres on the host. I have created a
So of course, I have tried setting the variables:
Where x.x.x.x is my server's public IP.
(I have made sure the server is listening on '*' and that the user is allowed to authenticate in pg_hba.conf. )
But alas, this does not work.
I think I also need to specify the actual database name somehwere?
I have also tried using the magick docker DNS name: host.docker.internal
Will follow up with the other two issues in comments below (running out of text limit)
I have a few issues/questions
* External PostgreSQL
* Backup
* Firewall issues
External PostgreSQL
I plan to use an external PostgreSQL server running on the docker host. For two reasons:
1. I plan to run other docker apps on this host, and many apps seem to require PostgreSQL, so I figured it is easier to manage and more resource efficient to run a single PostgreSQL instance.
2. I believe it is easier to do a central backup of the single PostgreSQL instance than having to figure out how to extract a pg_dump from a db running inside some docker container. (and then having to set this up per application)
It is unclear to me what is the correct way to deploy twenty with external postgres. I am following the "Option 2: Manual steps" here: https://twenty.com/developers/section/self-hosting/docker-compose
This means that I have the docker-compose.yml as downloaded and an .env file that looks like this:
I can bring this up using
docker compose up -dHowever, it does not use an external postgres.
I have already installed the external postgres on the host. I have created a
twentycrm database owned by the db user twentycrm.So of course, I have tried setting the variables:
Where x.x.x.x is my server's public IP.
(I have made sure the server is listening on '*' and that the user is allowed to authenticate in pg_hba.conf. )
But alas, this does not work.
I think I also need to specify the actual database name somehwere?
I have also tried using the magick docker DNS name: host.docker.internal
Will follow up with the other two issues in comments below (running out of text limit)