On how to host TwentyCRM on Debian VPS using Docker compose and external PostgreSQL
I am trying to deploy Twenty CRM on a Debian VPS using docker compose.
I have a few issues/questions
* External PostgreSQL
* Backup
* Firewall issues
External PostgreSQL
I plan to use an external PostgreSQL server running on the docker host. For two reasons:
1. I plan to run other docker apps on this host, and many apps seem to require PostgreSQL, so I figured it is easier to manage and more resource efficient to run a single PostgreSQL instance.
2. I believe it is easier to do a central backup of the single PostgreSQL instance than having to figure out how to extract a pg_dump from a db running inside some docker container. (and then having to set this up per application)
It is unclear to me what is the correct way to deploy twenty with external postgres. I am following the "Option 2: Manual steps" here: https://twenty.com/developers/section/self-hosting/docker-compose
This means that I have the docker-compose.yml as downloaded and an .env file that looks like this:
I can bring this up using
docker compose up -d
However, it does not use an external postgres.
I have already installed the external postgres on the host. I have created a twentycrm
database owned by the db user twentycrm
.
So of course, I have tried setting the variables:
Where x.x.x.x is my server's public IP.
(I have made sure the server is listening on '*' and that the user is allowed to authenticate in pg_hba.conf. )
But alas, this does not work.
I think I also need to specify the actual database name somehwere?
I have also tried using the magick docker DNS name: host.docker.internal
Will follow up with the other two issues in comments below (running out of text limit)6 Replies
Continuation of postgres issue:
When I eventually succeed in making twenty utilize my external postgresql server, I guess I will have to tell it not to start the containers with its own postgres in it (and perhaps more stuff needs to be disabled - I see there is mentioning of Patroni in the docker logs).
How do I disable/remove such conatiner?
Do I just edit the docker-compose.yml file directly? I think this might not be the way to go as I fear this is overwritten next time I want to update/upgrade twenty?
Firewall issues
I have an external nginx running as ingress on the docker host.
While this works with nginx being reverse proxy for twenty and letsencrypt certs set up using certbot, I notice that the direct docker port for twenty (tcp/3000) is reachable from the outside. This is not supposed to happen.
I have a firewall script that loads on bootup that sets the following rules:
* INPUT policy DROP
* Allow already established TCP connections
* Allow loopback
* Allow ICMP Ping
* Allow whitelisted ssh connections
* Allow tcp port 80 and 443
* Allow a few udp ports
* Drop the rest
But then it seems docker has added a whole bunch of rules to FORWARD, OUTPUT chains as well as a few DOCKER-specific chains as well. I guess this is what causes the issues.
(full iptables -L -v -n output here: https://paste.yt/p26954.html )
How to go about hardening twenty for production run on internet to block off requests not going through nginx/ingress?
Backup issues:
Am I right in that the only thing I need to backup from twenty is the postgresql database as well as the docker-compose.yml and .env file?
Will I then be able to re-create everything on another host if need be?
If so, what are the steps to get everything up and running agian on a different host while loading the backup?
Tuning
Another thing I have been wondering:
The docs calls for a VPS of at least 2 GB RAM. To me, this seems like a huge amount of RAM. Is all of this really consumed by twenty? Or is it something that can be tuned to consume less? I mean, I will be one or max two concurrent users on this app.
@charles could you have a look? 🙏
Hi @EbenezerIbiza, Charles from the core team,
Answering the questions :
- PostgreSQL external hosting: it was not possible until very recently as our postgres database required some custom extension (thats why we were providing a container having these extensions installed). We have recenlty migrated our production to AWS RDS so you are fine hosting your postgres separately
- Removing the postgres container: yes, remove the postgres container from the docker container and docker compose up again and you should be good to go
- Production network setup: my recommendation for a production environment is to have a nginx server in front of it, where SSL traffic will end up (handle SSL certificate there too). Then you'll need to proxy your traffic from nginx:443 to your container on port: 3000. That's being said, twenty also supports end-2-end SSL encryption in case you want it
Hello @charles
Thank you for your response! And thank you so much for making this awesome software!
1: For using external postgresql, does this mean that I then only need to supply the postgres url in .env and the remaining postgres vars should not be set at all?
2: How to migrate from the internal postgres to the external one?
Would the following work?
First, upgrade twentycrm with the internal docker postgres; then do a pg_dump inside the db container.
Then load the pg_dump on the external postgres.
Would this work?
Actually, I guess I might be missing a way to get the users/roles out of the internal postgres. How to do?
3: When this is all working, I just remove the db container from the docker-compose.yml?
I hope
pg_dump
is enough. As its generates a full file including the schema and migrations. You should be fine with it@EbenezerIbiza you are correct on all your points
what's import is the PG_DATABASE_URL in the .env