7 Replies
Hi @Hioko, we recommend making logical backup with pg_dump at this point or any other automated tool supporting postgres dump backup
@Quentin G do you have any other recommendation?
if you dont mind, can you elaborate with some steps/link me some tutorial/article for that as i am new to database & pg_dump in general + i am not a coder
asking like 5 yrs old: i created a github repo which stores backup of local markdown notes. furthermore, it is generally text only hence visible by opening github repo (even without need of restoring backup). Is there anything something equivalent backup possible for tabular database(e.g. twenty/nocodb tables) ?
This link is quite complete on how to backup and restore a postgres database: https://www.netguru.com/blog/how-to-dump-and-restore-postgresql-database
Using pg_dump will make a complete snashot of your database, I would store it locally on my computer, or on a dedicated server in the cloud, or on an S3 (or S3 like) bucket.
Pushing it to Github could work as long as the files are not too big, but if you are looking for something with incremental changes this is another (and more complex) story
thanks @charles
I'm sorry, I completely forgot about it 🙏
Thanks for giving a recommendation. Idealy you would want to run periodic backups
With a quick lookup I found this repo that has an S3 integration too https://github.com/kartoza/docker-pg-backup
GitHub
GitHub - kartoza/docker-pg-backup: A cron job that will back up dat...
A cron job that will back up databases running in a docker postgres container - kartoza/docker-pg-backup
This could easily be integrated in the docker-compose
i dont have s3 unfortunately & looking for alternate (free) solution like google drive etc. as my use case is limited-personal data only