PostgreSQL S3 backups: Allowing to Upload to a Bucket with Object Lock enabled
Hello everyone,
When I tried to use the PostgreSQL S3 backups template (https://github.com/railwayapp-templates/postgres-s3-backups)
to upload backups to a bucket that has object lock enabled
(a very important security feature) I got the following error message:
Error while running backup: InvalidRequest: Missing required header for this request: Content-MD5
at throwDefaultError (/app/node_modules/@smithy/smithy-client/dist-cjs/index.js:838:20)
It seems that the content MD5 is needed
if object lock is active. See here https://github.com/aws/aws-sdk-php/issues/1694
How should I calculate the MD5 content here? And does it make sense from your point of view to build this feature directly into the template,
because object lock is a standard security measure for backups.
Thank you in advance for your time
40 Replies
Project ID:
N/A
N/A
so if I understand this correctly, essentially the backup service just needs to provide an md5 hash of the backup file when uploading the file to S3?
Hey Brody,
Yes, it looks like that. Also, the md5 hash apparently has to be encoded with base64 afterwards
that part is definitely easy enough
md5 hashing a potentially large backup file could end up being quite resource intensive so if it is implemented it would need to be off by default for the buckets that don't require it, at least that's my thinking
Yes, it can be disabled by default.
Apparently you only need it if Object Lock is enabled for the bucket.
I'd be happy to take a crack at this, but I don't know if cloudflare R2 has object lock? I don't have access to an aws account to test with s3
I believe wasabi and backblaze have implemented object locking if you have an account there
I do have an account with backboaze!
that is nice (:
Because i am trying to use the template with backblaze and got the error
I'll see if I can get NodeJS to md5 hash a file efficiently, if node has to load the entire file into memory I don't think that would be worth it to implement
thank you brody!
no problem, I shall report back
got node to md5 hash a large (20gb) file without loading it into memory, and it's not actually too slow, will work on integrating that into the backup service when I'm back at the computer and then I'll do some testing, if that goes well, I would like to ask you to test as well
thats sounds great! Thank you very much!
I'm happy to test when you give me the ping
will do
hey @Obstkompost am i supposed to be doing something special to enable object locking besides just enabling it? I have enabled it but i dont get any errors running a backup without md5 hashing enabled
Maybe you need to specify how long the lock should be
ah good idea, ill try that now
just trying to reproduce your error before i add in the md5 hashing code
That is a good idea
that was it!
Nice, the first step is done (:
indeed! next step, provide an incorrect md5 header
good idea, lets see if backblaze can detect a incorrect md5 header.
My guess is that there is a good chance, that backblaze can not detect that
gotta cover all my bases if im doing a pr on a repo many people use!
awsome
great, good job on there part (and your for checking it)
Right, this is a template that is really used by many
and with the correct hash, it works!
Niice!
Apparently I am the first person who wants to use object locks😃
maybe someone had even forked it to add it but never pr'd it back
could you fork my branch so that you can swap your backup's source with your fork of my branch?
https://github.com/brody192/postgres-s3-backups/tree/support-object-lock
yeah, one moment
oh right, you would need to set a service variable
SUPPORT_OBJECT_LOCK
to true
since md5 hashing a potentially large file isnt exactly free i left it as false by defaultyeah, upload was a success!
awsome! is there anything else i should do before i submit a pr?
currently I can't think of anything😄
awsome then i will go ahead and submit a pr
Thank you very much, that was a solution in less than 24h. Dream Support!
happy to help!
I bought you some coffee's. Have a nice drink👍
thank you very much, i appreciate that!