Compression after upload
Coming from Google Photos I know that they heavily compress all of the files.
Now with Immich all of my files get uploaded in their true size, which may result in huge data usage, when using it in the long run.
Does Immich allow for a workflow, where I can compress Photos?
16 Replies
The only workflow that would come to my mind is remove said photos (selectable via the filesystem, as they are sorted by date), compress them on disk and reupload via CLI
This would ofc only work, when all of them have their EXIF data set correctly, so that they will end up in the old location (which sadly is not the case for photos received by Signal or Telegram, because Immich doesnt set their EXIF tags after uploading)
Is their an easier solution?
That being said, an option in the Upload (UI, CLI and Mobile app) mechanism to allow for compression (can even be off by default) would be really cool and shouldnt be hard to implement (just run a compression tool on the image to upload, server-sided, before actually adding it to the Immich instance)
This isn't something supported by Immich. At some point in the future we would like to build a plugin system that would allow for this, but that's still a long way out
But my outlined workflow should work, and is currently the only option?
(im just new to selfhosting bigger sets of data and im scared that my 4TB nas will run full faster than i might anticipate)
Basically to manually compress them? Yeah, that should work. You might even be able to write a script against the API.
How big is your photos collection currently?
~30.000 files
Ah good Input, will see what i can do with that
My biggest concern with that is actually the missing exif data, that would hinder the reupload to the old position
I think you're probably still quite a way from filling up 4TB then
In my mind i could write a script, which gets my assets from a timerange (will see how i can figure that one out), check if they have EXIF data from the api response, then download the image, delete it from the server, compress it and reupload it
afaik this is all supported by the api
I might write some POC script or application for that when i have the time (currently working on my bachelor thesis so idk when i have the time) and/or the need for that and would come back to you guys i case you're interested in that, because i have the feeling that there are quite a lot of ppl who would love to be able to compress older images
Hmm but if I understand your CLI upload correctly (https://github.com/immich-app/CLI/blob/6f13b6cd464eb5af2b2446d43d77600278458b0e/bin/index.ts#L362) you dont seem to care at all about the EXIF data when uploading the image
Any exif extraction stuff and such is done on the server
So you do the Date Extraction only after the file has been uploaded?
and you probably
Sorry, I couldnt find the location in code where it's done, thats why im asking
Because then i could download in image, and use the uploaddate as exif date, if it has a missing exif entry, so i dont have to worry when reuploading the image
Thanks alot!
Is there a nice way to get the asset UUID from the UI?
The only way I see is to query /assets which would return every single one
So when I understood it correctly, if no EXIF data is available, then no metadata is stored and the image is shown in the UI timeline on the date the file was uploaded, yea that helps alot and makes a lot of sense
I start to really like the approach of Immich compared to other tools
Esp the API
Nothing in the UI I believe, but IIRC it's in the URL when viewing an asset
sadly not
Ope, it used to be but not anymore
You can definitely find it if you look at the network tab in the browser devtools, but that's a bit tedious
the api allows to filter for ''afterDate' but not sth like beforeDate
If you have a google takeout archive, the json files should have more or less relevant dates. Then you can try use immich-go to import images with the date found in related json file.