Is storing a large file into Node public folder possible?
I'd have an api that loads the file from a google bucket and stores it into the public folder. I'd like to do this because I need to use these big images all the time and they aren't changing at all, but I can't upload them to Github. I'd like to read them with like fs.readFile(...), so it's quicker read time
I tried just using them from the bucket but each read takes like 4 hours, so it's not really feasible.
18 Replies
Project ID:
N/A
how large are these images?
1 gb
it's just one file at the moment, but might be more in the future
how often would the image change?
never
I just need to have it like in the cache so i can read it locally I guess
you could upload it to a volume and read it from the volume?
uhh, would that mean faster read time?
than from google, yeah for sure
alright I'll try it out, thanks
I didn't know about those
want a walk through it?
I'll try it myself, if I get stuck I'll let you know ^^
thanks for your help, I appreciate you
alright!
Hey so just a quick question about this. I managed to store the image to the volume, and I'm trying to access it inside of puppeteer (im using it to generate pictures).
I'm trying to acces it like this
${process.env.RAILWAY_VOLUME_MOUNT_PATH}/StarrySky.png
I'm sure it exists because I can list all the files from that mount path. But when done through an image tag it doesn't load at all.
Should I be loading the file into memory and passing it as a base64 string for this use case?
yeah that works, sorry for pingingI assume you need to load the file into memory to do some kind of processing on it right? otherwise you don't want 1gb sitting around in memory because that's expensive
I just need to load it into the puppeteer to screenshot it, then it gets thrown away again lol
I'm super curious, why is the image 1gb?
we are supporting big posters of like dimensions 70x100 cm, and prints need 300 dpi so that they look sharp, which then results in file resolutions of 10k x 12k
okay that makes sense