503 when multiple workers called
I have a worker that fetches images, resizes them, stores the origin image to cache the returns the resized image. When I load a few it works no problem but I have a page that is loading 100 of them and I'm getting 503's from the worker. I then copy and paste the individual link to a new tab and it works no problem. How can I fix this?
5 Replies
I see in the metrics that I'm exceeding memory limits (I'm assuming because it's pooling my memory for how many are running at the same time). I assumed it was on a per invocation per isolate basis.
Memory is per isolate and a given isolate will handle concurrent requests
any ideas of how to proceed?
I think I'm just going to dissect the image resizing code, deploy it on something like Fly and have it resize and store in R2 then stream from there
That’s probably the best approach, there’s no good workaround for Workers hitting memory constraints other than using less memory per request and hoping there aren’t a lot of concurrent requests
haha that would be a tricky one, I'm loading a browse page for a marketplace and it's loading all of the images.