Worker Killed Issue
I'm getting the attached error when I try to request my app from a React frontend, whereas I wasn't getting that error requesting the app from a Python file using requests. Why might I be getting a memory error in this case, and what can I do to fix it?
24 Replies
Project ID:
N/A
N/A
Not sure how to access project ID
you are on the trial plan, trial plan users get 512mb of ram, if you need more you will want to upgrade to hobby
the bot told you how
it said ctrl cmd k, I tried that with the project open in Railway and nothing happened
it was ctrl or cmd depending on your platform
either way, its not needed for this case
Is there a way for me to check how much ram my app is actually using? I don't understand why the exact same request is working from a local file but not from a React app when I'm sending them to the same endpoint
visit the metrics tab
hm, ok I can see maybe it's gotten a little high over the last hour but I don't see it getting close to 512mb
Would it not show the memory getting to 512 if it got that high? because the worker was killed?
there may have been a tiny spike up to 512mb before railway captured a memory metric recording
pretty sure railway records memory usage every minute so its totally possible for your app to have tried to use more than 512mb of memory and crashed before railway recorded the memory usage
ahh i see i see
okay thank you that's helpful
what are the memory benchmarks of the paid tiers?
8gb on hobby
what about the team plan?
32gb
okay thank you
is that the highest?
you can do custom plans on enterprise
so enterprise could go higher than 32 gig?
for sure
ok very helpful, thanks a lot!
no prob
let me know if you still get killed after the upgrade!
keep in mind you will need to redeploy your service to utilize the newly added resources
ahh okay helpful
still trying to fully understand why we're getting killed -- idea why a local request to the endpoint with the same data as a request from a frontend application would take different amounts of memory?
no clue, just python things? lol
lol
Will concurrent requests to a single endpoint work? And where would the logs appear for each individual request if the app was running for two separate requests at the same time?
(answered in your new thread)