R
Railwayβ€’11mo ago
King Jahad

High Memory Usage by service

Hey! I ran a standalone celery app both worker and beat using docker file. The beat is configured to run every 5 mins. I am getting constant 900MB memory usage. I am using python3.12.1 slim as base and supervisor to run both processes. Is this normal? Project ID: 26d20592-d5d6-4fd1-bfb3-086415c85b58
Solution:
celery -A tasks.app worker --concurrency=1 -l INFO
Jump to solution
21 Replies
Percy
Percyβ€’11mo ago
Project ID: 26d20592-d5d6-4fd1-bfb3-086415c85b58
Brody
Brodyβ€’11mo ago
900mb to run two things doesn't seem too crazy tbh though I would highly recommend running your two apps as separate railway services instead of under one railway service with supervisor (not that it would save you memory, but it's just the recommended way)
King Jahad
King JahadOPβ€’11mo ago
Can I do that using the same github repo?
Brody
Brodyβ€’11mo ago
yes of course
King Jahad
King JahadOPβ€’11mo ago
and Will Celery be able to detect it's workers from different service? Any reference?
Brody
Brodyβ€’11mo ago
deploy from the same repo to two railway services, in one service set the start command to run your app, in the other service set the start command to run celery I don't think it needs to, or cares, it just needs all the applicable environment variables
King Jahad
King JahadOPβ€’11mo ago
Okay I'll try it. So I need to configure beat and workers in seperate services.
Brody
Brodyβ€’11mo ago
yeah isn't communication done through a redis database?
King Jahad
King JahadOPβ€’11mo ago
Yes it does, sometimes I forget things. But the main point was 900MB still seems high for a program which just prints something when the beat sends it signals. Worked Although memory of worker still goes to 700MB
Brody
Brodyβ€’11mo ago
how many workers does your worker service spawn?
King Jahad
King JahadOPβ€’11mo ago
1
Brody
Brodyβ€’11mo ago
are you absolutely positive?
King Jahad
King JahadOPβ€’11mo ago
celery -A tasks.app worker -l INFO I am not so sure anymore, you are making me nervous kek
Brody
Brodyβ€’11mo ago
seems like you may be letting celery decide how many worker processes it wants to spawn, look into a flag that you can set to specify the exact amount of worker processes
King Jahad
King JahadOPβ€’11mo ago
I am trying with a concurrency of 1. Let's see. It worked, default concurrency of a worker is 10. I am surprised there's no min max but a constant value of 10. Memory usage is now 70 140 MB if we look combined for Beat and worker
Brody
Brodyβ€’11mo ago
awsome!
King Jahad
King JahadOPβ€’11mo ago
Thanks for the direction! I thought this had something to do with docker file loading something big into the memory.
Brody
Brodyβ€’11mo ago
just a simple flag πŸ™‚
King Jahad
King JahadOPβ€’11mo ago
Yeah, you saved my wallet. salute
Brody
Brodyβ€’11mo ago
mind saying your final start command?
Solution
King Jahad
King Jahadβ€’11mo ago
celery -A tasks.app worker --concurrency=1 -l INFO
Want results from more Discord servers?
Add your server