R
Railway4mo ago
Skye

Celery worker container dies while Starting Pool

Project ID: 1639f59c-5c44-42c3-9a46-6b7566be82f3 Running a Django application with Postgres + Redis... setup has been fine so far, a few silly misconfigurations on my part, but I'm a little stumped on this one since there's very little data to suggest what might be going wrong. I'm attempting to deploy the Celery worker as a separate service based off the same repo that holds the Django project. I looked briefly over one of the templates and didn't see anything drastically different from my configuration, or perhaps I'm just not experienced enough to identify it. Here's some of the logs prior to failure: -------------- celery@dce1163c1a98 v5.4.0 (opalescent) --- ----- -- ** ---- Linux-6.1.0-9-cloud-amd64-x86_64-with-glibc2.39 2024-07-26 15:14:01 - * --- * --- - ---------- [config] - ---------- .> app: akashi:0x7fd6d473ef50 - ---------- .> transport: redis://default:@redis.railway.internal:6379/0 - ---------- .> results: redis://default:@redis.railway.internal:6379/0 - * --- * --- .> concurrency: 32 (prefork) -- ** ---- .> task events: ON --- ** ----- -------------- [queues] .> celery exchange=celery(direct) key=celery
[tasks] ... [2024-07-26 15:14:01,600: DEBUG/MainProcess] | Worker: Starting Hub [2024-07-26 15:14:01,600: DEBUG/MainProcess] ^-- substep ok [2024-07-26 15:14:01,600: DEBUG/MainProcess] | Worker: Starting Pool container event container died
Solution:
you are likely running out of memory on the trial plan and thus your app is crashing, try this as the start command instead -
celery -A akashi worker --loglevel=DEBUG -E --concurrency 1
celery -A akashi worker --loglevel=DEBUG -E --concurrency 1
...
Jump to solution
9 Replies
Percy
Percy4mo ago
Project ID: 1639f59c-5c44-42c3-9a46-6b7566be82f3
Brody
Brody4mo ago
What is your start command in use for celery?
Skye
SkyeOP4mo ago
celery -A akashi worker --loglevel=DEBUG -E
Brody
Brody4mo ago
how many jobs do you think you could have running at the same time
Skye
SkyeOP4mo ago
Not a lot... there are only 4 defined tasks. When I run it locally I use it with Celery Beat for cron tasks. AFAIK, it shoudn't be executing anything at all. I also added logger.info statements at the beginning of all shared tasks and none of those appear in the logs.
Solution
Brody
Brody4mo ago
you are likely running out of memory on the trial plan and thus your app is crashing, try this as the start command instead -
celery -A akashi worker --loglevel=DEBUG -E --concurrency 1
celery -A akashi worker --loglevel=DEBUG -E --concurrency 1
Skye
SkyeOP4mo ago
Ok, will do Yup! That did it, thanks very much. GPT suggested that possibility but I foolishly ignored it because I didn't realize resources were limited like that on the trial.
Brody
Brody4mo ago
yep, 500mb of ram
Skye
SkyeOP4mo ago
Ah, ok! Good to know.
Want results from more Discord servers?
Add your server