How do you add a worker to a Procfile?
It seems that Railway only accepts "web", so how can I add a "worker" to a Procfile?
28 Replies
Project ID:
N/A
show me your desired procfile you wish to run on railway
web: python manage.py migrate && gunicorn linguoai.wsgi
worker: celery -A linguini worker --loglevel=info
I've tried this but it didn't work:
celery -A linguoai worker --loglevel=info && python manage.py collectstatic && gunicorn linguoai.wsgi --bind 0.0.0.0:$POR
yeah wouldnt be possible, you would need to run the worker in a separate service
that would only run the celery
how can you set it up in a separate service?
I'm trying to run the worker with Redis
yeah thats where this all falls apart because i have never done it myself
aka, i dont know
ah ok
no worries
DEV Community
Django, Celery, and Redis on Railway
Introduction Using Django Rest Framework, Celery, and Redis I created a personalized quiz...
I was following this tutorial and she seemed pretty confident but it didn't work
let me look
yeah i see what they did to make it work
you could do the same, but its not the most ideal way
how? I couldn't get it to work
you copied the procfile command wrong
after the celery command you have two and symbols, they only have one
should it be POR or PORT?
I was using PORT
I think I've tried it with one & already but I'll have another go
lol it would be
PORT
but you dont need --bind 0.0.0.0:$PORT
since thats already the default bindcelery -A linguoai worker --loglevel=info & python manage.py collectstatic && gunicorn linguoai.wsgi
so this then?
well you need
web:
but yeahweb: celery -A linguoai worker --loglevel=info & python manage.py collectstatic && gunicorn linguoai.wsgi
?
I tried it prefixed by web but it was erroring out
yep, also please enclose any single line code in single backticks
well you do have the log downloader
web: celery -A linguoai worker --loglevel=info & python manage.py collectstatic && gunicorn linguoai.wsgi
yep that should start celery in the background and then python in the foreground
okay, I'll give it a go
but why thats not ideal, railway wont be able to restart your service if celery where to crash, since youve ran it as a background task
What's the best solution?
something that would run those commands in parallel and exit if any command fails, so that railway would be able to restart for any commands crash
a program like this https://nicolas-van.github.io/multirun/
i have used it in the past in a dockerfile and its the most ideal thing to use to run multiple commands at once on railway, but i used it in a dockerfile, and you are using nixpacks
Okay, I'll take a look
Btw, that procfile command worked
so I guess this is solved
Thanks for your help again!
like i said, its not exactly applicable for you, unless you wanna be my Guine pig?
I'm just happy to get Celery working as I couldn't get the APIs to respond when I was running it locally
fair enough, happy i could help