Spawning RQ Worker during deploy.
Im running in to the problem of not knowing how to launch the workers on deployment automatically. Now i manually run 'railway run rq worker' each time. Any suggestions?
Added the following to procfile: 'worker: rq worker -u $REDIS_URL name'
which did not work. The following solution i tried is 'release: rq worker -u $REDIS_URL ' which started a worker during building and stayed in the building phase and never going to deployment.
How do i spawn a worker or run the 'rq worker' command during deployment? Using python Flask and Redis.
4 Replies
Project ID:
ec80508e-d5a8-4f2f-8222-d14d07b6307e
ec80508e-d5a8-4f2f-8222-d14d07b6307e
It seems that i order to keep the worker active after manually starting one with the 'rq worker' command i need to have the terminal window open, as soon as i close it the worker shuts down.
You'll need to spawn another service in your canvas
Or if you only want to run it before, set it in your build or run command
I see, i added an empty service (which i think is what you mean) and when i run 'railway run rq worker' it asks on what service, i select the new empty service dedicated to the workers and that works, i also found the setting where i can give it a start command. However, since this service is empty i do not get the option to deploy it > so no way i can run the start command.
Any suggestions or documentation references?
Alright so i managed to get it working, here's how i did it. I spawned a new service, created a new github repo, the repo needs the following: requirements.txt with redis and rq included, a main.py file, and a Procfile with web: python main.py. In the service setting under deployment i have a start command with 'rq worker'. - This launches te worker right after deploy.
However, the function i pass to the que does not work since it is a different service...
Installing all dependencies and copying the function to the new service solved the problem. Essentially going form monolith to adding a micro-service fixed the problem.