Best Practice for App with Parallel Tasks
Hi you guys, I have a Fast API Python app that has to be available at all times. BUT it also has to run tasks in the background... now to do this, one could use celery... but this doesn't quite work as one would need to be able to define separate workers, for example with a Procfile like on Heroku, but on railway... we define different services for this use case. The thing is, the main app and the background tasks share alot of code, the data models for example etc... so I really dont want to create another app, could I just deploy the same app again with a different start command?
What we do have are Replicas... but we cannot define custom start commands for each one... this would be so cool!!!
Does anyone know of a solution I could use for this use case right now?
8 Replies
Project ID:
N/A
You're welcome to kick off your app with a different start command, but I'm a bit confused by what you're trying to do here
You can have multiple workers on Railway, same as launching the app locally, just adjust your start command
Well Ive got the backend part which just responds to calls, and the celery part which runs certain tasks daily… but I want it to run in separate instances.
I see! So should I just create a shell script with both start commands?
No, it sounds to me like having multiple services is the way to go
However that does depend on how those services communicate, if they do at all. Do you use a db?
I see, the thing is, both parts use parts of the same code…
Hmm I see, so best thing I can do is just deploy the same app again with a different start command?
Worth a try
Thx!
No prob, let me know if you have any issues