Deploy fasttext model downloading error

We can build the app correctly, but when we deploy and want to download fasttext pre-trained model, the following error occurs: [2023-11-20 03:05:06 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:2404) [2023-11-20 03:05:07 +0000] [2467] [INFO] Booting worker with pid: 2467
Solution:
have another python file that only downloads the model, like a download_model.py file, and then change your start command to something like this python download_model.py && <your current start command>
Jump to solution
17 Replies
Percy
Percy13mo ago
Project ID: 5a2cb014-5557-4588-976f-dab9ac88315a
blazeszombi
blazeszombiOP13mo ago
5a2cb014-5557-4588-976f-dab9ac88315a
pandas
pandas13mo ago
Timeout Lol but your worker timeouts before finishing download Increase workers life
blazeszombi
blazeszombiOP13mo ago
Hi! how can i do that?
pandas
pandas13mo ago
Brody can help, you might need custom nginx config no idea how it works on railway
Brody
Brody13mo ago
dont give up that easy
pandas
pandas13mo ago
Lol
Brody
Brody13mo ago
well the real question is, why is the worker downloading the model? sure we could increase the timeout, but that would only be a band aid fix
blazeszombi
blazeszombiOP13mo ago
mmm how would you approach the problem? We need to download the model each time we deploy the app, so we have the model stored to do predictions.
Solution
Brody
Brody13mo ago
have another python file that only downloads the model, like a download_model.py file, and then change your start command to something like this python download_model.py && <your current start command>
blazeszombi
blazeszombiOP13mo ago
Whats the difference between doing it at the start or when the app is running? Is the worker not the same? or im missing something?
Brody
Brody13mo ago
a worker is a gunicorn web worker, it handles the incoming http requests, they should not be downloading a model
pandas
pandas13mo ago
Using worker is fine but you need to be more sophisticated around streaming large files Stream in chunks
Brody
Brody13mo ago
plenty of reasons why not, but heres one reason, maybe you want to increase the worker count in the future to handle more traffic, you add 4 workers, then you will be downloading the model 4 times when you only need to download it once pandas, the app is downloading the model, not the client
pandas
pandas13mo ago
That's why you have to verify chunks and check which is downloaded Totally viable to do just a lot harder than adding simple download script before app initiates
Brody
Brody13mo ago
a worker should not be doing this kind of setup
blazeszombi
blazeszombiOP13mo ago
Ok, i think i understand now. i´ll try this options. Thank you both very much for your time!
Want results from more Discord servers?
Add your server