Deploy fasttext model downloading error

We can build the app correctly, but when we deploy and want to download fasttext pre-trained model, the following error occurs: [2023-11-20 03:05:06 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:2404) [2023-11-20 03:05:07 +0000] [2467] [INFO] Booting worker with pid: 2467
Solution:
have another python file that only downloads the model, like a download_model.py file, and then change your start command to something like this python download_model.py && <your current start command>
Jump to solution
17 Replies
Percy
Percy10mo ago
Project ID: 5a2cb014-5557-4588-976f-dab9ac88315a
blazeszombi
blazeszombi10mo ago
5a2cb014-5557-4588-976f-dab9ac88315a
pandas
pandas10mo ago
Timeout Lol but your worker timeouts before finishing download Increase workers life
blazeszombi
blazeszombi10mo ago
Hi! how can i do that?
pandas
pandas10mo ago
Brody can help, you might need custom nginx config no idea how it works on railway
Brody
Brody10mo ago
dont give up that easy
pandas
pandas10mo ago
Lol
Brody
Brody10mo ago
well the real question is, why is the worker downloading the model? sure we could increase the timeout, but that would only be a band aid fix
blazeszombi
blazeszombi10mo ago
mmm how would you approach the problem? We need to download the model each time we deploy the app, so we have the model stored to do predictions.
Solution
Brody
Brody10mo ago
have another python file that only downloads the model, like a download_model.py file, and then change your start command to something like this python download_model.py && <your current start command>
blazeszombi
blazeszombi10mo ago
Whats the difference between doing it at the start or when the app is running? Is the worker not the same? or im missing something?
Brody
Brody10mo ago
a worker is a gunicorn web worker, it handles the incoming http requests, they should not be downloading a model
pandas
pandas10mo ago
Using worker is fine but you need to be more sophisticated around streaming large files Stream in chunks
Brody
Brody10mo ago
plenty of reasons why not, but heres one reason, maybe you want to increase the worker count in the future to handle more traffic, you add 4 workers, then you will be downloading the model 4 times when you only need to download it once pandas, the app is downloading the model, not the client
pandas
pandas10mo ago
That's why you have to verify chunks and check which is downloaded Totally viable to do just a lot harder than adding simple download script before app initiates
Want results from more Discord servers?
Add your server