R
RunPod5mo ago
AMooMoo

Question about Network Volumes

Hi! Just a quick 2 questions about network volumes working in tandem with Serverless endpoints: - Does it reduce the cold start time or availability of the serverless gpus? I want to store AI models inside a network volume and access it with a serverless gpu endpoint, I would like to know the issues I may run into as cold starts and availability are pretty important to me. - What's the file path to a network volume if I want to set up my serverless container to use stuff in it? Thanks all!
8 Replies
Encyrption
Encyrption5mo ago
1) Actually it has been found that using a network volume actually increases the cold startup time of serverless endpoints. It also decreases response time with Flashboot. In most every case you are better off storing models, etc., directly in container storage. Network volume is the cheapest storage on RunPod, but IMHO it is not worth using it. 2) If you attach a network volume to your image it will be attached to /runpod-volume on your serverless endpoint when it runs.
nerdylive
nerdylive5mo ago
It depends, if your container is bulky enough because of the model, or files, etc, it may be beneficial to move them model files in network volumes
blabbercrab
blabbercrab5mo ago
what about if it's 24gb total @nerdylive
nerdylive
nerdylive5mo ago
I guess more than 15gb is bulky alr Try to keep it below 10 ish
AMooMoo
AMooMooOP5mo ago
On more question, my container storage setting is at 20gb, but my container is 25gb. I’ve had no problems but this technically doesn’t make sense lol does anyone want to explain?
nerdylive
nerdylive5mo ago
yes, its different its excluded from your container size, so files that will be added in run time
AMooMoo
AMooMooOP5mo ago
makes sense
Want results from more Discord servers?
Add your server