How to use a volume with serverless endpoints?
Hello All, We have multiple serverless endpoints that downloads the model and generate the inference. Is there is a way to mount a common volume to all the serverless endpoint system. We don't want to down the model every time endpoint boots up.
It would be nice if you can please share a concrete example
5 Replies
yes
use network storage
it can be mounted on pods, network volume so they're accessible
In my case it is not a Pod but a serverless endpoint. Also, is there sample code pointing to similar use case.
Save and use files in /workspace on pods
and /runpod-volume
Manage Endpoints | RunPod Documentation
Learn to create, edit, and manage Serverless Endpoints, including adding network volumes and setting GPU prioritization, with step-by-step guides and tutorials.