R
RunPod7d ago
zeeb0t

Mounting network storage at runtime - serverless

I am running my own docker container and at the moment, I’m using the runpod interface to select network storage which then presents at /runpod-volume This is OK, however, what I am hoping to do (instead) is mount the volume at runtime programmatically. Is this in anyway possible through libraries or API? Basically I would want to list the available volumes, and where the volume exists within the same region as the container / worker, it will mount it. I’m wanting to do this as I plan to make a volume in every region and then by not selecting the volume at the serverless create interface, and instead mounting at runtime, it would in theory be able to then use ANY available GPU in all regions, whilst still having access to that regions volume. If not, I need to create a serverless cluster in every region, and then I may be routing requests to a cluster that has no available GPU at that point in time. It is far from ideal.
5 Replies
Sven
Sven4d ago
I have a simmilar Issue, where i want to store my llm models on the network drives, but then i am locked on one region, would be nice to be able to add a range of network drives (one for each region) in the serverless GUI, so multiple locations can be selected for a single endpoint.
nerdylive
nerdylive4d ago
hmm #🧐|feedback and this is coming soon, you might be interested https://discord.com/channels/912829806415085598/1276217713190244404
yhlong00000
yhlong000003d ago
The easiest way to deploy globally is to build all your files as part of docker image, so you don’t need network volume
nerdylive
nerdylive3d ago
Ah ya I had a few problem with this I used runpod cpu which seems to be fine, but push seems to fail ( I guess because of the layer is too big ) it keeps repeating pushing the same layer To dockerhub
zeeb0t
zeeb0tOP3d ago
works if the model is small. otherwise it takes an age to download the image and rarely is it cached Is there any plan to allow network storage to host our docker images? Or a persistant cache otherwise? It is something I'd happily pay to have
Want results from more Discord servers?
Add your server