R
RunPod7mo ago
Orazur

Fast loading of large docker image

Hello, I am trying to use a large docker image (>20GB) to start a pod. Is there a way to cache it in a network volume, and then start a pod from it, to start a pod quickly (I already sent my image in .tar on a volume, but couldn't find how to start a pod from it)? Or is there a better solution? Thank you!
2 Replies
justin
justin7mo ago
You move anything that is of large assets to a Network volume, I recommend to do this in a region that has both CPU Pods and GPU Pods availiable in case you ever need to spin up a CPU Pod + network volume to move data around. This should drastically decrease your docker image size, and let you spin up containers much faster You don't send your image to a .tar rather the things that make your image so large, and use ur docker image more as the actual run time processing unit
Orazur
OrazurOP7mo ago
Thank you very much, I’m Going to try that! I tested and it’s working well, thanks again!
Want results from more Discord servers?
Add your server