ob3lx.
ob3lx.
RRunPod
Created by ob3lx. on 7/28/2024 in #⚡|serverless
Getting timeout with network volume
I want to install llama 3.1 70B model as serverless but 'cold start' takes too long 1-3 minutes. For this reason, I tried to do it with 'network volume', but this time the model cannot be downloaded for the first time, I keep getting timeout after waiting 6-7 minutes. In short, the model cannot be downloaded from HuggingFace servers and transferred to 'network volume'. I am using vLLM. Thanks for your help.
2 replies