Slow download speed

Complete RunPod newbie here. I'm using Flux.1-Dev-ComfyUI by Camenduru template. However, its download speed is very slow, even the pod shows connection speed is 14836 Mbps. Are there something I missing? Is this normal? It takes nearly 10 mins to complete downloading. (I know Flux.1 Dev is 32GB and it's being hosted on HuggingFace.)
Solution:
Okay, here's what I found. I guess Docker RUN command runs slower(~8x) than just directly using aria2c. So I created network volume, deployed empty(without model) ComfyUI image based on ghcr.io/ai-dock/comfyui:latest, mount network volume to /workspace/storage/stable_diffusion/models, and manually downloaded models there. After first download to network volume, I don't have to download 32GB model again....
Jump to solution
10 Replies
Encyrption
Encyrption2mo ago
What region are you using for your pod?
AlexDraconian
AlexDraconian2mo ago
EU-SE-1. (I use A40, so there're only few options.) Same for CA-MTL-1.
Encyrption
Encyrption2mo ago
I previously created a tool to monitor the bandwidth of regions on RunPod. You might want to run similar. I have attached an example of a 5 minutes sample I took in EU-RO-1. Here is the source: https://github.com/drvpn/runpod_serverless_speedtest_worker and here is an image: https://hub.docker.com/r/drvpn/runpod_serverless_speedtest_worker
GitHub
GitHub - drvpn/runpod_serverless_speedtest_worker: RunPod serverles...
RunPod serverless Speedtest worker - test RunPod's network - drvpn/runpod_serverless_speedtest_worker
No description
AlexDraconian
AlexDraconian2mo ago
Okay, but I have no idea how to running this... yet. I get 34MB/s speed. However, in my PC, it shows 50MB/s(This is maximum bandwidth for current ISP). Is this normal for RunPod? (I also got this speed yesterday)
Encyrption
Encyrption2mo ago
34MB/s is much lower than anything I have ever recorded. Where are the files you are downloading hosted?
AlexDraconian
AlexDraconian2mo ago
GitHub
flux-runpod/Dockerfile at main · camenduru/flux-runpod
Contribute to camenduru/flux-runpod development by creating an account on GitHub.
AlexDraconian
AlexDraconian2mo ago
Maybe it could 40~45MB/s, it was rough estimation. However, it's clear that it was not over 50MB/s anyway.
Encyrption
Encyrption2mo ago
Don't know... only thing else I can think to test would be moving the file to another host and see if the speed improves.
AlexDraconian
AlexDraconian2mo ago
Today it even shows nearly ~10MB/s download speed for same template. What's happening? Is it just because of high demand for bandwidth?
Solution
AlexDraconian
AlexDraconian2mo ago
Okay, here's what I found. I guess Docker RUN command runs slower(~8x) than just directly using aria2c. So I created network volume, deployed empty(without model) ComfyUI image based on ghcr.io/ai-dock/comfyui:latest, mount network volume to /workspace/storage/stable_diffusion/models, and manually downloaded models there. After first download to network volume, I don't have to download 32GB model again. You can't use network volume for model with template camenduru/flux-runpod, since Docker RUN forces download anyway.
Want results from more Discord servers?
Add your server