R
RunPod•6mo ago
andrewwww

Fooocus loads the sdxl model too slowly

The time it takes to load the model has gone from 8 seconds to 30 seconds now. Does this have anything to do with me using Network Volume?
34 Replies
nerdylive
nerdylive•6mo ago
Yes 8 seconds where? Might be the network volume
andrewwww
andrewwwwOP•6mo ago
8 seconds is on other GPU cloud platforms, AutoDL
digigoblin
digigoblin•6mo ago
Network volumes are very slow, don't use them if speed matters
andrewwww
andrewwwwOP•6mo ago
I have a 100G model to load. Where can I find enough storage space without using Network Volume?
Madiator2011 (Work)
Madiator2011 (Work)•6mo ago
bake into docker image or copy from s3 bucket
andrewwww
andrewwwwOP•6mo ago
Where is the folder directory of Container Disk?
Madiator2011 (Work)
Madiator2011 (Work)•6mo ago
you dont want to save to conatiner storage you want to save to volume but note if you stop pod you can end up in 0 GPU mode
andrewwww
andrewwwwOP•6mo ago
I want to copy the model to the Container Disk and then load the model when I start a new machine.
digigoblin
digigoblin•6mo ago
You can do that but why? Regular persistent storage is fine too, its only network volumes that are slow usually Container storage is mounted at / and includes all OS stuff too
andrewwww
andrewwwwOP•6mo ago
I provide image generation services through the API. Each image may need to switch models, so the speed of loading models is very important to me.
digigoblin
digigoblin•6mo ago
Why do you want to use pods then? Surely serverless is better for that?
andrewwww
andrewwwwOP•6mo ago
I haven't delved into serverless yet, but I use a custom version of the Fooocus code Can Fooocus be customized without server?
digigoblin
digigoblin•6mo ago
Anything is possible but sounds like a lot of work.
andrewwww
andrewwwwOP•6mo ago
I didn't find the Fooocus serverless template, is there one now?
digigoblin
digigoblin•6mo ago
No
andrewwww
andrewwwwOP•6mo ago
How do I report a Pod that is performing very poorly?
nerdylive
nerdylive•6mo ago
Contact button on the website Left side menu
andrewwww
andrewwwwOP•6mo ago
Using Fooocus to load the sdxl model from the Container Disk still takes 30 seconds. Why? Normally it should take 5-10 seconds to complete.
nerdylive
nerdylive•6mo ago
I don't know... reasons are the network storage, or the pod What are you looking for?
andrewwww
andrewwwwOP•6mo ago
I should have gotten rid of Network Volume and moved all models to Container Disk
nerdylive
nerdylive•6mo ago
Maybe.. If you think like its the best for your use But if you're storing it on container disk network storage won't affect The loading speeds
andrewwww
andrewwwwOP•6mo ago
Yes, so it should have nothing to do with Network Volume. I changed 3 Pods and loading the model is still slow. I don't know how to have normal model loading speed On RTX 4090
digigoblin
digigoblin•6mo ago
Use community cloud instead of secure cloud. Secure cloud seems to use the same disk as network storage even if you don't attach network storage to your pod.
andrewwww
andrewwwwOP•6mo ago
However, community cloud cannot use Network Volume to copy resources. How to start new services efficiently?
nerdylive
nerdylive•6mo ago
Ye Just stop the pod when you're not using It will still charge you for offline pods ( disk price )
andrewwww
andrewwwwOP•6mo ago
Would switching to community cloud solve the slow model loading issue?
nerdylive
nerdylive•6mo ago
Cmon try it out for yourself
andrewwww
andrewwwwOP•6mo ago
Hope this doesn't waste my money and time😂
nerdylive
nerdylive•6mo ago
Haha hope so too 😂
andrewwww
andrewwwwOP•6mo ago
Yes, confirmed, the community cloud loads models very quickly Why can't the Container Disk with normal performance be used in the secure cloud using Network Volume?
nerdylive
nerdylive•6mo ago
Well I don't know exactly but I guess secure cloud has Moore usage, and somehow it's slower
Madiator2011 (Work)
Madiator2011 (Work)•6mo ago
one host has multiple pods so I think if multiple use same host machine and if someone does intensive io tasks it might affect speeds. Though they are my speculations
digigoblin
digigoblin•6mo ago
Sounds very plausible
nerdylive
nerdylive•6mo ago
Yeah or the bandwidth capacity is loe
Want results from more Discord servers?
Add your server