R
RunPod•8mo ago
andrewwww

Fooocus loads the sdxl model too slowly

The time it takes to load the model has gone from 8 seconds to 30 seconds now. Does this have anything to do with me using Network Volume?
34 Replies
nerdylive
nerdylive•8mo ago
Yes 8 seconds where? Might be the network volume
andrewwww
andrewwwwOP•8mo ago
8 seconds is on other GPU cloud platforms, AutoDL
digigoblin
digigoblin•8mo ago
Network volumes are very slow, don't use them if speed matters
andrewwww
andrewwwwOP•8mo ago
I have a 100G model to load. Where can I find enough storage space without using Network Volume?
Madiator2011 (Work)
Madiator2011 (Work)•8mo ago
bake into docker image or copy from s3 bucket
andrewwww
andrewwwwOP•8mo ago
Where is the folder directory of Container Disk?
Madiator2011 (Work)
Madiator2011 (Work)•8mo ago
you dont want to save to conatiner storage you want to save to volume but note if you stop pod you can end up in 0 GPU mode
andrewwww
andrewwwwOP•8mo ago
I want to copy the model to the Container Disk and then load the model when I start a new machine.
digigoblin
digigoblin•8mo ago
You can do that but why? Regular persistent storage is fine too, its only network volumes that are slow usually Container storage is mounted at / and includes all OS stuff too
andrewwww
andrewwwwOP•8mo ago
I provide image generation services through the API. Each image may need to switch models, so the speed of loading models is very important to me.
digigoblin
digigoblin•8mo ago
Why do you want to use pods then? Surely serverless is better for that?
andrewwww
andrewwwwOP•8mo ago
I haven't delved into serverless yet, but I use a custom version of the Fooocus code Can Fooocus be customized without server?
digigoblin
digigoblin•8mo ago
Anything is possible but sounds like a lot of work.
andrewwww
andrewwwwOP•8mo ago
I didn't find the Fooocus serverless template, is there one now?
digigoblin
digigoblin•8mo ago
No
andrewwww
andrewwwwOP•8mo ago
How do I report a Pod that is performing very poorly?
nerdylive
nerdylive•8mo ago
Contact button on the website Left side menu
andrewwww
andrewwwwOP•8mo ago
Using Fooocus to load the sdxl model from the Container Disk still takes 30 seconds. Why? Normally it should take 5-10 seconds to complete.
nerdylive
nerdylive•8mo ago
I don't know... reasons are the network storage, or the pod What are you looking for?
andrewwww
andrewwwwOP•8mo ago
I should have gotten rid of Network Volume and moved all models to Container Disk
nerdylive
nerdylive•8mo ago
Maybe.. If you think like its the best for your use But if you're storing it on container disk network storage won't affect The loading speeds
andrewwww
andrewwwwOP•8mo ago
Yes, so it should have nothing to do with Network Volume. I changed 3 Pods and loading the model is still slow. I don't know how to have normal model loading speed On RTX 4090
digigoblin
digigoblin•8mo ago
Use community cloud instead of secure cloud. Secure cloud seems to use the same disk as network storage even if you don't attach network storage to your pod.
andrewwww
andrewwwwOP•8mo ago
However, community cloud cannot use Network Volume to copy resources. How to start new services efficiently?
nerdylive
nerdylive•8mo ago
Ye Just stop the pod when you're not using It will still charge you for offline pods ( disk price )
andrewwww
andrewwwwOP•8mo ago
Would switching to community cloud solve the slow model loading issue?
nerdylive
nerdylive•8mo ago
Cmon try it out for yourself
andrewwww
andrewwwwOP•8mo ago
Hope this doesn't waste my money and time😂
nerdylive
nerdylive•8mo ago
Haha hope so too 😂
andrewwww
andrewwwwOP•8mo ago
Yes, confirmed, the community cloud loads models very quickly Why can't the Container Disk with normal performance be used in the secure cloud using Network Volume?
nerdylive
nerdylive•8mo ago
Well I don't know exactly but I guess secure cloud has Moore usage, and somehow it's slower
Madiator2011 (Work)
Madiator2011 (Work)•8mo ago
one host has multiple pods so I think if multiple use same host machine and if someone does intensive io tasks it might affect speeds. Though they are my speculations
digigoblin
digigoblin•8mo ago
Sounds very plausible
nerdylive
nerdylive•8mo ago
Yeah or the bandwidth capacity is loe

Did you find this page helpful?