What is the best way to upload a 7GB model to my network drive.
Please advise any solution that wont break half way through. Would like to upload to my SD model folder on my workspace. Thanks
12 Replies
Where are you transferring the model from?
runpodctl, sftp, rsync
s3 client, git, git lfs
I don't recommend git lfs, it kept failing when I was trying to download models today, huggingface-cli (pip install huggingface_hub) is MUCH more reliable
Oof
huggingface_hub Python module is amazing, it provides the CLI tool as well as being able to download weights from Python code and I have never had an issue with it
Yep hf hub is great
thanks guys! but is there any other way that does not require a technical background? Aor are these processes fairly easy to learn? If so, are there any links I can learn from?
Easy to learn, you need to have some basic technical knowledge to use RunPod
I think that if you don't have a technical knowledge at all it prob going to be very daunting
At the very least CLI / terminal / knowing to code in python even a bit is helpful
I knew nothing about dockers, and just a tadbit of ssh / scp, and i just chatgpt my way through everything for the 1/2 a month to a month, till i understood and got everything working.
Inherently with super large ML models too, i think ull need to learn some level of how to use technical tooling
Claude 3.5 is much better than ChatGPT
And its MUCH faster too
Just a bit of an ugly UI compared with ChatGPT
Load this template into a Pod, connect it to your network volume, connect to web interface and login as admin/admin. You'll have a file explorer type experience.
https://runpod.io/console/deploy?template=lkpjizsb08&ref=a57rehc6
Great! Thanks for the info guys! Really appreciate it!