R
RunPod5mo ago
Volko

Do 2 GPUs will fine tune 2 times faster than 1 GPU on axolotl ?

Do 2 GPUs will fine tune 2 times faster than 1 GPU on axolotl ?
Solution:
It seems
Jump to solution
13 Replies
nerdylive
nerdylive5mo ago
Hmm does it supports multi gpu fine tuning? software like accelerate does.
nerdylive
nerdylive5mo ago
GitHub
axolotl/FAQS.md at main · OpenAccess-AI-Collective/axolotl
Go ahead and axolotl questions. Contribute to OpenAccess-AI-Collective/axolotl development by creating an account on GitHub.
Volko
Volko5mo ago
Yes it does ? Strange because yesterday, I got no answer so I tried on my own on runpod and 2 * A4000 go 2 times faster than 1 A 4000 for training process. I trained an open llama 3B and it's 10h on 1 A4000 and 5h on 2 A4000
nerdylive
nerdylive5mo ago
Oh really all same inputs and same specs, same everything except the 2 gpus im unable to explain that then hahah
digigoblin
digigoblin5mo ago
FAQ is outdated "Can you train StableLM with this? Yes, but only with a single GPU atm. Multi GPU support is coming soon! Just waiting on this PR" but the PR was already merged a year ago.
nerdylive
nerdylive5mo ago
there you go hahah thanks so now it supports multi gpu?
digigoblin
digigoblin5mo ago
According to the FAQ it should once the PR is merged, and the PR is merged so apparently, but I honestly don't know. Seems it does according to what @Volko has observed.
nerdylive
nerdylive5mo ago
ah ye
Volko
Volko5mo ago
The only difference is that I cannot rent any 2A4000 so I rent 2A4000 ada (~7% better performance)
Solution
Volko
Volko5mo ago
It seems
Volko
Volko5mo ago
Oh and the Ada ones have 20gb VRAM 50 gb RAM and 9vCPU each And the non ada have 16Gb vram 23gb ram 6vcpu each But the training is almost exclusively on GPU right ? And it was a small model so no issues with VRAM And if I remember well, I think I saw a 99% utilization on the 2 GPUs in the dashboard of runpod
nerdylive
nerdylive5mo ago
ah yea then that works multi gpu
Volko
Volko5mo ago
Yeah it seems
Want results from more Discord servers?
Add your server