R
RunPod5mo ago
KAI

More RAM for endpoints?

Just curious about whether we are able to manually assgin more RAM for our endpoints since I want to use 4090 due to its high inference performance while the RAM is just 24GB which could be a bit low for video combine process
7 Replies
nerdylive
nerdylive5mo ago
The ram is 50gbs ish If I'm not wrong
Encyrption
Encyrption5mo ago
The 4090 has 24GB of VRAM (video RAM) built onto the physical card. More VRAM cannot be added to it.
nerdylive
nerdylive5mo ago
yup you cant add vram, if you want you can just add gpu's for more vram
KAI
KAIOP5mo ago
Yep I am wondering about RAM, the total memory for the system, not the card lol... I used free -m command in the console and it shows that I have 25,000ish MB RAM which is around 24GB RAM nvm
KAI
KAIOP5mo ago
No description
KAI
KAIOP5mo ago
I found out that that is actually 250GB
nerdylive
nerdylive5mo ago
Ya...
Want results from more Discord servers?
Add your server