R
RunPod3mo ago
Kaneda

When are multiple H100s cores on a single node available ?

10 x 48Gb GPUs cannot host all the model weights. Is RunPod planning to upgrade their platform ?
1 Reply
yhlong00000
yhlong000003mo ago
you can have 8*h100 for pod, if you need it for serverless, you can create endpoint first and let me know the id, I can set it for you.
Want results from more Discord servers?
Add your server