When are multiple H100s cores on a single node available ?
10 x 48Gb GPUs cannot host all the model weights. Is RunPod planning to upgrade their platform ?
1 Reply
you can have 8*h100 for pod, if you need it for serverless, you can create endpoint first and let me know the id, I can set it for you.