CUDA profiling

Hey guys, how can I profile kernels on serverless GPUs Like I have a cuda kernal, how can I know it’s performance using serverless GPUs like RunPod gpus
3 Replies
Dj
Dj2w ago
Serverless workers are pods deployed with your template, it's the same hardware in the same datacenters - only a small amount of room on each node is dedicated to serverless processing.
자베르
자베르OP6d ago
Aha so I can use Nvidia Nsight compute on them?
Dj
Dj6d ago
I think so? I believe there's some type of benchmarking, profiling, tool in that domain that requires privileges we don't give our pods because they're containerized. I can look into it a little more in a moment here It's Nsight I was thinking of that won't work unless you buy out the whole node and ask us to give you permission. :frowning3:

Did you find this page helpful?