CUDA profiling

Hey guys, how can I profile kernels on serverless GPUs Like I have a cuda kernal, how can I know it’s performance using serverless GPUs like RunPod gpus
13 Replies
Dj
Dj2mo ago
Serverless workers are pods deployed with your template, it's the same hardware in the same datacenters - only a small amount of room on each node is dedicated to serverless processing.
자베르
자베르OP2mo ago
Aha so I can use Nvidia Nsight compute on them?
Dj
Dj2mo ago
I think so? I believe there's some type of benchmarking, profiling, tool in that domain that requires privileges we don't give our pods because they're containerized. I can look into it a little more in a moment here It's Nsight I was thinking of that won't work unless you buy out the whole node and ask us to give you permission. :frowning3:
자베르
자베르OP4w ago
Can you offer it soon, I am a founder of a startup and we can sign a collaboration for it Win win situation:D
Jason
Jason4w ago
you can rent out a whole bare metal machine, did you meant that? if so you can go ahead to your website and request one right now or request to buy a whole machine for serverless
riverfog7
riverfog74w ago
Or test on other services (should be a real VM, not docker based) with the same type of GPU and then deploy to runpod
자베르
자베르OP3w ago
Nice ideas but I have never done that before, do you have guys any documentation explaining that
Jason
Jason3w ago
Which one?
riverfog7
riverfog73w ago
Renting bare metal will cost a fortune tho https://www.runpod.io/console/bare-metal
Jason
Jason3w ago
Yep
riverfog7
riverfog73w ago
5000 bucks minimum For a month If you are a startup in korea look for this https://aihub.or.kr/devsport/aicomputingsport/list.do?currMenu=121&topMenu=101
riverfog7
riverfog73w ago
No description
자베르
자베르OP3w ago
Ohhh sadly Thanks brother You’re the best

Did you find this page helpful?