Jaunty
RRunPod
•Created by Jaunty on 3/7/2025 in #⚡|serverless
Worker other than Python
As I understand only python library is implemented for serverless workers?
If I don't use python in my docker image (C# console app which is using cuda library) am I able just run http server in my worker application, listen to port set in "RUNPOD_REALTIME_PORT" (as far as I understand from runpod github) environment variable, and register some http routes for receiving job inputs, cancelling, etc.? If yes, where I can find list of routes that I need to implement on my http server to be able to act as worker?
2 replies
Pod with multiple gpus (rtx 4090)
When I am starting runpod/pytorch:2.4.0-py3.11-cuda12.4.1-devel-ubuntu22.04 with multiple gpus (rtx 4090) in my simple .cu file I am just trying to get cudaGetDeviceCount and getting 999 unknown error. when running code on instance with one gpu all is working fine. in nvidia-smi all gpus are visible. maybe i am missing anything obvious in pod setup?
2 replies