Is it possible to run a WebRTC server on a pod?
I don't think this is possible for two reasons:
1. The Cloudflare 100 second limit means that long-running connections like video chat are not possible.
2. Ports can only be exposed individually, but WebRTC requires a whole port range to be exposed.
I would like to run a pod that does realtime video processing with WebRTC but I think it may be impossible. I wanted to check before I switched to another cloud GPU provider. I have a small example app that works for me on another cloud provider here for reference: https://github.com/kylemcdonald/webrtc-bot
GitHub
GitHub - kylemcdonald/webrtc-bot
Contribute to kylemcdonald/webrtc-bot development by creating an account on GitHub.
6 Replies
Does this require a GPU? If not, why not just host it on $10/month VM?
This example does not require a GPU (and I have successfully installed it on a Digital Ocean droplet), but my application requires a GPU. I made this example in order to test the principle before I try to deploy the app with all its dependencies.
If you use TCP ports, instead of HTTP ports then Cloudflare shouldn't be involved. RunPod uses Cloudflare to proxy their HTTP ports but TCP ports are a direct mapping that doesn't go through a proxy.
webrtc typically runs over udp, not tcp or http. but if udp is exempt from the proxying, then i would have expected it to "just work" 🤔 i will do some more research on what ports and protocols are supported. thank you!
It is not possible right now, because of the udp
It is possible. You don't need the UDP ports for the WebRTC session, you only need to send back the server SDP after the offer SDP is received from a client. Just have your TURN server somewhere else if you're going to have your own.