Wiring up Javascript FE Using membrane-webrtc-js
Sorry if this obvious. I'm looking through the example in the membrane_rtc_engine (link below). It's not obvious to me how the audio playback for remote endpoints is managed. Does the membrane-webrtc-js take care of that magically? I see
addVideoElement
-- but that just seems to add an HTMLVideo element, but doesn't actually connect it to anything from the endpoint / tracks.
https://github.com/jellyfish-dev/membrane_rtc_engine/blob/master/examples/webrtc_videoroom/assets/src/room.tsGitHub
membrane_rtc_engine/examples/webrtc_videoroom/assets/src/room.ts at...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
1 Reply
Oh, I see it now --
attachStream
is called when the trackReady
event fires