WebRTC stream not working

I've hit an issue trying to get an MPEGTS stream displaying in the browser via WebRTC. All seems to work, I can add the WebRTC endpoint for the stream, to the RTC Engine, the web client successfully connects, negotiates tracks, handles the offer and sdpanswer etc and I can add the transceiver for the stream to the RTCPeerConnection and get the MediaStream. When I add the MediaStream to the video element srcObject I get a blank screen with a spinner and see a continuous stream of Didn't receive keyframe for variant: messages on the server side:
[debug] Responding to STUN request [UDP, session q02, anonymous, client 100.120.41.48:50473]
[debug] No callback function specified for 'stun_query' hook [UDP, session q02, anonymous, client 100.120.41.48:50473]
[debug] <0.2875.0>/{:endpoint, "6adebe85-148e-4fc7-b06d-3246d0b345bd"}/:endpoint_bin/:ice Sending Binding Indication with params: [magic: 554869826, transaction_id: 27660277917344572907996316102]
[debug] <0.2875.0>/{:endpoint, "6adebe85-148e-4fc7-b06d-3246d0b345bd"}/:endpoint_bin/:ice Received Binding Request with params: [priority: 1853825279, magic: 554869826, trid: 27653466564182733022482168431, username: "JBcs:+aQz", use_candidate: true, ice_controlled: false, ice_controlling: true]
[debug] <0.2875.0>/{:endpoint, "6adebe85-148e-4fc7-b06d-3246d0b345bd"}/:endpoint_bin/:ice Sending Binding Success with params: [magic: 554869826, transaction_id: 27653466564182733022482168431, username: "JBcs:+aQz"]
[debug] <0.2875.0>/{:endpoint, "ed395134-d9c5-4f8e-b4a5-5274c0319487"}/{:track_sender, "07DB486FABD21734"} Didn't receive keyframe for variant: high in 500. Retrying.
...
[debug] Responding to STUN request [UDP, session q02, anonymous, client 100.120.41.48:50473]
[debug] No callback function specified for 'stun_query' hook [UDP, session q02, anonymous, client 100.120.41.48:50473]
[debug] <0.2875.0>/{:endpoint, "6adebe85-148e-4fc7-b06d-3246d0b345bd"}/:endpoint_bin/:ice Sending Binding Indication with params: [magic: 554869826, transaction_id: 27660277917344572907996316102]
[debug] <0.2875.0>/{:endpoint, "6adebe85-148e-4fc7-b06d-3246d0b345bd"}/:endpoint_bin/:ice Received Binding Request with params: [priority: 1853825279, magic: 554869826, trid: 27653466564182733022482168431, username: "JBcs:+aQz", use_candidate: true, ice_controlled: false, ice_controlling: true]
[debug] <0.2875.0>/{:endpoint, "6adebe85-148e-4fc7-b06d-3246d0b345bd"}/:endpoint_bin/:ice Sending Binding Success with params: [magic: 554869826, transaction_id: 27653466564182733022482168431, username: "JBcs:+aQz"]
[debug] <0.2875.0>/{:endpoint, "ed395134-d9c5-4f8e-b4a5-5274c0319487"}/{:track_sender, "07DB486FABD21734"} Didn't receive keyframe for variant: high in 500. Retrying.
...
Any clues as to what I'm missing would be greatly received. For context, I followed the RTC Engine file example and have a main pipeline taking the video output from the MPEGTS demuxer and delivering it to a WebRTC endpoint bin.
20 Replies
Radosław
Radosław9mo ago
The first thing I would check is what profile of H264 you have in the MPEGTS file. Our implementation and most browsers support only a baseline profile.
Al
Al9mo ago
@Radosław, thanks looks like that may be the problem. ffprobe shows it's Main profile:
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=Main
codec_type=video
...
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=Main
codec_type=video
...
Radosław
Radosław9mo ago
Ok so try to transcode this file to profile baseline and check if it works in that scenario. If yes you can have two possibilities: a) Guarantee that your MPEGTS files will be always with h264 in the profile baseline or b) add transcoding in your pipeline after MPEG demuxer (you can use this plugin for that https://github.com/membraneframework/membrane_h264_ffmpeg_plugin), but remember that this will significantly increase usage of the CPU.
Al
Al9mo ago
Transcoding with ffmpeg to get this with ffprobe:
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=Constrained Baseline
codec_type=video
codec_tag_string=[27][0][0][0]
codec_tag=0x001b
...
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=Constrained Baseline
codec_type=video
codec_tag_string=[27][0][0][0]
codec_tag=0x001b
...
I still get Didn't receive keyframe for variant: high in 500. Retrying..
Al
Al9mo ago
Running ffprobe on the test fixture video.h264 file that's part of the file source endpoint test (https://github.com/jellyfish-dev/membrane_rtc_engine/tree/master/file/test/fixtures) shows that it's profile is High:
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=High
codec_type=video
codec_tag_string=[0][0][0][0]
codec_tag=0x0000
...
[STREAM]
index=0
codec_name=h264
codec_long_name=H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
profile=High
codec_type=video
codec_tag_string=[0][0][0][0]
codec_tag=0x0000
...
So I wonder is it not the profile that's the issue.
GitHub
membrane_rtc_engine/file/test/fixtures at master · jellyfish-dev/me...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
Al
Al9mo ago
I guess I must be doing something dumb with my pipeline. Trying to use Membrane.Debug.Filter to inspect the output from the demuxer and I get nothing:
get_child(:demuxer)
|> via_out(Pad.ref(:output, {:stream_id, sid}))
|> child(:filter, %Membrane.Debug.Filter{
handle_buffer: &IO.inspect(&1, label: "buffer"),
handle_stream_format: &IO.inspect(&1, label: "stream format")
})
|> child(:video_parser, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {30, 1}},
output_alignment: :au
})
|> via_in(:input, toilet_capacity: 1_000)
|> child(:realtimer, Membrane.Realtimer)
|> via_in(Pad.ref(:input, :main_stream), options: [media_track: track])
|> get_child(:webrtc)
get_child(:demuxer)
|> via_out(Pad.ref(:output, {:stream_id, sid}))
|> child(:filter, %Membrane.Debug.Filter{
handle_buffer: &IO.inspect(&1, label: "buffer"),
handle_stream_format: &IO.inspect(&1, label: "stream format")
})
|> child(:video_parser, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {30, 1}},
output_alignment: :au
})
|> via_in(:input, toilet_capacity: 1_000)
|> child(:realtimer, Membrane.Realtimer)
|> via_in(Pad.ref(:input, :main_stream), options: [media_track: track])
|> get_child(:webrtc)
This worked fine when the output was to HLS:
get_child(:demuxer)
|> via_out(Pad.ref(:output, {:stream_id, sid}))
|> child(:video_nal_parser, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {30, 1}},
output_alignment: :au
})
|> via_in(Pad.ref(:input, :video),
options: [encoding: :H264, segment_duration: Membrane.Time.seconds(4)]
)
|>child(:video_out,
%Membrane.HTTPAdaptiveStream.SinkBin{
manifest_module: Membrane.HTTPAdaptiveStream.HLS,
mode: :vod,
target_window_duration: :infinity,
persist?: true,
storage: %Membrane.HTTPAdaptiveStream.Storages.FileStorage{directory: state.hls_path}
},
get_if_exists: true)
get_child(:demuxer)
|> via_out(Pad.ref(:output, {:stream_id, sid}))
|> child(:video_nal_parser, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {30, 1}},
output_alignment: :au
})
|> via_in(Pad.ref(:input, :video),
options: [encoding: :H264, segment_duration: Membrane.Time.seconds(4)]
)
|>child(:video_out,
%Membrane.HTTPAdaptiveStream.SinkBin{
manifest_module: Membrane.HTTPAdaptiveStream.HLS,
mode: :vod,
target_window_duration: :infinity,
persist?: true,
storage: %Membrane.HTTPAdaptiveStream.Storages.FileStorage{directory: state.hls_path}
},
get_if_exists: true)
Radosław
Radosław9mo ago
Ok another thing could be a problem is a output alignment, if I remember correctly webrtc require :nal format instead of :au.
Radosław
Radosław9mo ago
GitHub
membrane_rtc_engine/file/lib/file_source_endpoint.ex at 07341b4d6c7...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
Al
Al8mo ago
I''m using :nalu alignment:
|> child(:video_parser, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {30, 1}},
output_alignment: :nalu
})
|> child(:video_parser, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {30, 1}},
output_alignment: :nalu
})
So I don't think it's that. I can only think that my pipeline isn't getting hooked up end to end since the Membrane.Debug.Filter I've added on the video out of the demuxer isn't showing any data - so i guess there's nothing generating a demand (at least that's what my limited understanding of Membrane is suggesting to me!).
Radosław
Radosław8mo ago
Ok I assumed that based on this pipeline provided by you. Also as I looked one again on this pipeline I want to question why do you link it inside of membrane_webrtc_plugin / WebRTCEndpint? Instead of using FileEndpoint? https://github.com/jellyfish-dev/membrane_rtc_engine/blob/dff42897293d88e0d70c34658473ae7ca31eee6b/file/lib/file_source_endpoint.ex#L66-L82 Here in after_source_transformation you could specify all required elements that you need.
GitHub
membrane_rtc_engine/file/lib/file_source_endpoint.ex at dff42897293...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
Radosław
Radosław8mo ago
And it could work better as maybe your current approach with only WebRTCEndpoint or EndpointBin lacks some messages required by one of this elements.
Al
Al8mo ago
Sorry, I should have been clearer - yes my original H264.Parser was configured with :au alignment. But following your comment I changed it to :nalu and it had no impact. The application is aimed at interfacing to a uav camera with the video payload delivered as an mpegts stream over udp. I've been using ex_nvr as the model. In particular the main pipeline https://github.com/evercam/ex_nvr/blob/master/apps/ex_nvr/lib/ex_nvr/pipelines/main.ex, the web_rtc bin element https://github.com/evercam/ex_nvr/blob/master/apps/ex_nvr/lib/ex_nvr/pipeline/output/web_rtc.ex and associated stream endpoint https://github.com/evercam/ex_nvr/blob/master/apps/ex_nvr/lib/ex_nvr/pipeline/output/webrtc/stream_endpoint.ex. It does seem quite complicated. Perhaps, as you suggest, it might be easier to instantiate a room GenServer and add a single stream endpoint.
GitHub
ex_nvr/apps/ex_nvr/lib/ex_nvr/pipeline/output/web_rtc.ex at master ...
Video recording and computer vision for edge devices - evercam/ex_nvr
Al
Al8mo ago
Thanks for your help btw, very much appreciated!
Radosław
Radosław8mo ago
Ok so maybe we should clarify something, like what protocols do you want to use? Because this example from evercam is pretty complex. If I understand correctly your input is a file with MPEGTS stream and you would like to send it to the browser through WebRTC. In that case IMO the best option for you would be to use membrane_rtc_engine where the source would be FileEndpoint (https://github.com/jellyfish-dev/membrane_rtc_engine/tree/master/file) and the second endpoint added to engine would be WebRTCEndpoint (https://github.com/jellyfish-dev/membrane_rtc_engine/tree/master/webrtc). If your case is that input is RTSP stream I think you could simply change the FileEndpoint to RTSPEndpoint (https://github.com/jellyfish-dev/membrane_rtc_engine/tree/master/rtsp).
GitHub
membrane_rtc_engine/file at master · jellyfish-dev/membrane_rtc_eng...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
Al
Al8mo ago
Ok, I've decided to simplify everything for my benefit. I've cloned the videoroom example that comes with the RTC engine. I'd now like to add a file endpoint when the room is instantiated so that in the browser when I join a room I get a video feed for the file endpoint. My possibly dumb question - where do I add the file endpoint? I've tried to add it at the end of the room GenServer init:
...
{:ok, pid} = Membrane.RTC.Engine.start(rtc_engine_options, [])
Engine.register(pid, self())
Process.monitor(pid)

file_peer_id = UUID.uuid4()
file_endpoint = create_file_endpoint(pid, file_peer_id)
:ok = Engine.add_endpoint(pid, file_endpoint, id: file_peer_id)
...
...
{:ok, pid} = Membrane.RTC.Engine.start(rtc_engine_options, [])
Engine.register(pid, self())
Process.monitor(pid)

file_peer_id = UUID.uuid4()
file_endpoint = create_file_endpoint(pid, file_peer_id)
:ok = Engine.add_endpoint(pid, file_endpoint, id: file_peer_id)
...
with create_file_endpoint defined as:
defp create_file_endpoint(rtc_engine, peer_id) do
IO.puts("Adding File #{inspect peer_id}")
video_track_config = %Isr.Pipeline.TrackConfig{
type: :video,
encoding: :H264,
clock_rate: 90_000,
fmtp: %ExSDP.Attribute.FMTP{
pt: 96
},
opts: [framerate: {60, 1}]
}

file_endpoint =
%Isr.Pipeline.Endpoint.File{
rtc_engine: rtc_engine,
file_path: "./test/fixtures/video.h264",
track_config: video_track_config,
payload_type: 96,
autoplay: true
}

end
defp create_file_endpoint(rtc_engine, peer_id) do
IO.puts("Adding File #{inspect peer_id}")
video_track_config = %Isr.Pipeline.TrackConfig{
type: :video,
encoding: :H264,
clock_rate: 90_000,
fmtp: %ExSDP.Attribute.FMTP{
pt: 96
},
opts: [framerate: {60, 1}]
}

file_endpoint =
%Isr.Pipeline.Endpoint.File{
rtc_engine: rtc_engine,
file_path: "./test/fixtures/video.h264",
track_config: video_track_config,
payload_type: 96,
autoplay: true
}

end
But I clearly still misunderstand the webrtc process because I then get an error on the client side because it can't find the endpoint when handling the tracksAdded mediaEvent because there isn't an entry in this.idToEndpoint map for the new endpointId.
Radosław
Radosław8mo ago
Here I modified our simple example so video from file is streamed to browser. https://github.com/jellyfish-dev/membrane_rtc_engine/tree/al_example/examples/webrtc_videoroom
GitHub
membrane_rtc_engine/examples/webrtc_videoroom at al_example · jelly...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
Radosław
Radosław8mo ago
Check if this could help you with something And if you could adjust it to your project
Al
Al8mo ago
That's great, thank you. I'd almost got there. I think this will be a great help, next step is to try and modify it to take a udp stream rather than a file - I'll let you know how I get on 🙂 Hi @Radosław I've only just managed to get some time to look at this and am still struggling to get this to work with a ts file. I've stripped it back to a simple variation on the SDL file player example, all works if I play your example file through it. When I try the ts file I get it all working up to the point I get the :mpeg_ts_pmt child notification. Then it stops. I think the issue is that the output pin of the demuxer is flow_control: :manual so I'm missing something to trigger the flow once the video output is hooked up to the demuxer in the pipeline. This worked fine with the http_adaptive_stream plugin but I guess that's because the SinkBin triggered the flow. Is my intuition here correct?
Radosław
Radosław8mo ago
Hi Al I pushed new commit to this branch (https://github.com/jellyfish-dev/membrane_rtc_engine/tree/al_example/examples/webrtc_videoroom). Now endpoint reads from MPEG-TS file but there are some glitches/some frames are dropped in this stream. I am not sure why is that, you can check on your file if it works, maybe it will. There are a lot of logs Not all buffers have been processed which are from Membrane.MPEG.TS.Demuxer and I don't know why is that. You can ask the creators of the plugin what does this logs mean. Maybe @varsill will have some idea why there are this instability in frame received.
Al
Al8mo ago
Hi @Radosław, once again thank you! Your working version helped me figure out why my approach wasn't working - I'd cloned the demuxer repo back in December and tried to upgrade it to core 1.0, not entirely successfully it would appear! I should have checked back with the original and picked up the uplift to core 1.0.
Want results from more Discord servers?
Add your server