Confusion on usage of MP4.Demuxer.ISOM
Hi, I'm trying to write a simple pipeline which take in an mp4 source and streams it out via the RTMP sink. I have some general confusion on how to properly wire up the MP4.Demuxer.ISOM.
1. How do I decode/parse the output pads stream format downstream from the source? I haven't seen any examples or demos of transforming the
MP4.Payload.{AAC.AVC1}
2. How to properly handle dynamically attaching the output pads for each track to downstream elements? If I want to handle the :new_track
message to attach to some sink with pads in the :always
availability mode (such as the RTMP sink) I can't attach that track to some grouping of elements which end at the sink temporarily. For example, if I get the :new_track
notification for an AAC track I can't attach just the :audio
pad of the RTMP sink, because when handling that callback there is no video pad to attach.
3. Is it better to handle the track determination statically? IE, should I ffprobe
the mp4 file and parse the available tracks beforehand?
4. Does the demuxer only handle mp4 with embedded h264/aac? A container with h264/mp3 won't be able to be demuxed?
Thanks!11 Replies
1. Can't recall, we could have better docs & some example for that ๐ @shuntrho ?
2. In this case it would be best to spawn the children up to the MP4 demuxer, then wait till both audio and video arrive, then spawn RTMP and link both pads to it in a single action
3. I would only do that if necessary for some reason. The more tools the more problem ๐
4. We only tried h264 and aac, but the demuxer seems to be codec agnostic, so maybe mp3 will work, but it may require some tweaking
Awesome, thanks @mat_hek -- an example of a pipeline of elements downstream of the demuxer would be awesome. I haven't been able to figure it out, and right now I have a very hacky solution where I
ffprobe
to get stream indicies and then shell out to ffmpeg
to manually demux the h264/aac -- not great.
For 2, yes, that makes sense I've seen a similar pattern in some membrane examples but in the opposite direction (waiting for EOS from all elements before terminating the pipeline).
In my initial testing I thought it would be a nice abstraction to do the demuxing and any decoding/parsing in a bin, however, I noticed that the bin doesn't get the :new_track
child notification. When I converted that simple bin into a pipeline, suddenly I was getting the :new_track
notification -- any ideas on why that might be happening? Having to hoist the demuxer into a pipeline isn't a problem, however, I just don't understand why the bin isn't handling the child notification.Not sure why the notification is not received, possibly it's not sent as well? The debug logs may tell, possibly need to enable verbose logging. If you have a repo reproducing that I can try to help
Hm, apologies -- I created a minimal example repo and could not reproduce. I tried again in my application repo and couldn't reproduce.. odd, sorry for the confusion.
If you could point me in the direction of how to parse the mp4 payloads that would be much appreciated ๐ thanks for all the awesome work you and the team have put out!!
Hah understand, been there ๐ Looking at the RTMP sink example https://github.com/membraneframework/membrane_rtmp_plugin/blob/master/examples/sink.exs it seems that it uses the same format as MP4 outputs for H264 and AAC, so I would try just plugging it directly (MP4 output -> RTMP input). We'll try to find some time next week, sort that out and provide some examples.
GitHub
membrane_rtmp_plugin/sink.exs at master ยท membraneframework/membran...
Contribute to membraneframework/membrane_rtmp_plugin development by creating an account on GitHub.
Awesome thanks Mat, if it helps here's a pipeline I've tried after incorporating some of your other tips, however, it's not working! Looking forward to seeing what you all come up with ๐
https://gist.github.com/nickdichev-firework/ffdc3ca3bd6ce9efa6207dec952c5c47
Hi, it seems that we are missing some information when sending
stream_format
from the demuxer. The correct in_encapsulation
for the AAC.Parser should be :none
instead of :ADTS
(there is no ADTS in mp4 so we don't expect to receive it from the demuxer). However, the information which in other cases is contained in ADTS, such as sample rate, samples per frame or frames per buffer should be sent in the stream_format
which is not happening, and should be added to the demuxer. If I have time I will work on it some time this week.Unknown Userโข2y ago
Message Not Public
Sign In & Join Server To View
Hi @.wol yes I think it would be nice to eventually transition our pipelines to use the mp4 demuxer, however, this shouldn't be a blocker for a v1 release we have coming up.
Unknown Userโข2y ago
Message Not Public
Sign In & Join Server To View
Sure!