Modifying pipeline after it has been started

I would like to create/remove additional children to the pipeline after it has start been started and is :playing. I was able to create children after pipeline started but I was wondering if it's fundamentally wrong to do so. Let's say I have this simple membrane pipeline where it reads from a file, pass it to a tee, and have a sink that writes to a file.
defmodule SimplePipeline do
use Membrane.Pipeline

@impl true
def handle_init(_context, _options) do
spec = [
child(:input, %MyFileSource{location: "input.txt"})
|> child(:tee, Membrane.Tee.Parallel)
|> child(:output, %MyFileSink{location: "output.txt"})
]

{[spec: spec], %{}}
end
defmodule SimplePipeline do
use Membrane.Pipeline

@impl true
def handle_init(_context, _options) do
spec = [
child(:input, %MyFileSource{location: "input.txt"})
|> child(:tee, Membrane.Tee.Parallel)
|> child(:output, %MyFileSink{location: "output.txt"})
]

{[spec: spec], %{}}
end
Then I start this pipeline and it starts reading from a file. While it is still reading from a file, is it fundamentally wrong if I invoke a new spec action to add additional children?
# in the same SimplePipeline module
def handle_info(:add_new_sink, _ctx, state) do
spec = [
get_child(:tee)
|> child(:new_output, %MyFileSink{location: "output_2.txt"})
]
{[spec: spec], state}
end
# in the same SimplePipeline module
def handle_info(:add_new_sink, _ctx, state) do
spec = [
get_child(:tee)
|> child(:new_output, %MyFileSink{location: "output_2.txt"})
]
{[spec: spec], state}
end
(I do that by send(<Pipeline_PID>, :add_new_sink, nil) ) I tested this and it worked fine. I also looked at the source code of how Membrane.Core.Pipeline initializes its children and I don't see anything obvious that would break. I just want to make sure that this won't impact performance and break benefits that membrane gives (eg. shared memory buffer etc.) Follow-up question, does Tee use shmex to share buffer across multiple elements or each element will get a duplicate copy of the same buffer? Thanks!
3 Replies
varsill
varsill6mo ago
Hello! It's completely fine to spawn children on request, for instance in response to handle_info. With many plugins we follow a similar scenario, for instance you can spawn MP4 Demuxer, wait until MP4 demuxer sends new_tracks notification to the pipeline and then add a new spec, that will handle tracks resolved from MP4 container. It shouldn't have any negative impact on the performance.
tintin
tintinOP6mo ago
awesome, thanks for the response!
varsill
varsill6mo ago
Concerning Tee, each buffer will get a copy of the same buffer, a buffer's payload is just a subject to a regular Erlang binaries handling mechanism
Want results from more Discord servers?
Add your server