Software Mansion

SM

Software Mansion

Join the community to ask questions about Software Mansion and get answers from other members.

Join

Testing a Membrane Bin used in a WebRTC Engine Endpoint

I'm trying to test a membrane bin that I use in WebRTC engine endpoint. Though I'm having trouble getting my test setup properly. I was trying to approach is this way: 1. Start my Membrane.Testing.Pipeline with a simple spec, just the conversation bin 2. Send a {:new_tracks} event to the bin, simulating what the Membrane.WebRTC.Engine does when a new bin is assed as an endpoint...

WebRTC - Add Track to running PeerConnection

I have started out with the Nexus WebRTC example app (https://github.com/elixir-webrtc/apps/tree/master/nexus). Instead of automatically starting the stream, I am now running createPeerConnection() and joinChannel() without starting any local streams. I have then added a button that when pressed, should start the local webcam stream and broadcast it to other peers. On button press I execute:...

High WebRTC CPU consumption

Hey! We are performing some benchmarks with ExWebRTC-based pipeline and on 4 CPU droplet (with dedicated cores) and 10 incoming streams consumes 100% of CPU. Is that expected behaviour? Our pipeline is essentially consuming H264 video and decodes the AAC to Opus, nothing else. The second question is - how do we trace the membrane elements CPU consumption? I was trying to https://hexdocs.pm/membrane_core/Membrane.Pipeline.html#module-visualizing-the-supervision-tree, but I doesn't see the pipeline at all in :observer. The Live dashboard on the other hand lists the processes and I can sort by "Number of Reductions", but over there all elements is just Membrane.Core.Element so it's kind of makes it impossible to distinguish which process causes the most CPU consumption....

Continious RTMP stream without constant output or changing output

Hi there, I am wondering if it's possible to use membrane to do the following: - We have camera's that are streaming RTMP continously to the server....

Testing a filter with flow control :auto?

I'm trying to test a membrane filter where the flow control is set to :auto. I'm using Membrane.Testing.Source and passing in a custom generator function. Though it appears to be only called once. Am I setting this testing pipeline up incorrectly?

RTSP push approach with Membrane.RTSP.Server

We are trying to add RTSP into media server, using Membrane. The main thing is that we need to implement push approach: Elixir TCP server starts listening for incoming RTSP connections from cameras, and then pushes the incoming RTSP video stream to the clients. We used Membrane.RTSP.Server with handler which handles announce, describe and record steps accordingly. On RECORD step, we are passing socket control to pipeline pid: ```elixir Enum.each(tracks, fn {_, track} -> options = [...

Get Video from RTSP and stream by RTMP

Hi guys, i'm build a stream video realtime through rtmp but my camera video source is from an rtsp link. i used Membrane.RTSP.Source but got stuck and many error. please help me, thanks you.

How to split a Raw Audio Buffer with 2 channels within frame into two different buffer

Hey team ! I'm trying to process an FLV stream with AAC codec for the audio, and the audio part has 2 channels that I would like to treat separately, is there a way to split my pipeline in order to handle the 2 channels differently? Here is an overview of the pipeline: ``` child(:source, %Membrane.RTMP.Source{ socket: socket...

Logs are overrun with `Sending stream format through pad` messages. Am I doing something wrong ?

I have a simple pipeline for processing text. I am using flow_control as auto for all elements in the pipeline besides the source which has flow_control set to push. All elements (with the exception of the source have 1000s of messages of type ``` app-1 | [debug] <0.3808.0>/:some_name Sending stream format through pad :output...

How to send control events upstream/downstream ?

Hi 👋 I am new to elixir and even newer to Membrane 🙂 I am trying to determine how to send events elements in the pipeline, which one or more other elements (upstream or downstream) could handle. I came across https://hexdocs.pm/membrane_core/Membrane.Event.html , which links to https://hexdocs.pm/membrane_core/Membrane.Element.Action.html#t:event/0. Am I correct in understanding that control events also should be sent via pads (maybe creating custom pads that are not input output but something like signal or control ?...

Guidance on turning a low fps stream of jpegs into a video

Hi folks! I'm new to Membrane (about 3 hours in by now) and just looking for some pointers on where to go, so far I have a source receiving tcp packets from a 3d printer, and a filter parsing those into jpeg images. I'd essentially like to end up with a low fps video feed from these. Any pointers?

membrane_rtc_engine/membrane_rtc_engine_ex_webrtc error

Hi, while trying to use membrane_rtc_engine with package membrane_rtc_engine_ex_webrtc I'm seeing there's a dependency mismatch. Ie; the example at the bottom of this page (https://github.com/fishjam-cloud/membrane_rtc_engine/tree/master?tab=readme-ov-file#repository-structure): ```...

Demuxing Safari MP4

Hi! I'm trying to use a MediaRecorder to record audio/mp4 on Safari, and then handle it using Membrane. Membrane.MP4.Demuxer.ISOM gives me an error: ``` Error parsing MP4 box: moof / traf / tfhd...

SDL plugin fails to initialize

Hi. I am trying to play udp stream via SDL sink but if fails to initialize. I am on archlinux and using hyperland(wayland) which may be the cause of problem. I have attached error and pipeline....

HTTP adaptive stream continuous segments

Hey there! I have a question regarding https://github.com/membraneframework/membrane_http_adaptive_stream_plugin library. Is there a way to configure the starting number for the segment, partial_segment or header? The use case is - we want to keep the segments, headers and partial segments counter continuous after restarting the stream. Is that even possible? Any points against such an approach? ...

Sections of files

I have a bunch of mp4 files sitting on disk and I'd like clients to be able to request arbitrary segments of them (e.g., starting 200 seconds in until 250s). I'm a beginner in any kind of digital video, is this even slightly viable with membrane somehow?

Pipeline Error: Pipeline Failed to Terminate within Timeout (5000ms)

This is a bit of a head scratcher for me. I'm in the process of writing a new element for my pipeline that uses the Silero VAD module for speech detection (rather than the built in WebRTC Engine VAD extension). I've got it working, but hitting a wierd bug. Now when my engine terminates (peer leaves), I'm getting this error: ** (Membrane.PipelineError) Pipeline #PID<0.1499.0> hasn't terminated within given timeout (5000 ms). The only thing that's changed is my new element in the pipeline (it's setup as a Membrane.Filter). If I remove the element from the pipeline, then the error goes away. ...

burn in caption to mp4

Thanks first of all for building the whole family of products. To learn Membrane & friends, I want to build a simple Livebook-based tool that starts with a video and its webVTTs (multilingual), generate the image sequence (with Image or Typst), then burn the image sequence captions into the video. Is Membrane, Live Compositor, or Boombox better suited for this purpose?...

stream RTMP to VLC Network stream

Hello, currently we are using membrane_rtmp_plugin to receive RTMP Stream as source (with help of Membrane.RTMPServer and Membrane.RTMP.SourceBin). All is fine, we migrated successfully to 0.26.0 version, which simplifies pipeline a lot. Also we did a POC of streaming RTMP to streaming service (Youtube) and everything is working as expected. I am curious, is there any way to stream RTMP to VLC Player (probably it is called pull approach)? I mean File -> Open Network -> Specify URL (eg. r...

Fly.io + UDP

I've got an membrane_webrtc server setup where someone can "call" an LLM and talk with them (audio only, no video). It largely works, though my users are reporting random disconnects. The console errors match the attached image. I'm a little thrown since the url in that message specifies UDP as the transport. I deployed to fly.io, and explicitly did not open up the UDP ports in my fly.toml, so I'm wondering why the app is failing with a UDP timeout. Am I incorrect in assuming that I can force all traffic over TCP by just not opening it up? Shoudl I also figure out UDP? On UDP, I read through this: https://github.com/fishjam-dev/fishjam-docs/blob/main/docs/deploying/fly_io.md But it's Fishjam specific, and it doesn't line up neatly with my app which is based on the old membrane video room repo (https://github.com/membraneframework-labs/membrane_videoroom). Where does fly-global-services get specified in that case? I'm not explicitly setting a TURN_LISTEN_IP. I traced through things I think it could be here in turn_ip (and then the turn_mock_ip is my external IPv4 address)...
No description
Next