Software Mansion

SM

Software Mansion

Join the community to ask questions about Software Mansion and get answers from other members.

Join

RTSP push approach with Membrane.RTSP.Server

We are trying to add RTSP into media server, using Membrane. The main thing is that we need to implement push approach: Elixir TCP server starts listening for incoming RTSP connections from cameras, and then pushes the incoming RTSP video stream to the clients. We used Membrane.RTSP.Server with handler which handles announce, describe and record steps accordingly. On RECORD step, we are passing socket control to pipeline pid: ```elixir Enum.each(tracks, fn {_, track} -> options = [...

Get Video from RTSP and stream by RTMP

Hi guys, i'm build a stream video realtime through rtmp but my camera video source is from an rtsp link. i used Membrane.RTSP.Source but got stuck and many error. please help me, thanks you.

How to split a Raw Audio Buffer with 2 channels within frame into two different buffer

Hey team ! I'm trying to process an FLV stream with AAC codec for the audio, and the audio part has 2 channels that I would like to treat separately, is there a way to split my pipeline in order to handle the 2 channels differently? Here is an overview of the pipeline: ``` child(:source, %Membrane.RTMP.Source{ socket: socket...

Logs are overrun with `Sending stream format through pad` messages. Am I doing something wrong ?

I have a simple pipeline for processing text. I am using flow_control as auto for all elements in the pipeline besides the source which has flow_control set to push. All elements (with the exception of the source have 1000s of messages of type ``` app-1 | [debug] <0.3808.0>/:some_name Sending stream format through pad :output...

How to send control events upstream/downstream ?

Hi 👋 I am new to elixir and even newer to Membrane 🙂 I am trying to determine how to send events elements in the pipeline, which one or more other elements (upstream or downstream) could handle. I came across https://hexdocs.pm/membrane_core/Membrane.Event.html , which links to https://hexdocs.pm/membrane_core/Membrane.Element.Action.html#t:event/0. Am I correct in understanding that control events also should be sent via pads (maybe creating custom pads that are not input output but something like signal or control ?...

Guidance on turning a low fps stream of jpegs into a video

Hi folks! I'm new to Membrane (about 3 hours in by now) and just looking for some pointers on where to go, so far I have a source receiving tcp packets from a 3d printer, and a filter parsing those into jpeg images. I'd essentially like to end up with a low fps video feed from these. Any pointers?

membrane_rtc_engine/membrane_rtc_engine_ex_webrtc error

Hi, while trying to use membrane_rtc_engine with package membrane_rtc_engine_ex_webrtc I'm seeing there's a dependency mismatch. Ie; the example at the bottom of this page (https://github.com/fishjam-cloud/membrane_rtc_engine/tree/master?tab=readme-ov-file#repository-structure): ```...

Demuxing Safari MP4

Hi! I'm trying to use a MediaRecorder to record audio/mp4 on Safari, and then handle it using Membrane. Membrane.MP4.Demuxer.ISOM gives me an error: ``` Error parsing MP4 box: moof / traf / tfhd...

SDL plugin fails to initialize

Hi. I am trying to play udp stream via SDL sink but if fails to initialize. I am on archlinux and using hyperland(wayland) which may be the cause of problem. I have attached error and pipeline....

HTTP adaptive stream continuous segments

Hey there! I have a question regarding https://github.com/membraneframework/membrane_http_adaptive_stream_plugin library. Is there a way to configure the starting number for the segment, partial_segment or header? The use case is - we want to keep the segments, headers and partial segments counter continuous after restarting the stream. Is that even possible? Any points against such an approach? ...

Sections of files

I have a bunch of mp4 files sitting on disk and I'd like clients to be able to request arbitrary segments of them (e.g., starting 200 seconds in until 250s). I'm a beginner in any kind of digital video, is this even slightly viable with membrane somehow?

Pipeline Error: Pipeline Failed to Terminate within Timeout (5000ms)

This is a bit of a head scratcher for me. I'm in the process of writing a new element for my pipeline that uses the Silero VAD module for speech detection (rather than the built in WebRTC Engine VAD extension). I've got it working, but hitting a wierd bug. Now when my engine terminates (peer leaves), I'm getting this error: ** (Membrane.PipelineError) Pipeline #PID<0.1499.0> hasn't terminated within given timeout (5000 ms). The only thing that's changed is my new element in the pipeline (it's setup as a Membrane.Filter). If I remove the element from the pipeline, then the error goes away. ...

burn in caption to mp4

Thanks first of all for building the whole family of products. To learn Membrane & friends, I want to build a simple Livebook-based tool that starts with a video and its webVTTs (multilingual), generate the image sequence (with Image or Typst), then burn the image sequence captions into the video. Is Membrane, Live Compositor, or Boombox better suited for this purpose?...

stream RTMP to VLC Network stream

Hello, currently we are using membrane_rtmp_plugin to receive RTMP Stream as source (with help of Membrane.RTMPServer and Membrane.RTMP.SourceBin). All is fine, we migrated successfully to 0.26.0 version, which simplifies pipeline a lot. Also we did a POC of streaming RTMP to streaming service (Youtube) and everything is working as expected. I am curious, is there any way to stream RTMP to VLC Player (probably it is called pull approach)? I mean File -> Open Network -> Specify URL (eg. r...

Fly.io + UDP

I've got an membrane_webrtc server setup where someone can "call" an LLM and talk with them (audio only, no video). It largely works, though my users are reporting random disconnects. The console errors match the attached image. I'm a little thrown since the url in that message specifies UDP as the transport. I deployed to fly.io, and explicitly did not open up the UDP ports in my fly.toml, so I'm wondering why the app is failing with a UDP timeout. Am I incorrect in assuming that I can force all traffic over TCP by just not opening it up? Shoudl I also figure out UDP? On UDP, I read through this: https://github.com/fishjam-dev/fishjam-docs/blob/main/docs/deploying/fly_io.md But it's Fishjam specific, and it doesn't line up neatly with my app which is based on the old membrane video room repo (https://github.com/membraneframework-labs/membrane_videoroom). Where does fly-global-services get specified in that case? I'm not explicitly setting a TURN_LISTEN_IP. I traced through things I think it could be here in turn_ip (and then the turn_mock_ip is my external IPv4 address)...
No description

web rtc engine and erlang clustering / load balancing

We are currently experiencing a problem where when we deploy our video room to production, which is a two node cluster (also with a load balancer in fly.io in front), the call drops for some people when another one joins, basically randomly. We know it's related to that because if we scale down to just one instance the calls work as expected. Any ideas what could be causing this?

WebRTC TURN TCP/TLS configuration issue

Hey! I'm loving the framework - using it in production with great success. We've got some users who (I think) are having trouble making the TURN connection to our server using UDP - typically these users are on restrictive corporate VPNs. My thought is that setting up our TCP/TLS TURN properly might help. I've set it up similar to this: https://github.com/fishjam-dev/membrane_rtc_engine/blob/eb8f97d254f5925139cdc1e76df0ecd0ac4977e9/examples/webrtc_to_hls/lib/webrtc_to_hls/stream.ex#L150...

Syncing two streams from HLS source

Hi, I have two streams(coming from a unmuxed HLS stream): a video-only stream and an audio stream. Screenshot shows the beginning of my pipeline. From the kino_membrane graphs, it looks like they are producing buffers at very different rates(screenshots are of output pad). demuxer2 is the audio stream which seems to produce buffers at a much lower rate. ...
No description

Lowest latency h264 UDP video stream possible.

Hello, To give some context I am trying to replicate parts of what https://openhd.gitbook.io/open-hd is doing using elixir, nerves and membrane. ...

Using Google meet as a source

Hi, I'm new to Membrane and have a question about its capabilities. I have a Google Meet URL that I can access using a headless browser, bot, or similar method. Is it possible to use this as a data source for Membrane? Does Membrane support reading from such dynamic sources? #googlemeet...
Next