You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
I'm having problems trying to push multiple WHIP streams into OvenMediaEngine from gstreamer using whipclientsink element. Sending single WHIP audio/video stream into OME works as expected from both gstreamer and OBS but trying to send multiple WHIP streams from gstreamer usually causes all streams from that host constantly disconnect/reconnect.
To Reproduce
Steps to reproduce the behavior:
Setup basic Server.xml with audio and video bypass streams
Install gstreamer 1.24.10 and gst-plugins-rs containing rswebrtc 0.12.11
Run gst-launch commands listed below few times
Follow OME logs to possibly see repeating errors E [SPICE-u10002:31305] ICE | ice_session.cpp:277 | ICE session : 2 | UseCandidate() | Invalid state: Connected
Expected behavior
Using example gstreamer pipeline launch command the streams connect and are ingested around one time for every four lauches. If I send only two streams (1080p and 720p for example), it works around half of the time and with single stream (1080p, 720p or 360p) it seems to work every time. Sending WHIP stream in from OBS work every time when gstreamer streams are in working state and when in broken state. I'd expect to have all streams working same way every time.
Server:
OS: Ubuntu server 24.04.1 LTS on AWS EC2 c5.xlarge
OvenMediaEngine Version: v0.16.8 and v0.17.3
Branch: master I think, I downloaded sources from releases
Player:
Not really applicable. This is ingest problem.
Additional context
Our use case for this is our existing gstreamer based streaming software stack where we'd like to use our current workflow to handle encoding the video/audio in multiple qualities and use OME to push these track pairs out via muxed ABR WebRTC, LL-HLS and HLS playback delivery methods.
We have tested sending these video/audio quality streams into OME using SRT and output via static mux config file and that works well. Issue with this is that OME WHIP ingest is only method that supports Opus audio codec required for WebRTC playback delivery without transcoding. Ingesting other protocols and transcofing video/audio in OME works but we'd like to avoid having to develop second layer of transcoding in our workflow to achieve lower systems complexity and to get better overall video quality.
Our OME test server is running in AWS on c5.xlarge EC2 instance where system load is around 0.2 and memory usage is low when streams work and when streams are in broken state. Configuration of server is tuned to match core count and network traffic is well under expected limits of EC2 instance type. Test server is located in AWS eu-central-1 region and sending test clients are located in Finland behind 500/250 Mbps fiber connection. Based on these I'm assuming this is not related to server performance.
I have tested this with OME version 0.16.8 and 0.17.3 which I have compiled myself on Ubuntu server 24.04.1 LTS. Issue is present on both and baheve same way. For sending I'm using gstreamer version 1.24.10 on Ubuntu Desktop 20.04.1 and OBS version 31.0.1 on MacOS 14.6.1.
Sending multiple streams in parallel from gstreamer into MediaMTX media distribution server works without issues and sending single streams into VDO.Ninja WHIP ingest room works without issues.
Encoding Opus audio before passing it into gstreamer whipclientsink or passing it raw and letting the sink set up Opus encoder doesn't seem to have any difference in behavior.
Sending PC has Nvidia graphics card which whipclientsink sets up as video encoder. whipclientsink element technically supports passing H264 encoded video in but there is some issue with SDP negotiation of it with OME so I haven't put too much time into investigation and testing it. Example launch commands and our software stack prototypes use gstreamer internal raw video format.
whipclientsink support adaptive video bitrate based on estimated upload bandwidth but forcing static bitrate doesn't seem to have any difference in behavior so I have left it enabled byt default.
Sending WHIP in with TCP using whipclientsink works but doesn't seem to have any difference in behavior. I have left it to UDP by default.
I have seen two instances of OvenMediaEngine causing OOM after WHIP streams have been disconnected from sending side. I managed to get log from second OOM event.
I have tested this with static mux config file and without on. This doesn't seem to have any difference in behavior.
I'm running out of things to tweak and test. This seems like complex issue so I'd like to hear what I should investigate next.
I've never used gstreamer before so I'm not familiar with it, but I'll give it a try.
It looks like you're trying to send 3 streams with gstreamer and then mux them to provide ABR. Since WHIP + Simulcast was released a few days ago, you might want to try that.
gstreamer is toolbox for building various audio/video streaming systems. Very useful for building things that are not possible with ffmpeg or testing different streaming situations. Learning curve is very steep but it's very good toolset to know.
I'll test Simulcast and see if it would fit our needs. Looks promising.
The two OOM cases were in 100-200 test runs I've had in last few weeks so it's not too common. I'll keep eye on it.
I compiled master branch head and ran some test streams. Broken state on 3 stream connections shows up as usual but maybe tiny bit less often. Hard to say. There is noticable reduction on timestamp jittering errors in the log with no changes to test setup compared to 0.17.3.
Building static mux output from three WHIP streams works but takes a while. Probably related to automatic keyframe interval used in whipclientsink. There is currently no settings to change this without recompiling whole module. I'll probably create feature request for this at some point into their repo.
From initial investigation it looks like adding simulcast support into gstreamer would require building new output element. Current whipclientsink assumes single video track and single audio track and is built with idea of the client handling changing target bitrate of the only video encoder with adaptive bitrate algorithm. Some/most of the code could be reused but sounds like it's pretty complex project.
Describe the bug
I'm having problems trying to push multiple WHIP streams into OvenMediaEngine from gstreamer using
whipclientsink
element. Sending single WHIP audio/video stream into OME works as expected from both gstreamer and OBS but trying to send multiple WHIP streams from gstreamer usually causes all streams from that host constantly disconnect/reconnect.To Reproduce
Steps to reproduce the behavior:
E [SPICE-u10002:31305] ICE | ice_session.cpp:277 | ICE session : 2 | UseCandidate() | Invalid state: Connected
Single WHIP stream command:
gst-launch-1.0 whipclientsink signaller::whip-endpoint="https://server:3334/app/1080p?direction=whip" name="whip1080p" video-caps="video/x-h264" audio-caps="audio/x-opus" max-bitrate=6000000 videotestsrc ! video/x-raw,width=1920,height=1080,framerate=50/1 ! queue ! whip1080p. audiotestsrc ! audio/x-raw,channels=2,rate=48000 ! opusenc ! queue ! whip1080p.
Three parallel WHIP streams command:
Expected behavior
Using example gstreamer pipeline launch command the streams connect and are ingested around one time for every four lauches. If I send only two streams (1080p and 720p for example), it works around half of the time and with single stream (1080p, 720p or 360p) it seems to work every time. Sending WHIP stream in from OBS work every time when gstreamer streams are in working state and when in broken state. I'd expect to have all streams working same way every time.
Server:
Player:
Additional context
Our use case for this is our existing gstreamer based streaming software stack where we'd like to use our current workflow to handle encoding the video/audio in multiple qualities and use OME to push these track pairs out via muxed ABR WebRTC, LL-HLS and HLS playback delivery methods.
We have tested sending these video/audio quality streams into OME using SRT and output via static mux config file and that works well. Issue with this is that OME WHIP ingest is only method that supports Opus audio codec required for WebRTC playback delivery without transcoding. Ingesting other protocols and transcofing video/audio in OME works but we'd like to avoid having to develop second layer of transcoding in our workflow to achieve lower systems complexity and to get better overall video quality.
Our OME test server is running in AWS on c5.xlarge EC2 instance where system load is around 0.2 and memory usage is low when streams work and when streams are in broken state. Configuration of server is tuned to match core count and network traffic is well under expected limits of EC2 instance type. Test server is located in AWS eu-central-1 region and sending test clients are located in Finland behind 500/250 Mbps fiber connection. Based on these I'm assuming this is not related to server performance.
I have tested this with OME version 0.16.8 and 0.17.3 which I have compiled myself on Ubuntu server 24.04.1 LTS. Issue is present on both and baheve same way. For sending I'm using gstreamer version 1.24.10 on Ubuntu Desktop 20.04.1 and OBS version 31.0.1 on MacOS 14.6.1.
Sending multiple streams in parallel from gstreamer into MediaMTX media distribution server works without issues and sending single streams into VDO.Ninja WHIP ingest room works without issues.
Encoding Opus audio before passing it into gstreamer
whipclientsink
or passing it raw and letting the sink set up Opus encoder doesn't seem to have any difference in behavior.Sending PC has Nvidia graphics card which
whipclientsink
sets up as video encoder.whipclientsink
element technically supports passing H264 encoded video in but there is some issue with SDP negotiation of it with OME so I haven't put too much time into investigation and testing it. Example launch commands and our software stack prototypes use gstreamer internal raw video format.whipclientsink
support adaptive video bitrate based on estimated upload bandwidth but forcing static bitrate doesn't seem to have any difference in behavior so I have left it enabled byt default.Sending WHIP in with TCP using
whipclientsink
works but doesn't seem to have any difference in behavior. I have left it to UDP by default.I have seen two instances of OvenMediaEngine causing OOM after WHIP streams have been disconnected from sending side. I managed to get log from second OOM event.
I have tested this with static mux config file and without on. This doesn't seem to have any difference in behavior.
I'm running out of things to tweak and test. This seems like complex issue so I'd like to hear what I should investigate next.
Logs
2025-01-29-ome-3-udp-ok.txt
2025-01-29-ome-udp-3-broken.txt
2025-01-29-ome-tcp-3-broken.txt
2025-01-29-ome-disconnect-oom.txt
The text was updated successfully, but these errors were encountered: