SMM
Software Mansion / Membranespscream
h264 encoder problems
hi guys, I'm using h264 encoder plugin for video encoding and sending it via rtp to client.
Sometimes video play on client speeds up or speeds down.
How to debug such cases and what could be a reason of such lagging video? Network issues?
input for encoder is coming from webrtc source
S
spscream•69d ago
I tried to put RealTimer before SessioBin, but it doesn't help
I have the following pipeline part for it
full logic of my pipeline is the following:
webrtc subscription for video from janus gateway -> vp8 decode -> h264 encode -> rtp output to sip client
I implemented vp8 encoder/decoder plugin (https://github.com/spscream/membrane_vp8_ffmpeg_plugin it is in WIP state now) for membrane to decode vp8, using membrane_h264_ffmpeg_plugin as an example, it makes decoding and encoding of vp8 for my case.
Is it reason could be in non-monotonically growing pts in incoming buffers?
I have subscriber connection to janus, which is calling "switch" to janus on on "talking" events from janus, to switch video to new speaker if it is talking right now.
I added sink to debug buffers and it is showing, what pts isn't monotonically increased, in log diff is the prev buffer pts - current pts
also I see what for first video stream is coming with full vp8 frame per packet, after switch is coming splitted, timestamps on incoming rtp from subscriber stream:
may be different frame rates in incoming videos could be the reason...
I set zerolatency tune for h264 encoder and framerate to {0,1} for vp8 decoder and now it looks good
problem is solved, thanks 😄
Q
Qizot•67d ago
zerolatency mode basically removes B-frames which should not be used for rtp/webrtc stuff (b-frames can have pts going back in time)
M
mat_hek•64d ago
Hi @spscream , some insights that may be helpful
- If PTS are not monotonic, that means they're either broken or you have B-frames in the stream. As @Qizot said, they shouldn't be used for real-time streaming as they introduce latency, and due to that they're sometimes not properly handled by endpoints. Normally, you set the H264 encoder profile to
baseline
or constrained baseline
for live streaming. These profiles don't include B-frames.
- Don't rely on framerate, as it may change at any moment. Rely on timestamps
- Generally, media servers usually don't apply pacing (like realtimer), they just forward media and the pacing is the job of the player
- Congrats for creating the VP8 plugin, though it seems it's not public as your link doesn't work 😉S
spscream•64d ago
thank you for insights, I moved repo in public
I also should mention, what i'm using swscale directly on decoder output in c code.
I have a special case with switch stream on janus subscription - frame size could be different for published videos, so I had to add it in decoder.
I tried to use swscale plugin in pipeline with no success and faster for my case was to implement it in decoder. In future I'll move this logic to separate membrane element, but I have deadlines now and it works for me now - decoder gives me constant frame format for outputs.
Want results from more Discord servers?
More PostsPipeline for muxing 2 msr files (audio and video) into a single flv fileI have the following pipeline which takes 2 msr files (recorded to disk using the RecordingEntrypoinMP3 output is audible, but test not passHi everyone. I made small changes in `membrane_mp3_lame_plugin` to support other input config (the oGrab keyframe image data from h264 stream?We are looking at some h264 coming from an RTSP stream. Membrane is doing a fine job with the HLS deUnity client?Thanks a lot to the Membrane team. The number of examples and availability of code has been incredibWebRTC stream not workingI've hit an issue trying to get an MPEGTS stream displaying in the browser via WebRTC.
All seems toError when compiling free4chatThis is probably pretty basic but I'm at square one. I get an error when I'm compiling free4chat, a Screen ShareIs there any reason why jellyfish_videoroom when is running localy share screen is not working ?
I sTesting Membrane ElementI'm trying to setup a simple test of a membrane element, but I'm stumped a bit on how to assert thatClustering and scale out behaviourContext: I've a membrane application running in an elixir cluster of 1. It receives a RTSP stream (URTSP authentication problem?Hi,
I am playing with an RTSP camera (**Tapo C210**) that is on the local network.
The path to the sExtending the jellyfish video room demo with a queueHey I am curious what a scalable way would be to add a queue in front of jellyfish rooms. I have a sDebugging bundlex/unifex errorsHello, I've been tinkering with membrane cross-compiled to Nerves (rpi4).
I've had features (e.g. mRTP to HLS Disconnect and Reconnect Audio StreamHello everyone,
I currently have a microphone input which is sending UDP to a server and into a RTPCustom RTC EndpointSorry, more random questions!
What's the best place to start when trying to get a custom endpoint tHTTPAdaptiveStream issue with hls.jsI was wondering if anyone else has experieneced this issues with HLS?
I have a working pipeline genSpinning up a new GenServer for each roomI have been learning from the videoroom demo and I have a few questions.
```elixir
# meeting.ex
@RTP streamI'm trying to get a simple RTP stream pipeline working but hit the following error:
```
16:22:57.782React-Native connection?I'm struggling to get a react-native client to connect to my membrane server. I'm just running localRTP demo with RawAudioHello friends, I'm trying to get microphone input (via `Membrane.PortAudio.Source`) packaged into aWebRTC to HLS, where does the pipeline happen?Looking at the demo I can't see a "normal" pipeline. I want to process video frames in a pipeline. D