Pipeline for muxing 2 msr files (audio and video) into a single flv file

I have the following pipeline which takes 2 msr files (recorded to disk using the RecordingEntrypoint from rtc_engine) and need to create a single video+audio file from it (trying flv at the moment but not tied to a specific type, just want something that popular tools can read and manipulate). My problem is that the end FLV file only plays audio. Here's the pipeline:
spec = [
# Part of pipeline which saves to FLV file
# child(:muxer, Membrane.MP4.Muxer.ISOM)
child(:muxer, Membrane.FLV.Muxer)
|> child(:sink, %Membrane.File.Sink{location: output_file}),

# Part of pipeline which reads from file and prepare video for muxing
child(:source, %Membrane.File.Source{location: video_track_file})
|> child(:deserializer_video, Membrane.Stream.Deserializer)
|> child(:rtp_video, %Membrane.RTP.DepayloaderBin{
depayloader: Membrane.RTP.H264.Depayloader,
clock_rate: 90_000
})
|> child(:parser_video, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {0, 1}},
output_stream_structure: :avc1
})
|> via_in(Pad.ref(:video, 0))
|> get_child(:muxer),
# Part of pipeline which reads from file and prepare audio for muxing
child(:source_audio, %Membrane.File.Source{location: audio_track_file})
|> child(:deserializer_audio, Membrane.Stream.Deserializer)
|> child(:rtp_audio, %Membrane.RTP.DepayloaderBin{
depayloader: Membrane.RTP.Opus.Depayloader,
clock_rate: 48_000
})
|> child(:opus_decoder, Membrane.Opus.Decoder)
|> child(:aac_encoder, Membrane.AAC.FDK.Encoder)
|> child(:aac_parser, %Membrane.AAC.Parser{
out_encapsulation: :none,
output_config: :audio_specific_config
})
|> via_in(Pad.ref(:audio, 0))
|> get_child(:muxer)
]
spec = [
# Part of pipeline which saves to FLV file
# child(:muxer, Membrane.MP4.Muxer.ISOM)
child(:muxer, Membrane.FLV.Muxer)
|> child(:sink, %Membrane.File.Sink{location: output_file}),

# Part of pipeline which reads from file and prepare video for muxing
child(:source, %Membrane.File.Source{location: video_track_file})
|> child(:deserializer_video, Membrane.Stream.Deserializer)
|> child(:rtp_video, %Membrane.RTP.DepayloaderBin{
depayloader: Membrane.RTP.H264.Depayloader,
clock_rate: 90_000
})
|> child(:parser_video, %Membrane.H264.Parser{
generate_best_effort_timestamps: %{framerate: {0, 1}},
output_stream_structure: :avc1
})
|> via_in(Pad.ref(:video, 0))
|> get_child(:muxer),
# Part of pipeline which reads from file and prepare audio for muxing
child(:source_audio, %Membrane.File.Source{location: audio_track_file})
|> child(:deserializer_audio, Membrane.Stream.Deserializer)
|> child(:rtp_audio, %Membrane.RTP.DepayloaderBin{
depayloader: Membrane.RTP.Opus.Depayloader,
clock_rate: 48_000
})
|> child(:opus_decoder, Membrane.Opus.Decoder)
|> child(:aac_encoder, Membrane.AAC.FDK.Encoder)
|> child(:aac_parser, %Membrane.AAC.Parser{
out_encapsulation: :none,
output_config: :audio_specific_config
})
|> via_in(Pad.ref(:audio, 0))
|> get_child(:muxer)
]
What am i doing wrong? Help appreciated!
N
noozo67d ago
Thanks to @Radosław the problem seems to be difference between the enconding on the webrtc part and the encoding i'm doing for muxing. Follow up question is: How can i modify the encoding on the webrtc part to be h264 instead of VP8 (which is how the recording plugin is storing those msr files)?
R
Radosław67d ago
GitHub
membrane_videoroom/lib/videoroom/room.ex at d3b0c49968d625afcee9777...
Contribute to membraneframework-labs/membrane_videoroom development by creating an account on GitHub.
N
noozo67d ago
ty @Radosław you've been a life saver