SMM
Software Mansion / MembraneTonyLikeSocks
Wiring up Javascript FE Using membrane-webrtc-js
Sorry if this obvious. I'm looking through the example in the membrane_rtc_engine (link below). It's not obvious to me how the audio playback for remote endpoints is managed. Does the membrane-webrtc-js take care of that magically? I see
addVideoElement
-- but that just seems to add an HTMLVideo element, but doesn't actually connect it to anything from the endpoint / tracks.
https://github.com/jellyfish-dev/membrane_rtc_engine/blob/master/examples/webrtc_videoroom/assets/src/room.tsGitHub
membrane_rtc_engine/examples/webrtc_videoroom/assets/src/room.ts at...
Customizable Real-time Communication Engine/SFU library focused on WebRTC. - jellyfish-dev/membrane_rtc_engine
T
TonyLikeSocks•36d ago
Oh, I see it now --
attachStream
is called when the trackReady
event firesWant results from more Discord servers?
More PostsFilter with `push` flow_controlHello, I have a filter that transcribes audio as it receives it by sending it to a transcription serLL-HLS broadcastingHello everyone!
I am trying to make LL-HLS broadcasting work.
I used the demo from webrtc_to_hls aPipeline children started twiceHello,
I'm seeing children in a Membrane pipeline get started twice:
I think this might be an issuWriting a `Bin` queuing content from multiple remote files@skillet wrote in https://discord.com/channels/464786597288738816/1007192081107791902/12244914186265