RealtimeOutgoing A/V sources should observe their sources only if having a sink
authoryouenn@apple.com <youenn@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Mon, 12 Nov 2018 19:49:13 +0000 (19:49 +0000)
committeryouenn@apple.com <youenn@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Mon, 12 Nov 2018 19:49:13 +0000 (19:49 +0000)
https://bugs.webkit.org/show_bug.cgi?id=191490

Reviewed by Eric Carlson.

Source/WebCore:

Observe the source that generates media based on the sinks:
- Do not observe at creation time
- For first sink, start observing
- When no more sink, stop observing
Apply this principle for both outgoing audio and video sources.
Add locks for the sinks to ensure thread-safety.
Make sinks HashSet which is more robust.

Do some refactoring to better isolate generic outgoing sources from Cocoa/GTK implementations.

Covered by existing tests and updated webrtc/remove-track.html.

* platform/mediastream/RealtimeOutgoingAudioSource.cpp:
(WebCore::RealtimeOutgoingAudioSource::~RealtimeOutgoingAudioSource):
(WebCore::RealtimeOutgoingAudioSource::stop):
(WebCore::RealtimeOutgoingAudioSource::AddSink):
(WebCore::RealtimeOutgoingAudioSource::RemoveSink):
(WebCore::RealtimeOutgoingAudioSource::sendAudioFrames):
* platform/mediastream/RealtimeOutgoingAudioSource.h:
* platform/mediastream/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::RealtimeOutgoingVideoSource):
(WebCore::RealtimeOutgoingVideoSource::~RealtimeOutgoingVideoSource):
(WebCore::RealtimeOutgoingVideoSource::observeSource):
(WebCore::RealtimeOutgoingVideoSource::setSource):
(WebCore::RealtimeOutgoingVideoSource::stop):
(WebCore::RealtimeOutgoingVideoSource::AddOrUpdateSink):
(WebCore::RealtimeOutgoingVideoSource::RemoveSink):
* platform/mediastream/RealtimeOutgoingVideoSource.h:
(WebCore::RealtimeOutgoingVideoSource::isSilenced const):
* platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp:
(WebCore::RealtimeOutgoingAudioSourceLibWebRTC::pullAudioData):
* platform/mediastream/mac/RealtimeOutgoingAudioSourceCocoa.cpp:
(WebCore::RealtimeOutgoingAudioSourceCocoa::RealtimeOutgoingAudioSourceCocoa):
(WebCore::RealtimeOutgoingAudioSourceCocoa::audioSamplesAvailable):
(WebCore::RealtimeOutgoingAudioSourceCocoa::pullAudioData):
* platform/mediastream/mac/RealtimeOutgoingAudioSourceCocoa.h:
* platform/mediastream/mac/RealtimeOutgoingVideoSourceCocoa.cpp:
(WebCore::RealtimeOutgoingVideoSourceCocoa::sampleBufferUpdated):

LayoutTests:

* webrtc/remove-track-expected.txt:
* webrtc/remove-track.html:
Add tests and fixed some flakiness issues on existing tests in the file.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@238102 268f45cc-cd09-0410-ab3c-d52691b4dbfc

12 files changed:
LayoutTests/ChangeLog
LayoutTests/webrtc/remove-track-expected.txt
LayoutTests/webrtc/remove-track.html
Source/WebCore/ChangeLog
Source/WebCore/platform/mediastream/RealtimeOutgoingAudioSource.cpp
Source/WebCore/platform/mediastream/RealtimeOutgoingAudioSource.h
Source/WebCore/platform/mediastream/RealtimeOutgoingVideoSource.cpp
Source/WebCore/platform/mediastream/RealtimeOutgoingVideoSource.h
Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp
Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp
Source/WebCore/platform/mediastream/mac/RealtimeOutgoingAudioSourceCocoa.cpp
Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSourceCocoa.cpp

index 609c74b..0dbb8a0 100644 (file)
@@ -1,5 +1,16 @@
 2018-11-12  Youenn Fablet  <youenn@apple.com>
 
+        RealtimeOutgoing A/V sources should observe their sources only if having a sink
+        https://bugs.webkit.org/show_bug.cgi?id=191490
+
+        Reviewed by Eric Carlson.
+
+        * webrtc/remove-track-expected.txt:
+        * webrtc/remove-track.html:
+        Add tests and fixed some flakiness issues on existing tests in the file.
+
+2018-11-12  Youenn Fablet  <youenn@apple.com>
+
         Support setting stream ids when adding a transceiver
         https://bugs.webkit.org/show_bug.cgi?id=191307
 
index de6c3c2..ea37644 100644 (file)
@@ -2,4 +2,6 @@
 PASS Setup audio video exchange 
 PASS Remove video track 
 PASS Remove audio track 
+PASS Add/remove audio tracks 
+PASS Add/remove video tracks 
 
index 827f522..14de539 100644 (file)
@@ -41,7 +41,7 @@ async function renegotiate()
 {
     let d = await firstConnection.createOffer();
     await firstConnection.setLocalDescription(d);
-    await secondConnection.setRemoteDescription(firstConnection.localDescription);
+    await secondConnection.setRemoteDescription(d);
     d = await secondConnection.createAnswer();
     await secondConnection.setLocalDescription(d);
 }
@@ -51,10 +51,10 @@ promise_test((test) => {
         remoteVideoTrack.onmute = resolve;
         setTimeout(() => reject("Test timed out"), 5000);
     });
-     
+
     firstConnection.removeTrack(firstConnection.getSenders()[0]);
-    renegotiate();
-    return promise;
+    const promise2 = renegotiate();
+    return Promise.all([promise, promise2]);
 }, "Remove video track");
 
 promise_test((test) => {
@@ -62,11 +62,34 @@ promise_test((test) => {
         remoteAudioTrack.onmute = resolve;
         setTimeout(() => reject("Test timed out"), 5000);
     });
-     
+
     firstConnection.removeTrack(firstConnection.getSenders()[1]);
-    renegotiate();
-    return promise;
+    const promise2 = renegotiate();
+    return Promise.all([promise, promise2]);
 }, "Remove audio track");
+
+promise_test(async t => {
+    const stream = await navigator.mediaDevices.getUserMedia({audio: true});
+    const track = stream.getTracks()[0];
+    let pc = new RTCPeerConnection();
+    for (let i = 0; i <100; i++) {
+        let sender = pc.addTrack(track, stream);
+        pc.removeTrack(sender);
+    }
+    pc.close();
+}, 'Add/remove audio tracks');
+
+promise_test(async t => {
+    const stream = await navigator.mediaDevices.getUserMedia({video: true});
+    const track = stream.getTracks()[0];
+    let pc = new RTCPeerConnection();
+    for (let i = 0; i <100; i++) {
+        let sender = pc.addTrack(track, stream);
+        pc.removeTrack(sender);
+    }
+    pc.close();
+}, 'Add/remove video tracks');
+
         </script>
     </body>
 </html>
index 79a7d30..d9878f4 100644 (file)
@@ -1,5 +1,51 @@
 2018-11-12  Youenn Fablet  <youenn@apple.com>
 
+        RealtimeOutgoing A/V sources should observe their sources only if having a sink
+        https://bugs.webkit.org/show_bug.cgi?id=191490
+
+        Reviewed by Eric Carlson.
+
+        Observe the source that generates media based on the sinks:
+        - Do not observe at creation time
+        - For first sink, start observing
+        - When no more sink, stop observing
+        Apply this principle for both outgoing audio and video sources.
+        Add locks for the sinks to ensure thread-safety.
+        Make sinks HashSet which is more robust.
+
+        Do some refactoring to better isolate generic outgoing sources from Cocoa/GTK implementations.
+
+        Covered by existing tests and updated webrtc/remove-track.html.
+
+        * platform/mediastream/RealtimeOutgoingAudioSource.cpp:
+        (WebCore::RealtimeOutgoingAudioSource::~RealtimeOutgoingAudioSource):
+        (WebCore::RealtimeOutgoingAudioSource::stop):
+        (WebCore::RealtimeOutgoingAudioSource::AddSink):
+        (WebCore::RealtimeOutgoingAudioSource::RemoveSink):
+        (WebCore::RealtimeOutgoingAudioSource::sendAudioFrames):
+        * platform/mediastream/RealtimeOutgoingAudioSource.h:
+        * platform/mediastream/RealtimeOutgoingVideoSource.cpp:
+        (WebCore::RealtimeOutgoingVideoSource::RealtimeOutgoingVideoSource):
+        (WebCore::RealtimeOutgoingVideoSource::~RealtimeOutgoingVideoSource):
+        (WebCore::RealtimeOutgoingVideoSource::observeSource):
+        (WebCore::RealtimeOutgoingVideoSource::setSource):
+        (WebCore::RealtimeOutgoingVideoSource::stop):
+        (WebCore::RealtimeOutgoingVideoSource::AddOrUpdateSink):
+        (WebCore::RealtimeOutgoingVideoSource::RemoveSink):
+        * platform/mediastream/RealtimeOutgoingVideoSource.h:
+        (WebCore::RealtimeOutgoingVideoSource::isSilenced const):
+        * platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp:
+        (WebCore::RealtimeOutgoingAudioSourceLibWebRTC::pullAudioData):
+        * platform/mediastream/mac/RealtimeOutgoingAudioSourceCocoa.cpp:
+        (WebCore::RealtimeOutgoingAudioSourceCocoa::RealtimeOutgoingAudioSourceCocoa):
+        (WebCore::RealtimeOutgoingAudioSourceCocoa::audioSamplesAvailable):
+        (WebCore::RealtimeOutgoingAudioSourceCocoa::pullAudioData):
+        * platform/mediastream/mac/RealtimeOutgoingAudioSourceCocoa.h:
+        * platform/mediastream/mac/RealtimeOutgoingVideoSourceCocoa.cpp:
+        (WebCore::RealtimeOutgoingVideoSourceCocoa::sampleBufferUpdated):
+
+2018-11-12  Youenn Fablet  <youenn@apple.com>
+
         Support setting stream ids when adding a transceiver
         https://bugs.webkit.org/show_bug.cgi?id=191307
 
index 5ef7444..0d07428 100644 (file)
@@ -41,6 +41,12 @@ RealtimeOutgoingAudioSource::RealtimeOutgoingAudioSource(Ref<MediaStreamTrackPri
 {
 }
 
+RealtimeOutgoingAudioSource::~RealtimeOutgoingAudioSource()
+{
+    ASSERT(m_sinks.isEmpty());
+    stop();
+}
+
 void RealtimeOutgoingAudioSource::observeSource()
 {
     m_audioSource->addObserver(*this);
@@ -54,11 +60,15 @@ void RealtimeOutgoingAudioSource::unobserveSource()
 
 bool RealtimeOutgoingAudioSource::setSource(Ref<MediaStreamTrackPrivate>&& newSource)
 {
-    m_audioSource->removeObserver(*this);
+    auto locker = holdLock(m_sinksLock);
+    bool hasSinks = !m_sinks.isEmpty();
+
+    if (hasSinks)
+        unobserveSource();
     m_audioSource = WTFMove(newSource);
-    m_audioSource->addObserver(*this);
+    if (hasSinks)
+        observeSource();
 
-    initializeConverter();
     return true;
 }
 
@@ -68,12 +78,6 @@ void RealtimeOutgoingAudioSource::initializeConverter()
     m_enabled = m_audioSource->enabled();
 }
 
-void RealtimeOutgoingAudioSource::stop()
-{
-    ASSERT(isMainThread());
-    m_audioSource->removeObserver(*this);
-}
-
 void RealtimeOutgoingAudioSource::sourceMutedChanged()
 {
     m_muted = m_audioSource->muted();
@@ -84,6 +88,38 @@ void RealtimeOutgoingAudioSource::sourceEnabledChanged()
     m_enabled = m_audioSource->enabled();
 }
 
+void RealtimeOutgoingAudioSource::AddSink(webrtc::AudioTrackSinkInterface* sink)
+{
+    {
+    auto locker = holdLock(m_sinksLock);
+    if (!m_sinks.add(sink) || m_sinks.size() != 1)
+        return;
+    }
+
+    callOnMainThread([protectedThis = makeRef(*this)]() {
+        protectedThis->observeSource();
+    });
+}
+
+void RealtimeOutgoingAudioSource::RemoveSink(webrtc::AudioTrackSinkInterface* sink)
+{
+    {
+    auto locker = holdLock(m_sinksLock);
+    if (!m_sinks.remove(sink) || !m_sinks.isEmpty())
+        return;
+    }
+
+    unobserveSource();
+}
+
+void RealtimeOutgoingAudioSource::sendAudioFrames(const void* audioData, int bitsPerSample, int sampleRate, size_t numberOfChannels, size_t numberOfFrames)
+{
+    auto locker = holdLock(m_sinksLock);
+    for (auto sink : m_sinks)
+        sink->OnData(audioData, bitsPerSample, sampleRate, numberOfChannels, numberOfFrames);
+}
+
+
 } // namespace WebCore
 
 #endif // USE(LIBWEBRTC)
index 694bdae..b03bf21 100644 (file)
@@ -54,9 +54,9 @@ class RealtimeOutgoingAudioSource : public ThreadSafeRefCounted<RealtimeOutgoing
 public:
     static Ref<RealtimeOutgoingAudioSource> create(Ref<MediaStreamTrackPrivate>&& audioSource);
 
-    ~RealtimeOutgoingAudioSource() { stop(); }
+    ~RealtimeOutgoingAudioSource();
 
-    void stop();
+    void stop() { unobserveSource(); }
 
     bool setSource(Ref<MediaStreamTrackPrivate>&&);
     MediaStreamTrackPrivate& source() const { return m_audioSource.get(); }
@@ -64,18 +64,18 @@ public:
 protected:
     explicit RealtimeOutgoingAudioSource(Ref<MediaStreamTrackPrivate>&&);
 
-    void observeSource();
     void unobserveSource();
 
     virtual void pullAudioData() { }
 
     bool isSilenced() const { return m_muted || !m_enabled; }
 
-    Vector<webrtc::AudioTrackSinkInterface*> m_sinks;
+    void sendAudioFrames(const void* audioData, int bitsPerSample, int sampleRate, size_t numberOfChannels, size_t numberOfFrames);
 
 private:
-    virtual void AddSink(webrtc::AudioTrackSinkInterface* sink) { m_sinks.append(sink); }
-    virtual void RemoveSink(webrtc::AudioTrackSinkInterface* sink) { m_sinks.removeFirst(sink); }
+    // webrtc::AudioSourceInterface API
+    void AddSink(webrtc::AudioTrackSinkInterface*) final;
+    void RemoveSink(webrtc::AudioTrackSinkInterface*) final;
 
     void AddRef() const final { ref(); }
     rtc::RefCountReleaseStatus Release() const final
@@ -90,6 +90,8 @@ private:
     void RegisterObserver(webrtc::ObserverInterface*) final { }
     void UnregisterObserver(webrtc::ObserverInterface*) final { }
 
+    void observeSource();
+
     void sourceMutedChanged();
     void sourceEnabledChanged();
     virtual void audioSamplesAvailable(const MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t) { };
@@ -110,6 +112,9 @@ private:
     Ref<MediaStreamTrackPrivate> m_audioSource;
     bool m_muted { false };
     bool m_enabled { true };
+
+    mutable RecursiveLock m_sinksLock;
+    HashSet<webrtc::AudioTrackSinkInterface*> m_sinks;
 };
 
 } // namespace WebCore
index 9daef79..e8d3216 100644 (file)
@@ -48,20 +48,38 @@ RealtimeOutgoingVideoSource::RealtimeOutgoingVideoSource(Ref<MediaStreamTrackPri
     : m_videoSource(WTFMove(videoSource))
     , m_blackFrameTimer(*this, &RealtimeOutgoingVideoSource::sendOneBlackFrame)
 {
+}
+
+RealtimeOutgoingVideoSource::~RealtimeOutgoingVideoSource()
+{
+    ASSERT(m_sinks.isEmpty());
+    stop();
+}
+
+void RealtimeOutgoingVideoSource::observeSource()
+{
     m_videoSource->addObserver(*this);
     initializeFromSource();
 }
 
+void RealtimeOutgoingVideoSource::unobserveSource()
+{
+    m_videoSource->removeObserver(*this);
+}
+
 bool RealtimeOutgoingVideoSource::setSource(Ref<MediaStreamTrackPrivate>&& newSource)
 {
     if (!m_initialSettings)
         m_initialSettings = m_videoSource->source().settings();
 
-    m_videoSource->removeObserver(*this);
-    m_videoSource = WTFMove(newSource);
-    m_videoSource->addObserver(*this);
+    auto locker = holdLock(m_sinksLock);
+    bool hasSinks = !m_sinks.isEmpty();
 
-    initializeFromSource();
+    if (hasSinks)
+        unobserveSource();
+    m_videoSource = WTFMove(newSource);
+    if (hasSinks)
+        observeSource();
 
     return true;
 }
@@ -69,9 +87,8 @@ bool RealtimeOutgoingVideoSource::setSource(Ref<MediaStreamTrackPrivate>&& newSo
 void RealtimeOutgoingVideoSource::stop()
 {
     ASSERT(isMainThread());
-    m_videoSource->removeObserver(*this);
+    unobserveSource();
     m_blackFrameTimer.stop();
-    m_isStopped = true;
 }
 
 void RealtimeOutgoingVideoSource::updateBlackFramesSending()
@@ -122,20 +139,27 @@ void RealtimeOutgoingVideoSource::AddOrUpdateSink(rtc::VideoSinkInterface<webrtc
     if (sinkWants.rotation_applied)
         m_shouldApplyRotation = true;
 
-    if (!m_sinks.contains(sink))
-        m_sinks.append(sink);
+    {
+    auto locker = holdLock(m_sinksLock);
+    if (!m_sinks.add(sink) || m_sinks.size() != 1)
+        return;
+    }
 
     callOnMainThread([protectedThis = makeRef(*this)]() {
-        protectedThis->sendBlackFramesIfNeeded();
+        protectedThis->observeSource();
     });
 }
 
 void RealtimeOutgoingVideoSource::RemoveSink(rtc::VideoSinkInterface<webrtc::VideoFrame>* sink)
 {
-    m_sinks.removeFirst(sink);
+    {
+    auto locker = holdLock(m_sinksLock);
 
-    if (m_sinks.size())
+    if (!m_sinks.remove(sink) || m_sinks.size())
         return;
+    }
+
+    unobserveSource();
 
     callOnMainThread([protectedThis = makeRef(*this)]() {
         if (protectedThis->m_blackFrameTimer.isActive())
@@ -148,9 +172,6 @@ void RealtimeOutgoingVideoSource::sendBlackFramesIfNeeded()
     if (m_blackFrameTimer.isActive())
         return;
 
-    if (!m_sinks.size())
-        return;
-
     if (!m_muted && m_enabled)
         return;
 
@@ -183,6 +204,8 @@ void RealtimeOutgoingVideoSource::sendFrame(rtc::scoped_refptr<webrtc::VideoFram
 {
     MonotonicTime timestamp = MonotonicTime::now();
     webrtc::VideoFrame frame(buffer, m_shouldApplyRotation ? webrtc::kVideoRotation_0 : m_currentRotation, static_cast<int64_t>(timestamp.secondsSinceEpoch().microseconds()));
+
+    auto locker = holdLock(m_sinksLock);
     for (auto* sink : m_sinks)
         sink->OnFrame(frame);
 }
index 4250599..a1b1e98 100644 (file)
@@ -49,7 +49,7 @@ namespace WebCore {
 class RealtimeOutgoingVideoSource : public ThreadSafeRefCounted<RealtimeOutgoingVideoSource, WTF::DestructionThread::Main>, public webrtc::VideoTrackSourceInterface, private MediaStreamTrackPrivate::Observer {
 public:
     static Ref<RealtimeOutgoingVideoSource> create(Ref<MediaStreamTrackPrivate>&& videoSource);
-    ~RealtimeOutgoingVideoSource() { stop(); }
+    ~RealtimeOutgoingVideoSource();
 
     void stop();
     bool setSource(Ref<MediaStreamTrackPrivate>&&);
@@ -68,25 +68,22 @@ protected:
     explicit RealtimeOutgoingVideoSource(Ref<MediaStreamTrackPrivate>&&);
 
     void sendFrame(rtc::scoped_refptr<webrtc::VideoFrameBuffer>&&);
+    bool isSilenced() const { return m_muted || !m_enabled; }
 
-    Vector<rtc::VideoSinkInterface<webrtc::VideoFrame>*> m_sinks;
-    webrtc::I420BufferPool m_bufferPool;
+    virtual rtc::scoped_refptr<webrtc::VideoFrameBuffer> createBlackFrame(size_t width, size_t height) = 0;
 
-    bool m_enabled { true };
-    bool m_muted { false };
-    uint32_t m_width { 0 };
-    uint32_t m_height { 0 };
     bool m_shouldApplyRotation { false };
     webrtc::VideoRotation m_currentRotation { webrtc::kVideoRotation_0 };
 
-    virtual rtc::scoped_refptr<webrtc::VideoFrameBuffer> createBlackFrame(size_t width, size_t height) = 0;
-
 private:
     void sendBlackFramesIfNeeded();
     void sendOneBlackFrame();
     void initializeFromSource();
     void updateBlackFramesSending();
 
+    void observeSource();
+    void unobserveSource();
+
     // Notifier API
     void RegisterObserver(webrtc::ObserverInterface*) final { }
     void UnregisterObserver(webrtc::ObserverInterface*) final { }
@@ -116,9 +113,16 @@ private:
 
     Ref<MediaStreamTrackPrivate> m_videoSource;
     std::optional<RealtimeMediaSourceSettings> m_initialSettings;
-    bool m_isStopped { false };
     Timer m_blackFrameTimer;
     rtc::scoped_refptr<webrtc::VideoFrameBuffer> m_blackFrame;
+
+    mutable RecursiveLock m_sinksLock;
+    HashSet<rtc::VideoSinkInterface<webrtc::VideoFrame>*> m_sinks;
+
+    bool m_enabled { true };
+    bool m_muted { false };
+    uint32_t m_width { 0 };
+    uint32_t m_height { 0 };
 };
 
 } // namespace WebCore
index 060fc10..2177032 100644 (file)
@@ -34,7 +34,6 @@ RealtimeOutgoingAudioSourceLibWebRTC::RealtimeOutgoingAudioSourceLibWebRTC(Ref<M
 {
     m_adapter = adoptGRef(gst_adapter_new()),
     m_sampleConverter = nullptr;
-    observeSource();
 }
 
 RealtimeOutgoingAudioSourceLibWebRTC::~RealtimeOutgoingAudioSourceLibWebRTC()
@@ -130,13 +129,11 @@ void RealtimeOutgoingAudioSourceLibWebRTC::pullAudioData()
     gpointer in[1] = { inmap.data };
     gpointer out[1] = { outmap.data };
     if (gst_audio_converter_samples(m_sampleConverter, static_cast<GstAudioConverterFlags>(0), in, inChunkSampleCount, out, outChunkSampleCount)) {
-        for (auto sink : m_sinks) {
-            sink->OnData(outmap.data,
-                LibWebRTCAudioFormat::sampleSize,
-                static_cast<int>(m_outputStreamDescription->sampleRate()),
-                static_cast<int>(m_outputStreamDescription->numberOfChannels()),
-                outChunkSampleCount);
-        }
+        sendAudioFrames(outmap.data,
+            LibWebRTCAudioFormat::sampleSize,
+            static_cast<int>(m_outputStreamDescription->sampleRate()),
+            static_cast<int>(m_outputStreamDescription->numberOfChannels()),
+            outChunkSampleCount);
     } else
         GST_ERROR("Could not convert samples.");
 
index f178eed..9e396d9 100644 (file)
@@ -52,10 +52,7 @@ RealtimeOutgoingVideoSourceLibWebRTC::RealtimeOutgoingVideoSourceLibWebRTC(Ref<M
 
 void RealtimeOutgoingVideoSourceLibWebRTC::sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample& sample)
 {
-    if (!m_sinks.size())
-        return;
-
-    if (m_muted || !m_enabled)
+    if (isSilenced())
         return;
 
     switch (sample.videoRotation()) {
index 2fbe067..22e49e9 100644 (file)
@@ -47,7 +47,6 @@ RealtimeOutgoingAudioSourceCocoa::RealtimeOutgoingAudioSourceCocoa(Ref<MediaStre
     : RealtimeOutgoingAudioSource(WTFMove(audioSource))
     , m_sampleConverter(AudioSampleDataSource::create(LibWebRTCAudioFormat::sampleRate * 2))
 {
-    observeSource();
 }
 
 RealtimeOutgoingAudioSourceCocoa::~RealtimeOutgoingAudioSourceCocoa()
@@ -141,8 +140,7 @@ void RealtimeOutgoingAudioSourceCocoa::pullAudioData()
 
     m_sampleConverter->pullAvalaibleSamplesAsChunks(bufferList, chunkSampleCount, m_readCount, [this, chunkSampleCount] {
         m_readCount += chunkSampleCount;
-        for (auto sink : m_sinks)
-            sink->OnData(m_audioBuffer.data(), LibWebRTCAudioFormat::sampleSize, m_outputStreamDescription.sampleRate(), m_outputStreamDescription.numberOfChannels(), chunkSampleCount);
+        sendAudioFrames(m_audioBuffer.data(), LibWebRTCAudioFormat::sampleSize, m_outputStreamDescription.sampleRate(), m_outputStreamDescription.numberOfChannels(), chunkSampleCount);
     });
 }
 
index 4419bae..7981e71 100644 (file)
@@ -63,10 +63,7 @@ RealtimeOutgoingVideoSourceCocoa::RealtimeOutgoingVideoSourceCocoa(Ref<MediaStre
 
 void RealtimeOutgoingVideoSourceCocoa::sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample& sample)
 {
-    if (!m_sinks.size())
-        return;
-
-    if (m_muted || !m_enabled)
+    if (isSilenced())
         return;
 
 #if !RELEASE_LOG_DISABLED