Improve WebRTC track enabled support
authorcommit-queue@webkit.org <commit-queue@webkit.org@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Thu, 16 Mar 2017 16:09:50 +0000 (16:09 +0000)
committercommit-queue@webkit.org <commit-queue@webkit.org@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Thu, 16 Mar 2017 16:09:50 +0000 (16:09 +0000)
https://bugs.webkit.org/show_bug.cgi?id=169727

Patch by Youenn Fablet <youenn@apple.com> on 2017-03-16
Reviewed by Alex Christensen.

Source/WebCore:

Tests: webrtc/peer-connection-audio-mute2.html
       webrtc/peer-connection-remote-audio-mute.html
       webrtc/video-remote-mute.html

Making sure muted/disabled sources produce silence/black frames.
For outgoing audio/video sources, this should be done by the actual a/v providers.
We keep this filtering here until we are sure they implement that.

* platform/audio/mac/AudioSampleDataSource.mm:
(WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
Used for outgoing webrtc tracks.
* platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
(WebCore::MockRealtimeAudioSourceMac::render): Ditto.
* platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
(WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
* platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
(WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
(WebCore::RealtimeIncomingVideoSource::OnFrame):
* platform/mediastream/mac/RealtimeIncomingVideoSource.h:
* platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.

LayoutTests:

* TestExpectations:
* webrtc/audio-peer-connection-webaudio.html:
* webrtc/peer-connection-audio-mute-expected.txt:
* webrtc/peer-connection-audio-mute.html:
* webrtc/peer-connection-audio-mute2-expected.txt: Added.
* webrtc/peer-connection-audio-mute2.html: Added.
* webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
* webrtc/peer-connection-remote-audio-mute.html: Added.
* webrtc/video-mute-expected.txt:
* webrtc/video-mute.html:
* webrtc/video-remote-mute-expected.txt: Added.
* webrtc/video-remote-mute.html: Added.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@214044 268f45cc-cd09-0410-ab3c-d52691b4dbfc

22 files changed:
LayoutTests/ChangeLog
LayoutTests/TestExpectations
LayoutTests/webrtc/audio-peer-connection-webaudio.html
LayoutTests/webrtc/peer-connection-audio-mute-expected.txt
LayoutTests/webrtc/peer-connection-audio-mute.html
LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt [new file with mode: 0644]
LayoutTests/webrtc/peer-connection-audio-mute2.html [new file with mode: 0644]
LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt [new file with mode: 0644]
LayoutTests/webrtc/peer-connection-remote-audio-mute.html [new file with mode: 0644]
LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt [new file with mode: 0644]
LayoutTests/webrtc/peer-connection-remote-audio-mute2.html [new file with mode: 0644]
LayoutTests/webrtc/video-mute-expected.txt
LayoutTests/webrtc/video-mute.html
LayoutTests/webrtc/video-remote-mute-expected.txt [new file with mode: 0644]
LayoutTests/webrtc/video-remote-mute.html [new file with mode: 0644]
Source/WebCore/ChangeLog
Source/WebCore/platform/audio/mac/AudioSampleDataSource.mm
Source/WebCore/platform/mediastream/mac/MockRealtimeAudioSourceMac.mm
Source/WebCore/platform/mediastream/mac/RealtimeIncomingAudioSource.cpp
Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp
Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h
Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp

index 7ee5896..59d5692 100644 (file)
@@ -1,3 +1,23 @@
+2017-03-16  Youenn Fablet  <youenn@apple.com>
+
+        Improve WebRTC track enabled support
+        https://bugs.webkit.org/show_bug.cgi?id=169727
+
+        Reviewed by Alex Christensen.
+
+        * TestExpectations:
+        * webrtc/audio-peer-connection-webaudio.html:
+        * webrtc/peer-connection-audio-mute-expected.txt:
+        * webrtc/peer-connection-audio-mute.html:
+        * webrtc/peer-connection-audio-mute2-expected.txt: Added.
+        * webrtc/peer-connection-audio-mute2.html: Added.
+        * webrtc/peer-connection-remote-audio-mute-expected.txt: Added.
+        * webrtc/peer-connection-remote-audio-mute.html: Added.
+        * webrtc/video-mute-expected.txt:
+        * webrtc/video-mute.html:
+        * webrtc/video-remote-mute-expected.txt: Added.
+        * webrtc/video-remote-mute.html: Added.
+
 2017-03-16  Manuel Rego Casasnovas  <rego@igalia.com>
 
         [css-grid] Crash on debug removing a positioned child
index 25bd29b..b044c20 100644 (file)
@@ -711,7 +711,10 @@ media/session [ Skip ]
 # GTK enables some of these tests on their TestExpectations file.
 [ Release ] webrtc [ Skip ]
 
-[ Debug ] webrtc/audio-peer-connection-webaudio.html [ Failure ]
+[ Debug ] webrtc/peer-connection-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-audio-mute2.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute.html [ Pass Failure ]
+[ Debug ] webrtc/peer-connection-remote-audio-mute2.html [ Pass Failure ]
 fast/mediastream/getUserMedia-webaudio.html [ Skip ]
 fast/mediastream/RTCPeerConnection-AddRemoveStream.html [ Skip ]
 fast/mediastream/RTCPeerConnection-closed-state.html [ Skip ]
index 5ffa19a..994bf75 100644 (file)
@@ -11,7 +11,7 @@
         if (window.testRunner)
             testRunner.setUserMediaPermission(true);
 
-       return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
             if (window.internals)
                 internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
             return new Promise((resolve, reject) => {
                     secondConnection.onaddstream = (streamEvent) => { resolve(streamEvent.stream); };
                 });
                 setTimeout(() => reject("Test timed out"), 5000);
-            }).then((stream) => {
-                return analyseAudio(stream, 1000);
-            }).then((results) => {
-                assert_true(results.heardHum, "heard hum");
-                assert_true(results.heardBip, "heard bip");
-                assert_true(results.heardBop, "heard bop");
             });
-         });
+        }).then((remoteStream) => {
+            return analyseAudio(remoteStream, 1000);
+        }).then((results) => {
+            assert_true(results.heardHum, "heard hum");
+            assert_true(results.heardBip, "heard bip");
+            assert_true(results.heardBop, "heard bop");
+        });
     }, "Basic audio playback through a peer connection");
     </script>
 </head>
index d39af1e..e76677e 100644 (file)
@@ -1,3 +1,3 @@
 
-FAIL Muting and unmuting an audio track assert_true: heard hum expected true got false
+PASS Muting a local audio track and making sure the remote track is silent 
 
index 2796e8b..6b8c09e 100644 (file)
         if (window.testRunner)
             testRunner.setUserMediaPermission(true);
 
-        return navigator.mediaDevices.getUserMedia({audio: true}).then((stream) => {
+        var localTrack;
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
             if (window.internals)
                 internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
 
-            var stream;
+            localTrack = localStream.getAudioTracks()[0];
+            var remoteStream;
             return new Promise((resolve, reject) => {
                 createConnections((firstConnection) => {
-                    firstConnection.addStream(stream);
+                    firstConnection.addStream(localStream);
                 }, (secondConnection) => {
                     secondConnection.onaddstream = (streamEvent) => {
-                        stream = streamEvent.stream;
+                        remoteStream = streamEvent.stream;
                         resolve();
                     };
                 });
             }).then(() => {
                 return waitFor(500);
             }).then(() => {
-                return analyseAudio(stream, 500).then((results) => {
-                    assert_true(results.heardHum, "heard hum");
-                    assert_true(results.heardBip, "heard bip");
-                    assert_true(results.heardBop, "heard bop");
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
                 });
             }).then(() => {
-                stream.getAudioTracks().forEach((track) => {
-                    track.enabled = false;
-                });
-                return waitFor(500);
-            }).then(() => {
-                return analyseAudio(stream, 500).then((results) => {
-                    assert_false(results.heardHum, "heard hum");
-                    assert_false(results.heardBip, "heard bip");
-                    assert_false(results.heardBop, "heard bop");
-                });
-            }).then(() => {
-                stream.getAudioTracks().forEach((track) => {
-                    track.enabled = true;
-                });
+                localTrack.enabled = false;
                 return waitFor(500);
             }).then(() => {
-                return analyseAudio(stream, 500).then((results) => {
-                    assert_true(results.heardHum, "heard hum");
-                    assert_true(results.heardBip, "heard bip");
-                    assert_true(results.heardBop, "heard bop");
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
                 });
             });
         });
-    }, "Muting and unmuting an audio track");
+    }, "Muting a local audio track and making sure the remote track is silent");
     </script>
 </body>
 </html>
diff --git a/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt b/LayoutTests/webrtc/peer-connection-audio-mute2-expected.txt
new file mode 100644 (file)
index 0000000..a70a03c
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS Muting and unmuting a local audio track 
+
diff --git a/LayoutTests/webrtc/peer-connection-audio-mute2.html b/LayoutTests/webrtc/peer-connection-audio-mute2.html
new file mode 100644 (file)
index 0000000..7b6270f
--- /dev/null
@@ -0,0 +1,57 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <meta charset="utf-8">
+    <title>Testing local audio capture playback causes "playing" event to fire</title>
+    <script src="../resources/testharness.js"></script>
+    <script src="../resources/testharnessreport.js"></script>
+</head>
+<body>
+    <script src ="routines.js"></script>
+    <script>
+    promise_test((test) => {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        var localTrack;
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            localTrack = localStream.getAudioTracks()[0];
+            var remoteStream;
+            return new Promise((resolve, reject) => {
+                createConnections((firstConnection) => {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) => {
+                    secondConnection.onaddstream = (streamEvent) => {
+                        remoteStream = streamEvent.stream;
+                        resolve();
+                    };
+                });
+            }).then(() => {
+                return waitFor(500);
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
+                });
+            }).then(() => {
+                localTrack.enabled = false;
+                return waitFor(500);
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
+                });
+            }).then(() => {
+                localTrack.enabled = true;
+                return waitFor(500);
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote reenabled track");
+                });
+            });
+        });
+    }, "Muting and unmuting a local audio track");
+    </script>
+</body>
+</html>
diff --git a/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt b/LayoutTests/webrtc/peer-connection-remote-audio-mute-expected.txt
new file mode 100644 (file)
index 0000000..072c5ac
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS Muting an incoming audio track 
+
diff --git a/LayoutTests/webrtc/peer-connection-remote-audio-mute.html b/LayoutTests/webrtc/peer-connection-remote-audio-mute.html
new file mode 100644 (file)
index 0000000..34b42e5
--- /dev/null
@@ -0,0 +1,47 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <meta charset="utf-8">
+    <title>Testing local audio capture playback causes "playing" event to fire</title>
+    <script src="../resources/testharness.js"></script>
+    <script src="../resources/testharnessreport.js"></script>
+</head>
+<body>
+    <script src ="routines.js"></script>
+    <script>
+    promise_test((test) => {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            var remoteTrack;
+            var remoteStream;
+            return new Promise((resolve, reject) => {
+                createConnections((firstConnection) => {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) => {
+                    secondConnection.onaddstream = (streamEvent) => {
+                        remoteStream = streamEvent.stream;
+                        remoteTrack = remoteStream.getAudioTracks()[0];
+                        resolve();
+                    };
+                });
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
+                });
+            }).then(() => {
+                remoteTrack.enabled = false;
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
+                });
+            });
+        });
+    }, "Muting an incoming audio track");
+    </script>
+</body>
+</html>
diff --git a/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt b/LayoutTests/webrtc/peer-connection-remote-audio-mute2-expected.txt
new file mode 100644 (file)
index 0000000..42911ae
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS Muting and unmuting an incoming audio track 
+
diff --git a/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html b/LayoutTests/webrtc/peer-connection-remote-audio-mute2.html
new file mode 100644 (file)
index 0000000..83eb56b
--- /dev/null
@@ -0,0 +1,53 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <meta charset="utf-8">
+    <title>Testing local audio capture playback causes "playing" event to fire</title>
+    <script src="../resources/testharness.js"></script>
+    <script src="../resources/testharnessreport.js"></script>
+</head>
+<body>
+    <script src ="routines.js"></script>
+    <script>
+    promise_test((test) => {
+        if (window.testRunner)
+            testRunner.setUserMediaPermission(true);
+
+        return navigator.mediaDevices.getUserMedia({audio: true}).then((localStream) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            var remoteTrack;
+            var remoteStream;
+            return new Promise((resolve, reject) => {
+                createConnections((firstConnection) => {
+                    firstConnection.addStream(localStream);
+                }, (secondConnection) => {
+                    secondConnection.onaddstream = (streamEvent) => {
+                        remoteStream = streamEvent.stream;
+                        remoteTrack = remoteStream.getAudioTracks()[0];
+                        resolve();
+                    };
+                });
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote enabled track");
+                });
+            }).then(() => {
+                remoteTrack.enabled = false;
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_false(results.heardHum, "not heard hum from remote disabled track");
+                });
+            }).then(() => {
+                remoteTrack.enabled = true;
+            }).then(() => {
+                return analyseAudio(remoteStream, 500).then((results) => {
+                    assert_true(results.heardHum, "heard hum from remote reenabled track");
+                });
+            });
+        });
+    }, "Muting and unmuting an incoming audio track");
+    </script>
+</body>
+</html>
index 2d261a5..9fe1422 100644 (file)
@@ -1,4 +1,4 @@
 
 
-PASS Video muted/unmuted track 
+PASS Outgoing muted/unmuted video track 
 
index 1bafaba..5957bb4 100644 (file)
@@ -21,10 +21,11 @@ function isVideoBlack()
     canvas.height = video.videoHeight;
     canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
 
-    imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
+    imageData = canvas.getContext('2d').getImageData(0, 0, canvas.width, canvas.height);
     data = imageData.data;
     for (var cptr = 0; cptr < canvas.width * canvas.height; ++cptr) {
-        if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
+        // Approximatively black pixels.
+        if (data[4 * cptr] > 10 || data[4 * cptr + 1] > 10 || data[4 * cptr + 2] > 10)
             return false;
     }
     return true;
@@ -35,35 +36,36 @@ promise_test((test) => {
     if (window.testRunner)
         testRunner.setUserMediaPermission(true);
 
-    return navigator.mediaDevices.getUserMedia({ video: true}).then((stream) => {
+    return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) => {
         return new Promise((resolve, reject) => {
             if (window.internals)
                 internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
 
+            track = localStream.getVideoTracks()[0];
+
             createConnections((firstConnection) => {
-                firstConnection.addStream(stream);
+                firstConnection.addStream(localStream);
             }, (secondConnection) => {
                 secondConnection.onaddstream = (streamEvent) => { resolve(streamEvent.stream); };
             });
             setTimeout(() => reject("Test timed out"), 5000);
         });
-    }).then((stream) => {
-        video.srcObject = stream;
-        track = stream.getVideoTracks()[0];
+    }).then((remoteStream) => {
+        video.srcObject = remoteStream;
         return video.play();
     }).then(() => {
-         assert_false(isVideoBlack());
+         assert_false(isVideoBlack(), "track is enabled, video is not black");
     }).then(() => {
         track.enabled = false;
         return waitFor(500);
     }).then(() => {
-        assert_true(isVideoBlack());
+        assert_true(isVideoBlack(), "track is disabled, video is black");
         track.enabled = true;
         return waitFor(500);
     }).then(() => {
-        assert_false(isVideoBlack());
+        assert_false(isVideoBlack(), "track is reenabled, video is not black");
     });
-}, "Video muted/unmuted track");
+}, "Outgoing muted/unmuted video track");
         </script>
     </body>
 </html>
diff --git a/LayoutTests/webrtc/video-remote-mute-expected.txt b/LayoutTests/webrtc/video-remote-mute-expected.txt
new file mode 100644 (file)
index 0000000..7d91abc
--- /dev/null
@@ -0,0 +1,4 @@
+
+
+PASS Incoming muted/unmuted video track 
+
diff --git a/LayoutTests/webrtc/video-remote-mute.html b/LayoutTests/webrtc/video-remote-mute.html
new file mode 100644 (file)
index 0000000..06bc745
--- /dev/null
@@ -0,0 +1,69 @@
+<!doctype html>
+<html>
+    <head>
+        <meta charset="utf-8">
+        <title>Testing basic video exchange from offerer to receiver</title>
+        <script src="../resources/testharness.js"></script>
+        <script src="../resources/testharnessreport.js"></script>
+    </head>
+    <body>
+        <video id="video" autoplay=""></video>
+        <canvas id="canvas" width="640" height="480"></canvas>
+        <script src ="routines.js"></script>
+        <script>
+video = document.getElementById("video");
+canvas = document.getElementById("canvas");
+// FIXME: We should use tracks
+
+function isVideoBlack()
+{
+    canvas.width = video.videoWidth;
+    canvas.height = video.videoHeight;
+    canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
+
+    imageData = canvas.getContext('2d').getImageData(10, 325, 250, 1);
+    data = imageData.data;
+    for (var cptr = 0; cptr < canvas.width * canvas.height; ++cptr) {
+        if (data[4 * cptr] || data[4 * cptr + 1] || data[4 * cptr + 2])
+            return false;
+    }
+    return true;
+}
+
+var track;
+promise_test((test) => {
+    if (window.testRunner)
+        testRunner.setUserMediaPermission(true);
+
+    return navigator.mediaDevices.getUserMedia({ video: true}).then((localStream) => {
+        return new Promise((resolve, reject) => {
+            if (window.internals)
+                internals.useMockRTCPeerConnectionFactory("TwoRealPeerConnections");
+
+            createConnections((firstConnection) => {
+                firstConnection.addStream(localStream);
+            }, (secondConnection) => {
+                secondConnection.onaddstream = (streamEvent) => { resolve(streamEvent.stream); };
+            });
+            setTimeout(() => reject("Test timed out"), 5000);
+        });
+    }).then((remoteStream) => {
+        video.srcObject = remoteStream;
+        track = remoteStream.getVideoTracks()[0];
+        return video.play();
+    }).then(() => {
+         assert_false(isVideoBlack());
+    }).then(() => {
+        track.enabled = false;
+        return waitFor(500);
+    }).then(() => {
+        assert_true(isVideoBlack());
+        track.enabled = true;
+        return waitFor(500);
+    }).then(() => {
+        assert_false(isVideoBlack());
+    });
+}, "Incoming muted/unmuted video track");
+        </script>
+    </body>
+</html>
index cac7ad5..457d8fc 100644 (file)
@@ -1,5 +1,34 @@
 2017-03-16  Youenn Fablet  <youenn@apple.com>
 
+        Improve WebRTC track enabled support
+        https://bugs.webkit.org/show_bug.cgi?id=169727
+
+        Reviewed by Alex Christensen.
+
+        Tests: webrtc/peer-connection-audio-mute2.html
+               webrtc/peer-connection-remote-audio-mute.html
+               webrtc/video-remote-mute.html
+
+        Making sure muted/disabled sources produce silence/black frames.
+        For outgoing audio/video sources, this should be done by the actual a/v providers.
+        We keep this filtering here until we are sure they implement that.
+
+        * platform/audio/mac/AudioSampleDataSource.mm:
+        (WebCore::AudioSampleDataSource::pullAvalaibleSamplesAsChunks): Ensuring disabled audio tracks send silence.
+        Used for outgoing webrtc tracks.
+        * platform/mediastream/mac/MockRealtimeAudioSourceMac.mm:
+        (WebCore::MockRealtimeAudioSourceMac::render): Ditto.
+        * platform/mediastream/mac/RealtimeIncomingAudioSource.cpp:
+        (WebCore::RealtimeIncomingAudioSource::OnData): Ditto.
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
+        (WebCore::RealtimeIncomingVideoSource::pixelBufferFromVideoFrame): Generating black frames if muted.
+        (WebCore::RealtimeIncomingVideoSource::OnFrame):
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.h:
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
+        (WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable): Ensuring we quit after sending black frame.
+
+2017-03-16  Youenn Fablet  <youenn@apple.com>
+
         LibWebRTC outgoing source should be thread safe refcounted
         https://bugs.webkit.org/show_bug.cgi?id=169726
 
index cb82c0f..da3b09d 100644 (file)
@@ -311,6 +311,16 @@ bool AudioSampleDataSource::pullAvalaibleSamplesAsChunks(AudioBufferList& buffer
         timeStamp = startFrame;
 
     startFrame = timeStamp;
+
+    if (m_muted) {
+        AudioSampleBufferList::zeroABL(buffer, sampleCountPerChunk * m_outputDescription->bytesPerFrame());
+        while (endFrame - startFrame >= sampleCountPerChunk) {
+            consumeFilledBuffer();
+            startFrame += sampleCountPerChunk;
+        }
+        return true;
+    }
+
     while (endFrame - startFrame >= sampleCountPerChunk) {
         if (m_ringBuffer->fetch(&buffer, sampleCountPerChunk, startFrame, CARingBuffer::Copy))
             return false;
index 2fc9e35..6bcaf5c 100644 (file)
@@ -147,9 +147,6 @@ void MockRealtimeAudioSourceMac::reconfigure()
 
 void MockRealtimeAudioSourceMac::render(double delta)
 {
-    if (m_muted || !m_enabled)
-        return;
-
     if (!m_audioBufferList)
         reconfigure();
 
@@ -162,8 +159,11 @@ void MockRealtimeAudioSourceMac::render(double delta)
         uint32_t bipBopCount = std::min(frameCount, bipBopRemain);
         for (auto& audioBuffer : m_audioBufferList->buffers()) {
             audioBuffer.mDataByteSize = frameCount * m_streamFormat.mBytesPerFrame;
-            memcpy(audioBuffer.mData, &m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
-            addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast<float*>(audioBuffer.mData), bipBopCount);
+            if (!m_muted && m_enabled) {
+                memcpy(audioBuffer.mData, &m_bipBopBuffer[bipBopStart], sizeof(Float32) * bipBopCount);
+                addHum(HumVolume, HumFrequency, m_sampleRate, m_samplesRendered, static_cast<float*>(audioBuffer.mData), bipBopCount);
+            } else
+                memset(audioBuffer.mData, 0, sizeof(Float32) * bipBopCount);
         }
         emitSampleBuffers(bipBopCount);
         m_samplesRendered += bipBopCount;
index 02581bd..2ff8aa9 100644 (file)
@@ -95,12 +95,16 @@ void RealtimeIncomingAudioSource::OnData(const void* audioData, int bitsPerSampl
             m_audioSourceProvider->prepare(&m_streamFormat);
     }
 
+    // FIXME: We should not need to do the extra memory allocation and copy.
+    // Instead, we should be able to directly pass audioData pointer.
     WebAudioBufferList audioBufferList { CAAudioStreamDescription(m_streamFormat), WTF::safeCast<uint32_t>(numberOfFrames) };
     audioBufferList.buffer(0)->mDataByteSize = numberOfChannels * numberOfFrames * bitsPerSample / 8;
     audioBufferList.buffer(0)->mNumberChannels = numberOfChannels;
-    // FIXME: We should not need to do the extra memory allocation and copy.
-    // Instead, we should be able to directly pass audioData pointer.
-    memcpy(audioBufferList.buffer(0)->mData, audioData, audioBufferList.buffer(0)->mDataByteSize);
+
+    if (muted() || !enabled())
+        memset(audioBufferList.buffer(0)->mData, 0, audioBufferList.buffer(0)->mDataByteSize);
+    else
+        memcpy(audioBufferList.buffer(0)->mData, audioData, audioBufferList.buffer(0)->mDataByteSize);
 
     audioSamplesAvailable(mediaTime, audioBufferList, CAAudioStreamDescription(m_streamFormat), numberOfFrames);
 }
index 193016a..92ee19e 100644 (file)
@@ -92,13 +92,40 @@ void RealtimeIncomingVideoSource::stopProducingData()
         m_videoTrack->RemoveSink(this);
 }
 
+CVPixelBufferRef RealtimeIncomingVideoSource::pixelBufferFromVideoFrame(const webrtc::VideoFrame& frame)
+{
+    if (muted() || !enabled()) {
+        if (!m_blackFrame || m_blackFrameWidth != frame.width() || m_blackFrameHeight != frame.height()) {
+            CVPixelBufferRef pixelBuffer = nullptr;
+            auto status = CVPixelBufferCreate(kCFAllocatorDefault, frame.width(), frame.height(), kCVPixelFormatType_420YpCbCr8Planar, nullptr, &pixelBuffer);
+            ASSERT_UNUSED(status, status == noErr);
+
+            m_blackFrame = pixelBuffer;
+            m_blackFrameWidth = frame.width();
+            m_blackFrameHeight = frame.height();
+
+            status = CVPixelBufferLockBaseAddress(pixelBuffer, 0);
+            ASSERT(status == noErr);
+            void* data = CVPixelBufferGetBaseAddress(pixelBuffer);
+            size_t yLength = frame.width() * frame.height();
+            memset(data, 0, yLength);
+            memset(static_cast<uint8_t*>(data) + yLength, 128, yLength / 2);
+
+            status = CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
+            ASSERT(!status);
+        }
+        return m_blackFrame.get();
+    }
+    auto buffer = frame.video_frame_buffer();
+    return static_cast<CVPixelBufferRef>(buffer->native_handle());
+}
+
 void RealtimeIncomingVideoSource::OnFrame(const webrtc::VideoFrame& frame)
 {
     if (!m_isProducingData)
         return;
 
-    auto buffer = frame.video_frame_buffer();
-    CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(buffer->native_handle());
+    auto pixelBuffer = pixelBufferFromVideoFrame(frame);
 
     // FIXME: Convert timing information from VideoFrame to CMSampleTimingInfo.
     // For the moment, we will pretend that frames should be rendered asap.
index 46ed691..ee5e82b 100644 (file)
@@ -70,6 +70,8 @@ private:
     // rtc::VideoSinkInterface
     void OnFrame(const webrtc::VideoFrame&) final;
 
+    CVPixelBufferRef pixelBufferFromVideoFrame(const webrtc::VideoFrame&);
+
     RefPtr<Image> m_currentImage;
     RealtimeMediaSourceSettings m_currentSettings;
     RealtimeMediaSourceSupportedConstraints m_supportedConstraints;
@@ -79,6 +81,9 @@ private:
     rtc::scoped_refptr<webrtc::VideoTrackInterface> m_videoTrack;
     RetainPtr<CMSampleBufferRef> m_buffer;
     PixelBufferConformerCV m_conformer;
+    RetainPtr<CVPixelBufferRef> m_blackFrame;
+    int m_blackFrameWidth { 0 };
+    int m_blackFrameHeight { 0 };
 };
 
 } // namespace WebCore
index 169b417..8ac57f5 100644 (file)
@@ -92,6 +92,7 @@ void RealtimeOutgoingVideoSource::videoSampleAvailable(MediaSample& sample)
         auto blackBuffer = m_bufferPool.CreateBuffer(settings.width(), settings.height());
         blackBuffer->SetToBlack();
         sendFrame(WTFMove(blackBuffer));
+        return;
     }
 
     ASSERT(sample.platformSample().type == PlatformSample::CMSampleBufferType);