[MediaStream] Move paintCurrentFrameInContext from RealtimeMediaSources to MediaPlayer
authoreric.carlson@apple.com <eric.carlson@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Tue, 14 Mar 2017 00:30:48 +0000 (00:30 +0000)
committereric.carlson@apple.com <eric.carlson@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Tue, 14 Mar 2017 00:30:48 +0000 (00:30 +0000)
https://bugs.webkit.org/show_bug.cgi?id=169474
<rdar://problem/30976747>

Reviewed by Youenn Fablet.

Source/WebCore:

Every video capture source has extremely similar code to render the current frame to
a graphics context. Because the media player gets every video sample buffer, have it
hang onto the most recent frame so it can implement paintCurrentFrameInContext directly.
Fix an existing race condition that occasionally caused the readyState to advance to
"have enough data" before a video was ready to paint by defining a MediaStreamTrackPrivate
readyState and observing that.

No new tests, covered by existing tests. These changes uncovered a bug in
fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html, which
was updated.

* Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp:
(WebCore::CanvasCaptureMediaStreamTrack::Source::captureCanvas):
(WebCore::CanvasCaptureMediaStreamTrack::Source::paintCurrentFrameInContext): Deleted.
(WebCore::CanvasCaptureMediaStreamTrack::Source::currentFrameImage): Deleted.
* Modules/mediastream/CanvasCaptureMediaStreamTrack.h:

* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
Drive-by change - don't pass status to parent callback, it is a property of the layer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): Drive-by cleanup - we don't
use AVSampleBufferRenderSynchronizer so don't fail if it isn't available.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample): Hang onto new frame,
invalidate cached image, update readyState.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange): No more "updatePausedImage".
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Drive-by cleanup - Add an early
 return if there is no need for a layer.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): renderingModeChanged -> updateRenderingMode.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): Minor cleanup.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode): Renamed from renderingModeChanged,
add a bool return to signal when the mode changes.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): No more m_haveEverPlayed. Update display
mode immediately.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): No more paused image.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState): Only return HaveNothing, HaveMetadata,
or HaveEnoughData. Don't return HaveEnoughData until all enabled tracks are providing data and never
drop back to HaveMetadata.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode): Renamed from renderingModeChanged.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::characteristicsChanged): Update intrinsic
size directly.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): No more m_hasReceivedMedia.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::readyStateChanged): Ditto.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack): Reset imagePainter
when active video track changes.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateCurrentFrameImage): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext): Paint current
frame image.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset): New.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updatePausedImage): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSamplesAvailable): Deleted.

* platform/mediastream/MediaStreamPrivate.cpp:
(WebCore::MediaStreamPrivate::paintCurrentFrameInContext): Deleted.
(WebCore::MediaStreamPrivate::currentFrameImage): Deleted.
* platform/mediastream/MediaStreamPrivate.h:

* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate):
(WebCore::MediaStreamTrackPrivate::endTrack): Update readyState.
(WebCore::MediaStreamTrackPrivate::clone): Clone readyState.
(WebCore::MediaStreamTrackPrivate::sourceStopped): Update readyState.
(WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Ditto.
(WebCore::MediaStreamTrackPrivate::audioSamplesAvailable): Ditto.
(WebCore::MediaStreamTrackPrivate::updateReadyState): New, update readyState and notify observers.
(WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
* platform/mediastream/MediaStreamTrackPrivate.h:

* platform/mediastream/MediaStreamTrackPrivate.cpp:
(WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
* platform/mediastream/RealtimeMediaSource.h:
(WebCore::RealtimeMediaSource::currentFrameImage): Deleted.
(WebCore::RealtimeMediaSource::paintCurrentFrameInContext): Deleted.

* platform/mediastream/mac/AVMediaCaptureSource.mm:
(-[WebCoreAVMediaCaptureSourceObserver disconnect]): Drive-by fix - clear m_callback
after calling removeNotificationObservers.
(-[WebCoreAVMediaCaptureSourceObserver removeNotificationObservers]): Drive-by fix - remove
the correct listener.
(-[WebCoreAVMediaCaptureSourceObserver endSessionInterrupted:]):

* platform/mediastream/mac/AVVideoCaptureSource.h:
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::currentFrameImage): Deleted.
(WebCore::AVVideoCaptureSource::currentFrameCGImage): Deleted.
(WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Deleted.

* platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
(WebCore::drawImage): Deleted.
(WebCore::RealtimeIncomingVideoSource::currentFrameImage): Deleted.
(WebCore::RealtimeIncomingVideoSource::paintCurrentFrameInContext): Deleted.
* platform/mediastream/mac/RealtimeIncomingVideoSource.h:

* platform/mock/MockRealtimeVideoSource.cpp:
(WebCore::MockRealtimeVideoSource::paintCurrentFrameInContext): Deleted.
(WebCore::MockRealtimeVideoSource::currentFrameImage): Deleted.
* platform/mock/MockRealtimeVideoSource.h:

LayoutTests:

* fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt:
* fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html: Fix
bug uncovered by patch.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@213880 268f45cc-cd09-0410-ab3c-d52691b4dbfc

20 files changed:
LayoutTests/ChangeLog
LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt
LayoutTests/fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html
Source/WebCore/ChangeLog
Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp
Source/WebCore/Modules/mediastream/CanvasCaptureMediaStreamTrack.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm
Source/WebCore/platform/mediastream/MediaStreamPrivate.cpp
Source/WebCore/platform/mediastream/MediaStreamPrivate.h
Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp
Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h
Source/WebCore/platform/mediastream/RealtimeMediaSource.h
Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm
Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h
Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm
Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.cpp
Source/WebCore/platform/mediastream/mac/RealtimeIncomingVideoSource.h
Source/WebCore/platform/mock/MockRealtimeVideoSource.cpp
Source/WebCore/platform/mock/MockRealtimeVideoSource.h

index bb07496..c4f2a19 100644 (file)
@@ -1,3 +1,15 @@
+2017-03-13  Eric Carlson  <eric.carlson@apple.com>
+
+        [MediaStream] Move paintCurrentFrameInContext from RealtimeMediaSources to MediaPlayer
+        https://bugs.webkit.org/show_bug.cgi?id=169474
+        <rdar://problem/30976747>
+
+        Reviewed by Youenn Fablet.
+
+        * fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled-expected.txt:
+        * fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html: Fix 
+        bug uncovered by patch.
+
 2017-03-13  Ryan Haddad  <ryanhaddad@apple.com>
 
         Skip WebGPU tests on ios-simulator.
index c9f6b06..0287ca4 100644 (file)
@@ -7,25 +7,26 @@ PASS mediaDevices.getUserMedia generated a stream successfully.
 video.src = window.URL.createObjectURL(mediaStream)
 
  === beginning round of pixel tests ===
-PASS pixel was black
+PASS pixel was black.
 
  === all video tracks disabled ===
 PASS pixel was black.
 
- === video track reenabled ===
-PASS pixel was white.
+ === video track reenabled, should NOT render current frame ===
+PASS pixel was black.
 
  ===== play video =====
 video.play()
 
  === beginning round of pixel tests ===
-PASS pixel was black
+PASS pixel was white.
 
  === all video tracks disabled ===
 PASS pixel was black.
 
- === video track reenabled ===
+ === video track reenabled, should render current frame ===
 PASS pixel was white.
+
 PASS successfullyParsed is true
 
 TEST COMPLETE
index 14f4ee6..9e21d37 100644 (file)
         return pixel[0] === 255 && pixel[1] === 255 && pixel[2] === 255 && pixel[3] === 255;
     }
 
-    function attempt(numberOfTries, call, callback, successMessage)
+    function canvasShouldBeBlack()
+    {
+        return !(mediaStream.getVideoTracks()[0].enabled && havePlayed);
+    }
+    
+    function attempt(numberOfTries, call, callback)
     {
         if (numberOfTries <= 0) {
             testFailed('Pixel check did not succeed after multiple tries.');
 
         let attemptSucceeded = call();
         if (attemptSucceeded) {
-            testPassed(successMessage);
+            testPassed(canvasShouldBeBlack() ? 'pixel was black.' : 'pixel was white.');
             callback();
 
             return;
         }
         
-        setTimeout(() => { attempt(--numberOfTries, call, callback, successMessage); }, 50);
+        setTimeout(() => { attempt(--numberOfTries, call, callback); }, 50);
     }
 
     function repeatWithVideoPlayingAndFinishTest()
         if (video.paused) {
             debug('<br> ===== play video =====');
             evalAndLog('video.play()');
+            havePlayed = true;
             beginTestRound();
-        } else
+        } else {
+            debug('');
             video.pause();
             finishJSTest();
+        }
     }
 
     function reenableTrack()
     {
         mediaStream.getVideoTracks()[0].enabled = true;
-        debug('<br> === video track reenabled ===');
+        debug(`<br> === video track reenabled, should${havePlayed ? "" : " NOT"} render current frame ===`);
 
         // The video is not guaranteed to render non-black frames before the canvas is drawn to and the pixels are checked.
         // A timeout is used to ensure that the pixel check is done after the video renders non-black frames.
-        attempt(10, checkPixels, repeatWithVideoPlayingAndFinishTest, 'pixel was white.');
+        attempt(10, checkPixels, repeatWithVideoPlayingAndFinishTest);
     }
 
     function checkPixels()
     {
         context.clearRect(0, 0, canvas.width, canvas.height);
         buffer = context.getImageData(30, 242, 1, 1).data;
-        if(!isPixelTransparent(buffer)) {
+        if (!isPixelTransparent(buffer))
             testFailed('pixel was not transparent after clearing canvas.');
-        }
 
         context.drawImage(video, 0, 0, canvas.width, canvas.height);
         buffer = context.getImageData(30, 242, 1, 1).data;
 
-        if (mediaStream.getVideoTracks()[0].enabled && havePlayed)
+        if (!canvasShouldBeBlack())
             return isPixelWhite(buffer);
         else
             return isPixelBlack(buffer);
         
         // The video is not guaranteed to render black frames before the canvas is drawn to and the pixels are checked.
         // A timeout is used to ensure that the pixel check is done after the video renders black frames.
-        attempt(10, checkPixels, reenableTrack, 'pixel was black.');
+        attempt(10, checkPixels, reenableTrack);
     }
 
     function beginTestRound()
     {
         debug('<br> === beginning round of pixel tests ===');
-        attempt(1, checkPixels, disableAllTracks, 'pixel was black');
+        attempt(10, checkPixels, disableAllTracks);
     }
 
     function canplay()
index c257d44..2e4a877 100644 (file)
@@ -1,3 +1,112 @@
+2017-03-13  Eric Carlson  <eric.carlson@apple.com>
+
+        [MediaStream] Move paintCurrentFrameInContext from RealtimeMediaSources to MediaPlayer
+        https://bugs.webkit.org/show_bug.cgi?id=169474
+        <rdar://problem/30976747>
+
+        Reviewed by Youenn Fablet.
+
+        Every video capture source has extremely similar code to render the current frame to
+        a graphics context. Because the media player gets every video sample buffer, have it
+        hang onto the most recent frame so it can implement paintCurrentFrameInContext directly.
+        Fix an existing race condition that occasionally caused the readyState to advance to 
+        "have enough data" before a video was ready to paint by defining a MediaStreamTrackPrivate
+        readyState and observing that.
+
+        No new tests, covered by existing tests. These changes uncovered a bug in 
+        fast/mediastream/MediaStream-video-element-video-tracks-disabled-then-enabled.html, which 
+        was updated.
+
+        * Modules/mediastream/CanvasCaptureMediaStreamTrack.cpp:
+        (WebCore::CanvasCaptureMediaStreamTrack::Source::captureCanvas):
+        (WebCore::CanvasCaptureMediaStreamTrack::Source::paintCurrentFrameInContext): Deleted.
+        (WebCore::CanvasCaptureMediaStreamTrack::Source::currentFrameImage): Deleted.
+        * Modules/mediastream/CanvasCaptureMediaStreamTrack.h:
+
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
+        Drive-by change - don't pass status to parent callback, it is a property of the layer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::isAvailable): Drive-by cleanup - we don't
+        use AVSampleBufferRenderSynchronizer so don't fail if it isn't available.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample): Hang onto new frame,
+        invalidate cached image, update readyState.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange): No more "updatePausedImage".
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Drive-by cleanup - Add an early
+         return if there is no need for a layer.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): renderingModeChanged -> updateRenderingMode.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): Minor cleanup.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode): Renamed from renderingModeChanged,
+        add a bool return to signal when the mode changes.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): No more m_haveEverPlayed. Update display
+        mode immediately.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause): No more paused image.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState): Only return HaveNothing, HaveMetadata,
+        or HaveEnoughData. Don't return HaveEnoughData until all enabled tracks are providing data and never
+        drop back to HaveMetadata.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode): Renamed from renderingModeChanged.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::characteristicsChanged): Update intrinsic
+        size directly.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated): No more m_hasReceivedMedia.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::readyStateChanged): Ditto.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack): Reset imagePainter
+        when active video track changes.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateCurrentFrameImage): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext): Paint current
+        frame image.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset): New.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updatePausedImage): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSamplesAvailable): Deleted.
+
+        * platform/mediastream/MediaStreamPrivate.cpp:
+        (WebCore::MediaStreamPrivate::paintCurrentFrameInContext): Deleted.
+        (WebCore::MediaStreamPrivate::currentFrameImage): Deleted.
+        * platform/mediastream/MediaStreamPrivate.h:
+
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::MediaStreamTrackPrivate):
+        (WebCore::MediaStreamTrackPrivate::endTrack): Update readyState.
+        (WebCore::MediaStreamTrackPrivate::clone): Clone readyState.
+        (WebCore::MediaStreamTrackPrivate::sourceStopped): Update readyState.
+        (WebCore::MediaStreamTrackPrivate::videoSampleAvailable): Ditto.
+        (WebCore::MediaStreamTrackPrivate::audioSamplesAvailable): Ditto.
+        (WebCore::MediaStreamTrackPrivate::updateReadyState): New, update readyState and notify observers.
+        (WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
+        * platform/mediastream/MediaStreamTrackPrivate.h:
+
+        * platform/mediastream/MediaStreamTrackPrivate.cpp:
+        (WebCore::MediaStreamTrackPrivate::paintCurrentFrameInContext): Deleted.
+        * platform/mediastream/RealtimeMediaSource.h:
+        (WebCore::RealtimeMediaSource::currentFrameImage): Deleted.
+        (WebCore::RealtimeMediaSource::paintCurrentFrameInContext): Deleted.
+
+        * platform/mediastream/mac/AVMediaCaptureSource.mm:
+        (-[WebCoreAVMediaCaptureSourceObserver disconnect]): Drive-by fix - clear m_callback
+        after calling removeNotificationObservers.
+        (-[WebCoreAVMediaCaptureSourceObserver removeNotificationObservers]): Drive-by fix - remove 
+        the correct listener.
+        (-[WebCoreAVMediaCaptureSourceObserver endSessionInterrupted:]):
+
+        * platform/mediastream/mac/AVVideoCaptureSource.h:
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::currentFrameImage): Deleted.
+        (WebCore::AVVideoCaptureSource::currentFrameCGImage): Deleted.
+        (WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Deleted.
+
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.cpp:
+        (WebCore::drawImage): Deleted.
+        (WebCore::RealtimeIncomingVideoSource::currentFrameImage): Deleted.
+        (WebCore::RealtimeIncomingVideoSource::paintCurrentFrameInContext): Deleted.
+        * platform/mediastream/mac/RealtimeIncomingVideoSource.h:
+
+        * platform/mock/MockRealtimeVideoSource.cpp:
+        (WebCore::MockRealtimeVideoSource::paintCurrentFrameInContext): Deleted.
+        (WebCore::MockRealtimeVideoSource::currentFrameImage): Deleted.
+        * platform/mock/MockRealtimeVideoSource.h:
+
 2017-03-13  Carlos Alberto Lopez Perez  <clopez@igalia.com>
 
         [GTK][SOUP] Fix build after r213877
index eb43bb0..f3b809c 100644 (file)
@@ -142,9 +142,6 @@ void CanvasCaptureMediaStreamTrack::Source::captureCanvas()
     if (!m_canvas->originClean())
         return;
 
-    // FIXME: This is probably not efficient.
-    m_currentImage = m_canvas->copiedImage();
-
     auto sample = m_canvas->toMediaSample();
     if (!sample)
         return;
@@ -152,31 +149,6 @@ void CanvasCaptureMediaStreamTrack::Source::captureCanvas()
     videoSampleAvailable(*sample);
 }
 
-void CanvasCaptureMediaStreamTrack::Source::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (!m_canvas)
-        return;
-
-    if (context.paintingDisabled())
-        return;
-
-    auto image = currentFrameImage();
-    if (!image)
-        return;
-
-    FloatRect fullRect(0, 0, m_canvas->width(), m_canvas->height());
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    context.drawImage(*image, rect);
-}
-
-RefPtr<Image> CanvasCaptureMediaStreamTrack::Source::currentFrameImage()
-{
-    return m_currentImage;
-}
-
 }
 
 #endif // ENABLE(MEDIA_STREAM)
index 619e864..a3824b6 100644 (file)
@@ -64,8 +64,6 @@ private:
         bool isProducingData() const { return m_isProducingData; }
         RefPtr<RealtimeMediaSourceCapabilities> capabilities() const final { return nullptr; }
         const RealtimeMediaSourceSettings& settings() const final { return m_settings; }
-        RefPtr<Image> currentFrameImage() final;
-        void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
         bool applySize(const IntSize&) final { return true; }
 
         void captureCanvas();
index c669e72..1683ed3 100644 (file)
@@ -49,6 +49,7 @@ class AudioTrackPrivateMediaStreamCocoa;
 class AVVideoCaptureSource;
 class Clock;
 class MediaSourcePrivateClient;
+class PixelBufferConformerCV;
 class VideoTrackPrivateMediaStream;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
@@ -77,7 +78,7 @@ public:
     void ensureLayer();
     void destroyLayer();
 
-    void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*);
+    void layerStatusDidChange(AVSampleBufferDisplayLayer*);
 
 private:
     // MediaPlayerPrivateInterface
@@ -135,7 +136,6 @@ private:
     MediaTime calculateTimelineOffset(const MediaSample&, double);
     
     void enqueueVideoSample(MediaStreamTrackPrivate&, MediaSample&);
-    bool shouldEnqueueVideoSampleBuffer() const;
     void flushAndRemoveVideoSampleBuffers();
     void requestNotificationWhenReadyForVideoData();
 
@@ -161,9 +161,8 @@ private:
     MediaPlayer::ReadyState currentReadyState();
     void updateReadyState();
 
-    void updateIntrinsicSize(const FloatSize&);
     void updateTracks();
-    void renderingModeChanged();
+    void updateRenderingMode();
     void checkSelectedVideoTrack();
 
     void scheduleDeferredTask(Function<void ()>&&);
@@ -175,8 +174,8 @@ private:
         LivePreview,
     };
     DisplayMode currentDisplayMode() const;
-    void updateDisplayMode();
-    void updatePausedImage();
+    bool updateDisplayMode();
+    void updateCurrentFrameImage();
 
     // MediaStreamPrivate::Observer
     void activeStatusChanged() override;
@@ -190,7 +189,7 @@ private:
     void trackSettingsChanged(MediaStreamTrackPrivate&) override { };
     void trackEnabledChanged(MediaStreamTrackPrivate&) override { };
     void sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&) override;
-    void audioSamplesAvailable(MediaStreamTrackPrivate&) override;
+    void readyStateChanged(MediaStreamTrackPrivate&) override;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     void setVideoFullscreenLayer(PlatformLayer*, std::function<void()> completionHandler) override;
@@ -212,7 +211,16 @@ private:
     std::unique_ptr<Clock> m_clock;
 
     MediaTime m_pausedTime;
-    RetainPtr<CGImageRef> m_pausedImage;
+
+    struct CurrentFramePainter {
+        CurrentFramePainter() = default;
+        void reset();
+
+        RetainPtr<CGImageRef> cgImage;
+        RefPtr<MediaSample> mediaSample;
+        std::unique_ptr<PixelBufferConformerCV> pixelBufferConformer;
+    };
+    CurrentFramePainter m_imagePainter;
 
     HashMap<String, RefPtr<AudioTrackPrivateMediaStreamCocoa>> m_audioTrackMap;
     HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap;
@@ -220,17 +228,16 @@ private:
 
     MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty };
     MediaPlayer::ReadyState m_readyState { MediaPlayer::HaveNothing };
+    MediaPlayer::ReadyState m_previousReadyState { MediaPlayer::HaveNothing };
     FloatSize m_intrinsicSize;
     float m_volume { 1 };
     DisplayMode m_displayMode { None };
     bool m_playing { false };
     bool m_muted { false };
-    bool m_haveEverPlayed { false };
     bool m_ended { false };
     bool m_hasEverEnqueuedVideoFrame { false };
-    bool m_hasReceivedMedia { false };
-    bool m_isFrameDisplayed { false };
     bool m_pendingSelectedTrackCheck { false };
+    bool m_shouldDisplayFirstVideoFrame { false };
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     std::unique_ptr<VideoFullscreenLayerManager> m_videoFullscreenLayerManager;
index 99c0a2c..9210d90 100644 (file)
 #import "AVFoundationSPI.h"
 #import "AudioTrackPrivateMediaStreamCocoa.h"
 #import "Clock.h"
-#import "CoreMediaSoftLink.h"
-#import "GraphicsContext.h"
+#import "GraphicsContextCG.h"
 #import "Logging.h"
 #import "MediaStreamPrivate.h"
 #import "MediaTimeAVFoundation.h"
+#import "PixelBufferConformerCV.h"
 #import "VideoTrackPrivateMediaStream.h"
 #import <AVFoundation/AVSampleBufferDisplayLayer.h>
 #import <QuartzCore/CALayer.h>
 
 #pragma mark - Soft Linking
 
+#import "CoreMediaSoftLink.h"
+#import "CoreVideoSoftLink.h"
+
 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
 
 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
-SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer)
-
-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*)
-SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*)
 
 #define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral()
 #define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed()
@@ -134,11 +133,11 @@ using namespace WebCore;
         ASSERT(_layers.contains(layer.get()));
         ASSERT([keyPath isEqualToString:@"status"]);
 
-        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer), status = WTFMove(status)] {
+        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer)] {
             if (!protectedSelf->_parent)
                 return;
 
-            protectedSelf->_parent->layerStatusDidChange(layer.get(), status.get());
+            protectedSelf->_parent->layerStatusDidChange(layer.get());
         });
 
     } else
@@ -201,11 +200,6 @@ bool MediaPlayerPrivateMediaStreamAVFObjC::isAvailable()
     if (!AVFoundationLibrary() || !isCoreMediaFrameworkAvailable() || !getAVSampleBufferDisplayLayerClass())
         return false;
 
-#if PLATFORM(MAC)
-    if (!getAVSampleBufferRenderSynchronizerClass())
-        return false;
-#endif
-
     return true;
 }
 
@@ -281,9 +275,14 @@ void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPr
     if (&track != m_mediaStreamPrivate->activeVideoTrack())
         return;
 
-    m_hasReceivedMedia = true;
-    updateReadyState();
-    if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_isFrameDisplayed))
+    if (!m_imagePainter.mediaSample || m_displayMode != PausedImage) {
+        m_imagePainter.mediaSample = &sample;
+        m_imagePainter.cgImage = nullptr;
+        if (m_readyState < MediaPlayer::ReadyState::HaveEnoughData)
+            updateReadyState();
+    }
+
+    if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_imagePainter.mediaSample))
         return;
 
     auto videoTrack = m_videoTrackMap.get(track.id());
@@ -306,11 +305,8 @@ void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPr
         [m_sampleBufferDisplayLayer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
     }
 
-    m_isFrameDisplayed = true;
     if (!m_hasEverEnqueuedVideoFrame) {
         m_hasEverEnqueuedVideoFrame = true;
-        if (m_displayMode == PausedImage)
-            updatePausedImage();
         m_player->firstVideoFrameAvailable();
     }
 }
@@ -338,11 +334,12 @@ AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
     return nullptr;
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status)
+void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer)
 {
-    if (status.integerValue != AVQueuedSampleBufferRenderingStatusRendering)
-        return;
+    LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(%p) - status = %d", this, (int)layer.status);
 
+    if (layer.status != AVQueuedSampleBufferRenderingStatusRendering)
+        return;
     if (!m_sampleBufferDisplayLayer || !m_activeVideoTrack || layer != m_sampleBufferDisplayLayer)
         return;
 
@@ -357,21 +354,9 @@ void MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers()
         [m_sampleBufferDisplayLayer flush];
 }
 
-bool MediaPlayerPrivateMediaStreamAVFObjC::shouldEnqueueVideoSampleBuffer() const
-{
-    if (m_displayMode == LivePreview)
-        return true;
-
-    if (m_displayMode == PausedImage && !m_isFrameDisplayed)
-        return true;
-
-    return false;
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::flushAndRemoveVideoSampleBuffers()
 {
     [m_sampleBufferDisplayLayer flushAndRemoveImage];
-    m_isFrameDisplayed = false;
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
@@ -379,14 +364,22 @@ void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
     if (m_sampleBufferDisplayLayer)
         return;
 
+    if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->activeVideoTrack() || !m_mediaStreamPrivate->activeVideoTrack()->enabled())
+        return;
+
     m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
+    if (!m_sampleBufferDisplayLayer) {
+        LOG_ERROR("MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers: +[AVSampleBufferDisplayLayer alloc] failed.");
+        return;
+    }
+
 #ifndef NDEBUG
     [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
 #endif
     m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
     [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
 
-    renderingModeChanged();
+    updateRenderingMode();
     
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
@@ -405,7 +398,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer()
         [m_sampleBufferDisplayLayer flush];
     }
 
-    renderingModeChanged();
+    updateRenderingMode();
     
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     m_videoFullscreenLayerManager->didDestroyVideoLayer();
@@ -480,8 +473,10 @@ MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamA
     if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
         return None;
 
-    if (m_mediaStreamPrivate->activeVideoTrack() && !m_mediaStreamPrivate->activeVideoTrack()->enabled())
-        return PaintItBlack;
+    if (auto* track = m_mediaStreamPrivate->activeVideoTrack()) {
+        if (!m_shouldDisplayFirstVideoFrame || !track->enabled() || track->muted())
+            return PaintItBlack;
+    }
 
     if (m_playing) {
         if (!m_mediaStreamPrivate->isProducingData())
@@ -492,30 +487,19 @@ MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamA
     return PausedImage;
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode()
+bool MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode()
 {
     DisplayMode displayMode = currentDisplayMode();
 
     if (displayMode == m_displayMode)
-        return;
+        return false;
+
     m_displayMode = displayMode;
 
     if (m_displayMode < PausedImage && m_sampleBufferDisplayLayer)
         flushAndRemoveVideoSampleBuffers();
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::updatePausedImage()
-{
-    if (m_displayMode < PausedImage)
-        return;
 
-    RefPtr<Image> image = m_mediaStreamPrivate->currentFrameImage();
-    ASSERT(image);
-    if (!image)
-        return;
-
-    m_pausedImage = image->nativeImage();
-    ASSERT(m_pausedImage);
+    return true;
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::play()
@@ -532,9 +516,10 @@ void MediaPlayerPrivateMediaStreamAVFObjC::play()
     for (const auto& track : m_audioTrackMap.values())
         track->play();
 
-    m_haveEverPlayed = true;
+    m_shouldDisplayFirstVideoFrame = true;
+    updateDisplayMode();
+
     scheduleDeferredTask([this] {
-        updateDisplayMode();
         updateReadyState();
     });
 }
@@ -553,7 +538,6 @@ void MediaPlayerPrivateMediaStreamAVFObjC::pause()
         track->pause();
 
     updateDisplayMode();
-    updatePausedImage();
     flushRenderers();
 }
 
@@ -632,20 +616,26 @@ MediaPlayer::ReadyState MediaPlayerPrivateMediaStreamAVFObjC::readyState() const
 
 MediaPlayer::ReadyState MediaPlayerPrivateMediaStreamAVFObjC::currentReadyState()
 {
-    if (!m_mediaStreamPrivate)
+    if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->active() || !m_mediaStreamPrivate->tracks().size())
         return MediaPlayer::ReadyState::HaveNothing;
 
-    // https://w3c.github.io/mediacapture-main/ Change 8. from July 4, 2013.
-    // FIXME: Only update readyState to HAVE_ENOUGH_DATA when all active tracks have sent a sample buffer.
-    if (m_mediaStreamPrivate->active() && m_hasReceivedMedia)
-        return MediaPlayer::ReadyState::HaveEnoughData;
+    bool allTracksAreLive = true;
+    for (auto& track : m_mediaStreamPrivate->tracks()) {
+        if (!track->enabled() || track->readyState() != MediaStreamTrackPrivate::ReadyState::Live) {
+            allTracksAreLive = false;
+            break;
+        }
 
-    updateDisplayMode();
+        if (track == m_mediaStreamPrivate->activeVideoTrack() && !m_imagePainter.mediaSample) {
+            allTracksAreLive = false;
+            break;
+        }
+    }
 
-    if (m_displayMode == PausedImage)
-        return MediaPlayer::ReadyState::HaveCurrentData;
+    if (!allTracksAreLive && m_previousReadyState == MediaPlayer::ReadyState::HaveNothing)
+        return MediaPlayer::ReadyState::HaveMetadata;
 
-    return MediaPlayer::ReadyState::HaveMetadata;
+    return MediaPlayer::ReadyState::HaveEnoughData;
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::updateReadyState()
@@ -674,17 +664,11 @@ void MediaPlayerPrivateMediaStreamAVFObjC::activeStatusChanged()
     });
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::updateIntrinsicSize(const FloatSize& size)
+void MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode()
 {
-    if (size == m_intrinsicSize)
+    if (!updateDisplayMode())
         return;
 
-    m_intrinsicSize = size;
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::renderingModeChanged()
-{
-    updateDisplayMode();
     scheduleDeferredTask([this] {
         if (m_player)
             m_player->client().mediaPlayerRenderingModeChanged(m_player);
@@ -698,7 +682,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::characteristicsChanged()
 
     FloatSize intrinsicSize = m_mediaStreamPrivate->intrinsicSize();
     if (intrinsicSize.height() != m_intrinsicSize.height() || intrinsicSize.width() != m_intrinsicSize.width()) {
-        updateIntrinsicSize(intrinsicSize);
+        m_intrinsicSize = intrinsicSize;
         sizeChanged = true;
     }
 
@@ -734,12 +718,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated(MediaStreamTrackP
     ASSERT(mediaSample.platformSample().type == PlatformSample::CMSampleBufferType);
     ASSERT(m_mediaStreamPrivate);
 
-    if (!m_hasReceivedMedia) {
-        m_hasReceivedMedia = true;
-        updateReadyState();
-    }
-
-    if (!m_playing || streamTime().toDouble() < 0)
+    if (streamTime().toDouble() < 0)
         return;
 
     switch (track.type()) {
@@ -755,12 +734,8 @@ void MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferUpdated(MediaStreamTrackP
     }
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::audioSamplesAvailable(MediaStreamTrackPrivate&)
+void MediaPlayerPrivateMediaStreamAVFObjC::readyStateChanged(MediaStreamTrackPrivate&)
 {
-    if (m_hasReceivedMedia)
-        return;
-    m_hasReceivedMedia = true;
-
     scheduleDeferredTask([this] {
         updateReadyState();
     });
@@ -834,6 +809,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack()
 
     m_pendingSelectedTrackCheck = true;
     scheduleDeferredTask([this] {
+        auto oldVideoTrack = m_activeVideoTrack;
         bool hideVideoLayer = true;
         m_activeVideoTrack = nullptr;
         if (m_mediaStreamPrivate->activeVideoTrack()) {
@@ -847,9 +823,12 @@ void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack()
             }
         }
 
+        if (oldVideoTrack != m_activeVideoTrack)
+            m_imagePainter.reset();
         ensureLayer();
         m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer;
         m_pendingSelectedTrackCheck = false;
+        updateDisplayMode();
     });
 }
 
@@ -914,25 +893,40 @@ void MediaPlayerPrivateMediaStreamAVFObjC::paint(GraphicsContext& context, const
     paintCurrentFrameInContext(context, rect);
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
+void MediaPlayerPrivateMediaStreamAVFObjC::updateCurrentFrameImage()
+{
+    if (m_imagePainter.cgImage || !m_imagePainter.mediaSample)
+        return;
+
+    if (!m_imagePainter.pixelBufferConformer)
+        m_imagePainter.pixelBufferConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)@{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) });
+
+    ASSERT(m_imagePainter.pixelBufferConformer);
+    if (!m_imagePainter.pixelBufferConformer)
+        return;
+
+    auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_imagePainter.mediaSample->platformSample().sample.cmSampleBuffer));
+    m_imagePainter.cgImage = m_imagePainter.pixelBufferConformer->createImageFromPixelBuffer(pixelBuffer);
+}
+
+void MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& destRect)
 {
     if (m_displayMode == None || !metaDataAvailable() || context.paintingDisabled())
         return;
 
-    if (m_displayMode == LivePreview)
-        m_mediaStreamPrivate->paintCurrentFrameInContext(context, rect);
-    else {
-        GraphicsContextStateSaver stateSaver(context);
-        context.translate(rect.x(), rect.y() + rect.height());
-        context.scale(FloatSize(1, -1));
-        IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-        context.setImageInterpolationQuality(InterpolationLow);
-
-        if (m_displayMode == PausedImage && m_pausedImage)
-            CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_pausedImage.get());
-        else
-            context.fillRect(paintRect, Color::black);
+    GraphicsContextStateSaver stateSaver(context);
+
+    if (m_displayMode != PaintItBlack && m_imagePainter.mediaSample)
+        updateCurrentFrameImage();
+
+    if (m_displayMode == PaintItBlack || !m_imagePainter.cgImage || !m_imagePainter.mediaSample) {
+        context.fillRect(IntRect(IntPoint(), IntSize(destRect.width(), destRect.height())), Color::black);
+        return;
     }
+
+    auto image = m_imagePainter.cgImage.get();
+    FloatRect imageRect(0, 0, CGImageGetWidth(image), CGImageGetHeight(image));
+    context.drawNativeImage(image, imageRect.size(), destRect, imageRect);
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged()
@@ -959,6 +953,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::setReadyState(MediaPlayer::ReadyState
     if (m_readyState == readyState)
         return;
 
+    m_previousReadyState = m_readyState;
     m_readyState = readyState;
     characteristicsChanged();
 
@@ -985,6 +980,13 @@ void MediaPlayerPrivateMediaStreamAVFObjC::scheduleDeferredTask(Function<void ()
     });
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset()
+{
+    cgImage = nullptr;
+    mediaSample = nullptr;
+    pixelBufferConformer = nullptr;
+}
+
 }
 
 #endif
index 21c0769..4206eed 100644 (file)
@@ -239,30 +239,6 @@ FloatSize MediaStreamPrivate::intrinsicSize() const
     return size;
 }
 
-void MediaStreamPrivate::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled())
-        return;
-
-    if (active() && m_activeVideoTrack)
-        m_activeVideoTrack->paintCurrentFrameInContext(context, rect);
-    else {
-        GraphicsContextStateSaver stateSaver(context);
-        context.translate(rect.x(), rect.y() + rect.height());
-        context.scale(FloatSize(1, -1));
-        IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-        context.fillRect(paintRect, Color::black);
-    }
-}
-
-RefPtr<Image> MediaStreamPrivate::currentFrameImage()
-{
-    if (!active() || !m_activeVideoTrack)
-        return nullptr;
-
-    return m_activeVideoTrack->source().currentFrameImage();
-}
-
 void MediaStreamPrivate::updateActiveVideoTrack()
 {
     m_activeVideoTrack = nullptr;
index a4596ff..111a524 100644 (file)
@@ -93,9 +93,6 @@ public:
     void stopProducingData();
     bool isProducingData() const;
 
-    RefPtr<Image> currentFrameImage();
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
-
     bool hasVideo() const;
     bool hasAudio() const;
     bool muted() const;
index 9936002..707bd65 100644 (file)
@@ -49,8 +49,6 @@ Ref<MediaStreamTrackPrivate> MediaStreamTrackPrivate::create(Ref<RealtimeMediaSo
 MediaStreamTrackPrivate::MediaStreamTrackPrivate(Ref<RealtimeMediaSource>&& source, String&& id)
     : m_source(WTFMove(source))
     , m_id(WTFMove(id))
-    , m_isEnabled(true)
-    , m_isEnded(false)
 {
     m_source->addObserver(*this);
 }
@@ -115,6 +113,7 @@ void MediaStreamTrackPrivate::endTrack()
     // only track using the source and it does stop, we will only call each observer's
     // trackEnded method once.
     m_isEnded = true;
+    updateReadyState();
 
     m_source->requestStop(this);
 
@@ -127,6 +126,7 @@ Ref<MediaStreamTrackPrivate> MediaStreamTrackPrivate::clone()
     auto clonedMediaStreamTrackPrivate = create(m_source.copyRef());
     clonedMediaStreamTrackPrivate->m_isEnabled = this->m_isEnabled;
     clonedMediaStreamTrackPrivate->m_isEnded = this->m_isEnded;
+    clonedMediaStreamTrackPrivate->updateReadyState();
 
     return clonedMediaStreamTrackPrivate;
 }
@@ -146,21 +146,6 @@ RefPtr<RealtimeMediaSourceCapabilities> MediaStreamTrackPrivate::capabilities()
     return m_source->capabilities();
 }
 
-void MediaStreamTrackPrivate::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled() || m_source->type() != RealtimeMediaSource::Type::Video || ended())
-        return;
-
-    if (!m_source->muted())
-        m_source->paintCurrentFrameInContext(context, rect);
-    else {
-        GraphicsContextStateSaver stateSaver(context);
-        context.translate(rect.x(), rect.y() + rect.height());
-        IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-        context.fillRect(paintRect, Color::black);
-    }
-}
-
 void MediaStreamTrackPrivate::applyConstraints(const MediaConstraints& constraints, RealtimeMediaSource::SuccessHandler successHandler, RealtimeMediaSource::FailureHandler failureHandler)
 {
     m_source->applyConstraints(constraints, successHandler, failureHandler);
@@ -177,6 +162,7 @@ void MediaStreamTrackPrivate::sourceStopped()
         return;
 
     m_isEnded = true;
+    updateReadyState();
 
     for (auto& observer : m_observers)
         observer->trackEnded(*this);
@@ -208,6 +194,11 @@ bool MediaStreamTrackPrivate::preventSourceFromStopping()
 
 void MediaStreamTrackPrivate::videoSampleAvailable(MediaSample& mediaSample)
 {
+    if (!m_haveProducedData) {
+        m_haveProducedData = true;
+        updateReadyState();
+    }
+
     mediaSample.setTrackID(id());
     for (auto& observer : m_observers)
         observer->sampleBufferUpdated(*this, mediaSample);
@@ -215,10 +206,33 @@ void MediaStreamTrackPrivate::videoSampleAvailable(MediaSample& mediaSample)
 
 void MediaStreamTrackPrivate::audioSamplesAvailable(const MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t)
 {
+    if (!m_haveProducedData) {
+        m_haveProducedData = true;
+        updateReadyState();
+    }
+
     for (auto& observer : m_observers)
         observer->audioSamplesAvailable(*this);
 }
 
+
+void MediaStreamTrackPrivate::updateReadyState()
+{
+    ReadyState state = ReadyState::None;
+
+    if (m_isEnded)
+        state = ReadyState::Ended;
+    else if (m_haveProducedData)
+        state = ReadyState::Live;
+
+    if (state == m_readyState)
+        return;
+
+    m_readyState = state;
+    for (auto& observer : m_observers)
+        observer->readyStateChanged(*this);
+}
+
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)
index cf9b591..3eee035 100644 (file)
@@ -49,6 +49,7 @@ public:
         virtual void trackEnabledChanged(MediaStreamTrackPrivate&) = 0;
         virtual void sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&) { };
         virtual void audioSamplesAvailable(MediaStreamTrackPrivate&) { };
+        virtual void readyStateChanged(MediaStreamTrackPrivate&) { };
     };
 
     static Ref<MediaStreamTrackPrivate> create(Ref<RealtimeMediaSource>&&);
@@ -93,6 +94,9 @@ public:
 
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
 
+    enum class ReadyState { None, Live, Ended };
+    ReadyState readyState() const { return m_readyState; }
+
 private:
     MediaStreamTrackPrivate(Ref<RealtimeMediaSource>&&, String&& id);
 
@@ -105,12 +109,16 @@ private:
     void videoSampleAvailable(MediaSample&) final;
     void audioSamplesAvailable(const MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t) final;
 
+    void updateReadyState();
+
     Vector<Observer*> m_observers;
     Ref<RealtimeMediaSource> m_source;
 
     String m_id;
-    bool m_isEnabled;
-    bool m_isEnded;
+    ReadyState m_readyState { ReadyState::None };
+    bool m_isEnabled { true };
+    bool m_isEnded { false };
+    bool m_haveProducedData { false };
 };
 
 typedef Vector<RefPtr<MediaStreamTrackPrivate>> MediaStreamTrackPrivateVector;
index 29d3d83..1cfcae0 100644 (file)
@@ -141,9 +141,6 @@ public:
 
     virtual AudioSourceProvider* audioSourceProvider() { return nullptr; }
 
-    virtual RefPtr<Image> currentFrameImage() { return nullptr; }
-    virtual void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) { }
-
     void setWidth(int);
     void setHeight(int);
     const IntSize& size() const { return m_size; }
index 575d8f5..de2f8cb 100644 (file)
@@ -347,8 +347,8 @@ NSArray<NSString*>* sessionKVOProperties()
 - (void)disconnect
 {
     [NSObject cancelPreviousPerformRequestsWithTarget:self];
-    m_callback = 0;
     [self removeNotificationObservers];
+    m_callback = nullptr;
 }
 
 - (void)addNotificationObservers
@@ -368,8 +368,7 @@ NSArray<NSString*>* sessionKVOProperties()
 - (void)removeNotificationObservers
 {
 #if PLATFORM(IOS)
-    ASSERT(m_callback);
-    [[NSNotificationCenter defaultCenter] removeObserver:m_callback->session()];
+    [[NSNotificationCenter defaultCenter] removeObserver:self];
 #endif
 }
 
@@ -426,7 +425,7 @@ NSArray<NSString*>* sessionKVOProperties()
 
 - (void)endSessionInterrupted:(NSNotification*)notification
 {
-    LOG(Media, "WebCoreAVMediaCaptureSourceObserver::endSessionInterrupted(%p) ", self);
+    LOG(Media, "WebCoreAVMediaCaptureSourceObserver::endSessionInterrupted(%p)", self);
 
     if (m_callback)
         m_callback->captureSessionEndInterruption(notification);
index 2431fb7..802c3c1 100644 (file)
@@ -77,11 +77,6 @@ private:
     void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) final;
     void processNewFrame(RetainPtr<CMSampleBufferRef>);
 
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
-
-    RetainPtr<CGImageRef> currentFrameCGImage();
-    RefPtr<Image> currentFrameImage() final;
-
     RetainPtr<NSString> m_pendingPreset;
     RetainPtr<CMSampleBufferRef> m_buffer;
     RetainPtr<CGImageRef> m_lastImage;
index 97f5d0f..aa40467 100644 (file)
@@ -103,7 +103,11 @@ using namespace WebCore;
 
 namespace WebCore {
 
+#if PLATFORM(MAC)
 const OSType videoCaptureFormat = kCVPixelFormatType_420YpCbCr8Planar;
+#else
+const OSType videoCaptureFormat = kCVPixelFormatType_420YpCbCr8BiPlanarFullRange;
+#endif
 
 RefPtr<AVMediaCaptureSource> AVVideoCaptureSource::create(AVCaptureDeviceTypedef* device, const AtomicString& id, const MediaConstraints* constraints, String& invalidConstraint)
 {
@@ -436,58 +440,6 @@ void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCa
     });
 }
 
-RefPtr<Image> AVVideoCaptureSource::currentFrameImage()
-{
-    if (!currentFrameCGImage())
-        return nullptr;
-
-    FloatRect imageRect(0, 0, m_width, m_height);
-    std::unique_ptr<ImageBuffer> imageBuffer = ImageBuffer::create(imageRect.size(), Unaccelerated);
-
-    if (!imageBuffer)
-        return nullptr;
-
-    paintCurrentFrameInContext(imageBuffer->context(), imageRect);
-
-    return ImageBuffer::sinkIntoImage(WTFMove(imageBuffer));
-}
-
-RetainPtr<CGImageRef> AVVideoCaptureSource::currentFrameCGImage()
-{
-    if (m_lastImage)
-        return m_lastImage;
-
-    if (!m_buffer)
-        return nullptr;
-
-    CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
-    ASSERT(CVPixelBufferGetPixelFormatType(pixelBuffer) == videoCaptureFormat);
-
-    if (!m_pixelBufferConformer)
-        m_pixelBufferConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)@{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) });
-
-    ASSERT(m_pixelBufferConformer);
-    if (!m_pixelBufferConformer)
-        return nullptr;
-
-    m_lastImage = m_pixelBufferConformer->createImageFromPixelBuffer(pixelBuffer);
-
-    return m_lastImage;
-}
-
-void AVVideoCaptureSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled() || !currentFrameCGImage())
-        return;
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.translate(rect.x(), rect.y() + rect.height());
-    context.scale(FloatSize(1, -1));
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_lastImage.get());
-}
-
 NSString* AVVideoCaptureSource::bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height) const
 {
     if (!width && !height)
index 33cf64c..193016a 100644 (file)
@@ -150,52 +150,6 @@ void RealtimeIncomingVideoSource::processNewSample(CMSampleBufferRef sample, uns
     videoSampleAvailable(MediaSampleAVFObjC::create(sample));
 }
 
-static inline void drawImage(ImageBuffer& imageBuffer, CGImageRef image, const FloatRect& rect)
-{
-    auto& context = imageBuffer.context();
-    GraphicsContextStateSaver stateSaver(context);
-    context.translate(rect.x(), rect.y() + rect.height());
-    context.scale(FloatSize(1, -1));
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), image);
-}
-
-RefPtr<Image> RealtimeIncomingVideoSource::currentFrameImage()
-{
-    if (!m_buffer)
-        return nullptr;
-
-    FloatRect rect(0, 0, m_currentSettings.width(), m_currentSettings.height());
-    auto imageBuffer = ImageBuffer::create(rect.size(), Unaccelerated);
-
-    auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
-    drawImage(*imageBuffer, m_conformer.createImageFromPixelBuffer(pixelBuffer).get(), rect);
-
-    return ImageBuffer::sinkIntoImage(WTFMove(imageBuffer));
-}
-
-void RealtimeIncomingVideoSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled())
-        return;
-
-    if (!m_buffer)
-        return;
-
-    // FIXME: Can we optimize here the painting?
-    FloatRect fullRect(0, 0, m_currentSettings.width(), m_currentSettings.height());
-    auto imageBuffer = ImageBuffer::create(fullRect.size(), Unaccelerated);
-
-    auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
-    drawImage(*imageBuffer, m_conformer.createImageFromPixelBuffer(pixelBuffer).get(), fullRect);
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-    context.drawImage(*imageBuffer->copyImage(DontCopyBackingStore), rect);
-}
-
 RefPtr<RealtimeMediaSourceCapabilities> RealtimeIncomingVideoSource::capabilities() const
 {
     return m_capabilities;
index 6859ddc..46ed691 100644 (file)
@@ -63,9 +63,6 @@ private:
     RealtimeMediaSourceSupportedConstraints& supportedConstraints();
 
     void processNewSample(CMSampleBufferRef, unsigned, unsigned);
-    RefPtr<Image> currentFrameImage() final;
-
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
 
     bool isProducingData() const final { return m_isProducingData && m_buffer; }
     bool applySize(const IntSize&) final { return true; }
index 56c607a..ab7f217 100644 (file)
@@ -357,26 +357,6 @@ ImageBuffer* MockRealtimeVideoSource::imageBuffer() const
     return m_imageBuffer.get();
 }
 
-void MockRealtimeVideoSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
-{
-    if (context.paintingDisabled() || !imageBuffer())
-        return;
-
-    GraphicsContextStateSaver stateSaver(context);
-    context.setImageInterpolationQuality(InterpolationLow);
-    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
-
-    context.drawImage(*m_imageBuffer->copyImage(DontCopyBackingStore), rect);
-}
-
-RefPtr<Image> MockRealtimeVideoSource::currentFrameImage()
-{
-    if (!imageBuffer())
-        return nullptr;
-
-    return m_imageBuffer->copyImage(DontCopyBackingStore);
-}
-
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_STREAM)
index 78d8f8f..81e13cb 100644 (file)
@@ -76,9 +76,6 @@ private:
     bool applyFacingMode(RealtimeMediaSourceSettings::VideoFacingMode) override { return true; }
     bool applyAspectRatio(double) override { return true; }
 
-    RefPtr<Image> currentFrameImage() override;
-    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
-
     void generateFrame();
 
     float m_baseFontSize { 0 };