[MSE][Mac] Support painting MSE video-element to canvas
authorjer.noble@apple.com <jer.noble@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Sat, 20 May 2017 16:55:01 +0000 (16:55 +0000)
committerjer.noble@apple.com <jer.noble@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Sat, 20 May 2017 16:55:01 +0000 (16:55 +0000)
https://bugs.webkit.org/show_bug.cgi?id=125157
<rdar://problem/23062016>

Reviewed by Eric Carlson.

Source/WebCore:

Test: media/media-source/media-source-paint-to-canvas.html

In order to have access to decoded video data for painting, decode the encoded samples manually
instead of adding them to the AVSampleBufferDisplayLayer. To facilitate doing so, add a new
utility class WebCoreDecompressionSession, which can decode samples and store them.

For the purposes of this patch, to avoid double-decoding of video data and to avoid severe complication
of our sample delivery pipeline, we will only support painting of decoded video samples when the video is
not displayed in the DOM.

* Modules/mediasource/MediaSource.cpp:
(WebCore::MediaSource::seekToTime): Always send waitForSeekCompleted() to give private a chance to delay seek completion.
* Modules/mediasource/SourceBuffer.cpp:
(WebCore::SourceBuffer::sourceBufferPrivateReenqueSamples): Added.
* Modules/mediasource/SourceBuffer.h:
* WebCore.xcodeproj/project.pbxproj:
* platform/cf/CoreMediaSoftLink.cpp: Added new soft link macros.
* platform/cf/CoreMediaSoftLink.h: Ditto.
* platform/cocoa/CoreVideoSoftLink.cpp: Ditto.
* platform/cocoa/CoreVideoSoftLink.h: Ditto.
* platform/graphics/SourceBufferPrivateClient.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h:
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::sampleBufferDisplayLayer): Simple accessor.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::decompressionSession): Ditto.
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC):
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::load): Update whether we should be displaying in a layer or decompression session..
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setVisible): Ditto.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted): m_seeking is now an enum.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seeking): Ditto.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted): Ditto. If waiting for a video frame, delay completing seek.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime): Call updateLastImage() and return result.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage): Fetch the image for the current time.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paint): Pass to paintCurrentFrameInCanvas.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext): Get a native image, and render it.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged): Create or destroy a layer or decompression session as appropriate.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer): Creates a layer.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyLayer): Destroys a layer.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession): Creates a decompression session.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession): Destroys a decompression session.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableVideoFrame): If seek completion delayed, complete now. Ditto for ready state change.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setReadyState): If waiting for a video frame, delay ready state change.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer): Deleted.
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer): Deleted.
* platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm:
(WebCore::MediaSourcePrivateAVFObjC::hasVideo): Promote to a class function.
(WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideo): Return whether any of the active source buffers have video and are selected.
(WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideoChanged): Call setSourceBufferWithSelectedVideo().
(WebCore::MediaSourcePrivateAVFObjC::setVideoLayer): Set (or clear) the layer on the selected buffer.
(WebCore::MediaSourcePrivateAVFObjC::setDecompressionSession): Ditto for decompression session.
(WebCore::MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo): Remove the layer and decompression session from the unselected

        buffer and add the decompression session or layer to the newly selected buffer.
(WebCore::MediaSourcePrivateAVFObjCHasVideo): Deleted.
* platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h:
* platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm:
(WebCore::SourceBufferPrivateAVFObjC::destroyRenderers): Clear the videoLayer and decompressionSession.
(WebCore::SourceBufferPrivateAVFObjC::hasSelectedVideo): Return whether the buffer has a selected video track.
(WebCore::SourceBufferPrivateAVFObjC::trackDidChangeEnabled): The media player now manages the video layer and decompression session lifetimes.
(WebCore::SourceBufferPrivateAVFObjC::flush): Flush the decompression session, if it exists.
(WebCore::SourceBufferPrivateAVFObjC::enqueueSample): Enqueue to the decompression session, if it exists.
(WebCore::SourceBufferPrivateAVFObjC::isReadyForMoreSamples): As the decompression session, if it exists.
(WebCore::SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples): Tell the decompression session to stop requesting data, if it exists.
(WebCore::SourceBufferPrivateAVFObjC::notifyClientWhenReadyForMoreSamples): Request media data from the decompression session, if it exists.
(WebCore::SourceBufferPrivateAVFObjC::setVideoLayer): Added.
(WebCore::SourceBufferPrivateAVFObjC::setDecompressionSession): Added.
* platform/graphics/cocoa/WebCoreDecompressionSession.h: Added.
(WebCore::WebCoreDecompressionSession::create):
(WebCore::WebCoreDecompressionSession::isInvalidated):
(WebCore::WebCoreDecompressionSession::createWeakPtr):
* platform/graphics/cocoa/WebCoreDecompressionSession.mm: Added.
(WebCore::WebCoreDecompressionSession::WebCoreDecompressionSession): Register for media data requests.
(WebCore::WebCoreDecompressionSession::invalidate):  Unregister for same.
(WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaDataCallback): Pass to maybeBecomeReadyForMoreMediaData.
(WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaData): Check in-flight decodes, and decoded frame counts.
(WebCore::WebCoreDecompressionSession::enqueueSample): Pass the sample to be decoded on a background queue.
(WebCore::WebCoreDecompressionSession::decodeSample): Decode the sample.
(WebCore::WebCoreDecompressionSession::decompressionOutputCallback): Call handleDecompressionOutput.
(WebCore::WebCoreDecompressionSession::handleDecompressionOutput): Pass decoded sample to be enqueued on the main thread.
(WebCore::WebCoreDecompressionSession::getFirstVideoFrame):
(WebCore::WebCoreDecompressionSession::enqueueDecodedSample): Enqueue the frame (if it's a displayed frame).
(WebCore::WebCoreDecompressionSession::isReadyForMoreMediaData): Return whether we've hit our high water sample count.
(WebCore::WebCoreDecompressionSession::requestMediaDataWhenReady):
(WebCore::WebCoreDecompressionSession::stopRequestingMediaData): Unset the same.
(WebCore::WebCoreDecompressionSession::notifyWhenHasAvailableVideoFrame): Set a callback to notify when a decoded frame has been enqueued.
(WebCore::WebCoreDecompressionSession::imageForTime): Successively dequeue images until reaching one at or beyond the requested time.
(WebCore::WebCoreDecompressionSession::flush): Synchronously empty the producer and consumer queues.
(WebCore::WebCoreDecompressionSession::getDecodeTime): Utility method.
(WebCore::WebCoreDecompressionSession::getPresentationTime): Ditto.
(WebCore::WebCoreDecompressionSession::getDuration): Ditto.
(WebCore::WebCoreDecompressionSession::compareBuffers): Ditto.
* platform/cocoa/VideoToolboxSoftLink.cpp: Added.
* platform/cocoa/VideoToolboxSoftLink.h: Added.

LayoutTests:

* media/media-source/content/test-fragmented.mp4: Add a 'edts' atom to move the presentation time for the
    first sample to 0:00.
* media/media-source/content/test-fragmented-manifest.json:
* media/media-source/media-source-paint-to-canvas-expected.txt: Added.
* media/media-source/media-source-paint-to-canvas.html: Added.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@217185 268f45cc-cd09-0410-ab3c-d52691b4dbfc

25 files changed:
LayoutTests/ChangeLog
LayoutTests/media/media-source/content/test-fragmented-manifest.json
LayoutTests/media/media-source/content/test-fragmented.mp4
LayoutTests/media/media-source/media-source-paint-to-canvas-expected.txt [new file with mode: 0644]
LayoutTests/media/media-source/media-source-paint-to-canvas.html [new file with mode: 0644]
Source/WebCore/ChangeLog
Source/WebCore/Modules/mediasource/MediaSource.cpp
Source/WebCore/Modules/mediasource/SourceBuffer.cpp
Source/WebCore/Modules/mediasource/SourceBuffer.h
Source/WebCore/WebCore.xcodeproj/project.pbxproj
Source/WebCore/platform/cf/CoreMediaSoftLink.cpp
Source/WebCore/platform/cf/CoreMediaSoftLink.h
Source/WebCore/platform/cocoa/CoreVideoSoftLink.cpp
Source/WebCore/platform/cocoa/CoreVideoSoftLink.h
Source/WebCore/platform/cocoa/VideoToolboxSoftLink.cpp [new file with mode: 0644]
Source/WebCore/platform/cocoa/VideoToolboxSoftLink.h [new file with mode: 0644]
Source/WebCore/platform/graphics/SourceBufferPrivateClient.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm
Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm
Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm
Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.h [new file with mode: 0644]
Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.mm [new file with mode: 0644]

index a43ec2a..16a459a 100644 (file)
@@ -1,3 +1,17 @@
+2017-05-20  Jer Noble  <jer.noble@apple.com>
+
+        [MSE][Mac] Support painting MSE video-element to canvas
+        https://bugs.webkit.org/show_bug.cgi?id=125157
+        <rdar://problem/23062016>
+
+        Reviewed by Eric Carlson.
+
+        * media/media-source/content/test-fragmented.mp4: Add a 'edts' atom to move the presentation time for the
+            first sample to 0:00.
+        * media/media-source/content/test-fragmented-manifest.json: 
+        * media/media-source/media-source-paint-to-canvas-expected.txt: Added.
+        * media/media-source/media-source-paint-to-canvas.html: Added.
+
 2017-05-19  Chris Dumez  <cdumez@apple.com>
 
         Consider not exposing webkitURL in workers
 
         * media/media-source/content/test-fragmented.mp4:
 
-2017-04-11  Jer Noble  <jer.noble@apple.com>
-
-        [MSE][Mac] Support painting MSE video-element to canvas
-        https://bugs.webkit.org/show_bug.cgi?id=125157
-        <rdar://problem/23062016>
-
-        Reviewed by Eric Carlson.
-
-        * media/media-source/content/test-fragmented.mp4: Add a 'edts' atom to move the presentation time for the
-            first sample to 0:00.
-        * media/media-source/media-source-paint-to-canvas-expected.txt: Added.
-        * media/media-source/media-source-paint-to-canvas.html: Added.
-
 2017-05-19  Zan Dobersek  <zdobersek@igalia.com>
 
         Unreviewed GTK+ gardening.
index f4b858a..a53cd9e 100644 (file)
@@ -1,18 +1,18 @@
 {
     "url": "content/test-fragmented.mp4",
     "type": "video/mp4; codecs=\"mp4a.40.2,avc1.4d281e\"",
-    "init": { "offset": 0, "size": 1231 },
-    "duration": 10.327753,
+    "init": { "offset": 0, "size": 1259 },
+    "duration": 10,
     "media": [
-        { "offset": 1231,   "size": 67526, "timecode": 0.000000, "duration": 1.041668 },
-        { "offset": 68757,  "size": 72683, "timecode": 1.016916, "duration": 1.024752 },
-        { "offset": 141440, "size": 78499, "timecode": 2.015374, "duration": 1.026294 },
-        { "offset": 219939, "size": 77358, "timecode": 3.013832, "duration": 1.027835 },
-        { "offset": 297297, "size": 80748, "timecode": 4.012290, "duration": 1.029377 },
-        { "offset": 378045, "size": 78038, "timecode": 5.010748, "duration": 1.030919 },
-        { "offset": 456083, "size": 82223, "timecode": 6.009206, "duration": 1.032461 },
-        { "offset": 538306, "size": 78331, "timecode": 7.007664, "duration": 1.034003 },
-        { "offset": 616637, "size": 80736, "timecode": 8.006122, "duration": 1.035545 },
-        { "offset": 697373, "size": 77752, "timecode": 9.004580, "duration": 1.044899 }
+        { "offset": 1259,   "size": 67526, "timestamp": 0, "duration": 1 },
+        { "offset": 68785,  "size": 72683, "timestamp": 1, "duration": 1 },
+        { "offset": 141468, "size": 78499, "timestamp": 2, "duration": 1 },
+        { "offset": 219967, "size": 77358, "timestamp": 3, "duration": 1 },
+        { "offset": 297325, "size": 80748, "timestamp": 4, "duration": 1 },
+        { "offset": 378073, "size": 78038, "timestamp": 5, "duration": 1 },
+        { "offset": 456111, "size": 82223, "timestamp": 6, "duration": 1 },
+        { "offset": 538334, "size": 78331, "timestamp": 7, "duration": 1 },
+        { "offset": 616665, "size": 80736, "timestamp": 8, "duration": 1 },
+        { "offset": 697401, "size": 77752, "timestamp": 9, "duration": 1 }
     ]
 }
index 6ce467b..520aab1 100644 (file)
Binary files a/LayoutTests/media/media-source/content/test-fragmented.mp4 and b/LayoutTests/media/media-source/content/test-fragmented.mp4 differ
diff --git a/LayoutTests/media/media-source/media-source-paint-to-canvas-expected.txt b/LayoutTests/media/media-source/media-source-paint-to-canvas-expected.txt
new file mode 100644 (file)
index 0000000..f572191
--- /dev/null
@@ -0,0 +1,14 @@
+EVENT(sourceopen)
+EVENT(update)
+EVENT(canplay)
+RUN(video.currentTime += 1.001 / 24)
+EVENT(seeked)
+EXPECTED (canvas.getContext("2d").getImageData(250, 130, 1, 1).data[0] != '0') OK
+RUN(video.currentTime += 1.001 / 24)
+EVENT(seeked)
+EXPECTED (canvas.getContext("2d").getImageData(250, 130, 1, 1).data[0] != '0') OK
+RUN(video.currentTime += 1.001 / 24)
+EVENT(seeked)
+EXPECTED (canvas.getContext("2d").getImageData(250, 130, 1, 1).data[0] != '0') OK
+END OF TEST
+
diff --git a/LayoutTests/media/media-source/media-source-paint-to-canvas.html b/LayoutTests/media/media-source/media-source-paint-to-canvas.html
new file mode 100644 (file)
index 0000000..048abdd
--- /dev/null
@@ -0,0 +1,79 @@
+<!DOCTYPE html>
+<html>
+<head>
+    <title>media-source-stalled-holds-sleep-assertion</title>
+    <script src="media-source-loader.js"></script>
+    <script src="../video-test.js"></script>
+    <script>
+
+    var canvas;
+    var loader;
+    var source;
+    var sourceBuffer;
+
+    function runTest() {
+        mediaElement = video = document.createElement('video');
+
+        loader = new MediaSourceLoader('content/test-fragmented-manifest.json');
+        loader.onload = mediaDataLoaded;
+        loader.onerror = mediaDataLoadingFailed;
+    }
+
+    function mediaDataLoadingFailed() {
+        failTest('Media data loading failed');
+    }
+
+    function mediaDataLoaded() {
+        source = new MediaSource();
+        waitForEvent('sourceopen', sourceOpen, false, false, source);
+        video.src = URL.createObjectURL(source);
+    }
+
+    function sourceOpen() {
+        source.duration = loader.duration();
+        sourceBuffer = source.addSourceBuffer(loader.type());
+        waitForEventOn(sourceBuffer, 'update', sourceInitialized, false, true);
+        sourceBuffer.appendBuffer(loader.initSegment());
+    }
+
+    function sourceInitialized() {
+        waitForEvent('canplay', canPlay, false, true);
+        sourceBuffer.appendBuffer(loader.mediaSegment(0));
+    }
+
+    function paint() {
+        canvas = document.createElement('canvas');
+        canvas.width = video.videoWidth / 2;
+        canvas.height = video.videoHeight / 2;
+        canvas.getContext('2d').drawImage(video, 0, 0, canvas.width, canvas.height);
+        document.getElementById('canvases').appendChild(canvas);
+        testExpected('canvas.getContext("2d").getImageData(250, 130, 1, 1).data[0]', '0', '!=');
+    }
+
+    function canPlay() {
+        waitForEvent('seeked', seeked1, false, true);
+        run('video.currentTime += 1.001 / 24');
+    }
+
+    function seeked1() {
+        paint();
+        waitForEvent('seeked', seeked2, false, true);
+        run('video.currentTime += 1.001 / 24');
+    }
+
+    function seeked2() {
+        paint();
+        waitForEvent('seeked', seeked3, false, true);
+        run('video.currentTime += 1.001 / 24');
+    }
+
+    function seeked3() {
+        paint();
+        endTest();
+    }
+    </script>
+</head>
+<body onload="runTest()">
+    <div id="canvases"></canvas>
+</body>
+</html>
\ No newline at end of file
index 9573325..580bc61 100644 (file)
@@ -1,3 +1,106 @@
+2017-05-20  Jer Noble  <jer.noble@apple.com>
+
+        [MSE][Mac] Support painting MSE video-element to canvas
+        https://bugs.webkit.org/show_bug.cgi?id=125157
+        <rdar://problem/23062016>
+
+        Reviewed by Eric Carlson.
+
+        Test: media/media-source/media-source-paint-to-canvas.html
+
+        In order to have access to decoded video data for painting, decode the encoded samples manually
+        instead of adding them to the AVSampleBufferDisplayLayer. To facilitate doing so, add a new
+        utility class WebCoreDecompressionSession, which can decode samples and store them.
+
+        For the purposes of this patch, to avoid double-decoding of video data and to avoid severe complication
+        of our sample delivery pipeline, we will only support painting of decoded video samples when the video is
+        not displayed in the DOM.
+
+        * Modules/mediasource/MediaSource.cpp:
+        (WebCore::MediaSource::seekToTime): Always send waitForSeekCompleted() to give private a chance to delay seek completion.
+        * Modules/mediasource/SourceBuffer.cpp:
+        (WebCore::SourceBuffer::sourceBufferPrivateReenqueSamples): Added.
+        * Modules/mediasource/SourceBuffer.h:
+        * WebCore.xcodeproj/project.pbxproj:
+        * platform/cf/CoreMediaSoftLink.cpp: Added new soft link macros.
+        * platform/cf/CoreMediaSoftLink.h: Ditto.
+        * platform/cocoa/CoreVideoSoftLink.cpp: Ditto.
+        * platform/cocoa/CoreVideoSoftLink.h: Ditto.
+        * platform/graphics/SourceBufferPrivateClient.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h:
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::sampleBufferDisplayLayer): Simple accessor.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::decompressionSession): Ditto.
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC):
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::load): Update whether we should be displaying in a layer or decompression session..
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setVisible): Ditto.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted): m_seeking is now an enum.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seeking): Ditto.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted): Ditto. If waiting for a video frame, delay completing seek.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime): Call updateLastImage() and return result.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage): Fetch the image for the current time.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paint): Pass to paintCurrentFrameInCanvas.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext): Get a native image, and render it.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged): Create or destroy a layer or decompression session as appropriate.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer): Creates a layer.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyLayer): Destroys a layer.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession): Creates a decompression session.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession): Destroys a decompression session.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableVideoFrame): If seek completion delayed, complete now. Ditto for ready state change.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setReadyState): If waiting for a video frame, delay ready state change.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer): Deleted.
+        * platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm:
+        (WebCore::MediaSourcePrivateAVFObjC::hasVideo): Promote to a class function.
+        (WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideo): Return whether any of the active source buffers have video and are selected.
+        (WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideoChanged): Call setSourceBufferWithSelectedVideo().
+        (WebCore::MediaSourcePrivateAVFObjC::setVideoLayer): Set (or clear) the layer on the selected buffer.
+        (WebCore::MediaSourcePrivateAVFObjC::setDecompressionSession): Ditto for decompression session.
+        (WebCore::MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo): Remove the layer and decompression session from the unselected
+
+                buffer and add the decompression session or layer to the newly selected buffer.
+        (WebCore::MediaSourcePrivateAVFObjCHasVideo): Deleted.
+        * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h:
+        * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm:
+        (WebCore::SourceBufferPrivateAVFObjC::destroyRenderers): Clear the videoLayer and decompressionSession.
+        (WebCore::SourceBufferPrivateAVFObjC::hasSelectedVideo): Return whether the buffer has a selected video track.
+        (WebCore::SourceBufferPrivateAVFObjC::trackDidChangeEnabled): The media player now manages the video layer and decompression session lifetimes.
+        (WebCore::SourceBufferPrivateAVFObjC::flush): Flush the decompression session, if it exists.
+        (WebCore::SourceBufferPrivateAVFObjC::enqueueSample): Enqueue to the decompression session, if it exists.
+        (WebCore::SourceBufferPrivateAVFObjC::isReadyForMoreSamples): As the decompression session, if it exists.
+        (WebCore::SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples): Tell the decompression session to stop requesting data, if it exists.
+        (WebCore::SourceBufferPrivateAVFObjC::notifyClientWhenReadyForMoreSamples): Request media data from the decompression session, if it exists.
+        (WebCore::SourceBufferPrivateAVFObjC::setVideoLayer): Added.
+        (WebCore::SourceBufferPrivateAVFObjC::setDecompressionSession): Added.
+        * platform/graphics/cocoa/WebCoreDecompressionSession.h: Added.
+        (WebCore::WebCoreDecompressionSession::create):
+        (WebCore::WebCoreDecompressionSession::isInvalidated):
+        (WebCore::WebCoreDecompressionSession::createWeakPtr):
+        * platform/graphics/cocoa/WebCoreDecompressionSession.mm: Added.
+        (WebCore::WebCoreDecompressionSession::WebCoreDecompressionSession): Register for media data requests.
+        (WebCore::WebCoreDecompressionSession::invalidate):  Unregister for same.
+        (WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaDataCallback): Pass to maybeBecomeReadyForMoreMediaData.
+        (WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaData): Check in-flight decodes, and decoded frame counts.
+        (WebCore::WebCoreDecompressionSession::enqueueSample): Pass the sample to be decoded on a background queue.
+        (WebCore::WebCoreDecompressionSession::decodeSample): Decode the sample.
+        (WebCore::WebCoreDecompressionSession::decompressionOutputCallback): Call handleDecompressionOutput.
+        (WebCore::WebCoreDecompressionSession::handleDecompressionOutput): Pass decoded sample to be enqueued on the main thread.
+        (WebCore::WebCoreDecompressionSession::getFirstVideoFrame):
+        (WebCore::WebCoreDecompressionSession::enqueueDecodedSample): Enqueue the frame (if it's a displayed frame).
+        (WebCore::WebCoreDecompressionSession::isReadyForMoreMediaData): Return whether we've hit our high water sample count.
+        (WebCore::WebCoreDecompressionSession::requestMediaDataWhenReady):
+        (WebCore::WebCoreDecompressionSession::stopRequestingMediaData): Unset the same.
+        (WebCore::WebCoreDecompressionSession::notifyWhenHasAvailableVideoFrame): Set a callback to notify when a decoded frame has been enqueued.
+        (WebCore::WebCoreDecompressionSession::imageForTime): Successively dequeue images until reaching one at or beyond the requested time.
+        (WebCore::WebCoreDecompressionSession::flush): Synchronously empty the producer and consumer queues.
+        (WebCore::WebCoreDecompressionSession::getDecodeTime): Utility method.
+        (WebCore::WebCoreDecompressionSession::getPresentationTime): Ditto.
+        (WebCore::WebCoreDecompressionSession::getDuration): Ditto.
+        (WebCore::WebCoreDecompressionSession::compareBuffers): Ditto.
+        * platform/cocoa/VideoToolboxSoftLink.cpp: Added.
+        * platform/cocoa/VideoToolboxSoftLink.h: Added.
+
 2017-05-19  Joseph Pecoraro  <pecoraro@apple.com>
 
         WebAVStreamDataParserListener String leak
index f9f913e..6ac8d02 100644 (file)
@@ -233,6 +233,7 @@ void MediaSource::seekToTime(const MediaTime& time)
     // ↳ Otherwise
     // Continue
 
+    m_private->waitForSeekCompleted();
     completeSeek();
 }
 
index d2b7c99..086fe6a 100644 (file)
@@ -1778,8 +1778,26 @@ void SourceBuffer::textTrackKindChanged(TextTrack& track)
         m_source->mediaElement()->textTrackKindChanged(track);
 }
 
+void SourceBuffer::sourceBufferPrivateReenqueSamples(const AtomicString& trackID)
+{
+    if (isRemoved())
+        return;
+
+    LOG(MediaSource, "SourceBuffer::sourceBufferPrivateReenqueSamples(%p)", this);
+    auto it = m_trackBufferMap.find(trackID);
+    if (it == m_trackBufferMap.end())
+        return;
+
+    auto& trackBuffer = it->value;
+    trackBuffer.needsReenqueueing = true;
+    reenqueueMediaForTime(trackBuffer, trackID, m_source->currentTime());
+}
+
 void SourceBuffer::sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID)
 {
+    if (isRemoved())
+        return;
+
     LOG(MediaSource, "SourceBuffer::sourceBufferPrivateDidBecomeReadyForMoreSamples(%p)", this);
     auto it = m_trackBufferMap.find(trackID);
     if (it == m_trackBufferMap.end())
@@ -1930,6 +1948,9 @@ void SourceBuffer::updateBufferedFromTrackBuffers()
     for (auto& trackBuffer : m_trackBufferMap.values()) {
         // 4.1 Let track ranges equal the track buffer ranges for the current track buffer.
         PlatformTimeRanges trackRanges = trackBuffer.buffered;
+        if (!trackRanges.length())
+            continue;
+
         // 4.2 If readyState is "ended", then set the end time on the last range in track ranges to highest end time.
         if (m_source->isEnded())
             trackRanges.add(trackRanges.maximumBufferedTime(), highestEndTime);
index 5a142d4..6cca87a 100644 (file)
@@ -129,6 +129,7 @@ private:
     void sourceBufferPrivateDidReceiveSample(MediaSample&) final;
     bool sourceBufferPrivateHasAudio() const final;
     bool sourceBufferPrivateHasVideo() const final;
+    void sourceBufferPrivateReenqueSamples(const AtomicString& trackID) final;
     void sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) final;
     MediaTime sourceBufferPrivateFastSeekTimeForMediaTime(const MediaTime&, const MediaTime& negativeThreshold, const MediaTime& positiveThreshold) final;
     void sourceBufferPrivateAppendComplete(AppendResult) final;
index 041948f..bc25b68 100644 (file)
                CD5596921475B678001D0BD0 /* AudioFileReaderIOS.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5596901475B678001D0BD0 /* AudioFileReaderIOS.h */; };
                CD5896E11CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD5896DF1CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm */; };
                CD5896E21CD2B15100B3BCC8 /* WebPlaybackControlsManager.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5896E01CD2B15100B3BCC8 /* WebPlaybackControlsManager.h */; settings = {ATTRIBUTES = (Private, ); }; };
+               CD5D27771E8318E000D80A3D /* WebCoreDecompressionSession.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */; };
+               CD5D27781E8318E000D80A3D /* WebCoreDecompressionSession.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */; };
                CD5E5B5F1A15CE54000C609E /* PageConfiguration.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5E5B5E1A15CE54000C609E /* PageConfiguration.h */; settings = {ATTRIBUTES = (Private, ); }; };
                CD5E5B611A15F156000C609E /* PageConfiguration.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CD5E5B601A15F156000C609E /* PageConfiguration.cpp */; };
                CD60C0C6193E87C7003C656B /* MediaTimeQTKit.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD60C0C4193E87C7003C656B /* MediaTimeQTKit.mm */; };
                CDC8B5AA18047FF10016E685 /* SourceBufferPrivateAVFObjC.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDC8B5A818047FF10016E685 /* SourceBufferPrivateAVFObjC.mm */; };
                CDC8B5AB18047FF10016E685 /* SourceBufferPrivateAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC8B5A918047FF10016E685 /* SourceBufferPrivateAVFObjC.h */; };
                CDC8B5AD1804AE5D0016E685 /* SourceBufferPrivateClient.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC8B5AC1804AE5D0016E685 /* SourceBufferPrivateClient.h */; };
+               CDC939A71E9BDFB100BB768D /* VideoToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */; };
+               CDC939A81E9BDFB100BB768D /* VideoToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */; };
                CDC979F41C498C0900DB50D4 /* WebCoreNSErrorExtras.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDC979F21C498C0900DB50D4 /* WebCoreNSErrorExtras.mm */; };
                CDC979F51C498C0900DB50D4 /* WebCoreNSErrorExtras.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC979F31C498C0900DB50D4 /* WebCoreNSErrorExtras.h */; };
                CDCA98EB18B2C8EB00C12FF9 /* LegacyCDMPrivateMediaPlayer.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CDCA98EA18B2C8EB00C12FF9 /* LegacyCDMPrivateMediaPlayer.cpp */; };
                CD5596901475B678001D0BD0 /* AudioFileReaderIOS.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioFileReaderIOS.h; sourceTree = "<group>"; };
                CD5896DF1CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebPlaybackControlsManager.mm; sourceTree = "<group>"; };
                CD5896E01CD2B15100B3BCC8 /* WebPlaybackControlsManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebPlaybackControlsManager.h; sourceTree = "<group>"; };
+               CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreDecompressionSession.mm; sourceTree = "<group>"; };
+               CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreDecompressionSession.h; sourceTree = "<group>"; };
                CD5E5B5E1A15CE54000C609E /* PageConfiguration.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = PageConfiguration.h; sourceTree = "<group>"; };
                CD5E5B601A15F156000C609E /* PageConfiguration.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PageConfiguration.cpp; sourceTree = "<group>"; };
                CD60C0C4193E87C7003C656B /* MediaTimeQTKit.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = MediaTimeQTKit.mm; sourceTree = "<group>"; };
                CDC8B5A818047FF10016E685 /* SourceBufferPrivateAVFObjC.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = SourceBufferPrivateAVFObjC.mm; sourceTree = "<group>"; };
                CDC8B5A918047FF10016E685 /* SourceBufferPrivateAVFObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SourceBufferPrivateAVFObjC.h; sourceTree = "<group>"; };
                CDC8B5AC1804AE5D0016E685 /* SourceBufferPrivateClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SourceBufferPrivateClient.h; sourceTree = "<group>"; };
+               CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = VideoToolboxSoftLink.cpp; sourceTree = "<group>"; };
+               CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = VideoToolboxSoftLink.h; sourceTree = "<group>"; };
                CDC979F21C498C0900DB50D4 /* WebCoreNSErrorExtras.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreNSErrorExtras.mm; sourceTree = "<group>"; };
                CDC979F31C498C0900DB50D4 /* WebCoreNSErrorExtras.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreNSErrorExtras.h; sourceTree = "<group>"; };
                CDCA98E918B2C8D000C12FF9 /* LegacyCDMPrivateMediaPlayer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = LegacyCDMPrivateMediaPlayer.h; sourceTree = "<group>"; };
                                52D5A1A41C57488900DE34A3 /* WebVideoFullscreenModel.h */,
                                52D5A1A51C57488900DE34A3 /* WebVideoFullscreenModelVideoElement.h */,
                                52D5A1A61C57488900DE34A3 /* WebVideoFullscreenModelVideoElement.mm */,
+                               CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */,
+                               CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */,
                        );
                        path = cocoa;
                        sourceTree = "<group>";
                                2D3EF4451917915C00034184 /* WebActionDisablingCALayerDelegate.mm */,
                                2D3EF4461917915C00034184 /* WebCoreCALayerExtras.h */,
                                2D3EF4471917915C00034184 /* WebCoreCALayerExtras.mm */,
+                               CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */,
+                               CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */,
                                316BDB8A1E6E153000DE0D5A /* WebGPULayer.h */,
                                316BDB891E6E153000DE0D5A /* WebGPULayer.mm */,
                        );
                                E4AE7C1A17D232350009FB31 /* ElementAncestorIterator.h in Headers */,
                                E440AA961C68420800A265CC /* ElementAndTextDescendantIterator.h in Headers */,
                                E46A2B1E17CA76B1000DBCD8 /* ElementChildIterator.h in Headers */,
+                               CD5D27781E8318E000D80A3D /* WebCoreDecompressionSession.h in Headers */,
                                B5B7A17117C10AC000E4AA0A /* ElementData.h in Headers */,
                                93D437A11D57B3F400AB85EA /* ElementDescendantIterator.h in Headers */,
                                E4AE7C1617D1BB950009FB31 /* ElementIterator.h in Headers */,
                                517139061BF64DEC000D5F01 /* MemoryObjectStoreCursor.h in Headers */,
                                413E00791DB0E4F2002341D2 /* MemoryRelease.h in Headers */,
                                93309DFA099E64920056E581 /* MergeIdenticalElementsCommand.h in Headers */,
+                               CDC939A81E9BDFB100BB768D /* VideoToolboxSoftLink.h in Headers */,
                                E1ADECCE0E76AD8B004A1A5E /* MessageChannel.h in Headers */,
                                75793E840D0CE0B3007FC0AC /* MessageEvent.h in Headers */,
                                E1ADECBF0E76ACF1004A1A5E /* MessagePort.h in Headers */,
                                FD31602D12B0267600C1A359 /* DelayNode.cpp in Sources */,
                                FD31603012B0267600C1A359 /* DelayProcessor.cpp in Sources */,
                                93309DDE099E64920056E581 /* DeleteFromTextNodeCommand.cpp in Sources */,
+                               CDC939A71E9BDFB100BB768D /* VideoToolboxSoftLink.cpp in Sources */,
                                93309DE0099E64920056E581 /* DeleteSelectionCommand.cpp in Sources */,
                                9479493C1E045CF300018D85 /* DeprecatedCSSOMPrimitiveValue.cpp in Sources */,
                                947949241E0308A400018D85 /* DeprecatedCSSOMValue.cpp in Sources */,
                                7C39C3761DDBB90D00FEFB29 /* SVGStringListValues.cpp in Sources */,
                                B2227AB70D00BF220071B782 /* SVGStyleElement.cpp in Sources */,
                                B2227ABA0D00BF220071B782 /* SVGSVGElement.cpp in Sources */,
+                               CD5D27771E8318E000D80A3D /* WebCoreDecompressionSession.mm in Sources */,
                                B2227ABD0D00BF220071B782 /* SVGSwitchElement.cpp in Sources */,
                                B2227AC00D00BF220071B782 /* SVGSymbolElement.cpp in Sources */,
                                B2227AC40D00BF220071B782 /* SVGTests.cpp in Sources */,
index 9083261..cbf71e1 100644 (file)
 #include "CoreMediaSPI.h"
 #include "SoftLinking.h"
 
+#if PLATFORM(COCOA)
+#include <CoreVideo/CoreVideo.h>
+#endif
+
 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, CoreMedia)
 
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBlockBufferCopyDataBytes, OSStatus, (CMBlockBufferRef theSourceBuffer, size_t offsetToData, size_t dataLength, void* destination), (theSourceBuffer, offsetToData, dataLength, destination))
@@ -69,6 +73,7 @@ SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTextVerticalLayout_LeftToRi
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTextVerticalLayout_RightToLeft, CFStringRef)
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimeInvalid, CMTime)
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimeZero, CMTime)
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimePositiveInfinity, CMTime)
 
 #if PLATFORM(COCOA)
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMFormatDescriptionGetMediaSubType, FourCharCode, (CMFormatDescriptionRef desc), (desc))
@@ -97,10 +102,24 @@ SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetTime, CMTime, (CM
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetRate, OSStatus, (CMTimebaseRef timebase, Float64 rate), (timebase, rate))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseAddTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseRemoveTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceNextFireTime, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource, CMTime fireTime, uint32_t flags), (timebase, timerSource, fireTime, flags))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceToFireImmediately, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetDimensions, CMVideoDimensions, (CMVideoFormatDescriptionRef videoDesc), (videoDesc))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetPresentationDimensions, CGSize, (CMVideoFormatDescriptionRef videoDesc, Boolean usePixelAspectRatio, Boolean useCleanAperture), (videoDesc, usePixelAspectRatio, useCleanAperture))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueReset, OSStatus, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueCreate, OSStatus, (CFAllocatorRef allocator, CMItemCount capacity, const CMBufferCallbacks* callbacks, CMBufferQueueRef* queueOut), (allocator, capacity, callbacks, queueOut))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetHead, CMBufferRef, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueDequeueAndRetain, CMBufferRef, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueEnqueue, OSStatus, (CMBufferQueueRef queue, CMBufferRef buffer), (queue, buffer))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut))
 
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef)
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMSampleAttachmentKey_NotSync, CFStringRef)
index ec3269f..ce44fae 100644 (file)
@@ -108,6 +108,8 @@ SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimeInvalid, CMTime)
 #define kCMTimeInvalid get_CoreMedia_kCMTimeInvalid()
 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimeZero, CMTime)
 #define kCMTimeZero get_CoreMedia_kCMTimeZero()
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimePositiveInfinity, CMTime)
+#define kCMTimePositiveInfinity get_CoreMedia_kCMTimePositiveInfinity()
 
 #if PLATFORM(COCOA)
 
@@ -161,6 +163,14 @@ SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (
 #define CMTimebaseSetTime softLink_CoreMedia_CMTimebaseSetTime
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase))
 #define CMTimebaseGetEffectiveRate softLink_CoreMedia_CMTimebaseGetEffectiveRate
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseAddTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource))
+#define CMTimebaseAddTimerDispatchSource softLink_CoreMedia_CMTimebaseAddTimerDispatchSource
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseRemoveTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource))
+#define CMTimebaseRemoveTimerDispatchSource softLink_CoreMedia_CMTimebaseRemoveTimerDispatchSource
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceNextFireTime, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource, CMTime fireTime, uint32_t flags), (timebase, timerSource, fireTime, flags))
+#define CMTimebaseSetTimerDispatchSourceNextFireTime softLink_CoreMedia_CMTimebaseSetTimerDispatchSourceNextFireTime
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceToFireImmediately, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource))
+#define CMTimebaseSetTimerDispatchSourceToFireImmediately softLink_CoreMedia_CMTimebaseSetTimerDispatchSourceToFireImmediately
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
 #define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef *outDesc), (allocator, imageBuffer, outDesc))
@@ -169,6 +179,26 @@ SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionGetDim
 #define CMVideoFormatDescriptionGetDimensions softLink_CoreMedia_CMVideoFormatDescriptionGetDimensions
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionGetPresentationDimensions, CGSize, (CMVideoFormatDescriptionRef videoDesc, Boolean usePixelAspectRatio, Boolean useCleanAperture), (videoDesc, usePixelAspectRatio, useCleanAperture))
 #define CMVideoFormatDescriptionGetPresentationDimensions softLink_CoreMedia_CMVideoFormatDescriptionGetPresentationDimensions
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueCreate, OSStatus, (CFAllocatorRef allocator, CMItemCount capacity, const CMBufferCallbacks* callbacks, CMBufferQueueRef* queueOut), (allocator, capacity, callbacks, queueOut))
+#define CMBufferQueueCreate softLink_CoreMedia_CMBufferQueueCreate
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueReset, OSStatus, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueReset softLink_CoreMedia_CMBufferQueueReset
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetHead, CMBufferRef, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueGetHead softLink_CoreMedia_CMBufferQueueGetHead
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueDequeueAndRetain, CMBufferRef, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueDequeueAndRetain softLink_CoreMedia_CMBufferQueueDequeueAndRetain
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueEnqueue, OSStatus, (CMBufferQueueRef queue, CMBufferRef buffer), (queue, buffer))
+#define CMBufferQueueEnqueue softLink_CoreMedia_CMBufferQueueEnqueue
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueIsEmpty softLink_CoreMedia_CMBufferQueueIsEmpty
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueGetBufferCount softLink_CoreMedia_CMBufferQueueGetBufferCount
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueGetFirstPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetFirstPresentationTimeStamp
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue))
+#define CMBufferQueueGetEndPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetEndPresentationTimeStamp
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut))
+#define CMBufferQueueInstallTriggerWithIntegerThreshold softLink_CoreMedia_CMBufferQueueInstallTriggerWithIntegerThreshold
 
 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef)
 #define kCMSampleAttachmentKey_DoNotDisplay get_CoreMedia_kCMSampleAttachmentKey_DoNotDisplay()
index 102e63a..10c43c3 100644 (file)
@@ -30,6 +30,7 @@
 
 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, CoreVideo)
 
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetTypeID, CFTypeID, (), ())
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetHeight, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetBaseAddress, void*, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
index 84a46ba..537612d 100644 (file)
@@ -31,6 +31,8 @@
 
 SOFT_LINK_FRAMEWORK_FOR_HEADER(WebCore, CoreVideo)
 
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetTypeID, CFTypeID, (), ())
+#define CVPixelBufferGetTypeID softLink_CoreVideo_CVPixelBufferGetTypeID
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
 #define CVPixelBufferGetWidth softLink_CoreVideo_CVPixelBufferGetWidth
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetHeight, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
diff --git a/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.cpp b/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.cpp
new file mode 100644 (file)
index 0000000..2c2b68d
--- /dev/null
@@ -0,0 +1,42 @@
+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#include "config.h"
+
+#include "SoftLinking.h"
+#include <VideoToolbox/VideoToolbox.h>
+
+SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, VideoToolbox)
+
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionCreate, OSStatus, (CFAllocatorRef allocator, CMVideoFormatDescriptionRef videoFormatDescription, CFDictionaryRef videoDecoderSpecification, CFDictionaryRef destinationImageBufferAttributes, const VTDecompressionOutputCallbackRecord* outputCallback, VTDecompressionSessionRef* decompressionSessionOut), (allocator, videoFormatDescription, videoDecoderSpecification, destinationImageBufferAttributes, outputCallback, decompressionSessionOut))
+#define VTDecompressionSessionCreate softLink_VideoToolbox_VTDecompressionSessionCreate
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionCanAcceptFormatDescription, Boolean, (VTDecompressionSessionRef session, CMFormatDescriptionRef newFormatDesc), (session, newFormatDesc))
+#define VTDecompressionSessionCanAcceptFormatDescription softLink_VideoToolbox_VTDecompressionSessionCanAcceptFormatDescription
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionWaitForAsynchronousFrames, OSStatus, (VTDecompressionSessionRef session), (session))
+#define VTDecompressionSessionWaitForAsynchronousFrames softLink_VideoToolbox_VTDecompressionSessionWaitForAsynchronousFrames
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionDecodeFrame, OSStatus, (VTDecompressionSessionRef session, CMSampleBufferRef sampleBuffer, VTDecodeFrameFlags decodeFlags, void* sourceFrameRefCon, VTDecodeInfoFlags* infoFlagsOut), (session, sampleBuffer, decodeFlags, sourceFrameRefCon, infoFlagsOut))
+#define VTDecompressionSessionDecodeFrame softLink_VideoToolbox_VTDecompressionSessionDecodeFrame
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, VideoToolbox, kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder, CFStringRef)
+#define kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder get_VideoToolbox_kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder()
diff --git a/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.h b/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.h
new file mode 100644 (file)
index 0000000..fb0098e
--- /dev/null
@@ -0,0 +1,42 @@
+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#include "SoftLinking.h"
+#include <VideoToolbox/VideoToolbox.h>
+
+SOFT_LINK_FRAMEWORK_FOR_HEADER(WebCore, VideoToolbox)
+
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionCreate, OSStatus, (CFAllocatorRef allocator, CMVideoFormatDescriptionRef videoFormatDescription, CFDictionaryRef videoDecoderSpecification, CFDictionaryRef destinationImageBufferAttributes, const VTDecompressionOutputCallbackRecord* outputCallback, VTDecompressionSessionRef* decompressionSessionOut), (allocator, videoFormatDescription, videoDecoderSpecification, destinationImageBufferAttributes, outputCallback, decompressionSessionOut))
+#define VTDecompressionSessionCreate softLink_VideoToolbox_VTDecompressionSessionCreate
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionCanAcceptFormatDescription, Boolean, (VTDecompressionSessionRef session, CMFormatDescriptionRef newFormatDesc), (session, newFormatDesc))
+#define VTDecompressionSessionCanAcceptFormatDescription softLink_VideoToolbox_VTDecompressionSessionCanAcceptFormatDescription
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionWaitForAsynchronousFrames, OSStatus, (VTDecompressionSessionRef session), (session))
+#define VTDecompressionSessionWaitForAsynchronousFrames softLink_VideoToolbox_VTDecompressionSessionWaitForAsynchronousFrames
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionDecodeFrame, OSStatus, (VTDecompressionSessionRef session, CMSampleBufferRef sampleBuffer, VTDecodeFrameFlags decodeFlags, void* sourceFrameRefCon, VTDecodeInfoFlags* infoFlagsOut), (session, sampleBuffer, decodeFlags, sourceFrameRefCon, infoFlagsOut))
+#define VTDecompressionSessionDecodeFrame softLink_VideoToolbox_VTDecompressionSessionDecodeFrame
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, VideoToolbox, kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder, CFStringRef)
+#define kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder get_VideoToolbox_kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder()
index 18087d5..dd3a97e 100644 (file)
@@ -68,6 +68,7 @@ public:
     virtual bool sourceBufferPrivateHasAudio() const = 0;
     virtual bool sourceBufferPrivateHasVideo() const = 0;
 
+    virtual void sourceBufferPrivateReenqueSamples(const AtomicString& trackID) = 0;
     virtual void sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) = 0;
 
     virtual MediaTime sourceBufferPrivateFastSeekTimeForMediaTime(const MediaTime& time, const MediaTime&, const MediaTime&) { return time; }
index 2f626c9..38fef0a 100644 (file)
@@ -41,12 +41,18 @@ OBJC_CLASS AVSampleBufferRenderSynchronizer;
 OBJC_CLASS AVStreamSession;
 
 typedef struct OpaqueCMTimebase* CMTimebaseRef;
+typedef struct __CVBuffer *CVPixelBufferRef;
+typedef struct __CVBuffer *CVOpenGLTextureRef;
 
 namespace WebCore {
 
 class CDMSessionMediaSourceAVFObjC;
-class PlatformClockCM;
 class MediaSourcePrivateAVFObjC;
+class PixelBufferConformerCV;
+class PlatformClockCM;
+class TextureCacheCV;
+class VideoTextureCopierCV;
+class WebCoreDecompressionSession;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
 class VideoFullscreenLayerManager;
@@ -64,9 +70,6 @@ public:
     static void getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types);
     static MediaPlayer::SupportsType supportsType(const MediaEngineSupportParameters&);
 
-    void addDisplayLayer(AVSampleBufferDisplayLayer*);
-    void removeDisplayLayer(AVSampleBufferDisplayLayer*);
-
     void addAudioRenderer(AVSampleBufferAudioRenderer*);
     void removeAudioRenderer(AVSampleBufferAudioRenderer*);
 
@@ -91,6 +94,10 @@ public:
     void flushPendingSizeChanges();
     void characteristicsChanged();
 
+    MediaTime currentMediaTime() const override;
+    AVSampleBufferDisplayLayer* sampleBufferDisplayLayer() const { return m_sampleBufferDisplayLayer.get(); }
+    WebCoreDecompressionSession* decompressionSession() const { return m_decompressionSession.get(); }
+
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     void setVideoFullscreenLayer(PlatformLayer*, std::function<void()> completionHandler) override;
     void setVideoFullscreenFrame(FloatRect) override;
@@ -150,7 +157,6 @@ private:
     void setVisible(bool) override;
 
     MediaTime durationMediaTime() const override;
-    MediaTime currentMediaTime() const override;
     MediaTime startTime() const override;
     MediaTime initialTime() const override;
 
@@ -169,9 +175,13 @@ private:
 
     void setSize(const IntSize&) override;
 
+    NativeImagePtr nativeImageForCurrentTime() override;
+    bool updateLastPixelBuffer();
+    bool updateLastImage();
     void paint(GraphicsContext&, const FloatRect&) override;
     void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override;
-
+    bool copyVideoTextureToPlatformTexture(GraphicsContext3D*, Platform3DObject, GC3Denum target, GC3Dint level, GC3Denum internalFormat, GC3Denum format, GC3Denum type, bool premultiplyAlpha, bool flipY) override;
+    
     bool hasAvailableVideoFrame() const override;
 
     bool supportsAcceleratedRendering() const override;
@@ -179,6 +189,12 @@ private:
     void acceleratedRenderingStateChanged() override;
     void notifyActiveSourceBuffersChanged() override;
 
+    // NOTE: Because the only way for MSE to recieve data is through an ArrayBuffer provided by
+    // javascript running in the page, the video will, by necessity, always be CORS correct and
+    // in the page's origin.
+    bool hasSingleSecurityOrigin() const override { return true; }
+    bool didPassCORSAccessCheck() const override { return true; }
+
     MediaPlayer::MovieLoadType movieLoadType() const override;
 
     void prepareForRendering() override;
@@ -203,6 +219,9 @@ private:
 
     void ensureLayer();
     void destroyLayer();
+    void ensureDecompressionSession();
+    void destroyDecompressionSession();
+
     bool shouldBePlaying() const;
 
     friend class MediaSourcePrivateAVFObjC;
@@ -235,21 +254,35 @@ private:
     RetainPtr<id> m_timeJumpedObserver;
     RetainPtr<id> m_durationObserver;
     RetainPtr<AVStreamSession> m_streamSession;
+    RetainPtr<CVPixelBufferRef> m_lastPixelBuffer;
+    RetainPtr<CGImageRef> m_lastImage;
+    std::unique_ptr<PixelBufferConformerCV> m_rgbConformer;
+    RefPtr<WebCoreDecompressionSession> m_decompressionSession;
     Deque<RetainPtr<id>> m_sizeChangeObservers;
     Timer m_seekTimer;
     CDMSessionMediaSourceAVFObjC* m_session;
     MediaPlayer::NetworkState m_networkState;
     MediaPlayer::ReadyState m_readyState;
+    bool m_readyStateIsWaitingForAvailableFrame { false };
     MediaTime m_lastSeekTime;
     FloatSize m_naturalSize;
     double m_rate;
     bool m_playing;
     bool m_seeking;
-    bool m_seekCompleted;
+    enum SeekState {
+        Seeking,
+        WaitingForAvailableFame,
+        SeekCompleted,
+    };
+    SeekState m_seekCompleted { SeekCompleted };
     mutable bool m_loadingProgressed;
-    bool m_hasAvailableVideoFrame;
+    bool m_hasBeenAskedToPaintGL { false };
+    bool m_hasAvailableVideoFrame { false };
     bool m_allRenderersHaveAvailableSamples { false };
     RetainPtr<PlatformLayer> m_textTrackRepresentationLayer;
+    std::unique_ptr<TextureCacheCV> m_textureCache;
+    std::unique_ptr<VideoTextureCopierCV> m_videoTextureCopier;
+    RetainPtr<CVOpenGLTextureRef> m_lastTexture;
 #if ENABLE(WIRELESS_PLAYBACK_TARGET)
     RefPtr<MediaPlaybackTarget> m_playbackTarget;
     bool m_shouldPlayToTarget { false };
index a4c34c9..2fe8ddd 100644 (file)
 #import "CDMSessionAVStreamSession.h"
 #import "CDMSessionMediaSourceAVFObjC.h"
 #import "FileSystem.h"
+#import "GraphicsContextCG.h"
 #import "Logging.h"
 #import "MediaSourcePrivateAVFObjC.h"
 #import "MediaSourcePrivateClient.h"
 #import "MediaTimeAVFoundation.h"
+#import "PixelBufferConformerCV.h"
 #import "PlatformClockCM.h"
 #import "TextTrackRepresentation.h"
+#import "TextureCacheCV.h"
+#import "VideoTextureCopierCV.h"
+#import "WebCoreDecompressionSession.h"
 #import "WebCoreSystemInterface.h"
 #import <AVFoundation/AVAsset.h>
 #import <AVFoundation/AVTime.h>
@@ -121,7 +126,6 @@ MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC(Media
     , m_rate(1)
     , m_playing(0)
     , m_seeking(false)
-    , m_seekCompleted(true)
     , m_loadingProgressed(false)
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     , m_videoFullscreenLayerManager(VideoFullscreenLayerManager::create())
@@ -148,7 +152,7 @@ MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC(Media
 
             if (shouldBePlaying())
                 [m_synchronizer setRate:m_rate];
-            if (!seeking())
+            if (!seeking() && m_seekCompleted == SeekCompleted)
                 m_player->timeChanged();
         }
 
@@ -241,6 +245,10 @@ void MediaPlayerPrivateMediaSourceAVFObjC::load(const String& url, MediaSourcePr
     UNUSED_PARAM(url);
 
     m_mediaSourcePrivate = MediaSourcePrivateAVFObjC::create(this, client);
+    m_mediaSourcePrivate->setVideoLayer(m_sampleBufferDisplayLayer.get());
+    m_mediaSourcePrivate->setDecompressionSession(m_decompressionSession.get());
+
+    acceleratedRenderingStateChanged();
 }
 
 #if ENABLE(MEDIA_STREAM)
@@ -354,7 +362,7 @@ bool MediaPlayerPrivateMediaSourceAVFObjC::hasAudio() const
 
 void MediaPlayerPrivateMediaSourceAVFObjC::setVisible(bool)
 {
-    // No-op.
+    acceleratedRenderingStateChanged();
 }
 
 MediaTime MediaPlayerPrivateMediaSourceAVFObjC::durationMediaTime() const
@@ -437,15 +445,19 @@ void MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted()
     if (!m_seeking)
         return;
     LOG(MediaSource, "MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted(%p)", this);
-    m_seekCompleted = false;
+    m_seekCompleted = Seeking;
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted()
 {
-    if (m_seekCompleted)
+    if (m_seekCompleted == SeekCompleted)
         return;
+    if (hasVideo() && !m_hasAvailableVideoFrame) {
+        m_seekCompleted = WaitingForAvailableFame;
+        return;
+    }
     LOG(MediaSource, "MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted(%p)", this);
-    m_seekCompleted = true;
+    m_seekCompleted = SeekCompleted;
     if (shouldBePlaying())
         [m_synchronizer setRate:m_rate];
     if (!m_seeking)
@@ -454,7 +466,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted()
 
 bool MediaPlayerPrivateMediaSourceAVFObjC::seeking() const
 {
-    return m_seeking || !m_seekCompleted;
+    return m_seeking || m_seekCompleted != SeekCompleted;
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::setRateDouble(double rate)
@@ -514,14 +526,96 @@ void MediaPlayerPrivateMediaSourceAVFObjC::setSize(const IntSize&)
     // No-op.
 }
 
-void MediaPlayerPrivateMediaSourceAVFObjC::paint(GraphicsContext&, const FloatRect&)
+NativeImagePtr MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime()
+{
+    updateLastImage();
+    return m_lastImage.get();
+}
+
+bool MediaPlayerPrivateMediaSourceAVFObjC::updateLastPixelBuffer()
+{
+    if (m_sampleBufferDisplayLayer || !m_decompressionSession)
+        return false;
+
+    auto flags = !m_lastPixelBuffer ? WebCoreDecompressionSession::AllowLater : WebCoreDecompressionSession::ExactTime;
+    auto newPixelBuffer = m_decompressionSession->imageForTime(currentMediaTime(), flags);
+    if (!newPixelBuffer)
+        return false;
+
+    m_lastPixelBuffer = newPixelBuffer;
+    return true;
+}
+
+bool MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage()
+{
+    if (!updateLastPixelBuffer())
+        return false;
+
+    ASSERT(m_lastPixelBuffer);
+
+    if (!m_rgbConformer) {
+        NSDictionary *attributes = @{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) };
+        m_rgbConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)attributes);
+    }
+
+    m_lastImage = m_rgbConformer->createImageFromPixelBuffer(m_lastPixelBuffer.get());
+    return true;
+}
+
+void MediaPlayerPrivateMediaSourceAVFObjC::paint(GraphicsContext& context, const FloatRect& rect)
+{
+    paintCurrentFrameInContext(context, rect);
+}
+
+void MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& outputRect)
 {
-    // FIXME(125157): Implement painting.
+    if (context.paintingDisabled())
+        return;
+
+    auto image = nativeImageForCurrentTime();
+    if (!image)
+        return;
+
+    GraphicsContextStateSaver stateSaver(context);
+    FloatRect imageRect(0, 0, CGImageGetWidth(image.get()), CGImageGetHeight(image.get()));
+    context.drawNativeImage(image, imageRect.size(), outputRect, imageRect);
 }
 
-void MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext(GraphicsContext&, const FloatRect&)
+bool MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture(GraphicsContext3D* context, Platform3DObject outputTexture, GC3Denum outputTarget, GC3Dint level, GC3Denum internalFormat, GC3Denum format, GC3Denum type, bool premultiplyAlpha, bool flipY)
 {
-    // FIXME(125157): Implement painting.
+    if (flipY || premultiplyAlpha)
+        return false;
+
+    // We have been asked to paint into a WebGL canvas, so take that as a signal to create
+    // a decompression session, even if that means the native video can't also be displayed
+    // in page.
+    if (!m_hasBeenAskedToPaintGL) {
+        m_hasBeenAskedToPaintGL = true;
+        acceleratedRenderingStateChanged();
+    }
+
+    ASSERT(context);
+
+    if (updateLastPixelBuffer()) {
+        if (!m_lastPixelBuffer)
+            return false;
+
+        if (!m_textureCache) {
+            m_textureCache = TextureCacheCV::create(*context);
+            if (!m_textureCache)
+                return false;
+        }
+
+        m_lastTexture = m_textureCache->textureFromImage(m_lastPixelBuffer.get(), outputTarget, level, internalFormat, format, type);
+    }
+
+    size_t width = CVPixelBufferGetWidth(m_lastPixelBuffer.get());
+    size_t height = CVPixelBufferGetHeight(m_lastPixelBuffer.get());
+
+    if (!m_videoTextureCopier)
+        m_videoTextureCopier = std::make_unique<VideoTextureCopierCV>(*context);
+
+    return m_videoTextureCopier->copyVideoTextureToPlatformTexture(m_lastTexture.get(), width, height, outputTexture, outputTarget, level, internalFormat, format, type, premultiplyAlpha, flipY);
 }
 
 bool MediaPlayerPrivateMediaSourceAVFObjC::hasAvailableVideoFrame() const
@@ -536,10 +630,13 @@ bool MediaPlayerPrivateMediaSourceAVFObjC::supportsAcceleratedRendering() const
 
 void MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged()
 {
-    if (m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player))
+    if (!m_hasBeenAskedToPaintGL && m_player->visible() && m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player)) {
+        destroyDecompressionSession();
         ensureLayer();
-    else
+    } else {
         destroyLayer();
+        ensureDecompressionSession();
+    }
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::notifyActiveSourceBuffersChanged()
@@ -608,10 +705,12 @@ void MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer()
 #endif
 
     [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
-    
-#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
+    if (m_mediaSourcePrivate)
+        m_mediaSourcePrivate->setVideoLayer(m_sampleBufferDisplayLayer.get());
+#if PLATFORM(IOS) || (PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE))
     m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
 #endif
+    m_player->client().mediaPlayerRenderingModeChanged(m_player);
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::destroyLayer()
@@ -623,7 +722,42 @@ void MediaPlayerPrivateMediaSourceAVFObjC::destroyLayer()
     [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL){
         // No-op.
     }];
+
+    if (m_mediaSourcePrivate)
+        m_mediaSourcePrivate->setVideoLayer(nullptr);
+#if PLATFORM(IOS) || (PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE))
+    m_videoFullscreenLayerManager->didDestroyVideoLayer();
+#endif
     m_sampleBufferDisplayLayer = nullptr;
+    setHasAvailableVideoFrame(false);
+    m_player->client().mediaPlayerRenderingModeChanged(m_player);
+}
+
+void MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession()
+{
+    if (m_decompressionSession)
+        return;
+
+    m_decompressionSession = WebCoreDecompressionSession::create();
+    m_decompressionSession->setTimebase([m_synchronizer timebase]);
+
+    if (m_mediaSourcePrivate)
+        m_mediaSourcePrivate->setDecompressionSession(m_decompressionSession.get());
+
+    m_player->client().mediaPlayerRenderingModeChanged(m_player);
+}
+
+void MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession()
+{
+    if (!m_decompressionSession)
+        return;
+
+    if (m_mediaSourcePrivate)
+        m_mediaSourcePrivate->setDecompressionSession(nullptr);
+
+    m_decompressionSession->invalidate();
+    m_decompressionSession = nullptr;
+    setHasAvailableVideoFrame(false);
 }
 
 bool MediaPlayerPrivateMediaSourceAVFObjC::shouldBePlaying() const
@@ -637,6 +771,18 @@ void MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableVideoFrame(bool flag)
         return;
     m_hasAvailableVideoFrame = flag;
     updateAllRenderersHaveAvailableSamples();
+
+    if (!m_hasAvailableVideoFrame)
+        return;
+
+    m_player->firstVideoFrameAvailable();
+    if (m_seekCompleted == WaitingForAvailableFame)
+        seekCompleted();
+
+    if (m_readyStateIsWaitingForAvailableFrame) {
+        m_readyStateIsWaitingForAvailableFrame = false;
+        m_player->readyStateChanged();
+    }
 }
 
 void MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableAudioSample(AVSampleBufferAudioRenderer* renderer, bool flag)
@@ -657,7 +803,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::updateAllRenderersHaveAvailableSample
     bool allRenderersHaveAvailableSamples = true;
 
     do {
-        if (m_sampleBufferDisplayLayer && !m_hasAvailableVideoFrame) {
+        if (hasVideo() && !m_hasAvailableVideoFrame) {
             allRenderersHaveAvailableSamples = false;
             break;
         }
@@ -814,6 +960,11 @@ void MediaPlayerPrivateMediaSourceAVFObjC::setReadyState(MediaPlayer::ReadyState
     else
         [m_synchronizer setRate:0];
 
+    if (m_readyState >= MediaPlayerEnums::HaveCurrentData && hasVideo() && !m_hasAvailableVideoFrame) {
+        m_readyStateIsWaitingForAvailableFrame = true;
+        return;
+    }
+
     m_player->readyStateChanged();
 }
 
@@ -826,42 +977,6 @@ void MediaPlayerPrivateMediaSourceAVFObjC::setNetworkState(MediaPlayer::NetworkS
     m_player->networkStateChanged();
 }
 
-void MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer(AVSampleBufferDisplayLayer* displayLayer)
-{
-    ASSERT(displayLayer);
-    if (displayLayer == m_sampleBufferDisplayLayer)
-        return;
-
-    m_sampleBufferDisplayLayer = displayLayer;
-    [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
-    m_player->client().mediaPlayerRenderingModeChanged(m_player);
-
-    // FIXME: move this somewhere appropriate:
-    m_player->firstVideoFrameAvailable();
-
-#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
-    m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
-#endif
-}
-
-void MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer(AVSampleBufferDisplayLayer* displayLayer)
-{
-    if (displayLayer != m_sampleBufferDisplayLayer)
-        return;
-
-    CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);
-    [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL){
-        // No-op.
-    }];
-
-    m_sampleBufferDisplayLayer = nullptr;
-    m_player->client().mediaPlayerRenderingModeChanged(m_player);
-
-#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
-    m_videoFullscreenLayerManager->didDestroyVideoLayer();
-#endif
-}
-
 void MediaPlayerPrivateMediaSourceAVFObjC::addAudioRenderer(AVSampleBufferAudioRenderer* audioRenderer)
 {
     if (m_sampleBufferAudioRendererMap.contains(audioRenderer))
@@ -894,6 +1009,7 @@ void MediaPlayerPrivateMediaSourceAVFObjC::removeAudioRenderer(AVSampleBufferAud
 
 void MediaPlayerPrivateMediaSourceAVFObjC::characteristicsChanged()
 {
+    updateAllRenderersHaveAvailableSamples();
     m_player->characteristicChanged();
 }
 
index 56020e1..348abb0 100644 (file)
@@ -35,6 +35,7 @@
 #include <wtf/Vector.h>
 
 OBJC_CLASS AVAsset;
+OBJC_CLASS AVSampleBufferDisplayLayer;
 OBJC_CLASS AVStreamDataParser;
 OBJC_CLASS NSError;
 OBJC_CLASS NSObject;
@@ -47,6 +48,7 @@ class MediaPlayerPrivateMediaSourceAVFObjC;
 class MediaSourcePrivateClient;
 class SourceBufferPrivateAVFObjC;
 class TimeRanges;
+class WebCoreDecompressionSession;
 
 class MediaSourcePrivateAVFObjC final : public MediaSourcePrivate {
 public:
@@ -71,12 +73,17 @@ public:
 
     bool hasAudio() const;
     bool hasVideo() const;
+    bool hasSelectedVideo() const;
 
     void willSeek();
     void seekToTime(const MediaTime&);
     MediaTime fastSeekTimeForMediaTime(const MediaTime&, const MediaTime& negativeThreshold, const MediaTime& positiveThreshold);
     FloatSize naturalSize() const;
 
+    void hasSelectedVideoChanged(SourceBufferPrivateAVFObjC&);
+    void setVideoLayer(AVSampleBufferDisplayLayer*);
+    void setDecompressionSession(WebCoreDecompressionSession*);
+
 private:
     MediaSourcePrivateAVFObjC(MediaPlayerPrivateMediaSourceAVFObjC*, MediaSourcePrivateClient*);
 
@@ -88,6 +95,8 @@ private:
     void monitorSourceBuffers();
     void removeSourceBuffer(SourceBufferPrivate*);
 
+    void setSourceBufferWithSelectedVideo(SourceBufferPrivateAVFObjC*);
+
     friend class SourceBufferPrivateAVFObjC;
 
     MediaPlayerPrivateMediaSourceAVFObjC* m_player;
@@ -95,6 +104,7 @@ private:
     Vector<RefPtr<SourceBufferPrivateAVFObjC>> m_sourceBuffers;
     Vector<SourceBufferPrivateAVFObjC*> m_activeSourceBuffers;
     Deque<SourceBufferPrivateAVFObjC*> m_sourceBuffersNeedingSessions;
+    SourceBufferPrivateAVFObjC* m_sourceBufferWithSelectedVideo { nullptr };
     bool m_isEnded;
 };
 
index ae0e76b..5b209bf 100644 (file)
@@ -174,14 +174,18 @@ bool MediaSourcePrivateAVFObjC::hasAudio() const
     return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), MediaSourcePrivateAVFObjCHasAudio);
 }
 
-static bool MediaSourcePrivateAVFObjCHasVideo(SourceBufferPrivateAVFObjC* sourceBuffer)
+bool MediaSourcePrivateAVFObjC::hasVideo() const
 {
-    return sourceBuffer->hasVideo();
+    return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), [] (SourceBufferPrivateAVFObjC* sourceBuffer) {
+        return sourceBuffer->hasVideo();
+    });
 }
 
-bool MediaSourcePrivateAVFObjC::hasVideo() const
+bool MediaSourcePrivateAVFObjC::hasSelectedVideo() const
 {
-    return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), MediaSourcePrivateAVFObjCHasVideo);
+    return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), [] (SourceBufferPrivateAVFObjC* sourceBuffer) {
+        return sourceBuffer->hasSelectedVideo();
+    });
 }
 
 void MediaSourcePrivateAVFObjC::willSeek()
@@ -218,6 +222,42 @@ FloatSize MediaSourcePrivateAVFObjC::naturalSize() const
     return result;
 }
 
+void MediaSourcePrivateAVFObjC::hasSelectedVideoChanged(SourceBufferPrivateAVFObjC& sourceBuffer)
+{
+    bool hasSelectedVideo = sourceBuffer.hasSelectedVideo();
+    if (m_sourceBufferWithSelectedVideo == &sourceBuffer && !hasSelectedVideo)
+        setSourceBufferWithSelectedVideo(nullptr);
+    else if (m_sourceBufferWithSelectedVideo != &sourceBuffer && hasSelectedVideo)
+        setSourceBufferWithSelectedVideo(&sourceBuffer);
+}
+
+void MediaSourcePrivateAVFObjC::setVideoLayer(AVSampleBufferDisplayLayer* layer)
+{
+    if (m_sourceBufferWithSelectedVideo)
+        m_sourceBufferWithSelectedVideo->setVideoLayer(layer);
+}
+
+void MediaSourcePrivateAVFObjC::setDecompressionSession(WebCoreDecompressionSession* decompressionSession)
+{
+    if (m_sourceBufferWithSelectedVideo)
+        m_sourceBufferWithSelectedVideo->setDecompressionSession(decompressionSession);
+}
+
+void MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo(SourceBufferPrivateAVFObjC* sourceBuffer)
+{
+    if (m_sourceBufferWithSelectedVideo) {
+        m_sourceBufferWithSelectedVideo->setVideoLayer(nullptr);
+        m_sourceBufferWithSelectedVideo->setDecompressionSession(nullptr);
+    }
+
+    m_sourceBufferWithSelectedVideo = sourceBuffer;
+
+    if (m_sourceBufferWithSelectedVideo) {
+        m_sourceBufferWithSelectedVideo->setVideoLayer(m_player->sampleBufferDisplayLayer());
+        m_sourceBufferWithSelectedVideo->setDecompressionSession(m_player->decompressionSession());
+    }
+}
+
 }
 
 #endif // ENABLE(MEDIA_SOURCE) && USE(AVFOUNDATION)
index 3e5bc81..a0a017e 100644 (file)
@@ -62,6 +62,7 @@ class AudioTrackPrivate;
 class VideoTrackPrivate;
 class AudioTrackPrivateMediaSourceAVFObjC;
 class VideoTrackPrivateMediaSourceAVFObjC;
+class WebCoreDecompressionSession;
 
 class SourceBufferPrivateAVFObjCErrorClient {
 public:
@@ -88,6 +89,7 @@ public:
     bool processCodedFrame(int trackID, CMSampleBufferRef, const String& mediaType);
 
     bool hasVideo() const;
+    bool hasSelectedVideo() const;
     bool hasAudio() const;
 
     void trackDidChangeEnabled(VideoTrackPrivateMediaSourceAVFObjC*);
@@ -109,6 +111,9 @@ public:
     void layerDidReceiveError(AVSampleBufferDisplayLayer *, NSError *);
     void rendererDidReceiveError(AVSampleBufferAudioRenderer *, NSError *);
 
+    void setVideoLayer(AVSampleBufferDisplayLayer*);
+    void setDecompressionSession(WebCoreDecompressionSession*);
+
 private:
     explicit SourceBufferPrivateAVFObjC(MediaSourcePrivateAVFObjC*);
 
@@ -152,6 +157,7 @@ private:
     RetainPtr<NSError> m_hdcpError;
     OSObjectPtr<dispatch_semaphore_t> m_hasSessionSemaphore;
     OSObjectPtr<dispatch_group_t> m_isAppendingGroup;
+    RefPtr<WebCoreDecompressionSession> m_decompressionSession;
 
     MediaSourcePrivateAVFObjC* m_mediaSource;
     SourceBufferPrivateClient* m_client { nullptr };
index 0bde7b4..55d26f6 100644 (file)
 #if ENABLE(MEDIA_SOURCE) && USE(AVFOUNDATION)
 
 #import "AVFoundationSPI.h"
+#import "AudioTrackPrivateMediaSourceAVFObjC.h"
 #import "CDMSessionAVContentKeySession.h"
 #import "CDMSessionMediaSourceAVFObjC.h"
+#import "InbandTextTrackPrivateAVFObjC.h"
 #import "Logging.h"
 #import "MediaDescription.h"
 #import "MediaPlayerPrivateMediaSourceAVFObjC.h"
 #import "SoftLinking.h"
 #import "SourceBufferPrivateClient.h"
 #import "TimeRanges.h"
-#import "AudioTrackPrivateMediaSourceAVFObjC.h"
 #import "VideoTrackPrivateMediaSourceAVFObjC.h"
-#import "InbandTextTrackPrivateAVFObjC.h"
+#import "WebCoreDecompressionSession.h"
 #import <AVFoundation/AVAssetTrack.h>
 #import <QuartzCore/CALayer.h>
+#import <map>
 #import <objc/runtime.h>
 #import <runtime/TypedArrayInlines.h>
-#import <wtf/text/AtomicString.h>
-#import <wtf/text/CString.h>
 #import <wtf/BlockObjCExceptions.h>
 #import <wtf/HashCountedSet.h>
 #import <wtf/MainThread.h>
 #import <wtf/WeakPtr.h>
-#import <map>
+#import <wtf/text/AtomicString.h>
+#import <wtf/text/CString.h>
 
 #pragma mark - Soft Linking
 
@@ -649,14 +650,11 @@ void SourceBufferPrivateAVFObjC::destroyParser()
 
 void SourceBufferPrivateAVFObjC::destroyRenderers()
 {
-    if (m_displayLayer) {
-        if (m_mediaSource)
-            m_mediaSource->player()->removeDisplayLayer(m_displayLayer.get());
-        [m_displayLayer flush];
-        [m_displayLayer stopRequestingMediaData];
-        [m_errorListener stopObservingLayer:m_displayLayer.get()];
-        m_displayLayer = nullptr;
-    }
+    if (m_displayLayer)
+        setVideoLayer(nullptr);
+
+    if (m_decompressionSession)
+        setDecompressionSession(nullptr);
 
     for (auto& renderer : m_audioRenderers.values()) {
         if (m_mediaSource)
@@ -694,6 +692,11 @@ bool SourceBufferPrivateAVFObjC::hasVideo() const
     return m_client && m_client->sourceBufferPrivateHasVideo();
 }
 
+bool SourceBufferPrivateAVFObjC::hasSelectedVideo() const
+{
+    return m_enabledVideoTrackID != -1;
+}
+
 bool SourceBufferPrivateAVFObjC::hasAudio() const
 {
     return m_client && m_client->sourceBufferPrivateHasAudio();
@@ -705,24 +708,21 @@ void SourceBufferPrivateAVFObjC::trackDidChangeEnabled(VideoTrackPrivateMediaSou
     if (!track->selected() && m_enabledVideoTrackID == trackID) {
         m_enabledVideoTrackID = -1;
         [m_parser setShouldProvideMediaData:NO forTrackID:trackID];
-        if (m_mediaSource)
-            m_mediaSource->player()->removeDisplayLayer(m_displayLayer.get());
+
+        if (m_decompressionSession)
+            m_decompressionSession->stopRequestingMediaData();
     } else if (track->selected()) {
         m_enabledVideoTrackID = trackID;
         [m_parser setShouldProvideMediaData:YES forTrackID:trackID];
-        if (!m_displayLayer) {
-            m_displayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
-#ifndef NDEBUG
-            [m_displayLayer setName:@"SourceBufferPrivateAVFObjC AVSampleBufferDisplayLayer"];
-#endif
-            [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{
+
+        if (m_decompressionSession) {
+            m_decompressionSession->requestMediaDataWhenReady([this, trackID] {
                 didBecomeReadyForMoreSamples(trackID);
-            }];
-            [m_errorListener beginObservingLayer:m_displayLayer.get()];
+            });
         }
-        if (m_mediaSource)
-            m_mediaSource->player()->addDisplayLayer(m_displayLayer.get());
     }
+
+    m_mediaSource->hasSelectedVideoChanged(*this);
 }
 
 void SourceBufferPrivateAVFObjC::trackDidChangeEnabled(AudioTrackPrivateMediaSourceAVFObjC* track)
@@ -791,6 +791,14 @@ void SourceBufferPrivateAVFObjC::flush()
     if (m_displayLayer)
         [m_displayLayer flushAndRemoveImage];
 
+    if (m_decompressionSession) {
+        m_decompressionSession->flush();
+        m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] {
+            if (weakThis && weakThis->m_mediaSource)
+                weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true);
+        });
+    }
+
     for (auto& renderer : m_audioRenderers.values())
         [renderer flush];
 }
@@ -850,9 +858,16 @@ void SourceBufferPrivateAVFObjC::flush(const AtomicString& trackIDString)
     int trackID = trackIDString.toInt();
     LOG(MediaSource, "SourceBufferPrivateAVFObjC::flush(%p) - trackId: %d", this, trackID);
 
-    if (trackID == m_enabledVideoTrackID)
+    if (trackID == m_enabledVideoTrackID) {
         flush(m_displayLayer.get());
-    else if (m_audioRenderers.contains(trackID))
+        if (m_decompressionSession) {
+            m_decompressionSession->flush();
+            m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] {
+                if (weakThis && weakThis->m_mediaSource)
+                    weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true);
+            });
+        }
+    } else if (m_audioRenderers.contains(trackID))
         flush(m_audioRenderers.get(trackID).get());
 }
 
@@ -902,9 +917,14 @@ void SourceBufferPrivateAVFObjC::enqueueSample(Ref<MediaSample>&& sample, const
             }
         }
 
-        [m_displayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer];
-        if (m_mediaSource)
-            m_mediaSource->player()->setHasAvailableVideoFrame(!sample->isNonDisplaying());
+        if (m_decompressionSession)
+            m_decompressionSession->enqueueSample(platformSample.sample.cmSampleBuffer);
+
+        if (m_displayLayer) {
+            [m_displayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer];
+            if (m_mediaSource)
+                m_mediaSource->player()->setHasAvailableVideoFrame(!sample->isNonDisplaying());
+        }
     } else {
         auto renderer = m_audioRenderers.get(trackID);
         [renderer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer];
@@ -917,8 +937,9 @@ bool SourceBufferPrivateAVFObjC::isReadyForMoreSamples(const AtomicString& track
 {
     int trackID = trackIDString.toInt();
     if (trackID == m_enabledVideoTrackID)
-        return [m_displayLayer isReadyForMoreMediaData];
-    else if (m_audioRenderers.contains(trackID))
+        return !m_decompressionSession || m_decompressionSession->isReadyForMoreMediaData();
+
+    if (m_audioRenderers.contains(trackID))
         return [m_audioRenderers.get(trackID) isReadyForMoreMediaData];
 
     return false;
@@ -959,9 +980,12 @@ FloatSize SourceBufferPrivateAVFObjC::naturalSize()
 
 void SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples(int trackID)
 {
-    if (trackID == m_enabledVideoTrackID)
+    LOG(Media, "SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples(%p) - track(%d)", this, trackID);
+    if (trackID == m_enabledVideoTrackID) {
+        if (m_decompressionSession)
+            m_decompressionSession->stopRequestingMediaData();
         [m_displayLayer stopRequestingMediaData];
-    else if (m_audioRenderers.contains(trackID))
+    else if (m_audioRenderers.contains(trackID))
         [m_audioRenderers.get(trackID) stopRequestingMediaData];
     else {
         ASSERT_NOT_REACHED();
@@ -976,16 +1000,76 @@ void SourceBufferPrivateAVFObjC::notifyClientWhenReadyForMoreSamples(const Atomi
 {
     int trackID = trackIDString.toInt();
     if (trackID == m_enabledVideoTrackID) {
-        [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{
-            didBecomeReadyForMoreSamples(trackID);
-        }];
+        if (m_decompressionSession) {
+            m_decompressionSession->requestMediaDataWhenReady([this, trackID] {
+                didBecomeReadyForMoreSamples(trackID);
+            });
+        }
+        if (m_displayLayer) {
+            [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
+                didBecomeReadyForMoreSamples(trackID);
+            }];
+        }
     } else if (m_audioRenderers.contains(trackID)) {
-        [m_audioRenderers.get(trackID) requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{
+        [m_audioRenderers.get(trackID) requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
             didBecomeReadyForMoreSamples(trackID);
         }];
     }
 }
 
+void SourceBufferPrivateAVFObjC::setVideoLayer(AVSampleBufferDisplayLayer* layer)
+{
+    if (layer == m_displayLayer)
+        return;
+
+    ASSERT(!layer || !m_decompressionSession || hasSelectedVideo());
+
+    if (m_displayLayer) {
+        [m_displayLayer flush];
+        [m_displayLayer stopRequestingMediaData];
+        [m_errorListener stopObservingLayer:m_displayLayer.get()];
+    }
+
+    m_displayLayer = layer;
+
+    if (m_displayLayer) {
+        [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
+            didBecomeReadyForMoreSamples(m_enabledVideoTrackID);
+        }];
+        [m_errorListener beginObservingLayer:m_displayLayer.get()];
+        if (m_client)
+            m_client->sourceBufferPrivateReenqueSamples(AtomicString::number(m_enabledVideoTrackID));
+    }
+}
+
+void SourceBufferPrivateAVFObjC::setDecompressionSession(WebCoreDecompressionSession* decompressionSession)
+{
+    if (m_decompressionSession == decompressionSession)
+        return;
+
+    if (m_decompressionSession) {
+        m_decompressionSession->stopRequestingMediaData();
+        m_decompressionSession->invalidate();
+    }
+
+    m_decompressionSession = decompressionSession;
+
+    if (!m_decompressionSession)
+        return;
+
+    WeakPtr<SourceBufferPrivateAVFObjC> weakThis = createWeakPtr();
+    m_decompressionSession->requestMediaDataWhenReady([weakThis] {
+        if (weakThis)
+            weakThis->didBecomeReadyForMoreSamples(weakThis->m_enabledVideoTrackID);
+    });
+    m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] {
+        if (weakThis && weakThis->m_mediaSource)
+            weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true);
+    });
+    if (m_client)
+        m_client->sourceBufferPrivateReenqueSamples(AtomicString::number(m_enabledVideoTrackID));
+}
+
 }
 
 #endif // ENABLE(MEDIA_SOURCE) && USE(AVFOUNDATION)
diff --git a/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.h b/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.h
new file mode 100644 (file)
index 0000000..dcc1d18
--- /dev/null
@@ -0,0 +1,115 @@
+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#if USE(VIDEOTOOLBOX)
+
+#include <CoreMedia/CMTime.h>
+#include <wtf/Lock.h>
+#include <wtf/MediaTime.h>
+#include <wtf/OSObjectPtr.h>
+#include <wtf/Ref.h>
+#include <wtf/RetainPtr.h>
+#include <wtf/ThreadSafeRefCounted.h>
+#include <wtf/WeakPtr.h>
+
+typedef CFTypeRef CMBufferRef;
+typedef struct opaqueCMBufferQueue *CMBufferQueueRef;
+typedef struct opaqueCMBufferQueueTriggerToken *CMBufferQueueTriggerToken;
+typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
+typedef struct OpaqueCMTimebase* CMTimebaseRef;
+typedef signed long CMItemCount;
+typedef struct __CVBuffer *CVPixelBufferRef;
+typedef struct __CVBuffer *CVImageBufferRef;
+typedef UInt32 VTDecodeInfoFlags;
+typedef UInt32 VTDecodeInfoFlags;
+typedef struct OpaqueVTDecompressionSession*  VTDecompressionSessionRef;
+
+namespace WebCore {
+
+class WebCoreDecompressionSession : public ThreadSafeRefCounted<WebCoreDecompressionSession> {
+public:
+    static Ref<WebCoreDecompressionSession> create() { return adoptRef(*new WebCoreDecompressionSession()); }
+
+    void invalidate();
+    bool isInvalidated() const { return m_invalidated; }
+
+    void enqueueSample(CMSampleBufferRef, bool displaying = true);
+    bool isReadyForMoreMediaData() const;
+    void requestMediaDataWhenReady(std::function<void()>);
+    void stopRequestingMediaData();
+    void notifyWhenHasAvailableVideoFrame(std::function<void()>);
+
+    void setTimebase(CMTimebaseRef);
+    CMTimebaseRef timebase() const { return m_timebase.get(); }
+
+    enum ImageForTimeFlags { ExactTime, AllowEarlier, AllowLater };
+    RetainPtr<CVPixelBufferRef> imageForTime(const MediaTime&, ImageForTimeFlags = ExactTime);
+    void flush();
+
+private:
+    WebCoreDecompressionSession();
+
+    void decodeSample(CMSampleBufferRef, bool displaying);
+    void enqueueDecodedSample(CMSampleBufferRef, bool displaying);
+    WeakPtr<WebCoreDecompressionSession> createWeakPtr() { return m_weakFactory.createWeakPtr(); }
+    void handleDecompressionOutput(bool displaying, OSStatus, VTDecodeInfoFlags, CVImageBufferRef, CMTime presentationTimeStamp, CMTime presentationDuration);
+    RetainPtr<CVPixelBufferRef> getFirstVideoFrame();
+    void resetAutomaticDequeueTimer();
+    void automaticDequeue();
+
+    static void decompressionOutputCallback(void* decompressionOutputRefCon, void* sourceFrameRefCon, OSStatus, VTDecodeInfoFlags, CVImageBufferRef, CMTime presentationTimeStamp, CMTime presentationDuration);
+    static CMTime getDecodeTime(CMBufferRef, void* refcon);
+    static CMTime getPresentationTime(CMBufferRef, void* refcon);
+    static CMTime getDuration(CMBufferRef, void* refcon);
+    static CFComparisonResult compareBuffers(CMBufferRef buf1, CMBufferRef buf2, void* refcon);
+    static void maybeBecomeReadyForMoreMediaDataCallback(void* refcon, CMBufferQueueTriggerToken);
+    void maybeBecomeReadyForMoreMediaData();
+
+    static const CMItemCount kMaximumCapacity = 120;
+    static const CMItemCount kHighWaterMark = 60;
+    static const CMItemCount kLowWaterMark = 15;
+
+    RetainPtr<VTDecompressionSessionRef> m_decompressionSession;
+    RetainPtr<CMBufferQueueRef> m_producerQueue;
+    RetainPtr<CMBufferQueueRef> m_consumerQueue;
+    RetainPtr<CMTimebaseRef> m_timebase;
+    OSObjectPtr<dispatch_queue_t> m_decompressionQueue;
+    OSObjectPtr<dispatch_queue_t> m_enqueingQueue;
+    OSObjectPtr<dispatch_semaphore_t> m_hasAvailableImageSemaphore;
+    OSObjectPtr<dispatch_source_t> m_timerSource;
+    std::function<void()> m_notificationCallback;
+    std::function<void()> m_hasAvailableFrameCallback;
+    WeakPtrFactory<WebCoreDecompressionSession> m_weakFactory;
+    CMBufferQueueTriggerToken m_didBecomeReadyTrigger { nullptr };
+
+    bool m_invalidated { false };
+    int m_framesBeingDecoded { 0 };
+};
+
+}
+
+#endif
diff --git a/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.mm b/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.mm
new file mode 100644 (file)
index 0000000..46a75bf
--- /dev/null
@@ -0,0 +1,435 @@
+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#import "config.h"
+#import "WebCoreDecompressionSession.h"
+
+#if USE(VIDEOTOOLBOX)
+
+#import "Logging.h"
+#import "MediaTimeAVFoundation.h"
+#import "PixelBufferConformerCV.h"
+#import <CoreMedia/CMBufferQueue.h>
+#import <CoreMedia/CMFormatDescription.h>
+#import <wtf/MainThread.h>
+#import <wtf/MediaTime.h>
+#import <wtf/StringPrintStream.h>
+#import <wtf/Vector.h>
+
+#import "CoreMediaSoftLink.h"
+#import "CoreVideoSoftLink.h"
+#import "VideoToolboxSoftLink.h"
+
+namespace WebCore {
+
+WebCoreDecompressionSession::WebCoreDecompressionSession()
+    : m_decompressionQueue(adoptOSObject(dispatch_queue_create("WebCoreDecompressionSession Decompression Queue", DISPATCH_QUEUE_SERIAL)))
+    , m_enqueingQueue(adoptOSObject(dispatch_queue_create("WebCoreDecompressionSession Enqueueing Queue", DISPATCH_QUEUE_SERIAL)))
+    , m_hasAvailableImageSemaphore(adoptOSObject(dispatch_semaphore_create(0)))
+    , m_weakFactory(this)
+{
+    auto weakThis = createWeakPtr();
+}
+
+void WebCoreDecompressionSession::invalidate()
+{
+    m_invalidated = true;
+    m_notificationCallback = nullptr;
+    m_hasAvailableFrameCallback = nullptr;
+    setTimebase(nullptr);
+    if (m_timerSource)
+        dispatch_source_cancel(m_timerSource.get());
+}
+
+void WebCoreDecompressionSession::setTimebase(CMTimebaseRef timebase)
+{
+    if (m_timebase == timebase)
+        return;
+
+    if (m_timebase)
+        CMTimebaseRemoveTimerDispatchSource(m_timebase.get(), m_timerSource.get());
+
+    m_timebase = timebase;
+
+    if (m_timebase) {
+        if (!m_timerSource) {
+            m_timerSource = adoptOSObject(dispatch_source_create(DISPATCH_SOURCE_TYPE_TIMER, 0, 0, dispatch_get_main_queue()));
+            dispatch_source_set_event_handler(m_timerSource.get(), [this] {
+                automaticDequeue();
+            });
+#if (PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101200) || (PLATFORM(IOS) && __IPHONE_OS_VERSION_MIN_REQUIRED >= 100000)
+            dispatch_activate(m_timerSource.get());
+#endif
+        }
+        CMTimebaseAddTimerDispatchSource(m_timebase.get(), m_timerSource.get());
+    }
+}
+
+void WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaDataCallback(void* refcon, CMBufferQueueTriggerToken)
+{
+    WebCoreDecompressionSession* session = static_cast<WebCoreDecompressionSession*>(refcon);
+    session->maybeBecomeReadyForMoreMediaData();
+}
+
+void WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaData()
+{
+    LOG(Media, "WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaData(%p) - isReadyForMoreMediaData(%d), hasCallback(%d)", this, isReadyForMoreMediaData(), !!m_notificationCallback);
+    if (!isReadyForMoreMediaData() || !m_notificationCallback)
+        return;
+
+    if (isMainThread()) {
+        m_notificationCallback();
+        return;
+    }
+
+    RefPtr<WebCoreDecompressionSession> strongThis { this };
+    dispatch_async(dispatch_get_main_queue(), [strongThis] {
+        if (strongThis->m_notificationCallback)
+            strongThis->m_notificationCallback();
+    });
+}
+
+void WebCoreDecompressionSession::enqueueSample(CMSampleBufferRef sampleBuffer, bool displaying)
+{
+    CMItemCount itemCount = 0;
+    if (noErr != CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, 0, nullptr, &itemCount))
+        return;
+
+    Vector<CMSampleTimingInfo> timingInfoArray;
+    timingInfoArray.grow(itemCount);
+    if (noErr != CMSampleBufferGetSampleTimingInfoArray(sampleBuffer, itemCount, timingInfoArray.data(), nullptr))
+        return;
+
+    if (!m_decompressionQueue)
+        m_decompressionQueue = adoptOSObject(dispatch_queue_create("SourceBufferPrivateAVFObjC Decompression Queue", DISPATCH_QUEUE_SERIAL));
+
+    if (!m_producerQueue) {
+        CMBufferQueueRef outQueue { nullptr };
+        CMBufferCallbacks callbacks {
+            0,
+            nullptr,
+            &getDecodeTime,
+            &getPresentationTime,
+            &getDuration,
+            nullptr,
+            &compareBuffers,
+            nullptr,
+            nullptr,
+        };
+        CMBufferQueueCreate(kCFAllocatorDefault, kMaximumCapacity, &callbacks, &outQueue);
+        m_producerQueue = adoptCF(outQueue);
+
+        CMBufferQueueInstallTriggerWithIntegerThreshold(m_producerQueue.get(), maybeBecomeReadyForMoreMediaDataCallback, this, kCMBufferQueueTrigger_WhenBufferCountBecomesLessThan, kLowWaterMark, &m_didBecomeReadyTrigger);
+    }
+
+    if (!m_consumerQueue) {
+        CMBufferQueueRef outQueue { nullptr };
+        CMBufferCallbacks callbacks {
+            0,
+            nullptr,
+            &getDecodeTime,
+            &getPresentationTime,
+            &getDuration,
+            nullptr,
+            &compareBuffers,
+            nullptr,
+            nullptr,
+        };
+        CMBufferQueueCreate(kCFAllocatorDefault, kMaximumCapacity, &callbacks, &outQueue);
+        m_consumerQueue = adoptCF(outQueue);
+    }
+
+    ++m_framesBeingDecoded;
+
+    LOG(Media, "WebCoreDecompressionSession::enqueueSample(%p) - framesBeingDecoded(%d)", this, m_framesBeingDecoded);
+
+    dispatch_async(m_decompressionQueue.get(), [strongThis = makeRefPtr(*this), strongBuffer = retainPtr(sampleBuffer), displaying] {
+        strongThis->decodeSample(strongBuffer.get(), displaying);
+    });
+}
+
+void WebCoreDecompressionSession::decodeSample(CMSampleBufferRef sample, bool displaying)
+{
+    if (isInvalidated())
+        return;
+
+    CMVideoFormatDescriptionRef videoFormatDescription = CMSampleBufferGetFormatDescription(sample);
+    if (m_decompressionSession && !VTDecompressionSessionCanAcceptFormatDescription(m_decompressionSession.get(), videoFormatDescription)) {
+        VTDecompressionSessionWaitForAsynchronousFrames(m_decompressionSession.get());
+        m_decompressionSession = nullptr;
+    }
+
+    if (!m_decompressionSession) {
+        CMVideoFormatDescriptionRef videoFormatDescription = CMSampleBufferGetFormatDescription(sample);
+        NSDictionary* videoDecoderSpecification = @{ (NSString *)kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder: @YES };
+#if PLATFORM(IOS)
+        NSDictionary* attributes = @{(NSString *)kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey: @YES};
+#else
+        NSDictionary* attributes = @{(NSString *)kCVPixelBufferIOSurfaceOpenGLFBOCompatibilityKey: @YES};
+#endif
+        VTDecompressionSessionRef decompressionSessionOut = nullptr;
+        VTDecompressionOutputCallbackRecord callback {
+            &decompressionOutputCallback,
+            this,
+        };
+        if (noErr == VTDecompressionSessionCreate(kCFAllocatorDefault, videoFormatDescription, (CFDictionaryRef)videoDecoderSpecification, (CFDictionaryRef)attributes, &callback, &decompressionSessionOut))
+            m_decompressionSession = adoptCF(decompressionSessionOut);
+    }
+
+    VTDecodeInfoFlags flags { kVTDecodeFrame_EnableTemporalProcessing };
+    if (!displaying)
+        flags |= kVTDecodeFrame_DoNotOutputFrame;
+
+    VTDecompressionSessionDecodeFrame(m_decompressionSession.get(), sample, flags, reinterpret_cast<void*>(displaying), nullptr);
+}
+
+void WebCoreDecompressionSession::decompressionOutputCallback(void* decompressionOutputRefCon, void* sourceFrameRefCon, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef imageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration)
+{
+    WebCoreDecompressionSession* session = static_cast<WebCoreDecompressionSession*>(decompressionOutputRefCon);
+    bool displaying = sourceFrameRefCon;
+    session->handleDecompressionOutput(displaying, status, infoFlags, imageBuffer, presentationTimeStamp, presentationDuration);
+}
+
+void WebCoreDecompressionSession::handleDecompressionOutput(bool displaying, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef rawImageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration)
+{
+    CMVideoFormatDescriptionRef rawImageBufferDescription = nullptr;
+    if (noErr != CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, rawImageBuffer, &rawImageBufferDescription))
+        return;
+    RetainPtr<CMVideoFormatDescriptionRef> imageBufferDescription = adoptCF(rawImageBufferDescription);
+
+    CMSampleTimingInfo imageBufferTiming {
+        presentationDuration,
+        presentationTimeStamp,
+        presentationTimeStamp,
+    };
+
+    CMSampleBufferRef rawImageSampleBuffer = nullptr;
+    if (noErr != CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, rawImageBuffer, imageBufferDescription.get(), &imageBufferTiming, &rawImageSampleBuffer))
+        return;
+    RefPtr<WebCoreDecompressionSession> strongThis { this };
+    RetainPtr<CMSampleBufferRef> imageSampleBuffer = adoptCF(rawImageSampleBuffer);
+    dispatch_async(m_enqueingQueue.get(), [strongThis, status, imageSampleBuffer, infoFlags, displaying] {
+        UNUSED_PARAM(infoFlags);
+        strongThis->enqueueDecodedSample(imageSampleBuffer.get(), displaying);
+    });
+}
+
+RetainPtr<CVPixelBufferRef> WebCoreDecompressionSession::getFirstVideoFrame()
+{
+    if (!m_producerQueue || CMBufferQueueIsEmpty(m_producerQueue.get()))
+        return nullptr;
+
+    RetainPtr<CMSampleBufferRef> currentSample = adoptCF((CMSampleBufferRef)CMBufferQueueDequeueAndRetain(m_producerQueue.get()));
+    RetainPtr<CVPixelBufferRef> imageBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(currentSample.get());
+    ASSERT(CFGetTypeID(imageBuffer.get()) == CVPixelBufferGetTypeID());
+    return imageBuffer;
+}
+
+void WebCoreDecompressionSession::automaticDequeue()
+{
+    if (!m_timebase)
+        return;
+
+    auto time = toMediaTime(CMTimebaseGetTime(m_timebase.get()));
+    LOG(Media, "WebCoreDecompressionSession::automaticDequeue(%p) - purging all samples before time(%s)", this, toString(time).utf8().data());
+
+    while (CMSampleBufferRef firstSample = (CMSampleBufferRef)CMBufferQueueGetHead(m_producerQueue.get())) {
+        MediaTime presentationTimestamp = toMediaTime(CMSampleBufferGetPresentationTimeStamp(firstSample));
+        MediaTime duration = toMediaTime(CMSampleBufferGetDuration(firstSample));
+        MediaTime presentationEndTimestamp = presentationTimestamp + duration;
+        if (time > presentationEndTimestamp) {
+            CFRelease(CMBufferQueueDequeueAndRetain(m_producerQueue.get()));
+            continue;
+        }
+
+#if !LOG_DISABLED
+        auto begin = toMediaTime(CMBufferQueueGetFirstPresentationTimeStamp(m_producerQueue.get()));
+        auto end = toMediaTime(CMBufferQueueGetEndPresentationTimeStamp(m_producerQueue.get()));
+        LOG(Media, "WebCoreDecompressionSession::automaticDequeue(%p) - queue(%s -> %s)", this, toString(begin).utf8().data(), toString(end).utf8().data());
+#endif
+        CMTimebaseSetTimerDispatchSourceNextFireTime(m_timebase.get(), m_timerSource.get(), toCMTime(presentationEndTimestamp), 0);
+        return;
+    }
+
+    LOG(Media, "WebCoreDecompressionSession::automaticDequeue(%p) - queue empty", this, toString(time).utf8().data());
+    CMTimebaseSetTimerDispatchSourceNextFireTime(m_timebase.get(), m_timerSource.get(), kCMTimePositiveInfinity, 0);
+}
+
+void WebCoreDecompressionSession::enqueueDecodedSample(CMSampleBufferRef sample, bool displaying)
+{
+    if (isInvalidated())
+        return;
+
+    --m_framesBeingDecoded;
+
+    if (!displaying) {
+        maybeBecomeReadyForMoreMediaData();
+        return;
+    }
+
+    CMBufferQueueEnqueue(m_producerQueue.get(), sample);
+
+#if !LOG_DISABLED
+    auto begin = toMediaTime(CMBufferQueueGetFirstPresentationTimeStamp(m_producerQueue.get()));
+    auto end = toMediaTime(CMBufferQueueGetEndPresentationTimeStamp(m_producerQueue.get()));
+    auto presentationTime = toMediaTime(CMSampleBufferGetPresentationTimeStamp(sample));
+    LOG(Media, "WebCoreDecompressionSession::enqueueDecodedSample(%p) - presentationTime(%s), framesBeingDecoded(%d), producerQueue(%s -> %s)", this, toString(presentationTime).utf8().data(), m_framesBeingDecoded, toString(begin).utf8().data(), toString(end).utf8().data());
+#endif
+
+    if (m_timebase)
+        CMTimebaseSetTimerDispatchSourceToFireImmediately(m_timebase.get(), m_timerSource.get());
+
+    if (m_hasAvailableFrameCallback) {
+        std::function<void()> callback { m_hasAvailableFrameCallback };
+        m_hasAvailableFrameCallback = nullptr;
+        RefPtr<WebCoreDecompressionSession> strongThis { this };
+        dispatch_async(dispatch_get_main_queue(), [strongThis, callback] {
+            callback();
+        });
+    }
+}
+
+bool WebCoreDecompressionSession::isReadyForMoreMediaData() const
+{
+    CMItemCount producerCount = m_producerQueue ? CMBufferQueueGetBufferCount(m_producerQueue.get()) : 0;
+    return m_framesBeingDecoded + producerCount <= kHighWaterMark;
+}
+
+void WebCoreDecompressionSession::requestMediaDataWhenReady(std::function<void()> notificationCallback)
+{
+    LOG(Media, "WebCoreDecompressionSession::requestMediaDataWhenReady(%p), hasNotificationCallback(%d)", this, !!notificationCallback);
+    m_notificationCallback = notificationCallback;
+
+    if (notificationCallback && isReadyForMoreMediaData()) {
+        RefPtr<WebCoreDecompressionSession> strongThis { this };
+        dispatch_async(dispatch_get_main_queue(), [strongThis] {
+            if (strongThis->m_notificationCallback)
+                strongThis->m_notificationCallback();
+        });
+    }
+}
+
+void WebCoreDecompressionSession::stopRequestingMediaData()
+{
+    LOG(Media, "WebCoreDecompressionSession::stopRequestingMediaData(%p)", this);
+    m_notificationCallback = nullptr;
+}
+
+void WebCoreDecompressionSession::notifyWhenHasAvailableVideoFrame(std::function<void()> callback)
+{
+    if (callback && m_producerQueue && !CMBufferQueueIsEmpty(m_producerQueue.get())) {
+        dispatch_async(dispatch_get_main_queue(), [callback] {
+            callback();
+        });
+        return;
+    }
+    m_hasAvailableFrameCallback = callback;
+}
+
+RetainPtr<CVPixelBufferRef> WebCoreDecompressionSession::imageForTime(const MediaTime& time, ImageForTimeFlags flags)
+{
+    if (CMBufferQueueIsEmpty(m_producerQueue.get())) {
+        LOG(Media, "WebCoreDecompressionSession::imageForTime(%p) - time(%s), queue empty", this, toString(time).utf8().data());
+        return nullptr;
+    }
+
+    bool allowEarlier = flags == WebCoreDecompressionSession::AllowEarlier;
+    bool allowLater = flags == WebCoreDecompressionSession::AllowLater;
+
+    MediaTime startTime = toMediaTime(CMBufferQueueGetFirstPresentationTimeStamp(m_producerQueue.get()));
+    MediaTime endTime = toMediaTime(CMBufferQueueGetEndPresentationTimeStamp(m_producerQueue.get()));
+    if (!allowLater && time < startTime) {
+        LOG(Media, "WebCoreDecompressionSession::imageForTime(%p) - time(%s) too early for queue(%s -> %s)", this, toString(time).utf8().data(), toString(startTime).utf8().data(), toString(endTime).utf8().data());
+        return nullptr;
+    }
+
+    while (CMSampleBufferRef firstSample = (CMSampleBufferRef)CMBufferQueueGetHead(m_producerQueue.get())) {
+        MediaTime presentationTimestamp = toMediaTime(CMSampleBufferGetPresentationTimeStamp(firstSample));
+        MediaTime duration = toMediaTime(CMSampleBufferGetDuration(firstSample));
+        MediaTime presentationEndTimestamp = presentationTimestamp + duration;
+        if (!allowLater && presentationTimestamp > time)
+            return nullptr;
+        if (!allowEarlier && presentationEndTimestamp < time) {
+            CFRelease(CMBufferQueueDequeueAndRetain(m_producerQueue.get()));
+            continue;
+        }
+
+        RetainPtr<CMSampleBufferRef> currentSample = adoptCF((CMSampleBufferRef)CMBufferQueueDequeueAndRetain(m_producerQueue.get()));
+        RetainPtr<CVPixelBufferRef> imageBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(currentSample.get());
+        ASSERT(CFGetTypeID(imageBuffer.get()) == CVPixelBufferGetTypeID());
+
+        if (m_timebase)
+            CMTimebaseSetTimerDispatchSourceToFireImmediately(m_timebase.get(), m_timerSource.get());
+
+        LOG(Media, "WebCoreDecompressionSession::imageForTime(%p) - found sample for time(%s) in queue(%s -> %s)", this, toString(time).utf8().data(), toString(startTime).utf8().data(), toString(endTime).utf8().data());
+        return imageBuffer;
+    }
+
+    if (m_timebase)
+        CMTimebaseSetTimerDispatchSourceToFireImmediately(m_timebase.get(), m_timerSource.get());
+
+    LOG(Media, "WebCoreDecompressionSession::imageForTime(%p) - no matching sample for time(%s) in queue(%s -> %s)", this, toString(time).utf8().data(), toString(startTime).utf8().data(), toString(endTime).utf8().data());
+    return nullptr;
+}
+
+void WebCoreDecompressionSession::flush()
+{
+    dispatch_sync(m_decompressionQueue.get(), [strongThis = RefPtr<WebCoreDecompressionSession>(this)] {
+        CMBufferQueueReset(strongThis->m_producerQueue.get());
+        dispatch_sync(strongThis->m_enqueingQueue.get(), [strongThis] {
+            CMBufferQueueReset(strongThis->m_consumerQueue.get());
+        });
+    });
+}
+
+CMTime WebCoreDecompressionSession::getDecodeTime(CMBufferRef buf, void*)
+{
+    ASSERT(CFGetTypeID(buf) == CMSampleBufferGetTypeID());
+    CMSampleBufferRef sample = (CMSampleBufferRef)(buf);
+    return CMSampleBufferGetDecodeTimeStamp(sample);
+}
+
+CMTime WebCoreDecompressionSession::getPresentationTime(CMBufferRef buf, void*)
+{
+    ASSERT(CFGetTypeID(buf) == CMSampleBufferGetTypeID());
+    CMSampleBufferRef sample = (CMSampleBufferRef)(buf);
+    return CMSampleBufferGetPresentationTimeStamp(sample);
+}
+
+CMTime WebCoreDecompressionSession::getDuration(CMBufferRef buf, void*)
+{
+    ASSERT(CFGetTypeID(buf) == CMSampleBufferGetTypeID());
+    CMSampleBufferRef sample = (CMSampleBufferRef)(buf);
+    return CMSampleBufferGetDuration(sample);
+}
+
+CFComparisonResult WebCoreDecompressionSession::compareBuffers(CMBufferRef buf1, CMBufferRef buf2, void* refcon)
+{
+    return (CFComparisonResult)CMTimeCompare(getPresentationTime(buf1, refcon), getPresentationTime(buf2, refcon));
+}
+
+}
+
+#endif