Introduce an abstract SampleBufferDisplayLayer
authoryouenn@apple.com <youenn@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Wed, 15 Jan 2020 14:24:12 +0000 (14:24 +0000)
committeryouenn@apple.com <youenn@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Wed, 15 Jan 2020 14:24:12 +0000 (14:24 +0000)
https://bugs.webkit.org/show_bug.cgi?id=206066

Reviewed by Eric Carlson.

Move use of display layers in MediaPlayerPrivateMediaStreamAVFObjC to a new class LocalSampleBufferDisplayLayer
that implements an interface named SampleBufferDisplayLayer.
A future patch will implement this interface by IPCing to GPUProcess.
We move both layers and handling of the sample queue to LocalSampleBufferDisplayLayer.

Contrary to previously, we do not call again enqueueVideoSample in case we enqueued a sample for later use in the display layer.
Instead, we directly render it, which should not change much since this is a realtime track and in the future the buffer will be in GPUProcess anyway.

* SourcesCocoa.txt:
* WebCore.xcodeproj/project.pbxproj:
* platform/graphics/avfoundation/SampleBufferDisplayLayer.h: Added.
(WebCore::SampleBufferDisplayLayer::SampleBufferDisplayLayer):
* platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.h: Added.
* platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm: Added.
(-[WebAVSampleBufferStatusChangeListener initWithParent:]):
(-[WebAVSampleBufferStatusChangeListener dealloc]):
(-[WebAVSampleBufferStatusChangeListener invalidate]):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayers]):
(-[WebAVSampleBufferStatusChangeListener stopObservingLayers]):
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
(WebCore::runWithoutAnimations):
(WebCore::LocalSampleBufferDisplayLayer::LocalSampleBufferDisplayLayer):
(WebCore::LocalSampleBufferDisplayLayer::~LocalSampleBufferDisplayLayer):
(WebCore::LocalSampleBufferDisplayLayer::layerStatusDidChange):
(WebCore::LocalSampleBufferDisplayLayer::layerErrorDidChange):
(WebCore::LocalSampleBufferDisplayLayer::rootLayerBoundsDidChange):
(WebCore::LocalSampleBufferDisplayLayer::displayLayer):
(WebCore::LocalSampleBufferDisplayLayer::rootLayer):
(WebCore::LocalSampleBufferDisplayLayer::didFail const):
(WebCore::LocalSampleBufferDisplayLayer::updateDisplayMode):
(WebCore::LocalSampleBufferDisplayLayer::bounds const):
(WebCore::LocalSampleBufferDisplayLayer::updateAffineTransform):
(WebCore::LocalSampleBufferDisplayLayer::updateBoundsAndPosition):
(WebCore::LocalSampleBufferDisplayLayer::ensureLayers):
(WebCore::LocalSampleBufferDisplayLayer::flush):
(WebCore::LocalSampleBufferDisplayLayer::flushAndRemoveImage):
(WebCore::LocalSampleBufferDisplayLayer::enqueueSample):
(WebCore::LocalSampleBufferDisplayLayer::removeOldSamplesFromPendingQueue):
(WebCore::LocalSampleBufferDisplayLayer::addSampleToPendingQueue):
(WebCore::LocalSampleBufferDisplayLayer::clearEnqueuedSamples):
(WebCore::LocalSampleBufferDisplayLayer::requestNotificationWhenReadyForVideoData):
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueCorrectedVideoSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferDisplayLayerStatusDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::applicationDidBecomeActive):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer const):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::displayLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setBufferingPolicy):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferDisplayLayerBoundsDidChange):

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@254564 268f45cc-cd09-0410-ab3c-d52691b4dbfc

Source/WebCore/ChangeLog
Source/WebCore/SourcesCocoa.txt
Source/WebCore/WebCore.xcodeproj/project.pbxproj
Source/WebCore/platform/graphics/avfoundation/SampleBufferDisplayLayer.h [new file with mode: 0644]
Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.h [new file with mode: 0644]
Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm [new file with mode: 0644]
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm

index 79ad674..10baa59 100644 (file)
@@ -1,5 +1,71 @@
 2020-01-15  youenn fablet  <youenn@apple.com>
 
+        Introduce an abstract SampleBufferDisplayLayer
+        https://bugs.webkit.org/show_bug.cgi?id=206066
+
+        Reviewed by Eric Carlson.
+
+        Move use of display layers in MediaPlayerPrivateMediaStreamAVFObjC to a new class LocalSampleBufferDisplayLayer
+        that implements an interface named SampleBufferDisplayLayer.
+        A future patch will implement this interface by IPCing to GPUProcess.
+        We move both layers and handling of the sample queue to LocalSampleBufferDisplayLayer.
+
+        Contrary to previously, we do not call again enqueueVideoSample in case we enqueued a sample for later use in the display layer.
+        Instead, we directly render it, which should not change much since this is a realtime track and in the future the buffer will be in GPUProcess anyway.
+
+        * SourcesCocoa.txt:
+        * WebCore.xcodeproj/project.pbxproj:
+        * platform/graphics/avfoundation/SampleBufferDisplayLayer.h: Added.
+        (WebCore::SampleBufferDisplayLayer::SampleBufferDisplayLayer):
+        * platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.h: Added.
+        * platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm: Added.
+        (-[WebAVSampleBufferStatusChangeListener initWithParent:]):
+        (-[WebAVSampleBufferStatusChangeListener dealloc]):
+        (-[WebAVSampleBufferStatusChangeListener invalidate]):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingLayers]):
+        (-[WebAVSampleBufferStatusChangeListener stopObservingLayers]):
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
+        (WebCore::runWithoutAnimations):
+        (WebCore::LocalSampleBufferDisplayLayer::LocalSampleBufferDisplayLayer):
+        (WebCore::LocalSampleBufferDisplayLayer::~LocalSampleBufferDisplayLayer):
+        (WebCore::LocalSampleBufferDisplayLayer::layerStatusDidChange):
+        (WebCore::LocalSampleBufferDisplayLayer::layerErrorDidChange):
+        (WebCore::LocalSampleBufferDisplayLayer::rootLayerBoundsDidChange):
+        (WebCore::LocalSampleBufferDisplayLayer::displayLayer):
+        (WebCore::LocalSampleBufferDisplayLayer::rootLayer):
+        (WebCore::LocalSampleBufferDisplayLayer::didFail const):
+        (WebCore::LocalSampleBufferDisplayLayer::updateDisplayMode):
+        (WebCore::LocalSampleBufferDisplayLayer::bounds const):
+        (WebCore::LocalSampleBufferDisplayLayer::updateAffineTransform):
+        (WebCore::LocalSampleBufferDisplayLayer::updateBoundsAndPosition):
+        (WebCore::LocalSampleBufferDisplayLayer::ensureLayers):
+        (WebCore::LocalSampleBufferDisplayLayer::flush):
+        (WebCore::LocalSampleBufferDisplayLayer::flushAndRemoveImage):
+        (WebCore::LocalSampleBufferDisplayLayer::enqueueSample):
+        (WebCore::LocalSampleBufferDisplayLayer::removeOldSamplesFromPendingQueue):
+        (WebCore::LocalSampleBufferDisplayLayer::addSampleToPendingQueue):
+        (WebCore::LocalSampleBufferDisplayLayer::clearEnqueuedSamples):
+        (WebCore::LocalSampleBufferDisplayLayer::requestNotificationWhenReadyForVideoData):
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueCorrectedVideoSample):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferDisplayLayerStatusDidChange):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::applicationDidBecomeActive):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer const):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::displayLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setBufferingPolicy):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferDisplayLayerBoundsDidChange):
+
+2020-01-15  youenn fablet  <youenn@apple.com>
+
         Add support for MediaStream audio track rendering in GPUProcess
         https://bugs.webkit.org/show_bug.cgi?id=206175
 
index e6f817a..8906f57 100644 (file)
@@ -253,6 +253,7 @@ platform/graphics/avfoundation/objc/CDMSessionMediaSourceAVFObjC.mm @no-unify
 platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm @no-unify
 platform/graphics/avfoundation/objc/InbandTextTrackPrivateAVFObjC.mm @no-unify
 platform/graphics/avfoundation/objc/InbandTextTrackPrivateLegacyAVFObjC.mm @no-unify
+platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm @no-unify
 platform/graphics/avfoundation/objc/MediaPlaybackTargetPickerMac.mm @no-unify
 platform/graphics/avfoundation/objc/MediaPlayerPrivateAVFoundationObjC.mm @no-unify
 platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm @no-unify
index fc1b0fc..cc35a58 100644 (file)
                413C2C341BC29A8F0075204C /* JSDOMConstructor.h in Headers */ = {isa = PBXBuildFile; fileRef = 413C2C331BC29A7B0075204C /* JSDOMConstructor.h */; };
                413CCD4A20DE034F0065A21A /* MockMediaDevice.h in Headers */ = {isa = PBXBuildFile; fileRef = 413CCD4820DE013C0065A21A /* MockMediaDevice.h */; settings = {ATTRIBUTES = (Private, ); }; };
                413E00791DB0E4F2002341D2 /* MemoryRelease.h in Headers */ = {isa = PBXBuildFile; fileRef = 413E00781DB0E4DE002341D2 /* MemoryRelease.h */; settings = {ATTRIBUTES = (Private, ); }; };
+               414598C223C8D177002B9CC8 /* LocalSampleBufferDisplayLayer.mm in Sources */ = {isa = PBXBuildFile; fileRef = 414598C123C8AD79002B9CC8 /* LocalSampleBufferDisplayLayer.mm */; };
                414B82051D6DF0E50077EBE3 /* StructuredClone.h in Headers */ = {isa = PBXBuildFile; fileRef = 414B82031D6DF0D90077EBE3 /* StructuredClone.h */; };
                414DEDE71F9FE91E0047C40D /* EmptyFrameLoaderClient.h in Headers */ = {isa = PBXBuildFile; fileRef = 414DEDE51F9FE9150047C40D /* EmptyFrameLoaderClient.h */; settings = {ATTRIBUTES = (Private, ); }; };
                415071581685067300C3C7B3 /* SelectorFilter.h in Headers */ = {isa = PBXBuildFile; fileRef = 415071561685067300C3C7B3 /* SelectorFilter.h */; };
                41F062140F5F192600A07EAC /* InspectorDatabaseResource.h in Headers */ = {isa = PBXBuildFile; fileRef = 41F062120F5F192600A07EAC /* InspectorDatabaseResource.h */; };
                41F1D21F0EF35C2A00DA8753 /* ScriptCachedFrameData.h in Headers */ = {isa = PBXBuildFile; fileRef = 41F1D21D0EF35C2A00DA8753 /* ScriptCachedFrameData.h */; settings = {ATTRIBUTES = (Private, ); }; };
                41FABD2D1F4DFE4A006A6C97 /* DOMCacheEngine.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FABD2B1F4DFE42006A6C97 /* DOMCacheEngine.h */; settings = {ATTRIBUTES = (Private, ); }; };
+               41FCD6B923CE015500C62567 /* SampleBufferDisplayLayer.h in Headers */ = {isa = PBXBuildFile; fileRef = 414598BE23C8AAB8002B9CC8 /* SampleBufferDisplayLayer.h */; settings = {ATTRIBUTES = (Private, ); }; };
+               41FCD6BB23CE027700C62567 /* LocalSampleBufferDisplayLayer.h in Headers */ = {isa = PBXBuildFile; fileRef = 414598C023C8AD78002B9CC8 /* LocalSampleBufferDisplayLayer.h */; settings = {ATTRIBUTES = (Private, ); }; };
                427DA71D13735DFA007C57FB /* JSServiceWorkerInternals.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 427DA71B13735DFA007C57FB /* JSServiceWorkerInternals.cpp */; };
                427DA71E13735DFA007C57FB /* JSServiceWorkerInternals.h in Headers */ = {isa = PBXBuildFile; fileRef = 427DA71C13735DFA007C57FB /* JSServiceWorkerInternals.h */; };
                43107BE218CC19DE00CC18E8 /* SelectorPseudoTypeMap.h in Headers */ = {isa = PBXBuildFile; fileRef = 43107BE118CC19DE00CC18E8 /* SelectorPseudoTypeMap.h */; };
                413E00781DB0E4DE002341D2 /* MemoryRelease.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = MemoryRelease.h; sourceTree = "<group>"; };
                413E007B1DB0E707002341D2 /* MemoryReleaseCocoa.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = MemoryReleaseCocoa.mm; sourceTree = "<group>"; };
                413FC4CD1FD1DD8C00541C4B /* ServiceWorkerClientQueryOptions.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = ServiceWorkerClientQueryOptions.h; sourceTree = "<group>"; };
+               414598BE23C8AAB8002B9CC8 /* SampleBufferDisplayLayer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SampleBufferDisplayLayer.h; sourceTree = "<group>"; };
+               414598C023C8AD78002B9CC8 /* LocalSampleBufferDisplayLayer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = LocalSampleBufferDisplayLayer.h; sourceTree = "<group>"; };
+               414598C123C8AD79002B9CC8 /* LocalSampleBufferDisplayLayer.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = LocalSampleBufferDisplayLayer.mm; sourceTree = "<group>"; };
                4147E2B21C88337F00A7E715 /* FetchBodyOwner.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = FetchBodyOwner.h; sourceTree = "<group>"; };
                4147E2B31C89912600A7E715 /* FetchBodyOwner.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = FetchBodyOwner.cpp; sourceTree = "<group>"; };
                4147E2B41C89912600A7E715 /* FetchLoader.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = FetchLoader.cpp; sourceTree = "<group>"; };
                                076F0D0A12B8192700C26AA4 /* MediaPlayerPrivateAVFoundation.h */,
                                CDBEAEAB19D92B6C00BEBA88 /* MediaSelectionGroupAVFObjC.h */,
                                CDBEAEAA19D92B6C00BEBA88 /* MediaSelectionGroupAVFObjC.mm */,
+                               414598BE23C8AAB8002B9CC8 /* SampleBufferDisplayLayer.h */,
                                CD336F6317FA0A4D00DDDCD0 /* VideoTrackPrivateAVF.h */,
                                0709D7901AE5557E004E42F8 /* WebMediaSessionManagerMac.cpp */,
                                0709D7911AE5557E004E42F8 /* WebMediaSessionManagerMac.h */,
                                07AA6B6A166D019500D45671 /* InbandTextTrackPrivateAVFObjC.mm */,
                                07367DDD172CA67F00D861B9 /* InbandTextTrackPrivateLegacyAVFObjC.h */,
                                07367DDE172CA67F00D861B9 /* InbandTextTrackPrivateLegacyAVFObjC.mm */,
+                               414598C023C8AD78002B9CC8 /* LocalSampleBufferDisplayLayer.h */,
+                               414598C123C8AD79002B9CC8 /* LocalSampleBufferDisplayLayer.mm */,
                                078E43DB1ABB6F6F001C2FA6 /* MediaPlaybackTargetPickerMac.h */,
                                078E43DC1ABB6F6F001C2FA6 /* MediaPlaybackTargetPickerMac.mm */,
                                DF9AFD7013FC31D80015FEB7 /* MediaPlayerPrivateAVFoundationObjC.h */,
                                7633A72613D8B33A008501B6 /* LocaleToScriptMapping.h in Headers */,
                                A516E8B7136E04DB0076C3C0 /* LocalizedDateCache.h in Headers */,
                                935207BE09BD410A00F2038D /* LocalizedStrings.h in Headers */,
+                               41FCD6BB23CE027700C62567 /* LocalSampleBufferDisplayLayer.h in Headers */,
                                BCE1C41B0D982980003B02F2 /* Location.h in Headers */,
                                6B4D412D23983F88002494C2 /* LoggedInStatus.h in Headers */,
                                A8239E0109B3CF8A00B60641 /* Logging.h in Headers */,
                                293EAE1F1356B2FE0067ACF9 /* RuntimeApplicationChecks.h in Headers */,
                                7C52229E1E1DAE47002CB8F7 /* RuntimeEnabledFeatures.h in Headers */,
                                CE7A6C28208537E200FA2B46 /* SameSiteInfo.h in Headers */,
+                               41FCD6B923CE015500C62567 /* SampleBufferDisplayLayer.h in Headers */,
                                CDD7089718359F6F002B3DC6 /* SampleMap.h in Headers */,
                                49E911CB0EF86D47009D0CAF /* ScaleTransformOperation.h in Headers */,
                                5DFE8F570D16477C0076E937 /* ScheduledAction.h in Headers */,
                                41D28D0D2139E05800F4206F /* LibWebRTCStatsCollector.cpp in Sources */,
                                4186BD3E213EE3400001826F /* LibWebRTCUtils.cpp in Sources */,
                                9759E93E14EF1CF80026A2DD /* LoadableTextTrack.cpp in Sources */,
+                               414598C223C8D177002B9CC8 /* LocalSampleBufferDisplayLayer.mm in Sources */,
                                FABE72FE1059C21100D999DD /* MathMLNames.cpp in Sources */,
                                2D9BF7051DBFBB24007A7D99 /* MediaEncryptedEvent.cpp in Sources */,
                                2D9BF7471DBFDC49007A7D99 /* MediaKeyMessageEvent.cpp in Sources */,
diff --git a/Source/WebCore/platform/graphics/avfoundation/SampleBufferDisplayLayer.h b/Source/WebCore/platform/graphics/avfoundation/SampleBufferDisplayLayer.h
new file mode 100644 (file)
index 0000000..4ff57d7
--- /dev/null
@@ -0,0 +1,77 @@
+/*
+ * Copyright (C) 2020 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY
+ * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL APPLE INC. OR
+ * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+ * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+ * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+ * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
+ * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#include "PlatformLayer.h"
+#include <wtf/WeakPtr.h>
+
+namespace WTF {
+class MediaTime;
+}
+
+namespace WebCore {
+class IntSize;
+class MediaSample;
+
+class SampleBufferDisplayLayer {
+public:
+    class Client : public CanMakeWeakPtr<Client> {
+    public:
+        virtual ~Client() = default;
+        virtual void sampleBufferDisplayLayerStatusDidChange(SampleBufferDisplayLayer&) = 0;
+        virtual void sampleBufferDisplayLayerBoundsDidChange(SampleBufferDisplayLayer&) = 0;
+        virtual WTF::MediaTime streamTime() const = 0;
+    };
+
+    explicit SampleBufferDisplayLayer(Client&);
+    virtual ~SampleBufferDisplayLayer() = default;
+
+    virtual bool didFail() const = 0;
+
+    virtual void updateDisplayMode(bool hideDisplayLayer, bool hideRootLayer) = 0;
+
+    virtual CGRect bounds() const = 0;
+    virtual void updateAffineTransform(CGAffineTransform) = 0;
+    virtual void updateBoundsAndPosition(CGRect, CGPoint) = 0;
+
+    virtual void flush() = 0;
+    virtual void flushAndRemoveImage() = 0;
+
+    virtual void enqueueSample(MediaSample&) = 0;
+    virtual void clearEnqueuedSamples() = 0;
+
+    virtual PlatformLayer* rootLayer() = 0;
+
+protected:
+    WeakPtr<Client> m_client;
+};
+
+inline SampleBufferDisplayLayer::SampleBufferDisplayLayer(Client& client)
+    : m_client(makeWeakPtr(client))
+{
+}
+
+}
diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.h b/Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.h
new file mode 100644 (file)
index 0000000..29cdea4
--- /dev/null
@@ -0,0 +1,89 @@
+/*
+ * Copyright (C) 2020 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY
+ * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL APPLE INC. OR
+ * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+ * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+ * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+ * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
+ * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
+
+#include "SampleBufferDisplayLayer.h"
+#include <wtf/Deque.h>
+#include <wtf/Forward.h>
+#include <wtf/RetainPtr.h>
+
+OBJC_CLASS AVSampleBufferDisplayLayer;
+OBJC_CLASS WebAVSampleBufferStatusChangeListener;
+
+namespace WebCore {
+
+class WEBCORE_EXPORT LocalSampleBufferDisplayLayer final : public SampleBufferDisplayLayer, public CanMakeWeakPtr<LocalSampleBufferDisplayLayer> {
+    WTF_MAKE_FAST_ALLOCATED;
+public:
+    static std::unique_ptr<SampleBufferDisplayLayer> create(Client&, bool hideRootLayer, IntSize);
+
+    LocalSampleBufferDisplayLayer(RetainPtr<AVSampleBufferDisplayLayer>&&, Client&, bool hideRootLayer, IntSize);
+    ~LocalSampleBufferDisplayLayer();
+
+    // API used by WebAVSampleBufferStatusChangeListener
+    void layerStatusDidChange();
+    void layerErrorDidChange();
+    void rootLayerBoundsDidChange();
+
+    PlatformLayer* displayLayer();
+
+    PlatformLayer* rootLayer() final;
+
+private:
+    bool didFail() const final;
+
+    void updateDisplayMode(bool hideDisplayLayer, bool hideRootLayer) final;
+
+    CGRect bounds() const final;
+    void updateAffineTransform(CGAffineTransform)  final;
+    void updateBoundsAndPosition(CGRect, CGPoint) final;
+
+    void flush() final;
+    void flushAndRemoveImage() final;
+
+    void enqueueSample(MediaSample&) final;
+    void clearEnqueuedSamples() final;
+
+    void ensureLayers();
+
+    void removeOldSamplesFromPendingQueue();
+    void addSampleToPendingQueue(MediaSample&);
+    void requestNotificationWhenReadyForVideoData();
+
+private:
+    RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener;
+    RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
+    RetainPtr<PlatformLayer> m_rootLayer;
+
+    using PendingSampleQueue = Deque<Ref<MediaSample>>;
+    PendingSampleQueue m_pendingVideoSampleQueue;
+};
+    
+}
+
+#endif // ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
diff --git a/Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm b/Source/WebCore/platform/graphics/avfoundation/objc/LocalSampleBufferDisplayLayer.mm
new file mode 100644 (file)
index 0000000..f0c9cc7
--- /dev/null
@@ -0,0 +1,368 @@
+/*
+ * Copyright (C) 2020 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. ``AS IS'' AND ANY
+ * EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
+ * IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED.  IN NO EVENT SHALL APPLE INC. OR
+ * CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
+ * EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
+ * PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
+ * PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY
+ * OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#import "config.h"
+#import "LocalSampleBufferDisplayLayer.h"
+
+#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
+
+#import "Color.h"
+#import "IntSize.h"
+#import "MediaSample.h"
+
+#import <AVFoundation/AVSampleBufferDisplayLayer.h>
+#import <QuartzCore/CALayer.h>
+#import <QuartzCore/CATransaction.h>
+
+#import <wtf/MainThread.h>
+
+#import <pal/cocoa/AVFoundationSoftLink.h>
+
+using namespace WebCore;
+
+@interface WebAVSampleBufferStatusChangeListener : NSObject {
+    LocalSampleBufferDisplayLayer* _parent;
+}
+
+- (id)initWithParent:(LocalSampleBufferDisplayLayer*)callback;
+- (void)invalidate;
+- (void)beginObservingLayers;
+- (void)stopObservingLayers;
+@end
+
+@implementation WebAVSampleBufferStatusChangeListener
+
+- (id)initWithParent:(LocalSampleBufferDisplayLayer*)parent
+{
+    if (!(self = [super init]))
+        return nil;
+
+    _parent = parent;
+
+    return self;
+}
+
+- (void)dealloc
+{
+    [self invalidate];
+    [super dealloc];
+}
+
+- (void)invalidate
+{
+    [self stopObservingLayers];
+    _parent = nullptr;
+}
+
+- (void)beginObservingLayers
+{
+    ASSERT(_parent);
+    ASSERT(_parent->displayLayer());
+    ASSERT(_parent->rootLayer());
+
+    [_parent->displayLayer() addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
+    [_parent->displayLayer() addObserver:self forKeyPath:@"error" options:NSKeyValueObservingOptionNew context:nil];
+    [_parent->rootLayer() addObserver:self forKeyPath:@"bounds" options:NSKeyValueObservingOptionNew context:nil];
+}
+
+- (void)stopObservingLayers
+{
+    if (!_parent)
+        return;
+
+    if (_parent->displayLayer()) {
+        [_parent->displayLayer() removeObserver:self forKeyPath:@"status"];
+        [_parent->displayLayer() removeObserver:self forKeyPath:@"error"];
+    }
+    if (_parent->rootLayer())
+        [_parent->rootLayer() removeObserver:self forKeyPath:@"bounds"];
+}
+
+- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
+{
+    UNUSED_PARAM(context);
+    UNUSED_PARAM(keyPath);
+    ASSERT(_parent);
+
+    if (!_parent)
+        return;
+
+    if ([object isKindOfClass:PAL::getAVSampleBufferDisplayLayerClass()]) {
+        ASSERT(object == _parent->displayLayer());
+
+        if ([keyPath isEqualToString:@"status"]) {
+            callOnMainThread([protectedSelf = RetainPtr<WebAVSampleBufferStatusChangeListener>(self)] {
+                if (!protectedSelf->_parent)
+                    return;
+
+                protectedSelf->_parent->layerStatusDidChange();
+            });
+            return;
+        }
+
+        if ([keyPath isEqualToString:@"error"]) {
+            callOnMainThread([protectedSelf = RetainPtr<WebAVSampleBufferStatusChangeListener>(self)] {
+                if (!protectedSelf->_parent)
+                    return;
+
+                protectedSelf->_parent->layerErrorDidChange();
+            });
+            return;
+        }
+    }
+
+    if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])
+        return;
+
+    if ((CALayer *)object == _parent->rootLayer()) {
+        if ([keyPath isEqualToString:@"bounds"]) {
+            if (!_parent)
+                return;
+
+            if (isMainThread()) {
+                _parent->rootLayerBoundsDidChange();
+                return;
+            }
+
+            callOnMainThread([protectedSelf = RetainPtr<WebAVSampleBufferStatusChangeListener>(self)] {
+                if (!protectedSelf->_parent)
+                    return;
+
+                protectedSelf->_parent->rootLayerBoundsDidChange();
+            });
+        }
+    }
+
+}
+@end
+
+namespace WebCore {
+
+static void runWithoutAnimations(const WTF::Function<void()>& function)
+{
+    [CATransaction begin];
+    [CATransaction setAnimationDuration:0];
+    [CATransaction setDisableActions:YES];
+    function();
+    [CATransaction commit];
+}
+
+std::unique_ptr<SampleBufferDisplayLayer> LocalSampleBufferDisplayLayer::create(Client& client, bool hideRootLayer, IntSize size)
+{
+    auto sampleBufferDisplayLayer = adoptNS([PAL::allocAVSampleBufferDisplayLayerInstance() init]);
+    if (!sampleBufferDisplayLayer)
+        return nullptr;
+
+    return makeUnique<LocalSampleBufferDisplayLayer>(WTFMove(sampleBufferDisplayLayer), client, hideRootLayer, size);
+}
+
+LocalSampleBufferDisplayLayer::LocalSampleBufferDisplayLayer(RetainPtr<AVSampleBufferDisplayLayer>&& sampleBufferDisplayLayer, Client& client, bool hideRootLayer, IntSize size)
+    : SampleBufferDisplayLayer(client)
+    , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this]))
+    , m_sampleBufferDisplayLayer(WTFMove(sampleBufferDisplayLayer))
+{
+    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_sampleBufferDisplayLayer.get().anchorPoint = { .5, .5 };
+    m_sampleBufferDisplayLayer.get().needsDisplayOnBoundsChange = YES;
+    m_sampleBufferDisplayLayer.get().videoGravity = AVLayerVideoGravityResizeAspectFill;
+
+    m_rootLayer = adoptNS([[CALayer alloc] init]);
+    m_rootLayer.get().hidden = hideRootLayer;
+
+    m_rootLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_rootLayer.get().needsDisplayOnBoundsChange = YES;
+
+    m_rootLayer.get().bounds = CGRectMake(0, 0, size.width(), size.height());
+
+    [m_statusChangeListener beginObservingLayers];
+
+    [m_rootLayer addSublayer:m_sampleBufferDisplayLayer.get()];
+
+#ifndef NDEBUG
+    [m_sampleBufferDisplayLayer setName:@"LocalSampleBufferDisplayLayer AVSampleBufferDisplayLayer"];
+    [m_rootLayer setName:@"LocalSampleBufferDisplayLayer AVSampleBufferDisplayLayer parent"];
+#endif
+}
+
+LocalSampleBufferDisplayLayer::~LocalSampleBufferDisplayLayer()
+{
+    [m_statusChangeListener stopObservingLayers];
+
+    m_pendingVideoSampleQueue.clear();
+
+    [m_sampleBufferDisplayLayer stopRequestingMediaData];
+    [m_sampleBufferDisplayLayer flush];
+    m_sampleBufferDisplayLayer = nullptr;
+
+    m_rootLayer = nullptr;
+}
+
+void LocalSampleBufferDisplayLayer::layerStatusDidChange()
+{
+    ASSERT(isMainThread());
+    if (m_sampleBufferDisplayLayer.get().status != AVQueuedSampleBufferRenderingStatusRendering)
+        return;
+    if (!m_client)
+        return;
+    m_client->sampleBufferDisplayLayerStatusDidChange(*this);
+}
+
+void LocalSampleBufferDisplayLayer::layerErrorDidChange()
+{
+    ASSERT(isMainThread());
+    // FIXME: Log error.
+}
+
+void LocalSampleBufferDisplayLayer::rootLayerBoundsDidChange()
+{
+    ASSERT(isMainThread());
+    if (!m_client)
+        return;
+    m_client->sampleBufferDisplayLayerBoundsDidChange(*this);
+}
+
+PlatformLayer* LocalSampleBufferDisplayLayer::displayLayer()
+{
+    return m_sampleBufferDisplayLayer.get();
+}
+
+PlatformLayer* LocalSampleBufferDisplayLayer::rootLayer()
+{
+    return m_rootLayer.get();
+}
+
+bool LocalSampleBufferDisplayLayer::didFail() const
+{
+    return [m_sampleBufferDisplayLayer status] == AVQueuedSampleBufferRenderingStatusFailed;
+}
+
+void LocalSampleBufferDisplayLayer::updateDisplayMode(bool hideDisplayLayer, bool hideRootLayer)
+{
+    if (m_rootLayer.get().hidden == hideRootLayer && m_sampleBufferDisplayLayer.get().hidden == hideDisplayLayer)
+        return;
+
+    runWithoutAnimations([&] {
+        m_sampleBufferDisplayLayer.get().hidden = hideDisplayLayer;
+        m_rootLayer.get().hidden = hideRootLayer;
+    });
+}
+
+CGRect LocalSampleBufferDisplayLayer::bounds() const
+{
+    return m_rootLayer.get().bounds;
+}
+
+void LocalSampleBufferDisplayLayer::updateAffineTransform(CGAffineTransform transform)
+{
+    runWithoutAnimations([&] {
+        m_sampleBufferDisplayLayer.get().affineTransform = transform;
+    });
+}
+
+void LocalSampleBufferDisplayLayer::updateBoundsAndPosition(CGRect videoBounds, CGPoint position)
+{
+    runWithoutAnimations([&] {
+        m_sampleBufferDisplayLayer.get().bounds = videoBounds;
+        m_sampleBufferDisplayLayer.get().position = position;
+    });
+}
+
+void LocalSampleBufferDisplayLayer::flush()
+{
+    [m_sampleBufferDisplayLayer flush];
+}
+
+void LocalSampleBufferDisplayLayer::flushAndRemoveImage()
+{
+    [m_sampleBufferDisplayLayer flushAndRemoveImage];
+}
+
+void LocalSampleBufferDisplayLayer::enqueueSample(MediaSample& sample)
+{
+    if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
+        addSampleToPendingQueue(sample);
+        requestNotificationWhenReadyForVideoData();
+        return;
+    }
+
+    [m_sampleBufferDisplayLayer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
+}
+
+void LocalSampleBufferDisplayLayer::removeOldSamplesFromPendingQueue()
+{
+    if (m_pendingVideoSampleQueue.isEmpty() || !m_client)
+        return;
+
+    auto decodeTime = m_pendingVideoSampleQueue.first()->decodeTime();
+    if (!decodeTime.isValid() || decodeTime < MediaTime::zeroTime()) {
+        while (m_pendingVideoSampleQueue.size() > 5)
+            m_pendingVideoSampleQueue.removeFirst();
+
+        return;
+    }
+
+    MediaTime now = m_client->streamTime();
+    while (!m_pendingVideoSampleQueue.isEmpty()) {
+        if (m_pendingVideoSampleQueue.first()->decodeTime() > now)
+            break;
+        m_pendingVideoSampleQueue.removeFirst();
+    }
+}
+
+void LocalSampleBufferDisplayLayer::addSampleToPendingQueue(MediaSample& sample)
+{
+    removeOldSamplesFromPendingQueue();
+    m_pendingVideoSampleQueue.append(sample);
+}
+
+void LocalSampleBufferDisplayLayer::clearEnqueuedSamples()
+{
+    m_pendingVideoSampleQueue.clear();
+}
+
+void LocalSampleBufferDisplayLayer::requestNotificationWhenReadyForVideoData()
+{
+    auto weakThis = makeWeakPtr(*this);
+    [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{
+        if (!weakThis)
+            return;
+
+        [m_sampleBufferDisplayLayer stopRequestingMediaData];
+
+        while (!m_pendingVideoSampleQueue.isEmpty()) {
+            if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
+                requestNotificationWhenReadyForVideoData();
+                return;
+            }
+
+            auto sample = m_pendingVideoSampleQueue.takeFirst();
+            enqueueSample(sample.get());
+        }
+    }];
+}
+
+}
+
+#endif
index 0340a8d..d9af27a 100644 (file)
 #include "MediaPlayerPrivate.h"
 #include "MediaSample.h"
 #include "MediaStreamPrivate.h"
+#include "SampleBufferDisplayLayer.h"
 #include <wtf/Deque.h>
 #include <wtf/Forward.h>
 #include <wtf/LoggerHelper.h>
-#include <wtf/WeakPtr.h>
 
 OBJC_CLASS AVSampleBufferDisplayLayer;
 OBJC_CLASS WebAVSampleBufferStatusChangeListener;
@@ -51,7 +51,7 @@ class PixelBufferConformerCV;
 class VideoFullscreenLayerManagerObjC;
 class VideoTrackPrivateMediaStream;
 
-class MediaPlayerPrivateMediaStreamAVFObjC final : public CanMakeWeakPtr<MediaPlayerPrivateMediaStreamAVFObjC>, public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer
+class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer, public SampleBufferDisplayLayer::Client
 #if !RELEASE_LOG_DISABLED
     , private LoggerHelper
 #endif
@@ -75,13 +75,6 @@ public:
     void ensureLayers();
     void destroyLayers();
 
-    void layerStatusDidChange(AVSampleBufferDisplayLayer*);
-    void layerErrorDidChange(AVSampleBufferDisplayLayer*);
-    void backgroundLayerBoundsChanged();
-
-    PlatformLayer* displayLayer();
-    PlatformLayer* backgroundLayer();
-
 #if !RELEASE_LOG_DISABLED
     const Logger& logger() const final { return m_logger.get(); }
     const char* logClassName() const override { return "MediaPlayerPrivateMediaStreamAVFObjC"; }
@@ -137,10 +130,6 @@ private:
 
     void flushRenderers();
 
-    using PendingSampleQueue = Deque<Ref<MediaSample>>;
-    void addSampleToPendingQueue(PendingSampleQueue&, MediaSample&);
-    void removeOldSamplesFromPendingQueue(PendingSampleQueue&);
-
     MediaTime calculateTimelineOffset(const MediaSample&, double);
     
     void enqueueVideoSample(MediaStreamTrackPrivate&, MediaSample&);
@@ -212,7 +201,7 @@ private:
     void setVideoFullscreenLayer(PlatformLayer*, WTF::Function<void()>&& completionHandler) override;
     void setVideoFullscreenFrame(FloatRect) override;
 
-    MediaTime streamTime() const;
+    MediaTime streamTime() const final;
 
     AudioSourceProvider* audioSourceProvider() final;
 
@@ -220,14 +209,11 @@ private:
 
     void applicationDidBecomeActive() final;
 
-    bool hideBackgroundLayer() const { return (!m_activeVideoTrack || m_waitingForFirstImage) && m_displayMode != PaintItBlack; }
+    bool hideRootLayer() const { return (!m_activeVideoTrack || m_waitingForFirstImage) && m_displayMode != PaintItBlack; }
 
     MediaPlayer* m_player { nullptr };
     RefPtr<MediaStreamPrivate> m_mediaStreamPrivate;
     RefPtr<MediaStreamTrackPrivate> m_activeVideoTrack;
-    RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener;
-    RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
-    RetainPtr<PlatformLayer> m_backgroundLayer;
     std::unique_ptr<PAL::Clock> m_clock;
 
     MediaTime m_pausedTime;
@@ -244,7 +230,6 @@ private:
 
     HashMap<String, RefPtr<AudioTrackPrivateMediaStream>> m_audioTrackMap;
     HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap;
-    PendingSampleQueue m_pendingVideoSampleQueue;
 
     MediaPlayer::NetworkState m_networkState { MediaPlayer::NetworkState::Empty };
     MediaPlayer::ReadyState m_readyState { MediaPlayer::ReadyState::HaveNothing };
@@ -254,8 +239,13 @@ private:
     PlaybackState m_playbackState { PlaybackState::None };
     MediaSample::VideoRotation m_videoRotation { MediaSample::VideoRotation::None };
     CGAffineTransform m_videoTransform;
+    std::unique_ptr<SampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
     std::unique_ptr<VideoFullscreenLayerManagerObjC> m_videoFullscreenLayerManager;
 
+    // SampleBufferDisplayLayer::Client
+    void sampleBufferDisplayLayerStatusDidChange(SampleBufferDisplayLayer&) final;
+    void sampleBufferDisplayLayerBoundsDidChange(SampleBufferDisplayLayer&) final;
+
 #if !RELEASE_LOG_DISABLED
     Ref<const Logger> m_logger;
     const void* m_logIdentifier;
index 5fcf9ab..df9e205 100644 (file)
@@ -1,5 +1,5 @@
 /*
- * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
+ * Copyright (C) 2015-2020 Apple Inc. All rights reserved.
  *
  * Redistribution and use in source and binary forms, with or without
  * modification, are permitted provided that the following conditions
 #import "AudioTrackPrivateMediaStream.h"
 #import "GraphicsContextCG.h"
 #import "Logging.h"
+#import "LocalSampleBufferDisplayLayer.h"
 #import "MediaStreamPrivate.h"
 #import "PixelBufferConformerCV.h"
 #import "VideoFullscreenLayerManagerObjC.h"
 #import "VideoTrackPrivateMediaStream.h"
-#import <AVFoundation/AVSampleBufferDisplayLayer.h>
 #import <CoreGraphics/CGAffineTransform.h>
-#import <QuartzCore/CALayer.h>
-#import <QuartzCore/CATransaction.h>
 #import <objc_runtime.h>
 #import <pal/avfoundation/MediaTimeAVFoundation.h>
 #import <pal/spi/mac/AVFoundationSPI.h>
 #import <pal/system/Clock.h>
-#import <wtf/Deque.h>
-#import <wtf/Function.h>
 #import <wtf/MainThread.h>
 #import <wtf/NeverDestroyed.h>
 
 #import <pal/cf/CoreMediaSoftLink.h>
 #import <pal/cocoa/AVFoundationSoftLink.h>
 
-using namespace WebCore;
-
-@interface WebAVSampleBufferStatusChangeListener : NSObject {
-    MediaPlayerPrivateMediaStreamAVFObjC* _parent;
-}
-
-- (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
-- (void)invalidate;
-- (void)beginObservingLayers;
-- (void)stopObservingLayers;
-@end
-
-@implementation WebAVSampleBufferStatusChangeListener
-
-- (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)parent
-{
-    if (!(self = [super init]))
-        return nil;
-
-    _parent = parent;
-
-    return self;
-}
-
-- (void)dealloc
-{
-    [self invalidate];
-    [super dealloc];
-}
-
-- (void)invalidate
-{
-    [self stopObservingLayers];
-    _parent = nullptr;
-}
-
-- (void)beginObservingLayers
-{
-    ASSERT(_parent);
-    ASSERT(_parent->displayLayer());
-    ASSERT(_parent->backgroundLayer());
-
-    [_parent->displayLayer() addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
-    [_parent->displayLayer() addObserver:self forKeyPath:@"error" options:NSKeyValueObservingOptionNew context:nil];
-    [_parent->backgroundLayer() addObserver:self forKeyPath:@"bounds" options:NSKeyValueObservingOptionNew context:nil];
-}
-
-- (void)stopObservingLayers
-{
-    if (!_parent)
-        return;
-
-    if (_parent->displayLayer()) {
-        [_parent->displayLayer() removeObserver:self forKeyPath:@"status"];
-        [_parent->displayLayer() removeObserver:self forKeyPath:@"error"];
-    }
-    if (_parent->backgroundLayer())
-        [_parent->backgroundLayer() removeObserver:self forKeyPath:@"bounds"];
-}
-
-- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
-{
-    UNUSED_PARAM(context);
-    UNUSED_PARAM(keyPath);
-    ASSERT(_parent);
-
-    if (!_parent)
-        return;
-
-    if ([object isKindOfClass:PAL::getAVSampleBufferDisplayLayerClass()]) {
-        RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object;
-        ASSERT(layer.get() == _parent->displayLayer());
-
-        if ([keyPath isEqualToString:@"status"]) {
-            RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
-            callOnMainThread([protectedSelf = RetainPtr<WebAVSampleBufferStatusChangeListener>(self), layer = WTFMove(layer), status = WTFMove(status)] {
-                if (!protectedSelf->_parent)
-                    return;
-
-                protectedSelf->_parent->layerStatusDidChange(layer.get());
-            });
-            return;
-        }
-
-        if ([keyPath isEqualToString:@"error"]) {
-            RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
-            callOnMainThread([protectedSelf = RetainPtr<WebAVSampleBufferStatusChangeListener>(self), layer = WTFMove(layer), status = WTFMove(status)] {
-                if (!protectedSelf->_parent)
-                    return;
-
-                protectedSelf->_parent->layerErrorDidChange(layer.get());
-            });
-            return;
-        }
-    }
-
-    if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])
-        return;
-
-    if ((CALayer *)object == _parent->backgroundLayer()) {
-        if ([keyPath isEqualToString:@"bounds"]) {
-            if (!_parent)
-                return;
-
-            if (isMainThread()) {
-                _parent->backgroundLayerBoundsChanged();
-                return;
-            }
-
-            callOnMainThread([protectedSelf = RetainPtr<WebAVSampleBufferStatusChangeListener>(self)] {
-                if (!protectedSelf->_parent)
-                    return;
-
-                protectedSelf->_parent->backgroundLayerBoundsChanged();
-            });
-        }
-    }
-
-}
-@end
-
 namespace WebCore {
 using namespace PAL;
 
@@ -183,7 +58,6 @@ static const double rendererLatency = 0.02;
 
 MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer* player)
     : m_player(player)
-    , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this]))
     , m_clock(PAL::Clock::create())
     , m_videoFullscreenLayerManager(makeUnique<VideoFullscreenLayerManagerObjC>())
 #if !RELEASE_LOG_DISABLED
@@ -198,8 +72,6 @@ MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC()
 {
     INFO_LOG(LOGIDENTIFIER);
 
-    [m_statusChangeListener invalidate];
-
     for (const auto& track : m_audioTrackMap.values())
         track->pause();
 
@@ -269,33 +141,6 @@ MediaPlayer::SupportsType MediaPlayerPrivateMediaStreamAVFObjC::supportsType(con
 #pragma mark -
 #pragma mark AVSampleBuffer Methods
 
-void MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue(PendingSampleQueue& queue)
-{
-    if (queue.isEmpty())
-        return;
-
-    auto decodeTime = queue.first()->decodeTime();
-    if (!decodeTime.isValid() || decodeTime < MediaTime::zeroTime()) {
-        while (queue.size() > 5)
-            queue.removeFirst();
-
-        return;
-    }
-
-    MediaTime now = streamTime();
-    while (!queue.isEmpty()) {
-        if (queue.first()->decodeTime() > now)
-            break;
-        queue.removeFirst();
-    }
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue(PendingSampleQueue& queue, MediaSample& sample)
-{
-    removeOldSamplesFromPendingQueue(queue);
-    queue.append(sample);
-}
-
 MediaTime MediaPlayerPrivateMediaStreamAVFObjC::calculateTimelineOffset(const MediaSample& sample, double latency)
 {
     MediaTime sampleTime = sample.outputPresentationTime();
@@ -330,37 +175,19 @@ CGAffineTransform MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatri
     return m_videoTransform;
 }
 
-static void runWithoutAnimations(const WTF::Function<void()>& function)
-{
-    [CATransaction begin];
-    [CATransaction setAnimationDuration:0];
-    [CATransaction setDisableActions:YES];
-    function();
-    [CATransaction commit];
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueCorrectedVideoSample(MediaSample& sample)
 {
     if (m_sampleBufferDisplayLayer) {
-        if ([m_sampleBufferDisplayLayer status] == AVQueuedSampleBufferRenderingStatusFailed)
+        if (m_sampleBufferDisplayLayer->didFail())
             return;
 
         if (sample.videoRotation() != m_videoRotation || sample.videoMirrored() != m_videoMirrored) {
             m_videoRotation = sample.videoRotation();
             m_videoMirrored = sample.videoMirrored();
-            runWithoutAnimations([this, &sample] {
-                m_sampleBufferDisplayLayer.get().affineTransform = videoTransformationMatrix(sample, true);
-                updateDisplayLayer();
-            });
+            m_sampleBufferDisplayLayer->updateAffineTransform(videoTransformationMatrix(sample, true));
+            updateDisplayLayer();
         }
-
-        if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
-            addSampleToPendingQueue(m_pendingVideoSampleQueue, sample);
-            requestNotificationWhenReadyForVideoData();
-            return;
-        }
-
-        [m_sampleBufferDisplayLayer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
+        m_sampleBufferDisplayLayer->enqueueSample(sample);
     }
 
     if (!m_hasEverEnqueuedVideoFrame) {
@@ -415,61 +242,26 @@ void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPr
     }
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData()
-{
-    auto weakThis = makeWeakPtr(*this);
-    [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
-        if (!weakThis)
-            return;
-
-        [m_sampleBufferDisplayLayer stopRequestingMediaData];
-
-        if (!m_activeVideoTrack) {
-            m_pendingVideoSampleQueue.clear();
-            return;
-        }
-
-        while (!m_pendingVideoSampleQueue.isEmpty()) {
-            if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
-                requestNotificationWhenReadyForVideoData();
-                return;
-            }
-
-            auto sample = m_pendingVideoSampleQueue.takeFirst();
-            enqueueVideoSample(*m_activeVideoTrack.get(), sample.get());
-        }
-    }];
-}
-
 AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
 {
     // FIXME: This should return a mix of all audio tracks - https://bugs.webkit.org/show_bug.cgi?id=160305
     return nullptr;
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange(AVSampleBufferDisplayLayer* layer)
+void MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferDisplayLayerStatusDidChange(SampleBufferDisplayLayer& layer)
 {
+    ASSERT(&layer == m_sampleBufferDisplayLayer.get());
     UNUSED_PARAM(layer);
-    ERROR_LOG(LOGIDENTIFIER, "error = ", [[layer.error localizedDescription] UTF8String]);
-}
-
-void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer)
-{
-    ALWAYS_LOG(LOGIDENTIFIER, "status = ", (int)layer.status);
-
-    if (layer.status != AVQueuedSampleBufferRenderingStatusRendering)
-        return;
-    if (!m_sampleBufferDisplayLayer || !m_activeVideoTrack || layer != m_sampleBufferDisplayLayer)
+    if (!m_activeVideoTrack)
         return;
 
-    auto track = m_videoTrackMap.get(m_activeVideoTrack->id());
-    if (track)
+    if (auto track = m_videoTrackMap.get(m_activeVideoTrack->id()))
         track->setTimelineOffset(MediaTime::invalidTime());
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::applicationDidBecomeActive()
 {
-    if (m_sampleBufferDisplayLayer && [m_sampleBufferDisplayLayer status] == AVQueuedSampleBufferRenderingStatusFailed) {
+    if (m_sampleBufferDisplayLayer && m_sampleBufferDisplayLayer->didFail()) {
         flushRenderers();
         if (m_imagePainter.mediaSample)
             enqueueCorrectedVideoSample(*m_imagePainter.mediaSample);
@@ -480,7 +272,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::applicationDidBecomeActive()
 void MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers()
 {
     if (m_sampleBufferDisplayLayer)
-        [m_sampleBufferDisplayLayer flush];
+        m_sampleBufferDisplayLayer->flush();
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers()
@@ -491,51 +283,24 @@ void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers()
     if (!m_mediaStreamPrivate || !m_mediaStreamPrivate->activeVideoTrack() || !m_mediaStreamPrivate->activeVideoTrack()->enabled())
         return;
 
-    m_sampleBufferDisplayLayer = adoptNS([PAL::allocAVSampleBufferDisplayLayerInstance() init]);
+    auto size = snappedIntRect(m_player->playerContentBoxRect()).size();
+    m_sampleBufferDisplayLayer = LocalSampleBufferDisplayLayer::create(*this, hideRootLayer(), size);
+
     if (!m_sampleBufferDisplayLayer) {
-        ERROR_LOG(LOGIDENTIFIER, "+[AVSampleBufferDisplayLayer alloc] failed.");
+        ERROR_LOG(LOGIDENTIFIER, "Creating the SampleBufferDisplayLayer failed.");
         return;
     }
 
-    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
-    m_sampleBufferDisplayLayer.get().anchorPoint = { .5, .5 };
-    m_sampleBufferDisplayLayer.get().needsDisplayOnBoundsChange = YES;
-    m_sampleBufferDisplayLayer.get().videoGravity = AVLayerVideoGravityResizeAspectFill;
-
-    m_backgroundLayer = adoptNS([[CALayer alloc] init]);
-    m_backgroundLayer.get().hidden = hideBackgroundLayer();
-
-    m_backgroundLayer.get().backgroundColor = cachedCGColor(Color::black);
-    m_backgroundLayer.get().needsDisplayOnBoundsChange = YES;
-
-    auto size = snappedIntRect(m_player->playerContentBoxRect()).size();
-    m_backgroundLayer.get().bounds = CGRectMake(0, 0, size.width(), size.height());
-
-    [m_statusChangeListener beginObservingLayers];
-
-    [m_backgroundLayer addSublayer:m_sampleBufferDisplayLayer.get()];
-
-#ifndef NDEBUG
-    [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
-    [m_backgroundLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer parent"];
-#endif
-
     updateRenderingMode();
     updateDisplayLayer();
 
-    m_videoFullscreenLayerManager->setVideoLayer(m_backgroundLayer.get(), size);
+    m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer->rootLayer(), size);
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers()
 {
-    [m_statusChangeListener stopObservingLayers];
-    if (m_sampleBufferDisplayLayer) {
-        m_pendingVideoSampleQueue.clear();
-        [m_sampleBufferDisplayLayer stopRequestingMediaData];
-        [m_sampleBufferDisplayLayer flush];
+    if (m_sampleBufferDisplayLayer)
         m_sampleBufferDisplayLayer = nullptr;
-    }
-    m_backgroundLayer = nullptr;
 
     updateRenderingMode();
     
@@ -602,22 +367,12 @@ void MediaPlayerPrivateMediaStreamAVFObjC::prepareToPlay()
 
 PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const
 {
-    if (!m_backgroundLayer || m_displayMode == None)
+    if (!m_sampleBufferDisplayLayer || !m_sampleBufferDisplayLayer->rootLayer() || m_displayMode == None)
         return nullptr;
 
     return m_videoFullscreenLayerManager->videoInlineLayer();
 }
 
-PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::displayLayer()
-{
-    return m_sampleBufferDisplayLayer.get();
-}
-
-PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer()
-{
-    return m_backgroundLayer.get();
-}
-
 MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const
 {
     if (m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
@@ -653,18 +408,8 @@ bool MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode()
     INFO_LOG(LOGIDENTIFIER, "updated to ", static_cast<int>(displayMode));
     m_displayMode = displayMode;
 
-    auto hidden = m_displayMode < PausedImage;
-    if (m_sampleBufferDisplayLayer && m_sampleBufferDisplayLayer.get().hidden != hidden) {
-        runWithoutAnimations([this, hidden] {
-            m_sampleBufferDisplayLayer.get().hidden = hidden;
-        });
-    }
-    hidden = hideBackgroundLayer();
-    if (m_backgroundLayer && m_backgroundLayer.get().hidden != hidden) {
-        runWithoutAnimations([this, hidden] {
-            m_backgroundLayer.get().hidden = hidden;
-        });
-    }
+    if (m_sampleBufferDisplayLayer)
+        m_sampleBufferDisplayLayer->updateDisplayMode(m_displayMode < PausedImage, hideRootLayer());
 
     return true;
 }
@@ -1023,8 +768,11 @@ void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack()
                 m_waitingForFirstImage = true;
         }
         ensureLayers();
-        m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer || m_displayMode < PausedImage;
-        m_backgroundLayer.get().hidden = hideBackgroundLayer();
+        if (m_sampleBufferDisplayLayer) {
+            if (!m_activeVideoTrack)
+                m_sampleBufferDisplayLayer->clearEnqueuedSamples();
+            m_sampleBufferDisplayLayer->updateDisplayMode(hideVideoLayer || m_displayMode < PausedImage, hideRootLayer());
+        }
 
         m_pendingSelectedTrackCheck = false;
         updateDisplayMode();
@@ -1173,8 +921,8 @@ void MediaPlayerPrivateMediaStreamAVFObjC::setNetworkState(MediaPlayer::NetworkS
 
 void MediaPlayerPrivateMediaStreamAVFObjC::setBufferingPolicy(MediaPlayer::BufferingPolicy policy)
 {
-    if (policy != MediaPlayer::BufferingPolicy::Default)
-        [m_sampleBufferDisplayLayer flushAndRemoveImage];
+    if (policy != MediaPlayer::BufferingPolicy::Default && m_sampleBufferDisplayLayer)
+        m_sampleBufferDisplayLayer->flushAndRemoveImage();
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::scheduleDeferredTask(Function<void ()>&& function)
@@ -1197,23 +945,20 @@ void MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset()
 
 void MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayLayer()
 {
-    if (!m_backgroundLayer || !m_sampleBufferDisplayLayer)
+    if (!m_sampleBufferDisplayLayer)
         return;
 
-    auto backgroundBounds = m_backgroundLayer.get().bounds;
-    auto videoBounds = backgroundBounds;
+    auto bounds = m_sampleBufferDisplayLayer->bounds();
+    auto videoBounds = bounds;
     if (m_videoRotation == MediaSample::VideoRotation::Right || m_videoRotation == MediaSample::VideoRotation::Left)
         std::swap(videoBounds.size.width, videoBounds.size.height);
 
-    m_sampleBufferDisplayLayer.get().bounds = videoBounds;
-    m_sampleBufferDisplayLayer.get().position = { backgroundBounds.size.width / 2, backgroundBounds.size.height / 2};
+    m_sampleBufferDisplayLayer->updateBoundsAndPosition(videoBounds, { bounds.size.width / 2, bounds.size.height / 2});
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged()
+void MediaPlayerPrivateMediaStreamAVFObjC::sampleBufferDisplayLayerBoundsDidChange(SampleBufferDisplayLayer&)
 {
-    runWithoutAnimations([this] {
-        updateDisplayLayer();
-    });
+    updateDisplayLayer();
 }
 
 #if !RELEASE_LOG_DISABLED