[MediaStream] Compensate for video capture orientation
authoreric.carlson@apple.com <eric.carlson@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 17 Mar 2017 21:14:28 +0000 (21:14 +0000)
committereric.carlson@apple.com <eric.carlson@apple.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 17 Mar 2017 21:14:28 +0000 (21:14 +0000)
https://bugs.webkit.org/show_bug.cgi?id=169313
<rdar://problem/30994785>

Reviewed by Jer Noble.

No new tests, the mock video source doesn't support rotation. Test will be added when this
is fixed in https://bugs.webkit.org/show_bug.cgi?id=169822.

Add 'orientation' and 'mirrored' attributes to MediaSample
* platform/MediaSample.h:
(WebCore::MediaSample::videoOrientation):
(WebCore::MediaSample::videoMirrored):
* platform/graphics/avfoundation/MediaSampleAVFObjC.h:

A video sample can be rotated and/or mirrored, so the video layer may need to be rotated
and resized for display. We don't want expose this information to the renderer, so allocate
return a generic CALayer as the player's platforLayer, and add the video layer as a sublayer
so we can adjust it to display correctly. Add an enum for playback state as well as display
mode so we correctly display a black frame when video frames are available but playback has
not yet started.

* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
(-[WebAVSampleBufferStatusChangeListener initWithParent:]):
(-[WebAVSampleBufferStatusChangeListener invalidate]):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayers]):
(-[WebAVSampleBufferStatusChangeListener stopObservingLayers]): Ditto.
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatrix):
(WebCore::runWithoutAnimations):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::displayLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::activeStatusChanged):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setShouldBufferData):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]): Deleted.
(-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paused): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldBePlaying): Deleted.

* platform/mediastream/mac/AVVideoCaptureSource.h:
* platform/mediastream/mac/AVVideoCaptureSource.mm:
(WebCore::AVVideoCaptureSource::processNewFrame): Add connection parameter so we can get the
video orientation.
(WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection):

Pass sample orientation to libwebrtc.
* platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
(WebCore::RealtimeOutgoingVideoSource::sendFrame):
(WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable):
* platform/mediastream/mac/RealtimeOutgoingVideoSource.h:

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@214120 268f45cc-cd09-0410-ab3c-d52691b4dbfc

Source/WebCore/ChangeLog
Source/WebCore/platform/MediaSample.h
Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h
Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm
Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h
Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm
Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp
Source/WebCore/platform/mediastream/mac/RealtimeOutgoingVideoSource.h

index 66375c9..f645f25 100644 (file)
@@ -1,3 +1,75 @@
+2017-03-17  Eric Carlson  <eric.carlson@apple.com>
+
+        [MediaStream] Compensate for video capture orientation
+        https://bugs.webkit.org/show_bug.cgi?id=169313
+        <rdar://problem/30994785>
+
+        Reviewed by Jer Noble.
+
+        No new tests, the mock video source doesn't support rotation. Test will be added when this
+        is fixed in https://bugs.webkit.org/show_bug.cgi?id=169822.
+
+        Add 'orientation' and 'mirrored' attributes to MediaSample
+        * platform/MediaSample.h:
+        (WebCore::MediaSample::videoOrientation):
+        (WebCore::MediaSample::videoMirrored):
+        * platform/graphics/avfoundation/MediaSampleAVFObjC.h:
+
+        A video sample can be rotated and/or mirrored, so the video layer may need to be rotated
+        and resized for display. We don't want expose this information to the renderer, so allocate
+        return a generic CALayer as the player's platforLayer, and add the video layer as a sublayer
+        so we can adjust it to display correctly. Add an enum for playback state as well as display
+        mode so we correctly display a black frame when video frames are available but playback has
+        not yet started.
+
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
+        (-[WebAVSampleBufferStatusChangeListener initWithParent:]):
+        (-[WebAVSampleBufferStatusChangeListener invalidate]):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingLayers]):
+        (-[WebAVSampleBufferStatusChangeListener stopObservingLayers]): Ditto.
+        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatrix):
+        (WebCore::runWithoutAnimations):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::displayLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::pause):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::activeStatusChanged):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::setShouldBufferData):
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged):
+        (-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]): Deleted.
+        (-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::paused): Deleted.
+        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::shouldBePlaying): Deleted.
+
+        * platform/mediastream/mac/AVVideoCaptureSource.h:
+        * platform/mediastream/mac/AVVideoCaptureSource.mm:
+        (WebCore::AVVideoCaptureSource::processNewFrame): Add connection parameter so we can get the
+        video orientation.
+        (WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection):
+
+        Pass sample orientation to libwebrtc.
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.cpp:
+        (WebCore::RealtimeOutgoingVideoSource::sendFrame):
+        (WebCore::RealtimeOutgoingVideoSource::videoSampleAvailable):
+        * platform/mediastream/mac/RealtimeOutgoingVideoSource.h:
+
 2017-03-17  Zalan Bujtas  <zalan@apple.com>
 
         Fix the flow thread state on the descendants of out of flow positioned replaced elements.
index 6ee990d..e8a7b58 100644 (file)
@@ -77,6 +77,16 @@ public:
     virtual SampleFlags flags() const = 0;
     virtual PlatformSample platformSample() = 0;
 
+    enum class VideoOrientation {
+        Unknown,
+        Portrait,
+        PortraitUpsideDown,
+        LandscapeRight,
+        LandscapeLeft,
+    };
+    virtual VideoOrientation videoOrientation() const { return VideoOrientation::Unknown; }
+    virtual bool videoMirrored() const { return false; }
+
     bool isSync() const { return flags() & IsSync; }
     bool isNonDisplaying() const { return flags() & IsNonDisplaying; }
 
index 7bf1bbe..fcdb61e 100644 (file)
@@ -37,7 +37,7 @@ class MediaSampleAVFObjC final : public MediaSample {
 public:
     static Ref<MediaSampleAVFObjC> create(CMSampleBufferRef sample, int trackID) { return adoptRef(*new MediaSampleAVFObjC(sample, trackID)); }
     static Ref<MediaSampleAVFObjC> create(CMSampleBufferRef sample, AtomicString trackID) { return adoptRef(*new MediaSampleAVFObjC(sample, trackID)); }
-    static Ref<MediaSampleAVFObjC> create(CMSampleBufferRef sample) { return adoptRef(*new MediaSampleAVFObjC(sample)); }
+    static Ref<MediaSampleAVFObjC> create(CMSampleBufferRef sample, VideoOrientation orientation = VideoOrientation::Unknown, bool mirrored = false) { return adoptRef(*new MediaSampleAVFObjC(sample, orientation, mirrored)); }
     static RefPtr<MediaSampleAVFObjC> createImageSample(Ref<JSC::Uint8ClampedArray>&&, unsigned long width, unsigned long height);
     static RefPtr<MediaSampleAVFObjC> createImageSample(Vector<uint8_t>&&, unsigned long width, unsigned long height);
 
@@ -56,6 +56,13 @@ private:
         , m_id(String::format("%d", trackID))
     {
     }
+    MediaSampleAVFObjC(CMSampleBufferRef sample, VideoOrientation orientation, bool mirrored)
+        : m_sample(sample)
+        , m_orientation(orientation)
+        , m_mirrored(mirrored)
+    {
+    }
+
     virtual ~MediaSampleAVFObjC() { }
 
     MediaTime presentationTime() const override;
@@ -79,8 +86,13 @@ private:
     std::pair<RefPtr<MediaSample>, RefPtr<MediaSample>> divide(const MediaTime& presentationTime) override;
     Ref<MediaSample> createNonDisplayingCopy() const override;
 
+    VideoOrientation videoOrientation() const final { return m_orientation; }
+    bool videoMirrored() const final { return m_mirrored; }
+
     RetainPtr<CMSampleBufferRef> m_sample;
     AtomicString m_id;
+    VideoOrientation m_orientation { VideoOrientation::Unknown };
+    bool m_mirrored { false };
 };
 
 }
index 1683ed3..c973f10 100644 (file)
@@ -31,6 +31,7 @@
 #include "MediaPlayerPrivate.h"
 #include "MediaSample.h"
 #include "MediaStreamPrivate.h"
+#include <CoreGraphics/CGAffineTransform.h>
 #include <wtf/Function.h>
 #include <wtf/MediaTime.h>
 #include <wtf/WeakPtr.h>
@@ -75,10 +76,15 @@ public:
 
     WeakPtr<MediaPlayerPrivateMediaStreamAVFObjC> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
 
-    void ensureLayer();
-    void destroyLayer();
+    void ensureLayers();
+    void destroyLayers();
 
     void layerStatusDidChange(AVSampleBufferDisplayLayer*);
+    void layerErrorDidChange(AVSampleBufferDisplayLayer*);
+    void backgroundLayerBoundsChanged();
+
+    PlatformLayer* displayLayer();
+    PlatformLayer* backgroundLayer();
 
 private:
     // MediaPlayerPrivateInterface
@@ -99,7 +105,7 @@ private:
 
     void play() override;
     void pause() override;
-    bool paused() const override;
+    bool paused() const override { return !playing(); }
 
     void setVolume(float) override;
     void setMuted(bool) override;
@@ -156,7 +162,7 @@ private:
 
     bool ended() const override { return m_ended; }
 
-    bool shouldBePlaying() const;
+    void setShouldBufferData(bool) override;
 
     MediaPlayer::ReadyState currentReadyState();
     void updateReadyState();
@@ -177,6 +183,13 @@ private:
     bool updateDisplayMode();
     void updateCurrentFrameImage();
 
+    enum class PlaybackState {
+        None,
+        Playing,
+        Paused,
+    };
+    bool playing() const { return m_playbackState == PlaybackState::Playing; }
+
     // MediaStreamPrivate::Observer
     void activeStatusChanged() override;
     void characteristicsChanged() override;
@@ -200,6 +213,8 @@ private:
 
     AudioSourceProvider* audioSourceProvider() final;
 
+    CGAffineTransform videoTransformationMatrix(MediaSample&);
+
     MediaPlayer* m_player { nullptr };
     WeakPtrFactory<MediaPlayerPrivateMediaStreamAVFObjC> m_weakPtrFactory;
     RefPtr<MediaStreamPrivate> m_mediaStreamPrivate;
@@ -208,6 +223,7 @@ private:
 
     RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener;
     RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
+    RetainPtr<PlatformLayer> m_backgroundLayer;
     std::unique_ptr<Clock> m_clock;
 
     MediaTime m_pausedTime;
@@ -232,12 +248,18 @@ private:
     FloatSize m_intrinsicSize;
     float m_volume { 1 };
     DisplayMode m_displayMode { None };
+    PlaybackState m_playbackState { PlaybackState::None };
+    MediaSample::VideoOrientation m_videoOrientation { MediaSample::VideoOrientation::Unknown };
+    CGAffineTransform m_videoTransform;
+    bool m_videoMirrored { false };
     bool m_playing { false };
     bool m_muted { false };
     bool m_ended { false };
     bool m_hasEverEnqueuedVideoFrame { false };
     bool m_pendingSelectedTrackCheck { false };
     bool m_shouldDisplayFirstVideoFrame { false };
+    bool m_transformIsValid { false };
+    bool m_videoSizeChanged;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     std::unique_ptr<VideoFullscreenLayerManager> m_videoFullscreenLayerManager;
index 354b919..b32b1a8 100644 (file)
@@ -57,20 +57,24 @@ SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
 
 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
 
-#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral()
-#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed()
+SOFT_LINK_POINTER(AVFoundation, AVLayerVideoGravityResizeAspect, NSString *)
+SOFT_LINK_POINTER(AVFoundation, AVLayerVideoGravityResizeAspectFill, NSString *)
+SOFT_LINK_POINTER(AVFoundation, AVLayerVideoGravityResize, NSString *)
+
+#define AVLayerVideoGravityResizeAspect getAVLayerVideoGravityResizeAspect()
+#define AVLayerVideoGravityResizeAspectFill getAVLayerVideoGravityResizeAspectFill()
+#define AVLayerVideoGravityResize getAVLayerVideoGravityResize()
 
 using namespace WebCore;
 
 @interface WebAVSampleBufferStatusChangeListener : NSObject {
     MediaPlayerPrivateMediaStreamAVFObjC* _parent;
-    Vector<RetainPtr<AVSampleBufferDisplayLayer>> _layers;
 }
 
 - (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
 - (void)invalidate;
-- (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer;
-- (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer;
+- (void)beginObservingLayers;
+- (void)stopObservingLayers;
 @end
 
 @implementation WebAVSampleBufferStatusChangeListener
@@ -81,6 +85,7 @@ using namespace WebCore;
         return nil;
 
     _parent = parent;
+
     return self;
 }
 
@@ -92,31 +97,35 @@ using namespace WebCore;
 
 - (void)invalidate
 {
-    for (auto& layer : _layers)
-        [layer removeObserver:self forKeyPath:@"status"];
-    _layers.clear();
+    [self stopObservingLayers];
 
     [[NSNotificationCenter defaultCenter] removeObserver:self];
 
     _parent = nullptr;
 }
 
-- (void)beginObservingLayer:(AVSampleBufferDisplayLayer*)layer
+- (void)beginObservingLayers
 {
     ASSERT(_parent);
-    ASSERT(!_layers.contains(layer));
+    ASSERT(_parent->displayLayer());
+    ASSERT(_parent->backgroundLayer());
 
-    _layers.append(layer);
-    [layer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr];
+    [_parent->displayLayer() addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];
+    [_parent->displayLayer() addObserver:self forKeyPath:@"error" options:NSKeyValueObservingOptionNew context:nil];
+    [_parent->backgroundLayer() addObserver:self forKeyPath:@"bounds" options:NSKeyValueObservingOptionNew context:nil];
 }
 
-- (void)stopObservingLayer:(AVSampleBufferDisplayLayer*)layer
+- (void)stopObservingLayers
 {
-    ASSERT(_parent);
-    ASSERT(_layers.contains(layer));
+    if (!_parent)
+        return;
 
-    [layer removeObserver:self forKeyPath:@"status"];
-    _layers.remove(_layers.find(layer));
+    if (_parent->displayLayer()) {
+        [_parent->displayLayer() removeObserver:self forKeyPath:@"status"];
+        [_parent->displayLayer() removeObserver:self forKeyPath:@"error"];
+    }
+    if (_parent->backgroundLayer())
+        [_parent->backgroundLayer() removeObserver:self forKeyPath:@"bounds"];
 }
 
 - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
@@ -125,23 +134,47 @@ using namespace WebCore;
     UNUSED_PARAM(keyPath);
     ASSERT(_parent);
 
-    RetainPtr<WebAVSampleBufferStatusChangeListener> protectedSelf = self;
     if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) {
         RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object;
-        RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
+        ASSERT(layer.get() == _parent->displayLayer());
 
-        ASSERT(_layers.contains(layer.get()));
-        ASSERT([keyPath isEqualToString:@"status"]);
+        if ([keyPath isEqualToString:@"status"]) {
+            RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
+            callOnMainThread([protectedSelf = WTFMove(self), layer = WTFMove(layer), status = WTFMove(status)] {
+                if (!protectedSelf->_parent)
+                    return;
 
-        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer)] {
-            if (!protectedSelf->_parent)
-                return;
+                protectedSelf->_parent->layerStatusDidChange(layer.get());
+            });
+            return;
+        }
 
-            protectedSelf->_parent->layerStatusDidChange(layer.get());
-        });
+        if ([keyPath isEqualToString:@"error"]) {
+            RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
+            callOnMainThread([protectedSelf = WTFMove(self), layer = WTFMove(layer), status = WTFMove(status)] {
+                if (!protectedSelf->_parent)
+                    return;
+
+                protectedSelf->_parent->layerErrorDidChange(layer.get());
+            });
+            return;
+        }
+    }
+
+    if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])
+        return;
+
+    if ((CALayer *)object == _parent->backgroundLayer()) {
+        if ([keyPath isEqualToString:@"bounds"]) {
+            callOnMainThread([protectedSelf = WTFMove(self)] {
+                if (!protectedSelf->_parent)
+                    return;
+
+                protectedSelf->_parent->backgroundLayerBoundsChanged();
+            });
+        }
+    }
 
-    } else
-        ASSERT_NOT_REACHED();
 }
 @end
 
@@ -177,7 +210,7 @@ MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC()
             track->removeObserver(*this);
     }
 
-    destroyLayer();
+    destroyLayers();
 
     [m_statusChangeListener invalidate];
 
@@ -268,6 +301,46 @@ MediaTime MediaPlayerPrivateMediaStreamAVFObjC::calculateTimelineOffset(const Me
     return timelineOffset;
 }
 
+CGAffineTransform MediaPlayerPrivateMediaStreamAVFObjC::videoTransformationMatrix(MediaSample& sample)
+{
+    if (m_transformIsValid)
+        return m_videoTransform;
+
+    CMSampleBufferRef sampleBuffer = sample.platformSample().sample.cmSampleBuffer;
+    CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(sampleBuffer));
+    size_t width = CVPixelBufferGetWidth(pixelBuffer);
+    size_t height = CVPixelBufferGetHeight(pixelBuffer);
+    if (!width || !height)
+        return CGAffineTransformIdentity;
+
+    ASSERT(m_videoOrientation >= MediaSample::VideoOrientation::Unknown);
+    ASSERT(m_videoOrientation <= MediaSample::VideoOrientation::LandscapeLeft);
+
+    // Unknown, Portrait, PortraitUpsideDown, LandscapeRight, LandscapeLeft,
+#if PLATFORM(MAC)
+    static float sensorAngle[] = { 0, 0, 180, 90, 270 };
+#else
+    static float sensorAngle[] = { 180, 180, 0, 90, 270 };
+#endif
+    float rotation = sensorAngle[static_cast<int>(m_videoOrientation)];
+    m_videoTransform = CGAffineTransformMakeRotation(rotation * M_PI / 180);
+
+    if (sample.videoMirrored())
+        m_videoTransform = CGAffineTransformScale(m_videoTransform, -1, 1);
+
+    m_transformIsValid = true;
+    return m_videoTransform;
+}
+
+static void runWithoutAnimations(std::function<void()> function)
+{
+    [CATransaction begin];
+    [CATransaction setAnimationDuration:0];
+    [CATransaction setDisableActions:YES];
+    function();
+    [CATransaction commit];
+}
+
 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate& track, MediaSample& sample)
 {
     ASSERT(m_videoTrackMap.contains(track.id()));
@@ -296,6 +369,25 @@ void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPr
     updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample");
 
     if (m_sampleBufferDisplayLayer) {
+        if (sample.videoOrientation() != m_videoOrientation || sample.videoMirrored() != m_videoMirrored) {
+            m_videoOrientation = sample.videoOrientation();
+            m_videoMirrored = sample.videoMirrored();
+            m_transformIsValid = false;
+        }
+
+        if (m_videoSizeChanged || !m_transformIsValid) {
+            runWithoutAnimations([this, &sample] {
+                auto backgroundBounds = m_backgroundLayer.get().bounds;
+                auto videoBounds = backgroundBounds;
+                if (m_videoOrientation == MediaSample::VideoOrientation::LandscapeRight || m_videoOrientation == MediaSample::VideoOrientation::LandscapeLeft)
+                    std::swap(videoBounds.size.width, videoBounds.size.height);
+                m_sampleBufferDisplayLayer.get().bounds = videoBounds;
+                m_sampleBufferDisplayLayer.get().position = { backgroundBounds.size.width / 2, backgroundBounds.size.height / 2};
+                m_sampleBufferDisplayLayer.get().affineTransform = videoTransformationMatrix(sample);
+                m_videoSizeChanged = false;
+            });
+        }
+
         if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
             addSampleToPendingQueue(m_pendingVideoSampleQueue, sample);
             requestNotificationWhenReadyForVideoData();
@@ -334,6 +426,12 @@ AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
     return nullptr;
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange(AVSampleBufferDisplayLayer* layer)
+{
+    UNUSED_PARAM(layer);
+    LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::layerErrorDidChange(%p) - error = %s", this, [[layer.error localizedDescription] UTF8String]);
+}
+
 void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer)
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(%p) - status = %d", this, (int)layer.status);
@@ -359,7 +457,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::flushAndRemoveVideoSampleBuffers()
     [m_sampleBufferDisplayLayer flushAndRemoveImage];
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
+void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayers()
 {
     if (m_sampleBufferDisplayLayer)
         return;
@@ -373,30 +471,44 @@ void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
         return;
     }
 
+    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_sampleBufferDisplayLayer.get().anchorPoint = { .5, .5 };
+    m_sampleBufferDisplayLayer.get().needsDisplayOnBoundsChange = YES;
+    m_sampleBufferDisplayLayer.get().videoGravity = AVLayerVideoGravityResizeAspectFill;
+
+    m_backgroundLayer = adoptNS([[CALayer alloc] init]);
+    m_backgroundLayer.get().backgroundColor = cachedCGColor(Color::black);
+    m_backgroundLayer.get().needsDisplayOnBoundsChange = YES;
+
+    [m_statusChangeListener beginObservingLayers];
+
+    [m_backgroundLayer addSublayer:m_sampleBufferDisplayLayer.get()];
+
 #ifndef NDEBUG
     [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
+    [m_backgroundLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer parent"];
 #endif
-    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
-    [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
 
     updateRenderingMode();
     
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
-    m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
+    m_videoFullscreenLayerManager->setVideoLayer(m_backgroundLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
 #endif
 }
 
-void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer()
+void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayers()
 {
     if (!m_sampleBufferDisplayLayer)
         return;
 
+    [m_statusChangeListener stopObservingLayers];
     if (m_sampleBufferDisplayLayer) {
         m_pendingVideoSampleQueue.clear();
-        [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()];
         [m_sampleBufferDisplayLayer stopRequestingMediaData];
         [m_sampleBufferDisplayLayer flush];
+        m_sampleBufferDisplayLayer = nullptr;
     }
+    m_backgroundLayer = nullptr;
 
     updateRenderingMode();
     
@@ -446,7 +558,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::load(MediaStreamPrivate& stream)
 void MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad()
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::cancelLoad(%p)", this);
-    if (m_playing)
+    if (playing())
         pause();
 }
 
@@ -457,15 +569,24 @@ void MediaPlayerPrivateMediaStreamAVFObjC::prepareToPlay()
 
 PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const
 {
-    if (!m_sampleBufferDisplayLayer || m_displayMode == None)
+    if (!m_backgroundLayer || m_displayMode == None)
         return nullptr;
 
 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
     return m_videoFullscreenLayerManager->videoInlineLayer();
 #else
+    return m_backgroundLayer.get();
+#endif
+}
 
+PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::displayLayer()
+{
     return m_sampleBufferDisplayLayer.get();
-#endif
+}
+
+PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayer()
+{
+    return m_backgroundLayer.get();
 }
 
 MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const
@@ -474,16 +595,19 @@ MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamA
         return None;
 
     if (auto* track = m_mediaStreamPrivate->activeVideoTrack()) {
-        if (!m_shouldDisplayFirstVideoFrame || !track->enabled() || track->muted())
+        if (!track->enabled() || track->muted())
             return PaintItBlack;
     }
 
-    if (m_playing) {
+    if (playing()) {
         if (!m_mediaStreamPrivate->isProducingData())
             return PausedImage;
         return LivePreview;
     }
 
+    if (m_playbackState == PlaybackState::None)
+        return PaintItBlack;
+
     return PausedImage;
 }
 
@@ -493,11 +617,13 @@ bool MediaPlayerPrivateMediaStreamAVFObjC::updateDisplayMode()
 
     if (displayMode == m_displayMode)
         return false;
-
     m_displayMode = displayMode;
 
-    if (m_displayMode < PausedImage && m_sampleBufferDisplayLayer)
-        flushAndRemoveVideoSampleBuffers();
+    if (m_sampleBufferDisplayLayer) {
+        runWithoutAnimations([this] {
+            m_sampleBufferDisplayLayer.get().hidden = m_displayMode < PausedImage;
+        });
+    }
 
     return true;
 }
@@ -506,10 +632,10 @@ void MediaPlayerPrivateMediaStreamAVFObjC::play()
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::play(%p)", this);
 
-    if (!metaDataAvailable() || m_playing || m_ended)
+    if (!metaDataAvailable() || playing() || m_ended)
         return;
 
-    m_playing = true;
+    m_playbackState = PlaybackState::Playing;
     if (!m_clock->isRunning())
         m_clock->start();
 
@@ -528,11 +654,11 @@ void MediaPlayerPrivateMediaStreamAVFObjC::pause()
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::pause(%p)", this);
 
-    if (!metaDataAvailable() || !m_playing || m_ended)
+    if (!metaDataAvailable() || !playing() || m_ended)
         return;
 
     m_pausedTime = currentMediaTime();
-    m_playing = false;
+    m_playbackState = PlaybackState::Paused;
 
     for (const auto& track : m_audioTrackMap.values())
         track->pause();
@@ -541,11 +667,6 @@ void MediaPlayerPrivateMediaStreamAVFObjC::pause()
     flushRenderers();
 }
 
-bool MediaPlayerPrivateMediaStreamAVFObjC::paused() const
-{
-    return !m_playing;
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::setVolume(float volume)
 {
     LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::setVolume(%p)", this);
@@ -593,7 +714,7 @@ MediaTime MediaPlayerPrivateMediaStreamAVFObjC::durationMediaTime() const
 
 MediaTime MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime() const
 {
-    if (!m_playing)
+    if (paused())
         return m_pausedTime;
 
     return streamTime();
@@ -650,7 +771,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::activeStatusChanged()
 {
     scheduleDeferredTask([this] {
         bool ended = !m_mediaStreamPrivate->active();
-        if (ended && m_playing)
+        if (ended && playing())
             pause();
 
         updateReadyState();
@@ -670,6 +791,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::updateRenderingMode()
         return;
 
     scheduleDeferredTask([this] {
+        m_transformIsValid = false;
         if (m_player)
             m_player->client().mediaPlayerRenderingModeChanged(m_player);
     });
@@ -825,7 +947,7 @@ void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack()
 
         if (oldVideoTrack != m_activeVideoTrack)
             m_imagePainter.reset();
-        ensureLayer();
+        ensureLayers();
         m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer;
         m_pendingSelectedTrackCheck = false;
         updateDisplayMode();
@@ -914,11 +1036,10 @@ void MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext(GraphicsCo
     if (m_displayMode == None || !metaDataAvailable() || context.paintingDisabled())
         return;
 
-    GraphicsContextStateSaver stateSaver(context);
-
     if (m_displayMode != PaintItBlack && m_imagePainter.mediaSample)
         updateCurrentFrameImage();
 
+    GraphicsContextStateSaver stateSaver(context);
     if (m_displayMode == PaintItBlack || !m_imagePainter.cgImage || !m_imagePainter.mediaSample) {
         context.fillRect(IntRect(IntPoint(), IntSize(destRect.width(), destRect.height())), Color::black);
         return;
@@ -926,15 +1047,18 @@ void MediaPlayerPrivateMediaStreamAVFObjC::paintCurrentFrameInContext(GraphicsCo
 
     auto image = m_imagePainter.cgImage.get();
     FloatRect imageRect(0, 0, CGImageGetWidth(image), CGImageGetHeight(image));
-    context.drawNativeImage(image, imageRect.size(), destRect, imageRect);
+    AffineTransform videoTransform = videoTransformationMatrix(*m_imagePainter.mediaSample);
+    FloatRect transformedDestRect = videoTransform.inverse().value_or(AffineTransform()).mapRect(destRect);
+    context.concatCTM(videoTransform);
+    context.drawNativeImage(image, imageRect.size(), transformedDestRect, imageRect);
 }
 
 void MediaPlayerPrivateMediaStreamAVFObjC::acceleratedRenderingStateChanged()
 {
     if (m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player))
-        ensureLayer();
+        ensureLayers();
     else
-        destroyLayer();
+        destroyLayers();
 }
 
 String MediaPlayerPrivateMediaStreamAVFObjC::engineDescription() const
@@ -943,11 +1067,6 @@ String MediaPlayerPrivateMediaStreamAVFObjC::engineDescription() const
     return description;
 }
 
-bool MediaPlayerPrivateMediaStreamAVFObjC::shouldBePlaying() const
-{
-    return m_playing && m_readyState >= MediaPlayer::HaveFutureData;
-}
-
 void MediaPlayerPrivateMediaStreamAVFObjC::setReadyState(MediaPlayer::ReadyState readyState)
 {
     if (m_readyState == readyState)
@@ -969,6 +1088,12 @@ void MediaPlayerPrivateMediaStreamAVFObjC::setNetworkState(MediaPlayer::NetworkS
     m_player->networkStateChanged();
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::setShouldBufferData(bool shouldBuffer)
+{
+    if (!shouldBuffer)
+        flushAndRemoveVideoSampleBuffers();
+}
+
 void MediaPlayerPrivateMediaStreamAVFObjC::scheduleDeferredTask(Function<void ()>&& function)
 {
     ASSERT(function);
@@ -987,6 +1112,14 @@ void MediaPlayerPrivateMediaStreamAVFObjC::CurrentFramePainter::reset()
     pixelBufferConformer = nullptr;
 }
 
+void MediaPlayerPrivateMediaStreamAVFObjC::backgroundLayerBoundsChanged()
+{
+    if (!m_backgroundLayer || !m_sampleBufferDisplayLayer)
+        return;
+
+    m_videoSizeChanged = true;
+}
+
 }
 
 #endif
index 1d6d6ef..a821580 100644 (file)
@@ -77,7 +77,7 @@ private:
     bool updateFramerate(CMSampleBufferRef);
 
     void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) final;
-    void processNewFrame(RetainPtr<CMSampleBufferRef>);
+    void processNewFrame(RetainPtr<CMSampleBufferRef>, RetainPtr<AVCaptureConnection>);
 
     RetainPtr<NSString> m_pendingPreset;
     RetainPtr<CMSampleBufferRef> m_buffer;
index 96e216e..4b961cf 100644 (file)
@@ -417,7 +417,7 @@ bool AVVideoCaptureSource::updateFramerate(CMSampleBufferRef sampleBuffer)
     return frameRate != m_frameRate;
 }
 
-void AVVideoCaptureSource::processNewFrame(RetainPtr<CMSampleBufferRef> sampleBuffer)
+void AVVideoCaptureSource::processNewFrame(RetainPtr<CMSampleBufferRef> sampleBuffer, RetainPtr<AVCaptureConnectionType> connection)
 {
     // Ignore frames delivered when the session is not running, we want to hang onto the last image
     // delivered before it stopped.
@@ -432,8 +432,27 @@ void AVVideoCaptureSource::processNewFrame(RetainPtr<CMSampleBufferRef> sampleBu
     m_buffer = sampleBuffer;
     m_lastImage = nullptr;
 
+    MediaSample::VideoOrientation orientation = MediaSample::VideoOrientation::Unknown;
+    switch ([connection videoOrientation]) {
+    case AVCaptureVideoOrientationPortrait:
+        orientation = MediaSample::VideoOrientation::Portrait;
+        break;
+    case AVCaptureVideoOrientationPortraitUpsideDown:
+        orientation = MediaSample::VideoOrientation::PortraitUpsideDown;
+        break;
+    case AVCaptureVideoOrientationLandscapeRight:
+        orientation = MediaSample::VideoOrientation::LandscapeRight;
+        break;
+    case AVCaptureVideoOrientationLandscapeLeft:
+        orientation = MediaSample::VideoOrientation::LandscapeLeft;
+        break;
+    }
+
     bool settingsChanged = false;
     CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(formatDescription);
+    if (orientation == MediaSample::VideoOrientation::LandscapeRight || orientation == MediaSample::VideoOrientation::LandscapeLeft)
+        std::swap(dimensions.width, dimensions.height);
+
     if (dimensions.width != m_width || dimensions.height != m_height) {
         m_width = dimensions.width;
         m_height = dimensions.height;
@@ -443,15 +462,16 @@ void AVVideoCaptureSource::processNewFrame(RetainPtr<CMSampleBufferRef> sampleBu
     if (settingsChanged)
         settingsDidChange();
 
-    videoSampleAvailable(MediaSampleAVFObjC::create(m_buffer.get()));
+    videoSampleAvailable(MediaSampleAVFObjC::create(m_buffer.get(), orientation, [connection isVideoMirrored]));
 }
 
-void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
+void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType* captureConnection)
 {
     RetainPtr<CMSampleBufferRef> buffer = sampleBuffer;
+    RetainPtr<AVCaptureConnectionType> connection = captureConnection;
 
-    scheduleDeferredTask([this, buffer] {
-        this->processNewFrame(buffer);
+    scheduleDeferredTask([this, buffer, connection] {
+        this->processNewFrame(buffer, connection);
     });
 }
 
index 8ac57f5..43c5349 100644 (file)
@@ -73,9 +73,9 @@ void RealtimeOutgoingVideoSource::RemoveSink(rtc::VideoSinkInterface<webrtc::Vid
     m_sinks.removeFirst(sink);
 }
 
-void RealtimeOutgoingVideoSource::sendFrame(rtc::scoped_refptr<webrtc::VideoFrameBuffer>&& buffer)
+void RealtimeOutgoingVideoSource::sendFrame(rtc::scoped_refptr<webrtc::VideoFrameBuffer>&& buffer, webrtc::VideoRotation rotation)
 {
-    webrtc::VideoFrame frame(buffer, 0, 0, webrtc::kVideoRotation_0);
+    webrtc::VideoFrame frame(buffer, 0, 0, rotation);
     for (auto* sink : m_sinks)
         sink->OnFrame(frame);
 }
@@ -91,7 +91,7 @@ void RealtimeOutgoingVideoSource::videoSampleAvailable(MediaSample& sample)
     if (m_muted || !m_enabled) {
         auto blackBuffer = m_bufferPool.CreateBuffer(settings.width(), settings.height());
         blackBuffer->SetToBlack();
-        sendFrame(WTFMove(blackBuffer));
+        sendFrame(WTFMove(blackBuffer), webrtc::kVideoRotation_0);
         return;
     }
 
@@ -99,8 +99,25 @@ void RealtimeOutgoingVideoSource::videoSampleAvailable(MediaSample& sample)
     auto pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(sample.platformSample().sample.cmSampleBuffer));
     auto pixelFormatType = CVPixelBufferGetPixelFormatType(pixelBuffer);
 
-    if (pixelFormatType == kCVPixelFormatType_420YpCbCr8Planar) {
-        sendFrame(new rtc::RefCountedObject<webrtc::CoreVideoFrameBuffer>(pixelBuffer));
+    webrtc::VideoRotation rotation;
+    switch (sample.videoOrientation()) {
+    case MediaSample::VideoOrientation::Unknown:
+    case MediaSample::VideoOrientation::Portrait:
+        rotation = webrtc::kVideoRotation_0;
+        break;
+    case MediaSample::VideoOrientation::PortraitUpsideDown:
+        rotation = webrtc::kVideoRotation_180;
+        break;
+    case MediaSample::VideoOrientation::LandscapeRight:
+        rotation = webrtc::kVideoRotation_90;
+        break;
+    case MediaSample::VideoOrientation::LandscapeLeft:
+        rotation = webrtc::kVideoRotation_270;
+        break;
+    }
+
+    if (pixelFormatType == kCVPixelFormatType_420YpCbCr8Planar || pixelFormatType == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange) {
+        sendFrame(new rtc::RefCountedObject<webrtc::CoreVideoFrameBuffer>(pixelBuffer), rotation);
         return;
     }
 
@@ -116,7 +133,7 @@ void RealtimeOutgoingVideoSource::videoSampleAvailable(MediaSample& sample)
         webrtc::ConvertToI420(webrtc::kBGRA, source, 0, 0, settings.width(), settings.height(), 0, webrtc::kVideoRotation_0, newBuffer);
     }
     CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
-    sendFrame(WTFMove(newBuffer));
+    sendFrame(WTFMove(newBuffer), rotation);
 }
 
 } // namespace WebCore
index 73e60e9..bb151ec 100644 (file)
@@ -50,7 +50,7 @@ public:
 private:
     RealtimeOutgoingVideoSource(Ref<RealtimeMediaSource>&&);
 
-    void sendFrame(rtc::scoped_refptr<webrtc::VideoFrameBuffer>&&);
+    void sendFrame(rtc::scoped_refptr<webrtc::VideoFrameBuffer>&&, webrtc::VideoRotation);
 
     // Notifier API
     void RegisterObserver(webrtc::ObserverInterface*) final { }