[MSE][GStreamer] WebKitMediaSrc rework
authoraboya@igalia.com <aboya@igalia.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Wed, 28 Aug 2019 17:04:32 +0000 (17:04 +0000)
committeraboya@igalia.com <aboya@igalia.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Wed, 28 Aug 2019 17:04:32 +0000 (17:04 +0000)
https://bugs.webkit.org/show_bug.cgi?id=199719

Reviewed by Xabier Rodriguez-Calvar.

LayoutTests/imported/w3c:

* web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py: Added.
(parse_range):
(main):
* web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html: Added.
* web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
* web-platform-tests/media-source/mediasource-buffered-seek-expected.txt: Added.
* web-platform-tests/media-source/mediasource-buffered-seek.html: Added.

Source/WebCore:

This patch reworks the WebKitMediaSrc element and many of the player
private methods that interacted with it.

In comparison with the old WebKitMediaSrc, in the new one seeks have
been massively simplified.

The new WebKitMediaSrc no longer relies on a bin or appsrc, having
greater control over its operation. This made it comparatively much
easier to implement features such as seek before playback or
single-stream flushing.

stream-collection events are emitted from the WebKitMediaSrc to reuse
the track handling in MediaPlayerPrivateGStreamer for playbin3, which
is now used for MSE pipelines.

Additional tests have been added to check some assumptions, and some
bugs that have surfaced with the changes have been fixed but no new
features (like multi-track support) are implemented in this patch.

One instance of these bugs is `resized` events, which were previously
being emitted when frames with different resolutions where appended.
This is a wrong behavior that has not been preserved in the rework, as
resize events should be emitted when the frames are shown, not
just appended.

There are subtler bugfixes, such as ignoring PTS-less frames in
AppendPipeline::appsinkNewSample(). These frames are problematic for
MSE, yet they were somehow passing through the pipelines. Since
WebKitMediaSrc is stricter with assertions, these have to be filtered.

This test gets rid of !m_mseSeekCompleted assertion failures in tests
and potentially other hard to debug bugs in the previous seek
algorithm.

This patch makes the following existing tests pass:

imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-a-bitrate.html
imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-v-framesize.html

New test: imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html
New test: LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html (non-MSE related)

* Headers.cmake:
* platform/GStreamer.cmake:
* platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
(WTF::adoptGRef):
(WTF::refGPtr<GstMiniObject>):
(WTF::derefGPtr<GstMiniObject>):
* platform/graphics/gstreamer/GRefPtrGStreamer.h:
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
(WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
(WebCore::MediaPlayerPrivateGStreamer::paused const):
(WebCore::MediaPlayerPrivateGStreamer::updateTracks):
(WebCore::MediaPlayerPrivateGStreamer::enableTrack):
(WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideo):
(WebCore::MediaPlayerPrivateGStreamer::sourceSetup):
(WebCore::MediaPlayerPrivateGStreamer::handleSyncMessage):
(WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
(WebCore::MediaPlayerPrivateGStreamer::invalidateCachedPosition):
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
(WebCore::MediaPlayerPrivateGStreamerBase::naturalSize const):
(WebCore::MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps const):
(WebCore::MediaPlayerPrivateGStreamerBase::samplesHaveDifferentNaturalSize const):
(WebCore::MediaPlayerPrivateGStreamerBase::triggerRepaint):
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
* platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
* platform/graphics/gstreamer/mse/AppendPipeline.cpp:
(WebCore::AppendPipeline::appsinkNewSample):
(WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink):
* platform/graphics/gstreamer/mse/AppendPipeline.h:
(WebCore::AppendPipeline::appsinkCaps):
(WebCore::AppendPipeline::streamType):
(WebCore::AppendPipeline::demuxerSrcPadCaps):
* platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
(WebCore::MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE):
(WebCore::MediaPlayerPrivateGStreamerMSE::load):
(WebCore::MediaPlayerPrivateGStreamerMSE::play):
(WebCore::MediaPlayerPrivateGStreamerMSE::pause):
(WebCore::MediaPlayerPrivateGStreamerMSE::seek):
(WebCore::MediaPlayerPrivateGStreamerMSE::seekCompleted):
(WebCore::MediaPlayerPrivateGStreamerMSE::setReadyState):
(WebCore::MediaPlayerPrivateGStreamerMSE::sourceSetup):
(WebCore::MediaPlayerPrivateGStreamerMSE::updateStates):
(WebCore::MediaPlayerPrivateGStreamerMSE::didEnd):
(WebCore::MediaPlayerPrivateGStreamerMSE::unblockDurationChanges):
(WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
(WebCore::MediaPlayerPrivateGStreamerMSE::trackDetected):
* platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
* platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp:
(WebCore::MediaSourceClientGStreamerMSE::addSourceBuffer):
(WebCore::MediaSourceClientGStreamerMSE::removedFromMediaSource):
(WebCore::MediaSourceClientGStreamerMSE::flush):
(WebCore::MediaSourceClientGStreamerMSE::enqueueSample):
(WebCore::MediaSourceClientGStreamerMSE::isReadyForMoreSamples):
(WebCore::MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples):
(WebCore::MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued):
* platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h:
* platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp:
(WebCore::MediaSourceGStreamer::markEndOfStream):
(WebCore::MediaSourceGStreamer::unmarkEndOfStream):
(WebCore::MediaSourceGStreamer::waitForSeekCompleted):
* platform/graphics/gstreamer/mse/MediaSourceGStreamer.h:
* platform/graphics/gstreamer/mse/PlaybackPipeline.cpp: Removed.
* platform/graphics/gstreamer/mse/PlaybackPipeline.h: Removed.
* platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp:
(WebCore::SourceBufferPrivateGStreamer::enqueueSample):
(WebCore::SourceBufferPrivateGStreamer::isReadyForMoreSamples):
(WebCore::SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples):
* platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h:
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
(WebKitMediaSrcPrivate::streamByName):
(Stream::Stream):
(Stream::StreamingMembers::StreamingMembers):
(Stream::StreamingMembers::durationEnqueued const):
(findPipeline):
(webkit_media_src_class_init):
(webkit_media_src_init):
(webKitMediaSrcFinalize):
(debugProbe):
(collectionPlusStream):
(collectionMinusStream):
(gstStreamType):
(webKitMediaSrcAddStream):
(webKitMediaSrcRemoveStream):
(webKitMediaSrcActivateMode):
(webKitMediaSrcPadLinked):
(webKitMediaSrcStreamNotifyLowWaterLevel):
(webKitMediaSrcLoop):
(webKitMediaSrcEnqueueObject):
(webKitMediaSrcEnqueueSample):
(webKitMediaSrcEnqueueEvent):
(webKitMediaSrcEndOfStream):
(webKitMediaSrcIsReadyForMoreSamples):
(webKitMediaSrcNotifyWhenReadyForMoreSamples):
(webKitMediaSrcChangeState):
(webKitMediaSrcStreamFlushStart):
(webKitMediaSrcStreamFlushStop):
(webKitMediaSrcFlush):
(webKitMediaSrcSeek):
(countStreamsOfType):
(webKitMediaSrcGetProperty):
(webKitMediaSrcUriGetType):
(webKitMediaSrcGetProtocols):
(webKitMediaSrcGetUri):
(webKitMediaSrcSetUri):
(webKitMediaSrcUriHandlerInit):
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h:
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h: Removed.

Tools:

Added WebKitMediaSourceGStreamer.cpp to the GStreamer-style coding
whitelist.

* Scripts/webkitpy/style/checker.py:

LayoutTests:

Updated expectations.

* platform/gtk/TestExpectations:
* platform/mac/TestExpectations:
* platform/ios-simulator/TestExpectations:
* platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@249205 268f45cc-cd09-0410-ab3c-d52691b4dbfc

38 files changed:
LayoutTests/ChangeLog
LayoutTests/imported/w3c/ChangeLog
LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek-expected.txt [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html [new file with mode: 0644]
LayoutTests/platform/gtk/TestExpectations
LayoutTests/platform/ios-simulator/TestExpectations
LayoutTests/platform/mac/TestExpectations
LayoutTests/platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt [new file with mode: 0644]
Source/WebCore/ChangeLog
Source/WebCore/platform/GStreamer.cmake
Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h
Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp
Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h
Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h
Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h
Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.h
Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp [deleted file]
Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.h [deleted file]
Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h
Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h
Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h [deleted file]
Source/cmake/GStreamerChecks.cmake
Tools/ChangeLog
Tools/Scripts/webkitpy/style/checker.py

index dff8af4..a65c787 100644 (file)
@@ -1,3 +1,17 @@
+2019-08-28  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] WebKitMediaSrc rework
+        https://bugs.webkit.org/show_bug.cgi?id=199719
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        Updated expectations.
+
+        * platform/gtk/TestExpectations:
+        * platform/mac/TestExpectations:
+        * platform/ios-simulator/TestExpectations:
+        * platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
+
 2019-08-28  Jer Noble  <jer.noble@apple.com>
 
         Flaky Test: fullscreen/full-screen-request-removed-with-raf.html
index a64081d..7b5f3f6 100644 (file)
@@ -1,3 +1,18 @@
+2019-08-28  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] WebKitMediaSrc rework
+        https://bugs.webkit.org/show_bug.cgi?id=199719
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        * web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py: Added.
+        (parse_range):
+        (main):
+        * web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html: Added.
+        * web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
+        * web-platform-tests/media-source/mediasource-buffered-seek-expected.txt: Added.
+        * web-platform-tests/media-source/mediasource-buffered-seek.html: Added.
+
 2019-08-26  Chris Dumez  <cdumez@apple.com>
 
         Change default value of window.open()'s url argument
diff --git a/LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py b/LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py
new file mode 100644 (file)
index 0000000..a130989
--- /dev/null
@@ -0,0 +1,49 @@
+from __future__ import print_function
+import os
+import re
+from urlparse import parse_qs
+
+def parse_range(header_value, file_size):
+    if header_value is None:
+        # HTTP Range header range end is inclusive
+        return 0, file_size - 1
+
+    match = re.match("bytes=(\d*)-(\d*)", header_value)
+    start = int(match.group(1)) if match.group(1).strip() != "" else 0
+    last = int(match.group(2)) if match.group(2).strip() != "" else file_size - 1
+    return start, last
+
+def main(request, response):
+    file_extension = parse_qs(request.url_parts.query)["extension"][0]
+    with open("media/movie_300." + file_extension, "rb") as f:
+        f.seek(0, os.SEEK_END)
+        file_size = f.tell()
+
+        range_header = request.headers.get("range")
+        req_start, req_last = parse_range(range_header, file_size)
+        f.seek(req_start, os.SEEK_SET)
+
+        response.add_required_headers = False
+        response.writer.write_status(206 if range_header else 200)
+        response.writer.write_header("Accept-Ranges", "bytes")
+        response.writer.write_header("Content-Type", "video/mp4")
+        if range_header:
+            response.writer.write_header("Content-Range", "bytes %d-%d/%d" %
+                    (req_start, req_last, file_size))
+        response.writer.write_header("Content-Length", str(req_last - req_start + 1))
+        response.writer.end_headers()
+
+        gap_start = int(file_size * 0.5)
+        gap_last = int(file_size * 0.95)
+
+        if gap_start < req_start < gap_last:
+            # If the start position is part of the gap, don't send any data
+            return
+
+        if req_start < gap_start:
+            # If the position is before of the gap, only send data until the
+            # gap is reached
+            req_last = min(req_last, gap_start)
+
+        size = req_last - req_start + 1
+        response.writer.write(f.read(size))
diff --git a/LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt b/LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt
new file mode 100644 (file)
index 0000000..aa59401
--- /dev/null
@@ -0,0 +1,4 @@
+
+PASS timeupdate is emitted after a seek before the data is received: mp4. 
+PASS timeupdate is emitted after a seek before the data is received: ogv. 
+
diff --git a/LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html b/LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html
new file mode 100644 (file)
index 0000000..017c71c
--- /dev/null
@@ -0,0 +1,48 @@
+<!DOCTYPE HTML>
+<html>
+<head>
+    <title>HTML5 Media Elements: timeupdate is emitted after a seek before the data is received.</title>
+    <meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
+    <link rel="author" title="Alicia Boya García" href="mailto:aboya@igalia.com"/>
+    <script src="/resources/testharness.js"></script>
+    <script src="/resources/testharnessreport.js"></script>
+</head>
+<body onload="runTests()">
+<script>
+    const seekTime = 60 * 4;
+
+    function testTimeupdateOnSeek(mediaType) {
+        async_test(function (test) {
+            const video = document.createElement("video");
+            video.src = `timeout_on_seek.py?extension=${mediaType}`;
+            video.controls = true;
+            video.defaultMuted = true;
+            document.body.appendChild(video);
+
+            video.addEventListener("canplay", test.step_func(videoCanPlay), {once: true});
+
+            function videoCanPlay() {
+                video.addEventListener("timeupdate", test.step_func(onTimeUpdate));
+                video.play();
+                video.currentTime = seekTime;
+            }
+
+            function onTimeUpdate() {
+                if (Math.abs(video.currentTime - seekTime) <= 1) {
+                    document.body.removeChild(video);
+                    test.done();
+                }
+            }
+        }, `timeupdate is emitted after a seek before the data is received: ${mediaType}.`);
+    }
+
+    function runTests() {
+        const testerVideo = document.createElement("video");
+        if (testerVideo.canPlayType("video/mp4"))
+            testTimeupdateOnSeek("mp4");
+        if (testerVideo.canPlayType("video/ogg"))
+            testTimeupdateOnSeek("ogv");
+    }
+</script>
+</body>
+</html>
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek-expected.txt b/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek-expected.txt
new file mode 100644 (file)
index 0000000..8b45a97
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS Test seeking to a buffered location. 
+
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html b/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html
new file mode 100644 (file)
index 0000000..17caa38
--- /dev/null
@@ -0,0 +1,58 @@
+<!DOCTYPE html>
+<html>
+    <head>
+        <title>Test MediaSource behavior when a seek is requested to a buffered position.</title>
+        <meta content="text/html; charset=UTF-8" http-equiv="Content-Type">
+        <link rel="author" title="Alicia Boya García" href="mailto:aboya@igalia.com"/>
+        <script src="/resources/testharness.js"></script>
+        <script src="/resources/testharnessreport.js"></script>
+        <script src="mediasource-util.js"></script>
+    </head>
+    <body>
+        <div id="log"></div>
+        <script>
+          mediasource_testafterdataloaded(function(test, mediaElement, mediaSource, segmentInfo, sourceBuffer, mediaData)
+          {
+              mediaElement.play();
+
+              var initSegment = MediaSourceUtil.extractSegmentData(mediaData, segmentInfo.init);
+              var firstSegment = MediaSourceUtil.extractSegmentData(mediaData, segmentInfo.media[0]);
+
+              // Append the initialization segment to trigger a transition to HAVE_METADATA.
+              test.expectEvent(sourceBuffer, 'updateend', 'sourceBuffer');
+              test.expectEvent(mediaElement, 'loadedmetadata', 'Reached HAVE_METADATA');
+              sourceBuffer.appendBuffer(initSegment);
+
+              test.waitForExpectedEvents(function()
+              {
+                  assert_false(mediaElement.seeking, 'mediaElement is not seeking');
+                  assert_equals(mediaElement.readyState, mediaElement.HAVE_METADATA, 'Still in HAVE_METADATA');
+
+                  test.expectEvent(sourceBuffer, 'updateend', 'sourceBuffer');
+                  sourceBuffer.appendBuffer(firstSegment);
+                  test.expectEvent(mediaElement, 'playing', 'mediaElement playing');
+              });
+
+              test.waitForExpectedEvents(function()
+              {
+                  assert_greater_than(mediaElement.readyState, mediaElement.HAVE_METADATA, 'readyState > HAVE_METADATA');
+
+                  // Seek to a position we know it's buffered by now.
+                  mediaElement.currentTime = Math.max(segmentInfo.media[0].timev, segmentInfo.media[0].timea) / 2;
+
+                  assert_true(mediaElement.seeking, 'mediaElement is seeking');
+
+                  test.expectEvent(mediaElement, 'seeking', 'mediaElement seeking');
+                  test.expectEvent(mediaElement, 'seeked', 'mediaElement finished seek');
+              });
+
+              test.waitForExpectedEvents(function()
+              {
+                  assert_greater_than(mediaElement.readyState, mediaElement.HAVE_METADATA, 'readyState > HAVE_METADATA');
+                  test.done();
+              });
+
+          }, 'Test seeking to a buffered location.');
+        </script>
+    </body>
+</html>
index 1289ee6..4f75f4f 100644 (file)
@@ -228,7 +228,6 @@ webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-avt
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-buffered.html [ Failure ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-changetype.html [ Failure Crash ]
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-changetype-play.html [ Failure ]
-webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-v-framesize.html [ Failure Pass ]
 # Crash is webkit.org/b/176020
 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-duration.html [ Failure Crash ]
 # Crash in bug #176019
@@ -246,6 +245,9 @@ webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-tra
 # We don't support multiple streams per sourcebuffer nor dynamic type changes (audio/video/text)
 webkit.org/b/165394 media/media-source/media-source-seek-detach-crash.html [ Skip ]
 
+# There is an oggdemux bug that deadlocks WebKit: https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/issues/639
+imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout ]
+
 # CSS filters related failures
 webkit.org/b/99026 css3/filters/effect-brightness.html [ Failure ]
 webkit.org/b/99026 css3/filters/effect-brightness-hw.html [ Failure ]
@@ -2459,7 +2461,7 @@ webkit.org/b/168373 media/media-source/only-bcp47-language-tags-accepted-as-vali
 
 webkit.org/b/172284 svg/animations/animated-svg-image-outside-viewport-paused.html [ Timeout ]
 
-webkit.org/b/172816 media/media-source/media-source-paint-to-canvas.html [ Timeout ]
+webkit.org/b/172816 media/media-source/media-source-paint-to-canvas.html [ Failure ]
 
 webkit.org/b/174242 media/media-fullscreen-pause-inline.html [ Skip ]
 
index 55e2167..39a3b8c 100644 (file)
@@ -126,3 +126,5 @@ webkit.org/b/195466 imported/w3c/web-platform-tests/html/semantics/embedded-cont
 webkit.org/b/195466 imported/w3c/web-platform-tests/html/semantics/embedded-content/media-elements/error-codes/error.html [ Pass Failure ]
 
 imported/w3c/web-platform-tests/wasm [ Skip ]
+
+webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ]
index e1245dd..7653c03 100644 (file)
@@ -1986,6 +1986,8 @@ webkit.org/b/200258 [ Debug ] imported/w3c/web-platform-tests/wasm/jsapi/interfa
 # rdar://52594556 (Layout test fast/text/international/system-language/han-quotes.html is failing)
 [ Catalina+ ] fast/text/international/system-language/han-quotes.html [ ImageOnlyFailure ]
 
+webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ]
+
 # rdar://52557916 (REGRESSION: fast/css/paint-order.html and fast/css/paint-order-shadow.html are failing)
 [ Catalina+ ] fast/css/paint-order.html [ ImageOnlyFailure ]
 [ Catalina+ ] fast/css/paint-order-shadow.html [ ImageOnlyFailure ]
diff --git a/LayoutTests/platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt b/LayoutTests/platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt
new file mode 100644 (file)
index 0000000..b67431a
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS timeupdate is emitted after a seek before the data is received: mp4. 
+
index d3a727b..849bc0a 100644 (file)
@@ -1,3 +1,161 @@
+2019-08-28  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] WebKitMediaSrc rework
+        https://bugs.webkit.org/show_bug.cgi?id=199719
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        This patch reworks the WebKitMediaSrc element and many of the player
+        private methods that interacted with it.
+
+        In comparison with the old WebKitMediaSrc, in the new one seeks have
+        been massively simplified.
+
+        The new WebKitMediaSrc no longer relies on a bin or appsrc, having
+        greater control over its operation. This made it comparatively much
+        easier to implement features such as seek before playback or
+        single-stream flushing.
+
+        stream-collection events are emitted from the WebKitMediaSrc to reuse
+        the track handling in MediaPlayerPrivateGStreamer for playbin3, which
+        is now used for MSE pipelines.
+
+        Additional tests have been added to check some assumptions, and some
+        bugs that have surfaced with the changes have been fixed but no new
+        features (like multi-track support) are implemented in this patch.
+
+        One instance of these bugs is `resized` events, which were previously
+        being emitted when frames with different resolutions where appended.
+        This is a wrong behavior that has not been preserved in the rework, as
+        resize events should be emitted when the frames are shown, not
+        just appended.
+
+        There are subtler bugfixes, such as ignoring PTS-less frames in
+        AppendPipeline::appsinkNewSample(). These frames are problematic for
+        MSE, yet they were somehow passing through the pipelines. Since
+        WebKitMediaSrc is stricter with assertions, these have to be filtered.
+
+        This test gets rid of !m_mseSeekCompleted assertion failures in tests
+        and potentially other hard to debug bugs in the previous seek
+        algorithm.
+
+        This patch makes the following existing tests pass:
+
+        imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-a-bitrate.html
+        imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-v-framesize.html
+
+        New test: imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html
+        New test: LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html (non-MSE related)
+
+        * Headers.cmake:
+        * platform/GStreamer.cmake:
+        * platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
+        (WTF::adoptGRef):
+        (WTF::refGPtr<GstMiniObject>):
+        (WTF::derefGPtr<GstMiniObject>):
+        * platform/graphics/gstreamer/GRefPtrGStreamer.h:
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        (WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
+        (WebCore::MediaPlayerPrivateGStreamer::paused const):
+        (WebCore::MediaPlayerPrivateGStreamer::updateTracks):
+        (WebCore::MediaPlayerPrivateGStreamer::enableTrack):
+        (WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideo):
+        (WebCore::MediaPlayerPrivateGStreamer::sourceSetup):
+        (WebCore::MediaPlayerPrivateGStreamer::handleSyncMessage):
+        (WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
+        (WebCore::MediaPlayerPrivateGStreamer::invalidateCachedPosition):
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
+        (WebCore::MediaPlayerPrivateGStreamerBase::naturalSize const):
+        (WebCore::MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps const):
+        (WebCore::MediaPlayerPrivateGStreamerBase::samplesHaveDifferentNaturalSize const):
+        (WebCore::MediaPlayerPrivateGStreamerBase::triggerRepaint):
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
+        * platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
+        (WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
+        * platform/graphics/gstreamer/mse/AppendPipeline.cpp:
+        (WebCore::AppendPipeline::appsinkNewSample):
+        (WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink):
+        * platform/graphics/gstreamer/mse/AppendPipeline.h:
+        (WebCore::AppendPipeline::appsinkCaps):
+        (WebCore::AppendPipeline::streamType):
+        (WebCore::AppendPipeline::demuxerSrcPadCaps):
+        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
+        (WebCore::MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::load):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::play):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::pause):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::seek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::seekCompleted):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::setReadyState):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::sourceSetup):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::updateStates):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::didEnd):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::unblockDurationChanges):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::trackDetected):
+        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
+        * platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp:
+        (WebCore::MediaSourceClientGStreamerMSE::addSourceBuffer):
+        (WebCore::MediaSourceClientGStreamerMSE::removedFromMediaSource):
+        (WebCore::MediaSourceClientGStreamerMSE::flush):
+        (WebCore::MediaSourceClientGStreamerMSE::enqueueSample):
+        (WebCore::MediaSourceClientGStreamerMSE::isReadyForMoreSamples):
+        (WebCore::MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples):
+        (WebCore::MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued):
+        * platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h:
+        * platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp:
+        (WebCore::MediaSourceGStreamer::markEndOfStream):
+        (WebCore::MediaSourceGStreamer::unmarkEndOfStream):
+        (WebCore::MediaSourceGStreamer::waitForSeekCompleted):
+        * platform/graphics/gstreamer/mse/MediaSourceGStreamer.h:
+        * platform/graphics/gstreamer/mse/PlaybackPipeline.cpp: Removed.
+        * platform/graphics/gstreamer/mse/PlaybackPipeline.h: Removed.
+        * platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp:
+        (WebCore::SourceBufferPrivateGStreamer::enqueueSample):
+        (WebCore::SourceBufferPrivateGStreamer::isReadyForMoreSamples):
+        (WebCore::SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples):
+        * platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h:
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
+        (WebKitMediaSrcPrivate::streamByName):
+        (Stream::Stream):
+        (Stream::StreamingMembers::StreamingMembers):
+        (Stream::StreamingMembers::durationEnqueued const):
+        (findPipeline):
+        (webkit_media_src_class_init):
+        (webkit_media_src_init):
+        (webKitMediaSrcFinalize):
+        (debugProbe):
+        (collectionPlusStream):
+        (collectionMinusStream):
+        (gstStreamType):
+        (webKitMediaSrcAddStream):
+        (webKitMediaSrcRemoveStream):
+        (webKitMediaSrcActivateMode):
+        (webKitMediaSrcPadLinked):
+        (webKitMediaSrcStreamNotifyLowWaterLevel):
+        (webKitMediaSrcLoop):
+        (webKitMediaSrcEnqueueObject):
+        (webKitMediaSrcEnqueueSample):
+        (webKitMediaSrcEnqueueEvent):
+        (webKitMediaSrcEndOfStream):
+        (webKitMediaSrcIsReadyForMoreSamples):
+        (webKitMediaSrcNotifyWhenReadyForMoreSamples):
+        (webKitMediaSrcChangeState):
+        (webKitMediaSrcStreamFlushStart):
+        (webKitMediaSrcStreamFlushStop):
+        (webKitMediaSrcFlush):
+        (webKitMediaSrcSeek):
+        (countStreamsOfType):
+        (webKitMediaSrcGetProperty):
+        (webKitMediaSrcUriGetType):
+        (webKitMediaSrcGetProtocols):
+        (webKitMediaSrcGetUri):
+        (webKitMediaSrcSetUri):
+        (webKitMediaSrcUriHandlerInit):
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h:
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h: Removed.
+
 2019-08-28  Simon Fraser  <simon.fraser@apple.com>
 
         Have RenderSVGBlock compute visual overflow just like everyone else
index f240ec7..f93ce6b 100644 (file)
@@ -32,7 +32,6 @@ if (ENABLE_VIDEO OR ENABLE_WEB_AUDIO)
         platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
         platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
         platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
-        platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
         platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
         platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
 
index b15f7f3..669a0eb 100644 (file)
 
 namespace WTF {
 
+template<> GRefPtr<GstMiniObject> adoptGRef(GstMiniObject* ptr)
+{
+    return GRefPtr<GstMiniObject>(ptr, GRefPtrAdopt);
+}
+
+template<> GstMiniObject* refGPtr<GstMiniObject>(GstMiniObject* ptr)
+{
+    if (ptr)
+        gst_mini_object_ref(ptr);
+
+    return ptr;
+}
+
+template<> void derefGPtr<GstMiniObject>(GstMiniObject* ptr)
+{
+    if (ptr)
+        gst_mini_object_unref(ptr);
+}
+
 template <> GRefPtr<GstElement> adoptGRef(GstElement* ptr)
 {
     ASSERT(!ptr || !g_object_is_floating(ptr));
index a1965d4..3d04b1b 100644 (file)
@@ -34,6 +34,10 @@ typedef struct _GstGLContext GstGLContext;
 
 namespace WTF {
 
+template<> GRefPtr<GstMiniObject> adoptGRef(GstMiniObject* ptr);
+template<> GstMiniObject* refGPtr<GstMiniObject>(GstMiniObject* ptr);
+template<> void derefGPtr<GstMiniObject>(GstMiniObject* ptr);
+
 template<> GRefPtr<GstElement> adoptGRef(GstElement* ptr);
 template<> GstElement* refGPtr<GstElement>(GstElement* ptr);
 template<> void derefGPtr<GstElement>(GstElement* ptr);
index ab37ee0..efca2f5 100644 (file)
@@ -358,8 +358,16 @@ void MediaPlayerPrivateGStreamer::commitLoad()
 MediaTime MediaPlayerPrivateGStreamer::playbackPosition() const
 {
     GST_TRACE_OBJECT(pipeline(), "isEndReached: %s, seeking: %s, seekTime: %s", boolForPrinting(m_isEndReached), boolForPrinting(m_seeking), m_seekTime.toString().utf8().data());
-    if (m_isEndReached && m_seeking)
-        return m_seekTime;
+    if (m_isEndReached) {
+        // Position queries on a pipeline that is not running return 0. This is the case when the prerolling
+        // from a seek is still not done and after EOS. In these cases we want to report the seek time or the
+        // duration respectively.
+        if (m_seeking)
+            return m_seekTime;
+
+        MediaTime duration = durationMediaTime();
+        return duration.isInvalid() ? MediaTime::zeroTime() : duration;
+    }
 
     // This constant should remain lower than HTMLMediaElement's maxTimeupdateEventFrequency.
     static const Seconds positionCacheThreshold = 200_ms;
@@ -661,10 +669,10 @@ bool MediaPlayerPrivateGStreamer::paused() const
         return false;
     }
 
-    GstState state;
-    gst_element_get_state(m_pipeline.get(), &state, nullptr, 0);
+    GstState state, pending;
+    gst_element_get_state(m_pipeline.get(), &state, &pending, 0);
     bool paused = state <= GST_STATE_PAUSED;
-    GST_LOG_OBJECT(pipeline(), "Paused: %s", toString(paused).utf8().data());
+    GST_LOG_OBJECT(pipeline(), "Paused: %s (pending state: %s)", toString(paused).utf8().data(), gst_element_state_get_name(pending));
     return paused;
 }
 
@@ -710,13 +718,11 @@ FloatSize MediaPlayerPrivateGStreamer::naturalSize() const
 #if ENABLE(VIDEO_TRACK)
 #define CREATE_TRACK(type, Type) G_STMT_START {                         \
         m_has##Type = true;                                             \
-        if (!useMediaSource) {                                          \
-            RefPtr<Type##TrackPrivateGStreamer> track = Type##TrackPrivateGStreamer::create(makeWeakPtr(*this), i, stream); \
-            m_##type##Tracks.add(track->id(), track);                   \
-            m_player->add##Type##Track(*track);                         \
-            if (gst_stream_get_stream_flags(stream.get()) & GST_STREAM_FLAG_SELECT) \
-                m_current##Type##StreamId = String(gst_stream_get_stream_id(stream.get())); \
-        }                                                               \
+        RefPtr<Type##TrackPrivateGStreamer> track = Type##TrackPrivateGStreamer::create(makeWeakPtr(*this), i, stream); \
+        m_##type##Tracks.add(track->id(), track);                       \
+        m_player->add##Type##Track(*track);                             \
+        if (gst_stream_get_stream_flags(stream.get()) & GST_STREAM_FLAG_SELECT) \
+            m_current##Type##StreamId = String(gst_stream_get_stream_id(stream.get())); \
     } G_STMT_END
 #else
 #define CREATE_TRACK(type, Type) G_STMT_START { \
@@ -730,6 +736,8 @@ void MediaPlayerPrivateGStreamer::updateTracks()
 
     bool useMediaSource = isMediaSource();
     unsigned length = gst_stream_collection_get_size(m_streamCollection.get());
+    GST_DEBUG_OBJECT(pipeline(), "Inspecting stream collection: %s %" GST_PTR_FORMAT,
+        gst_stream_collection_get_upstream_id(m_streamCollection.get()), m_streamCollection.get());
 
     bool oldHasAudio = m_hasAudio;
     bool oldHasVideo = m_hasVideo;
@@ -769,21 +777,13 @@ void MediaPlayerPrivateGStreamer::updateTracks()
 
 void MediaPlayerPrivateGStreamer::enableTrack(TrackPrivateBaseGStreamer::TrackType trackType, unsigned index)
 {
-    // FIXME: Remove isMediaSource() test below when fixing https://bugs.webkit.org/show_bug.cgi?id=182531.
-    if (isMediaSource()) {
-        GST_FIXME_OBJECT(m_pipeline.get(), "Audio/Video/Text track switching is not yet supported by the MSE backend.");
-        return;
-    }
-
     const char* propertyName;
     const char* trackTypeAsString;
     Vector<String> selectedStreams;
     String selectedStreamId;
 
-    GstStream* stream = nullptr;
-
     if (!m_isLegacyPlaybin) {
-        stream = gst_stream_collection_get_stream(m_streamCollection.get(), index);
+        GstStream* stream = gst_stream_collection_get_stream(m_streamCollection.get(), index);
         if (!stream) {
             GST_WARNING_OBJECT(pipeline(), "No stream to select at index %u", index);
             return;
@@ -864,7 +864,8 @@ void MediaPlayerPrivateGStreamer::notifyPlayerOfVideo()
     if (UNLIKELY(!m_pipeline || !m_source))
         return;
 
-    ASSERT(m_isLegacyPlaybin || isMediaSource());
+    ASSERT(m_isLegacyPlaybin);
+    ASSERT(!isMediaSource());
 
     gint numTracks = 0;
     bool useMediaSource = isMediaSource();
@@ -917,19 +918,6 @@ void MediaPlayerPrivateGStreamer::notifyPlayerOfVideo()
     m_player->client().mediaPlayerEngineUpdated(m_player);
 }
 
-void MediaPlayerPrivateGStreamer::videoSinkCapsChangedCallback(MediaPlayerPrivateGStreamer* player)
-{
-    player->m_notifier->notify(MainThreadNotification::VideoCapsChanged, [player] {
-        player->notifyPlayerOfVideoCaps();
-    });
-}
-
-void MediaPlayerPrivateGStreamer::notifyPlayerOfVideoCaps()
-{
-    m_videoSize = IntSize();
-    m_player->client().mediaPlayerEngineUpdated(m_player);
-}
-
 void MediaPlayerPrivateGStreamer::audioChangedCallback(MediaPlayerPrivateGStreamer* player)
 {
     player->m_notifier->notify(MainThreadNotification::AudioChanged, [player] {
@@ -1848,6 +1836,7 @@ void MediaPlayerPrivateGStreamer::purgeOldDownloadFiles(const char* downloadFile
 
 void MediaPlayerPrivateGStreamer::sourceSetup(GstElement* sourceElement)
 {
+    ASSERT(!isMediaSource());
     GST_DEBUG_OBJECT(pipeline(), "Source element set-up for %s", GST_ELEMENT_NAME(sourceElement));
 
     if (WEBKIT_IS_WEB_SRC(m_source.get()) && GST_OBJECT_PARENT(m_source.get()))
@@ -2098,15 +2087,21 @@ void MediaPlayerPrivateGStreamer::updateStates()
 bool MediaPlayerPrivateGStreamer::handleSyncMessage(GstMessage* message)
 {
     if (GST_MESSAGE_TYPE(message) == GST_MESSAGE_STREAM_COLLECTION && !m_isLegacyPlaybin) {
+        // GStreamer workaround:
+        // Unfortunately, when we have a stream-collection aware source (like WebKitMediaSrc) parsebin and decodebin3 emit
+        // their own stream-collection messages, but late, and sometimes with duplicated streams. Let's only listen for
+        // stream-collection messages from the source in the MSE case to avoid these issues.
+        if (isMediaSource() && message->src != GST_OBJECT(m_source.get()))
+            return true;
+
         GRefPtr<GstStreamCollection> collection;
         gst_message_parse_stream_collection(message, &collection.outPtr());
+        ASSERT(collection);
+        m_streamCollection.swap(collection);
 
-        if (collection) {
-            m_streamCollection.swap(collection);
-            m_notifier->notify(MainThreadNotification::StreamCollectionChanged, [this] {
-                this->updateTracks();
-            });
-        }
+        m_notifier->notify(MainThreadNotification::StreamCollectionChanged, [this] {
+            this->updateTracks();
+        });
     }
 
     return MediaPlayerPrivateGStreamerBase::handleSyncMessage(message);
@@ -2390,10 +2385,9 @@ void MediaPlayerPrivateGStreamer::createGSTPlayBin(const URL& url, const String&
 {
     const gchar* playbinName = "playbin";
 
-    // MSE doesn't support playbin3. Mediastream requires playbin3. Regular
-    // playback can use playbin3 on-demand with the WEBKIT_GST_USE_PLAYBIN3
-    // environment variable.
-    if ((!isMediaSource() && g_getenv("WEBKIT_GST_USE_PLAYBIN3")) || url.protocolIs("mediastream"))
+    // MSE and Mediastream require playbin3. Regular playback can use playbin3 on-demand with the
+    // WEBKIT_GST_USE_PLAYBIN3 environment variable.
+    if ((isMediaSource() || url.protocolIs("mediastream") || g_getenv("WEBKIT_GST_USE_PLAYBIN3")))
         playbinName = "playbin3";
 
     if (m_pipeline) {
@@ -2474,8 +2468,6 @@ void MediaPlayerPrivateGStreamer::createGSTPlayBin(const URL& url, const String&
 
     g_object_set(m_pipeline.get(), "video-sink", createVideoSink(), "audio-sink", createAudioSink(), nullptr);
 
-    configurePlaySink();
-
     if (m_preservesPitch) {
         GstElement* scale = gst_element_factory_make("scaletempo", nullptr);
 
@@ -2495,10 +2487,6 @@ void MediaPlayerPrivateGStreamer::createGSTPlayBin(const URL& url, const String&
         } else
             GST_WARNING("The videoflip element is missing, video rotation support is now disabled. Please check your gst-plugins-good installation.");
     }
-
-    GRefPtr<GstPad> videoSinkPad = adoptGRef(gst_element_get_static_pad(m_videoSink.get(), "sink"));
-    if (videoSinkPad)
-        g_signal_connect_swapped(videoSinkPad.get(), "notify::caps", G_CALLBACK(videoSinkCapsChangedCallback), this);
 }
 
 void MediaPlayerPrivateGStreamer::simulateAudioInterruption()
index 511325e..e041235 100644 (file)
@@ -114,14 +114,12 @@ public:
 
     void loadStateChanged();
     void timeChanged();
-    void didEnd();
     virtual void durationChanged();
     void loadingFailed(MediaPlayer::NetworkState, MediaPlayer::ReadyState = MediaPlayer::HaveNothing, bool forceNotifications = false);
 
     virtual void sourceSetup(GstElement*);
 
     GstElement* audioSink() const override;
-    virtual void configurePlaySink() { }
 
     void simulateAudioInterruption() override;
 
@@ -206,7 +204,7 @@ protected:
     GstState m_requestedState;
     bool m_resetPipeline;
     bool m_seeking;
-    bool m_seekIsPending;
+    bool m_seekIsPending; // Set when the user requests a seek but gst can't handle it yet, so it's deferred until we're >=PAUSED.
     MediaTime m_seekTime;
     GRefPtr<GstElement> m_source;
     bool m_volumeAndMuteInitialized;
@@ -214,7 +212,6 @@ protected:
     void readyTimerFired();
 
     void notifyPlayerOfVideo();
-    void notifyPlayerOfVideoCaps();
     void notifyPlayerOfAudio();
 
 #if ENABLE(VIDEO_TRACK)
@@ -225,11 +222,13 @@ protected:
     void ensureAudioSourceProvider();
     void setAudioStreamProperties(GObject*);
 
+    virtual void didEnd();
+    void invalidateCachedPosition() { m_lastQueryTime.reset(); }
+
     static void setAudioStreamPropertiesCallback(MediaPlayerPrivateGStreamer*, GObject*);
 
     static void sourceSetupCallback(MediaPlayerPrivateGStreamer*, GstElement*);
     static void videoChangedCallback(MediaPlayerPrivateGStreamer*);
-    static void videoSinkCapsChangedCallback(MediaPlayerPrivateGStreamer*);
     static void audioChangedCallback(MediaPlayerPrivateGStreamer*);
 #if ENABLE(VIDEO_TRACK)
     static void textChangedCallback(MediaPlayerPrivateGStreamer*);
index 35d6ae9..fc3f0be 100644 (file)
@@ -486,6 +486,7 @@ bool MediaPlayerPrivateGStreamerBase::ensureGstGLContext()
 // Returns the size of the video
 FloatSize MediaPlayerPrivateGStreamerBase::naturalSize() const
 {
+    ASSERT(isMainThread());
 #if USE(GSTREAMER_HOLEPUNCH)
     // When using the holepuch we may not be able to get the video frames size, so we can't use
     // it. But we need to report some non empty naturalSize for the player's GraphicsLayer
@@ -500,6 +501,7 @@ FloatSize MediaPlayerPrivateGStreamerBase::naturalSize() const
         return m_videoSize;
 
     auto sampleLocker = holdLock(m_sampleMutex);
+
     if (!GST_IS_SAMPLE(m_sample.get()))
         return FloatSize();
 
@@ -507,6 +509,14 @@ FloatSize MediaPlayerPrivateGStreamerBase::naturalSize() const
     if (!caps)
         return FloatSize();
 
+    m_videoSize = naturalSizeFromCaps(caps);
+    GST_DEBUG_OBJECT(pipeline(), "Natural size: %.0fx%.0f", m_videoSize.width(), m_videoSize.height());
+    return m_videoSize;
+}
+
+FloatSize MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps(GstCaps* caps) const
+{
+    ASSERT(caps);
 
     // TODO: handle possible clean aperture data. See
     // https://bugzilla.gnome.org/show_bug.cgi?id=596571
@@ -557,9 +567,7 @@ FloatSize MediaPlayerPrivateGStreamerBase::naturalSize() const
         height = static_cast<guint64>(originalSize.height());
     }
 
-    GST_DEBUG_OBJECT(pipeline(), "Natural size: %" G_GUINT64_FORMAT "x%" G_GUINT64_FORMAT, width, height);
-    m_videoSize = FloatSize(static_cast<int>(width), static_cast<int>(height));
-    return m_videoSize;
+    return FloatSize(static_cast<int>(width), static_cast<int>(height));
 }
 
 void MediaPlayerPrivateGStreamerBase::setVolume(float volume)
@@ -613,11 +621,6 @@ MediaPlayer::ReadyState MediaPlayerPrivateGStreamerBase::readyState() const
     return m_readyState;
 }
 
-void MediaPlayerPrivateGStreamerBase::sizeChanged()
-{
-    notImplemented();
-}
-
 void MediaPlayerPrivateGStreamerBase::setMuted(bool mute)
 {
     if (!m_volumeElement)
@@ -749,12 +752,28 @@ void MediaPlayerPrivateGStreamerBase::repaint()
     m_drawCondition.notifyOne();
 }
 
+bool MediaPlayerPrivateGStreamerBase::doSamplesHaveDifferentNaturalSizes(GstSample* sampleA, GstSample* sampleB) const
+{
+    ASSERT(sampleA);
+    ASSERT(sampleB);
+
+    GstCaps* capsA = gst_sample_get_caps(sampleA);
+    GstCaps* capsB = gst_sample_get_caps(sampleB);
+
+    if (LIKELY(capsA == capsB))
+        return false;
+
+    return naturalSizeFromCaps(capsA) != naturalSizeFromCaps(capsB);
+}
+
 void MediaPlayerPrivateGStreamerBase::triggerRepaint(GstSample* sample)
 {
     bool triggerResize;
     {
         auto sampleLocker = holdLock(m_sampleMutex);
-        triggerResize = !m_sample;
+        triggerResize = !m_sample || doSamplesHaveDifferentNaturalSizes(m_sample.get(), sample);
+        if (triggerResize)
+            m_videoSize = FloatSize(); // Force re-calculation in next call to naturalSize().
         m_sample = sample;
     }
 
index e2604c4..8c0a9ec 100644 (file)
@@ -120,7 +120,6 @@ public:
 
     void setVisible(bool) override { }
     void setSize(const IntSize&) override;
-    void sizeChanged();
 
     // Prefer MediaTime based methods over float based.
     float duration() const override { return durationMediaTime().toFloat(); }
@@ -249,7 +248,6 @@ protected:
 
     enum MainThreadNotification {
         VideoChanged = 1 << 0,
-        VideoCapsChanged = 1 << 1,
         AudioChanged = 1 << 2,
         VolumeChanged = 1 << 3,
         MuteChanged = 1 << 4,
@@ -269,10 +267,11 @@ protected:
     MediaPlayer::ReadyState m_readyState;
     mutable MediaPlayer::NetworkState m_networkState;
     IntSize m_size;
+
     mutable Lock m_sampleMutex;
     GRefPtr<GstSample> m_sample;
-
     mutable FloatSize m_videoSize;
+
     bool m_usingFallbackVideoSink { false };
     bool m_renderingCanBeAccelerated { false };
 
@@ -309,6 +308,10 @@ protected:
 
     enum class WebKitGstVideoDecoderPlatform { Video4Linux };
     Optional<WebKitGstVideoDecoderPlatform> m_videoDecoderPlatform;
+
+private:
+    FloatSize naturalSizeFromCaps(GstCaps*) const;
+    bool doSamplesHaveDifferentNaturalSizes(GstSample* sampleA, GstSample* sampleB) const;
 };
 
 }
index f5a7dd6..912aa5f 100644 (file)
@@ -43,7 +43,7 @@ MediaSampleGStreamer::MediaSampleGStreamer(GRefPtr<GstSample>&& sample, const Fl
 
     auto createMediaTime =
         [](GstClockTime time) -> MediaTime {
-            return MediaTime(GST_TIME_AS_USECONDS(time), G_USEC_PER_SEC);
+            return MediaTime(time, GST_SECOND);
         };
 
     if (GST_BUFFER_PTS_IS_VALID(buffer))
index 097b1ce..2255b67 100644 (file)
@@ -452,6 +452,12 @@ void AppendPipeline::appsinkNewSample(GRefPtr<GstSample>&& sample)
         return;
     }
 
+    if (!GST_BUFFER_PTS_IS_VALID(gst_sample_get_buffer(sample.get()))) {
+        // When demuxing Vorbis, matroskademux creates several PTS-less frames with header information. We don't need those.
+        GST_DEBUG("Ignoring sample without PTS: %" GST_PTR_FORMAT, gst_sample_get_buffer(sample.get()));
+        return;
+    }
+
     auto mediaSample = WebCore::MediaSampleGStreamer::create(WTFMove(sample), m_presentationSize, trackId());
 
     GST_TRACE("append: trackId=%s PTS=%s DTS=%s DUR=%s presentationSize=%.0fx%.0f",
@@ -740,6 +746,9 @@ void AppendPipeline::connectDemuxerSrcPadToAppsink(GstPad* demuxerSrcPad)
     // Only one stream per demuxer is supported.
     ASSERT(!gst_pad_is_linked(sinkSinkPad.get()));
 
+    // As it is now, resetParserState() will cause the pads to be disconnected, so they will later be re-added on the next initialization segment.
+    bool firstTimeConnectingTrack = m_track == nullptr;
+
     GRefPtr<GstCaps> caps = adoptGRef(gst_pad_get_current_caps(GST_PAD(demuxerSrcPad)));
 
 #ifndef GST_DISABLE_GST_DEBUG
@@ -780,7 +789,7 @@ void AppendPipeline::connectDemuxerSrcPadToAppsink(GstPad* demuxerSrcPad)
     }
 
     m_appsinkCaps = WTFMove(caps);
-    m_playerPrivate->trackDetected(this, m_track, true);
+    m_playerPrivate->trackDetected(this, m_track, firstTimeConnectingTrack);
 }
 
 void AppendPipeline::disconnectDemuxerSrcPadFromAppsinkFromAnyThread(GstPad*)
index 080d2f7..0d5640b 100644 (file)
@@ -52,8 +52,9 @@ public:
     void pushNewBuffer(GRefPtr<GstBuffer>&&);
     void resetParserState();
     Ref<SourceBufferPrivateGStreamer> sourceBufferPrivate() { return m_sourceBufferPrivate.get(); }
-    GstCaps* appsinkCaps() { return m_appsinkCaps.get(); }
+    const GRefPtr<GstCaps>& appsinkCaps() { return m_appsinkCaps; }
     RefPtr<WebCore::TrackPrivateBase> track() { return m_track; }
+    MediaSourceStreamTypeGStreamer streamType() { return m_streamType; }
     MediaPlayerPrivateGStreamerMSE* playerPrivate() { return m_playerPrivate; }
 
 private:
@@ -80,7 +81,6 @@ private:
     GstElement* appsrc() { return m_appsrc.get(); }
     GstElement* appsink() { return m_appsink.get(); }
     GstCaps* demuxerSrcPadCaps() { return m_demuxerSrcPadCaps.get(); }
-    WebCore::MediaSourceStreamTypeGStreamer streamType() { return m_streamType; }
 
     void disconnectDemuxerSrcPadFromAppsinkFromAnyThread(GstPad*);
     void connectDemuxerSrcPadToAppsinkFromStreamingThread(GstPad*);
index e6880d9..8671ce5 100644 (file)
@@ -3,9 +3,9 @@
  * Copyright (C) 2007 Collabora Ltd.  All rights reserved.
  * Copyright (C) 2007 Alp Toker <alp@atoker.com>
  * Copyright (C) 2009 Gustavo Noronha Silva <gns@gnome.org>
- * Copyright (C) 2009, 2010, 2011, 2012, 2013, 2016, 2017 Igalia S.L
+ * Copyright (C) 2009, 2010, 2011, 2012, 2013, 2016, 2017, 2018, 2019 Igalia S.L
  * Copyright (C) 2015 Sebastian Dröge <sebastian@centricular.com>
- * Copyright (C) 2015, 2016, 2017 Metrological Group B.V.
+ * Copyright (C) 2015, 2016, 2017, 2018, 2019 Metrological Group B.V.
  *
  * This library is free software; you can redistribute it and/or
  * modify it under the terms of the GNU Library General Public
@@ -37,7 +37,6 @@
 #include "MediaDescription.h"
 #include "MediaPlayer.h"
 #include "NotImplemented.h"
-#include "PlaybackPipeline.h"
 #include "SourceBufferPrivateGStreamer.h"
 #include "TimeRanges.h"
 #include "VideoTrackPrivateGStreamer.h"
@@ -100,13 +99,7 @@ MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE()
 #endif
     m_appendPipelinesMap.clear();
 
-    if (m_source) {
-        webKitMediaSrcSetMediaPlayerPrivate(WEBKIT_MEDIA_SRC(m_source.get()), nullptr);
-        g_signal_handlers_disconnect_by_data(m_source.get(), this);
-    }
-
-    if (m_playbackPipeline)
-        m_playbackPipeline->setWebKitMediaSrc(nullptr);
+    m_source.clear();
 }
 
 void MediaPlayerPrivateGStreamerMSE::load(const String& urlString)
@@ -118,9 +111,6 @@ void MediaPlayerPrivateGStreamerMSE::load(const String& urlString)
         return;
     }
 
-    if (!m_playbackPipeline)
-        m_playbackPipeline = PlaybackPipeline::create();
-
     MediaPlayerPrivateGStreamer::load(urlString);
 }
 
@@ -130,10 +120,18 @@ void MediaPlayerPrivateGStreamerMSE::load(const String& url, MediaSourcePrivateC
     load(makeString("mediasource", url));
 }
 
+void MediaPlayerPrivateGStreamerMSE::play()
+{
+    GST_DEBUG_OBJECT(pipeline(), "Play requested");
+    m_paused = false;
+    updateStates();
+}
+
 void MediaPlayerPrivateGStreamerMSE::pause()
 {
+    GST_DEBUG_OBJECT(pipeline(), "Pause requested");
     m_paused = true;
-    MediaPlayerPrivateGStreamer::pause();
+    updateStates();
 }
 
 MediaTime MediaPlayerPrivateGStreamerMSE::durationMediaTime() const
@@ -146,310 +144,48 @@ MediaTime MediaPlayerPrivateGStreamerMSE::durationMediaTime() const
 
 void MediaPlayerPrivateGStreamerMSE::seek(const MediaTime& time)
 {
-    if (UNLIKELY(!m_pipeline || m_errorOccured))
-        return;
-
-    GST_INFO("[Seek] seek attempt to %s secs", toString(time).utf8().data());
-
-    // Avoid useless seeking.
-    MediaTime current = currentMediaTime();
-    if (time == current) {
-        if (!m_seeking)
-            timeChanged();
-        return;
-    }
-
-    if (isLiveStream())
-        return;
-
-    if (m_seeking && m_seekIsPending) {
-        m_seekTime = time;
-        return;
-    }
-
-    GST_DEBUG("Seeking from %s to %s seconds", toString(current).utf8().data(), toString(time).utf8().data());
-
-    MediaTime previousSeekTime = m_seekTime;
     m_seekTime = time;
-
-    if (!doSeek()) {
-        m_seekTime = previousSeekTime;
-        GST_WARNING("Seeking to %s failed", toString(time).utf8().data());
-        return;
-    }
-
-    m_isEndReached = false;
-    GST_DEBUG("m_seeking=%s, m_seekTime=%s", boolForPrinting(m_seeking), toString(m_seekTime).utf8().data());
-}
-
-void MediaPlayerPrivateGStreamerMSE::configurePlaySink()
-{
-    MediaPlayerPrivateGStreamer::configurePlaySink();
-
-    GRefPtr<GstElement> playsink = adoptGRef(gst_bin_get_by_name(GST_BIN(m_pipeline.get()), "playsink"));
-    if (playsink) {
-        // The default value (0) means "send events to all the sinks", instead
-        // of "only to the first that returns true". This is needed for MSE seek.
-        g_object_set(G_OBJECT(playsink.get()), "send-event-mode", 0, nullptr);
-    }
-}
-
-bool MediaPlayerPrivateGStreamerMSE::changePipelineState(GstState newState)
-{
-    if (seeking()) {
-        GST_DEBUG("Rejected state change to %s while seeking",
-            gst_element_state_get_name(newState));
-        return true;
-    }
-
-    return MediaPlayerPrivateGStreamer::changePipelineState(newState);
-}
-
-void MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime(const MediaTime& seekTime)
-{
-    // Reenqueue samples needed to resume playback in the new position.
-    m_mediaSource->seekToTime(seekTime);
-
-    GST_DEBUG("MSE seek to %s finished", toString(seekTime).utf8().data());
-
-    if (!m_gstSeekCompleted) {
-        m_gstSeekCompleted = true;
-        maybeFinishSeek();
-    }
-}
-
-bool MediaPlayerPrivateGStreamerMSE::doSeek(const MediaTime&, float, GstSeekFlags)
-{
-    // Use doSeek() instead. If anybody is calling this version of doSeek(), something is wrong.
-    ASSERT_NOT_REACHED();
-    return false;
-}
-
-bool MediaPlayerPrivateGStreamerMSE::doSeek()
-{
-    MediaTime seekTime = m_seekTime;
-    double rate = m_player->rate();
-    GstSeekFlags seekType = static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE);
-
-    // Always move to seeking state to report correct 'currentTime' while pending for actual seek to complete.
     m_seeking = true;
 
-    // Check if playback pipeline is ready for seek.
-    GstState state, newState;
-    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &newState, 0);
-    if (getStateResult == GST_STATE_CHANGE_FAILURE || getStateResult == GST_STATE_CHANGE_NO_PREROLL) {
-        GST_DEBUG("[Seek] cannot seek, current state change is %s", gst_element_state_change_return_get_name(getStateResult));
-        webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
-        m_seeking = false;
-        return false;
-    }
-    if ((getStateResult == GST_STATE_CHANGE_ASYNC
-        && !(state == GST_STATE_PLAYING && newState == GST_STATE_PAUSED))
-        || state < GST_STATE_PAUSED
-        || m_isEndReached
-        || !m_gstSeekCompleted) {
-        CString reason = "Unknown reason";
-        if (getStateResult == GST_STATE_CHANGE_ASYNC) {
-            reason = makeString("In async change ",
-                gst_element_state_get_name(state), " --> ",
-                gst_element_state_get_name(newState)).utf8();
-        } else if (state < GST_STATE_PAUSED)
-            reason = "State less than PAUSED";
-        else if (m_isEndReached)
-            reason = "End reached";
-        else if (!m_gstSeekCompleted)
-            reason = "Previous seek is not finished yet";
-
-        GST_DEBUG("[Seek] Delaying the seek: %s", reason.data());
-
-        m_seekIsPending = true;
-
-        if (m_isEndReached) {
-            GST_DEBUG("[Seek] reset pipeline");
-            m_resetPipeline = true;
-            m_seeking = false;
-            if (!changePipelineState(GST_STATE_PAUSED))
-                loadingFailed(MediaPlayer::Empty);
-            else
-                m_seeking = true;
-        }
-
-        return m_seeking;
-    }
-
-    // Stop accepting new samples until actual seek is finished.
-    webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), false);
-
-    // Correct seek time if it helps to fix a small gap.
-    if (!isTimeBuffered(seekTime)) {
-        // Look if a near future time (<0.1 sec.) is buffered and change the seek target time.
-        if (m_mediaSource) {
-            const MediaTime miniGap = MediaTime(1, 10);
-            MediaTime nearest = m_mediaSource->buffered()->nearest(seekTime);
-            if (nearest.isValid() && nearest > seekTime && (nearest - seekTime) <= miniGap && isTimeBuffered(nearest + miniGap)) {
-                GST_DEBUG("[Seek] Changed the seek target time from %s to %s, a near point in the future", toString(seekTime).utf8().data(), toString(nearest).utf8().data());
-                seekTime = nearest;
-            }
-        }
-    }
+    webKitMediaSrcSeek(WEBKIT_MEDIA_SRC(m_source.get()), toGstClockTime(m_seekTime), m_playbackRate);
 
-    // Check if MSE has samples for requested time and defer actual seek if needed.
-    if (!isTimeBuffered(seekTime)) {
-        GST_DEBUG("[Seek] Delaying the seek: MSE is not ready");
-        GstStateChangeReturn setStateResult = gst_element_set_state(m_pipeline.get(), GST_STATE_PAUSED);
-        if (setStateResult == GST_STATE_CHANGE_FAILURE) {
-            GST_DEBUG("[Seek] Cannot seek, failed to pause playback pipeline.");
-            webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
-            m_seeking = false;
-            return false;
-        }
-        m_readyState = MediaPlayer::HaveMetadata;
-        notifySeekNeedsDataForTime(seekTime);
-        ASSERT(!m_mseSeekCompleted);
-        return true;
-    }
-
-    // Complete previous MSE seek if needed.
-    if (!m_mseSeekCompleted) {
-        m_mediaSource->monitorSourceBuffers();
-        ASSERT(m_mseSeekCompleted);
-        // Note: seekCompleted will recursively call us.
-        return m_seeking;
-    }
-
-    GST_DEBUG("We can seek now");
-
-    MediaTime startTime = seekTime, endTime = MediaTime::invalidTime();
-
-    if (rate < 0) {
-        startTime = MediaTime::zeroTime();
-        endTime = seekTime;
-    }
-
-    if (!rate)
-        rate = 1;
-
-    GST_DEBUG("Actual seek to %s, end time:  %s, rate: %f", toString(startTime).utf8().data(), toString(endTime).utf8().data(), rate);
-
-    // This will call notifySeekNeedsData() after some time to tell that the pipeline is ready for sample enqueuing.
-    webKitMediaSrcPrepareSeek(WEBKIT_MEDIA_SRC(m_source.get()), seekTime);
-
-    m_gstSeekCompleted = false;
-    if (!gst_element_seek(m_pipeline.get(), rate, GST_FORMAT_TIME, seekType, GST_SEEK_TYPE_SET, toGstClockTime(startTime), GST_SEEK_TYPE_SET, toGstClockTime(endTime))) {
-        webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
-        m_seeking = false;
-        m_gstSeekCompleted = true;
-        GST_DEBUG("doSeek(): gst_element_seek() failed, returning false");
-        return false;
-    }
-
-    // The samples will be enqueued in notifySeekNeedsData().
-    GST_DEBUG("doSeek(): gst_element_seek() succeeded, returning true");
-    return true;
-}
-
-void MediaPlayerPrivateGStreamerMSE::maybeFinishSeek()
-{
-    if (!m_seeking || !m_mseSeekCompleted || !m_gstSeekCompleted)
-        return;
-
-    GstState state, newState;
-    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &newState, 0);
-
-    if (getStateResult == GST_STATE_CHANGE_ASYNC
-        && !(state == GST_STATE_PLAYING && newState == GST_STATE_PAUSED)) {
-        GST_DEBUG("[Seek] Delaying seek finish");
-        return;
-    }
-
-    if (m_seekIsPending) {
-        GST_DEBUG("[Seek] Committing pending seek to %s", toString(m_seekTime).utf8().data());
-        m_seekIsPending = false;
-        if (!doSeek()) {
-            GST_WARNING("[Seek] Seeking to %s failed", toString(m_seekTime).utf8().data());
-            m_cachedPosition = MediaTime::invalidTime();
-        }
-        return;
-    }
-
-    GST_DEBUG("[Seek] Seeked to %s", toString(m_seekTime).utf8().data());
-
-    webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
-    m_seeking = false;
-    m_cachedPosition = MediaTime::invalidTime();
-    // The pipeline can still have a pending state. In this case a position query will fail.
-    // Right now we can use m_seekTime as a fallback.
+    invalidateCachedPosition();
     m_canFallBackToLastFinishedSeekPosition = true;
-    timeChanged();
-}
 
-void MediaPlayerPrivateGStreamerMSE::updatePlaybackRate()
-{
-    notImplemented();
+    // Notify MediaSource and have new frames enqueued (when they're available).
+    m_mediaSource->seekToTime(time);
 }
 
-bool MediaPlayerPrivateGStreamerMSE::seeking() const
+void MediaPlayerPrivateGStreamerMSE::reportSeekCompleted()
 {
-    return m_seeking;
+    m_seeking = false;
+    m_player->timeChanged();
 }
 
-// FIXME: MediaPlayerPrivateGStreamer manages the ReadyState on its own. We shouldn't change it manually.
 void MediaPlayerPrivateGStreamerMSE::setReadyState(MediaPlayer::ReadyState readyState)
 {
     if (readyState == m_readyState)
         return;
 
-    if (seeking()) {
-        GST_DEBUG("Skip ready state change(%s -> %s) due to seek\n", dumpReadyState(m_readyState), dumpReadyState(readyState));
-        return;
-    }
-
-    GST_DEBUG("Ready State Changed manually from %u to %u", m_readyState, readyState);
-    MediaPlayer::ReadyState oldReadyState = m_readyState;
+    GST_DEBUG("MediaPlayerPrivateGStreamerMSE::setReadyState(%p): %s -> %s", this, dumpReadyState(m_readyState), dumpReadyState(readyState));
     m_readyState = readyState;
-    GST_DEBUG("m_readyState: %s -> %s", dumpReadyState(oldReadyState), dumpReadyState(m_readyState));
+    updateStates();
 
-    if (oldReadyState < MediaPlayer::HaveCurrentData && m_readyState >= MediaPlayer::HaveCurrentData) {
-        GST_DEBUG("[Seek] Reporting load state changed to trigger seek continuation");
-        loadStateChanged();
-    }
+    // Both readyStateChanged() and timeChanged() check for "seeked" condition, which requires all the following three things:
+    //   1. HTMLMediaPlayer.m_seekRequested == true.
+    //   2. Our seeking() method to return false (that is, we have completed the seek).
+    //   3. readyState > HaveMetadata.
+    //
+    // We normally would set m_seeking = false in seekCompleted(), but unfortunately by that time, playback has already
+    // started which means that the "playing" event is emitted before "seeked". In order to avoid that wrong order,
+    // we do it here already.
+    if (m_seeking && readyState > MediaPlayer::ReadyState::HaveMetadata)
+        m_seeking = false;
     m_player->readyStateChanged();
 
-    GstState pipelineState;
-    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &pipelineState, nullptr, 250 * GST_NSECOND);
-    bool isPlaying = (getStateResult == GST_STATE_CHANGE_SUCCESS && pipelineState == GST_STATE_PLAYING);
-
-    if (m_readyState == MediaPlayer::HaveMetadata && oldReadyState > MediaPlayer::HaveMetadata && isPlaying) {
-        GST_TRACE("Changing pipeline to PAUSED...");
-        bool ok = changePipelineState(GST_STATE_PAUSED);
-        GST_TRACE("Changed pipeline to PAUSED: %s", ok ? "Success" : "Error");
-    }
-}
-
-void MediaPlayerPrivateGStreamerMSE::waitForSeekCompleted()
-{
-    if (!m_seeking)
-        return;
-
-    GST_DEBUG("Waiting for MSE seek completed");
-    m_mseSeekCompleted = false;
-}
-
-void MediaPlayerPrivateGStreamerMSE::seekCompleted()
-{
-    if (m_mseSeekCompleted)
-        return;
-
-    GST_DEBUG("MSE seek completed");
-    m_mseSeekCompleted = true;
-
-    doSeek();
-
-    if (!seeking() && m_readyState >= MediaPlayer::HaveFutureData)
-        changePipelineState(GST_STATE_PLAYING);
-
-    if (!seeking())
-        m_player->timeChanged();
+    // The readyState change may be a result of monitorSourceBuffers() finding that currentTime == duration, which
+    // should cause the video to be marked as ended. Let's have the player check that.
+    m_player->timeChanged();
 }
 
 void MediaPlayerPrivateGStreamerMSE::setRate(float)
@@ -465,168 +201,33 @@ std::unique_ptr<PlatformTimeRanges> MediaPlayerPrivateGStreamerMSE::buffered() c
 void MediaPlayerPrivateGStreamerMSE::sourceSetup(GstElement* sourceElement)
 {
     m_source = sourceElement;
-
     ASSERT(WEBKIT_IS_MEDIA_SRC(m_source.get()));
-
-    m_playbackPipeline->setWebKitMediaSrc(WEBKIT_MEDIA_SRC(m_source.get()));
-
     MediaSourceGStreamer::open(*m_mediaSource.get(), *this);
-    g_signal_connect_swapped(m_source.get(), "video-changed", G_CALLBACK(videoChangedCallback), this);
-    g_signal_connect_swapped(m_source.get(), "audio-changed", G_CALLBACK(audioChangedCallback), this);
-    g_signal_connect_swapped(m_source.get(), "text-changed", G_CALLBACK(textChangedCallback), this);
-    webKitMediaSrcSetMediaPlayerPrivate(WEBKIT_MEDIA_SRC(m_source.get()), this);
 }
 
 void MediaPlayerPrivateGStreamerMSE::updateStates()
 {
-    if (UNLIKELY(!m_pipeline || m_errorOccured))
-        return;
-
-    MediaPlayer::NetworkState oldNetworkState = m_networkState;
-    MediaPlayer::ReadyState oldReadyState = m_readyState;
-    GstState state, pending;
-
-    GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &pending, 250 * GST_NSECOND);
-
-    bool shouldUpdatePlaybackState = false;
-    switch (getStateResult) {
-    case GST_STATE_CHANGE_SUCCESS: {
-        GST_DEBUG("State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
-
-        // Do nothing if on EOS and state changed to READY to avoid recreating the player
-        // on HTMLMediaElement and properly generate the video 'ended' event.
-        if (m_isEndReached && state == GST_STATE_READY)
-            break;
-
-        m_resetPipeline = (state <= GST_STATE_READY);
-        if (m_resetPipeline)
-            m_mediaTimeDuration = MediaTime::zeroTime();
-
-        // Update ready and network states.
-        switch (state) {
-        case GST_STATE_NULL:
-            m_readyState = MediaPlayer::HaveNothing;
-            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
-            m_networkState = MediaPlayer::Empty;
-            break;
-        case GST_STATE_READY:
-            m_readyState = MediaPlayer::HaveMetadata;
-            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
-            m_networkState = MediaPlayer::Empty;
-            break;
-        case GST_STATE_PAUSED:
-        case GST_STATE_PLAYING:
-            if (seeking()) {
-                m_readyState = MediaPlayer::HaveMetadata;
-                // FIXME: Should we manage NetworkState too?
-                GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
-            } else {
-                if (m_readyState < MediaPlayer::HaveFutureData)
-                    m_readyState = MediaPlayer::HaveFutureData;
-                GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
-                m_networkState = MediaPlayer::Loading;
-            }
-
-            if (m_eosMarked && state == GST_STATE_PLAYING)
-                m_eosPending = true;
-
-            break;
-        default:
-            ASSERT_NOT_REACHED();
-            break;
-        }
-
-        // Sync states where needed.
-        if (state == GST_STATE_PAUSED) {
-            if (!m_volumeAndMuteInitialized) {
-                notifyPlayerOfVolumeChange();
-                notifyPlayerOfMute();
-                m_volumeAndMuteInitialized = true;
-            }
-
-            if (!seeking() && !m_paused && m_playbackRate) {
-                GST_DEBUG("[Buffering] Restarting playback.");
-                changePipelineState(GST_STATE_PLAYING);
-            }
-        } else if (state == GST_STATE_PLAYING) {
-            m_paused = false;
-
-            if (!m_playbackRate) {
-                GST_DEBUG("[Buffering] Pausing stream for buffering.");
-                changePipelineState(GST_STATE_PAUSED);
-            }
-        } else
-            m_paused = true;
-
-        if (m_requestedState == GST_STATE_PAUSED && state == GST_STATE_PAUSED) {
-            shouldUpdatePlaybackState = true;
-            GST_DEBUG("Requested state change to %s was completed", gst_element_state_get_name(state));
-        }
-
-        break;
-    }
-    case GST_STATE_CHANGE_ASYNC:
-        GST_DEBUG("Async: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
-        // Change in progress.
-        break;
-    case GST_STATE_CHANGE_FAILURE:
-        GST_WARNING("Failure: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
-        // Change failed.
-        return;
-    case GST_STATE_CHANGE_NO_PREROLL:
-        GST_DEBUG("No preroll: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
-
-        // Live pipelines go in PAUSED without prerolling.
-        m_isStreaming = true;
-
-        if (state == GST_STATE_READY) {
-            m_readyState = MediaPlayer::HaveNothing;
-            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
-        } else if (state == GST_STATE_PAUSED) {
-            m_readyState = MediaPlayer::HaveEnoughData;
-            GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
-            m_paused = true;
-        } else if (state == GST_STATE_PLAYING)
-            m_paused = false;
-
-        if (!m_paused && m_playbackRate)
-            changePipelineState(GST_STATE_PLAYING);
-
-        m_networkState = MediaPlayer::Loading;
-        break;
-    default:
-        GST_DEBUG("Else : %d", getStateResult);
-        break;
-    }
-
-    m_requestedState = GST_STATE_VOID_PENDING;
-
-    if (shouldUpdatePlaybackState)
-        m_player->playbackStateChanged();
-
-    if (m_networkState != oldNetworkState) {
-        GST_DEBUG("Network State Changed from %u to %u", oldNetworkState, m_networkState);
-        m_player->networkStateChanged();
-    }
-    if (m_readyState != oldReadyState) {
-        GST_DEBUG("Ready State Changed from %u to %u", oldReadyState, m_readyState);
-        m_player->readyStateChanged();
-    }
-
-    if (getStateResult == GST_STATE_CHANGE_SUCCESS && state >= GST_STATE_PAUSED) {
-        updatePlaybackRate();
-        maybeFinishSeek();
+    bool shouldBePlaying = !m_paused && readyState() >= MediaPlayer::ReadyState::HaveFutureData;
+    GST_DEBUG_OBJECT(pipeline(), "shouldBePlaying = %d, m_isPipelinePlaying = %d", static_cast<int>(shouldBePlaying), static_cast<int>(m_isPipelinePlaying));
+    if (shouldBePlaying && !m_isPipelinePlaying) {
+        if (!changePipelineState(GST_STATE_PLAYING))
+            GST_ERROR_OBJECT(pipeline(), "Setting the pipeline to PLAYING failed");
+        m_isPipelinePlaying = true;
+    } else if (!shouldBePlaying && m_isPipelinePlaying) {
+        if (!changePipelineState(GST_STATE_PAUSED))
+            GST_ERROR_OBJECT(pipeline(), "Setting the pipeline to PAUSED failed");
+        m_isPipelinePlaying = false;
     }
 }
-void MediaPlayerPrivateGStreamerMSE::asyncStateChangeDone()
-{
-    if (UNLIKELY(!m_pipeline || m_errorOccured))
-        return;
 
-    if (m_seeking)
-        maybeFinishSeek();
-    else
-        updateStates();
+void MediaPlayerPrivateGStreamerMSE::didEnd()
+{
+    GST_DEBUG_OBJECT(pipeline(), "EOS received, currentTime=%s duration=%s", currentMediaTime().toString().utf8().data(), durationMediaTime().toString().utf8().data());
+    m_isEndReached = true;
+    invalidateCachedPosition();
+    // HTMLMediaElement will emit ended if currentTime >= duration (which should now be the case).
+    ASSERT(currentMediaTime() == durationMediaTime());
+    m_player->timeChanged();
 }
 
 bool MediaPlayerPrivateGStreamerMSE::isTimeBuffered(const MediaTime &time) const
@@ -641,11 +242,6 @@ void MediaPlayerPrivateGStreamerMSE::setMediaSourceClient(Ref<MediaSourceClientG
     m_mediaSourceClient = client.ptr();
 }
 
-RefPtr<MediaSourceClientGStreamerMSE> MediaPlayerPrivateGStreamerMSE::mediaSourceClient()
-{
-    return m_mediaSourceClient;
-}
-
 void MediaPlayerPrivateGStreamerMSE::blockDurationChanges()
 {
     ASSERT(isMainThread());
@@ -658,7 +254,6 @@ void MediaPlayerPrivateGStreamerMSE::unblockDurationChanges()
     ASSERT(isMainThread());
     if (m_shouldReportDurationWhenUnblocking) {
         m_player->durationChanged();
-        m_playbackPipeline->notifyDurationChanged();
         m_shouldReportDurationWhenUnblocking = false;
     }
 
@@ -681,10 +276,9 @@ void MediaPlayerPrivateGStreamerMSE::durationChanged()
     // Avoid emiting durationchanged in the case where the previous duration was 0 because that case is already handled
     // by the HTMLMediaElement.
     if (m_mediaTimeDuration != previousDuration && m_mediaTimeDuration.isValid() && previousDuration.isValid()) {
-        if (!m_areDurationChangesBlocked) {
+        if (!m_areDurationChangesBlocked)
             m_player->durationChanged();
-            m_playbackPipeline->notifyDurationChanged();
-        } else
+        else
             m_shouldReportDurationWhenUnblocking = true;
         m_mediaSource->durationChanged(m_mediaTimeDuration);
     }
@@ -694,20 +288,18 @@ void MediaPlayerPrivateGStreamerMSE::trackDetected(RefPtr<AppendPipeline> append
 {
     ASSERT(appendPipeline->track() == newTrack);
 
-    GstCaps* caps = appendPipeline->appsinkCaps();
+    GRefPtr<GstCaps> caps = appendPipeline->appsinkCaps();
     ASSERT(caps);
-    GST_DEBUG("track ID: %s, caps: %" GST_PTR_FORMAT, newTrack->id().string().latin1().data(), caps);
+    GST_DEBUG("track ID: %s, caps: %" GST_PTR_FORMAT, newTrack->id().string().latin1().data(), caps.get());
 
-    if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
-        Optional<FloatSize> size = getVideoResolutionFromCaps(caps);
+    if (doCapsHaveType(caps.get(), GST_VIDEO_CAPS_TYPE_PREFIX)) {
+        Optional<FloatSize> size = getVideoResolutionFromCaps(caps.get());
         if (size.hasValue())
             m_videoSize = size.value();
     }
 
     if (firstTrackDetected)
-        m_playbackPipeline->attachTrack(appendPipeline->sourceBufferPrivate(), newTrack, caps);
-    else
-        m_playbackPipeline->reattachTrack(appendPipeline->sourceBufferPrivate(), newTrack, caps);
+        webKitMediaSrcAddStream(WEBKIT_MEDIA_SRC(m_source.get()), newTrack->id(), appendPipeline->streamType(), WTFMove(caps));
 }
 
 void MediaPlayerPrivateGStreamerMSE::getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types)
@@ -744,34 +336,6 @@ MediaPlayer::SupportsType MediaPlayerPrivateGStreamerMSE::supportsType(const Med
     return finalResult;
 }
 
-void MediaPlayerPrivateGStreamerMSE::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus status)
-{
-    if (status != MediaSourcePrivate::EosNoError)
-        return;
-
-    GST_DEBUG("Marking end of stream");
-    m_eosMarked = true;
-    updateStates();
-}
-
-MediaTime MediaPlayerPrivateGStreamerMSE::currentMediaTime() const
-{
-    MediaTime position = MediaPlayerPrivateGStreamer::currentMediaTime();
-
-    if (m_eosPending && position >= durationMediaTime()) {
-        if (m_networkState != MediaPlayer::Loaded) {
-            m_networkState = MediaPlayer::Loaded;
-            m_player->networkStateChanged();
-        }
-
-        m_eosPending = false;
-        m_isEndReached = true;
-        m_cachedPosition = m_mediaTimeDuration;
-        m_player->timeChanged();
-    }
-    return position;
-}
-
 MediaTime MediaPlayerPrivateGStreamerMSE::maxMediaTimeSeekable() const
 {
     if (UNLIKELY(m_errorOccured))
index 3bcc0fd..fe6179f 100644 (file)
@@ -55,13 +55,12 @@ public:
     void updateDownloadBufferingFlag() override { };
 
     bool isLiveStream() const override { return false; }
-    MediaTime currentMediaTime() const override;
 
+    void play() override;
     void pause() override;
-    bool seeking() const override;
     void seek(const MediaTime&) override;
-    void configurePlaySink() override;
-    bool changePipelineState(GstState) override;
+    void reportSeekCompleted();
+    void updatePipelineState(GstState);
 
     void durationChanged() override;
     MediaTime durationMediaTime() const override;
@@ -73,50 +72,38 @@ public:
     void sourceSetup(GstElement*) override;
 
     void setReadyState(MediaPlayer::ReadyState);
-    void waitForSeekCompleted();
-    void seekCompleted();
     MediaSourcePrivateClient* mediaSourcePrivateClient() { return m_mediaSource.get(); }
 
-    void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
-
     void trackDetected(RefPtr<AppendPipeline>, RefPtr<WebCore::TrackPrivateBase>, bool firstTrackDetected);
-    void notifySeekNeedsDataForTime(const MediaTime&);
 
     void blockDurationChanges();
     void unblockDurationChanges();
 
+    void asyncStateChangeDone() override { }
+
+protected:
+    void didEnd() override;
+
 private:
     static void getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>&);
     static MediaPlayer::SupportsType supportsType(const MediaEngineSupportParameters&);
 
-    // FIXME: Reduce code duplication.
     void updateStates() override;
 
-    bool doSeek(const MediaTime&, float, GstSeekFlags) override;
-    bool doSeek();
-    void maybeFinishSeek();
-    void updatePlaybackRate() override;
-    void asyncStateChangeDone() override;
-
     // FIXME: Implement videoPlaybackQualityMetrics.
     bool isTimeBuffered(const MediaTime&) const;
 
     bool isMediaSource() const override { return true; }
 
     void setMediaSourceClient(Ref<MediaSourceClientGStreamerMSE>);
-    RefPtr<MediaSourceClientGStreamerMSE> mediaSourceClient();
 
     HashMap<RefPtr<SourceBufferPrivateGStreamer>, RefPtr<AppendPipeline>> m_appendPipelinesMap;
-    bool m_eosMarked = false;
-    mutable bool m_eosPending = false;
-    bool m_gstSeekCompleted = true;
     RefPtr<MediaSourcePrivateClient> m_mediaSource;
     RefPtr<MediaSourceClientGStreamerMSE> m_mediaSourceClient;
     MediaTime m_mediaTimeDuration;
-    bool m_mseSeekCompleted = true;
     bool m_areDurationChangesBlocked = false;
     bool m_shouldReportDurationWhenUnblocking = false;
-    RefPtr<PlaybackPipeline> m_playbackPipeline;
+    bool m_isPipelinePlaying = true;
 };
 
 } // namespace WebCore
index fae74f4..e8d9d01 100644 (file)
@@ -23,7 +23,6 @@
 
 #include "AppendPipeline.h"
 #include "MediaPlayerPrivateGStreamerMSE.h"
-#include "PlaybackPipeline.h"
 #include "WebKitMediaSourceGStreamer.h"
 #include <gst/gst.h>
 
@@ -60,14 +59,13 @@ MediaSourcePrivate::AddStatus MediaSourceClientGStreamerMSE::addSourceBuffer(Ref
 {
     ASSERT(WTF::isMainThread());
 
-    ASSERT(m_playerPrivate.m_playbackPipeline);
     ASSERT(sourceBufferPrivate);
 
     RefPtr<AppendPipeline> appendPipeline = adoptRef(new AppendPipeline(*this, *sourceBufferPrivate, m_playerPrivate));
     GST_TRACE("Adding SourceBuffer to AppendPipeline: this=%p sourceBuffer=%p appendPipeline=%p", this, sourceBufferPrivate.get(), appendPipeline.get());
     m_playerPrivate.m_appendPipelinesMap.add(sourceBufferPrivate, appendPipeline);
 
-    return m_playerPrivate.m_playbackPipeline->addSourceBuffer(sourceBufferPrivate);
+    return MediaSourcePrivate::Ok;
 }
 
 const MediaTime& MediaSourceClientGStreamerMSE::duration()
@@ -137,25 +135,17 @@ void MediaSourceClientGStreamerMSE::append(RefPtr<SourceBufferPrivateGStreamer>
     appendPipeline->pushNewBuffer(WTFMove(buffer));
 }
 
-void MediaSourceClientGStreamerMSE::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus status)
-{
-    ASSERT(WTF::isMainThread());
-
-    m_playerPrivate.markEndOfStream(status);
-}
-
 void MediaSourceClientGStreamerMSE::removedFromMediaSource(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
 {
     ASSERT(WTF::isMainThread());
 
-    ASSERT(m_playerPrivate.m_playbackPipeline);
-
     // Remove the AppendPipeline from the map. This should cause its destruction since there should be no alive
     // references at this point.
     ASSERT(m_playerPrivate.m_appendPipelinesMap.get(sourceBufferPrivate)->hasOneRef());
     m_playerPrivate.m_appendPipelinesMap.remove(sourceBufferPrivate);
 
-    m_playerPrivate.m_playbackPipeline->removeSourceBuffer(sourceBufferPrivate);
+    if (!sourceBufferPrivate->trackId().isNull())
+        webKitMediaSrcRemoveStream(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), sourceBufferPrivate->trackId());
 }
 
 void MediaSourceClientGStreamerMSE::flush(AtomString trackId)
@@ -164,21 +154,41 @@ void MediaSourceClientGStreamerMSE::flush(AtomString trackId)
 
     // This is only for on-the-fly reenqueues after appends. When seeking, the seek will do its own flush.
     if (!m_playerPrivate.m_seeking)
-        m_playerPrivate.m_playbackPipeline->flush(trackId);
+        webKitMediaSrcFlush(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
 }
 
-void MediaSourceClientGStreamerMSE::enqueueSample(Ref<MediaSample>&& sample)
+void MediaSourceClientGStreamerMSE::enqueueSample(Ref<MediaSample>&& sample, AtomString trackId)
 {
     ASSERT(WTF::isMainThread());
 
-    m_playerPrivate.m_playbackPipeline->enqueueSample(WTFMove(sample));
+    GST_TRACE("enqueing sample trackId=%s PTS=%f presentationSize=%.0fx%.0f at %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT,
+        trackId.string().utf8().data(), sample->presentationTime().toFloat(),
+        sample->presentationSize().width(), sample->presentationSize().height(),
+        GST_TIME_ARGS(WebCore::toGstClockTime(sample->presentationTime())),
+        GST_TIME_ARGS(WebCore::toGstClockTime(sample->duration())));
+
+    GRefPtr<GstSample> gstSample = sample->platformSample().sample.gstSample;
+    ASSERT(gstSample);
+    ASSERT(gst_sample_get_buffer(gstSample.get()));
+
+    webKitMediaSrcEnqueueSample(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId, WTFMove(gstSample));
+}
+
+bool MediaSourceClientGStreamerMSE::isReadyForMoreSamples(const AtomString& trackId)
+{
+    return webKitMediaSrcIsReadyForMoreSamples(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
+}
+
+void MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples(const AtomString& trackId, SourceBufferPrivateClient* sourceBuffer)
+{
+    webKitMediaSrcNotifyWhenReadyForMoreSamples(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId, sourceBuffer);
 }
 
 void MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued(const AtomString& trackId)
 {
     ASSERT(WTF::isMainThread());
 
-    m_playerPrivate.m_playbackPipeline->allSamplesInTrackEnqueued(trackId);
+    webKitMediaSrcEndOfStream(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
 }
 
 GRefPtr<WebKitMediaSrc> MediaSourceClientGStreamerMSE::webKitMediaSrc()
index 5008a06..8e52f94 100644 (file)
@@ -43,7 +43,6 @@ public:
     // From MediaSourceGStreamer.
     MediaSourcePrivate::AddStatus addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>, const ContentType&);
     void durationChanged(const MediaTime&);
-    void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
 
     // From SourceBufferPrivateGStreamer.
     void abort(RefPtr<SourceBufferPrivateGStreamer>);
@@ -51,9 +50,12 @@ public:
     void append(RefPtr<SourceBufferPrivateGStreamer>, Vector<unsigned char>&&);
     void removedFromMediaSource(RefPtr<SourceBufferPrivateGStreamer>);
     void flush(AtomString);
-    void enqueueSample(Ref<MediaSample>&&);
+    void enqueueSample(Ref<MediaSample>&&, AtomString trackId);
     void allSamplesInTrackEnqueued(const AtomString&);
 
+    bool isReadyForMoreSamples(const AtomString&);
+    void notifyClientWhenReadyForMoreSamples(const AtomString&, SourceBufferPrivateClient*);
+
     const MediaTime& duration();
     GRefPtr<WebKitMediaSrc> webKitMediaSrc();
 
index e62cd56..51adf82 100644 (file)
@@ -92,14 +92,18 @@ void MediaSourceGStreamer::durationChanged()
     m_client->durationChanged(m_mediaSource->duration());
 }
 
-void MediaSourceGStreamer::markEndOfStream(EndOfStreamStatus status)
+void MediaSourceGStreamer::markEndOfStream(EndOfStreamStatus)
 {
-    m_client->markEndOfStream(status);
+    // We don't need to do anything in the AppendPipeline nor the playback pipeline. Instead, SourceBuffer knows better
+    // when .endOfStream() has been called and there are no more samples to enqueue, which it will signal with a call
+    // to SourceBufferPrivateGStreamer::allSamplesInTrackEnqueued(), where we enqueue an EOS event into WebKitMediaSrc.
+
+    // At this point it would be dangerously early to do that! There may be samples waiting to reach WebKitMediaSrc
+    // (e.g. because high water level is hit) that will not be shown if we enqueue an EOS now.
 }
 
 void MediaSourceGStreamer::unmarkEndOfStream()
 {
-    notImplemented();
 }
 
 MediaPlayer::ReadyState MediaSourceGStreamer::readyState() const
@@ -114,12 +118,11 @@ void MediaSourceGStreamer::setReadyState(MediaPlayer::ReadyState state)
 
 void MediaSourceGStreamer::waitForSeekCompleted()
 {
-    m_playerPrivate.waitForSeekCompleted();
 }
 
 void MediaSourceGStreamer::seekCompleted()
 {
-    m_playerPrivate.seekCompleted();
+    m_playerPrivate.reportSeekCompleted();
 }
 
 void MediaSourceGStreamer::sourceBufferPrivateDidChangeActiveState(SourceBufferPrivateGStreamer* sourceBufferPrivate, bool isActive)
index c9a09fa..0a44746 100644 (file)
@@ -40,8 +40,6 @@
 #include <wtf/Forward.h>
 #include <wtf/HashSet.h>
 
-typedef struct _WebKitMediaSrc WebKitMediaSrc;
-
 namespace WebCore {
 
 class SourceBufferPrivateGStreamer;
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp b/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
deleted file mode 100644 (file)
index b005d0d..0000000
+++ /dev/null
@@ -1,400 +0,0 @@
-/*
- * Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
- * Copyright (C) 2016 Metrological Group B.V.
- * Copyright (C) 2016 Igalia S.L
- *
- * This library is free software; you can redistribute it and/or
- * modify it under the terms of the GNU Library General Public
- * License as published by the Free Software Foundation; either
- * version 2 of the License, or (at your option) any later version.
- *
- * This library is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
- * Library General Public License for more details.
- *
- * You should have received a copy of the GNU Library General Public License
- * aint with this library; see the file COPYING.LIB.  If not, write to
- * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
- * Boston, MA 02110-1301, USA.
- */
-
-#include "config.h"
-#include "PlaybackPipeline.h"
-
-#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE)
-
-#include "AudioTrackPrivateGStreamer.h"
-#include "GStreamerCommon.h"
-#include "MediaSampleGStreamer.h"
-#include "MediaSample.h"
-#include "SourceBufferPrivateGStreamer.h"
-#include "VideoTrackPrivateGStreamer.h"
-
-#include <gst/app/gstappsrc.h>
-#include <gst/gst.h>
-#include <wtf/MainThread.h>
-#include <wtf/RefCounted.h>
-#include <wtf/glib/GRefPtr.h>
-#include <wtf/glib/GUniquePtr.h>
-#include <wtf/text/AtomString.h>
-
-GST_DEBUG_CATEGORY_EXTERN(webkit_mse_debug);
-#define GST_CAT_DEFAULT webkit_mse_debug
-
-static Stream* getStreamByTrackId(WebKitMediaSrc*, AtomString);
-static Stream* getStreamBySourceBufferPrivate(WebKitMediaSrc*, WebCore::SourceBufferPrivateGStreamer*);
-
-static Stream* getStreamByTrackId(WebKitMediaSrc* source, AtomString trackIdString)
-{
-    // WebKitMediaSrc should be locked at this point.
-    for (Stream* stream : source->priv->streams) {
-        if (stream->type != WebCore::Invalid
-            && ((stream->audioTrack && stream->audioTrack->id() == trackIdString)
-                || (stream->videoTrack && stream->videoTrack->id() == trackIdString) ) ) {
-            return stream;
-        }
-    }
-    return nullptr;
-}
-
-static Stream* getStreamBySourceBufferPrivate(WebKitMediaSrc* source, WebCore::SourceBufferPrivateGStreamer* sourceBufferPrivate)
-{
-    for (Stream* stream : source->priv->streams) {
-        if (stream->sourceBuffer == sourceBufferPrivate)
-            return stream;
-    }
-    return nullptr;
-}
-
-// FIXME: Use gst_app_src_push_sample() instead when we switch to the appropriate GStreamer version.
-static GstFlowReturn pushSample(GstAppSrc* appsrc, GstSample* sample)
-{
-    g_return_val_if_fail(GST_IS_SAMPLE(sample), GST_FLOW_ERROR);
-
-    GstCaps* caps = gst_sample_get_caps(sample);
-    if (caps)
-        gst_app_src_set_caps(appsrc, caps);
-    else
-        GST_WARNING_OBJECT(appsrc, "received sample without caps");
-
-    GstBuffer* buffer = gst_sample_get_buffer(sample);
-    if (UNLIKELY(!buffer)) {
-        GST_WARNING_OBJECT(appsrc, "received sample without buffer");
-        return GST_FLOW_OK;
-    }
-
-    // gst_app_src_push_buffer() steals the reference, we need an additional one.
-    return gst_app_src_push_buffer(appsrc, gst_buffer_ref(buffer));
-}
-
-namespace WebCore {
-
-void PlaybackPipeline::setWebKitMediaSrc(WebKitMediaSrc* webKitMediaSrc)
-{
-    GST_DEBUG("webKitMediaSrc=%p", webKitMediaSrc);
-    m_webKitMediaSrc = webKitMediaSrc;
-}
-
-WebKitMediaSrc* PlaybackPipeline::webKitMediaSrc()
-{
-    return m_webKitMediaSrc.get();
-}
-
-MediaSourcePrivate::AddStatus PlaybackPipeline::addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
-{
-    WebKitMediaSrcPrivate* priv = m_webKitMediaSrc->priv;
-
-    if (priv->allTracksConfigured) {
-        GST_ERROR_OBJECT(m_webKitMediaSrc.get(), "Adding new source buffers after first data not supported yet");
-        return MediaSourcePrivate::NotSupported;
-    }
-
-    GST_DEBUG_OBJECT(m_webKitMediaSrc.get(), "State %d", int(GST_STATE(m_webKitMediaSrc.get())));
-
-    Stream* stream = new Stream{ };
-    stream->parent = m_webKitMediaSrc.get();
-    stream->appsrc = gst_element_factory_make("appsrc", nullptr);
-    stream->appsrcNeedDataFlag = false;
-    stream->sourceBuffer = sourceBufferPrivate.get();
-
-    // No track has been attached yet.
-    stream->type = Invalid;
-    stream->caps = nullptr;
-    stream->audioTrack = nullptr;
-    stream->videoTrack = nullptr;
-    stream->presentationSize = WebCore::FloatSize();
-    stream->lastEnqueuedTime = MediaTime::invalidTime();
-
-    gst_app_src_set_callbacks(GST_APP_SRC(stream->appsrc), &enabledAppsrcCallbacks, stream->parent, nullptr);
-    gst_app_src_set_emit_signals(GST_APP_SRC(stream->appsrc), FALSE);
-    gst_app_src_set_stream_type(GST_APP_SRC(stream->appsrc), GST_APP_STREAM_TYPE_SEEKABLE);
-
-    gst_app_src_set_max_bytes(GST_APP_SRC(stream->appsrc), 2 * WTF::MB);
-    g_object_set(G_OBJECT(stream->appsrc), "block", FALSE, "min-percent", 20, "format", GST_FORMAT_TIME, nullptr);
-
-    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
-    priv->streams.append(stream);
-    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
-
-    gst_bin_add(GST_BIN(m_webKitMediaSrc.get()), stream->appsrc);
-    gst_element_sync_state_with_parent(stream->appsrc);
-
-    return MediaSourcePrivate::Ok;
-}
-
-void PlaybackPipeline::removeSourceBuffer(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
-{
-    ASSERT(WTF::isMainThread());
-
-    GST_DEBUG_OBJECT(m_webKitMediaSrc.get(), "Element removed from MediaSource");
-    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
-    WebKitMediaSrcPrivate* priv = m_webKitMediaSrc->priv;
-    Stream* stream = getStreamBySourceBufferPrivate(m_webKitMediaSrc.get(), sourceBufferPrivate.get());
-    if (stream)
-        priv->streams.removeFirst(stream);
-    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
-
-    if (stream)
-        webKitMediaSrcFreeStream(m_webKitMediaSrc.get(), stream);
-}
-
-void PlaybackPipeline::attachTrack(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate, RefPtr<TrackPrivateBase> trackPrivate, GstCaps* caps)
-{
-    WebKitMediaSrc* webKitMediaSrc = m_webKitMediaSrc.get();
-
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    Stream* stream = getStreamBySourceBufferPrivate(webKitMediaSrc, sourceBufferPrivate.get());
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
-
-    ASSERT(stream);
-
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    unsigned padId = stream->parent->priv->numberOfPads;
-    stream->parent->priv->numberOfPads++;
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
-
-    const char* mediaType = capsMediaType(caps);
-    GST_DEBUG_OBJECT(webKitMediaSrc, "Configured track %s: appsrc=%s, padId=%u, mediaType=%s", trackPrivate->id().string().utf8().data(), GST_ELEMENT_NAME(stream->appsrc), padId, mediaType);
-
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    stream->type = Unknown;
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
-
-    GRefPtr<GstPad> sourcePad = adoptGRef(gst_element_get_static_pad(stream->appsrc, "src"));
-    ASSERT(sourcePad);
-
-    // FIXME: Is padId the best way to identify the Stream? What about trackId?
-    g_object_set_data(G_OBJECT(sourcePad.get()), "padId", GINT_TO_POINTER(padId));
-    webKitMediaSrcLinkSourcePad(sourcePad.get(), caps, stream);
-
-    ASSERT(stream->parent->priv->mediaPlayerPrivate);
-    int signal = -1;
-
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    if (doCapsHaveType(caps, GST_AUDIO_CAPS_TYPE_PREFIX)) {
-        stream->type = Audio;
-        stream->parent->priv->numberOfAudioStreams++;
-        signal = SIGNAL_AUDIO_CHANGED;
-        stream->audioTrack = RefPtr<WebCore::AudioTrackPrivateGStreamer>(static_cast<WebCore::AudioTrackPrivateGStreamer*>(trackPrivate.get()));
-    } else if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
-        stream->type = Video;
-        stream->parent->priv->numberOfVideoStreams++;
-        signal = SIGNAL_VIDEO_CHANGED;
-        stream->videoTrack = RefPtr<WebCore::VideoTrackPrivateGStreamer>(static_cast<WebCore::VideoTrackPrivateGStreamer*>(trackPrivate.get()));
-    } else if (doCapsHaveType(caps, GST_TEXT_CAPS_TYPE_PREFIX)) {
-        stream->type = Text;
-        stream->parent->priv->numberOfTextStreams++;
-        signal = SIGNAL_TEXT_CHANGED;
-
-        // FIXME: Support text tracks.
-    }
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
-
-    if (signal != -1)
-        g_signal_emit(G_OBJECT(stream->parent), webKitMediaSrcSignals[signal], 0, nullptr);
-}
-
-void PlaybackPipeline::reattachTrack(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate, RefPtr<TrackPrivateBase> trackPrivate, GstCaps* caps)
-{
-    GST_DEBUG("Re-attaching track");
-
-    // FIXME: Maybe remove this method. Now the caps change is managed by gst_appsrc_push_sample() in enqueueSample()
-    // and flushAndEnqueueNonDisplayingSamples().
-
-    WebKitMediaSrc* webKitMediaSrc = m_webKitMediaSrc.get();
-
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    Stream* stream = getStreamBySourceBufferPrivate(webKitMediaSrc, sourceBufferPrivate.get());
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
-
-    ASSERT(stream && stream->type != Invalid);
-
-    int signal = -1;
-
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    if (doCapsHaveType(caps, GST_AUDIO_CAPS_TYPE_PREFIX)) {
-        ASSERT(stream->type == Audio);
-        signal = SIGNAL_AUDIO_CHANGED;
-        stream->audioTrack = RefPtr<WebCore::AudioTrackPrivateGStreamer>(static_cast<WebCore::AudioTrackPrivateGStreamer*>(trackPrivate.get()));
-    } else if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
-        ASSERT(stream->type == Video);
-        signal = SIGNAL_VIDEO_CHANGED;
-        stream->videoTrack = RefPtr<WebCore::VideoTrackPrivateGStreamer>(static_cast<WebCore::VideoTrackPrivateGStreamer*>(trackPrivate.get()));
-    } else if (doCapsHaveType(caps, GST_TEXT_CAPS_TYPE_PREFIX)) {
-        ASSERT(stream->type == Text);
-        signal = SIGNAL_TEXT_CHANGED;
-
-        // FIXME: Support text tracks.
-    }
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
-
-    if (signal != -1)
-        g_signal_emit(G_OBJECT(stream->parent), webKitMediaSrcSignals[signal], 0, nullptr);
-}
-
-void PlaybackPipeline::notifyDurationChanged()
-{
-    gst_element_post_message(GST_ELEMENT(m_webKitMediaSrc.get()), gst_message_new_duration_changed(GST_OBJECT(m_webKitMediaSrc.get())));
-    // WebKitMediaSrc will ask MediaPlayerPrivateGStreamerMSE for the new duration later, when somebody asks for it.
-}
-
-void PlaybackPipeline::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus)
-{
-    WebKitMediaSrcPrivate* priv = m_webKitMediaSrc->priv;
-
-    GST_DEBUG_OBJECT(m_webKitMediaSrc.get(), "Have EOS");
-
-    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
-    bool allTracksConfigured = priv->allTracksConfigured;
-    if (!allTracksConfigured)
-        priv->allTracksConfigured = true;
-    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
-
-    if (!allTracksConfigured) {
-        gst_element_no_more_pads(GST_ELEMENT(m_webKitMediaSrc.get()));
-        webKitMediaSrcDoAsyncDone(m_webKitMediaSrc.get());
-    }
-
-    Vector<GstAppSrc*> appsrcs;
-
-    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
-    for (Stream* stream : priv->streams) {
-        if (stream->appsrc)
-            appsrcs.append(GST_APP_SRC(stream->appsrc));
-    }
-    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
-
-    for (GstAppSrc* appsrc : appsrcs)
-        gst_app_src_end_of_stream(appsrc);
-}
-
-void PlaybackPipeline::flush(AtomString trackId)
-{
-    ASSERT(WTF::isMainThread());
-
-    GST_DEBUG("flush: trackId=%s", trackId.string().utf8().data());
-
-    GST_OBJECT_LOCK(m_webKitMediaSrc.get());
-    Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
-
-    if (!stream) {
-        GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
-        return;
-    }
-
-    stream->lastEnqueuedTime = MediaTime::invalidTime();
-    GstElement* appsrc = stream->appsrc;
-    GST_OBJECT_UNLOCK(m_webKitMediaSrc.get());
-
-    if (!appsrc)
-        return;
-
-    gint64 position = GST_CLOCK_TIME_NONE;
-    GRefPtr<GstQuery> query = adoptGRef(gst_query_new_position(GST_FORMAT_TIME));
-    if (gst_element_query(pipeline(), query.get()))
-        gst_query_parse_position(query.get(), 0, &position);
-
-    GST_TRACE("Position: %" GST_TIME_FORMAT, GST_TIME_ARGS(position));
-
-    if (static_cast<guint64>(position) == GST_CLOCK_TIME_NONE) {
-        GST_DEBUG("Can't determine position, avoiding flush");
-        return;
-    }
-
-    if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_start())) {
-        GST_WARNING("Failed to send flush-start event for trackId=%s", trackId.string().utf8().data());
-    }
-
-    if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_stop(false))) {
-        GST_WARNING("Failed to send flush-stop event for trackId=%s", trackId.string().utf8().data());
-    }
-
-    GST_DEBUG("trackId=%s flushed", trackId.string().utf8().data());
-}
-
-void PlaybackPipeline::enqueueSample(Ref<MediaSample>&& mediaSample)
-{
-    ASSERT(WTF::isMainThread());
-
-    AtomString trackId = mediaSample->trackID();
-
-    GST_TRACE("enqueing sample trackId=%s PTS=%f presentationSize=%.0fx%.0f at %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT,
-        trackId.string().utf8().data(), mediaSample->presentationTime().toFloat(),
-        mediaSample->presentationSize().width(), mediaSample->presentationSize().height(),
-        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->presentationTime())),
-        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->duration())));
-
-    // No need to lock to access the Stream here because the only chance of conflict with this read and with the usage
-    // of the sample fields done in this method would be the deletion of the stream. However, that operation can only
-    // happen in the main thread, but we're already there. Therefore there's no conflict and locking would only cause
-    // a performance penalty on the readers working in other threads.
-    Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
-
-    if (!stream) {
-        GST_WARNING("No stream!");
-        return;
-    }
-
-    if (!stream->sourceBuffer->isReadyForMoreSamples(trackId)) {
-        GST_DEBUG("enqueueSample: skip adding new sample for trackId=%s, SB is not ready yet", trackId.string().utf8().data());
-        return;
-    }
-
-    // This field doesn't change after creation, no need to lock.
-    GstElement* appsrc = stream->appsrc;
-
-    // Only modified by the main thread, no need to lock.
-    MediaTime lastEnqueuedTime = stream->lastEnqueuedTime;
-
-    ASSERT(mediaSample->platformSample().type == PlatformSample::GStreamerSampleType);
-    GRefPtr<GstSample> gstSample = mediaSample->platformSample().sample.gstSample;
-    if (gstSample && gst_sample_get_buffer(gstSample.get())) {
-        GstBuffer* buffer = gst_sample_get_buffer(gstSample.get());
-        lastEnqueuedTime = mediaSample->presentationTime();
-
-        GST_BUFFER_FLAG_UNSET(buffer, GST_BUFFER_FLAG_DECODE_ONLY);
-        pushSample(GST_APP_SRC(appsrc), gstSample.get());
-        // gst_app_src_push_sample() uses transfer-none for gstSample.
-
-        stream->lastEnqueuedTime = lastEnqueuedTime;
-    }
-}
-
-void PlaybackPipeline::allSamplesInTrackEnqueued(const AtomString& trackId)
-{
-    Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
-    gst_app_src_end_of_stream(GST_APP_SRC(stream->appsrc));
-}
-
-GstElement* PlaybackPipeline::pipeline()
-{
-    if (!m_webKitMediaSrc || !GST_ELEMENT_PARENT(GST_ELEMENT(m_webKitMediaSrc.get())))
-        return nullptr;
-
-    return GST_ELEMENT_PARENT(GST_ELEMENT_PARENT(GST_ELEMENT(m_webKitMediaSrc.get())));
-}
-
-} // namespace WebCore.
-
-#endif // USE(GSTREAMER)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.h b/Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.h
deleted file mode 100644 (file)
index fc7ca0e..0000000
+++ /dev/null
@@ -1,80 +0,0 @@
-/*
- * Copyright (C) 2016 Metrological Group B.V.
- * Copyright (C) 2016 Igalia S.L
- *
- * This library is free software; you can redistribute it and/or
- * modify it under the terms of the GNU Library General Public
- * License as published by the Free Software Foundation; either
- * version 2 of the License, or (at your option) any later version.
- *
- * This library is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
- * Library General Public License for more details.
- *
- * You should have received a copy of the GNU Library General Public License
- * aint with this library; see the file COPYING.LIB.  If not, write to
- * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
- * Boston, MA 02110-1301, USA.
- */
-
-#pragma once
-
-#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE)
-
-// PlaybackPipeline is (sort of) a friend class of WebKitMediaSourceGStreamer.
-
-#include "WebKitMediaSourceGStreamer.h"
-#include "WebKitMediaSourceGStreamerPrivate.h"
-
-#include <gst/gst.h>
-#include <wtf/Condition.h>
-#include <wtf/glib/GRefPtr.h>
-
-namespace WTF {
-template<> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc*);
-template<> WebKitMediaSrc* refGPtr<WebKitMediaSrc>(WebKitMediaSrc*);
-template<> void derefGPtr<WebKitMediaSrc>(WebKitMediaSrc*);
-};
-
-namespace WebCore {
-
-class ContentType;
-class SourceBufferPrivateGStreamer;
-class MediaSourceGStreamer;
-
-class PlaybackPipeline: public RefCounted<PlaybackPipeline> {
-public:
-    static Ref<PlaybackPipeline> create()
-    {
-        return adoptRef(*new PlaybackPipeline());
-    }
-
-    virtual ~PlaybackPipeline() = default;
-
-    void setWebKitMediaSrc(WebKitMediaSrc*);
-    WebKitMediaSrc* webKitMediaSrc();
-
-    MediaSourcePrivate::AddStatus addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>);
-    void removeSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>);
-    void attachTrack(RefPtr<SourceBufferPrivateGStreamer>, RefPtr<TrackPrivateBase>, GstCaps*);
-    void reattachTrack(RefPtr<SourceBufferPrivateGStreamer>, RefPtr<TrackPrivateBase>, GstCaps*);
-    void notifyDurationChanged();
-
-    // From MediaSourceGStreamer.
-    void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
-
-    // From SourceBufferPrivateGStreamer.
-    void flush(AtomString);
-    void enqueueSample(Ref<MediaSample>&&);
-    void allSamplesInTrackEnqueued(const AtomString&);
-
-    GstElement* pipeline();
-private:
-    PlaybackPipeline() = default;
-    GRefPtr<WebKitMediaSrc> m_webKitMediaSrc;
-};
-
-} // namespace WebCore.
-
-#endif // USE(GSTREAMER)
index 7c1e799..8761cba 100644 (file)
@@ -46,6 +46,9 @@
 #include "NotImplemented.h"
 #include "WebKitMediaSourceGStreamer.h"
 
+GST_DEBUG_CATEGORY_EXTERN(webkit_mse_debug);
+#define GST_CAT_DEFAULT webkit_mse_debug
+
 namespace WebCore {
 
 Ref<SourceBufferPrivateGStreamer> SourceBufferPrivateGStreamer::create(MediaSourceGStreamer* mediaSource, Ref<MediaSourceClientGStreamerMSE> client, const ContentType& contentType)
@@ -109,11 +112,9 @@ void SourceBufferPrivateGStreamer::flush(const AtomString& trackId)
     m_client->flush(trackId);
 }
 
-void SourceBufferPrivateGStreamer::enqueueSample(Ref<MediaSample>&& sample, const AtomString&)
+void SourceBufferPrivateGStreamer::enqueueSample(Ref<MediaSample>&& sample, const AtomString& trackId)
 {
-    m_notifyWhenReadyForMoreSamples = false;
-
-    m_client->enqueueSample(WTFMove(sample));
+    m_client->enqueueSample(WTFMove(sample), trackId);
 }
 
 void SourceBufferPrivateGStreamer::allSamplesInTrackEnqueued(const AtomString& trackId)
@@ -121,23 +122,12 @@ void SourceBufferPrivateGStreamer::allSamplesInTrackEnqueued(const AtomString& t
     m_client->allSamplesInTrackEnqueued(trackId);
 }
 
-bool SourceBufferPrivateGStreamer::isReadyForMoreSamples(const AtomString&)
-{
-    return m_isReadyForMoreSamples;
-}
-
-void SourceBufferPrivateGStreamer::setReadyForMoreSamples(bool isReady)
-{
-    ASSERT(WTF::isMainThread());
-    m_isReadyForMoreSamples = isReady;
-}
-
-void SourceBufferPrivateGStreamer::notifyReadyForMoreSamples()
+bool SourceBufferPrivateGStreamer::isReadyForMoreSamples(const AtomString& trackId)
 {
     ASSERT(WTF::isMainThread());
-    setReadyForMoreSamples(true);
-    if (m_notifyWhenReadyForMoreSamples)
-        m_sourceBufferPrivateClient->sourceBufferPrivateDidBecomeReadyForMoreSamples(m_trackId);
+    bool isReadyForMoreSamples = m_client->isReadyForMoreSamples(trackId);
+    GST_DEBUG("SourceBufferPrivate(%p) - isReadyForMoreSamples: %d", this, (int) isReadyForMoreSamples);
+    return isReadyForMoreSamples;
 }
 
 void SourceBufferPrivateGStreamer::setActive(bool isActive)
@@ -149,8 +139,7 @@ void SourceBufferPrivateGStreamer::setActive(bool isActive)
 void SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples(const AtomString& trackId)
 {
     ASSERT(WTF::isMainThread());
-    m_notifyWhenReadyForMoreSamples = true;
-    m_trackId = trackId;
+    return m_client->notifyClientWhenReadyForMoreSamples(trackId, m_sourceBufferPrivateClient);
 }
 
 void SourceBufferPrivateGStreamer::didReceiveInitializationSegment(const SourceBufferPrivateClient::InitializationSegment& initializationSegment)
index e5fe406..38cea50 100644 (file)
@@ -69,15 +69,13 @@ public:
     void setActive(bool) final;
     void notifyClientWhenReadyForMoreSamples(const AtomString&) final;
 
-    void setReadyForMoreSamples(bool);
-    void notifyReadyForMoreSamples();
-
     void didReceiveInitializationSegment(const SourceBufferPrivateClient::InitializationSegment&);
     void didReceiveSample(MediaSample&);
     void didReceiveAllPendingSamples();
     void appendParsingFailed();
 
     ContentType type() const { return m_type; }
+    AtomString trackId() const { return m_trackId; }
 
 private:
     SourceBufferPrivateGStreamer(MediaSourceGStreamer*, Ref<MediaSourceClientGStreamerMSE>, const ContentType&);
@@ -87,8 +85,6 @@ private:
     ContentType m_type;
     Ref<MediaSourceClientGStreamerMSE> m_client;
     SourceBufferPrivateClient* m_sourceBufferPrivateClient { nullptr };
-    bool m_isReadyForMoreSamples = true;
-    bool m_notifyWhenReadyForMoreSamples = false;
     AtomString m_trackId;
 };
 
index 1ad4424..862305b 100644 (file)
@@ -3,8 +3,8 @@
  *  Copyright (C) 2013 Collabora Ltd.
  *  Copyright (C) 2013 Orange
  *  Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
- *  Copyright (C) 2015, 2016 Metrological Group B.V.
- *  Copyright (C) 2015, 2016 Igalia, S.L
+ *  Copyright (C) 2015, 2016, 2018, 2019 Metrological Group B.V.
+ *  Copyright (C) 2015, 2016, 2018, 2019 Igalia, S.L
  *
  *  This library is free software; you can redistribute it and/or
  *  modify it under the terms of the GNU Lesser General Public
 #include "config.h"
 #include "WebKitMediaSourceGStreamer.h"
 
-#include "PlaybackPipeline.h"
-
 #if ENABLE(VIDEO) && ENABLE(MEDIA_SOURCE) && USE(GSTREAMER)
 
-#include "AudioTrackPrivateGStreamer.h"
 #include "GStreamerCommon.h"
-#include "MediaDescription.h"
-#include "MediaPlayerPrivateGStreamerMSE.h"
-#include "MediaSample.h"
-#include "MediaSourceGStreamer.h"
-#include "NotImplemented.h"
-#include "SourceBufferPrivateGStreamer.h"
-#include "TimeRanges.h"
 #include "VideoTrackPrivateGStreamer.h"
-#include "WebKitMediaSourceGStreamerPrivate.h"
 
-#include <gst/pbutils/pbutils.h>
-#include <gst/video/video.h>
+#include <gst/gst.h>
 #include <wtf/Condition.h>
+#include <wtf/DataMutex.h>
+#include <wtf/HashMap.h>
 #include <wtf/MainThread.h>
+#include <wtf/MainThreadData.h>
 #include <wtf/RefPtr.h>
+#include <wtf/glib/WTFGType.h>
+#include <wtf/text/AtomString.h>
+#include <wtf/text/AtomStringHash.h>
 #include <wtf/text/CString.h>
 
+using namespace WTF;
+using namespace WebCore;
+
 GST_DEBUG_CATEGORY_STATIC(webkit_media_src_debug);
 #define GST_CAT_DEFAULT webkit_media_src_debug
 
-#define webkit_media_src_parent_class parent_class
-#define WEBKIT_MEDIA_SRC_CATEGORY_INIT GST_DEBUG_CATEGORY_INIT(webkit_media_src_debug, "webkitmediasrc", 0, "websrc element");
-
-static GstStaticPadTemplate srcTemplate = GST_STATIC_PAD_TEMPLATE("src_%u", GST_PAD_SRC,
+static GstStaticPadTemplate srcTemplate = GST_STATIC_PAD_TEMPLATE("src_%s", GST_PAD_SRC,
     GST_PAD_SOMETIMES, GST_STATIC_CAPS_ANY);
 
-static void enabledAppsrcNeedData(GstAppSrc*, guint, gpointer);
-static void enabledAppsrcEnoughData(GstAppSrc*, gpointer);
-static gboolean enabledAppsrcSeekData(GstAppSrc*, guint64, gpointer);
-
-static void disabledAppsrcNeedData(GstAppSrc*, guint, gpointer) { };
-static void disabledAppsrcEnoughData(GstAppSrc*, gpointer) { };
-static gboolean disabledAppsrcSeekData(GstAppSrc*, guint64, gpointer)
-{
-    return FALSE;
+enum {
+    PROP_0,
+    PROP_N_AUDIO,
+    PROP_N_VIDEO,
+    PROP_N_TEXT,
+    PROP_LAST
 };
 
-GstAppSrcCallbacks enabledAppsrcCallbacks = {
-    enabledAppsrcNeedData,
-    enabledAppsrcEnoughData,
-    enabledAppsrcSeekData,
-    { 0 }
-};
+struct Stream;
 
-GstAppSrcCallbacks disabledAppsrcCallbacks = {
-    disabledAppsrcNeedData,
-    disabledAppsrcEnoughData,
-    disabledAppsrcSeekData,
-    { 0 }
-};
+struct WebKitMediaSrcPrivate {
+    HashMap<AtomString, RefPtr<Stream>> streams;
+    Stream* streamByName(const AtomString& name)
+    {
+        Stream* stream = streams.get(name);
+        ASSERT(stream);
+        return stream;
+    }
 
-static Stream* getStreamByAppsrc(WebKitMediaSrc*, GstElement*);
-static void seekNeedsDataMainThread(WebKitMediaSrc*);
-static void notifyReadyForMoreSamplesMainThread(WebKitMediaSrc*, Stream*);
+    // Used for stream-start events, shared by all streams.
+    const unsigned groupId { gst_util_group_id_next() };
 
-static void enabledAppsrcNeedData(GstAppSrc* appsrc, guint, gpointer userData)
-{
-    WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
-    ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
+    // Every time a track is added or removed this collection is swapped by an updated one and a STREAM_COLLECTION
+    // message is posted in the bus.
+    GRefPtr<GstStreamCollection> collection { adoptGRef(gst_stream_collection_new("WebKitMediaSrc")) };
 
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    OnSeekDataAction appsrcSeekDataNextAction = webKitMediaSrc->priv->appsrcSeekDataNextAction;
-    Stream* appsrcStream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
-    bool allAppsrcNeedDataAfterSeek = false;
+    // Changed on seeks.
+    GstClockTime startTime { 0 };
+    double rate { 1.0 };
 
-    if (webKitMediaSrc->priv->appsrcSeekDataCount > 0) {
-        if (appsrcStream && !appsrcStream->appsrcNeedDataFlag) {
-            ++webKitMediaSrc->priv->appsrcNeedDataCount;
-            appsrcStream->appsrcNeedDataFlag = true;
-        }
-        int numAppsrcs = webKitMediaSrc->priv->streams.size();
-        if (webKitMediaSrc->priv->appsrcSeekDataCount == numAppsrcs && webKitMediaSrc->priv->appsrcNeedDataCount == numAppsrcs) {
-            GST_DEBUG("All needDatas completed");
-            allAppsrcNeedDataAfterSeek = true;
-            webKitMediaSrc->priv->appsrcSeekDataCount = 0;
-            webKitMediaSrc->priv->appsrcNeedDataCount = 0;
-            webKitMediaSrc->priv->appsrcSeekDataNextAction = Nothing;
-
-            for (Stream* stream : webKitMediaSrc->priv->streams)
-                stream->appsrcNeedDataFlag = false;
-        }
-    }
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
+    // Only used by URI Handler API implementation.
+    GUniquePtr<char> uri;
+};
 
-    if (allAppsrcNeedDataAfterSeek) {
-        GST_DEBUG("All expected appsrcSeekData() and appsrcNeedData() calls performed. Running next action (%d)", static_cast<int>(appsrcSeekDataNextAction));
+static void webKitMediaSrcUriHandlerInit(gpointer, gpointer);
+static void webKitMediaSrcFinalize(GObject*);
+static GstStateChangeReturn webKitMediaSrcChangeState(GstElement*, GstStateChange);
+static gboolean webKitMediaSrcActivateMode(GstPad*, GstObject*, GstPadMode, gboolean activate);
+static void webKitMediaSrcLoop(void*);
+static void webKitMediaSrcStreamFlushStart(const RefPtr<Stream>&);
+static void webKitMediaSrcStreamFlushStop(const RefPtr<Stream>&, bool resetTime);
+static void webKitMediaSrcGetProperty(GObject*, unsigned propId, GValue*, GParamSpec*);
 
-        switch (appsrcSeekDataNextAction) {
-        case MediaSourceSeekToTime:
-            webKitMediaSrc->priv->notifier->notify(WebKitMediaSrcMainThreadNotification::SeekNeedsData, [webKitMediaSrc] {
-                seekNeedsDataMainThread(webKitMediaSrc);
-            });
-            break;
-        case Nothing:
-            break;
-        }
-    } else if (appsrcSeekDataNextAction == Nothing) {
-        LockHolder locker(webKitMediaSrc->priv->streamLock);
+#define webkit_media_src_parent_class parent_class
 
-        GST_OBJECT_LOCK(webKitMediaSrc);
+struct WebKitMediaSrcPadPrivate {
+    RefPtr<Stream> stream;
+};
 
-        // Search again for the Stream, just in case it was removed between the previous lock and this one.
-        appsrcStream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
+struct WebKitMediaSrcPad {
+    GstPad parent;
+    WebKitMediaSrcPadPrivate* priv;
+};
 
-        if (appsrcStream && appsrcStream->type != WebCore::Invalid)
-            webKitMediaSrc->priv->notifier->notify(WebKitMediaSrcMainThreadNotification::ReadyForMoreSamples, [webKitMediaSrc, appsrcStream] {
-                notifyReadyForMoreSamplesMainThread(webKitMediaSrc, appsrcStream);
-            });
+struct WebKitMediaSrcPadClass {
+    GstPadClass parent;
+};
 
-        GST_OBJECT_UNLOCK(webKitMediaSrc);
-    }
-}
+namespace WTF {
 
-static void enabledAppsrcEnoughData(GstAppSrc *appsrc, gpointer userData)
+template<> GRefPtr<WebKitMediaSrcPad> adoptGRef(WebKitMediaSrcPad* ptr)
 {
-    // No need to lock on webKitMediaSrc, we're on the main thread and nobody is going to remove the stream in the meantime.
-    ASSERT(WTF::isMainThread());
+    ASSERT(!ptr || !g_object_is_floating(ptr));
+    return GRefPtr<WebKitMediaSrcPad>(ptr, GRefPtrAdopt);
+}
 
-    WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
-    ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
-    Stream* stream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
+template<> WebKitMediaSrcPad* refGPtr<WebKitMediaSrcPad>(WebKitMediaSrcPad* ptr)
+{
+    if (ptr)
+        gst_object_ref_sink(GST_OBJECT(ptr));
 
-    // This callback might have been scheduled from a child thread before the stream was removed.
-    // Then, the removal code might have run, and later this callback.
-    // This check solves the race condition.
-    if (!stream || stream->type == WebCore::Invalid)
-        return;
+    return ptr;
+}
 
-    stream->sourceBuffer->setReadyForMoreSamples(false);
+template<> void derefGPtr<WebKitMediaSrcPad>(WebKitMediaSrcPad* ptr)
+{
+    if (ptr)
+        gst_object_unref(ptr);
 }
 
-static gboolean enabledAppsrcSeekData(GstAppSrc*, guint64, gpointer userData)
+} // namespace WTF
+
+static GType webkit_media_src_pad_get_type();
+WEBKIT_DEFINE_TYPE(WebKitMediaSrcPad, webkit_media_src_pad, GST_TYPE_PAD);
+#define WEBKIT_TYPE_MEDIA_SRC_PAD (webkit_media_src_pad_get_type())
+#define WEBKIT_MEDIA_SRC_PAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), WEBKIT_TYPE_MEDIA_SRC_PAD, WebKitMediaSrcPad))
+
+static void webkit_media_src_pad_class_init(WebKitMediaSrcPadClass*)
 {
-    ASSERT(WTF::isMainThread());
+}
 
-    WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
+G_DEFINE_TYPE_WITH_CODE(WebKitMediaSrc, webkit_media_src, GST_TYPE_ELEMENT,
+    G_IMPLEMENT_INTERFACE(GST_TYPE_URI_HANDLER, webKitMediaSrcUriHandlerInit);
+    G_ADD_PRIVATE(WebKitMediaSrc);
+    GST_DEBUG_CATEGORY_INIT(webkit_media_src_debug, "webkitmediasrc", 0, "WebKit MSE source element"));
+
+struct Stream : public ThreadSafeRefCounted<Stream> {
+    Stream(WebKitMediaSrc* source, GRefPtr<GstPad>&& pad, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer type, GRefPtr<GstCaps>&& initialCaps, GRefPtr<GstStream>&& streamInfo)
+        : source(source)
+        , pad(WTFMove(pad))
+        , name(name)
+        , type(type)
+        , streamInfo(WTFMove(streamInfo))
+        , streamingMembersDataMutex(WTFMove(initialCaps), source->priv->startTime, source->priv->rate, adoptGRef(gst_event_new_stream_collection(source->priv->collection.get())))
+    { }
+
+    WebKitMediaSrc* const source;
+    GRefPtr<GstPad> const pad;
+    AtomString const name;
+    WebCore::MediaSourceStreamTypeGStreamer type;
+    GRefPtr<GstStream> streamInfo;
+
+    // The point of having a queue in WebKitMediaSource is to limit the number of context switches per second.
+    // If we had no queue, the main thread would have to be awaken for every frame. On the other hand, if the
+    // queue had unlimited size WebKit would end up requesting flushes more often than necessary when frames
+    // in the future are re-appended. As a sweet spot between these extremes we choose to allow enqueueing a
+    // few seconds worth of samples.
+
+    // `isReadyForMoreSamples` follows the classical two water levels strategy: initially it's true until the
+    // high water level is reached, then it becomes false until the queue drains down to the low water level
+    // and the cycle repeats. This way we avoid stalls and minimize context switches.
+
+    static const uint64_t durationEnqueuedHighWaterLevel = 5 * GST_SECOND;
+    static const uint64_t durationEnqueuedLowWaterLevel = 2 * GST_SECOND;
+
+    struct StreamingMembers {
+        StreamingMembers(GRefPtr<GstCaps>&& initialCaps, GstClockTime startTime, double rate, GRefPtr<GstEvent>&& pendingStreamCollectionEvent)
+            : pendingStreamCollectionEvent(WTFMove(pendingStreamCollectionEvent))
+            , pendingInitialCaps(WTFMove(initialCaps))
+        {
+            gst_segment_init(&segment, GST_FORMAT_TIME);
+            segment.start = segment.time = startTime;
+            segment.rate = rate;
+
+            GstStreamCollection* collection;
+            gst_event_parse_stream_collection(this->pendingStreamCollectionEvent.get(), &collection);
+            ASSERT(collection);
+        }
 
-    ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
+        bool hasPushedFirstBuffer { false };
+        bool wasStreamStartSent { false };
+        bool doesNeedSegmentEvent { true };
+        GstSegment segment;
+        GRefPtr<GstEvent> pendingStreamCollectionEvent;
+        GRefPtr<GstCaps> pendingInitialCaps;
+        GRefPtr<GstCaps> previousCaps;
+
+        Condition padLinkedOrFlushedCondition;
+        Condition queueChangedOrFlushedCondition;
+        Deque<GRefPtr<GstMiniObject>> queue;
+        bool isFlushing { false };
+        bool doesNeedToNotifyOnLowWaterLevel { false };
+
+        uint64_t durationEnqueued() const
+        {
+            // Find the first and last GstSample in the queue and subtract their DTS.
+            auto frontIter = std::find_if(queue.begin(), queue.end(), [](const GRefPtr<GstMiniObject>& object) {
+                return GST_IS_SAMPLE(object.get());
+            });
 
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    webKitMediaSrc->priv->appsrcSeekDataCount++;
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
+            // If there are no samples in the queue, that makes total duration of enqueued frames of zero.
+            if (frontIter == queue.end())
+                return 0;
 
-    return TRUE;
-}
+            auto backIter = std::find_if(queue.rbegin(), queue.rend(), [](const GRefPtr<GstMiniObject>& object) {
+                return GST_IS_SAMPLE(object.get());
+            });
+
+            const GstBuffer* front = gst_sample_get_buffer(GST_SAMPLE(frontIter->get()));
+            const GstBuffer* back = gst_sample_get_buffer(GST_SAMPLE(backIter->get()));
+            return GST_BUFFER_DTS_OR_PTS(back) - GST_BUFFER_DTS_OR_PTS(front);
+        }
+    };
+    DataMutex<StreamingMembers> streamingMembersDataMutex;
+
+    struct ReportedStatus {
+        // Set to true when the pad is removed. In the case where a reference to the Stream object is alive because of
+        // a posted task to notify isReadyForMoreSamples, the notification must not be delivered if this flag is true.
+        bool wasRemoved { false };
+
+        bool isReadyForMoreSamples { true };
+        SourceBufferPrivateClient* sourceBufferPrivateToNotify { nullptr };
+    };
+    MainThreadData<ReportedStatus> reportedStatus;
+};
 
-static Stream* getStreamByAppsrc(WebKitMediaSrc* source, GstElement* appsrc)
+static GRefPtr<GstElement> findPipeline(GRefPtr<GstElement> element)
 {
-    for (Stream* stream : source->priv->streams) {
-        if (stream->appsrc == appsrc)
-            return stream;
+    while (true) {
+        GRefPtr<GstElement> parentElement = adoptGRef(GST_ELEMENT(gst_element_get_parent(element.get())));
+        if (!parentElement)
+            return element;
+        element = parentElement;
     }
-    return nullptr;
 }
 
-G_DEFINE_TYPE_WITH_CODE(WebKitMediaSrc, webkit_media_src, GST_TYPE_BIN,
-    G_IMPLEMENT_INTERFACE(GST_TYPE_URI_HANDLER, webKitMediaSrcUriHandlerInit);
-    WEBKIT_MEDIA_SRC_CATEGORY_INIT);
-
-guint webKitMediaSrcSignals[LAST_SIGNAL] = { 0 };
-
 static void webkit_media_src_class_init(WebKitMediaSrcClass* klass)
 {
     GObjectClass* oklass = G_OBJECT_CLASS(klass);
     GstElementClass* eklass = GST_ELEMENT_CLASS(klass);
 
     oklass->finalize = webKitMediaSrcFinalize;
-    oklass->set_property = webKitMediaSrcSetProperty;
     oklass->get_property = webKitMediaSrcGetProperty;
 
-    gst_element_class_add_pad_template(eklass, gst_static_pad_template_get(&srcTemplate));
+    gst_element_class_add_static_pad_template_with_gtype(eklass, &srcTemplate, webkit_media_src_pad_get_type());
 
-    gst_element_class_set_static_metadata(eklass, "WebKit Media source element", "Source", "Handles Blob uris", "Stephane Jadaud <sjadaud@sii.fr>, Sebastian Dröge <sebastian@centricular.com>, Enrique Ocaña González <eocanha@igalia.com>");
+    gst_element_class_set_static_metadata(eklass, "WebKit MediaSource source element", "Source/Network", "Feeds samples coming from WebKit MediaSource object", "Igalia <aboya@igalia.com>");
+
+    eklass->change_state = webKitMediaSrcChangeState;
 
-    // Allows setting the uri using the 'location' property, which is used for example by gst_element_make_from_uri().
-    g_object_class_install_property(oklass,
-        PROP_LOCATION,
-        g_param_spec_string("location", "location", "Location to read from", nullptr,
-        GParamFlags(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)));
     g_object_class_install_property(oklass,
         PROP_N_AUDIO,
         g_param_spec_int("n-audio", "Number Audio", "Total number of audio streams",
@@ -221,428 +272,531 @@ static void webkit_media_src_class_init(WebKitMediaSrcClass* klass)
         PROP_N_TEXT,
         g_param_spec_int("n-text", "Number Text", "Total number of text streams",
         0, G_MAXINT, 0, GParamFlags(G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)));
+}
 
-    webKitMediaSrcSignals[SIGNAL_VIDEO_CHANGED] =
-        g_signal_new("video-changed", G_TYPE_FROM_CLASS(oklass),
-        G_SIGNAL_RUN_LAST,
-        G_STRUCT_OFFSET(WebKitMediaSrcClass, videoChanged), nullptr, nullptr,
-        g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
-    webKitMediaSrcSignals[SIGNAL_AUDIO_CHANGED] =
-        g_signal_new("audio-changed", G_TYPE_FROM_CLASS(oklass),
-        G_SIGNAL_RUN_LAST,
-        G_STRUCT_OFFSET(WebKitMediaSrcClass, audioChanged), nullptr, nullptr,
-        g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
-    webKitMediaSrcSignals[SIGNAL_TEXT_CHANGED] =
-        g_signal_new("text-changed", G_TYPE_FROM_CLASS(oklass),
-        G_SIGNAL_RUN_LAST,
-        G_STRUCT_OFFSET(WebKitMediaSrcClass, textChanged), nullptr, nullptr,
-        g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
-
-    eklass->change_state = webKitMediaSrcChangeState;
+static void webkit_media_src_init(WebKitMediaSrc* source)
+{
+    ASSERT(isMainThread());
 
-    g_type_class_add_private(klass, sizeof(WebKitMediaSrcPrivate));
+    GST_OBJECT_FLAG_SET(source, GST_ELEMENT_FLAG_SOURCE);
+    source->priv = G_TYPE_INSTANCE_GET_PRIVATE((source), WEBKIT_TYPE_MEDIA_SRC, WebKitMediaSrcPrivate);
+    new (source->priv) WebKitMediaSrcPrivate();
 }
 
-static GstFlowReturn webkitMediaSrcChain(GstPad* pad, GstObject* parent, GstBuffer* buffer)
+static void webKitMediaSrcFinalize(GObject* object)
 {
-    GRefPtr<WebKitMediaSrc> self = adoptGRef(WEBKIT_MEDIA_SRC(gst_object_get_parent(parent)));
+    ASSERT(isMainThread());
 
-    return gst_flow_combiner_update_pad_flow(self->priv->flowCombiner.get(), pad, gst_proxy_pad_chain_default(pad, GST_OBJECT(self.get()), buffer));
+    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
+    source->priv->~WebKitMediaSrcPrivate();
+    GST_CALL_PARENT(G_OBJECT_CLASS, finalize, (object));
 }
 
-static void webkit_media_src_init(WebKitMediaSrc* source)
+static GstPadProbeReturn debugProbe(GstPad* pad, GstPadProbeInfo* info, void*)
 {
-    source->priv = WEBKIT_MEDIA_SRC_GET_PRIVATE(source);
-    new (source->priv) WebKitMediaSrcPrivate();
-    source->priv->seekTime = MediaTime::invalidTime();
-    source->priv->appsrcSeekDataCount = 0;
-    source->priv->appsrcNeedDataCount = 0;
-    source->priv->appsrcSeekDataNextAction = Nothing;
-    source->priv->flowCombiner = GUniquePtr<GstFlowCombiner>(gst_flow_combiner_new());
-    source->priv->notifier = WebCore::MainThreadNotifier<WebKitMediaSrcMainThreadNotification>::create();
+    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
+    GST_TRACE_OBJECT(stream->source, "track %s: %" GST_PTR_FORMAT, stream->name.string().utf8().data(), info->data);
+    return GST_PAD_PROBE_OK;
+}
+
+// GstStreamCollection are immutable objects once posted. THEY MUST NOT BE MODIFIED once they have been posted.
+// Instead, when stream changes occur a new collection must be made. The following functions help to create
+// such new collections:
+
+static GRefPtr<GstStreamCollection> copyCollectionAndAddStream(GstStreamCollection* collection, GRefPtr<GstStream>&& stream)
+{
+    GRefPtr<GstStreamCollection> newCollection = adoptGRef(gst_stream_collection_new(collection->upstream_id));
+
+    unsigned n = gst_stream_collection_get_size(collection);
+    for (unsigned i = 0; i < n; i++)
+        gst_stream_collection_add_stream(newCollection.get(), static_cast<GstStream*>(gst_object_ref(gst_stream_collection_get_stream(collection, i))));
+    gst_stream_collection_add_stream(newCollection.get(), stream.leakRef());
 
-    // No need to reset Stream.appsrcNeedDataFlag because there are no Streams at this point yet.
+    return newCollection;
 }
 
-void webKitMediaSrcFinalize(GObject* object)
+static GRefPtr<GstStreamCollection> copyCollectionWithoutStream(GstStreamCollection* collection, const GstStream* stream)
 {
-    ASSERT(WTF::isMainThread());
+    GRefPtr<GstStreamCollection> newCollection = adoptGRef(gst_stream_collection_new(collection->upstream_id));
 
-    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
-    WebKitMediaSrcPrivate* priv = source->priv;
+    unsigned n = gst_stream_collection_get_size(collection);
+    for (unsigned i = 0; i < n; i++) {
+        GRefPtr<GstStream> oldStream = gst_stream_collection_get_stream(collection, i);
+        if (oldStream.get() != stream)
+            gst_stream_collection_add_stream(newCollection.get(), oldStream.leakRef());
+    }
+
+    return newCollection;
+}
+
+static GstStreamType gstStreamType(WebCore::MediaSourceStreamTypeGStreamer type)
+{
+    switch (type) {
+    case WebCore::MediaSourceStreamTypeGStreamer::Video:
+        return GST_STREAM_TYPE_VIDEO;
+    case WebCore::MediaSourceStreamTypeGStreamer::Audio:
+        return GST_STREAM_TYPE_AUDIO;
+    case WebCore::MediaSourceStreamTypeGStreamer::Text:
+        return GST_STREAM_TYPE_TEXT;
+    default:
+        GST_ERROR("Received unexpected stream type");
+        return GST_STREAM_TYPE_UNKNOWN;
+    }
+}
 
-    Vector<Stream*> oldStreams;
-    source->priv->streams.swap(oldStreams);
+void webKitMediaSrcAddStream(WebKitMediaSrc* source, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer type, GRefPtr<GstCaps>&& initialCaps)
+{
+    ASSERT(isMainThread());
+    ASSERT(!source->priv->streams.contains(name));
 
-    for (Stream* stream : oldStreams)
-        webKitMediaSrcFreeStream(source, stream);
+    GRefPtr<GstStream> streamInfo = adoptGRef(gst_stream_new(name.string().utf8().data(), initialCaps.get(), gstStreamType(type), GST_STREAM_FLAG_SELECT));
+    source->priv->collection = copyCollectionAndAddStream(source->priv->collection.get(), GRefPtr<GstStream>(streamInfo));
+    gst_element_post_message(GST_ELEMENT(source), gst_message_new_stream_collection(GST_OBJECT(source), source->priv->collection.get()));
 
-    priv->seekTime = MediaTime::invalidTime();
+    GRefPtr<WebKitMediaSrcPad> pad = WEBKIT_MEDIA_SRC_PAD(g_object_new(webkit_media_src_pad_get_type(), "name", makeString("src_", name).utf8().data(), "direction", GST_PAD_SRC, NULL));
+    gst_pad_set_activatemode_function(GST_PAD(pad.get()), webKitMediaSrcActivateMode);
 
-    source->priv->notifier->invalidate();
+    {
+        RefPtr<Stream> stream = adoptRef(new Stream(source, GRefPtr<GstPad>(GST_PAD(pad.get())), name, type, WTFMove(initialCaps), WTFMove(streamInfo)));
+        pad->priv->stream = stream;
+        source->priv->streams.set(name, WTFMove(stream));
+    }
 
-    if (priv->mediaPlayerPrivate)
-        webKitMediaSrcSetMediaPlayerPrivate(source, nullptr);
+    if (gst_debug_category_get_threshold(webkit_media_src_debug) >= GST_LEVEL_TRACE)
+        gst_pad_add_probe(GST_PAD(pad.get()), static_cast<GstPadProbeType>(GST_PAD_PROBE_TYPE_DATA_DOWNSTREAM | GST_PAD_PROBE_TYPE_EVENT_FLUSH), debugProbe, nullptr, nullptr);
 
-    // We used a placement new for construction, the destructor won't be called automatically.
-    priv->~_WebKitMediaSrcPrivate();
+    // Workaround: gst_element_add_pad() should already call gst_pad_set_active() if the element is PAUSED or
+    // PLAYING. Unfortunately, as of GStreamer 1.14.4 it does so with the element lock taken, causing a deadlock
+    // in gst_pad_start_task(), who tries to post a `stream-status` message in the element, which also requires
+    // the element lock. Activating the pad beforehand avoids that codepath.
+    GstState state;
+    gst_element_get_state(GST_ELEMENT(source), &state, nullptr, 0);
+    if (state > GST_STATE_READY)
+        gst_pad_set_active(GST_PAD(pad.get()), true);
 
-    GST_CALL_PARENT(G_OBJECT_CLASS, finalize, (object));
+    gst_element_add_pad(GST_ELEMENT(source), GST_PAD(pad.get()));
 }
 
-void webKitMediaSrcSetProperty(GObject* object, guint propId, const GValue* value, GParamSpec* pspec)
+void webKitMediaSrcRemoveStream(WebKitMediaSrc* source, const AtomString& name)
 {
-    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
+    ASSERT(isMainThread());
+    Stream* stream = source->priv->streamByName(name);
 
-    switch (propId) {
-    case PROP_LOCATION:
-        gst_uri_handler_set_uri(reinterpret_cast<GstURIHandler*>(source), g_value_get_string(value), nullptr);
-        break;
-    default:
-        G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
-        break;
-    }
+    source->priv->collection = copyCollectionWithoutStream(source->priv->collection.get(), stream->streamInfo.get());
+    gst_element_post_message(GST_ELEMENT(source), gst_message_new_stream_collection(GST_OBJECT(source), source->priv->collection.get()));
+
+    // Flush the source element **and** downstream. We want to stop the streaming thread and for that we need all elements downstream to be idle.
+    webKitMediaSrcStreamFlushStart(stream);
+    webKitMediaSrcStreamFlushStop(stream, false);
+    // Stop the thread now.
+    gst_pad_set_active(stream->pad.get(), false);
+
+    stream->reportedStatus->wasRemoved = true;
+    gst_element_remove_pad(GST_ELEMENT(source), stream->pad.get());
+    source->priv->streams.remove(name);
 }
 
-void webKitMediaSrcGetProperty(GObject* object, guint propId, GValue* value, GParamSpec* pspec)
+static gboolean webKitMediaSrcActivateMode(GstPad* pad, GstObject* source, GstPadMode mode, gboolean active)
 {
-    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
-    WebKitMediaSrcPrivate* priv = source->priv;
+    if (mode != GST_PAD_MODE_PUSH) {
+        GST_ERROR_OBJECT(source, "Unexpected pad mode in WebKitMediaSrc");
+        return false;
+    }
 
-    GST_OBJECT_LOCK(source);
-    switch (propId) {
-    case PROP_LOCATION:
-        g_value_set_string(value, priv->location.get());
-        break;
-    case PROP_N_AUDIO:
-        g_value_set_int(value, priv->numberOfAudioStreams);
-        break;
-    case PROP_N_VIDEO:
-        g_value_set_int(value, priv->numberOfVideoStreams);
-        break;
-    case PROP_N_TEXT:
-        g_value_set_int(value, priv->numberOfTextStreams);
-        break;
-    default:
-        G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
-        break;
+    if (active)
+        gst_pad_start_task(pad, webKitMediaSrcLoop, pad, nullptr);
+    else {
+        // Unblock the streaming thread.
+        RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
+        {
+            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+            streamingMembers->isFlushing = true;
+            streamingMembers->padLinkedOrFlushedCondition.notifyOne();
+            streamingMembers->queueChangedOrFlushedCondition.notifyOne();
+        }
+        // Following gstbasesrc implementation, this code is not flushing downstream.
+        // If there is any possibility of the streaming thread being blocked downstream the caller MUST flush before.
+        // Otherwise a deadlock would occur as the next function tries to join the thread.
+        gst_pad_stop_task(pad);
+        {
+            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+            streamingMembers->isFlushing = false;
+        }
     }
-    GST_OBJECT_UNLOCK(source);
+    return true;
 }
 
-void webKitMediaSrcDoAsyncStart(WebKitMediaSrc* source)
+static void webKitMediaSrcPadLinked(GstPad* pad, GstPad*, void*)
 {
-    source->priv->asyncStart = true;
-    GST_BIN_CLASS(parent_class)->handle_message(GST_BIN(source),
-        gst_message_new_async_start(GST_OBJECT(source)));
+    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
+    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+    streamingMembers->padLinkedOrFlushedCondition.notifyOne();
 }
 
-void webKitMediaSrcDoAsyncDone(WebKitMediaSrc* source)
+static void webKitMediaSrcStreamNotifyLowWaterLevel(const RefPtr<Stream>& stream)
 {
-    WebKitMediaSrcPrivate* priv = source->priv;
-    if (priv->asyncStart) {
-        GST_BIN_CLASS(parent_class)->handle_message(GST_BIN(source),
-            gst_message_new_async_done(GST_OBJECT(source), GST_CLOCK_TIME_NONE));
-        priv->asyncStart = false;
-    }
+    RunLoop::main().dispatch([stream]() {
+        if (stream->reportedStatus->wasRemoved)
+            return;
+
+        stream->reportedStatus->isReadyForMoreSamples = true;
+        if (stream->reportedStatus->sourceBufferPrivateToNotify) {
+            // We need to set sourceBufferPrivateToNotify BEFORE calling sourceBufferPrivateDidBecomeReadyForMoreSamples(),
+            // not after, since otherwise it would destroy a notification request should the callback request one.
+            SourceBufferPrivateClient* sourceBuffer = stream->reportedStatus->sourceBufferPrivateToNotify;
+            stream->reportedStatus->sourceBufferPrivateToNotify = nullptr;
+            sourceBuffer->sourceBufferPrivateDidBecomeReadyForMoreSamples(stream->name);
+        }
+    });
 }
 
-GstStateChangeReturn webKitMediaSrcChangeState(GstElement* element, GstStateChange transition)
+// Called with STREAM_LOCK.
+static void webKitMediaSrcLoop(void* userData)
 {
-    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(element);
-    WebKitMediaSrcPrivate* priv = source->priv;
+    GstPad* pad = GST_PAD(userData);
+    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
 
-    switch (transition) {
-    case GST_STATE_CHANGE_READY_TO_PAUSED:
-        priv->allTracksConfigured = false;
-        webKitMediaSrcDoAsyncStart(source);
-        break;
-    default:
-        break;
+    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+    if (streamingMembers->isFlushing) {
+        gst_pad_pause_task(pad);
+        return;
     }
 
-    GstStateChangeReturn result = GST_ELEMENT_CLASS(parent_class)->change_state(element, transition);
-    if (G_UNLIKELY(result == GST_STATE_CHANGE_FAILURE)) {
-        GST_WARNING_OBJECT(source, "State change failed");
-        webKitMediaSrcDoAsyncDone(source);
-        return result;
-    }
+    // Since the pad can and will be added when the element is in PLAYING state, this task can start running
+    // before the pad is linked. Wait for the pad to be linked to avoid buffers being lost to not-linked errors.
+    GST_OBJECT_LOCK(pad);
+    if (!GST_PAD_IS_LINKED(pad)) {
+        g_signal_connect(pad, "linked", G_CALLBACK(webKitMediaSrcPadLinked), nullptr);
+        GST_OBJECT_UNLOCK(pad);
 
-    switch (transition) {
-    case GST_STATE_CHANGE_READY_TO_PAUSED:
-        result = GST_STATE_CHANGE_ASYNC;
-        break;
-    case GST_STATE_CHANGE_PAUSED_TO_READY:
-        webKitMediaSrcDoAsyncDone(source);
-        priv->allTracksConfigured = false;
-        break;
-    default:
-        break;
+        streamingMembers->padLinkedOrFlushedCondition.wait(streamingMembers.mutex());
+
+        g_signal_handlers_disconnect_by_func(pad, reinterpret_cast<void*>(webKitMediaSrcPadLinked), nullptr);
+        if (streamingMembers->isFlushing)
+            return;
+    } else
+        GST_OBJECT_UNLOCK(pad);
+    ASSERT(gst_pad_is_linked(pad));
+
+    // By keeping the lock we are guaranteed that a flush will not happen while we send essential events.
+    // These events should never block downstream, so the lock should be released in little time in every
+    // case.
+
+    if (streamingMembers->pendingStreamCollectionEvent)
+        gst_pad_push_event(stream->pad.get(), streamingMembers->pendingStreamCollectionEvent.leakRef());
+
+    if (!streamingMembers->wasStreamStartSent) {
+        GUniquePtr<char> streamId(g_strdup_printf("mse/%s", stream->name.string().utf8().data()));
+        GRefPtr<GstEvent> event = adoptGRef(gst_event_new_stream_start(streamId.get()));
+        gst_event_set_group_id(event.get(), stream->source->priv->groupId);
+        gst_event_set_stream(event.get(), stream->streamInfo.get());
+
+        bool wasStreamStartSent = gst_pad_push_event(pad, event.leakRef());
+        streamingMembers->wasStreamStartSent = wasStreamStartSent;
     }
 
-    return result;
-}
+    if (streamingMembers->pendingInitialCaps) {
+        GRefPtr<GstEvent> event = adoptGRef(gst_event_new_caps(streamingMembers->pendingInitialCaps.get()));
 
-gint64 webKitMediaSrcGetSize(WebKitMediaSrc* webKitMediaSrc)
-{
-    gint64 duration = 0;
-    for (Stream* stream : webKitMediaSrc->priv->streams)
-        duration = std::max<gint64>(duration, gst_app_src_get_size(GST_APP_SRC(stream->appsrc)));
-    return duration;
-}
+        gst_pad_push_event(pad, event.leakRef());
 
-gboolean webKitMediaSrcQueryWithParent(GstPad* pad, GstObject* parent, GstQuery* query)
-{
-    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(GST_ELEMENT(parent));
-    gboolean result = FALSE;
+        streamingMembers->previousCaps = WTFMove(streamingMembers->pendingInitialCaps);
+        ASSERT(!streamingMembers->pendingInitialCaps);
+    }
+
+    streamingMembers->queueChangedOrFlushedCondition.wait(streamingMembers.mutex(), [&]() {
+        return !streamingMembers->queue.isEmpty() || streamingMembers->isFlushing;
+    });
+    if (streamingMembers->isFlushing)
+        return;
+
+    // We wait to get a sample before emitting the first segment. This way, if we get a seek before any
+    // enqueue, we're sending only one segment. This also ensures that when such a seek is made, where we also
+    // omit the flush (see webKitMediaSrcFlush) we actually emit the updated, correct segment.
+    if (streamingMembers->doesNeedSegmentEvent) {
+        gst_pad_push_event(pad, gst_event_new_segment(&streamingMembers->segment));
+        streamingMembers->doesNeedSegmentEvent = false;
+    }
 
-    switch (GST_QUERY_TYPE(query)) {
-    case GST_QUERY_DURATION: {
-        GstFormat format;
-        gst_query_parse_duration(query, &format, nullptr);
+    GRefPtr<GstMiniObject> object = streamingMembers->queue.takeFirst();
+    if (GST_IS_SAMPLE(object.get())) {
+        GRefPtr<GstSample> sample = adoptGRef(GST_SAMPLE(object.leakRef()));
 
-        GST_DEBUG_OBJECT(source, "duration query in format %s", gst_format_get_name(format));
-        GST_OBJECT_LOCK(source);
-        switch (format) {
-        case GST_FORMAT_TIME: {
-            if (source->priv && source->priv->mediaPlayerPrivate) {
-                MediaTime duration = source->priv->mediaPlayerPrivate->durationMediaTime();
-                if (duration > MediaTime::zeroTime()) {
-                    gst_query_set_duration(query, format, WebCore::toGstClockTime(duration));
-                    GST_DEBUG_OBJECT(source, "Answering: duration=%" GST_TIME_FORMAT, GST_TIME_ARGS(WebCore::toGstClockTime(duration)));
-                    result = TRUE;
-                }
-            }
-            break;
+        if (!gst_caps_is_equal(gst_sample_get_caps(sample.get()), streamingMembers->previousCaps.get())) {
+            // This sample needs new caps (typically because of a quality change).
+            gst_pad_push_event(stream->pad.get(), gst_event_new_caps(gst_sample_get_caps(sample.get())));
+            streamingMembers->previousCaps = gst_sample_get_caps(sample.get());
         }
-        case GST_FORMAT_BYTES: {
-            if (source->priv) {
-                gint64 duration = webKitMediaSrcGetSize(source);
-                if (duration) {
-                    gst_query_set_duration(query, format, duration);
-                    GST_DEBUG_OBJECT(source, "size: %" G_GINT64_FORMAT, duration);
-                    result = TRUE;
-                }
-            }
-            break;
+
+        if (streamingMembers->doesNeedToNotifyOnLowWaterLevel && streamingMembers->durationEnqueued() <= Stream::durationEnqueuedLowWaterLevel) {
+            streamingMembers->doesNeedToNotifyOnLowWaterLevel = false;
+            webKitMediaSrcStreamNotifyLowWaterLevel(RefPtr<Stream>(stream));
         }
-        default:
-            break;
+
+        GRefPtr<GstBuffer> buffer = gst_sample_get_buffer(sample.get());
+        sample.clear();
+
+        if (!streamingMembers->hasPushedFirstBuffer) {
+            GUniquePtr<char> fileName { g_strdup_printf("playback-pipeline-before-playback-%s", stream->name.string().utf8().data()) };
+            GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS(GST_BIN(findPipeline(GRefPtr<GstElement>(GST_ELEMENT(stream->source))).get()),
+                GST_DEBUG_GRAPH_SHOW_ALL, fileName.get());
+            streamingMembers->hasPushedFirstBuffer = true;
         }
 
-        GST_OBJECT_UNLOCK(source);
-        break;
-    }
-    case GST_QUERY_URI:
-        if (source) {
-            GST_OBJECT_LOCK(source);
-            if (source->priv)
-                gst_query_set_uri(query, source->priv->location.get());
-            GST_OBJECT_UNLOCK(source);
+        // Push the buffer without the streamingMembers lock so that flushes can happen while it travels downstream.
+        streamingMembers.lockHolder().unlockEarly();
+
+        ASSERT(GST_BUFFER_PTS_IS_VALID(buffer.get()));
+        GstFlowReturn ret = gst_pad_push(pad, buffer.leakRef());
+        if (ret != GST_FLOW_OK && ret != GST_FLOW_FLUSHING) {
+            GST_ERROR_OBJECT(pad, "Pushing buffer returned %s", gst_flow_get_name(ret));
+            gst_pad_pause_task(pad);
         }
-        result = TRUE;
-        break;
-    default: {
-        GRefPtr<GstPad> target = adoptGRef(gst_ghost_pad_get_target(GST_GHOST_PAD_CAST(pad)));
-        // Forward the query to the proxy target pad.
-        if (target)
-            result = gst_pad_query(target.get(), query);
-        break;
-    }
-    }
+    } else if (GST_IS_EVENT(object.get())) {
+        // EOS events and other enqueued events are also sent unlocked so they can react to flushes if necessary.
+        GRefPtr<GstEvent> event = GRefPtr<GstEvent>(GST_EVENT(object.leakRef()));
+
+        streamingMembers.lockHolder().unlockEarly();
+        bool eventHandled = gst_pad_push_event(pad, GRefPtr<GstEvent>(event).leakRef());
+        if (!eventHandled)
+            GST_DEBUG_OBJECT(pad, "Pushed event was not handled: %" GST_PTR_FORMAT, event.get());
+    } else
+        ASSERT_NOT_REACHED();
+}
 
-    return result;
+static void webKitMediaSrcEnqueueObject(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstMiniObject>&& object)
+{
+    ASSERT(isMainThread());
+    ASSERT(object);
+
+    Stream* stream = source->priv->streamByName(streamName);
+    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+    streamingMembers->queue.append(WTFMove(object));
+    if (stream->reportedStatus->isReadyForMoreSamples && streamingMembers->durationEnqueued() > Stream::durationEnqueuedHighWaterLevel) {
+        stream->reportedStatus->isReadyForMoreSamples = false;
+        streamingMembers->doesNeedToNotifyOnLowWaterLevel = true;
+    }
+    streamingMembers->queueChangedOrFlushedCondition.notifyOne();
 }
 
-void webKitMediaSrcUpdatePresentationSize(GstCaps* caps, Stream* stream)
+void webKitMediaSrcEnqueueSample(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstSample>&& sample)
 {
-    GST_OBJECT_LOCK(stream->parent);
-    if (WebCore::doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
-        Optional<WebCore::FloatSize> size = WebCore::getVideoResolutionFromCaps(caps);
-        if (size.hasValue())
-            stream->presentationSize = size.value();
-        else
-            stream->presentationSize = WebCore::FloatSize();
-    } else
-        stream->presentationSize = WebCore::FloatSize();
+    ASSERT(GST_BUFFER_PTS_IS_VALID(gst_sample_get_buffer(sample.get())));
+    webKitMediaSrcEnqueueObject(source, streamName, adoptGRef(GST_MINI_OBJECT(sample.leakRef())));
+}
 
-    gst_caps_ref(caps);
-    stream->caps = adoptGRef(caps);
-    GST_OBJECT_UNLOCK(stream->parent);
+static void webKitMediaSrcEnqueueEvent(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstEvent>&& event)
+{
+    webKitMediaSrcEnqueueObject(source, streamName, adoptGRef(GST_MINI_OBJECT(event.leakRef())));
 }
 
-void webKitMediaSrcLinkStreamToSrcPad(GstPad* sourcePad, Stream* stream)
+void webKitMediaSrcEndOfStream(WebKitMediaSrc* source, const AtomString& streamName)
 {
-    unsigned padId = static_cast<unsigned>(GPOINTER_TO_INT(g_object_get_data(G_OBJECT(sourcePad), "padId")));
-    GST_DEBUG_OBJECT(stream->parent, "linking stream to src pad (id: %u)", padId);
+    webKitMediaSrcEnqueueEvent(source, streamName, adoptGRef(gst_event_new_eos()));
+}
 
-    GUniquePtr<gchar> padName(g_strdup_printf("src_%u", padId));
-    GstPad* ghostpad = WebCore::webkitGstGhostPadFromStaticTemplate(&srcTemplate, padName.get(), sourcePad);
+bool webKitMediaSrcIsReadyForMoreSamples(WebKitMediaSrc* source, const AtomString& streamName)
+{
+    ASSERT(isMainThread());
+    Stream* stream = source->priv->streamByName(streamName);
+    return stream->reportedStatus->isReadyForMoreSamples;
+}
 
-    auto proxypad = adoptGRef(GST_PAD(gst_proxy_pad_get_internal(GST_PROXY_PAD(ghostpad))));
-    gst_flow_combiner_add_pad(stream->parent->priv->flowCombiner.get(), proxypad.get());
-    gst_pad_set_chain_function(proxypad.get(), static_cast<GstPadChainFunction>(webkitMediaSrcChain));
-    gst_pad_set_query_function(ghostpad, webKitMediaSrcQueryWithParent);
+void webKitMediaSrcNotifyWhenReadyForMoreSamples(WebKitMediaSrc* source, const AtomString& streamName, WebCore::SourceBufferPrivateClient* sourceBufferPrivate)
+{
+    ASSERT(isMainThread());
+    Stream* stream = source->priv->streamByName(streamName);
+    ASSERT(!stream->reportedStatus->isReadyForMoreSamples);
+    stream->reportedStatus->sourceBufferPrivateToNotify = sourceBufferPrivate;
+}
 
-    gst_pad_set_active(ghostpad, TRUE);
-    gst_element_add_pad(GST_ELEMENT(stream->parent), ghostpad);
+static GstStateChangeReturn webKitMediaSrcChangeState(GstElement* element, GstStateChange transition)
+{
+    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(element);
+    if (transition == GST_STATE_CHANGE_PAUSED_TO_READY) {
+        while (!source->priv->streams.isEmpty())
+            webKitMediaSrcRemoveStream(source, source->priv->streams.begin()->key);
+    }
+    return GST_ELEMENT_CLASS(webkit_media_src_parent_class)->change_state(element, transition);
 }
 
-void webKitMediaSrcLinkSourcePad(GstPad* sourcePad, GstCaps* caps, Stream* stream)
+static void webKitMediaSrcStreamFlushStart(const RefPtr<Stream>& stream)
 {
-    ASSERT(caps && stream->parent);
-    if (!caps || !stream->parent) {
-        GST_ERROR("Unable to link parser");
-        return;
+    ASSERT(isMainThread());
+    {
+        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+
+        streamingMembers->isFlushing = true;
+        streamingMembers->queueChangedOrFlushedCondition.notifyOne();
+        streamingMembers->padLinkedOrFlushedCondition.notifyOne();
     }
 
-    webKitMediaSrcUpdatePresentationSize(caps, stream);
+    gst_pad_push_event(stream->pad.get(), gst_event_new_flush_start());
+}
+
+static void webKitMediaSrcStreamFlushStop(const RefPtr<Stream>& stream, bool resetTime)
+{
+    ASSERT(isMainThread());
+
+    // By taking the stream lock we are waiting for the streaming thread task to stop if it hadn't yet.
+    GST_PAD_STREAM_LOCK(stream->pad.get());
+    {
+        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
 
-    // FIXME: drop webKitMediaSrcLinkStreamToSrcPad() and move its code here.
-    if (!gst_pad_is_linked(sourcePad)) {
-        GST_DEBUG_OBJECT(stream->parent, "pad not linked yet");
-        webKitMediaSrcLinkStreamToSrcPad(sourcePad, stream);
+        streamingMembers->isFlushing = false;
+        streamingMembers->doesNeedSegmentEvent = true;
+        streamingMembers->queue.clear();
+        if (streamingMembers->doesNeedToNotifyOnLowWaterLevel) {
+            streamingMembers->doesNeedToNotifyOnLowWaterLevel = false;
+            webKitMediaSrcStreamNotifyLowWaterLevel(stream);
+        }
     }
 
-    webKitMediaSrcCheckAllTracksConfigured(stream->parent);
+    // Since FLUSH_STOP is a synchronized event, we send it while we still hold the stream lock of the pad.
+    gst_pad_push_event(stream->pad.get(), gst_event_new_flush_stop(resetTime));
+
+    gst_pad_start_task(stream->pad.get(), webKitMediaSrcLoop, stream->pad.get(), nullptr);
+    GST_PAD_STREAM_UNLOCK(stream->pad.get());
 }
 
-void webKitMediaSrcFreeStream(WebKitMediaSrc* source, Stream* stream)
+void webKitMediaSrcFlush(WebKitMediaSrc* source, const AtomString& streamName)
 {
-    if (GST_IS_APP_SRC(stream->appsrc)) {
-        // Don't trigger callbacks from this appsrc to avoid using the stream anymore.
-        gst_app_src_set_callbacks(GST_APP_SRC(stream->appsrc), &disabledAppsrcCallbacks, nullptr, nullptr);
-        gst_app_src_end_of_stream(GST_APP_SRC(stream->appsrc));
+    ASSERT(isMainThread());
+    Stream* stream = source->priv->streamByName(streamName);
+
+    bool hasPushedFirstBuffer;
+    {
+        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+        hasPushedFirstBuffer = streamingMembers->hasPushedFirstBuffer;
     }
 
-    GST_OBJECT_LOCK(source);
-    switch (stream->type) {
-    case WebCore::Audio:
-        source->priv->numberOfAudioStreams--;
-        break;
-    case WebCore::Video:
-        source->priv->numberOfVideoStreams--;
-        break;
-    case WebCore::Text:
-        source->priv->numberOfTextStreams--;
-        break;
-    default:
-        break;
+    if (hasPushedFirstBuffer) {
+        // If no buffer has been pushed there is no need for flush... and flushing at that point could
+        // expose bugs in downstream which may have not completely initialized (e.g. decodebin3 not
+        // having linked the chain so far and forgetting to do it after the flush).
+        webKitMediaSrcStreamFlushStart(stream);
     }
-    GST_OBJECT_UNLOCK(source);
 
-    if (stream->type != WebCore::Invalid) {
-        GST_DEBUG("Freeing track-related info on stream %p", stream);
-
-        LockHolder locker(source->priv->streamLock);
-
-        if (stream->caps)
-            stream->caps = nullptr;
-
-        if (stream->audioTrack)
-            stream->audioTrack = nullptr;
-        if (stream->videoTrack)
-            stream->videoTrack = nullptr;
-
-        int signal = -1;
-        switch (stream->type) {
-        case WebCore::Audio:
-            signal = SIGNAL_AUDIO_CHANGED;
-            break;
-        case WebCore::Video:
-            signal = SIGNAL_VIDEO_CHANGED;
-            break;
-        case WebCore::Text:
-            signal = SIGNAL_TEXT_CHANGED;
-            break;
-        default:
-            break;
-        }
-        stream->type = WebCore::Invalid;
+    GstClockTime pipelineStreamTime;
+    gst_element_query_position(findPipeline(GRefPtr<GstElement>(GST_ELEMENT(source))).get(), GST_FORMAT_TIME,
+        reinterpret_cast<gint64*>(&pipelineStreamTime));
+    // -1 is returned when the pipeline is not yet pre-rolled (e.g. just after a seek). In this case we don't need to
+    // adjust the segment though, as running time has not advanced.
+    if (GST_CLOCK_TIME_IS_VALID(pipelineStreamTime)) {
+        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+        // We need to increase the base by the running time accumulated during the previous segment.
 
-        if (signal != -1)
-            g_signal_emit(G_OBJECT(source), webKitMediaSrcSignals[signal], 0, nullptr);
+        GstClockTime pipelineRunningTime = gst_segment_to_running_time(&streamingMembers->segment, GST_FORMAT_TIME, pipelineStreamTime);
+        assert(GST_CLOCK_TIME_IS_VALID(pipelineRunningTime));
+        streamingMembers->segment.base = pipelineRunningTime;
 
-        source->priv->streamCondition.notifyOne();
+        streamingMembers->segment.start = streamingMembers->segment.time = static_cast<GstClockTime>(pipelineStreamTime);
     }
 
-    GST_DEBUG("Releasing stream: %p", stream);
-    delete stream;
+    if (hasPushedFirstBuffer)
+        webKitMediaSrcStreamFlushStop(stream, false);
 }
 
-void webKitMediaSrcCheckAllTracksConfigured(WebKitMediaSrc* webKitMediaSrc)
+void webKitMediaSrcSeek(WebKitMediaSrc* source, uint64_t startTime, double rate)
 {
-    bool allTracksConfigured = false;
+    ASSERT(isMainThread());
+    source->priv->startTime = startTime;
+    source->priv->rate = rate;
+
+    for (auto& pair : source->priv->streams) {
+        const RefPtr<Stream>& stream = pair.value;
+        bool hasPushedFirstBuffer;
+        {
+            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+            hasPushedFirstBuffer = streamingMembers->hasPushedFirstBuffer;
+        }
+
+        if (hasPushedFirstBuffer) {
+            // If no buffer has been pushed there is no need for flush... and flushing at that point could
+            // expose bugs in downstream which may have not completely initialized (e.g. decodebin3 not
+            // having linked the chain so far and forgetting to do it after the flush).
+            webKitMediaSrcStreamFlushStart(stream);
+        }
 
-    GST_OBJECT_LOCK(webKitMediaSrc);
-    if (!webKitMediaSrc->priv->allTracksConfigured) {
-        allTracksConfigured = true;
-        for (Stream* stream : webKitMediaSrc->priv->streams) {
-            if (stream->type == WebCore::Invalid) {
-                allTracksConfigured = false;
-                break;
-            }
+        {
+            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
+            streamingMembers->segment.base = 0;
+            streamingMembers->segment.rate = rate;
+            streamingMembers->segment.start = streamingMembers->segment.time = startTime;
         }
-        if (allTracksConfigured)
-            webKitMediaSrc->priv->allTracksConfigured = true;
+
+        if (hasPushedFirstBuffer)
+            webKitMediaSrcStreamFlushStop(stream, true);
     }
-    GST_OBJECT_UNLOCK(webKitMediaSrc);
+}
+
+static int countStreamsOfType(WebKitMediaSrc* source, WebCore::MediaSourceStreamTypeGStreamer type)
+{
+    // Barring pipeline dumps someone may add during debugging, WebKit will only read these properties (n-video etc.) from the main thread.
+    return std::count_if(source->priv->streams.begin(), source->priv->streams.end(), [type](auto item) {
+        return item.value->type == type;
+    });
+}
+
+static void webKitMediaSrcGetProperty(GObject* object, unsigned propId, GValue* value, GParamSpec* pspec)
+{
+    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
 
-    if (allTracksConfigured) {
-        GST_DEBUG("All tracks attached. Completing async state change operation.");
-        gst_element_no_more_pads(GST_ELEMENT(webKitMediaSrc));
-        webKitMediaSrcDoAsyncDone(webKitMediaSrc);
+    switch (propId) {
+    case PROP_N_AUDIO:
+        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Audio));
+        break;
+    case PROP_N_VIDEO:
+        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Video));
+        break;
+    case PROP_N_TEXT:
+        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Text));
+        break;
+    default:
+        G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
     }
 }
 
-// Uri handler interface.
-GstURIType webKitMediaSrcUriGetType(GType)
+// URI handler interface. It's only purpose is for the element to be instantiated by playbin on "mediasourceblob:"
+// URIs. The actual URI does not matter.
+static GstURIType webKitMediaSrcUriGetType(GType)
 {
     return GST_URI_SRC;
 }
 
-const gchar* const* webKitMediaSrcGetProtocols(GType)
+static const gchar* const* webKitMediaSrcGetProtocols(GType)
 {
     static const char* protocols[] = {"mediasourceblob", nullptr };
     return protocols;
 }
 
-gchar* webKitMediaSrcGetUri(GstURIHandler* handler)
+static gchar* webKitMediaSrcGetUri(GstURIHandler* handler)
 {
     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(handler);
     gchar* result;
 
     GST_OBJECT_LOCK(source);
-    result = g_strdup(source->priv->location.get());
+    result = g_strdup(source->priv->uri.get());
     GST_OBJECT_UNLOCK(source);
     return result;
 }
 
-gboolean webKitMediaSrcSetUri(GstURIHandler* handler, const gchar* uri, GError**)
+static gboolean webKitMediaSrcSetUri(GstURIHandler* handler, const gchar* uri, GError**)
 {
     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(handler);
 
     if (GST_STATE(source) >= GST_STATE_PAUSED) {
         GST_ERROR_OBJECT(source, "URI can only be set in states < PAUSED");
-        return FALSE;
+        return false;
     }
 
     GST_OBJECT_LOCK(source);
-    WebKitMediaSrcPrivate* priv = source->priv;
-    priv->location = nullptr;
-    if (!uri) {
-        GST_OBJECT_UNLOCK(source);
-        return TRUE;
-    }
-
-    URL url(URL(), uri);
-
-    priv->location = GUniquePtr<gchar>(g_strdup(url.string().utf8().data()));
+    source->priv->uri = GUniquePtr<char>(g_strdup(uri));
     GST_OBJECT_UNLOCK(source);
     return TRUE;
 }
 
-void webKitMediaSrcUriHandlerInit(gpointer gIface, gpointer)
+static void webKitMediaSrcUriHandlerInit(void* gIface, void*)
 {
     GstURIHandlerInterface* iface = (GstURIHandlerInterface *) gIface;
 
@@ -652,83 +806,6 @@ void webKitMediaSrcUriHandlerInit(gpointer gIface, gpointer)
     iface->set_uri = webKitMediaSrcSetUri;
 }
 
-static void seekNeedsDataMainThread(WebKitMediaSrc* source)
-{
-    GST_DEBUG("Buffering needed before seek");
-
-    ASSERT(WTF::isMainThread());
-
-    GST_OBJECT_LOCK(source);
-    MediaTime seekTime = source->priv->seekTime;
-    WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate = source->priv->mediaPlayerPrivate;
-
-    if (!mediaPlayerPrivate) {
-        GST_OBJECT_UNLOCK(source);
-        return;
-    }
-
-    for (Stream* stream : source->priv->streams) {
-        if (stream->type != WebCore::Invalid)
-            stream->sourceBuffer->setReadyForMoreSamples(true);
-    }
-    GST_OBJECT_UNLOCK(source);
-    mediaPlayerPrivate->notifySeekNeedsDataForTime(seekTime);
-}
-
-static void notifyReadyForMoreSamplesMainThread(WebKitMediaSrc* source, Stream* appsrcStream)
-{
-    GST_OBJECT_LOCK(source);
-
-    auto it = std::find(source->priv->streams.begin(), source->priv->streams.end(), appsrcStream);
-    if (it == source->priv->streams.end()) {
-        GST_OBJECT_UNLOCK(source);
-        return;
-    }
-
-    WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate = source->priv->mediaPlayerPrivate;
-    if (mediaPlayerPrivate && !mediaPlayerPrivate->seeking())
-        appsrcStream->sourceBuffer->notifyReadyForMoreSamples();
-
-    GST_OBJECT_UNLOCK(source);
-}
-
-void webKitMediaSrcSetMediaPlayerPrivate(WebKitMediaSrc* source, WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate)
-{
-    GST_OBJECT_LOCK(source);
-
-    // Set to nullptr on MediaPlayerPrivateGStreamer destruction, never a dangling pointer.
-    source->priv->mediaPlayerPrivate = mediaPlayerPrivate;
-    GST_OBJECT_UNLOCK(source);
-}
-
-void webKitMediaSrcSetReadyForSamples(WebKitMediaSrc* source, bool isReady)
-{
-    if (source) {
-        GST_OBJECT_LOCK(source);
-        for (Stream* stream : source->priv->streams)
-            stream->sourceBuffer->setReadyForMoreSamples(isReady);
-        GST_OBJECT_UNLOCK(source);
-    }
-}
-
-void webKitMediaSrcPrepareSeek(WebKitMediaSrc* source, const MediaTime& time)
-{
-    GST_OBJECT_LOCK(source);
-    source->priv->seekTime = time;
-    source->priv->appsrcSeekDataCount = 0;
-    source->priv->appsrcNeedDataCount = 0;
-
-    for (Stream* stream : source->priv->streams) {
-        stream->appsrcNeedDataFlag = false;
-        // Don't allow samples away from the seekTime to be enqueued.
-        stream->lastEnqueuedTime = time;
-    }
-
-    // The pending action will be performed in enabledAppsrcSeekData().
-    source->priv->appsrcSeekDataNextAction = MediaSourceSeekToTime;
-    GST_OBJECT_UNLOCK(source);
-}
-
 namespace WTF {
 template <> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc* ptr)
 {
@@ -749,7 +826,7 @@ template <> void derefGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr)
     if (ptr)
         gst_object_unref(ptr);
 }
-};
+} // namespace WTF
 
 #endif // USE(GSTREAMER)
 
index 6c45eb6..e0cf11c 100644 (file)
@@ -3,8 +3,8 @@
  *  Copyright (C) 2013 Collabora Ltd.
  *  Copyright (C) 2013 Orange
  *  Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
- *  Copyright (C) 2015, 2016 Metrological Group B.V.
- *  Copyright (C) 2015, 2016 Igalia, S.L
+ *  Copyright (C) 2015, 2016, 2018, 2019 Metrological Group B.V.
+ *  Copyright (C) 2015, 2016, 2018, 2019 Igalia, S.L
  *
  *  This library is free software; you can redistribute it and/or
  *  modify it under the terms of the GNU Lesser General Public
@@ -39,7 +39,7 @@ class MediaPlayerPrivateGStreamerMSE;
 
 enum MediaSourceStreamTypeGStreamer { Invalid, Unknown, Audio, Video, Text };
 
-}
+} // namespace WebCore
 
 G_BEGIN_DECLS
 
@@ -49,32 +49,38 @@ G_BEGIN_DECLS
 #define WEBKIT_IS_MEDIA_SRC(obj)         (G_TYPE_CHECK_INSTANCE_TYPE ((obj), WEBKIT_TYPE_MEDIA_SRC))
 #define WEBKIT_IS_MEDIA_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), WEBKIT_TYPE_MEDIA_SRC))
 
-typedef struct _WebKitMediaSrc        WebKitMediaSrc;
-typedef struct _WebKitMediaSrcClass   WebKitMediaSrcClass;
-typedef struct _WebKitMediaSrcPrivate WebKitMediaSrcPrivate;
+struct WebKitMediaSrcPrivate;
 
-struct _WebKitMediaSrc {
-    GstBin parent;
+struct WebKitMediaSrc {
+    GstElement parent;
 
     WebKitMediaSrcPrivate *priv;
 };
 
-struct _WebKitMediaSrcClass {
-    GstBinClass parentClass;
-
-    // Notify app that number of audio/video/text streams changed.
-    void (*videoChanged)(WebKitMediaSrc*);
-    void (*audioChanged)(WebKitMediaSrc*);
-    void (*textChanged)(WebKitMediaSrc*);
+struct WebKitMediaSrcClass {
+    GstElementClass parentClass;
 };
 
 GType webkit_media_src_get_type(void);
 
-void webKitMediaSrcSetMediaPlayerPrivate(WebKitMediaSrc*, WebCore::MediaPlayerPrivateGStreamerMSE*);
+void webKitMediaSrcAddStream(WebKitMediaSrc*, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer, GRefPtr<GstCaps>&& initialCaps);
+void webKitMediaSrcRemoveStream(WebKitMediaSrc*, const AtomString& name);
+
+void webKitMediaSrcEnqueueSample(WebKitMediaSrc*, const AtomString& streamName, GRefPtr<GstSample>&&);
+void webKitMediaSrcEndOfStream(WebKitMediaSrc*, const AtomString& streamName);
 
-void webKitMediaSrcPrepareSeek(WebKitMediaSrc*, const MediaTime&);
-void webKitMediaSrcSetReadyForSamples(WebKitMediaSrc*, bool);
+bool webKitMediaSrcIsReadyForMoreSamples(WebKitMediaSrc*, const AtomString& streamName);
+void webKitMediaSrcNotifyWhenReadyForMoreSamples(WebKitMediaSrc*, const AtomString& streamName, WebCore::SourceBufferPrivateClient*);
+
+void webKitMediaSrcFlush(WebKitMediaSrc*, const AtomString& streamName);
+void webKitMediaSrcSeek(WebKitMediaSrc*, guint64 startTime, double rate);
 
 G_END_DECLS
 
+namespace WTF {
+template<> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc* ptr);
+template<> WebKitMediaSrc* refGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr);
+template<> void derefGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr);
+} // namespace WTF
+
 #endif // USE(GSTREAMER)
diff --git a/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h b/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h
deleted file mode 100644 (file)
index 413dd3f..0000000
+++ /dev/null
@@ -1,155 +0,0 @@
-/*
- * Copyright (C) 2016 Metrological Group B.V.
- * Copyright (C) 2016 Igalia S.L
- *
- * This library is free software; you can redistribute it and/or
- * modify it under the terms of the GNU Library General Public
- * License as published by the Free Software Foundation; either
- * version 2 of the License, or (at your option) any later version.
- *
- * This library is distributed in the hope that it will be useful,
- * but WITHOUT ANY WARRANTY; without even the implied warranty of
- * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
- * Library General Public License for more details.
- *
- * You should have received a copy of the GNU Library General Public License
- * aint with this library; see the file COPYING.LIB.  If not, write to
- * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
- * Boston, MA 02110-1301, USA.
- */
-
-#pragma once
-
-#if ENABLE(VIDEO) && USE(GSTREAMER) && ENABLE(MEDIA_SOURCE)
-
-#include "AudioTrackPrivateGStreamer.h"
-#include "GUniquePtrGStreamer.h"
-#include "MainThreadNotifier.h"
-#include "SourceBufferPrivateGStreamer.h"
-#include "VideoTrackPrivateGStreamer.h"
-#include "WebKitMediaSourceGStreamer.h"
-
-#include <gst/app/gstappsrc.h>
-#include <gst/gst.h>
-#include <wtf/Forward.h>
-#include <wtf/glib/GRefPtr.h>
-
-namespace WebCore {
-
-class MediaPlayerPrivateGStreamerMSE;
-
-};
-
-void webKitMediaSrcUriHandlerInit(gpointer, gpointer);
-
-#define WEBKIT_MEDIA_SRC_GET_PRIVATE(obj) (G_TYPE_INSTANCE_GET_PRIVATE((obj), WEBKIT_TYPE_MEDIA_SRC, WebKitMediaSrcPrivate))
-
-typedef struct _Stream Stream;
-
-struct _Stream {
-    // Fields filled when the Stream is created.
-    WebKitMediaSrc* parent;
-
-    // AppSrc. Never modified after first assignment.
-    GstElement* appsrc;
-
-    // Never modified after first assignment.
-    WebCore::SourceBufferPrivateGStreamer* sourceBuffer;
-
-    // Fields filled when the track is attached.
-    WebCore::MediaSourceStreamTypeGStreamer type;
-    GRefPtr<GstCaps> caps;
-
-    // Only audio, video or nothing at a given time.
-    RefPtr<WebCore::AudioTrackPrivateGStreamer> audioTrack;
-    RefPtr<WebCore::VideoTrackPrivateGStreamer> videoTrack;
-    WebCore::FloatSize presentationSize;
-
-    // This helps WebKitMediaSrcPrivate.appsrcNeedDataCount, ensuring that needDatas are
-    // counted only once per each appsrc.
-    bool appsrcNeedDataFlag;
-
-    // Used to enforce continuity in the appended data and avoid breaking the decoder.
-    // Only used from the main thread.
-    MediaTime lastEnqueuedTime;
-};
-
-enum {
-    PROP_0,
-    PROP_LOCATION,
-    PROP_N_AUDIO,
-    PROP_N_VIDEO,
-    PROP_N_TEXT,
-    PROP_LAST
-};
-
-enum {
-    SIGNAL_VIDEO_CHANGED,
-    SIGNAL_AUDIO_CHANGED,
-    SIGNAL_TEXT_CHANGED,
-    LAST_SIGNAL
-};
-
-enum OnSeekDataAction {
-    Nothing,
-    MediaSourceSeekToTime
-};
-
-enum WebKitMediaSrcMainThreadNotification {
-    ReadyForMoreSamples = 1 << 0,
-    SeekNeedsData = 1 << 1
-};
-
-struct _WebKitMediaSrcPrivate {
-    // Used to coordinate the release of Stream track info.
-    Lock streamLock;
-    Condition streamCondition;
-
-    // Streams are only added/removed in the main thread.
-    Vector<Stream*> streams;
-
-    GUniquePtr<gchar> location;
-    int numberOfAudioStreams;
-    int numberOfVideoStreams;
-    int numberOfTextStreams;
-    bool asyncStart;
-    bool allTracksConfigured;
-    unsigned numberOfPads;
-
-    MediaTime seekTime;
-
-    // On seek, we wait for all the seekDatas, then for all the needDatas, and then run the nextAction.
-    OnSeekDataAction appsrcSeekDataNextAction;
-    int appsrcSeekDataCount;
-    int appsrcNeedDataCount;
-
-    WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate;
-
-    RefPtr<WebCore::MainThreadNotifier<WebKitMediaSrcMainThreadNotification>> notifier;
-    GUniquePtr<GstFlowCombiner> flowCombiner;
-};
-
-extern guint webKitMediaSrcSignals[LAST_SIGNAL];
-extern GstAppSrcCallbacks enabledAppsrcCallbacks;
-extern GstAppSrcCallbacks disabledAppsrcCallbacks;
-
-void webKitMediaSrcUriHandlerInit(gpointer gIface, gpointer ifaceData);
-void webKitMediaSrcFinalize(GObject*);
-void webKitMediaSrcSetProperty(GObject*, guint propertyId, const GValue*, GParamSpec*);
-void webKitMediaSrcGetProperty(GObject*, guint propertyId, GValue*, GParamSpec*);
-void webKitMediaSrcDoAsyncStart(WebKitMediaSrc*);
-void webKitMediaSrcDoAsyncDone(WebKitMediaSrc*);
-GstStateChangeReturn webKitMediaSrcChangeState(GstElement*, GstStateChange);
-gint64 webKitMediaSrcGetSize(WebKitMediaSrc*);
-gboolean webKitMediaSrcQueryWithParent(GstPad*, GstObject*, GstQuery*);
-void webKitMediaSrcUpdatePresentationSize(GstCaps*, Stream*);
-void webKitMediaSrcLinkStreamToSrcPad(GstPad*, Stream*);
-void webKitMediaSrcLinkSourcePad(GstPad*, GstCaps*, Stream*);
-void webKitMediaSrcFreeStream(WebKitMediaSrc*, Stream*);
-void webKitMediaSrcCheckAllTracksConfigured(WebKitMediaSrc*);
-GstURIType webKitMediaSrcUriGetType(GType);
-const gchar* const* webKitMediaSrcGetProtocols(GType);
-gchar* webKitMediaSrcGetUri(GstURIHandler*);
-gboolean webKitMediaSrcSetUri(GstURIHandler*, const gchar*, GError**);
-
-#endif // USE(GSTREAMER)
index 18b12f8..3a7cec7 100644 (file)
@@ -37,8 +37,8 @@ if (ENABLE_VIDEO OR ENABLE_WEB_AUDIO)
     SET_AND_EXPOSE_TO_BUILD(USE_GSTREAMER TRUE)
 endif ()
 
-if (ENABLE_MEDIA_SOURCE AND PC_GSTREAMER_VERSION VERSION_LESS "1.14")
-    message(FATAL_ERROR "GStreamer 1.14 is needed for ENABLE_MEDIA_SOURCE.")
+if (ENABLE_MEDIA_SOURCE AND PC_GSTREAMER_VERSION VERSION_LESS "1.16")
+    message(FATAL_ERROR "GStreamer 1.16 is needed for ENABLE_MEDIA_SOURCE.")
 endif ()
 
 if (ENABLE_MEDIA_STREAM OR ENABLE_WEB_RTC)
index fa20e32..aa12e0a 100644 (file)
@@ -1,3 +1,15 @@
+2019-08-28  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] WebKitMediaSrc rework
+        https://bugs.webkit.org/show_bug.cgi?id=199719
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        Added WebKitMediaSourceGStreamer.cpp to the GStreamer-style coding
+        whitelist.
+
+        * Scripts/webkitpy/style/checker.py:
+
 2019-08-28  Alexey Proskuryakov  <ap@apple.com>
 
         Updating inactive contributors in contributors.json.
index ec86e6b..dc088e9 100644 (file)
@@ -211,6 +211,7 @@ _PATH_RULES_SPECIFIER = [
       # variables and functions containing underscores.
       os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'VideoSinkGStreamer.cpp'),
       os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'WebKitWebSourceGStreamer.cpp'),
+      os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'mse', 'WebKitMediaSourceGStreamer.cpp'),
       os.path.join('Source', 'WebCore', 'platform', 'audio', 'gstreamer', 'WebKitWebAudioSourceGStreamer.cpp'),
       os.path.join('Source', 'WebCore', 'platform', 'mediastream', 'gstreamer', 'GStreamerMediaStreamSource.h'),
       os.path.join('Source', 'WebCore', 'platform', 'mediastream', 'gstreamer', 'GStreamerMediaStreamSource.cpp'),