[MSE][GStreamer] Don't construct segments on PlaybackPipeline::flush
authoraboya@igalia.com <aboya@igalia.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 22 Mar 2019 12:22:23 +0000 (12:22 +0000)
committeraboya@igalia.com <aboya@igalia.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 22 Mar 2019 12:22:23 +0000 (12:22 +0000)
https://bugs.webkit.org/show_bug.cgi?id=195867

Reviewed by Xabier Rodriguez-Calvar.

LayoutTests/imported/w3c:

These tests check that video and audio are roughly in sync with each
other and with the reported player position during MSE playback.

* web-platform-tests/media-source/mediasource-correct-frames-after-reappend-expected.txt: Added.
* web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html: Added.
* web-platform-tests/media-source/mediasource-correct-frames-expected.txt: Added.
* web-platform-tests/media-source/mediasource-correct-frames.html: Added.
* web-platform-tests/media-source/mp4/test-boxes-audio.mp4: Added.
* web-platform-tests/media-source/mp4/test-boxes-video.mp4: Added.

Source/WebCore:

The previous approach did not really work for flushes on only one
branch, as setting reset-time in FLUSH_STOP affects the running time
of the entire pipeline, causing timing issues in the other branch.

Since it's preferable not to interfere with the other branch if
possible, setting reset-time to FALSE fixes that problem.

Also, it's not necessary to fabricate a segment. Since we are not
seeking, only the base needs to be adjusted, and gstbasesrc already
handles this correctly by default.

This fixes an audio/video synchronization bug in YT when some
automatic quality changes occur.

Tests: imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html
       imported/w3c/web-platform-tests/media-source/mediasource-correct-frames.html

* platform/graphics/gstreamer/mse/PlaybackPipeline.cpp:
(WebCore::PlaybackPipeline::flush):

LayoutTests:

Drawing an MSE video in a canvas seems to be failing in Mac. That
functionality is necessary for the tests introduced with this patch,
therefore they fail there. Marking them as Skip.

* platform/mac/TestExpectations:

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@243372 268f45cc-cd09-0410-ab3c-d52691b4dbfc

LayoutTests/ChangeLog
LayoutTests/imported/w3c/ChangeLog
LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend-expected.txt [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-expected.txt [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames.html [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-audio.mp4 [new file with mode: 0644]
LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-video.mp4 [new file with mode: 0644]
LayoutTests/platform/mac/TestExpectations
Source/WebCore/ChangeLog
Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp

index 0a1ac18..fe8a265 100644 (file)
@@ -1,3 +1,16 @@
+2019-03-22  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Don't construct segments on PlaybackPipeline::flush
+        https://bugs.webkit.org/show_bug.cgi?id=195867
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        Drawing an MSE video in a canvas seems to be failing in Mac. That
+        functionality is necessary for the tests introduced with this patch,
+        therefore they fail there. Marking them as Skip.
+
+        * platform/mac/TestExpectations:
+
 2019-03-21  Said Abou-Hallawa  <sabouhallawa@apple.com>
 
         Remove the SVG tear off objects for SVGNumber, SVGNumberList and SVGAnimatedNumberList
index 44aab83..1f64c15 100644 (file)
@@ -1,3 +1,20 @@
+2019-03-22  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Don't construct segments on PlaybackPipeline::flush
+        https://bugs.webkit.org/show_bug.cgi?id=195867
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        These tests check that video and audio are roughly in sync with each
+        other and with the reported player position during MSE playback.
+
+        * web-platform-tests/media-source/mediasource-correct-frames-after-reappend-expected.txt: Added.
+        * web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html: Added.
+        * web-platform-tests/media-source/mediasource-correct-frames-expected.txt: Added.
+        * web-platform-tests/media-source/mediasource-correct-frames.html: Added.
+        * web-platform-tests/media-source/mp4/test-boxes-audio.mp4: Added.
+        * web-platform-tests/media-source/mp4/test-boxes-video.mp4: Added.
+
 2019-03-21  Sihui Liu  <sihui_liu@apple.com>
 
         Fix key path extraction code in IndexedDB to check own property
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend-expected.txt b/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend-expected.txt
new file mode 100644 (file)
index 0000000..f734c3e
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS Test the expected frames are played at the expected times, even in presence of reappends 
+
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html b/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html
new file mode 100644 (file)
index 0000000..5c0f2e1
--- /dev/null
@@ -0,0 +1,162 @@
+<!DOCTYPE html>
+<!-- Copyright © 2019 Igalia. -->
+<html>
+<head>
+    <title>Frame checking test for MSE playback in presence of a reappend.</title>
+    <meta name="timeout" content="long">
+    <meta name="charset" content="UTF-8">
+    <link rel="author" title="Alicia Boya García" href="mailto:aboya@igalia.com">
+    <script src="/resources/testharness.js"></script>
+    <script src="/resources/testharnessreport.js"></script>
+    <script src="mediasource-util.js"></script>
+</head>
+<body>
+<div id="log"></div>
+<canvas id="test-canvas"></canvas>
+<script>
+    function waitForEventPromise(element, event) {
+        return new Promise(resolve => {
+            function handler(ev) {
+                element.removeEventListener(event, handler);
+                resolve(ev);
+            }
+            element.addEventListener(event, handler);
+        });
+    }
+
+    function appendBufferPromise(sourceBuffer, data) {
+        sourceBuffer.appendBuffer(data);
+        return waitForEventPromise(sourceBuffer, "update");
+    }
+
+    function waitForPlayerToReachTimePromise(mediaElement, time) {
+        return new Promise(resolve => {
+            function timeupdate() {
+                if (mediaElement.currentTime < time)
+                    return;
+
+                mediaElement.removeEventListener("timeupdate", timeupdate);
+                resolve();
+            }
+            mediaElement.addEventListener("timeupdate", timeupdate);
+        });
+    }
+
+    function readPixel(imageData, x, y) {
+        return {
+            r: imageData.data[4 * (y * imageData.width + x)],
+            g: imageData.data[1 + 4 * (y * imageData.width + x)],
+            b: imageData.data[2 + 4 * (y * imageData.width + x)],
+            a: imageData.data[3 + 4 * (y * imageData.width + x)],
+        };
+    }
+
+    function isPixelLit(pixel) {
+        const threshold = 200; // out of 255
+        return pixel.r >= threshold && pixel.g >= threshold && pixel.b >= threshold;
+    }
+
+    // The test video has a few gray boxes. Each box interval (1 second) a new box is lit white and a different note
+    // is played. This test makes sure the right number of lit boxes and the right note are played at the right time.
+    const totalBoxes = 7;
+    const boxInterval = 1; // seconds
+
+    const videoWidth = 320;
+    const videoHeight = 240;
+    const boxesY = 210;
+    const boxSide = 20;
+    const boxMargin = 20;
+    const allBoxesWidth = totalBoxes * boxSide + (totalBoxes - 1) * boxMargin;
+    const boxesX = new Array(totalBoxes).fill(undefined)
+        .map((_, i) => (videoWidth - allBoxesWidth) / 2 + boxSide / 2 + i * (boxSide + boxMargin));
+
+    // Sound starts playing A4 (440 Hz) and goes one chromatic note up with every box lit.
+    // By comparing the player position to both the amount of boxes lit and the note played we can detect A/V
+    // synchronization issues automatically.
+    const noteFrequencies = new Array(1 + totalBoxes).fill(undefined)
+        .map((_, i) => 440 * Math.pow(Math.pow(2, 1 / 12), i));
+
+    // We also check the first second [0, 1) where no boxes are lit, therefore we start counting at -1 to do the check
+    // for zero lit boxes.
+    let boxesLitSoFar = -1;
+
+    mediasource_test(async function (test, mediaElement, mediaSource) {
+        const canvas = document.getElementById("test-canvas");
+        const canvasCtx = canvas.getContext("2d");
+        canvas.width = videoWidth;
+        canvas.height = videoHeight;
+
+        const videoData = await (await fetch("mp4/test-boxes-video.mp4")).arrayBuffer();
+        const audioData = (await (await fetch("mp4/test-boxes-audio.mp4")).arrayBuffer());
+
+        const videoSb = mediaSource.addSourceBuffer('video/mp4; codecs="avc1.4d401f"');
+        const audioSb = mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"');
+
+        mediaElement.addEventListener('error', test.unreached_func("Unexpected event 'error'"));
+        mediaElement.addEventListener('ended', onEnded);
+        mediaElement.addEventListener('timeupdate', onTimeUpdate);
+
+        await appendBufferPromise(videoSb, videoData);
+        await appendBufferPromise(audioSb, audioData);
+        mediaElement.play();
+
+        audioCtx = new (window.AudioContext || window.webkitAudioContext)();
+        source = audioCtx.createMediaElementSource(mediaElement);
+        analyser = audioCtx.createAnalyser();
+        analyser.fftSize = 8192;
+        source.connect(analyser);
+        analyser.connect(audioCtx.destination);
+
+        const freqDomainArray = new Float32Array(analyser.frequencyBinCount);
+
+        function checkNoteBeingPlayed() {
+            const expectedNoteFrequency = noteFrequencies[boxesLitSoFar];
+
+            analyser.getFloatFrequencyData(freqDomainArray);
+            const maxBin = freqDomainArray.reduce((prev, curValue, i) =>
+                curValue > prev.value ? {index: i, value: curValue} : prev,
+                {index: -1, value: -Infinity});
+            const binFrequencyWidth = audioCtx.sampleRate / analyser.fftSize;
+            const binFreq = maxBin.index * binFrequencyWidth;
+
+            assert_true(Math.abs(expectedNoteFrequency - binFreq) <= binFrequencyWidth,
+                `The note being played matches the expected one (boxes lit: ${boxesLitSoFar}, ${expectedNoteFrequency.toFixed(1)} Hz)` +
+                `, found ~${binFreq.toFixed(1)} Hz`);
+        }
+
+        function countLitBoxesInCurrentVideoFrame() {
+            canvasCtx.drawImage(mediaElement, 0, 0);
+            const imageData = canvasCtx.getImageData(0, 0, videoWidth, videoHeight);
+            const lights = boxesX.map(boxX => isPixelLit(readPixel(imageData, boxX, boxesY)));
+            let litBoxes = 0;
+            for (let i = 0; i < lights.length; i++) {
+                if (lights[i])
+                    litBoxes++;
+            }
+            for (let i = litBoxes; i < lights.length; i++) {
+                assert_false(lights[i], 'After the first non-lit box, all boxes must non-lit');
+            }
+            return litBoxes;
+        }
+
+        await waitForPlayerToReachTimePromise(mediaElement, 2.5);
+        await appendBufferPromise(audioSb, audioData);
+        mediaSource.endOfStream();
+
+        function onTimeUpdate() {
+            const graceTime = 0.5;
+            if (mediaElement.currentTime >= (1 + boxesLitSoFar) * boxInterval + graceTime && boxesLitSoFar < totalBoxes) {
+                assert_equals(countLitBoxesInCurrentVideoFrame(), boxesLitSoFar + 1, "Num of lit boxes:");
+                boxesLitSoFar++;
+                checkNoteBeingPlayed();
+            }
+        }
+
+        function onEnded() {
+            assert_equals(boxesLitSoFar, totalBoxes, "Boxes lit at video ended event");
+            test.done();
+        }
+    }, "Test the expected frames are played at the expected times, even in presence of reappends");
+</script>
+</body>
+</html>
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-expected.txt b/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-expected.txt
new file mode 100644 (file)
index 0000000..4723bf2
--- /dev/null
@@ -0,0 +1,3 @@
+
+PASS Test the expected frames are played at the expected times 
+
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames.html b/LayoutTests/imported/w3c/web-platform-tests/media-source/mediasource-correct-frames.html
new file mode 100644 (file)
index 0000000..4ef3f46
--- /dev/null
@@ -0,0 +1,146 @@
+<!DOCTYPE html>
+<!-- Copyright © 2019 Igalia. -->
+<html>
+<head>
+    <title>Frame checking test for simple MSE playback.</title>
+    <meta name="timeout" content="long">
+    <meta name="charset" content="UTF-8">
+    <link rel="author" title="Alicia Boya García" href="mailto:aboya@igalia.com">
+    <script src="/resources/testharness.js"></script>
+    <script src="/resources/testharnessreport.js"></script>
+    <script src="mediasource-util.js"></script>
+</head>
+<body>
+<div id="log"></div>
+<canvas id="test-canvas"></canvas>
+<script>
+    function waitForEventPromise(element, event) {
+        return new Promise(resolve => {
+            function handler(ev) {
+                element.removeEventListener(event, handler);
+                resolve(ev);
+            }
+            element.addEventListener(event, handler);
+        });
+    }
+
+    function appendBufferPromise(sourceBuffer, data) {
+        sourceBuffer.appendBuffer(data);
+        return waitForEventPromise(sourceBuffer, "update");
+    }
+
+    function readPixel(imageData, x, y) {
+        return {
+            r: imageData.data[4 * (y * imageData.width + x)],
+            g: imageData.data[1 + 4 * (y * imageData.width + x)],
+            b: imageData.data[2 + 4 * (y * imageData.width + x)],
+            a: imageData.data[3 + 4 * (y * imageData.width + x)],
+        };
+    }
+
+    function isPixelLit(pixel) {
+        const threshold = 200; // out of 255
+        return pixel.r >= threshold && pixel.g >= threshold && pixel.b >= threshold;
+    }
+
+    // The test video has a few gray boxes. Each box interval (1 second) a new box is lit white and a different note
+    // is played. This test makes sure the right number of lit boxes and the right note are played at the right time.
+    const totalBoxes = 7;
+    const boxInterval = 1; // seconds
+
+    const videoWidth = 320;
+    const videoHeight = 240;
+    const boxesY = 210;
+    const boxSide = 20;
+    const boxMargin = 20;
+    const allBoxesWidth = totalBoxes * boxSide + (totalBoxes - 1) * boxMargin;
+    const boxesX = new Array(totalBoxes).fill(undefined)
+        .map((_, i) => (videoWidth - allBoxesWidth) / 2 + boxSide / 2 + i * (boxSide + boxMargin));
+
+    // Sound starts playing A4 (440 Hz) and goes one chromatic note up with every box lit.
+    // By comparing the player position to both the amount of boxes lit and the note played we can detect A/V
+    // synchronization issues automatically.
+    const noteFrequencies = new Array(1 + totalBoxes).fill(undefined)
+        .map((_, i) => 440 * Math.pow(Math.pow(2, 1 / 12), i));
+
+    // We also check the first second [0, 1) where no boxes are lit, therefore we start counting at -1 to do the check
+    // for zero lit boxes.
+    let boxesLitSoFar = -1;
+
+    mediasource_test(async function (test, mediaElement, mediaSource) {
+        const canvas = document.getElementById("test-canvas");
+        const canvasCtx = canvas.getContext("2d");
+        canvas.width = videoWidth;
+        canvas.height = videoHeight;
+
+        const videoData = await (await fetch("mp4/test-boxes-video.mp4")).arrayBuffer();
+        const audioData = (await (await fetch("mp4/test-boxes-audio.mp4")).arrayBuffer());
+
+        const videoSb = mediaSource.addSourceBuffer('video/mp4; codecs="avc1.4d401f"');
+        const audioSb = mediaSource.addSourceBuffer('audio/mp4; codecs="mp4a.40.2"');
+
+        mediaElement.addEventListener('error', test.unreached_func("Unexpected event 'error'"));
+        mediaElement.addEventListener('ended', onEnded);
+        mediaElement.addEventListener('timeupdate', onTimeUpdate);
+
+        await appendBufferPromise(videoSb, videoData);
+        await appendBufferPromise(audioSb, audioData);
+        mediaSource.endOfStream();
+        mediaElement.play();
+
+        const audioCtx = new (window.AudioContext || window.webkitAudioContext)();
+        const source = audioCtx.createMediaElementSource(mediaElement);
+        const analyser = audioCtx.createAnalyser();
+        analyser.fftSize = 8192;
+        source.connect(analyser);
+        analyser.connect(audioCtx.destination);
+
+        const freqDomainArray = new Float32Array(analyser.frequencyBinCount);
+
+        function checkNoteBeingPlayed() {
+            const expectedNoteFrequency = noteFrequencies[boxesLitSoFar];
+
+            analyser.getFloatFrequencyData(freqDomainArray);
+            const maxBin = freqDomainArray.reduce((prev, curValue, i) =>
+                curValue > prev.value ? {index: i, value: curValue} : prev,
+                {index: -1, value: -Infinity});
+            const binFrequencyWidth = audioCtx.sampleRate / analyser.fftSize;
+            const binFreq = maxBin.index * binFrequencyWidth;
+
+            assert_true(Math.abs(expectedNoteFrequency - binFreq) <= binFrequencyWidth,
+                `The note being played matches the expected one (boxes lit: ${boxesLitSoFar}, ${expectedNoteFrequency.toFixed(1)} Hz)` +
+                `, found ~${binFreq.toFixed(1)} Hz`);
+        }
+
+        function countLitBoxesInCurrentVideoFrame() {
+            canvasCtx.drawImage(mediaElement, 0, 0);
+            const imageData = canvasCtx.getImageData(0, 0, videoWidth, videoHeight);
+            const lights = boxesX.map(boxX => isPixelLit(readPixel(imageData, boxX, boxesY)));
+            let litBoxes = 0;
+            for (let i = 0; i < lights.length; i++) {
+                if (lights[i])
+                    litBoxes++;
+            }
+            for (let i = litBoxes; i < lights.length; i++) {
+                assert_false(lights[i], 'After the first non-lit box, all boxes must non-lit');
+            }
+            return litBoxes;
+        }
+
+        function onTimeUpdate() {
+            const graceTime = 0.5;
+            if (mediaElement.currentTime >= (1 + boxesLitSoFar) * boxInterval + graceTime && boxesLitSoFar < totalBoxes) {
+                assert_equals(countLitBoxesInCurrentVideoFrame(), boxesLitSoFar + 1, "Num of lit boxes:");
+                boxesLitSoFar++;
+                checkNoteBeingPlayed();
+            }
+        }
+
+        function onEnded() {
+            assert_equals(boxesLitSoFar, totalBoxes, "Boxes lit at video ended event");
+            test.done();
+        }
+    }, "Test the expected frames are played at the expected times");
+</script>
+</body>
+</html>
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-audio.mp4 b/LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-audio.mp4
new file mode 100644 (file)
index 0000000..b1cabbf
Binary files /dev/null and b/LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-audio.mp4 differ
diff --git a/LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-video.mp4 b/LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-video.mp4
new file mode 100644 (file)
index 0000000..714c17c
Binary files /dev/null and b/LayoutTests/imported/w3c/web-platform-tests/media-source/mp4/test-boxes-video.mp4 differ
index 1cbe105..29246ea 100644 (file)
@@ -1798,3 +1798,6 @@ webkit.org/b/195466 imported/w3c/web-platform-tests/html/semantics/embedded-cont
 
 # <rdar://problem/45595702>
 webgl/2.0.0/conformance/glsl/bugs/conditional-discard-in-loop.html [ Skip ]
+
+webkit.org/b/153588 imported/w3c/web-platform-tests/media-source/mediasource-correct-frames.html [ Skip ]
+webkit.org/b/153588 imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html [ Skip ]
index db7cd2f..b6335a7 100644 (file)
@@ -1,3 +1,30 @@
+2019-03-22  Alicia Boya García  <aboya@igalia.com>
+
+        [MSE][GStreamer] Don't construct segments on PlaybackPipeline::flush
+        https://bugs.webkit.org/show_bug.cgi?id=195867
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        The previous approach did not really work for flushes on only one
+        branch, as setting reset-time in FLUSH_STOP affects the running time
+        of the entire pipeline, causing timing issues in the other branch.
+
+        Since it's preferable not to interfere with the other branch if
+        possible, setting reset-time to FALSE fixes that problem.
+
+        Also, it's not necessary to fabricate a segment. Since we are not
+        seeking, only the base needs to be adjusted, and gstbasesrc already
+        handles this correctly by default.
+
+        This fixes an audio/video synchronization bug in YT when some
+        automatic quality changes occur.
+
+        Tests: imported/w3c/web-platform-tests/media-source/mediasource-correct-frames-after-reappend.html
+               imported/w3c/web-platform-tests/media-source/mediasource-correct-frames.html
+
+        * platform/graphics/gstreamer/mse/PlaybackPipeline.cpp:
+        (WebCore::PlaybackPipeline::flush):
+
 2019-03-22  Frederic Wang  <fwang@igalia.com>
 
         Move implementation of mathsize to a single place
index 334b36f..8ab025a 100644 (file)
@@ -318,46 +318,16 @@ void PlaybackPipeline::flush(AtomicString trackId)
     GST_TRACE("Position: %" GST_TIME_FORMAT, GST_TIME_ARGS(position));
 
     if (static_cast<guint64>(position) == GST_CLOCK_TIME_NONE) {
-        GST_TRACE("Can't determine position, avoiding flush");
+        GST_DEBUG("Can't determine position, avoiding flush");
         return;
     }
 
-    double rate;
-    GstFormat format;
-    gint64 start = GST_CLOCK_TIME_NONE;
-    gint64 stop = GST_CLOCK_TIME_NONE;
-
-    query = adoptGRef(gst_query_new_segment(GST_FORMAT_TIME));
-    if (gst_element_query(pipeline(), query.get()))
-        gst_query_parse_segment(query.get(), &rate, &format, &start, &stop);
-
-    GST_TRACE("segment: [%" GST_TIME_FORMAT ", %" GST_TIME_FORMAT "], rate: %f",
-        GST_TIME_ARGS(start), GST_TIME_ARGS(stop), rate);
-
     if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_start())) {
         GST_WARNING("Failed to send flush-start event for trackId=%s", trackId.string().utf8().data());
-        return;
     }
 
-    if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_stop(true))) {
+    if (!gst_element_send_event(GST_ELEMENT(appsrc), gst_event_new_flush_stop(false))) {
         GST_WARNING("Failed to send flush-stop event for trackId=%s", trackId.string().utf8().data());
-        return;
-    }
-
-    if (static_cast<guint64>(position) == GST_CLOCK_TIME_NONE || static_cast<guint64>(start) == GST_CLOCK_TIME_NONE)
-        return;
-
-    GUniquePtr<GstSegment> segment(gst_segment_new());
-    gst_segment_init(segment.get(), GST_FORMAT_TIME);
-    gst_segment_do_seek(segment.get(), rate, GST_FORMAT_TIME, GST_SEEK_FLAG_NONE,
-        GST_SEEK_TYPE_SET, position, GST_SEEK_TYPE_SET, stop, nullptr);
-
-    GST_TRACE("Sending new seamless segment: [%" GST_TIME_FORMAT ", %" GST_TIME_FORMAT "], rate: %f",
-        GST_TIME_ARGS(segment->start), GST_TIME_ARGS(segment->stop), segment->rate);
-
-    if (!gst_base_src_new_seamless_segment(GST_BASE_SRC(appsrc), segment->start, segment->stop, segment->start)) {
-        GST_WARNING("Failed to send seamless segment event for trackId=%s", trackId.string().utf8().data());
-        return;
     }
 
     GST_DEBUG("trackId=%s flushed", trackId.string().utf8().data());