[GStreamer] Refactor media player to use MediaTime consistently
authoreocanha@igalia.com <eocanha@igalia.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 29 Sep 2017 10:22:12 +0000 (10:22 +0000)
committereocanha@igalia.com <eocanha@igalia.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 29 Sep 2017 10:22:12 +0000 (10:22 +0000)
https://bugs.webkit.org/show_bug.cgi?id=174817

Reviewed by Xabier Rodriguez-Calvar.

Make consistent use of the MediaTime class in the GStreamer media
player implementations.

This patch is authored by Charlie Turner <cturner@igalia.com> plus
some minor modifications by Enrique: migration of m_cachedPosition,
usage of m_seekTime as MediaTime in the MSE player and more logging
using toString().

Covered by existing tests.

* platform/graphics/gstreamer/GStreamerUtilities.cpp:
(WebCore::toGstUnsigned64Time): Scales MediaTime to the precision used
by GStreamer and returns a value compatible with GstClockTime.
(WebCore::toGstClockTime): Deleted.
* platform/graphics/gstreamer/GStreamerUtilities.h:
(WebCore::toGstClockTime): Inlined, now it takes MediaTime and converts
to GstClockTime.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
Several changes to use MediaTime instead of float and to log MediaTime
values converting them toString().
(WebCore::MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer):
(WebCore::MediaPlayerPrivateGStreamer::load):
(WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
(WebCore::MediaPlayerPrivateGStreamer::durationMediaTime const):
(WebCore::MediaPlayerPrivateGStreamer::currentMediaTime const):
(WebCore::MediaPlayerPrivateGStreamer::seek): Uses MediaTime.
(WebCore::MediaPlayerPrivateGStreamer::doSeek):
(WebCore::MediaPlayerPrivateGStreamer::updatePlaybackRate):
(WebCore::MediaPlayerPrivateGStreamer::buffered const):
(WebCore::MediaPlayerPrivateGStreamer::fillTimerFired):
(WebCore::MediaPlayerPrivateGStreamer::maxMediaTimeSeekable const):
(WebCore::MediaPlayerPrivateGStreamer::maxTimeLoaded const):
(WebCore::MediaPlayerPrivateGStreamer::didLoadingProgress const):
(WebCore::MediaPlayerPrivateGStreamer::asyncStateChangeDone):
(WebCore::MediaPlayerPrivateGStreamer::updateStates):
(WebCore::MediaPlayerPrivateGStreamer::didEnd):
(WebCore::MediaPlayerPrivateGStreamer::durationChanged):
(WebCore::MediaPlayerPrivateGStreamer::maxTimeSeekable const): Deleted.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h: Changed
seek(), playBackposition(), m_cachedPosition, m_durationAtEOS,
m_seekTime and m_timeOfOverlappingSeek to be of MediaTime type.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
Prefer the methods based on MediaTime over those based on fload/double.
(WebCore::MediaPlayerPrivateGStreamerBase::maxTimeLoaded const):
* platform/graphics/gstreamer/mse/GStreamerMediaSample.cpp:
(WebCore::GStreamerMediaSample::offsetTimestampsBy): toGstClockTime()
now can handle MediaTime.
* platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
Several changes to use MediaTime instead of float and to log MediaTime
values converting them toString().
(WebCore::MediaPlayerPrivateGStreamerMSE::seek):
(WebCore::MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime):
(WebCore::MediaPlayerPrivateGStreamerMSE::doSeek):
(WebCore::MediaPlayerPrivateGStreamerMSE::maybeFinishSeek):
(WebCore::MediaPlayerPrivateGStreamerMSE::isTimeBuffered const):
(WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
(WebCore::MediaPlayerPrivateGStreamerMSE::currentMediaTime const):
(WebCore::MediaPlayerPrivateGStreamerMSE::maxTimeSeekable const):
* platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
seek() now takes a MediaTime argument.
* platform/graphics/gstreamer/mse/PlaybackPipeline.cpp:
(WebCore::PlaybackPipeline::enqueueSample): Use MediaTime values in
MediaSample.
* platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
(webKitMediaSrcQueryWithParent): Use MediaTime values in
durationMediaTime().

git-svn-id: http://svn.webkit.org/repository/webkit/trunk@222649 268f45cc-cd09-0410-ab3c-d52691b4dbfc

12 files changed:
Source/WebCore/ChangeLog
Source/WebCore/platform/graphics/gstreamer/GStreamerUtilities.cpp
Source/WebCore/platform/graphics/gstreamer/GStreamerUtilities.h
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h
Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp
Source/WebCore/platform/graphics/gstreamer/mse/GStreamerMediaSample.cpp
Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp
Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h
Source/WebCore/platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp

index 5b9e1c4..bbff911 100644 (file)
@@ -1,3 +1,77 @@
+2017-09-29  Enrique Ocaña González  <eocanha@igalia.com>
+
+        [GStreamer] Refactor media player to use MediaTime consistently
+        https://bugs.webkit.org/show_bug.cgi?id=174817
+
+        Reviewed by Xabier Rodriguez-Calvar.
+
+        Make consistent use of the MediaTime class in the GStreamer media
+        player implementations.
+
+        This patch is authored by Charlie Turner <cturner@igalia.com> plus
+        some minor modifications by Enrique: migration of m_cachedPosition,
+        usage of m_seekTime as MediaTime in the MSE player and more logging
+        using toString().
+
+        Covered by existing tests.
+
+        * platform/graphics/gstreamer/GStreamerUtilities.cpp:
+        (WebCore::toGstUnsigned64Time): Scales MediaTime to the precision used
+        by GStreamer and returns a value compatible with GstClockTime.
+        (WebCore::toGstClockTime): Deleted.
+        * platform/graphics/gstreamer/GStreamerUtilities.h:
+        (WebCore::toGstClockTime): Inlined, now it takes MediaTime and converts
+        to GstClockTime.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        Several changes to use MediaTime instead of float and to log MediaTime
+        values converting them toString().
+        (WebCore::MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer):
+        (WebCore::MediaPlayerPrivateGStreamer::load):
+        (WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
+        (WebCore::MediaPlayerPrivateGStreamer::durationMediaTime const):
+        (WebCore::MediaPlayerPrivateGStreamer::currentMediaTime const):
+        (WebCore::MediaPlayerPrivateGStreamer::seek): Uses MediaTime.
+        (WebCore::MediaPlayerPrivateGStreamer::doSeek):
+        (WebCore::MediaPlayerPrivateGStreamer::updatePlaybackRate):
+        (WebCore::MediaPlayerPrivateGStreamer::buffered const):
+        (WebCore::MediaPlayerPrivateGStreamer::fillTimerFired):
+        (WebCore::MediaPlayerPrivateGStreamer::maxMediaTimeSeekable const):
+        (WebCore::MediaPlayerPrivateGStreamer::maxTimeLoaded const):
+        (WebCore::MediaPlayerPrivateGStreamer::didLoadingProgress const):
+        (WebCore::MediaPlayerPrivateGStreamer::asyncStateChangeDone):
+        (WebCore::MediaPlayerPrivateGStreamer::updateStates):
+        (WebCore::MediaPlayerPrivateGStreamer::didEnd):
+        (WebCore::MediaPlayerPrivateGStreamer::durationChanged):
+        (WebCore::MediaPlayerPrivateGStreamer::maxTimeSeekable const): Deleted.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h: Changed
+        seek(), playBackposition(), m_cachedPosition, m_durationAtEOS,
+        m_seekTime and m_timeOfOverlappingSeek to be of MediaTime type.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
+        Prefer the methods based on MediaTime over those based on fload/double.
+        (WebCore::MediaPlayerPrivateGStreamerBase::maxTimeLoaded const):
+        * platform/graphics/gstreamer/mse/GStreamerMediaSample.cpp:
+        (WebCore::GStreamerMediaSample::offsetTimestampsBy): toGstClockTime()
+        now can handle MediaTime.
+        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
+        Several changes to use MediaTime instead of float and to log MediaTime
+        values converting them toString().
+        (WebCore::MediaPlayerPrivateGStreamerMSE::seek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::doSeek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::maybeFinishSeek):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::isTimeBuffered const):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::currentMediaTime const):
+        (WebCore::MediaPlayerPrivateGStreamerMSE::maxTimeSeekable const):
+        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
+        seek() now takes a MediaTime argument.
+        * platform/graphics/gstreamer/mse/PlaybackPipeline.cpp:
+        (WebCore::PlaybackPipeline::enqueueSample): Use MediaTime values in
+        MediaSample.
+        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
+        (webKitMediaSrcQueryWithParent): Use MediaTime values in
+        durationMediaTime().
+
 2017-09-28  Sam Weinig  <sam@webkit.org>
 
         Re-write Settings generation in python for some reason
index 1854e36..5d2bc25 100644 (file)
@@ -28,7 +28,6 @@
 
 #include <gst/audio/audio-info.h>
 #include <gst/gst.h>
-#include <wtf/MathExtras.h>
 #include <wtf/glib/GUniquePtr.h>
 
 #if ENABLE(VIDEO_TRACK) && USE(GSTREAMER_MPEGTS)
@@ -175,16 +174,15 @@ unsigned getGstPlayFlag(const char* nick)
     return flag->value;
 }
 
-GstClockTime toGstClockTime(float time)
+// Convert a MediaTime in seconds to a GstClockTime. Note that we can get MediaTime objects with a time scale that isn't a GST_SECOND, since they can come to
+// us through the internal testing API, the DOM and internally. It would be nice to assert the format of the incoming time, but all the media APIs assume time
+// is passed around in fractional seconds, so we'll just have to assume the same.
+uint64_t toGstUnsigned64Time(const MediaTime& mediaTime)
 {
-    // Extract the integer part of the time (seconds) and the fractional part (microseconds). Attempt to
-    // round the microseconds so no floating point precision is lost and we can perform an accurate seek.
-    float seconds;
-    float microSeconds = modff(time, &seconds) * 1000000;
-    GTimeVal timeValue;
-    timeValue.tv_sec = static_cast<glong>(seconds);
-    timeValue.tv_usec = static_cast<glong>(floor(microSeconds + 0.5));
-    return GST_TIMEVAL_TO_TIME(timeValue);
+    MediaTime time = mediaTime.toTimeScale(GST_SECOND);
+    if (time.isInvalid())
+        return GST_CLOCK_TIME_NONE;
+    return time.timeValue();
 }
 
 bool gstRegistryHasElementForMediaType(GList* elementFactories, const char* capsString)
index 933bd1d..fef6b83 100644 (file)
@@ -23,6 +23,7 @@
 #include <gst/gst.h>
 #include <gst/video/video-format.h>
 #include <gst/video/video-info.h>
+#include <wtf/MediaTime.h>
 
 namespace WebCore {
 
@@ -61,6 +62,12 @@ void mapGstBuffer(GstBuffer*, uint32_t);
 void unmapGstBuffer(GstBuffer*);
 bool initializeGStreamer();
 unsigned getGstPlayFlag(const char* nick);
-GstClockTime toGstClockTime(float time);
+uint64_t toGstUnsigned64Time(const MediaTime&);
+
+inline GstClockTime toGstClockTime(const MediaTime &mediaTime)
+{
+    return static_cast<GstClockTime>(toGstUnsigned64Time(mediaTime));
+}
+
 bool gstRegistryHasElementForMediaType(GList* elementFactories, const char* capsString);
 }
index 54b95e3..f38e637 100644 (file)
@@ -45,6 +45,7 @@
 #include <wtf/HexNumber.h>
 #include <wtf/MediaTime.h>
 #include <wtf/NeverDestroyed.h>
+#include <wtf/StringPrintStream.h>
 #include <wtf/glib/GUniquePtr.h>
 #include <wtf/glib/RunLoopSourcePriority.h>
 #include <wtf/text/CString.h>
@@ -130,26 +131,26 @@ MediaPlayerPrivateGStreamer::MediaPlayerPrivateGStreamer(MediaPlayer* player)
     , m_errorOccured(false)
     , m_isEndReached(false)
     , m_isStreaming(false)
-    , m_durationAtEOS(0)
+    , m_durationAtEOS(MediaTime::invalidTime())
     , m_paused(true)
     , m_playbackRate(1)
     , m_requestedState(GST_STATE_VOID_PENDING)
     , m_resetPipeline(false)
     , m_seeking(false)
     , m_seekIsPending(false)
-    , m_seekTime(0)
+    , m_seekTime(MediaTime::invalidTime())
     , m_source(nullptr)
     , m_volumeAndMuteInitialized(false)
     , m_mediaLocations(nullptr)
     , m_mediaLocationCurrentIndex(0)
     , m_playbackRatePause(false)
-    , m_timeOfOverlappingSeek(-1)
+    , m_timeOfOverlappingSeek(MediaTime::invalidTime())
     , m_lastPlaybackRate(1)
     , m_fillTimer(*this, &MediaPlayerPrivateGStreamer::fillTimerFired)
-    , m_maxTimeLoaded(0)
+    , m_maxTimeLoaded(MediaTime::zeroTime())
     , m_preload(player->preload())
     , m_delayingLoad(false)
-    , m_maxTimeLoadedAtLastDidLoadingProgress(0)
+    , m_maxTimeLoadedAtLastDidLoadingProgress(MediaTime::zeroTime())
     , m_hasVideo(false)
     , m_hasAudio(false)
     , m_readyTimerHandler(RunLoop::main(), this, &MediaPlayerPrivateGStreamer::readyTimerFired)
@@ -260,7 +261,7 @@ void MediaPlayerPrivateGStreamer::load(const String& urlString)
     m_readyState = MediaPlayer::HaveNothing;
     m_player->readyStateChanged();
     m_volumeAndMuteInitialized = false;
-    m_durationAtEOS = 0;
+    m_durationAtEOS = MediaTime::invalidTime();
 
     if (!m_delayingLoad)
         commitLoad();
@@ -295,7 +296,7 @@ void MediaPlayerPrivateGStreamer::commitLoad()
     updateStates();
 }
 
-double MediaPlayerPrivateGStreamer::playbackPosition() const
+MediaTime MediaPlayerPrivateGStreamer::playbackPosition() const
 {
     if (m_isEndReached) {
         // Position queries on a null pipeline return 0. If we're at
@@ -305,10 +306,8 @@ double MediaPlayerPrivateGStreamer::playbackPosition() const
         if (m_seeking)
             return m_seekTime;
 
-        MediaTime mediaDuration = durationMediaTime();
-        if (mediaDuration)
-            return mediaDuration.toDouble();
-        return 0;
+        MediaTime duration = durationMediaTime();
+        return duration.isInvalid() ? MediaTime::zeroTime() : duration;
     }
 
     // Position is only available if no async state change is going on and the state is either paused or playing.
@@ -320,15 +319,14 @@ double MediaPlayerPrivateGStreamer::playbackPosition() const
 
     GST_DEBUG("Position %" GST_TIME_FORMAT, GST_TIME_ARGS(position));
 
-    double result = 0.0f;
-    if (static_cast<GstClockTime>(position) != GST_CLOCK_TIME_NONE) {
-        GTimeVal timeValue;
-        GST_TIME_TO_TIMEVAL(position, timeValue);
-        result = static_cast<double>(timeValue.tv_sec + (timeValue.tv_usec / 1000000.0));
-    } else if (m_canFallBackToLastFinishedSeekPosition)
-        result = m_seekTime;
+    MediaTime playbackPosition = MediaTime::zeroTime();
+    GstClockTime gstreamerPosition = static_cast<GstClockTime>(position);
+    if (GST_CLOCK_TIME_IS_VALID(gstreamerPosition))
+        playbackPosition = MediaTime(gstreamerPosition, GST_SECOND);
+    else if (m_canFallBackToLastFinishedSeekPosition)
+        playbackPosition = m_seekTime;
 
-    return result;
+    return playbackPosition;
 }
 
 void MediaPlayerPrivateGStreamer::readyTimerFired()
@@ -417,44 +415,36 @@ void MediaPlayerPrivateGStreamer::pause()
 
 MediaTime MediaPlayerPrivateGStreamer::durationMediaTime() const
 {
-    if (!m_pipeline)
-        return { };
-
-    if (m_errorOccured)
-        return { };
+    if (!m_pipeline || m_errorOccured)
+        return MediaTime::invalidTime();
 
-    if (m_durationAtEOS)
-        return MediaTime::createWithDouble(m_durationAtEOS);
+    if (m_durationAtEOS.isValid())
+        return m_durationAtEOS;
 
     // The duration query would fail on a not-prerolled pipeline.
     if (GST_STATE(m_pipeline.get()) < GST_STATE_PAUSED)
-        return { };
+        return MediaTime::invalidTime();
 
-    GstFormat timeFormat = GST_FORMAT_TIME;
     gint64 timeLength = 0;
 
-    bool failure = !gst_element_query_duration(m_pipeline.get(), timeFormat, &timeLength) || static_cast<guint64>(timeLength) == GST_CLOCK_TIME_NONE;
-    if (failure) {
+    if (!gst_element_query_duration(m_pipeline.get(), GST_FORMAT_TIME, &timeLength) || !GST_CLOCK_TIME_IS_VALID(timeLength)) {
         GST_DEBUG("Time duration query failed for %s", m_url.string().utf8().data());
         return MediaTime::positiveInfiniteTime();
     }
 
     GST_DEBUG("Duration: %" GST_TIME_FORMAT, GST_TIME_ARGS(timeLength));
 
-    return MediaTime::createWithDouble(static_cast<double>(timeLength) / GST_SECOND);
+    return MediaTime(timeLength, GST_SECOND);
     // FIXME: handle 3.14.9.5 properly
 }
 
 MediaTime MediaPlayerPrivateGStreamer::currentMediaTime() const
 {
-    if (!m_pipeline)
-        return { };
-
-    if (m_errorOccured)
-        return { };
+    if (!m_pipeline || m_errorOccured)
+        return MediaTime::invalidTime();
 
     if (m_seeking)
-        return MediaTime::createWithFloat(m_seekTime);
+        return m_seekTime;
 
     // Workaround for
     // https://bugzilla.gnome.org/show_bug.cgi?id=639941 In GStreamer
@@ -462,12 +452,12 @@ MediaTime MediaPlayerPrivateGStreamer::currentMediaTime() const
     // negative playback rate. There's no upstream accepted patch for
     // this bug yet, hence this temporary workaround.
     if (m_isEndReached && m_playbackRate < 0)
-        return { };
+        return MediaTime::invalidTime();
 
-    return MediaTime::createWithDouble(playbackPosition());
+    return playbackPosition();
 }
 
-void MediaPlayerPrivateGStreamer::seek(float time)
+void MediaPlayerPrivateGStreamer::seek(const MediaTime& mediaTime)
 {
     if (!m_pipeline)
         return;
@@ -475,17 +465,18 @@ void MediaPlayerPrivateGStreamer::seek(float time)
     if (m_errorOccured)
         return;
 
-    GST_INFO("[Seek] seek attempt to %f secs", time);
+    GST_INFO("[Seek] seek attempt to %s", toString(mediaTime).utf8().data());
 
     // Avoid useless seeking.
-    if (MediaTime::createWithFloat(time) == currentMediaTime())
+    if (mediaTime == currentMediaTime())
         return;
 
+    MediaTime time = std::min(mediaTime, durationMediaTime());
+
     if (isLiveStream())
         return;
 
-    GstClockTime clockTime = toGstClockTime(time);
-    GST_INFO("[Seek] seeking to %" GST_TIME_FORMAT " (%f)", GST_TIME_ARGS(clockTime), time);
+    GST_INFO("[Seek] seeking to %s", toString(time).utf8().data());
 
     if (m_seeking) {
         m_timeOfOverlappingSeek = time;
@@ -511,8 +502,8 @@ void MediaPlayerPrivateGStreamer::seek(float time)
         }
     } else {
         // We can seek now.
-        if (!doSeek(clockTime, m_player->rate(), static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE))) {
-            GST_DEBUG("[Seek] seeking to %f failed", time);
+        if (!doSeek(time, m_player->rate(), static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE))) {
+            GST_DEBUG("[Seek] seeking to %s failed", toString(time).utf8().data());
             return;
         }
     }
@@ -522,24 +513,22 @@ void MediaPlayerPrivateGStreamer::seek(float time)
     m_isEndReached = false;
 }
 
-bool MediaPlayerPrivateGStreamer::doSeek(gint64 position, float rate, GstSeekFlags seekType)
+bool MediaPlayerPrivateGStreamer::doSeek(const MediaTime& position, float rate, GstSeekFlags seekType)
 {
-    gint64 startTime, endTime;
+    // Default values for rate >= 0.
+    MediaTime startTime = position, endTime = MediaTime::invalidTime();
 
     // TODO: Should do more than that, need to notify the media source
     // and probably flush the pipeline at least.
     if (isMediaSource())
         return true;
 
-    if (rate > 0) {
-        startTime = position;
-        endTime = GST_CLOCK_TIME_NONE;
-    } else {
-        startTime = 0;
+    if (rate < 0) {
+        startTime = MediaTime::zeroTime();
         // If we are at beginning of media, start from the end to
         // avoid immediate EOS.
-        if (position < 0)
-            endTime = static_cast<gint64>(durationMediaTime().toDouble() * GST_SECOND);
+        if (position < MediaTime::zeroTime())
+            endTime = durationMediaTime();
         else
             endTime = position;
     }
@@ -548,7 +537,7 @@ bool MediaPlayerPrivateGStreamer::doSeek(gint64 position, float rate, GstSeekFla
         rate = 1.0;
 
     return gst_element_seek(m_pipeline.get(), rate, GST_FORMAT_TIME, seekType,
-        GST_SEEK_TYPE_SET, startTime, GST_SEEK_TYPE_SET, endTime);
+        GST_SEEK_TYPE_SET, toGstClockTime(startTime), GST_SEEK_TYPE_SET, toGstClockTime(endTime));
 }
 
 void MediaPlayerPrivateGStreamer::updatePlaybackRate()
@@ -556,23 +545,14 @@ void MediaPlayerPrivateGStreamer::updatePlaybackRate()
     if (!m_changingRate)
         return;
 
-    float currentPosition = static_cast<float>(playbackPosition() * GST_SECOND);
-    bool mute = false;
-
     GST_INFO("Set Rate to %f", m_playbackRate);
 
-    if (m_playbackRate > 0) {
-        // Mute the sound if the playback rate is too extreme and
-        // audio pitch is not adjusted.
-        mute = (!m_preservesPitch && (m_playbackRate < 0.8 || m_playbackRate > 2));
-    } else {
-        if (currentPosition == 0.0f)
-            currentPosition = -1.0f;
-        mute = true;
-    }
+    // Mute the sound if the playback rate is negative or too extreme and audio pitch is not adjusted.
+    bool mute = m_playbackRate <= 0 || (!m_preservesPitch && (m_playbackRate < 0.8 || m_playbackRate > 2));
 
-    GST_INFO("Need to mute audio?: %d", (int) mute);
-    if (doSeek(currentPosition, m_playbackRate, static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH))) {
+    GST_INFO(mute ? "Need to mute audio" : "Do not need to mute audio");
+
+    if (doSeek(playbackPosition(), m_playbackRate, static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH))) {
         g_object_set(m_pipeline.get(), "mute", mute, nullptr);
         m_lastPlaybackRate = m_playbackRate;
     } else {
@@ -874,8 +854,8 @@ std::unique_ptr<PlatformTimeRanges> MediaPlayerPrivateGStreamer::buffered() cons
     if (m_errorOccured || isLiveStream())
         return timeRanges;
 
-    float mediaDuration(durationMediaTime().toDouble());
-    if (!mediaDuration || std::isinf(mediaDuration))
+    MediaTime mediaDuration = durationMediaTime();
+    if (!mediaDuration || mediaDuration.isPositiveInfinite())
         return timeRanges;
 
     GstQuery* query = gst_query_new_buffering(GST_FORMAT_PERCENT);
@@ -889,15 +869,15 @@ std::unique_ptr<PlatformTimeRanges> MediaPlayerPrivateGStreamer::buffered() cons
     for (guint index = 0; index < numBufferingRanges; index++) {
         gint64 rangeStart = 0, rangeStop = 0;
         if (gst_query_parse_nth_buffering_range(query, index, &rangeStart, &rangeStop))
-            timeRanges->add(MediaTime::createWithDouble((rangeStart * mediaDuration) / GST_FORMAT_PERCENT_MAX),
-                MediaTime::createWithDouble((rangeStop * mediaDuration) / GST_FORMAT_PERCENT_MAX));
+            timeRanges->add(MediaTime(rangeStart * toGstUnsigned64Time(mediaDuration) / GST_FORMAT_PERCENT_MAX, GST_SECOND),
+                MediaTime(rangeStop * toGstUnsigned64Time(mediaDuration) / GST_FORMAT_PERCENT_MAX, GST_SECOND));
     }
 
     // Fallback to the more general maxTimeLoaded() if no range has
     // been found.
     if (!timeRanges->length())
-        if (float loaded = maxTimeLoaded())
-            timeRanges->add(MediaTime::zeroTime(), MediaTime::createWithDouble(loaded));
+        if (MediaTime loaded = maxTimeLoaded())
+            timeRanges->add(MediaTime::zeroTime(), loaded);
 
     gst_query_unref(query);
 
@@ -1257,7 +1237,7 @@ void MediaPlayerPrivateGStreamer::fillTimerFired()
 
     GST_DEBUG("[Buffering] Download buffer filled up to %f%%", fillStatus);
 
-    float mediaDuration = durationMediaTime().toDouble();
+    MediaTime mediaDuration = durationMediaTime();
 
     // Update maxTimeLoaded only if the media duration is
     // available. Otherwise we can't compute it.
@@ -1265,8 +1245,8 @@ void MediaPlayerPrivateGStreamer::fillTimerFired()
         if (fillStatus == 100.0)
             m_maxTimeLoaded = mediaDuration;
         else
-            m_maxTimeLoaded = static_cast<float>((fillStatus * mediaDuration) / 100.0);
-        GST_DEBUG("[Buffering] Updated maxTimeLoaded: %f", m_maxTimeLoaded);
+            m_maxTimeLoaded = MediaTime(fillStatus * static_cast<double>(toGstUnsigned64Time(mediaDuration)) / 100, GST_SECOND);
+        GST_DEBUG("[Buffering] Updated maxTimeLoaded: %s", toString(m_maxTimeLoaded).utf8().data());
     }
 
     m_downloadFinished = fillStatus == 100.0;
@@ -1282,29 +1262,29 @@ void MediaPlayerPrivateGStreamer::fillTimerFired()
     updateStates();
 }
 
-float MediaPlayerPrivateGStreamer::maxTimeSeekable() const
+MediaTime MediaPlayerPrivateGStreamer::maxMediaTimeSeekable() const
 {
     if (m_errorOccured)
-        return 0.0f;
+        return MediaTime::zeroTime();
 
-    float mediaDuration = durationMediaTime().toDouble();
-    GST_DEBUG("maxTimeSeekable, duration: %f", mediaDuration);
+    MediaTime duration = durationMediaTime();
+    GST_DEBUG("maxMediaTimeSeekable, duration: %s", toString(duration).utf8().data());
     // infinite duration means live stream
-    if (std::isinf(mediaDuration))
-        return 0.0f;
+    if (duration.isPositiveInfinite())
+        return MediaTime::zeroTime();
 
-    return mediaDuration;
+    return duration;
 }
 
-float MediaPlayerPrivateGStreamer::maxTimeLoaded() const
+MediaTime MediaPlayerPrivateGStreamer::maxTimeLoaded() const
 {
     if (m_errorOccured)
-        return 0.0f;
+        return MediaTime::zeroTime();
 
-    float loaded = m_maxTimeLoaded;
+    MediaTime loaded = m_maxTimeLoaded;
     if (m_isEndReached)
-        loaded = durationMediaTime().toDouble();
-    GST_DEBUG("maxTimeLoaded: %f", loaded);
+        loaded = durationMediaTime();
+    GST_DEBUG("maxTimeLoaded: %s", toString(loaded).utf8().data());
     return loaded;
 }
 
@@ -1312,7 +1292,7 @@ bool MediaPlayerPrivateGStreamer::didLoadingProgress() const
 {
     if (UNLIKELY(!m_pipeline || !durationMediaTime() || (!isMediaSource() && !totalBytes())))
         return false;
-    float currentMaxTimeLoaded = maxTimeLoaded();
+    MediaTime currentMaxTimeLoaded = maxTimeLoaded();
     bool didLoadingProgress = currentMaxTimeLoaded != m_maxTimeLoadedAtLastDidLoadingProgress;
     m_maxTimeLoadedAtLastDidLoadingProgress = currentMaxTimeLoaded;
     GST_DEBUG("didLoadingProgress: %d", didLoadingProgress);
@@ -1487,14 +1467,14 @@ void MediaPlayerPrivateGStreamer::asyncStateChangeDone()
         if (m_seekIsPending)
             updateStates();
         else {
-            GST_DEBUG("[Seek] seeked to %f", m_seekTime);
+            GST_DEBUG("[Seek] seeked to %s", toString(m_seekTime).utf8().data());
             m_seeking = false;
-            if (m_timeOfOverlappingSeek != m_seekTime && m_timeOfOverlappingSeek != -1) {
+            if (m_timeOfOverlappingSeek != m_seekTime && m_timeOfOverlappingSeek.isValid()) {
                 seek(m_timeOfOverlappingSeek);
-                m_timeOfOverlappingSeek = -1;
+                m_timeOfOverlappingSeek = MediaTime::invalidTime();
                 return;
             }
-            m_timeOfOverlappingSeek = -1;
+            m_timeOfOverlappingSeek = MediaTime::invalidTime();
 
             // The pipeline can still have a pending state. In this case a position query will fail.
             // Right now we can use m_seekTime as a fallback.
@@ -1649,11 +1629,11 @@ void MediaPlayerPrivateGStreamer::updateStates()
     if (getStateResult == GST_STATE_CHANGE_SUCCESS && state >= GST_STATE_PAUSED) {
         updatePlaybackRate();
         if (m_seekIsPending) {
-            GST_DEBUG("[Seek] committing pending seek to %f", m_seekTime);
+            GST_DEBUG("[Seek] committing pending seek to %s", toString(m_seekTime).utf8().data());
             m_seekIsPending = false;
-            m_seeking = doSeek(toGstClockTime(m_seekTime), m_player->rate(), static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE));
+            m_seeking = doSeek(m_seekTime, m_player->rate(), static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE));
             if (!m_seeking)
-                GST_DEBUG("[Seek] seeking to %f failed", m_seekTime);
+                GST_DEBUG("[Seek] seeking to %s failed", toString(m_seekTime).utf8().data());
         }
     }
 }
@@ -1773,7 +1753,7 @@ void MediaPlayerPrivateGStreamer::didEnd()
 
     if (!m_player->client().mediaPlayerIsLooping()) {
         m_paused = true;
-        m_durationAtEOS = durationMediaTime().toDouble();
+        m_durationAtEOS = durationMediaTime();
         changePipelineState(GST_STATE_READY);
         m_downloadFinished = false;
     }
@@ -1781,12 +1761,15 @@ void MediaPlayerPrivateGStreamer::didEnd()
 
 void MediaPlayerPrivateGStreamer::durationChanged()
 {
-    float previousDuration = durationMediaTime().toDouble();
+    MediaTime previousDuration = durationMediaTime();
+
+    // FIXME: Check if this method is still useful, because it's not doing its work at all
+    // since bug #159458 removed a cacheDuration() call here.
 
     // Avoid emiting durationchanged in the case where the previous
     // duration was 0 because that case is already handled by the
     // HTMLMediaElement.
-    if (previousDuration && durationMediaTime().toDouble() != previousDuration)
+    if (previousDuration && durationMediaTime() != previousDuration)
         m_player->durationChanged();
 }
 
index 7479487..a73a924 100644 (file)
@@ -93,7 +93,7 @@ public:
 
     MediaTime durationMediaTime() const override;
     MediaTime currentMediaTime() const override;
-    void seek(float) override;
+    void seek(const MediaTime&) override;
 
     void setRate(float) override;
     double rate() const override;
@@ -103,10 +103,10 @@ public:
     void fillTimerFired();
 
     std::unique_ptr<PlatformTimeRanges> buffered() const override;
-    float maxTimeSeekable() const override;
+    MediaTime maxMediaTimeSeekable() const override;
     bool didLoadingProgress() const override;
     unsigned long long totalBytes() const override;
-    float maxTimeLoaded() const override;
+    MediaTime maxTimeLoaded() const override;
 
     bool hasSingleSecurityOrigin() const override;
 
@@ -143,7 +143,7 @@ private:
 
     GstElement* createAudioSink() override;
 
-    double playbackPosition() const;
+    MediaTime playbackPosition() const;
 
     virtual void updateStates();
     virtual void asyncStateChangeDone();
@@ -162,7 +162,7 @@ private:
     void processTableOfContents(GstMessage*);
     void processTableOfContentsEntry(GstTocEntry*);
 #endif
-    virtual bool doSeek(gint64 position, float rate, GstSeekFlags seekType);
+    virtual bool doSeek(const MediaTime& position, float rate, GstSeekFlags seekType);
     virtual void updatePlaybackRate();
 
     String engineDescription() const override { return "GStreamer"; }
@@ -180,21 +180,21 @@ protected:
 
     bool m_buffering;
     int m_bufferingPercentage;
-    mutable float m_cachedPosition;
+    mutable MediaTime m_cachedPosition;
     bool m_canFallBackToLastFinishedSeekPosition;
     bool m_changingRate;
     bool m_downloadFinished;
     bool m_errorOccured;
     mutable bool m_isEndReached;
     mutable bool m_isStreaming;
-    mutable gdouble m_durationAtEOS;
+    mutable MediaTime m_durationAtEOS;
     bool m_paused;
     float m_playbackRate;
     GstState m_requestedState;
     bool m_resetPipeline;
     bool m_seeking;
     bool m_seekIsPending;
-    float m_seekTime;
+    MediaTime m_seekTime;
     GRefPtr<GstElement> m_source;
     bool m_volumeAndMuteInitialized;
 
@@ -233,13 +233,13 @@ private:
     GstStructure* m_mediaLocations;
     int m_mediaLocationCurrentIndex;
     bool m_playbackRatePause;
-    float m_timeOfOverlappingSeek;
+    MediaTime m_timeOfOverlappingSeek;
     float m_lastPlaybackRate;
     Timer m_fillTimer;
-    float m_maxTimeLoaded;
+    MediaTime m_maxTimeLoaded;
     MediaPlayer::Preload m_preload;
     bool m_delayingLoad;
-    mutable float m_maxTimeLoadedAtLastDidLoadingProgress;
+    mutable MediaTime m_maxTimeLoadedAtLastDidLoadingProgress;
     bool m_hasVideo;
     bool m_hasAudio;
     RunLoop::Timer<MediaPlayerPrivateGStreamer> m_readyTimerHandler;
index a485cf1..81d744a 100644 (file)
@@ -91,10 +91,25 @@ public:
     void setSize(const IntSize&) override;
     void sizeChanged();
 
+    // Prefer MediaTime based methods over float based.
+    float duration() const override { return durationMediaTime().toFloat(); }
+    double durationDouble() const override { return durationMediaTime().toDouble(); }
+    MediaTime durationMediaTime() const override { return MediaTime::zeroTime(); }
+    float currentTime() const override { return currentMediaTime().toFloat(); }
+    double currentTimeDouble() const override { return currentMediaTime().toDouble(); }
+    MediaTime currentMediaTime() const override { return MediaTime::zeroTime(); }
+    void seek(float time) override { seek(MediaTime::createWithFloat(time)); }
+    void seekDouble(double time) override { seek(MediaTime::createWithDouble(time)); }
+    void seek(const MediaTime&) override { }
+    float maxTimeSeekable() const override { return maxMediaTimeSeekable().toFloat(); }
+    MediaTime maxMediaTimeSeekable() const override { return MediaTime::zeroTime(); }
+    double minTimeSeekable() const override { return minMediaTimeSeekable().toFloat(); }
+    MediaTime minMediaTimeSeekable() const override { return MediaTime::zeroTime(); }
+
     void paint(GraphicsContext&, const FloatRect&) override;
 
     bool hasSingleSecurityOrigin() const override { return true; }
-    virtual float maxTimeLoaded() const { return 0.0; }
+    virtual MediaTime maxTimeLoaded() const { return MediaTime::zeroTime(); }
 
     bool supportsFullscreen() const override;
     PlatformMedia platformMedia() const override;
index 1687650..bd64898 100644 (file)
@@ -719,7 +719,7 @@ void AppendPipeline::appsinkNewSample(GstSample* sample)
         // Add a gap sample if a gap is detected before the first sample.
         if (mediaSample->decodeTime() == MediaTime::zeroTime()
             && mediaSample->presentationTime() > MediaTime::zeroTime()
-            && mediaSample->presentationTime() <= MediaTime::createWithDouble(0.1)) {
+            && mediaSample->presentationTime() <= MediaTime(1, 10)) {
             GST_DEBUG("Adding gap offset");
             mediaSample->applyPtsOffset(MediaTime::zeroTime());
         }
index 86d4329..229defa 100644 (file)
@@ -93,8 +93,8 @@ void GStreamerMediaSample::offsetTimestampsBy(const MediaTime& timestampOffset)
     m_dts += timestampOffset;
     GstBuffer* buffer = gst_sample_get_buffer(m_sample.get());
     if (buffer) {
-        GST_BUFFER_PTS(buffer) = toGstClockTime(m_pts.toFloat());
-        GST_BUFFER_DTS(buffer) = toGstClockTime(m_dts.toFloat());
+        GST_BUFFER_PTS(buffer) = toGstClockTime(m_pts);
+        GST_BUFFER_DTS(buffer) = toGstClockTime(m_dts);
     }
 }
 
index fe0c352..04c6e12 100644 (file)
@@ -50,6 +50,7 @@
 #include <wtf/Condition.h>
 #include <wtf/HashSet.h>
 #include <wtf/NeverDestroyed.h>
+#include <wtf/StringPrintStream.h>
 #include <wtf/text/AtomicString.h>
 #include <wtf/text/AtomicStringHash.h>
 
@@ -167,15 +168,15 @@ MediaTime MediaPlayerPrivateGStreamerMSE::durationMediaTime() const
     return m_mediaTimeDuration;
 }
 
-void MediaPlayerPrivateGStreamerMSE::seek(float time)
+void MediaPlayerPrivateGStreamerMSE::seek(const MediaTime& time)
 {
     if (UNLIKELY(!m_pipeline || m_errorOccured))
         return;
 
-    GST_INFO("[Seek] seek attempt to %f secs", time);
+    GST_INFO("[Seek] seek attempt to %s secs", toString(time).utf8().data());
 
     // Avoid useless seeking.
-    float current = currentMediaTime().toFloat();
+    MediaTime current = currentMediaTime();
     if (time == current) {
         if (!m_seeking)
             timeChanged();
@@ -190,19 +191,19 @@ void MediaPlayerPrivateGStreamerMSE::seek(float time)
         return;
     }
 
-    GST_DEBUG("Seeking from %f to %f seconds", current, time);
+    GST_DEBUG("Seeking from %s to %s seconds", toString(current).utf8().data(), toString(time).utf8().data());
 
-    float prevSeekTime = m_seekTime;
+    MediaTime previousSeekTime = m_seekTime;
     m_seekTime = time;
 
     if (!doSeek()) {
-        m_seekTime = prevSeekTime;
-        GST_WARNING("Seeking to %f failed", time);
+        m_seekTime = previousSeekTime;
+        GST_WARNING("Seeking to %s failed", toString(time).utf8().data());
         return;
     }
 
     m_isEndReached = false;
-    GST_DEBUG("m_seeking=%s, m_seekTime=%f", boolForPrinting(m_seeking), m_seekTime);
+    GST_DEBUG("m_seeking=%s, m_seekTime=%s", boolForPrinting(m_seeking), toString(m_seekTime).utf8().data());
 }
 
 void MediaPlayerPrivateGStreamerMSE::configurePlaySink()
@@ -233,7 +234,7 @@ void MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime(const MediaTime&
     // Reenqueue samples needed to resume playback in the new position.
     m_mediaSource->seekToTime(seekTime);
 
-    GST_DEBUG("MSE seek to %f finished", seekTime.toDouble());
+    GST_DEBUG("MSE seek to %s finished", toString(seekTime).utf8().data());
 
     if (!m_gstSeekCompleted) {
         m_gstSeekCompleted = true;
@@ -241,7 +242,7 @@ void MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime(const MediaTime&
     }
 }
 
-bool MediaPlayerPrivateGStreamerMSE::doSeek(gint64, float, GstSeekFlags)
+bool MediaPlayerPrivateGStreamerMSE::doSeek(const MediaTime&, float, GstSeekFlags)
 {
     // Use doSeek() instead. If anybody is calling this version of doSeek(), something is wrong.
     ASSERT_NOT_REACHED();
@@ -250,8 +251,7 @@ bool MediaPlayerPrivateGStreamerMSE::doSeek(gint64, float, GstSeekFlags)
 
 bool MediaPlayerPrivateGStreamerMSE::doSeek()
 {
-    GstClockTime position = toGstClockTime(m_seekTime);
-    MediaTime seekTime = MediaTime::createWithDouble(m_seekTime);
+    MediaTime seekTime = m_seekTime;
     double rate = m_player->rate();
     GstSeekFlags seekType = static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE);
 
@@ -308,10 +308,10 @@ bool MediaPlayerPrivateGStreamerMSE::doSeek()
     if (!isTimeBuffered(seekTime)) {
         // Look if a near future time (<0.1 sec.) is buffered and change the seek target time.
         if (m_mediaSource) {
-            const MediaTime miniGap = MediaTime::createWithDouble(0.1);
+            const MediaTime miniGap = MediaTime(1, 10);
             MediaTime nearest = m_mediaSource->buffered()->nearest(seekTime);
             if (nearest.isValid() && nearest > seekTime && (nearest - seekTime) <= miniGap && isTimeBuffered(nearest + miniGap)) {
-                GST_DEBUG("[Seek] Changed the seek target time from %f to %f, a near point in the future", seekTime.toFloat(), nearest.toFloat());
+                GST_DEBUG("[Seek] Changed the seek target time from %s to %s, a near point in the future", toString(seekTime).utf8().data(), toString(nearest).utf8().data());
                 seekTime = nearest;
             }
         }
@@ -343,22 +343,23 @@ bool MediaPlayerPrivateGStreamerMSE::doSeek()
 
     GST_DEBUG("We can seek now");
 
-    gint64 startTime = position, endTime = GST_CLOCK_TIME_NONE;
+    MediaTime startTime = seekTime, endTime = MediaTime::invalidTime();
+
     if (rate < 0) {
-        startTime = 0;
-        endTime = position;
+        startTime = MediaTime::zeroTime();
+        endTime = seekTime;
     }
 
     if (!rate)
         rate = 1;
 
-    GST_DEBUG("Actual seek to %" GST_TIME_FORMAT ", end time:  %" GST_TIME_FORMAT ", rate: %f", GST_TIME_ARGS(startTime), GST_TIME_ARGS(endTime), rate);
+    GST_DEBUG("Actual seek to %s, end time:  %s, rate: %f", toString(startTime).utf8().data(), toString(endTime).utf8().data(), rate);
 
     // This will call notifySeekNeedsData() after some time to tell that the pipeline is ready for sample enqueuing.
     webKitMediaSrcPrepareSeek(WEBKIT_MEDIA_SRC(m_source.get()), seekTime);
 
     m_gstSeekCompleted = false;
-    if (!gst_element_seek(m_pipeline.get(), rate, GST_FORMAT_TIME, seekType, GST_SEEK_TYPE_SET, startTime, GST_SEEK_TYPE_SET, endTime)) {
+    if (!gst_element_seek(m_pipeline.get(), rate, GST_FORMAT_TIME, seekType, GST_SEEK_TYPE_SET, toGstClockTime(startTime), GST_SEEK_TYPE_SET, toGstClockTime(endTime))) {
         webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
         m_seeking = false;
         m_gstSeekCompleted = true;
@@ -386,20 +387,20 @@ void MediaPlayerPrivateGStreamerMSE::maybeFinishSeek()
     }
 
     if (m_seekIsPending) {
-        GST_DEBUG("[Seek] Committing pending seek to %f", m_seekTime);
+        GST_DEBUG("[Seek] Committing pending seek to %s", toString(m_seekTime).utf8().data());
         m_seekIsPending = false;
         if (!doSeek()) {
-            GST_WARNING("[Seek] Seeking to %f failed", m_seekTime);
-            m_cachedPosition = -1;
+            GST_WARNING("[Seek] Seeking to %s failed", toString(m_seekTime).utf8().data());
+            m_cachedPosition = MediaTime::invalidTime();
         }
         return;
     }
 
-    GST_DEBUG("[Seek] Seeked to %f", m_seekTime);
+    GST_DEBUG("[Seek] Seeked to %s", toString(m_seekTime).utf8().data());
 
     webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
     m_seeking = false;
-    m_cachedPosition = -1;
+    m_cachedPosition = MediaTime::invalidTime();
     // The pipeline can still have a pending state. In this case a position query will fail.
     // Right now we can use m_seekTime as a fallback.
     m_canFallBackToLastFinishedSeekPosition = true;
@@ -671,7 +672,7 @@ void MediaPlayerPrivateGStreamerMSE::asyncStateChangeDone()
 bool MediaPlayerPrivateGStreamerMSE::isTimeBuffered(const MediaTime &time) const
 {
     bool result = m_mediaSource && m_mediaSource->buffered()->contain(time);
-    GST_DEBUG("Time %f buffered? %s", time.toDouble(), result ? "Yes" : "No");
+    GST_DEBUG("Time %s buffered? %s", toString(time).utf8().data(), boolForPrinting(result));
     return result;
 }
 
@@ -695,7 +696,7 @@ void MediaPlayerPrivateGStreamerMSE::durationChanged()
     MediaTime previousDuration = m_mediaTimeDuration;
     m_mediaTimeDuration = m_mediaSourceClient->duration();
 
-    GST_TRACE("previous=%f, new=%f", previousDuration.toFloat(), m_mediaTimeDuration.toFloat());
+    GST_TRACE("previous=%s, new=%s", toString(previousDuration).utf8().data(), toString(m_mediaTimeDuration).utf8().data());
 
     // Avoid emiting durationchanged in the case where the previous duration was 0 because that case is already handled
     // by the HTMLMediaElement.
@@ -911,25 +912,25 @@ MediaTime MediaPlayerPrivateGStreamerMSE::currentMediaTime() const
 
         m_eosPending = false;
         m_isEndReached = true;
-        m_cachedPosition = m_mediaTimeDuration.toFloat();
-        m_durationAtEOS = m_mediaTimeDuration.toFloat();
+        m_cachedPosition = m_mediaTimeDuration;
+        m_durationAtEOS = m_mediaTimeDuration;
         m_player->timeChanged();
     }
     return position;
 }
 
-float MediaPlayerPrivateGStreamerMSE::maxTimeSeekable() const
+MediaTime MediaPlayerPrivateGStreamerMSE::maxMediaTimeSeekable() const
 {
     if (UNLIKELY(m_errorOccured))
-        return 0;
+        return MediaTime::zeroTime();
 
-    GST_DEBUG("maxTimeSeekable");
-    float result = durationMediaTime().toFloat();
+    GST_DEBUG("maxMediaTimeSeekable");
+    MediaTime result = durationMediaTime();
     // Infinite duration means live stream.
-    if (std::isinf(result)) {
+    if (result.isPositiveInfinite()) {
         MediaTime maxBufferedTime = buffered()->maximumBufferedTime();
         // Return the highest end time reported by the buffered attribute.
-        result = maxBufferedTime.isValid() ? maxBufferedTime.toFloat() : 0;
+        result = maxBufferedTime.isValid() ? maxBufferedTime : MediaTime::zeroTime();
     }
 
     return result;
index 657ba81..98c2f1e 100644 (file)
@@ -60,7 +60,7 @@ public:
 
     void pause() override;
     bool seeking() const override;
-    void seek(float) override;
+    void seek(const MediaTime&) override;
     void configurePlaySink() override;
     bool changePipelineState(GstState) override;
 
@@ -69,7 +69,7 @@ public:
 
     void setRate(float) override;
     std::unique_ptr<PlatformTimeRanges> buffered() const override;
-    float maxTimeSeekable() const override;
+    MediaTime maxMediaTimeSeekable() const override;
 
     void sourceChanged() override;
 
@@ -98,7 +98,7 @@ private:
     // FIXME: Reduce code duplication.
     void updateStates() override;
 
-    bool doSeek(gint64, float, GstSeekFlags) override;
+    bool doSeek(const MediaTime&, float, GstSeekFlags) override;
     bool doSeek();
     void maybeFinishSeek();
     void updatePlaybackRate() override;
index 6049e9d..bd6fb74 100644 (file)
@@ -481,8 +481,8 @@ void PlaybackPipeline::enqueueSample(Ref<MediaSample>&& mediaSample)
     GST_TRACE("enqueing sample trackId=%s PTS=%f presentationSize=%.0fx%.0f at %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT,
         trackId.string().utf8().data(), mediaSample->presentationTime().toFloat(),
         mediaSample->presentationSize().width(), mediaSample->presentationSize().height(),
-        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->presentationTime().toDouble())),
-        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->duration().toDouble())));
+        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->presentationTime())),
+        GST_TIME_ARGS(WebCore::toGstClockTime(mediaSample->duration())));
 
     WTF::GMutexLocker<GMutex> locker(*GST_OBJECT_GET_LOCK(m_webKitMediaSrc.get()));
     Stream* stream = getStreamByTrackId(m_webKitMediaSrc.get(), trackId);
index 52ca668..19bd4e5 100644 (file)
@@ -402,8 +402,8 @@ gboolean webKitMediaSrcQueryWithParent(GstPad* pad, GstObject* parent, GstQuery*
         switch (format) {
         case GST_FORMAT_TIME: {
             if (source->priv && source->priv->mediaPlayerPrivate) {
-                float duration = source->priv->mediaPlayerPrivate->durationMediaTime().toFloat();
-                if (duration > 0) {
+                MediaTime duration = source->priv->mediaPlayerPrivate->durationMediaTime();
+                if (duration > MediaTime::zeroTime()) {
                     gst_query_set_duration(query, format, WebCore::toGstClockTime(duration));
                     GST_DEBUG_OBJECT(source, "Answering: duration=%" GST_TIME_FORMAT, GST_TIME_ARGS(WebCore::toGstClockTime(duration)));
                     result = TRUE;