[GL][GStreamer] activate wrapped shared context
authorcommit-queue@webkit.org <commit-queue@webkit.org@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 9 Aug 2019 09:36:45 +0000 (09:36 +0000)
committercommit-queue@webkit.org <commit-queue@webkit.org@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 9 Aug 2019 09:36:45 +0000 (09:36 +0000)
https://bugs.webkit.org/show_bug.cgi?id=196966

Patch by Víctor Manuel Jáquez Leal <vjaquez@igalia.com> on 2019-08-09
Reviewed by Žan Doberšek.

This patch consists in four parts:

1\ When the media player is instantiated, and it is intended to
render textures, it will create a wrapped object of the
application's GL context, and in order to populate the wrapped
object with the GL vtable, the context has to be current. Thus,
this patch makes current the shared WebKit application context,
and populate the wrapped GstGLContext by activating it and filling
in it. Afterwards, the wrapped context is deactivated.

2\ This patch makes GL texture use the RGBA color space, thus the
color transformation is done in GStreamer, and no further color
transformation is required in WebKit.

3\ Since it is not necessary to modify behavior if the decoder is
imxvpudecoder, its identification and label were removed.

4\ As only RGBA is used, the old color conversions when rendering
using Cairo (fallback) were changed to convert the RGBA, as in
GStreamer's format, to ARGB32, as in Cairo format -which depends
on endianness.

No new tests because there is no behavior change.

* platform/graphics/gstreamer/ImageGStreamerCairo.cpp:
(WebCore::ImageGStreamer::ImageGStreamer): Only convert GStreamer
RGBA to Cairo RGB32.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
(WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin): Removes
the IMX VPU identification.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
(WebCore::MediaPlayerPrivateGStreamerBase::ensureGstGLContext):
Intializes the wrapped GL Context.
(WebCore::MediaPlayerPrivateGStreamerBase::updateTextureMapperFlags):
Removes frame's color conversion.
(WebCore::MediaPlayerPrivateGStreamerBase::createVideoSinkGL):
Instead of parsing a string, the GstCaps are created manually, and
it is set to appsink, rather than a filtered linking.
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
Removes ImxVPU enumeration value.
* platform/graphics/gstreamer/VideoTextureCopierGStreamer.cpp:
Adds NoConvert option to texture copier, setting an identity
matrix.
(WebCore::VideoTextureCopierGStreamer::updateColorConversionMatrix):
* platform/graphics/gstreamer/VideoTextureCopierGStreamer.h: Adds
NoConvert enumeration value.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@248464 268f45cc-cd09-0410-ab3c-d52691b4dbfc

Source/WebCore/ChangeLog
Source/WebCore/platform/graphics/gstreamer/ImageGStreamerCairo.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h
Source/WebCore/platform/graphics/gstreamer/VideoTextureCopierGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/VideoTextureCopierGStreamer.h

index 6924d31..b726a58 100644 (file)
@@ -1,3 +1,57 @@
+2019-08-09  Víctor Manuel Jáquez Leal  <vjaquez@igalia.com>
+
+        [GL][GStreamer] activate wrapped shared context
+        https://bugs.webkit.org/show_bug.cgi?id=196966
+
+        Reviewed by Žan Doberšek.
+
+        This patch consists in four parts:
+
+        1\ When the media player is instantiated, and it is intended to
+        render textures, it will create a wrapped object of the
+        application's GL context, and in order to populate the wrapped
+        object with the GL vtable, the context has to be current. Thus,
+        this patch makes current the shared WebKit application context,
+        and populate the wrapped GstGLContext by activating it and filling
+        in it. Afterwards, the wrapped context is deactivated.
+
+        2\ This patch makes GL texture use the RGBA color space, thus the
+        color transformation is done in GStreamer, and no further color
+        transformation is required in WebKit.
+
+        3\ Since it is not necessary to modify behavior if the decoder is
+        imxvpudecoder, its identification and label were removed.
+
+        4\ As only RGBA is used, the old color conversions when rendering
+        using Cairo (fallback) were changed to convert the RGBA, as in
+        GStreamer's format, to ARGB32, as in Cairo format -which depends
+        on endianness.
+
+        No new tests because there is no behavior change.
+
+        * platform/graphics/gstreamer/ImageGStreamerCairo.cpp:
+        (WebCore::ImageGStreamer::ImageGStreamer): Only convert GStreamer
+        RGBA to Cairo RGB32.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
+        (WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin): Removes
+        the IMX VPU identification.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
+        (WebCore::MediaPlayerPrivateGStreamerBase::ensureGstGLContext):
+        Intializes the wrapped GL Context.
+        (WebCore::MediaPlayerPrivateGStreamerBase::updateTextureMapperFlags):
+        Removes frame's color conversion.
+        (WebCore::MediaPlayerPrivateGStreamerBase::createVideoSinkGL):
+        Instead of parsing a string, the GstCaps are created manually, and
+        it is set to appsink, rather than a filtered linking.
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
+        Removes ImxVPU enumeration value.
+        * platform/graphics/gstreamer/VideoTextureCopierGStreamer.cpp:
+        Adds NoConvert option to texture copier, setting an identity
+        matrix.
+        (WebCore::VideoTextureCopierGStreamer::updateColorConversionMatrix):
+        * platform/graphics/gstreamer/VideoTextureCopierGStreamer.h: Adds
+        NoConvert enumeration value.
+
 2019-08-09  Ryosuke Niwa  <rniwa@webkit.org>
 
         REGRESSION (iOS 13): united.com web forms do not respond to taps
index c0de8c4..67d0f0b 100644 (file)
@@ -57,11 +57,7 @@ ImageGStreamer::ImageGStreamer(GstSample* sample)
 
     RefPtr<cairo_surface_t> surface;
     cairo_format_t cairoFormat;
-#if G_BYTE_ORDER == G_LITTLE_ENDIAN
-    cairoFormat = (GST_VIDEO_FRAME_FORMAT(&m_videoFrame) == GST_VIDEO_FORMAT_BGRA) ? CAIRO_FORMAT_ARGB32 : CAIRO_FORMAT_RGB24;
-#else
-    cairoFormat = (GST_VIDEO_FRAME_FORMAT(&m_videoFrame) == GST_VIDEO_FORMAT_ARGB) ? CAIRO_FORMAT_ARGB32 : CAIRO_FORMAT_RGB24;
-#endif
+    cairoFormat = (GST_VIDEO_FRAME_FORMAT(&m_videoFrame) == GST_VIDEO_FORMAT_RGBA) ? CAIRO_FORMAT_ARGB32 : CAIRO_FORMAT_RGB24;
 
     // GStreamer doesn't use premultiplied alpha, but cairo does. So if the video format has an alpha component
     // we need to premultiply it before passing the data to cairo. This needs to be both using gstreamer-gl and not
@@ -75,20 +71,19 @@ ImageGStreamer::ImageGStreamer(GstSample* sample)
 
         for (int x = 0; x < width; x++) {
             for (int y = 0; y < height; y++) {
-#if G_BYTE_ORDER == G_LITTLE_ENDIAN
-                // Video frames use BGRA in little endian.
                 unsigned short alpha = bufferData[3];
-                surfacePixel[0] = (bufferData[0] * alpha + 128) / 255;
+#if G_BYTE_ORDER == G_LITTLE_ENDIAN
+                // Video frames use RGBA in little endian.
+                surfacePixel[0] = (bufferData[2] * alpha + 128) / 255;
                 surfacePixel[1] = (bufferData[1] * alpha + 128) / 255;
-                surfacePixel[2] = (bufferData[2] * alpha + 128) / 255;
+                surfacePixel[2] = (bufferData[0] * alpha + 128) / 255;
                 surfacePixel[3] = alpha;
 #else
-                // Video frames use ARGB in big endian.
-                unsigned short alpha = bufferData[0];
+                // Video frames use RGBA in big endian.
                 surfacePixel[0] = alpha;
-                surfacePixel[1] = (bufferData[1] * alpha + 128) / 255;
-                surfacePixel[2] = (bufferData[2] * alpha + 128) / 255;
-                surfacePixel[3] = (bufferData[3] * alpha + 128) / 255;
+                surfacePixel[1] = (bufferData[0] * alpha + 128) / 255;
+                surfacePixel[2] = (bufferData[1] * alpha + 128) / 255;
+                surfacePixel[3] = (bufferData[2] * alpha + 128) / 255;
 #endif
                 bufferData += 4;
                 surfacePixel += 4;
index 4f36433..f4a4ba1 100644 (file)
@@ -2435,8 +2435,6 @@ void MediaPlayerPrivateGStreamer::createGSTPlayBin(const URL& url, const String&
         GUniquePtr<char> elementName(gst_element_get_name(element));
         if (g_str_has_prefix(elementName.get(), "v4l2"))
             player->m_videoDecoderPlatform = WebKitGstVideoDecoderPlatform::Video4Linux;
-        else if (g_str_has_prefix(elementName.get(), "imxvpudecoder"))
-            player->m_videoDecoderPlatform = WebKitGstVideoDecoderPlatform::ImxVPU;
 
 #if USE(TEXTURE_MAPPER_GL)
         player->updateTextureMapperFlags();
index 91e1789..0728eeb 100644 (file)
 #endif
 
 #if USE(GSTREAMER_GL)
-#if G_BYTE_ORDER == G_LITTLE_ENDIAN
-#define GST_GL_CAPS_FORMAT "{ BGRx, BGRA }"
-#define TEXTURE_MAPPER_COLOR_CONVERT_FLAG TextureMapperGL::ShouldConvertTextureBGRAToRGBA
-#define TEXTURE_COPIER_COLOR_CONVERT_FLAG VideoTextureCopierGStreamer::ColorConversion::ConvertBGRAToRGBA
-#else
-#define GST_GL_CAPS_FORMAT "{ xRGB, ARGB }"
-#define TEXTURE_MAPPER_COLOR_CONVERT_FLAG TextureMapperGL::ShouldConvertTextureARGBToRGBA
-#define TEXTURE_COPIER_COLOR_CONVERT_FLAG VideoTextureCopierGStreamer::ColorConversion::ConvertARGBToRGBA
-#endif
+#define TEXTURE_COPIER_COLOR_CONVERT_FLAG VideoTextureCopierGStreamer::ColorConversion::NoConvert
 
 #include <gst/app/gstappsink.h>
 
-
 #include "GLContext.h"
 #if USE(GLX)
 #include "GLContextGLX.h"
@@ -475,6 +466,19 @@ bool MediaPlayerPrivateGStreamerBase::ensureGstGLContext()
     else
         m_glContext = gst_gl_context_new_wrapped(m_glDisplay.get(), reinterpret_cast<guintptr>(contextHandle), glPlatform, glAPI);
 
+    // Activate and fill the GStreamer wrapped context with the Webkit's shared one.
+    auto previousActiveContext = GLContext::current();
+    webkitContext->makeContextCurrent();
+    if (gst_gl_context_activate(m_glContext.get(), TRUE)) {
+        GUniqueOutPtr<GError> error;
+        if (!gst_gl_context_fill_info(m_glContext.get(), &error.outPtr()))
+            GST_WARNING("Failed to fill in GStreamer context: %s", error->message);
+        gst_gl_context_activate(m_glContext.get(), FALSE);
+    } else
+        GST_WARNING("Failed to activate GStreamer context %" GST_PTR_FORMAT, m_glContext.get());
+    if (previousActiveContext)
+        previousActiveContext->makeContextCurrent();
+
     return true;
 }
 #endif // USE(GSTREAMER_GL)
@@ -993,14 +997,6 @@ void MediaPlayerPrivateGStreamerBase::updateTextureMapperFlags()
         m_textureMapperFlags = 0;
         break;
     }
-
-#if USE(GSTREAMER_GL)
-    // When the imxvpudecoder is used, the texture sampling of the
-    // directviv-uploaded texture returns an RGB value, so there's no need to
-    // convert it.
-    if (m_videoDecoderPlatform != WebKitGstVideoDecoderPlatform::ImxVPU)
-        m_textureMapperFlags |= TEXTURE_MAPPER_COLOR_CONVERT_FLAG;
-#endif
 }
 #endif
 
@@ -1065,6 +1061,10 @@ GstElement* MediaPlayerPrivateGStreamerBase::createVideoSinkGL()
     GstElement* colorconvert = gst_element_factory_make("glcolorconvert", nullptr);
     GstElement* appsink = createGLAppSink();
 
+    // glsinkbin is not used because it includes glcolorconvert which only process RGBA,
+    // but in the future it would be possible to render YUV formats too:
+    // https://bugs.webkit.org/show_bug.cgi?id=132869
+
     if (!appsink || !upload || !colorconvert) {
         GST_WARNING("Failed to create GstGL elements");
         gst_object_unref(videoSink);
@@ -1082,10 +1082,11 @@ GstElement* MediaPlayerPrivateGStreamerBase::createVideoSinkGL()
 
     gst_bin_add_many(GST_BIN(videoSink), upload, colorconvert, appsink, nullptr);
 
-    GRefPtr<GstCaps> caps = adoptGRef(gst_caps_from_string("video/x-raw(" GST_CAPS_FEATURE_MEMORY_GL_MEMORY "), format = (string) " GST_GL_CAPS_FORMAT));
+    GRefPtr<GstCaps> caps = adoptGRef(gst_caps_new_simple("video/x-raw", "format", G_TYPE_STRING, "RGBA", nullptr));
+    gst_caps_set_features(caps.get(), 0, gst_caps_features_new(GST_CAPS_FEATURE_MEMORY_GL_MEMORY, nullptr));
+    g_object_set(appsink, "caps", caps.get(), nullptr);
 
-    result &= gst_element_link_pads(upload, "src", colorconvert, "sink");
-    result &= gst_element_link_pads_filtered(colorconvert, "src", appsink, "sink", caps.get());
+    result &= gst_element_link_many(upload, colorconvert, appsink, nullptr);
 
     GRefPtr<GstPad> pad = adoptGRef(gst_element_get_static_pad(upload, "sink"));
     gst_element_add_pad(videoSink, gst_ghost_pad_new("sink", pad.get()));
index ed92f78..8079a5f 100644 (file)
@@ -307,7 +307,7 @@ protected:
     bool m_waitingForKey { false };
 #endif
 
-    enum class WebKitGstVideoDecoderPlatform { ImxVPU, Video4Linux };
+    enum class WebKitGstVideoDecoderPlatform { Video4Linux };
     Optional<WebKitGstVideoDecoderPlatform> m_videoDecoderPlatform;
 };
 
index 135333e..f519b03 100644 (file)
@@ -84,6 +84,9 @@ void VideoTextureCopierGStreamer::updateColorConversionMatrix(ColorConversion co
     case ColorConversion::ConvertARGBToRGBA:
         m_colorConversionMatrix.setMatrix(0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0);
         break;
+    case ColorConversion::NoConvert:
+        m_colorConversionMatrix.makeIdentity();
+        break;
     default:
         RELEASE_ASSERT_NOT_REACHED();
     }
index ed1d58f..20fb663 100644 (file)
@@ -36,7 +36,8 @@ class VideoTextureCopierGStreamer {
 public:
     enum class ColorConversion {
         ConvertBGRAToRGBA,
-        ConvertARGBToRGBA
+        ConvertARGBToRGBA,
+        NoConvert,
     };
 
     VideoTextureCopierGStreamer(ColorConversion);