.:
authorcommit-queue@webkit.org <commit-queue@webkit.org@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Tue, 24 Jul 2018 08:24:35 +0000 (08:24 +0000)
committercommit-queue@webkit.org <commit-queue@webkit.org@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Tue, 24 Jul 2018 08:24:35 +0000 (08:24 +0000)
[WPE][GTK] Implement PeerConnection API on top of libwebrtc
https://bugs.webkit.org/show_bug.cgi?id=186932

Patch by Thibault Saunier <tsaunier@igalia.com> on 2018-07-24
Reviewed by Philippe Normand.

* Source/cmake/FindGStreamer.cmake: Look for gstreamer-codecparser as it needed for GStreamerVideoDecoder

Source/ThirdParty/libwebrtc:
[WPE][GTK] Implement PeerConnection API on top of libwebrtc
https://bugs.webkit.org/show_bug.cgi?id=186932

Patch by Thibault Saunier <tsaunier@igalia.com> on 2018-07-24
Reviewed by Philippe Normand.

* CMakeLists.txt: Properly set our build as `WEBRTC_WEBKIT_BUILD`

Source/WebCore:
[WPE][GTK] Implement PeerConnection API on top of libwebrtc
https://bugs.webkit.org/show_bug.cgi?id=186932

Patch by Thibault Saunier <tsaunier@igalia.com> on 2018-07-24
Reviewed by Philippe Normand.

Enabled many webrtc tests.

* platform/GStreamer.cmake: Build new files
* platform/graphics/gstreamer/GStreamerCommon.cpp: Fix minor style issues
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp: Add a way to give precise name to pipelines
and give useful names to pipelines with sources comming from a PeerConnection
* platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h: Ditto.
* platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.cpp: Added. Implement a subclass of webrtc::VideoFrame
to represent a kNative GStreamer video frame.
* platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.h: Added. Ditto.
* platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.cpp: Handle incoming audio samples from libwebrtc.
* platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.h: Ditto.
* platform/mediastream/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp: Handle incoming video frames from libwebrtc.
* platform/mediastream/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h: Ditto.
* platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp: Handle passing locally captured audio sample to libwebrtc.
* platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h: Ditto.
* platform/mediastream/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp: Handle passing locally captured vidoe frames to libwebrtc.
* platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp: Added. Implement a video decoder factory and LibWebRTC Video decoders based on GStreamer.
* platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.h: Added. Ditto.
* platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp: Added. Implement a video encoder factory and LibWebRTC H264/VP8 Video encoders based on GStreamer.
* platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.h: Added. Ditto.
* platform/mediastream/libwebrtc/LibWebRTCAudioFormat.h: Add information about signness of the LibWebRTC audio format.
* platform/mediastream/libwebrtc/LibWebRTCProviderGlib.cpp: Add support for newly added Encoder/Decoder factories.
* platform/mediastream/libwebrtc/LibWebRTCProviderGlib.h: Ditto.

Source/WebKit:
[WPE][GTK] Implement WebRTC based on libwebrtc
https://bugs.webkit.org/show_bug.cgi?id=186932

Patch by Thibault Saunier <tsaunier@igalia.com> on 2018-07-24
Reviewed by Philippe Normand.

* WebProcess/Network/webrtc/LibWebRTCProvider.h: Use LibWebRTCProviderGlib when building WPE or GTK ports.

LayoutTests:
[WPE][GTK] Implement PeerConnection API on top of libwebrtc
https://bugs.webkit.org/show_bug.cgi?id=186932

Patch by Thibault Saunier <tsaunier@igalia.com> on 2018-07-24
Reviewed by Philippe Normand.

* platform/gtk/TestExpectations: Enable webrtc tests.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@234138 268f45cc-cd09-0410-ab3c-d52691b4dbfc

32 files changed:
ChangeLog
LayoutTests/ChangeLog
LayoutTests/platform/gtk/TestExpectations
LayoutTests/platform/wpe/TestExpectations
Source/ThirdParty/libwebrtc/CMakeLists.txt
Source/ThirdParty/libwebrtc/ChangeLog
Source/WebCore/ChangeLog
Source/WebCore/platform/GStreamer.cmake
Source/WebCore/platform/graphics/gstreamer/GStreamerCommon.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp
Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h
Source/WebCore/platform/mediastream/gstreamer/GStreamerAudioStreamDescription.h
Source/WebCore/platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp
Source/WebCore/platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.cpp [new file with mode: 0644]
Source/WebCore/platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.h [new file with mode: 0644]
Source/WebCore/platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.cpp
Source/WebCore/platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.h
Source/WebCore/platform/mediastream/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp
Source/WebCore/platform/mediastream/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h
Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp
Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h
Source/WebCore/platform/mediastream/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp
Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp [new file with mode: 0644]
Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.h [new file with mode: 0644]
Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp [new file with mode: 0644]
Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.h [new file with mode: 0644]
Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCAudioFormat.h
Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCProviderGlib.cpp
Source/WebCore/platform/mediastream/libwebrtc/LibWebRTCProviderGlib.h
Source/WebKit/ChangeLog
Source/WebKit/WebProcess/Network/webrtc/LibWebRTCProvider.h
Source/cmake/FindGStreamer.cmake

index 5755610..6e80c52 100644 (file)
--- a/ChangeLog
+++ b/ChangeLog
@@ -1,3 +1,12 @@
+2018-07-24  Thibault Saunier  <tsaunier@igalia.com>
+
+        [WPE][GTK] Implement PeerConnection API on top of libwebrtc
+        https://bugs.webkit.org/show_bug.cgi?id=186932
+
+        Reviewed by Philippe Normand.
+
+        * Source/cmake/FindGStreamer.cmake: Look for gstreamer-codecparser as it needed for GStreamerVideoDecoder
+
 2018-07-20  Carlos Garcia Campos  <cgarcia@igalia.com>
 
         Unreviewed. Update OptionsGTK.cmake and NEWS for 2.21.5 release.
index 1dca7ff..55c8c0f 100644 (file)
@@ -1,3 +1,12 @@
+2018-07-24  Thibault Saunier  <tsaunier@igalia.com>
+
+        [WPE][GTK] Implement PeerConnection API on top of libwebrtc
+        https://bugs.webkit.org/show_bug.cgi?id=186932
+
+        Reviewed by Philippe Normand.
+
+        * platform/gtk/TestExpectations: Enable webrtc tests.
+
 2018-07-24  Dirk Schulze  <krit@webkit.org>
 
         [css-masking] Black backdrop on -webkit-clip-path on SVG root
index 1071d95..ff005b3 100644 (file)
@@ -572,7 +572,36 @@ webkit.org/b/99036 fast/shadow-dom/pointerlockelement-in-slot.html [ Skip ]
 webkit.org/b/85211 ietestcenter/css3/flexbox/flexbox-align-stretch-001.htm [ ImageOnlyFailure ]
 webkit.org/b/85212 ietestcenter/css3/flexbox/flexbox-layout-002.htm [ ImageOnlyFailure ]
 
-webrtc [ Skip ]
+webkit.org/b/187064 webrtc/audio-peer-connection-g722.html
+webkit.org/b/187064 webrtc/video-with-receiver.html
+webkit.org/b/187064 webrtc/captureCanvas-webrtc.html
+webkit.org/b/187064 webrtc/video-remote-mute.html
+webkit.org/b/187064 webrtc/video-addTransceiver.html
+webkit.org/b/187064 webrtc/peer-connection-track-end.html
+webkit.org/b/187064 webrtc/video.html
+webkit.org/b/187064 webrtc/multi-video.html
+webkit.org/b/187064 webrtc/video-addTrack.html
+webkit.org/b/187064 webrtc/video-rotation.html
+webkit.org/b/187064 webrtc/video-mute.html
+webkit.org/b/187064 webrtc/video-disabled-black.html
+webkit.org/b/187064 webrtc/video-replace-track.html
+webkit.org/b/187064 webrtc/video-with-data-channel.html
+webkit.org/b/187064 webrtc/video-replace-muted-track.html
+webkit.org/b/187064 webrtc/video-unmute.html
+webkit.org/b/187064 webrtc/datachannel/mdns-ice-candidates.html
+webkit.org/b/187064 webrtc/libwebrtc/descriptionGetters.html
+webkit.org/b/177533 webrtc/video-interruption.html
+
+webkit.org/b/186933 webrtc/peer-connection-remote-audio-mute2.html
+webkit.org/b/186933 webrtc/peer-connection-audio-unmute.html
+webkit.org/b/186933 webrtc/peer-connection-audio-mute2.html
+webkit.org/b/186933 webrtc/peer-connection-audio-mute.html
+webkit.org/b/186933 webrtc/peer-connection-remote-audio-mute.html
+webkit.org/b/186933 webrtc/clone-audio-track.html
+webkit.org/b/186933 webrtc/audio-replace-track.html
+webkit.org/b/186933 webrtc/audio-peer-connection-webaudio.html
+webkit.org/b/186933 webrtc/audio-muted-stats.html
+
 imported/w3c/web-platform-tests/webrtc/ [ Skip ]
 http/tests/webrtc [ Skip ]
 # The MediaStream implementation is also still not completed
index b4581c2..ac0a87c 100644 (file)
@@ -13,7 +13,38 @@ Bug(WPE) printing/ [ Skip ]
 Bug(WPE) scrollingcoordinator/ [ Skip ]
 Bug(WPE) webarchive/ [ Skip ]
 Bug(WPE) webaudio/ [ Skip ]
-Bug(WPE) webrtc [ Skip ]
+
+# The webrtc implementation is not fully completed yet
+webkit.org/b/187064 webrtc/audio-peer-connection-g722.html
+webkit.org/b/187064 webrtc/video-with-receiver.html
+webkit.org/b/187064 webrtc/captureCanvas-webrtc.html
+webkit.org/b/187064 webrtc/video-remote-mute.html
+webkit.org/b/187064 webrtc/video-addTransceiver.html
+webkit.org/b/187064 webrtc/peer-connection-track-end.html
+webkit.org/b/187064 webrtc/video.html
+webkit.org/b/187064 webrtc/multi-video.html
+webkit.org/b/187064 webrtc/video-addTrack.html
+webkit.org/b/187064 webrtc/video-rotation.html
+webkit.org/b/187064 webrtc/video-mute.html
+webkit.org/b/187064 webrtc/video-disabled-black.html
+webkit.org/b/187064 webrtc/video-replace-track.html
+webkit.org/b/187064 webrtc/video-with-data-channel.html
+webkit.org/b/187064 webrtc/video-replace-muted-track.html
+webkit.org/b/187064 webrtc/video-unmute.html
+webkit.org/b/187064 webrtc/datachannel/mdns-ice-candidates.html
+webkit.org/b/187064 webrtc/libwebrtc/descriptionGetters.html
+webkit.org/b/177533 webrtc/video-interruption.html
+
+webkit.org/b/186933 webrtc/peer-connection-remote-audio-mute2.html
+webkit.org/b/186933 webrtc/peer-connection-audio-unmute.html
+webkit.org/b/186933 webrtc/peer-connection-audio-mute2.html
+webkit.org/b/186933 webrtc/peer-connection-audio-mute.html
+webkit.org/b/186933 webrtc/peer-connection-remote-audio-mute.html
+webkit.org/b/186933 webrtc/clone-audio-track.html
+webkit.org/b/186933 webrtc/audio-replace-track.html
+webkit.org/b/186933 webrtc/audio-peer-connection-webaudio.html
+webkit.org/b/186933 webrtc/audio-muted-stats.html
+
 
 # The MediaStream implementation is still not completed
 webkit.org/b/79203 fast/mediastream/mock-media-source-webaudio.html [ Timeout ]
index c4a662f..550cf2a 100644 (file)
@@ -1391,6 +1391,7 @@ add_library(webrtc STATIC ${webrtc_SOURCES})
 target_compile_options(webrtc PRIVATE
     "$<$<COMPILE_LANGUAGE:CXX>:-std=gnu++11>"
     "-UHAVE_CONFIG_H"
+    "-DWEBRTC_WEBKIT_BUILD=1"
     "-w"
 )
 
index 03d88b6..61c731a 100644 (file)
@@ -1,3 +1,12 @@
+2018-07-24  Thibault Saunier  <tsaunier@igalia.com>
+
+        [WPE][GTK] Implement PeerConnection API on top of libwebrtc
+        https://bugs.webkit.org/show_bug.cgi?id=186932
+
+        Reviewed by Philippe Normand.
+
+        * CMakeLists.txt: Properly set our build as `WEBRTC_WEBKIT_BUILD`
+
 2018-07-19  Youenn Fablet  <youenn@apple.com>
 
         PlatformThread::Run does not need to log the fact that it is running
index 317b080..af618da 100644 (file)
@@ -1,3 +1,35 @@
+2018-07-24  Thibault Saunier  <tsaunier@igalia.com>
+
+        [WPE][GTK] Implement PeerConnection API on top of libwebrtc
+        https://bugs.webkit.org/show_bug.cgi?id=186932
+
+        Reviewed by Philippe Normand.
+
+        Enabled many webrtc tests.
+
+        * platform/GStreamer.cmake: Build new files
+        * platform/graphics/gstreamer/GStreamerCommon.cpp: Fix minor style issues
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp: Add a way to give precise name to pipelines
+        and give useful names to pipelines with sources comming from a PeerConnection
+        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h: Ditto.
+        * platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.cpp: Added. Implement a subclass of webrtc::VideoFrame
+        to represent a kNative GStreamer video frame.
+        * platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.h: Added. Ditto.
+        * platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.cpp: Handle incoming audio samples from libwebrtc.
+        * platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.h: Ditto.
+        * platform/mediastream/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.cpp: Handle incoming video frames from libwebrtc.
+        * platform/mediastream/gstreamer/RealtimeIncomingVideoSourceLibWebRTC.h: Ditto.
+        * platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.cpp: Handle passing locally captured audio sample to libwebrtc.
+        * platform/mediastream/gstreamer/RealtimeOutgoingAudioSourceLibWebRTC.h: Ditto.
+        * platform/mediastream/gstreamer/RealtimeOutgoingVideoSourceLibWebRTC.cpp: Handle passing locally captured vidoe frames to libwebrtc.
+        * platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp: Added. Implement a video decoder factory and LibWebRTC Video decoders based on GStreamer.
+        * platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.h: Added. Ditto.
+        * platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp: Added. Implement a video encoder factory and LibWebRTC H264/VP8 Video encoders based on GStreamer.
+        * platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.h: Added. Ditto.
+        * platform/mediastream/libwebrtc/LibWebRTCAudioFormat.h: Add information about signness of the LibWebRTC audio format.
+        * platform/mediastream/libwebrtc/LibWebRTCProviderGlib.cpp: Add support for newly added Encoder/Decoder factories.
+        * platform/mediastream/libwebrtc/LibWebRTCProviderGlib.h: Ditto.
+
 2018-07-24  Dirk Schulze  <krit@webkit.org>
 
         [css-masking] Black backdrop on -webkit-clip-path on SVG root
index 30df892..8ede175 100644 (file)
@@ -33,6 +33,8 @@ if (ENABLE_VIDEO OR ENABLE_WEB_AUDIO)
         platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
         platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
 
+        platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp
+        platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp
         platform/mediastream/libwebrtc/LibWebRTCAudioModule.cpp
         platform/mediastream/libwebrtc/LibWebRTCProviderGlib.cpp
 
@@ -43,6 +45,7 @@ if (ENABLE_VIDEO OR ENABLE_WEB_AUDIO)
         platform/mediastream/gstreamer/GStreamerMediaStreamSource.cpp
         platform/mediastream/gstreamer/GStreamerVideoCaptureSource.cpp
         platform/mediastream/gstreamer/GStreamerVideoCapturer.cpp
+        platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.cpp
         platform/mediastream/gstreamer/MockGStreamerAudioCaptureSource.cpp
         platform/mediastream/gstreamer/MockGStreamerVideoCaptureSource.cpp
         platform/mediastream/gstreamer/RealtimeIncomingAudioSourceLibWebRTC.cpp
@@ -57,6 +60,7 @@ if (ENABLE_VIDEO OR ENABLE_WEB_AUDIO)
         ${GSTREAMER_BASE_INCLUDE_DIRS}
         ${GSTREAMER_APP_INCLUDE_DIRS}
         ${GSTREAMER_PBUTILS_INCLUDE_DIRS}
+        ${GSTREAMER_CODECPARSERS_INCLUDE_DIRS}
     )
 
     list(APPEND WebCore_LIBRARIES
@@ -65,6 +69,7 @@ if (ENABLE_VIDEO OR ENABLE_WEB_AUDIO)
         ${GSTREAMER_LIBRARIES}
         ${GSTREAMER_PBUTILS_LIBRARIES}
         ${GSTREAMER_AUDIO_LIBRARIES}
+        ${GSTREAMER_CODECPARSERS_LIBRARIES}
     )
 
     # Avoiding a GLib deprecation warning due to GStreamer API using deprecated classes.
@@ -102,6 +107,15 @@ if (ENABLE_VIDEO)
             platform/graphics/gstreamer/VideoTextureCopierGStreamer.cpp
         )
     endif ()
+
+    if (USE_LIBWEBRTC)
+        list(APPEND WebCore_SYSTEM_INCLUDE_DIRECTORIES
+            ${GSTREAMER_CODECPARSERS_INCLUDE_DIRS}
+        )
+        list(APPEND WebCore_LIBRARIES
+            ${GSTREAMER_CODECPARSERS_LIBRARIES}
+        )
+    endif ()
 endif ()
 
 if (ENABLE_WEB_AUDIO)
index d5475b7..f668e77 100644 (file)
@@ -354,13 +354,13 @@ static void simpleBusMessageCallback(GstBus*, GstMessage* message, GstBin* pipel
     }
 }
 
-void disconnectSimpleBusMessageCallback(GstElement *pipeline)
+void disconnectSimpleBusMessageCallback(GstElementpipeline)
 {
     GRefPtr<GstBus> bus = adoptGRef(gst_pipeline_get_bus(GST_PIPELINE(pipeline)));
     g_signal_handlers_disconnect_by_func(bus.get(), reinterpret_cast<gpointer>(simpleBusMessageCallback), pipeline);
 }
 
-void connectSimpleBusMessageCallback(GstElement *pipeline)
+void connectSimpleBusMessageCallback(GstElementpipeline)
 {
     GRefPtr<GstBus> bus = adoptGRef(gst_pipeline_get_bus(GST_PIPELINE(pipeline)));
     gst_bus_add_signal_watch_full(bus.get(), RunLoopSourcePriority::RunLoopDispatcher);
index fe7c230..c623501 100644 (file)
@@ -242,10 +242,11 @@ void MediaPlayerPrivateGStreamer::setPlaybinURL(const URL& url)
 
 void MediaPlayerPrivateGStreamer::load(const String& urlString)
 {
-    loadFull(urlString, nullptr);
+    loadFull(urlString, nullptr, String());
 }
 
-void MediaPlayerPrivateGStreamer::loadFull(const String& urlString, const gchar *playbinName)
+void MediaPlayerPrivateGStreamer::loadFull(const String& urlString, const gchar* playbinName,
+    const String& pipelineName)
 {
     // FIXME: This method is still called even if supportsType() returned
     // IsNotSupported. This would deserve more investigation but meanwhile make
@@ -263,7 +264,7 @@ void MediaPlayerPrivateGStreamer::loadFull(const String& urlString, const gchar
         return;
 
     if (!m_pipeline)
-        createGSTPlayBin(isMediaSource() ? "playbin" : playbinName);
+        createGSTPlayBin(isMediaSource() ? "playbin" : playbinName, pipelineName);
 
     if (m_fillTimer.isActive())
         m_fillTimer.stop();
@@ -305,7 +306,10 @@ void MediaPlayerPrivateGStreamer::load(MediaStreamPrivate& stream)
 {
 #if GST_CHECK_VERSION(1, 10, 0)
     m_streamPrivate = &stream;
-    loadFull(String("mediastream://") + stream.id(), "playbin3");
+    auto pipelineName = String::format("mediastream_%s_%p",
+        (stream.hasCaptureVideoSource() || stream.hasCaptureAudioSource()) ? "Local" : "Remote", this);
+
+    loadFull(String("mediastream://") + stream.id(), "playbin3", pipelineName);
 #if USE(GSTREAMER_GL)
     ensureGLVideoSinkContext();
 #endif
@@ -2468,7 +2472,7 @@ AudioSourceProvider* MediaPlayerPrivateGStreamer::audioSourceProvider()
 }
 #endif
 
-void MediaPlayerPrivateGStreamer::createGSTPlayBin(const gchar* playbinName)
+void MediaPlayerPrivateGStreamer::createGSTPlayBin(const gchar* playbinName, const String& pipelineName)
 {
     if (m_pipeline) {
         if (!playbinName) {
@@ -2503,7 +2507,8 @@ void MediaPlayerPrivateGStreamer::createGSTPlayBin(const gchar* playbinName)
 
     // gst_element_factory_make() returns a floating reference so
     // we should not adopt.
-    setPipeline(gst_element_factory_make(playbinName, String::format("play_%p", this).utf8().data()));
+    setPipeline(gst_element_factory_make(playbinName,
+        pipelineName.isEmpty() ? String::format("play_%p", this).utf8().data() : pipelineName.utf8().data()));
     setStreamVolumeElement(GST_STREAM_VOLUME(m_pipeline.get()));
 
     GST_INFO("Using legacy playbin element: %s", boolForPrinting(m_isLegacyPlaybin));
index ad67d24..f0ae0b9 100644 (file)
@@ -146,7 +146,7 @@ private:
     virtual void updateStates();
     virtual void asyncStateChangeDone();
 
-    void createGSTPlayBin(const gchar* playbinName);
+    void createGSTPlayBin(const gchar* playbinName, const String& pipelineName);
 
     bool loadNextLocation();
     void mediaLocationChanged(GstMessage*);
@@ -177,7 +177,7 @@ private:
     static void downloadBufferFileCreatedCallback(MediaPlayerPrivateGStreamer*);
 
     void setPlaybinURL(const URL& urlString);
-    void loadFull(const String& url, const gchar *playbinName);
+    void loadFull(const String& url, const gchar* playbinName, const String& pipelineName);
 
 #if GST_CHECK_VERSION(1, 10, 0)
     void updateTracks();
index 1ff59b2..861adef 100644 (file)
@@ -1,7 +1,6 @@
 /*
  * Copyright (C) 2018 Metrological Group B.V.
- * Author: Thibault Saunier <tsaunier@igalia.com>
- * Author: Alejandro G. Castro <alex@igalia.com>
+ * Copyright (C) 2018 Igalia S.L. All rights reserved.
  *
  * This library is free software; you can redistribute it and/or
  * modify it under the terms of the GNU Library General Public
@@ -24,6 +23,7 @@
 #if ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) && USE(GSTREAMER)
 
 #include "AudioStreamDescription.h"
+#include "GStreamerCommon.h"
 #include <gst/audio/audio.h>
 
 namespace WebCore {
index fb3484a..9536076 100644 (file)
@@ -45,7 +45,7 @@ static void webkitMediaStreamSrcTrackEnded(WebKitMediaStreamSrc* self, MediaStre
 static GstStaticPadTemplate videoSrcTemplate = GST_STATIC_PAD_TEMPLATE("video_src",
     GST_PAD_SRC,
     GST_PAD_SOMETIMES,
-    GST_STATIC_CAPS("video/x-raw"));
+    GST_STATIC_CAPS("video/x-raw;video/x-h264;video/x-vp8"));
 
 static GstStaticPadTemplate audioSrcTemplate = GST_STATIC_PAD_TEMPLATE("audio_src",
     GST_PAD_SRC,
diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.cpp b/Source/WebCore/platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.cpp
new file mode 100644 (file)
index 0000000..57c48d2
--- /dev/null
@@ -0,0 +1,127 @@
+/*
+ *  Copyright (C) 2012, 2015, 2016, 2018 Igalia S.L
+ *  Copyright (C) 2015, 2016, 2018 Metrological Group B.V.
+ *
+ *  This library is free software; you can redistribute it and/or
+ *  modify it under the terms of the GNU Lesser General Public
+ *  License as published by the Free Software Foundation; either
+ *  version 2 of the License, or (at your option) any later version.
+ *
+ *  This library is distributed in the hope that it will be useful,
+ *  but WITHOUT ANY WARRANTY; without even the implied warranty of
+ *  MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ *  Lesser General Public License for more details.
+ *
+ *  You should have received a copy of the GNU Lesser General Public
+ *  License along with this library; if not, write to the Free Software
+ *  Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA  02110-1301  USA
+ */
+#include "config.h"
+
+#if USE(GSTREAMER) && USE(LIBWEBRTC)
+#include "GStreamerVideoFrameLibWebRTC.h"
+
+namespace WebCore {
+
+const GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFrame& frame)
+{
+    if (frame.video_frame_buffer()->type() == webrtc::VideoFrameBuffer::Type::kNative) {
+        auto framebuffer = static_cast<GStreamerVideoFrameLibWebRTC*>(frame.video_frame_buffer().get());
+        auto gstsample = framebuffer->getSample();
+
+        GST_LOG("Reusing native GStreamer sample: %p", gstsample.get());
+
+        return gstsample;
+    }
+
+    auto webrtcbuffer = frame.video_frame_buffer().get()->ToI420();
+    // FIXME - Check lifetime of those buffers.
+    const uint8_t* comps[3] = {
+        webrtcbuffer->DataY(),
+        webrtcbuffer->DataU(),
+        webrtcbuffer->DataV()
+    };
+
+    GstVideoInfo info;
+    gst_video_info_set_format(&info, GST_VIDEO_FORMAT_I420, frame.width(), frame.height());
+    auto buffer = adoptGRef(gst_buffer_new());
+    for (gint i = 0; i < 3; i++) {
+        gsize compsize = GST_VIDEO_INFO_COMP_STRIDE(&info, i) * GST_VIDEO_INFO_COMP_HEIGHT(&info, i);
+
+        GstMemory* comp = gst_memory_new_wrapped(
+            static_cast<GstMemoryFlags>(GST_MEMORY_FLAG_PHYSICALLY_CONTIGUOUS | GST_MEMORY_FLAG_READONLY),
+            const_cast<gpointer>(reinterpret_cast<const void*>(comps[i])), compsize, 0, compsize, webrtcbuffer, nullptr);
+        gst_buffer_append_memory(buffer.get(), comp);
+    }
+
+    auto caps = adoptGRef(gst_video_info_to_caps(&info));
+    auto sample = adoptGRef(gst_sample_new(buffer.get(), caps.get(), nullptr, nullptr));
+    return WTFMove(sample);
+}
+
+rtc::scoped_refptr<webrtc::VideoFrameBuffer> GStreamerVideoFrameLibWebRTC::create(GstSample * sample)
+{
+    GstVideoInfo info;
+
+    if (!gst_video_info_from_caps(&info, gst_sample_get_caps(sample)))
+        ASSERT_NOT_REACHED();
+
+    return rtc::scoped_refptr<webrtc::VideoFrameBuffer>(new GStreamerVideoFrameLibWebRTC(sample, info));
+}
+
+std::unique_ptr<webrtc::VideoFrame> LibWebRTCVideoFrameFromGStreamerSample(GstSample* sample, webrtc::VideoRotation rotation)
+{
+    auto frameBuffer(GStreamerVideoFrameLibWebRTC::create(sample));
+
+    auto buffer = gst_sample_get_buffer(sample);
+    return std::unique_ptr<webrtc::VideoFrame>(new webrtc::VideoFrame(frameBuffer, GST_BUFFER_DTS(buffer), GST_BUFFER_PTS(buffer), rotation));
+}
+
+webrtc::VideoFrameBuffer::Type GStreamerVideoFrameLibWebRTC::type() const
+{
+    return Type::kNative;
+}
+
+GRefPtr<GstSample> GStreamerVideoFrameLibWebRTC::getSample()
+{
+    return m_sample.get();
+}
+
+rtc::scoped_refptr<webrtc::I420BufferInterface> GStreamerVideoFrameLibWebRTC::ToI420()
+{
+    GstVideoInfo info;
+    GstVideoFrame frame;
+
+    if (!gst_video_info_from_caps(&info, gst_sample_get_caps(m_sample.get())))
+        ASSERT_NOT_REACHED();
+
+    if (GST_VIDEO_INFO_FORMAT(&info) != GST_VIDEO_FORMAT_I420)
+        return nullptr;
+
+    gst_video_frame_map(&frame, &info, gst_sample_get_buffer(m_sample.get()), GST_MAP_READ);
+
+    auto newBuffer = m_bufferPool.CreateBuffer(GST_VIDEO_FRAME_WIDTH(&frame),
+        GST_VIDEO_FRAME_HEIGHT(&frame));
+
+    ASSERT(newBuffer);
+    if (!newBuffer) {
+        gst_video_frame_unmap(&frame);
+        GST_WARNING("RealtimeOutgoingVideoSourceGStreamer::videoSampleAvailable unable to allocate buffer for conversion to YUV");
+        return nullptr;
+    }
+
+    newBuffer->Copy(
+        GST_VIDEO_FRAME_WIDTH(&frame),
+        GST_VIDEO_FRAME_HEIGHT(&frame),
+        GST_VIDEO_FRAME_COMP_DATA(&frame, 0),
+        GST_VIDEO_FRAME_COMP_STRIDE(&frame, 0),
+        GST_VIDEO_FRAME_COMP_DATA(&frame, 1),
+        GST_VIDEO_FRAME_COMP_STRIDE(&frame, 1),
+        GST_VIDEO_FRAME_COMP_DATA(&frame, 2),
+        GST_VIDEO_FRAME_COMP_STRIDE(&frame, 2));
+    gst_video_frame_unmap(&frame);
+
+    return newBuffer;
+}
+}
+#endif // USE(LIBWEBRTC)
diff --git a/Source/WebCore/platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.h b/Source/WebCore/platform/mediastream/gstreamer/GStreamerVideoFrameLibWebRTC.h
new file mode 100644 (file)
index 0000000..53ada42
--- /dev/null
@@ -0,0 +1,60 @@
+/*
+ * Copyright (C) 2018 Metrological Group B.V.
+ * Copyright (C) 2018 Igalia S.L. All rights reserved.
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#pragma once
+
+#if USE(GSTREAMER) && USE(LIBWEBRTC)
+#include "GStreamerCommon.h"
+#include "LibWebRTCMacros.h"
+#include "webrtc/api/video/i420_buffer.h"
+#include "webrtc/api/video/video_frame.h"
+#include "webrtc/common_video/include/i420_buffer_pool.h"
+#include "webrtc/common_video/include/video_frame_buffer.h"
+#include "webrtc/rtc_base/refcountedobject.h"
+
+namespace WebCore {
+
+const GRefPtr<GstSample> GStreamerSampleFromLibWebRTCVideoFrame(const webrtc::VideoFrame&);
+std::unique_ptr<webrtc::VideoFrame> LibWebRTCVideoFrameFromGStreamerSample(GstSample*, webrtc::VideoRotation);
+
+class GStreamerVideoFrameLibWebRTC : public rtc::RefCountedObject<webrtc::VideoFrameBuffer> {
+public:
+    GStreamerVideoFrameLibWebRTC(GstSample* sample, GstVideoInfo info)
+        : m_sample(adoptGRef(sample))
+        , m_info(info) { }
+
+    static rtc::scoped_refptr<webrtc::VideoFrameBuffer> create(GstSample*);
+
+    GRefPtr<GstSample> getSample();
+    rtc::scoped_refptr<webrtc::I420BufferInterface> ToI420() final;
+
+    int width() const override { return GST_VIDEO_INFO_WIDTH(&m_info); }
+    int height() const override { return GST_VIDEO_INFO_HEIGHT(&m_info); }
+
+private:
+    webrtc::VideoFrameBuffer::Type type() const override;
+
+    GRefPtr<GstSample> m_sample;
+    GstVideoInfo m_info;
+    webrtc::I420BufferPool m_bufferPool;
+};
+}
+
+#endif // USE(GSTREAMER) && USE(LIBWEBRTC)
index 9b9baf9..541829e 100644 (file)
 #if USE(LIBWEBRTC) && USE(GSTREAMER)
 #include "RealtimeIncomingAudioSourceLibWebRTC.h"
 
+#include "LibWebRTCAudioFormat.h"
+#include "gstreamer/GStreamerAudioData.h"
+#include "gstreamer/GStreamerAudioStreamDescription.h"
+
 namespace WebCore {
 
 Ref<RealtimeIncomingAudioSource> RealtimeIncomingAudioSource::create(rtc::scoped_refptr<webrtc::AudioTrackInterface>&& audioTrack, String&& audioTrackId)
@@ -49,10 +53,29 @@ RealtimeIncomingAudioSourceLibWebRTC::RealtimeIncomingAudioSourceLibWebRTC(rtc::
 {
 }
 
-void RealtimeIncomingAudioSourceLibWebRTC::OnData(const void*, int, int, size_t, size_t)
+void RealtimeIncomingAudioSourceLibWebRTC::OnData(const void* audioData, int, int sampleRate, size_t numberOfChannels, size_t numberOfFrames)
 {
-}
+    GstAudioInfo info;
+    GstAudioFormat format = gst_audio_format_build_integer(
+        LibWebRTCAudioFormat::isSigned,
+        LibWebRTCAudioFormat::isBigEndian ? G_BIG_ENDIAN : G_LITTLE_ENDIAN,
+        LibWebRTCAudioFormat::sampleSize,
+        LibWebRTCAudioFormat::sampleSize);
+
+    gst_audio_info_set_format(&info, format, sampleRate, numberOfChannels, NULL);
 
+    auto buffer = adoptGRef(gst_buffer_new_wrapped(
+        g_memdup(audioData, GST_AUDIO_INFO_BPF(&info) * numberOfFrames),
+        GST_AUDIO_INFO_BPF(&info) * numberOfFrames));
+    GRefPtr<GstCaps> caps = adoptGRef(gst_audio_info_to_caps(&info));
+    auto sample = adoptGRef(gst_sample_new(buffer.get(), caps.get(), nullptr, nullptr));
+    auto data(std::unique_ptr<GStreamerAudioData>(new GStreamerAudioData(WTFMove(sample), info)));
+
+    auto mediaTime = MediaTime((m_numberOfFrames * G_USEC_PER_SEC) / sampleRate, G_USEC_PER_SEC);
+    audioSamplesAvailable(mediaTime, *data.get(), GStreamerAudioStreamDescription(info), numberOfFrames);
+
+    m_numberOfFrames += numberOfFrames;
+}
 }
 
 #endif // USE(LIBWEBRTC)
index 9d4da03..5733c52 100644 (file)
@@ -42,6 +42,8 @@ private:
 
     // webrtc::AudioTrackSinkInterface API
     void OnData(const void* audioData, int bitsPerSample, int sampleRate, size_t numberOfChannels, size_t numberOfFrames) final;
+
+    uint64_t m_numberOfFrames { 0 };
 };
 
 } // namespace WebCore
index 72781f6..b123c75 100644 (file)
 #if USE(LIBWEBRTC) && USE(GSTREAMER)
 #include "RealtimeIncomingVideoSourceLibWebRTC.h"
 
+#include "GStreamerVideoFrameLibWebRTC.h"
+#include "MediaSampleGStreamer.h"
+#include <gst/video/video.h>
+
 namespace WebCore {
 
 Ref<RealtimeIncomingVideoSource> RealtimeIncomingVideoSource::create(rtc::scoped_refptr<webrtc::VideoTrackInterface>&& videoTrack, String&& trackId)
@@ -50,8 +54,16 @@ RealtimeIncomingVideoSourceLibWebRTC::RealtimeIncomingVideoSourceLibWebRTC(rtc::
 {
 }
 
-void RealtimeIncomingVideoSourceLibWebRTC::OnFrame(const webrtc::VideoFrame&)
+void RealtimeIncomingVideoSourceLibWebRTC::OnFrame(const webrtc::VideoFrame& frame)
 {
+    if (!isProducingData())
+        return;
+
+    auto sample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
+    callOnMainThread([protectedThis = makeRef(*this), sample] {
+        protectedThis->videoSampleAvailable(MediaSampleGStreamer::create(sample.get(),
+            WebCore::FloatSize(), String()));
+    });
 }
 
 } // namespace WebCore
index 7cc8db1..c6bbc53 100644 (file)
@@ -30,6 +30,7 @@
 
 #if USE(LIBWEBRTC) && USE(GSTREAMER)
 
+#include "GStreamerCommon.h"
 #include "RealtimeIncomingVideoSource.h"
 
 namespace WebCore {
@@ -40,9 +41,12 @@ public:
 
 private:
     RealtimeIncomingVideoSourceLibWebRTC(rtc::scoped_refptr<webrtc::VideoTrackInterface>&&, String&&);
+    ~RealtimeIncomingVideoSourceLibWebRTC() { }
 
     // rtc::VideoSinkInterface
     void OnFrame(const webrtc::VideoFrame&) final;
+    void setCapsFromSettings();
+    GRefPtr<GstCaps> m_caps;
 };
 
 } // namespace WebCore
index 3d85ea0..1115d06 100644 (file)
 #if USE(LIBWEBRTC) && USE(GSTREAMER)
 #include "RealtimeOutgoingAudioSourceLibWebRTC.h"
 
+#include "LibWebRTCAudioFormat.h"
+#include "LibWebRTCProvider.h"
+#include "NotImplemented.h"
+#include "gstreamer/GStreamerAudioData.h"
+
 namespace WebCore {
 
 RealtimeOutgoingAudioSourceLibWebRTC::RealtimeOutgoingAudioSourceLibWebRTC(Ref<MediaStreamTrackPrivate>&& audioSource)
     : RealtimeOutgoingAudioSource(WTFMove(audioSource))
 {
+    m_adapter = adoptGRef(gst_adapter_new()),
+    m_sampleConverter = nullptr;
 }
 
 RealtimeOutgoingAudioSourceLibWebRTC::~RealtimeOutgoingAudioSourceLibWebRTC()
 {
+    if (m_sampleConverter)
+        g_clear_pointer(&m_sampleConverter, gst_audio_converter_free);
 }
 
 Ref<RealtimeOutgoingAudioSource> RealtimeOutgoingAudioSource::create(Ref<MediaStreamTrackPrivate>&& audioSource)
@@ -38,27 +47,116 @@ Ref<RealtimeOutgoingAudioSource> RealtimeOutgoingAudioSource::create(Ref<MediaSt
     return RealtimeOutgoingAudioSourceLibWebRTC::create(WTFMove(audioSource));
 }
 
-void RealtimeOutgoingAudioSourceLibWebRTC::audioSamplesAvailable(const MediaTime&, const PlatformAudioData&, const AudioStreamDescription&,
+static inline std::unique_ptr<GStreamerAudioStreamDescription> libwebrtcAudioFormat(int sampleRate,
+    size_t channelCount)
+{
+    GstAudioFormat format = gst_audio_format_build_integer(
+        LibWebRTCAudioFormat::isSigned,
+        LibWebRTCAudioFormat::isBigEndian ? G_BIG_ENDIAN : G_LITTLE_ENDIAN,
+        LibWebRTCAudioFormat::sampleSize,
+        LibWebRTCAudioFormat::sampleSize);
+
+    GstAudioInfo info;
+
+    size_t libWebRTCChannelCount = channelCount >= 2 ? 2 : channelCount;
+    gst_audio_info_set_format(&info, format, sampleRate, libWebRTCChannelCount, nullptr);
+
+    return std::unique_ptr<GStreamerAudioStreamDescription>(new GStreamerAudioStreamDescription(info));
+}
+
+void RealtimeOutgoingAudioSourceLibWebRTC::audioSamplesAvailable(const MediaTime&,
+    const PlatformAudioData& audioData, const AudioStreamDescription& streamDescription,
     size_t /* sampleCount */)
 {
+    auto data = static_cast<const GStreamerAudioData&>(audioData);
+    auto desc = static_cast<const GStreamerAudioStreamDescription&>(streamDescription);
+
+    if (m_sampleConverter && !gst_audio_info_is_equal(m_inputStreamDescription->getInfo(), desc.getInfo())) {
+        GST_ERROR_OBJECT(this, "FIXME - Audio format renegotiation is not possible yet!");
+        g_clear_pointer(&m_sampleConverter, gst_audio_converter_free);
+    }
+
+    if (!m_sampleConverter) {
+        m_inputStreamDescription = std::unique_ptr<GStreamerAudioStreamDescription>(new GStreamerAudioStreamDescription(desc.getInfo()));
+        m_outputStreamDescription = libwebrtcAudioFormat(LibWebRTCAudioFormat::sampleRate, streamDescription.numberOfChannels());
+
+        m_sampleConverter = gst_audio_converter_new(GST_AUDIO_CONVERTER_FLAG_IN_WRITABLE,
+            m_inputStreamDescription->getInfo(),
+            m_outputStreamDescription->getInfo(),
+            nullptr);
+    }
+
+    LockHolder locker(m_adapterMutex);
+    auto buffer = gst_sample_get_buffer(data.getSample());
+    gst_adapter_push(m_adapter.get(), gst_buffer_ref(buffer));
+    LibWebRTCProvider::callOnWebRTCSignalingThread([protectedThis = makeRef(*this)] {
+        protectedThis->pullAudioData();
+    });
 }
 
 void RealtimeOutgoingAudioSourceLibWebRTC::pullAudioData()
 {
+    if (!m_inputStreamDescription || !m_outputStreamDescription) {
+        GST_INFO("No stream description set yet.");
+
+        return;
+    }
+
+    size_t outChunkSampleCount = LibWebRTCAudioFormat::chunkSampleCount;
+    size_t outBufferSize = outChunkSampleCount * m_outputStreamDescription->getInfo()->bpf;
+
+    LockHolder locker(m_adapterMutex);
+    size_t inChunkSampleCount = gst_audio_converter_get_in_frames(m_sampleConverter, outChunkSampleCount);
+    size_t inBufferSize = inChunkSampleCount * m_inputStreamDescription->getInfo()->bpf;
+
+    auto available = gst_adapter_available(m_adapter.get());
+    if (inBufferSize > available) {
+        GST_DEBUG("Not enough data: wanted: %ld > %ld available",
+            inBufferSize, available);
+
+        return;
+    }
+
+    auto inbuf = adoptGRef(gst_adapter_take_buffer(m_adapter.get(), inBufferSize));
+    GstMapInfo inmap;
+    gst_buffer_map(inbuf.get(), &inmap, static_cast<GstMapFlags>(GST_MAP_READ));
+
+    GstMapInfo outmap;
+    auto outbuf = adoptGRef(gst_buffer_new_allocate(nullptr, outBufferSize, 0));
+    gst_buffer_map(outbuf.get(), &outmap, static_cast<GstMapFlags>(GST_MAP_WRITE));
+
+    gpointer in[1] = { inmap.data };
+    gpointer out[1] = { outmap.data };
+    if (gst_audio_converter_samples(m_sampleConverter, static_cast<GstAudioConverterFlags>(0), in, inChunkSampleCount, out, outChunkSampleCount)) {
+        for (auto sink : m_sinks) {
+            sink->OnData(outmap.data,
+                LibWebRTCAudioFormat::sampleSize,
+                static_cast<int>(m_outputStreamDescription->sampleRate()),
+                static_cast<int>(m_outputStreamDescription->numberOfChannels()),
+                outChunkSampleCount);
+        }
+    } else
+        GST_ERROR("Could not convert samples.");
+
+    gst_buffer_unmap(inbuf.get(), &inmap);
+    gst_buffer_unmap(outbuf.get(), &outmap);
 }
 
 bool RealtimeOutgoingAudioSourceLibWebRTC::isReachingBufferedAudioDataHighLimit()
 {
+    notImplemented();
     return false;
 }
 
 bool RealtimeOutgoingAudioSourceLibWebRTC::isReachingBufferedAudioDataLowLimit()
 {
+    notImplemented();
     return false;
 }
 
 bool RealtimeOutgoingAudioSourceLibWebRTC::hasBufferedEnoughData()
 {
+    notImplemented();
     return false;
 }
 
index ee56d52..53d15cb 100644 (file)
 
 #if USE(LIBWEBRTC)
 
+#include "GStreamerAudioStreamDescription.h"
+#include "GStreamerCommon.h"
 #include "RealtimeOutgoingAudioSource.h"
 
+#include <gst/audio/audio.h>
+
 namespace WebCore {
 
 class RealtimeOutgoingAudioSourceLibWebRTC final : public RealtimeOutgoingAudioSource {
@@ -43,6 +47,13 @@ private:
     bool hasBufferedEnoughData() final;
 
     void pullAudioData() final;
+
+    GstAudioConverter* m_sampleConverter;
+    std::unique_ptr<GStreamerAudioStreamDescription> m_inputStreamDescription;
+    std::unique_ptr<GStreamerAudioStreamDescription> m_outputStreamDescription;
+
+    Lock m_adapterMutex;
+    GRefPtr<GstAdapter> m_adapter;
 };
 
 } // namespace WebCore
index 81ae89f..9131422 100644 (file)
@@ -30,6 +30,9 @@
 #if USE(LIBWEBRTC) && USE(GSTREAMER)
 #include "RealtimeOutgoingVideoSourceLibWebRTC.h"
 
+#include "GStreamerVideoFrameLibWebRTC.h"
+#include "MediaSampleGStreamer.h"
+
 namespace WebCore {
 
 Ref<RealtimeOutgoingVideoSource> RealtimeOutgoingVideoSource::create(Ref<MediaStreamTrackPrivate>&& videoSource)
@@ -47,8 +50,34 @@ RealtimeOutgoingVideoSourceLibWebRTC::RealtimeOutgoingVideoSourceLibWebRTC(Ref<M
 {
 }
 
-void RealtimeOutgoingVideoSourceLibWebRTC::sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample&)
+void RealtimeOutgoingVideoSourceLibWebRTC::sampleBufferUpdated(MediaStreamTrackPrivate&, MediaSample& sample)
 {
+    if (!m_sinks.size())
+        return;
+
+    if (m_muted || !m_enabled)
+        return;
+
+    switch (sample.videoRotation()) {
+    case MediaSample::VideoRotation::None:
+        m_currentRotation = webrtc::kVideoRotation_0;
+        break;
+    case MediaSample::VideoRotation::UpsideDown:
+        m_currentRotation = webrtc::kVideoRotation_180;
+        break;
+    case MediaSample::VideoRotation::Right:
+        m_currentRotation = webrtc::kVideoRotation_90;
+        break;
+    case MediaSample::VideoRotation::Left:
+        m_currentRotation = webrtc::kVideoRotation_270;
+        break;
+    }
+
+    ASSERT(sample.platformSample().type == PlatformSample::GStreamerSampleType);
+    auto& mediaSample = static_cast<MediaSampleGStreamer&>(sample);
+    auto frameBuffer(GStreamerVideoFrameLibWebRTC::create(gst_sample_ref(mediaSample.platformSample().sample.gstSample)));
+
+    sendFrame(WTFMove(frameBuffer));
 }
 
 } // namespace WebCore
diff --git a/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp b/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp
new file mode 100644 (file)
index 0000000..0ec2f75
--- /dev/null
@@ -0,0 +1,318 @@
+/*
+ * Copyright (C) 2018 Metrological Group B.V.
+ * Copyright (C) 2018 Igalia S.L. All rights reserved.
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#include "config.h"
+
+#if ENABLE(VIDEO) && ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) && USE(GSTREAMER)
+#include "GStreamerVideoDecoderFactory.h"
+
+#include "GStreamerVideoFrameLibWebRTC.h"
+#include "webrtc/common_video/h264/h264_common.h"
+#include "webrtc/common_video/h264/profile_level_id.h"
+#include "webrtc/media/base/codec.h"
+#include "webrtc/modules/video_coding/codecs/h264/include/h264.h"
+#include "webrtc/modules/video_coding/codecs/vp8/include/vp8.h"
+#include "webrtc/modules/video_coding/include/video_codec_interface.h"
+#include <gst/app/gstappsink.h>
+#include <gst/app/gstappsrc.h>
+#include <gst/video/video.h>
+#include <mutex>
+#include <wtf/glib/RunLoopSourcePriority.h>
+#include <wtf/text/WTFString.h>
+
+GST_DEBUG_CATEGORY(webkit_webrtcdec_debug);
+#define GST_CAT_DEFAULT webkit_webrtcdec_debug
+
+namespace WebCore {
+
+class GStreamerVideoDecoder : public webrtc::VideoDecoder {
+public:
+    GStreamerVideoDecoder()
+        : m_pictureId(0)
+        , m_firstBufferPts(GST_CLOCK_TIME_NONE)
+        , m_firstBufferDts(GST_CLOCK_TIME_NONE)
+    {
+    }
+
+    static void decodebinPadAddedCb(GstElement*,
+        GstPad* srcpad,
+        GstPad* sinkpad)
+    {
+        GST_INFO_OBJECT(srcpad, "connecting pad with %" GST_PTR_FORMAT, sinkpad);
+        if (gst_pad_link(srcpad, sinkpad) != GST_PAD_LINK_OK)
+            ASSERT_NOT_REACHED();
+    }
+
+    GstElement* pipeline()
+    {
+        return m_pipeline.get();
+    }
+
+    GstElement* makeElement(const gchar* factoryName)
+    {
+        GUniquePtr<char> name(g_strdup_printf("%s_dec_%s_%p", Name(), factoryName, this));
+
+        return gst_element_factory_make(factoryName, name.get());
+    }
+
+    int32_t InitDecode(const webrtc::VideoCodec*, int32_t)
+    {
+        m_src = makeElement("appsrc");
+
+        auto capsfilter = CreateFilter();
+        auto decoder = makeElement("decodebin");
+
+        // Make the decoder output "parsed" frames only and let the main decodebin
+        // do the real decoding. This allows us to have optimized decoding/rendering
+        // happening in the main pipeline.
+        g_object_set(decoder, "caps", adoptGRef(gst_caps_from_string(Caps())).get(), nullptr);
+        auto sinkpad = gst_element_get_static_pad(capsfilter, "sink");
+        g_signal_connect(decoder, "pad-added", G_CALLBACK(decodebinPadAddedCb), sinkpad);
+
+        m_pipeline = makeElement("pipeline");
+        connectSimpleBusMessageCallback(m_pipeline.get());
+
+        auto sink = makeElement("appsink");
+        gst_app_sink_set_emit_signals(GST_APP_SINK(sink), true);
+        g_signal_connect(sink, "new-sample", G_CALLBACK(newSampleCallbackTramp), this);
+        // This is an encoder, everything should happen as fast as possible and not
+        // be synced on the clock.
+        g_object_set(sink, "sync", false, nullptr);
+
+        gst_bin_add_many(GST_BIN(pipeline()), m_src, decoder, capsfilter, sink, nullptr);
+        if (!gst_element_link(m_src, decoder)) {
+            GST_ERROR_OBJECT(pipeline(), "Could not link src to decoder.");
+            return WEBRTC_VIDEO_CODEC_ERROR;
+        }
+
+        if (!gst_element_link(capsfilter, sink)) {
+            GST_ERROR_OBJECT(pipeline(), "Could not link capsfilter to sink.");
+            return WEBRTC_VIDEO_CODEC_ERROR;
+        }
+
+        if (gst_element_set_state(pipeline(), GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {
+            GST_ERROR_OBJECT(pipeline(), "Could not set state to PLAYING.");
+            return WEBRTC_VIDEO_CODEC_ERROR;
+        }
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    int32_t RegisterDecodeCompleteCallback(webrtc::DecodedImageCallback* callback)
+    {
+        m_imageReadyCb = callback;
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    virtual GstElement* CreateFilter()
+    {
+        return makeElement("identity");
+    }
+
+    int32_t Release() final
+    {
+        if (m_pipeline.get()) {
+            GRefPtr<GstBus> bus = adoptGRef(gst_pipeline_get_bus(GST_PIPELINE(m_pipeline.get())));
+            gst_bus_set_sync_handler(bus.get(), nullptr, nullptr, nullptr);
+
+            gst_element_set_state(m_pipeline.get(), GST_STATE_NULL);
+            m_src = nullptr;
+            m_pipeline = nullptr;
+        }
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    int32_t Decode(const webrtc::EncodedImage& inputImage,
+        bool,
+        const webrtc::RTPFragmentationHeader*,
+        const webrtc::CodecSpecificInfo*,
+        int64_t renderTimeMs) override
+    {
+        if (!m_src) {
+            GST_ERROR("No source set, can't decode.");
+
+            return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
+        }
+
+        if (!GST_CLOCK_TIME_IS_VALID(m_firstBufferPts)) {
+            GRefPtr<GstPad> srcpad = adoptGRef(gst_element_get_static_pad(m_src, "src"));
+            m_firstBufferPts = (static_cast<guint64>(renderTimeMs)) * GST_MSECOND;
+            m_firstBufferDts = (static_cast<guint64>(inputImage._timeStamp)) * GST_MSECOND;
+        }
+
+        // FIXME- Use a GstBufferPool.
+        auto buffer = gst_buffer_new_wrapped(g_memdup(inputImage._buffer, inputImage._size),
+            inputImage._size);
+        GST_BUFFER_DTS(buffer) = (static_cast<guint64>(inputImage._timeStamp) * GST_MSECOND) - m_firstBufferDts;
+        GST_BUFFER_PTS(buffer) = (static_cast<guint64>(renderTimeMs) * GST_MSECOND) - m_firstBufferPts;
+        m_dtsPtsMap[GST_BUFFER_PTS(buffer)] = inputImage._timeStamp;
+
+        GST_LOG_OBJECT(pipeline(), "%ld Decoding: %" GST_PTR_FORMAT, renderTimeMs, buffer);
+        switch (gst_app_src_push_sample(GST_APP_SRC(m_src),
+            gst_sample_new(buffer, GetCapsForFrame(inputImage), nullptr, nullptr))) {
+        case GST_FLOW_OK:
+            return WEBRTC_VIDEO_CODEC_OK;
+        case GST_FLOW_FLUSHING:
+            return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
+        default:
+            return WEBRTC_VIDEO_CODEC_ERROR;
+        }
+    }
+
+    GstCaps* GetCapsForFrame(const webrtc::EncodedImage& image)
+    {
+        if (!m_caps) {
+            m_caps = adoptGRef(gst_caps_new_simple(Caps(),
+                "width", G_TYPE_INT, image._encodedWidth,
+                "height", G_TYPE_INT, image._encodedHeight,
+                nullptr));
+        }
+
+        return m_caps.get();
+    }
+
+    void AddDecoderIfSupported(std::vector<webrtc::SdpVideoFormat> codecList)
+    {
+        if (HasGstDecoder()) {
+            webrtc::SdpVideoFormat format = ConfigureSupportedDecoder();
+
+            codecList.push_back(format);
+        }
+    }
+
+    virtual webrtc::SdpVideoFormat ConfigureSupportedDecoder()
+    {
+        return webrtc::SdpVideoFormat(Name());
+    }
+
+    bool HasGstDecoder()
+    {
+
+        auto all_decoders = gst_element_factory_list_get_elements(GST_ELEMENT_FACTORY_TYPE_DECODER,
+            GST_RANK_MARGINAL);
+        auto caps = adoptGRef(gst_caps_from_string(Caps()));
+        auto decoders = gst_element_factory_list_filter(all_decoders,
+            caps.get(), GST_PAD_SINK, FALSE);
+
+        gst_plugin_feature_list_free(all_decoders);
+        gst_plugin_feature_list_free(decoders);
+
+        return decoders != nullptr;
+    }
+
+    GstFlowReturn newSampleCallback(GstElement* sink)
+    {
+        auto sample = gst_app_sink_pull_sample(GST_APP_SINK(sink));
+        auto buffer = gst_sample_get_buffer(sample);
+
+        // Make sure that the frame.timestamp == previsouly input_frame._timeStamp
+        // as it is required by the VideoDecoder baseclass.
+        GST_BUFFER_DTS(buffer) = m_dtsPtsMap[GST_BUFFER_PTS(buffer)];
+        m_dtsPtsMap.erase(GST_BUFFER_PTS(buffer));
+        auto frame(LibWebRTCVideoFrameFromGStreamerSample(sample, webrtc::kVideoRotation_0));
+        GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;
+        GST_LOG_OBJECT(pipeline(), "Output decoded frame! %d -> %" GST_PTR_FORMAT,
+            frame->timestamp(), buffer);
+
+        m_imageReadyCb->Decoded(*frame.get(), rtc::Optional<int32_t>(), rtc::Optional<uint8_t>());
+
+        return GST_FLOW_OK;
+    }
+
+    virtual const gchar* Caps() = 0;
+    virtual webrtc::VideoCodecType CodecType() = 0;
+    const char* ImplementationName() const { return "GStreamer"; }
+    virtual const gchar* Name() = 0;
+
+protected:
+    GRefPtr<GstCaps> m_caps;
+    gint m_pictureId;
+
+private:
+    static GstFlowReturn newSampleCallbackTramp(GstElement* sink, GStreamerVideoDecoder* enc)
+    {
+        return enc->newSampleCallback(sink);
+    }
+
+    GRefPtr<GstElement> m_pipeline;
+    GstElement* m_src;
+
+    GstVideoInfo m_info;
+    webrtc::DecodedImageCallback* m_imageReadyCb;
+
+    std::map<GstClockTime, GstClockTime> m_dtsPtsMap;
+    GstClockTime m_firstBufferPts;
+    GstClockTime m_firstBufferDts;
+};
+
+class H264Decoder : public GStreamerVideoDecoder {
+public:
+    H264Decoder() { }
+    const gchar* Caps() final { return "video/x-h264"; }
+    const gchar* Name() final { return cricket::kH264CodecName; }
+    webrtc::VideoCodecType CodecType() final { return webrtc::kVideoCodecH264; }
+};
+
+class VP8Decoder : public GStreamerVideoDecoder {
+public:
+    VP8Decoder() { }
+    const gchar* Caps() final { return "video/x-vp8"; }
+    const gchar* Name() final { return cricket::kVp8CodecName; }
+    webrtc::VideoCodecType CodecType() final { return webrtc::kVideoCodecVP8; }
+};
+
+std::unique_ptr<webrtc::VideoDecoder> GStreamerVideoDecoderFactory::CreateVideoDecoder(const webrtc::SdpVideoFormat& format)
+{
+    GStreamerVideoDecoder* dec;
+
+    if (format.name == cricket::kH264CodecName)
+        dec = new H264Decoder();
+    else if (format.name == cricket::kVp8CodecName)
+        dec = new VP8Decoder();
+    else {
+        GST_ERROR("Could not create decoder for %s", format.name.c_str());
+
+        return nullptr;
+    }
+
+    return std::unique_ptr<webrtc::VideoDecoder>(dec);
+}
+
+GStreamerVideoDecoderFactory::GStreamerVideoDecoderFactory()
+{
+    static std::once_flag debugRegisteredFlag;
+
+    std::call_once(debugRegisteredFlag, [] {
+        GST_DEBUG_CATEGORY_INIT(webkit_webrtcdec_debug, "webkitlibwebrtcvideodecoder", 0, "WebKit WebRTC video decoder");
+    });
+}
+std::vector<webrtc::SdpVideoFormat> GStreamerVideoDecoderFactory::GetSupportedFormats() const
+{
+    std::vector<webrtc::SdpVideoFormat> formats;
+
+    VP8Decoder().AddDecoderIfSupported(formats);
+    H264Decoder().AddDecoderIfSupported(formats);
+
+    return formats;
+}
+}
+#endif
diff --git a/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.h b/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.h
new file mode 100644 (file)
index 0000000..e4d508e
--- /dev/null
@@ -0,0 +1,43 @@
+/*
+ * Copyright (C) 2018 Metrological Group B.V.
+ * Copyright (C) 2018 Igalia S.L. All rights reserved.
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#pragma once
+
+#if USE(LIBWEBRTC) && USE(GSTREAMER)
+
+#include "LibWebRTCMacros.h"
+#include "api/video_codecs/video_decoder_factory.h"
+#include <gst/gst.h>
+#include <wtf/text/WTFString.h>
+
+namespace WebCore {
+
+class GStreamerVideoDecoderFactory : public webrtc::VideoDecoderFactory {
+public:
+    GStreamerVideoDecoderFactory();
+    static bool newSource(String trackId, GstElement *source);
+
+private:
+    std::unique_ptr<webrtc::VideoDecoder> CreateVideoDecoder(const webrtc::SdpVideoFormat&) final;
+    std::vector<webrtc::SdpVideoFormat> GetSupportedFormats() const final;
+};
+}
+
+#endif
diff --git a/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp b/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp
new file mode 100644 (file)
index 0000000..784e916
--- /dev/null
@@ -0,0 +1,527 @@
+/*
+ * Copyright (C) 2018 Metrological Group B.V.
+ * Copyright (C) 2018 Igalia S.L. All rights reserved.
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#include "config.h"
+
+#if ENABLE(VIDEO) && ENABLE(MEDIA_STREAM) && USE(LIBWEBRTC) && USE(GSTREAMER)
+#include "GStreamerVideoEncoderFactory.h"
+
+#include "GStreamerVideoFrameLibWebRTC.h"
+#include "webrtc/common_video/h264/h264_common.h"
+#include "webrtc/common_video/h264/profile_level_id.h"
+#include "webrtc/media/base/codec.h"
+#include "webrtc/modules/video_coding/codecs/h264/include/h264.h"
+#include "webrtc/modules/video_coding/codecs/vp8/include/vp8.h"
+#include "webrtc/modules/video_coding/include/video_codec_interface.h"
+
+#include <gst/app/gstappsink.h>
+#include <gst/app/gstappsrc.h>
+#define GST_USE_UNSTABLE_API 1
+#include <gst/codecparsers/gsth264parser.h>
+#undef GST_USE_UNSTABLE_API
+#include <gst/pbutils/encoding-profile.h>
+#include <gst/video/video.h>
+
+#include <mutex>
+
+// Required for unified builds
+#ifdef GST_CAT_DEFAULT
+#undef GST_CAT_DEFAULT
+#endif
+
+GST_DEBUG_CATEGORY(webkit_webrtcenc_debug);
+#define GST_CAT_DEFAULT webkit_webrtcenc_debug
+
+namespace WebCore {
+
+class GStreamerVideoEncoder : public webrtc::VideoEncoder {
+public:
+    GStreamerVideoEncoder(const webrtc::SdpVideoFormat&)
+        : m_pictureId(0)
+        , m_firstFramePts(GST_CLOCK_TIME_NONE)
+        , m_restrictionCaps(adoptGRef(gst_caps_new_empty_simple("video/x-raw")))
+    {
+    }
+    GStreamerVideoEncoder()
+        : m_pictureId(0)
+        , m_firstFramePts(GST_CLOCK_TIME_NONE)
+        , m_restrictionCaps(adoptGRef(gst_caps_new_empty_simple("video/x-raw")))
+    {
+    }
+
+    int SetRates(uint32_t newBitrate, uint32_t frameRate) override
+    {
+        GST_INFO_OBJECT(m_pipeline.get(), "New bitrate: %d - framerate is %d",
+            newBitrate, frameRate);
+
+        auto caps = gst_caps_make_writable(m_restrictionCaps.get());
+        gst_caps_set_simple(caps, "framerate", GST_TYPE_FRACTION, frameRate, 1, nullptr);
+
+        SetRestrictionCaps(caps);
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    GstElement* pipeline()
+    {
+        return m_pipeline.get();
+    }
+
+    GstElement* makeElement(const gchar* factoryName)
+    {
+        auto name = String::format("%s_enc_%s_%p", Name(), factoryName, this);
+        auto elem = gst_element_factory_make(factoryName, name.utf8().data());
+
+        return elem;
+    }
+
+    int32_t InitEncode(const webrtc::VideoCodec* codecSettings, int32_t, size_t)
+    {
+        g_return_val_if_fail(codecSettings, WEBRTC_VIDEO_CODEC_ERR_PARAMETER);
+        g_return_val_if_fail(codecSettings->codecType == CodecType(), WEBRTC_VIDEO_CODEC_ERR_PARAMETER);
+
+        m_pipeline = makeElement("pipeline");
+
+        connectSimpleBusMessageCallback(m_pipeline.get());
+        auto encodebin = CreateEncoder(&m_encoder).leakRef();
+        ASSERT(m_encoder);
+
+        m_src = makeElement("appsrc");
+        g_object_set(m_src, "is-live", true, "format", GST_FORMAT_TIME, nullptr);
+
+        auto capsfilter = CreateFilter();
+        auto sink = makeElement("appsink");
+        gst_app_sink_set_emit_signals(GST_APP_SINK(sink), TRUE);
+        g_signal_connect(sink, "new-sample", G_CALLBACK(newSampleCallbackTramp), this);
+
+        gst_bin_add_many(GST_BIN(m_pipeline.get()), m_src, encodebin, capsfilter, sink, nullptr);
+        if (!gst_element_link_many(m_src, encodebin, capsfilter, sink, nullptr))
+            ASSERT_NOT_REACHED();
+
+        gst_element_set_state(m_pipeline.get(), GST_STATE_PLAYING);
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    bool SupportsNativeHandle() const final
+    {
+        return true;
+    }
+
+    virtual GstElement* CreateFilter()
+    {
+        return makeElement("capsfilter");
+    }
+
+    int32_t RegisterEncodeCompleteCallback(webrtc::EncodedImageCallback* callback) final
+    {
+        m_imageReadyCb = callback;
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    int32_t Release() final
+    {
+        GRefPtr<GstBus> bus = adoptGRef(gst_pipeline_get_bus(GST_PIPELINE(m_pipeline.get())));
+        gst_bus_set_sync_handler(bus.get(), nullptr, nullptr, nullptr);
+
+        gst_element_set_state(m_pipeline.get(), GST_STATE_NULL);
+        m_src = nullptr;
+        m_pipeline = nullptr;
+
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    int32_t SetChannelParameters(uint32_t, int64_t) final
+    {
+        return WEBRTC_VIDEO_CODEC_OK;
+    }
+
+    int32_t Encode(const webrtc::VideoFrame& frame,
+        const webrtc::CodecSpecificInfo*,
+        const std::vector<webrtc::FrameType>* frameTypes) final
+    {
+        if (!m_imageReadyCb) {
+            GST_INFO_OBJECT(m_pipeline.get(), "No encoded callback set yet!");
+
+            return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
+        }
+
+        if (!m_src) {
+            GST_INFO_OBJECT(m_pipeline.get(), "No source set yet!");
+
+            return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
+        }
+
+        auto sample = GStreamerSampleFromLibWebRTCVideoFrame(frame);
+        auto buffer = gst_sample_get_buffer(sample.get());
+
+        if (!GST_CLOCK_TIME_IS_VALID(m_firstFramePts)) {
+            m_firstFramePts = GST_BUFFER_PTS(buffer);
+            auto pad = adoptGRef(gst_element_get_static_pad(m_src, "src"));
+            gst_pad_set_offset(pad.get(), -m_firstFramePts);
+        }
+
+        for (auto frame_type : *frameTypes) {
+            if (frame_type == webrtc::kVideoFrameKey) {
+                auto pad = adoptGRef(gst_element_get_static_pad(m_src, "src"));
+                auto forceKeyUnit = gst_video_event_new_downstream_force_key_unit(GST_CLOCK_TIME_NONE,
+                    GST_CLOCK_TIME_NONE, GST_CLOCK_TIME_NONE, FALSE, 1);
+
+                if (!gst_pad_push_event(pad.get(), forceKeyUnit))
+                    GST_WARNING_OBJECT(pipeline(), "Could not send ForceKeyUnit event");
+
+                break;
+            }
+        }
+
+        switch (gst_app_src_push_sample(GST_APP_SRC(m_src), sample.get())) {
+        case GST_FLOW_OK:
+            return WEBRTC_VIDEO_CODEC_OK;
+        case GST_FLOW_FLUSHING:
+            return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
+        default:
+            return WEBRTC_VIDEO_CODEC_ERROR;
+        }
+    }
+
+    GstFlowReturn newSampleCallback(GstElement* sink)
+    {
+        auto sample = adoptGRef(gst_app_sink_pull_sample(GST_APP_SINK(sink)));
+        auto buffer = gst_sample_get_buffer(sample.get());
+        auto caps = gst_sample_get_caps(sample.get());
+
+        webrtc::RTPFragmentationHeader* fragmentationInfo;
+        auto frame = Fragmentize(buffer, &fragmentationInfo);
+        if (!frame._size)
+            return GST_FLOW_OK;
+
+        gst_structure_get(gst_caps_get_structure(caps, 0),
+            "width", G_TYPE_INT, &frame._encodedWidth,
+            "height", G_TYPE_INT, &frame._encodedHeight,
+            nullptr);
+
+        frame._frameType = GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT) ? webrtc::kVideoFrameDelta : webrtc::kVideoFrameKey;
+        frame._completeFrame = true;
+        frame.capture_time_ms_ = GST_TIME_AS_MSECONDS(GST_BUFFER_PTS(buffer));
+        frame._timeStamp = GST_TIME_AS_MSECONDS(GST_BUFFER_DTS(buffer));
+        GST_LOG_OBJECT(m_pipeline.get(), "Got buffer TS: %" GST_TIME_FORMAT, GST_TIME_ARGS(GST_BUFFER_PTS(buffer)));
+
+        webrtc::CodecSpecificInfo codecSpecifiInfos;
+        PopulateCodecSpecific(&codecSpecifiInfos, buffer);
+
+        webrtc::EncodedImageCallback::Result result = m_imageReadyCb->OnEncodedImage(frame, &codecSpecifiInfos, fragmentationInfo);
+        m_pictureId = (m_pictureId + 1) & 0x7FFF;
+        if (result.error != webrtc::EncodedImageCallback::Result::OK) {
+            GST_ELEMENT_ERROR(m_pipeline.get(), LIBRARY, FAILED, (nullptr),
+                ("Encode callback failed: %d", result.error));
+
+            return GST_FLOW_ERROR;
+        }
+
+        return GST_FLOW_OK;
+    }
+
+    GRefPtr<GstElement> CreateEncoder(GstElement** encoder)
+    {
+        GstElement* enc = nullptr;
+
+        m_profile = GST_ENCODING_PROFILE(gst_encoding_video_profile_new(
+            adoptGRef(gst_caps_from_string(Caps())).get(),
+            ProfileName(),
+            gst_caps_ref(m_restrictionCaps.get()),
+            1));
+        GRefPtr<GstElement> encodebin = makeElement("encodebin");
+
+        if (!encodebin.get()) {
+            GST_ERROR("No encodebin present... can't use GStreamer based encoders");
+            return nullptr;
+        }
+        g_object_set(encodebin.get(), "profile", m_profile.get(), nullptr);
+
+        for (GList* tmp = GST_BIN_CHILDREN(encodebin.get()); tmp; tmp = tmp->next) {
+            GstElement* elem = GST_ELEMENT(tmp->data);
+            GstElementFactory* factory = gst_element_get_factory((elem));
+
+            if (!factory || !gst_element_factory_list_is_type(factory, GST_ELEMENT_FACTORY_TYPE_VIDEO_ENCODER))
+                continue;
+
+            enc = elem;
+            break;
+        }
+
+        if (!enc)
+            return nullptr;
+
+        if (encoder)
+            *encoder = enc;
+
+        return encodebin;
+    }
+
+    void AddCodecIfSupported(std::vector<webrtc::SdpVideoFormat>* supportedFormats)
+    {
+        GstElement* encoder;
+
+        if (CreateEncoder(&encoder).get() != nullptr) {
+            webrtc::SdpVideoFormat format = ConfigureSupportedCodec(encoder);
+
+            supportedFormats->push_back(format);
+        }
+    }
+
+    virtual const gchar* ProfileName()
+    {
+        return nullptr;
+    }
+
+    virtual const gchar* Caps()
+    {
+        return nullptr;
+    }
+
+    virtual webrtc::VideoCodecType CodecType() = 0;
+    virtual webrtc::SdpVideoFormat ConfigureSupportedCodec(GstElement*)
+    {
+        return webrtc::SdpVideoFormat(Name());
+    }
+
+    virtual void PopulateCodecSpecific(webrtc::CodecSpecificInfo*, GstBuffer*) = 0;
+
+    virtual webrtc::EncodedImage Fragmentize(GstBuffer* buffer, webrtc::RTPFragmentationHeader** outFragmentationInfo)
+    {
+        GstMapInfo map;
+
+        gst_buffer_map(buffer, &map, GST_MAP_READ);
+        webrtc::EncodedImage frame(map.data, map.size, map.size);
+        gst_buffer_unmap(buffer, &map);
+
+        // No fragmentation by default.
+        webrtc::RTPFragmentationHeader* fragmentationInfo = new webrtc::RTPFragmentationHeader();
+
+        fragmentationInfo->VerifyAndAllocateFragmentationHeader(1);
+        fragmentationInfo->fragmentationOffset[0] = 0;
+        fragmentationInfo->fragmentationLength[0] = gst_buffer_get_size(buffer);
+        fragmentationInfo->fragmentationPlType[0] = 0;
+        fragmentationInfo->fragmentationTimeDiff[0] = 0;
+
+        *outFragmentationInfo = fragmentationInfo;
+
+        return frame;
+    }
+
+    const char* ImplementationName() const
+    {
+        g_return_val_if_fail(m_encoder, nullptr);
+
+        return GST_OBJECT_NAME(gst_element_get_factory(m_encoder));
+    }
+
+    virtual const gchar* Name() = 0;
+
+    void SetRestrictionCaps(GstCaps* caps)
+    {
+        if (caps && m_profile.get())
+            g_object_set(m_profile.get(), "restriction-caps", caps, nullptr);
+
+        m_restrictionCaps = caps;
+    }
+
+protected:
+    int16_t m_pictureId;
+
+private:
+    static GstFlowReturn newSampleCallbackTramp(GstElement* sink, GStreamerVideoEncoder* enc)
+    {
+        return enc->newSampleCallback(sink);
+    }
+
+    GRefPtr<GstElement> m_pipeline;
+    GstElement* m_src;
+    GstElement* m_encoder;
+
+    webrtc::EncodedImageCallback* m_imageReadyCb;
+    GstClockTime m_firstFramePts;
+    GRefPtr<GstCaps> m_restrictionCaps;
+    GRefPtr<GstEncodingProfile> m_profile;
+};
+
+class H264Encoder : public GStreamerVideoEncoder {
+public:
+    H264Encoder() { }
+
+    H264Encoder(const webrtc::SdpVideoFormat& format)
+        : m_parser(gst_h264_nal_parser_new())
+        , packetizationMode(webrtc::H264PacketizationMode::NonInterleaved)
+    {
+        auto it = format.parameters.find(cricket::kH264FmtpPacketizationMode);
+
+        if (it != format.parameters.end() && it->second == "1")
+            packetizationMode = webrtc::H264PacketizationMode::NonInterleaved;
+    }
+
+    // FIXME - MT. safety!
+    webrtc::EncodedImage Fragmentize(GstBuffer* gstbuffer, webrtc::RTPFragmentationHeader** outFragmentationInfo) final
+    {
+        GstMapInfo map;
+        GstH264NalUnit nalu;
+        auto parserResult = GST_H264_PARSER_OK;
+
+        gsize offset = 0;
+        size_t requiredSize = 0;
+
+        std::vector<GstH264NalUnit> nals;
+        webrtc::EncodedImage encodedImage;
+
+        const uint8_t startCode[4] = { 0, 0, 0, 1 };
+        gst_buffer_map(gstbuffer, &map, GST_MAP_READ);
+        while (parserResult == GST_H264_PARSER_OK) {
+            parserResult = gst_h264_parser_identify_nalu(m_parser, map.data, offset, map.size, &nalu);
+
+            nalu.sc_offset = offset;
+            nalu.offset = offset + sizeof(startCode);
+            if (parserResult != GST_H264_PARSER_OK && parserResult != GST_H264_PARSER_NO_NAL_END)
+                break;
+
+            requiredSize += nalu.size + sizeof(startCode);
+            nals.push_back(nalu);
+            offset = nalu.offset + nalu.size;
+        }
+
+        encodedImage._size = requiredSize;
+        encodedImage._buffer = new uint8_t[encodedImage._size];
+        // Iterate nal units and fill the Fragmentation info.
+        webrtc::RTPFragmentationHeader* fragmentationHeader = new webrtc::RTPFragmentationHeader();
+        fragmentationHeader->VerifyAndAllocateFragmentationHeader(nals.size());
+        size_t fragmentIndex = 0;
+        encodedImage._length = 0;
+        for (std::vector<GstH264NalUnit>::iterator nal = nals.begin(); nal != nals.end(); ++nal, fragmentIndex++) {
+
+            ASSERT(map.data[nal->sc_offset + 0] == startCode[0]);
+            ASSERT(map.data[nal->sc_offset + 1] == startCode[1]);
+            ASSERT(map.data[nal->sc_offset + 2] == startCode[2]);
+            ASSERT(map.data[nal->sc_offset + 3] == startCode[3]);
+
+            fragmentationHeader->fragmentationOffset[fragmentIndex] = nal->offset;
+            fragmentationHeader->fragmentationLength[fragmentIndex] = nal->size;
+
+            memcpy(encodedImage._buffer + encodedImage._length, &map.data[nal->sc_offset],
+                sizeof(startCode) + nal->size);
+            encodedImage._length += nal->size + sizeof(startCode);
+        }
+
+        *outFragmentationInfo = fragmentationHeader;
+        gst_buffer_unmap(gstbuffer, &map);
+        return encodedImage;
+    }
+
+    GstElement* CreateFilter() final
+    {
+        GstElement* filter = makeElement("capsfilter");
+        auto caps = adoptGRef(gst_caps_new_simple(Caps(),
+            "alignment", G_TYPE_STRING, "au",
+            "stream-format", G_TYPE_STRING, "byte-stream",
+            nullptr));
+        g_object_set(filter, "caps", caps.get(), nullptr);
+
+        return filter;
+    }
+
+    webrtc::SdpVideoFormat ConfigureSupportedCodec(GstElement*) final
+    {
+        // TODO- Create from encoder src pad caps template
+        return webrtc::SdpVideoFormat(cricket::kH264CodecName,
+            { { cricket::kH264FmtpProfileLevelId, cricket::kH264ProfileLevelConstrainedBaseline },
+                { cricket::kH264FmtpLevelAsymmetryAllowed, "1" },
+                { cricket::kH264FmtpPacketizationMode, "1" } });
+    }
+
+    const gchar* Caps() final { return "video/x-h264"; }
+    const gchar* Name() final { return cricket::kH264CodecName; }
+    GstH264NalParser* m_parser;
+    webrtc::VideoCodecType CodecType() final { return webrtc::kVideoCodecH264; }
+
+    void PopulateCodecSpecific(webrtc::CodecSpecificInfo* codecSpecifiInfos, GstBuffer*) final
+    {
+        codecSpecifiInfos->codecType = CodecType();
+        codecSpecifiInfos->codec_name = ImplementationName();
+        webrtc::CodecSpecificInfoH264* h264Info = &(codecSpecifiInfos->codecSpecific.H264);
+        h264Info->packetization_mode = packetizationMode;
+    }
+
+    webrtc::H264PacketizationMode packetizationMode;
+};
+
+class VP8Encoder : public GStreamerVideoEncoder {
+public:
+    VP8Encoder() { }
+    VP8Encoder(const webrtc::SdpVideoFormat&) { }
+    const gchar* Caps() final { return "video/x-vp8"; }
+    const gchar* Name() final { return cricket::kVp8CodecName; }
+    webrtc::VideoCodecType CodecType() final { return webrtc::kVideoCodecVP8; }
+    virtual const gchar* ProfileName() { return "Profile Realtime"; }
+
+    void PopulateCodecSpecific(webrtc::CodecSpecificInfo* codecSpecifiInfos, GstBuffer* buffer) final
+    {
+        codecSpecifiInfos->codecType = webrtc::kVideoCodecVP8;
+        codecSpecifiInfos->codec_name = ImplementationName();
+        webrtc::CodecSpecificInfoVP8* vp8Info = &(codecSpecifiInfos->codecSpecific.VP8);
+        vp8Info->temporalIdx = 0;
+        vp8Info->pictureId = m_pictureId;
+
+        vp8Info->simulcastIdx = 0;
+        vp8Info->keyIdx = webrtc::kNoKeyIdx;
+        vp8Info->nonReference = GST_BUFFER_FLAG_IS_SET(buffer, GST_BUFFER_FLAG_DELTA_UNIT);
+        vp8Info->tl0PicIdx = webrtc::kNoTl0PicIdx;
+    }
+};
+
+std::unique_ptr<webrtc::VideoEncoder> GStreamerVideoEncoderFactory::CreateVideoEncoder(const webrtc::SdpVideoFormat& format)
+{
+    if (format.name == cricket::kVp8CodecName)
+        return std::make_unique<VP8Encoder>(format);
+
+    if (format.name == cricket::kH264CodecName)
+        return std::make_unique<H264Encoder>(format);
+
+    return nullptr;
+}
+
+GStreamerVideoEncoderFactory::GStreamerVideoEncoderFactory()
+{
+    static std::once_flag debugRegisteredFlag;
+
+    std::call_once(debugRegisteredFlag, [] {
+        GST_DEBUG_CATEGORY_INIT(webkit_webrtcenc_debug, "webkitlibwebrtcvideoencoder", 0, "WebKit WebRTC video encoder");
+    });
+}
+
+std::vector<webrtc::SdpVideoFormat> GStreamerVideoEncoderFactory::GetSupportedFormats() const
+{
+    std::vector<webrtc::SdpVideoFormat> supportedCodecs;
+
+    VP8Encoder().AddCodecIfSupported(&supportedCodecs);
+    H264Encoder().AddCodecIfSupported(&supportedCodecs);
+
+    return supportedCodecs;
+}
+
+} // namespace WebCore
+#endif
diff --git a/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.h b/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.h
new file mode 100644 (file)
index 0000000..796a20f
--- /dev/null
@@ -0,0 +1,46 @@
+/*
+ * Copyright (C) 2018 Metrological Group B.V.
+ * Copyright (C) 2018 Igalia S.L. All rights reserved.
+ *
+ * This library is free software; you can redistribute it and/or
+ * modify it under the terms of the GNU Library General Public
+ * License as published by the Free Software Foundation; either
+ * version 2 of the License, or (at your option) any later version.
+ *
+ * This library is distributed in the hope that it will be useful,
+ * but WITHOUT ANY WARRANTY; without even the implied warranty of
+ * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the GNU
+ * Library General Public License for more details.
+ *
+ * You should have received a copy of the GNU Library General Public License
+ * aint with this library; see the file COPYING.LIB.  If not, write to
+ * the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor,
+ * Boston, MA 02110-1301, USA.
+ */
+
+#pragma once
+
+#if USE(LIBWEBRTC) && USE(GSTREAMER)
+#include "LibWebRTCMacros.h"
+#include "api/video_codecs/video_encoder_factory.h"
+#include <gst/gst.h>
+
+namespace WebCore {
+
+class GStreamerVideoEncoderFactory final : public webrtc::VideoEncoderFactory {
+public:
+    GStreamerVideoEncoderFactory();
+
+private:
+    std::vector<webrtc::SdpVideoFormat> GetSupportedFormats() const override;
+    std::unique_ptr<webrtc::VideoEncoder> CreateVideoEncoder(const webrtc::SdpVideoFormat&) final;
+    CodecInfo QueryVideoEncoder(const webrtc::SdpVideoFormat&) const
+    {
+        GST_FIXME("Detect wether the decoder is HW accelerated");
+
+        return { false, false };
+    }
+};
+}
+
+#endif
index e49107f..c683376 100644 (file)
@@ -35,6 +35,7 @@ static const size_t sampleRate = 48000;
 static const size_t chunkSampleCount = 480;
 static const size_t sampleSize = 16;
 static const size_t sampleByteSize = 2;
+static const bool isSigned = true;
 static const bool isFloat = false;
 static const bool isBigEndian = false;
 static const bool isNonInterleaved = false;
index 3dbd910..db81d14 100644 (file)
@@ -44,4 +44,16 @@ bool LibWebRTCProvider::webRTCAvailable()
     return true;
 }
 
+#if USE(LIBWEBRTC) && USE(GSTREAMER)
+std::unique_ptr<webrtc::VideoDecoderFactory> LibWebRTCProviderGlib::createDecoderFactory()
+{
+    return std::make_unique<GStreamerVideoDecoderFactory>();
+}
+
+std::unique_ptr<webrtc::VideoEncoderFactory> LibWebRTCProviderGlib::createEncoderFactory()
+{
+    return std::make_unique<GStreamerVideoEncoderFactory>();
+}
+#endif // USE(LIBWEBRTC) && USE(GSTREAMER)
+
 } // namespace WebCore
index 4692268..3687bfc 100644 (file)
 
 #pragma once
 
+#if USE(GSTREAMER)
+#include "GStreamerVideoDecoderFactory.h"
+#include "GStreamerVideoEncoderFactory.h"
+#endif
+
 #include "LibWebRTCProvider.h"
 
 #if USE(LIBWEBRTC)
@@ -34,6 +39,11 @@ namespace WebCore {
 class WEBCORE_EXPORT LibWebRTCProviderGlib : public LibWebRTCProvider {
 public:
     LibWebRTCProviderGlib() = default;
+
+#if USE(GSTREAMER)
+    std::unique_ptr<webrtc::VideoEncoderFactory> createEncoderFactory() final;
+    std::unique_ptr<webrtc::VideoDecoderFactory> createDecoderFactory() final;
+#endif
 };
 
 } // namespace WebCore
index d8780d1..20b948b 100644 (file)
@@ -1,3 +1,12 @@
+2018-07-24  Thibault Saunier  <tsaunier@igalia.com>
+
+        [WPE][GTK] Implement WebRTC based on libwebrtc
+        https://bugs.webkit.org/show_bug.cgi?id=186932
+
+        Reviewed by Philippe Normand.
+
+        * WebProcess/Network/webrtc/LibWebRTCProvider.h: Use LibWebRTCProviderGlib when building WPE or GTK ports.
+
 2018-07-23  Stephan Szabo  <stephan.szabo@sony.com>
 
         [WinCairo] Add implementation for setting cursors
index e32ae4f..e708a55 100644 (file)
@@ -27,6 +27,8 @@
 
 #if PLATFORM(COCOA)
 #include <WebCore/LibWebRTCProviderCocoa.h>
+#elif PLATFORM(GTK) || PLATFORM(WPE)
+#include <WebCore/LibWebRTCProviderGlib.h>
 #else
 #include <WebCore/LibWebRTCProvider.h>
 #endif
@@ -37,6 +39,8 @@ namespace WebKit {
 
 #if PLATFORM(COCOA)
 using LibWebRTCProviderBase = WebCore::LibWebRTCProviderCocoa;
+#elif PLATFORM(GTK) || PLATFORM(WPE)
+using LibWebRTCProviderBase = WebCore::LibWebRTCProviderGlib;
 #else
 using LibWebRTCProviderBase = WebCore::LibWebRTCProvider;
 #endif
index 4577817..e0a4ee0 100644 (file)
@@ -24,6 +24,7 @@
 #  gstreamer-pbutils:    GSTREAMER_PBUTILS_INCLUDE_DIRS and GSTREAMER_PBUTILS_LIBRARIES
 #  gstreamer-tag:        GSTREAMER_TAG_INCLUDE_DIRS and GSTREAMER_TAG_LIBRARIES
 #  gstreamer-video:      GSTREAMER_VIDEO_INCLUDE_DIRS and GSTREAMER_VIDEO_LIBRARIES
+#  gstreamer-codecparser:GSTREAMER_CODECPARSERS_INCLUDE_DIRS and GSTREAMER_CODECPARSERS_LIBRARIES
 #
 # Copyright (C) 2012 Raphael Kubo da Costa <rakuco@webkit.org>
 #
@@ -90,6 +91,7 @@ FIND_GSTREAMER_COMPONENT(GSTREAMER_MPEGTS gstreamer-mpegts-1.0>=1.4.0 gstmpegts-
 FIND_GSTREAMER_COMPONENT(GSTREAMER_PBUTILS gstreamer-pbutils-1.0 gstpbutils-1.0)
 FIND_GSTREAMER_COMPONENT(GSTREAMER_TAG gstreamer-tag-1.0 gsttag-1.0)
 FIND_GSTREAMER_COMPONENT(GSTREAMER_VIDEO gstreamer-video-1.0 gstvideo-1.0)
+FIND_GSTREAMER_COMPONENT(GSTREAMER_CODECPARSERS gstreamer-codecparsers-1.0 gstcodecparsers-1.0)
 
 # ------------------------------------------------
 # 3. Process the COMPONENTS passed to FIND_PACKAGE
@@ -128,4 +130,6 @@ mark_as_advanced(
     GSTREAMER_TAG_LIBRARIES
     GSTREAMER_VIDEO_INCLUDE_DIRS
     GSTREAMER_VIDEO_LIBRARIES
+    GSTREAMER_CODECPARSERS_INCLUDE_DIRS
+    GSTREAMER_CODECPARSERS_LIBRARIES
 )