WebRTC: [GTK] Add MediaEndpointOwr - an OpenWebRTC WebRTC backend
authoradam.bergkvist@ericsson.com <adam.bergkvist@ericsson.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 21 Oct 2016 10:20:23 +0000 (10:20 +0000)
committeradam.bergkvist@ericsson.com <adam.bergkvist@ericsson.com@268f45cc-cd09-0410-ab3c-d52691b4dbfc>
Fri, 21 Oct 2016 10:20:23 +0000 (10:20 +0000)
https://bugs.webkit.org/show_bug.cgi?id=163327

Reviewed by Philippe Normand.

.:

Add manual WebRTC test. Test features:
- Two RTCPeerConnection instances communicate in a single browser tab.
- Supports setting up bidirectional media with a single SDP dialog, as
  well as one direction at a time.
- Strips vendor prefixes (runs in Chrome and Firefox as well)
- Supports modern as well as legacy APIs (mainly to make the test run
  in Chrome)

* ManualTests/webrtc-one-tab-p2p.html: Added.

Source/WebCore:

Add MediaEndpointOwr which is a MediaEndpoint implementation (WebRTC backend) based on
OpenWebRTC [1]. The WebRTC backend can be tested with a manual test. Automatic testing
is still done with MockMediaEndpoint.

[1] http://www.openwebrtc.org/

Testing: Added manual test (webrtc-one-tab-p2p.html)

* CMakeLists.txt:
* platform/GStreamer.cmake:
* platform/mediastream/openwebrtc/MediaEndpointOwr.cpp: Added.
(WebCore::createMediaEndpointOwr):
(WebCore::MediaEndpointOwr::MediaEndpointOwr):
(WebCore::MediaEndpointOwr::~MediaEndpointOwr):
(WebCore::MediaEndpointOwr::setConfiguration):
(WebCore::cryptoDataCallback):
(WebCore::MediaEndpointOwr::generateDtlsInfo):
(WebCore::MediaEndpointOwr::getDefaultAudioPayloads):
(WebCore::MediaEndpointOwr::getDefaultVideoPayloads):
(WebCore::payloadsContainType):
(WebCore::MediaEndpointOwr::filterPayloads):
(WebCore::MediaEndpointOwr::updateReceiveConfiguration):
(WebCore::findRtxPayload):
(WebCore::MediaEndpointOwr::updateSendConfiguration):
(WebCore::MediaEndpointOwr::addRemoteCandidate):
(WebCore::MediaEndpointOwr::replaceMutedRemoteSourceMid):
(WebCore::MediaEndpointOwr::createMutedRemoteSource):
(WebCore::MediaEndpointOwr::replaceSendSource):
(WebCore::MediaEndpointOwr::stop):
(WebCore::MediaEndpointOwr::transceiverIndexForSession):
(WebCore::MediaEndpointOwr::sessionMid):
(WebCore::MediaEndpointOwr::matchTransceiverByMid):
(WebCore::MediaEndpointOwr::dispatchNewIceCandidate):
(WebCore::MediaEndpointOwr::dispatchGatheringDone):
(WebCore::MediaEndpointOwr::processIceTransportStateChange):
(WebCore::MediaEndpointOwr::dispatchDtlsFingerprint):
(WebCore::MediaEndpointOwr::unmuteRemoteSource):
(WebCore::MediaEndpointOwr::prepareSession):
(WebCore::MediaEndpointOwr::prepareMediaSession):
(WebCore::parseHelperServerUrl):
(WebCore::MediaEndpointOwr::ensureTransportAgentAndTransceivers):
(WebCore::MediaEndpointOwr::internalAddRemoteCandidate):
(WebCore::gotCandidate):
(WebCore::candidateGatheringDone):
(WebCore::iceConnectionStateChange):
(WebCore::gotIncomingSource):
* platform/mediastream/openwebrtc/MediaEndpointOwr.h: Added.
(WebCore::OwrTransceiver::create):
(WebCore::OwrTransceiver::~OwrTransceiver):
(WebCore::OwrTransceiver::mid):
(WebCore::OwrTransceiver::session):
(WebCore::OwrTransceiver::owrIceState):
(WebCore::OwrTransceiver::setOwrIceState):
(WebCore::OwrTransceiver::gotEndOfRemoteCandidates):
(WebCore::OwrTransceiver::markGotEndOfRemoteCandidates):
(WebCore::OwrTransceiver::OwrTransceiver):
* platform/mediastream/openwebrtc/RealtimeMediaSourceOwr.h:
(WebCore::RealtimeMediaSourceOwr::RealtimeMediaSourceOwr):
(WebCore::RealtimeMediaSourceOwr::swapOutShallowSource):
Add support for an initially muted source. This is used for early
creation of remote sources.

git-svn-id: https://svn.webkit.org/repository/webkit/trunk@207665 268f45cc-cd09-0410-ab3c-d52691b4dbfc

ChangeLog
ManualTests/webrtc-one-tab-p2p.html [new file with mode: 0644]
Source/WebCore/CMakeLists.txt
Source/WebCore/ChangeLog
Source/WebCore/platform/GStreamer.cmake
Source/WebCore/platform/mediastream/openwebrtc/MediaEndpointOwr.cpp [new file with mode: 0644]
Source/WebCore/platform/mediastream/openwebrtc/MediaEndpointOwr.h [new file with mode: 0644]
Source/WebCore/platform/mediastream/openwebrtc/RealtimeMediaSourceOwr.h

index 9e1c7e2..88b54ea 100644 (file)
--- a/ChangeLog
+++ b/ChangeLog
@@ -1,3 +1,20 @@
+2016-10-21  Adam Bergkvist  <adam.bergkvist@ericsson.com>
+
+        WebRTC: [GTK] Add MediaEndpointOwr - an OpenWebRTC WebRTC backend
+        https://bugs.webkit.org/show_bug.cgi?id=163327
+
+        Reviewed by Philippe Normand.
+
+        Add manual WebRTC test. Test features:
+        - Two RTCPeerConnection instances communicate in a single browser tab.
+        - Supports setting up bidirectional media with a single SDP dialog, as
+          well as one direction at a time.
+        - Strips vendor prefixes (runs in Chrome and Firefox as well)
+        - Supports modern as well as legacy APIs (mainly to make the test run
+          in Chrome)
+
+        * ManualTests/webrtc-one-tab-p2p.html: Added.
+
 2016-10-20  Carlos Garcia Campos  <cgarcia@igalia.com>
 
         [GTK] Configures but fails to link with ENABLE_OPENGL=OFF
diff --git a/ManualTests/webrtc-one-tab-p2p.html b/ManualTests/webrtc-one-tab-p2p.html
new file mode 100644 (file)
index 0000000..c6f79fd
--- /dev/null
@@ -0,0 +1,344 @@
+<!doctype html>
+<html>
+<head>
+<title>One tab p2p</title>
+
+<style type="text/css">
+    video { width: 240px; height: 160px; border: black 1px dashed; }
+    input { margin: 2px }
+</style>
+
+<script>
+// Make use of prefixed APIs to run this test in Chrome and Firefox
+self.RTCPeerConnection = self.RTCPeerConnection || self.webkitRTCPeerConnection || self.mozRTCPeerConnection;
+navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia;
+
+let legacyCheckBox;
+let closeButton;
+let pcA;
+let pcB;
+let localStream;
+
+const pcNames = {
+    first: "A",
+    second: "B"
+};
+
+// FIXME: We should be able to use an empty configuration (bug: http://webkit.org/b/158936)
+const configuration = { "iceServers": [{ "urls": "stun:mmt-stun.verkstad.net" }] };
+
+document.addEventListener("DOMContentLoaded", function () {
+    legacyCheckBox = document.querySelector("#legacy_check");
+    const audioCheckBox = document.querySelector("#audio_check");
+    const videoCheckBox = document.querySelector("#video_check");
+
+    const startButton = document.querySelector("#start_but");
+    closeButton = document.querySelector("#close_but");
+
+    const testButtons = {
+        "single": document.querySelector("#single_but"),
+        "mediaAtoB": document.querySelector("#media_A_to_B_but"),
+        "mediaBtoA": document.querySelector("#media_B_to_A_but")
+    };
+
+    function setTestButtonsDisabled(isDisabled) {
+        for (let p in testButtons)
+            testButtons[p].disabled = isDisabled;
+    }
+
+    startButton.onclick = function () {
+        navigator.getUserMedia({
+            "audio": audioCheckBox.checked,
+            "video": videoCheckBox.checked
+        }, function (stream) {
+            audioCheckBox.disabled = videoCheckBox.disabled = true;
+            localStream = stream;
+            startButton.disabled = true;
+            setTestButtonsDisabled(false);
+        }, logError);
+    };
+
+    closeButton.onclick = function (evt) {
+        evt.target.disabled = true;
+        console.log("Closing");
+        pcA.close();
+        pcB.close();
+        pcA = null;
+        pcB = null;
+
+        setTestButtonsDisabled(false);
+    }
+
+    testButtons.single.onclick = function (evt) {
+        setTestButtonsDisabled(true);
+        getTestFunction("singleDialog")();
+    }
+
+    testButtons.mediaAtoB.onclick = function (evt) {
+        setTestButtonsDisabled(true);
+        if (!pcA)
+            commonSetup();
+        getTestFunction("addOneWayMedia")(pcA, pcB, testButtons.mediaBtoA);
+    }
+
+    testButtons.mediaBtoA.onclick = function (evt) {
+        setTestButtonsDisabled(true);
+        if (!pcA)
+            commonSetup();
+        getTestFunction("addOneWayMedia")(pcB, pcA, testButtons.mediaAtoB);
+    }
+});
+
+function getTestFunction(name) {
+    const functionName = legacyCheckBox.checked ? name : `${name}Promise`;
+    return self[functionName];
+}
+
+function singleDialog() {
+    commonSetup();
+
+    renderStream(localStream, document.querySelector("#self_viewA"));
+    pcA.addStream(localStream);
+
+    pcA.createOffer(function (offer) {
+        pcA.setLocalDescription(offer, function () {
+            offerToB(pcA.localDescription);
+        }, logError);
+    }, logError);
+
+    function offerToB(offer) {
+        logSignalling(offer, pcA, pcB);
+        pcB.setRemoteDescription(offer, function () {
+            addStoredCandidates(pcB);
+            renderStream(localStream, document.querySelector("#self_viewB"));
+            pcB.addStream(localStream);
+
+            pcB.createAnswer(function (answer) {
+                pcB.setLocalDescription(answer, function () {
+                    answerToA(pcB.localDescription);
+                }, logError);
+            }, logError);
+        }, logError);
+    }
+
+    function answerToA(answer) {
+        logSignalling(answer, pcB, pcA);
+        pcA.setRemoteDescription(answer, function () {
+            console.log("Initiator got answer, O/A dialog completed");
+            addStoredCandidates(pcA);
+            closeButton.disabled = false;
+        }, logError);
+    }
+}
+
+function singleDialogPromise() {
+    commonSetup();
+
+    renderStream(localStream, document.querySelector("#self_viewA"));
+    localStream.getTracks().forEach(track => {
+        pcA.addTrack(track, localStream);
+    });
+
+    pcA.createOffer().then(function (offer) {
+        return pcA.setLocalDescription(offer);
+    })
+    .then(function () {
+        logSignalling(pcA.localDescription, pcA, pcB);
+        return pcB.setRemoteDescription(pcA.localDescription);
+    })
+    .then(function () {
+        addStoredCandidates(pcB);
+        renderStream(localStream, document.querySelector("#self_viewB"));
+        localStream.getTracks().forEach(track => {
+            pcB.addTrack(track, localStream);
+        });
+        return pcB.createAnswer();
+    })
+    .then(function (answer) {
+        return pcB.setLocalDescription(answer);
+    })
+    .then(function () {
+        logSignalling(pcB.localDescription, pcB, pcA);
+        return pcA.setRemoteDescription(pcB.localDescription);
+    })
+    .then(function () {
+        addStoredCandidates(pcA);
+        console.log("Initiator got answer, O/A dialog completed");
+        closeButton.disabled = false;
+    })
+    .catch(logError);
+}
+
+function addOneWayMedia(offeringPc, answeringPc, continueButton) {
+    renderStream(localStream, document.querySelector(`#self_view${offeringPc.name}`));
+    offeringPc.addStream(localStream);
+
+    offeringPc.createOffer(function (offer) {
+        offeringPc.setLocalDescription(offer, function () {
+            offerToAnsweringPc(offeringPc.localDescription);
+        }, logError);
+    }, logError);
+
+    function offerToAnsweringPc(offer) {
+        logSignalling(offer, offeringPc, answeringPc);
+        answeringPc.setRemoteDescription(offer, function () {
+            addStoredCandidates(answeringPc);
+            answeringPc.createAnswer(function (answer) {
+                answeringPc.setLocalDescription(answer, function () {
+                    answerToOfferingPc(answeringPc.localDescription);
+                }, logError);
+            }, logError);
+        }, logError);
+    }
+
+    function answerToOfferingPc(answer) {
+        logSignalling(answer, answeringPc, offeringPc);
+        offeringPc.setRemoteDescription(answer, function () {
+            console.log("Initiator side got answer, single way O/A dialog completed");
+            addStoredCandidates(offeringPc);
+            continueButton.disabled = false;
+            closeButton.disabled = false;
+        }, logError);
+    }
+}
+
+function addOneWayMediaPromise(offeringPc, answeringPc, continueButton) {
+    renderStream(localStream, document.querySelector(`#self_view${offeringPc.name}`));
+    localStream.getTracks().forEach(track => {
+        offeringPc.addTrack(track, localStream);
+    });
+
+    offeringPc.createOffer().then(function (offer) {
+        return offeringPc.setLocalDescription(offer);
+    })
+    .then(function () {
+        logSignalling(offeringPc.localDescription, offeringPc, answeringPc);
+        return answeringPc.setRemoteDescription(offeringPc.localDescription);
+    })
+    .then(function () {
+        addStoredCandidates(answeringPc);
+        return answeringPc.createAnswer();
+    })
+    .then(function (answer) {
+        return answeringPc.setLocalDescription(answer)
+    })
+    .then(function () {
+        logSignalling(answeringPc.localDescription, answeringPc, offeringPc);
+        return offeringPc.setRemoteDescription(answeringPc.localDescription)
+    })
+    .then(function () {
+        console.log("Initiator side got answer, single way O/A dialog completed");
+        addStoredCandidates(offeringPc);
+        continueButton.disabled = false;
+        closeButton.disabled = false;
+    })
+    .catch(logError);
+}
+
+function commonSetup() {
+    pcA = new RTCPeerConnection(configuration);
+    pcB = new RTCPeerConnection(configuration);
+
+    pcA.name = pcNames.first;
+    pcB.name = pcNames.second;
+
+    symetricSetup(pcA, pcB);
+    symetricSetup(pcB, pcA);
+}
+
+function addStoredCandidates(pc) {
+    if (!pc.storedCandidates)
+        return;
+
+    pc.storedCandidates.forEach(candidate => {
+        pc.addIceCandidate(candidate).catch(logError);
+    });
+
+    console.log(`Added ${pc.storedCandidates.length} stored candidates (arrived before remote description was set)`);
+    pc.storedCandidates = null;
+}
+
+function symetricSetup(pc, otherPc) {
+    pc.onicecandidate = function (evt) {
+        if (evt.candidate) {
+            logSignalling(evt.candidate, pc, otherPc);
+            // If the remote description isn't set yet, store the candidate
+            if (!otherPc.remoteDescription) {
+                if (!otherPc.storedCandidates)
+                    otherPc.storedCandidates = [];
+                otherPc.storedCandidates.push(evt.candidate);
+            } else
+                otherPc.addIceCandidate(evt.candidate).catch(logError);
+        }
+    };
+
+    pc.onaddstream = function (evt) {
+        renderStream(evt.stream, document.querySelector(`#remote_view${pc.name}`));
+    };
+}
+
+function renderStream(stream, video) {
+    if (typeof video.srcObject !== "undefined")
+        video.srcObject = stream;
+    else
+        video.src = URL.createObjectURL(stream);
+}
+
+function logSignalling(msg, fromPc, toPc) {
+    const type = msg.candidate ? "Candidate" : msg.type.replace(/^[a-z]/, s => s.toUpperCase());
+    let header = `${type} `;
+    header += fromPc.name == pcNames.first ? `${fromPc.name} -> ${toPc.name}` : `${toPc.name} <- ${fromPc.name}`;
+    console.groupCollapsed(header);
+    console.log(msg.candidate || msg.sdp);
+    console.groupEnd();
+}
+
+function logError(error) {
+    if (error) {
+        if (error.name || error.message)
+            console.error(`logError: ${error.name || "-"}: ${error.message || "-"}`);
+        else
+            console.error(`logError: ${error}`);
+    } else
+        console.error("logError: (no error message)");
+}
+</script>
+
+</head>
+<body>
+<h3>One Tab P2P - Test Different Signaling Schemas</h3>
+<p>Click start to request user media. The same stream is sent in both directions so a successful
+bidirectional media setup shows the same output in all four video elements. Open console to view
+signaling details. Some browsers only allow access to user media via a secure origin (e.g.
+localhost).</p>
+<input type="checkbox" id="legacy_check">Use Legacy APIs (Chrome compatible)<br>
+<input type="checkbox" id="audio_check">Audio<br>
+<input type="checkbox" id="video_check" checked>Video<br>
+
+<input type="button" id="start_but" value="Start">
+<input type="button" id="close_but" value="Close Connections" disabled>
+<br>
+Setup bidirectional media: <input type="button" id="single_but" value="Single SDP dialog" disabled>
+<br>
+Setup media in one direction at a time: <input type="button" id="media_A_to_B_but" value="Media A to B" disabled>
+<input type="button" id="media_B_to_A_but" value="Media B to A" disabled>
+<br>
+
+<table>
+    <tr>
+        <td>Local (A)</td><td>Remote (A)</td>
+    </tr>
+    <tr>
+        <td><video id="self_viewA" autoplay muted></video></td>
+        <td><video id="remote_viewA" autoplay></video></td>
+    </tr>
+    <tr>
+        <td>Local (B)</td><td>Remote (B)</td>
+    </tr>
+    <tr>
+        <td><video id="self_viewB" autoplay muted></video></td>
+        <td><video id="remote_viewB" autoplay></video></td>
+    </tr>
+</table>
+</body>
+</html>
index 5ce3277..74803a5 100644 (file)
@@ -2310,7 +2310,6 @@ set(WebCore_SOURCES
     platform/graphics/transforms/TranslateTransformOperation.cpp
 
     platform/mediastream/MediaConstraints.cpp
-    platform/mediastream/MediaEndpoint.cpp
     platform/mediastream/MediaEndpointConfiguration.cpp
     platform/mediastream/MediaStreamPrivate.cpp
     platform/mediastream/MediaStreamTrackPrivate.cpp
index 1584807..0d75902 100644 (file)
@@ -1,3 +1,72 @@
+2016-10-21  Adam Bergkvist  <adam.bergkvist@ericsson.com>
+
+        WebRTC: [GTK] Add MediaEndpointOwr - an OpenWebRTC WebRTC backend
+        https://bugs.webkit.org/show_bug.cgi?id=163327
+
+        Reviewed by Philippe Normand.
+
+        Add MediaEndpointOwr which is a MediaEndpoint implementation (WebRTC backend) based on
+        OpenWebRTC [1]. The WebRTC backend can be tested with a manual test. Automatic testing
+        is still done with MockMediaEndpoint.
+
+        [1] http://www.openwebrtc.org/
+
+        Testing: Added manual test (webrtc-one-tab-p2p.html)
+
+        * CMakeLists.txt:
+        * platform/GStreamer.cmake:
+        * platform/mediastream/openwebrtc/MediaEndpointOwr.cpp: Added.
+        (WebCore::createMediaEndpointOwr):
+        (WebCore::MediaEndpointOwr::MediaEndpointOwr):
+        (WebCore::MediaEndpointOwr::~MediaEndpointOwr):
+        (WebCore::MediaEndpointOwr::setConfiguration):
+        (WebCore::cryptoDataCallback):
+        (WebCore::MediaEndpointOwr::generateDtlsInfo):
+        (WebCore::MediaEndpointOwr::getDefaultAudioPayloads):
+        (WebCore::MediaEndpointOwr::getDefaultVideoPayloads):
+        (WebCore::payloadsContainType):
+        (WebCore::MediaEndpointOwr::filterPayloads):
+        (WebCore::MediaEndpointOwr::updateReceiveConfiguration):
+        (WebCore::findRtxPayload):
+        (WebCore::MediaEndpointOwr::updateSendConfiguration):
+        (WebCore::MediaEndpointOwr::addRemoteCandidate):
+        (WebCore::MediaEndpointOwr::replaceMutedRemoteSourceMid):
+        (WebCore::MediaEndpointOwr::createMutedRemoteSource):
+        (WebCore::MediaEndpointOwr::replaceSendSource):
+        (WebCore::MediaEndpointOwr::stop):
+        (WebCore::MediaEndpointOwr::transceiverIndexForSession):
+        (WebCore::MediaEndpointOwr::sessionMid):
+        (WebCore::MediaEndpointOwr::matchTransceiverByMid):
+        (WebCore::MediaEndpointOwr::dispatchNewIceCandidate):
+        (WebCore::MediaEndpointOwr::dispatchGatheringDone):
+        (WebCore::MediaEndpointOwr::processIceTransportStateChange):
+        (WebCore::MediaEndpointOwr::dispatchDtlsFingerprint):
+        (WebCore::MediaEndpointOwr::unmuteRemoteSource):
+        (WebCore::MediaEndpointOwr::prepareSession):
+        (WebCore::MediaEndpointOwr::prepareMediaSession):
+        (WebCore::parseHelperServerUrl):
+        (WebCore::MediaEndpointOwr::ensureTransportAgentAndTransceivers):
+        (WebCore::MediaEndpointOwr::internalAddRemoteCandidate):
+        (WebCore::gotCandidate):
+        (WebCore::candidateGatheringDone):
+        (WebCore::iceConnectionStateChange):
+        (WebCore::gotIncomingSource):
+        * platform/mediastream/openwebrtc/MediaEndpointOwr.h: Added.
+        (WebCore::OwrTransceiver::create):
+        (WebCore::OwrTransceiver::~OwrTransceiver):
+        (WebCore::OwrTransceiver::mid):
+        (WebCore::OwrTransceiver::session):
+        (WebCore::OwrTransceiver::owrIceState):
+        (WebCore::OwrTransceiver::setOwrIceState):
+        (WebCore::OwrTransceiver::gotEndOfRemoteCandidates):
+        (WebCore::OwrTransceiver::markGotEndOfRemoteCandidates):
+        (WebCore::OwrTransceiver::OwrTransceiver):
+        * platform/mediastream/openwebrtc/RealtimeMediaSourceOwr.h:
+        (WebCore::RealtimeMediaSourceOwr::RealtimeMediaSourceOwr):
+        (WebCore::RealtimeMediaSourceOwr::swapOutShallowSource):
+        Add support for an initially muted source. This is used for early
+        creation of remote sources.
+
 2016-10-21  Javier Fernandez  <jfernandez@igalia.com>
 
         [css-grid] Content Alignment broken with indefinite sized grid container
index 031c9f1..37fed6f 100644 (file)
@@ -14,6 +14,7 @@ if (ENABLE_MEDIA_STREAM)
     list(APPEND WebCore_SOURCES
         platform/graphics/gstreamer/MediaPlayerPrivateGStreamerOwr.cpp
 
+        platform/mediastream/openwebrtc/MediaEndpointOwr.cpp
         platform/mediastream/openwebrtc/OpenWebRTCUtilities.cpp
         platform/mediastream/openwebrtc/RealtimeMediaSourceCenterOwr.cpp
     )
diff --git a/Source/WebCore/platform/mediastream/openwebrtc/MediaEndpointOwr.cpp b/Source/WebCore/platform/mediastream/openwebrtc/MediaEndpointOwr.cpp
new file mode 100644 (file)
index 0000000..39c1164
--- /dev/null
@@ -0,0 +1,702 @@
+/*
+ * Copyright (C) 2015, 2016 Ericsson AB. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ *
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer
+ *    in the documentation and/or other materials provided with the
+ *    distribution.
+ * 3. Neither the name of Ericsson nor the names of its contributors
+ *    may be used to endorse or promote products derived from this
+ *    software without specific prior written permission.
+ *
+ * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#include "config.h"
+
+#if ENABLE(WEB_RTC)
+#include "MediaEndpointOwr.h"
+
+#include "MediaEndpointSessionConfiguration.h"
+#include "MediaPayload.h"
+#include "NotImplemented.h"
+#include "OpenWebRTCUtilities.h"
+#include "RealtimeMediaSourceOwr.h"
+#include <owr/owr.h>
+#include <owr/owr_audio_payload.h>
+#include <owr/owr_crypto_utils.h>
+#include <owr/owr_media_session.h>
+#include <owr/owr_transport_agent.h>
+#include <owr/owr_video_payload.h>
+#include <wtf/text/CString.h>
+
+namespace WebCore {
+
+static void gotCandidate(OwrSession*, OwrCandidate*, MediaEndpointOwr*);
+static void candidateGatheringDone(OwrSession*, MediaEndpointOwr*);
+static void iceConnectionStateChange(OwrSession*, GParamSpec*, MediaEndpointOwr*);
+static void gotIncomingSource(OwrMediaSession*, OwrMediaSource*, MediaEndpointOwr*);
+
+static const Vector<String> candidateTypes = { "host", "srflx", "prflx", "relay" };
+static const Vector<String> candidateTcpTypes = { "", "active", "passive", "so" };
+static const Vector<String> codecTypes = { "NONE", "PCMU", "PCMA", "OPUS", "H264", "VP8" };
+
+static const char* helperServerRegEx = "(turn|stun):([\\w\\.\\-]+|\\[[\\w\\:]+\\])(:\\d+)?(\\?.+)?";
+
+static const unsigned short helperServerDefaultPort = 3478;
+static const unsigned short candidateDefaultPort = 9;
+
+static std::unique_ptr<MediaEndpoint> createMediaEndpointOwr(MediaEndpointClient& client)
+{
+    return std::unique_ptr<MediaEndpoint>(new MediaEndpointOwr(client));
+}
+
+CreateMediaEndpoint MediaEndpoint::create = createMediaEndpointOwr;
+
+MediaEndpointOwr::MediaEndpointOwr(MediaEndpointClient& client)
+    : m_transportAgent(nullptr)
+    , m_client(client)
+    , m_numberOfReceivePreparedSessions(0)
+    , m_numberOfSendPreparedSessions(0)
+{
+    initializeOpenWebRTC();
+
+    GRegexCompileFlags compileFlags = G_REGEX_JAVASCRIPT_COMPAT;
+    GRegexMatchFlags matchFlags = static_cast<GRegexMatchFlags>(0);
+    m_helperServerRegEx = g_regex_new(helperServerRegEx, compileFlags, matchFlags, nullptr);
+}
+
+MediaEndpointOwr::~MediaEndpointOwr()
+{
+    stop();
+
+    g_regex_unref(m_helperServerRegEx);
+}
+
+void MediaEndpointOwr::setConfiguration(RefPtr<MediaEndpointConfiguration>&& configuration)
+{
+    m_configuration = configuration;
+}
+
+static void cryptoDataCallback(gchar* privateKey, gchar* certificate, gchar* fingerprint, gchar* fingerprintFunction, gpointer data)
+{
+    MediaEndpointOwr* mediaEndpoint = (MediaEndpointOwr*) data;
+    mediaEndpoint->dispatchDtlsFingerprint(g_strdup(privateKey), g_strdup(certificate), String(fingerprint), String(fingerprintFunction));
+}
+
+void MediaEndpointOwr::generateDtlsInfo()
+{
+    owr_crypto_create_crypto_data(cryptoDataCallback, this);
+}
+
+MediaPayloadVector MediaEndpointOwr::getDefaultAudioPayloads()
+{
+    MediaPayloadVector payloads;
+
+    // FIXME: This list should be based on what is available in the platform (bug: http://webkit.org/b/163723)
+    RefPtr<MediaPayload> payload = MediaPayload::create();
+    payload->setType(111);
+    payload->setEncodingName("OPUS");
+    payload->setClockRate(48000);
+    payload->setChannels(2);
+    payloads.append(payload);
+
+    payload = MediaPayload::create();
+    payload->setType(8);
+    payload->setEncodingName("PCMA");
+    payload->setClockRate(8000);
+    payload->setChannels(1);
+    payloads.append(payload);
+
+    payload = MediaPayload::create();
+    payload->setType(0);
+    payload->setEncodingName("PCMU");
+    payload->setClockRate(8000);
+    payload->setChannels(1);
+    payloads.append(payload);
+
+    return payloads;
+}
+
+MediaPayloadVector MediaEndpointOwr::getDefaultVideoPayloads()
+{
+    MediaPayloadVector payloads;
+
+    // FIXME: This list should be based on what is available in the platform (bug: http://webkit.org/b/163723)
+    RefPtr<MediaPayload> payload = MediaPayload::create();
+    payload->setType(103);
+    payload->setEncodingName("H264");
+    payload->setClockRate(90000);
+    payload->setCcmfir(true);
+    payload->setNackpli(true);
+    payload->addParameter("packetizationMode", 1);
+    payloads.append(payload);
+
+    payload = MediaPayload::create();
+    payload->setType(100);
+    payload->setEncodingName("VP8");
+    payload->setClockRate(90000);
+    payload->setCcmfir(true);
+    payload->setNackpli(true);
+    payload->setNack(true);
+    payloads.append(payload);
+
+    payload = MediaPayload::create();
+    payload->setType(120);
+    payload->setEncodingName("RTX");
+    payload->setClockRate(90000);
+    payload->addParameter("apt", 100);
+    payload->addParameter("rtxTime", 200);
+    payloads.append(payload);
+
+    return payloads;
+}
+
+static bool payloadsContainType(MediaPayloadVector payloads, unsigned payloadType)
+{
+    for (auto& payload : payloads) {
+        if (payload->type() == payloadType)
+            return true;
+    }
+    return false;
+}
+
+MediaPayloadVector MediaEndpointOwr::filterPayloads(const MediaPayloadVector& remotePayloads, const MediaPayloadVector& defaultPayloads)
+{
+    MediaPayloadVector filteredPayloads;
+
+    for (auto& remotePayload : remotePayloads) {
+        MediaPayload* defaultPayload = nullptr;
+        for (auto& p : defaultPayloads) {
+            if (p->encodingName() == remotePayload->encodingName().convertToASCIIUppercase()) {
+                defaultPayload = p.get();
+                break;
+            }
+        }
+        if (!defaultPayload)
+            continue;
+
+        if (defaultPayload->parameters().contains("packetizationMode") && remotePayload->parameters().contains("packetizationMode")
+            && (defaultPayload->parameters().get("packetizationMode") != defaultPayload->parameters().get("packetizationMode")))
+            continue;
+
+        filteredPayloads.append(remotePayload);
+    }
+
+    MediaPayloadVector filteredAptPayloads;
+
+    for (auto& filteredPayload : filteredPayloads) {
+        if (filteredPayload->parameters().contains("apt") && (!payloadsContainType(filteredPayloads, filteredPayload->parameters().get("apt"))))
+            continue;
+        filteredAptPayloads.append(filteredPayload);
+    }
+
+    return filteredAptPayloads;
+}
+
+MediaEndpoint::UpdateResult MediaEndpointOwr::updateReceiveConfiguration(MediaEndpointSessionConfiguration* configuration, bool isInitiator)
+{
+    Vector<TransceiverConfig> transceiverConfigs;
+    for (unsigned i = m_transceivers.size(); i < configuration->mediaDescriptions().size(); ++i) {
+        TransceiverConfig config;
+        config.type = SessionTypeMedia;
+        config.isDtlsClient = configuration->mediaDescriptions()[i]->dtlsSetup() == "active";
+        config.mid = configuration->mediaDescriptions()[i]->mid();
+        transceiverConfigs.append(WTFMove(config));
+    }
+
+    ensureTransportAgentAndTransceivers(isInitiator, transceiverConfigs);
+
+    // Prepare the new sessions.
+    for (unsigned i = m_numberOfReceivePreparedSessions; i < m_transceivers.size(); ++i) {
+        OwrSession* session = m_transceivers[i]->session();
+        prepareMediaSession(OWR_MEDIA_SESSION(session), configuration->mediaDescriptions()[i].get(), isInitiator);
+        owr_transport_agent_add_session(m_transportAgent, session);
+    }
+
+    m_numberOfReceivePreparedSessions = m_transceivers.size();
+
+    return UpdateResult::Success;
+}
+
+static RefPtr<MediaPayload> findRtxPayload(MediaPayloadVector payloads, unsigned apt)
+{
+    for (auto& payload : payloads) {
+        if (payload->encodingName().convertToASCIIUppercase() == "RTX" && payload->parameters().contains("apt")
+            && (payload->parameters().get("apt") == apt))
+            return payload;
+    }
+    return nullptr;
+}
+
+MediaEndpoint::UpdateResult MediaEndpointOwr::updateSendConfiguration(MediaEndpointSessionConfiguration* configuration, const RealtimeMediaSourceMap& sendSourceMap, bool isInitiator)
+{
+    Vector<TransceiverConfig> transceiverConfigs;
+    for (unsigned i = m_transceivers.size(); i < configuration->mediaDescriptions().size(); ++i) {
+        TransceiverConfig config;
+        config.type = SessionTypeMedia;
+        config.isDtlsClient = configuration->mediaDescriptions()[i]->dtlsSetup() != "active";
+        config.mid = configuration->mediaDescriptions()[i]->mid();
+        transceiverConfigs.append(WTFMove(config));
+    }
+
+    ensureTransportAgentAndTransceivers(isInitiator, transceiverConfigs);
+
+    for (unsigned i = 0; i < m_transceivers.size(); ++i) {
+        OwrSession* session = m_transceivers[i]->session();
+        PeerMediaDescription& mdesc = *configuration->mediaDescriptions()[i];
+
+        if (mdesc.type() == "audio" || mdesc.type() == "video")
+            g_object_set(session, "rtcp-mux", mdesc.rtcpMux(), nullptr);
+
+        if (mdesc.iceCandidates().size()) {
+            for (auto& candidate : mdesc.iceCandidates())
+                internalAddRemoteCandidate(session, *candidate, mdesc.iceUfrag(), mdesc.icePassword());
+        }
+
+        if (i < m_numberOfSendPreparedSessions)
+            continue;
+
+        if (!sendSourceMap.contains(mdesc.mid()))
+            continue;
+
+        MediaPayload* payload = nullptr;
+        for (auto& p : mdesc.payloads()) {
+            if (p->encodingName().convertToASCIIUppercase() != "RTX") {
+                payload = p.get();
+                break;
+            }
+        }
+
+        if (!payload)
+            return UpdateResult::Failed;
+
+        RefPtr<MediaPayload> rtxPayload = findRtxPayload(mdesc.payloads(), payload->type());
+        RealtimeMediaSourceOwr* source = static_cast<RealtimeMediaSourceOwr*>(sendSourceMap.get(mdesc.mid()));
+
+        ASSERT(codecTypes.find(payload->encodingName().convertToASCIIUppercase()) != notFound);
+        OwrCodecType codecType = static_cast<OwrCodecType>(codecTypes.find(payload->encodingName().convertToASCIIUppercase()));
+
+        OwrPayload* sendPayload;
+        if (mdesc.type() == "audio")
+            sendPayload = owr_audio_payload_new(codecType, payload->type(), payload->clockRate(), payload->channels());
+        else {
+            sendPayload = owr_video_payload_new(codecType, payload->type(), payload->clockRate(), payload->ccmfir(), payload->nackpli());
+            g_object_set(sendPayload, "rtx-payload-type", rtxPayload ? rtxPayload->type() : -1,
+                "rtx-time", rtxPayload && rtxPayload->parameters().contains("rtxTime") ? rtxPayload->parameters().get("rtxTime") : 0, nullptr);
+        }
+
+        owr_media_session_set_send_payload(OWR_MEDIA_SESSION(session), sendPayload);
+        owr_media_session_set_send_source(OWR_MEDIA_SESSION(session), source->mediaSource());
+
+        m_numberOfSendPreparedSessions = i + 1;
+    }
+
+    return UpdateResult::Success;
+}
+
+void MediaEndpointOwr::addRemoteCandidate(IceCandidate& candidate, const String& mid, const String& ufrag, const String& password)
+{
+    for (auto& transceiver : m_transceivers) {
+        if (transceiver->mid() == mid) {
+            internalAddRemoteCandidate(transceiver->session(), candidate, ufrag, password);
+            break;
+        }
+    }
+}
+
+void MediaEndpointOwr::replaceMutedRemoteSourceMid(const String& oldMid, const String& newMid)
+{
+    RefPtr<RealtimeMediaSourceOwr> remoteSource = m_mutedRemoteSources.take(oldMid);
+    m_mutedRemoteSources.set(newMid, remoteSource);
+}
+
+Ref<RealtimeMediaSource> MediaEndpointOwr::createMutedRemoteSource(const String& mid, RealtimeMediaSource::Type type)
+{
+    String name;
+    String id("not used");
+
+    switch (type) {
+    case RealtimeMediaSource::Audio: name = "remote audio"; break;
+    case RealtimeMediaSource::Video: name = "remote video"; break;
+    case RealtimeMediaSource::None:
+        ASSERT_NOT_REACHED();
+    }
+
+    RefPtr<RealtimeMediaSourceOwr> source = adoptRef(new RealtimeMediaSourceOwr(nullptr, id, type, name));
+    m_mutedRemoteSources.set(mid, source);
+
+    return *source;
+}
+
+void MediaEndpointOwr::replaceSendSource(RealtimeMediaSource& newSource, const String& mid)
+{
+    UNUSED_PARAM(newSource);
+    UNUSED_PARAM(mid);
+
+    // FIXME: We want to use owr_media_session_set_send_source here, but it doesn't work as intended.
+    // Issue tracked by OpenWebRTC bug: https://github.com/EricssonResearch/openwebrtc/issues/533
+
+    notImplemented();
+}
+
+void MediaEndpointOwr::stop()
+{
+    if (!m_transportAgent)
+        return;
+
+    for (auto& transceiver : m_transceivers)
+        owr_media_session_set_send_source(OWR_MEDIA_SESSION(transceiver->session()), nullptr);
+
+    g_object_unref(m_transportAgent);
+    m_transportAgent = nullptr;
+}
+
+size_t MediaEndpointOwr::transceiverIndexForSession(OwrSession* session) const
+{
+    for (unsigned i = 0; i < m_transceivers.size(); ++i) {
+        if (m_transceivers[i]->session() == session)
+            return i;
+    }
+
+    ASSERT_NOT_REACHED();
+    return notFound;
+}
+
+const String& MediaEndpointOwr::sessionMid(OwrSession* session) const
+{
+    size_t index = transceiverIndexForSession(session);
+    return m_transceivers[index]->mid();
+}
+
+OwrTransceiver* MediaEndpointOwr::matchTransceiverByMid(const String& mid) const
+{
+    for (auto& transceiver : m_transceivers) {
+        if (transceiver->mid() == mid)
+            return transceiver.get();
+    }
+    return nullptr;
+}
+
+void MediaEndpointOwr::dispatchNewIceCandidate(const String& mid, RefPtr<IceCandidate>&& iceCandidate)
+{
+    m_client.gotIceCandidate(mid, WTFMove(iceCandidate));
+}
+
+void MediaEndpointOwr::dispatchGatheringDone(const String& mid)
+{
+    m_client.doneGatheringCandidates(mid);
+}
+
+void MediaEndpointOwr::processIceTransportStateChange(OwrSession* session)
+{
+    OwrIceState owrIceState;
+    g_object_get(session, "ice-connection-state", &owrIceState, nullptr);
+
+    OwrTransceiver& transceiver = *m_transceivers[transceiverIndexForSession(session)];
+    if (owrIceState < transceiver.owrIceState())
+        return;
+
+    transceiver.setOwrIceState(owrIceState);
+
+    // We cannot go to Completed if there may be more remote candidates.
+    if (owrIceState == OWR_ICE_STATE_READY && !transceiver.gotEndOfRemoteCandidates())
+        return;
+
+    MediaEndpoint::IceTransportState transportState;
+    switch (owrIceState) {
+    case OWR_ICE_STATE_CONNECTING:
+        transportState = MediaEndpoint::IceTransportState::Checking;
+        break;
+    case OWR_ICE_STATE_CONNECTED:
+        transportState = MediaEndpoint::IceTransportState::Connected;
+        break;
+    case OWR_ICE_STATE_READY:
+        transportState = MediaEndpoint::IceTransportState::Completed;
+        break;
+    case OWR_ICE_STATE_FAILED:
+        transportState = MediaEndpoint::IceTransportState::Failed;
+        break;
+    default:
+        return;
+    }
+
+    m_client.iceTransportStateChanged(transceiver.mid(), transportState);
+}
+
+void MediaEndpointOwr::dispatchDtlsFingerprint(gchar* privateKey, gchar* certificate, const String& fingerprint, const String& fingerprintFunction)
+{
+    m_dtlsPrivateKey = String(privateKey);
+    m_dtlsCertificate = String(certificate);
+
+    g_free(privateKey);
+    g_free(certificate);
+
+    m_client.gotDtlsFingerprint(fingerprint, fingerprintFunction);
+}
+
+void MediaEndpointOwr::unmuteRemoteSource(const String& mid, OwrMediaSource* realSource)
+{
+    RefPtr<RealtimeMediaSourceOwr> remoteSource = m_mutedRemoteSources.take(mid);
+    if (!remoteSource) {
+        LOG_ERROR("Unable to find muted remote source.");
+        return;
+    }
+
+    if (!remoteSource->stopped())
+        remoteSource->swapOutShallowSource(*realSource);
+}
+
+void MediaEndpointOwr::prepareSession(OwrSession* session, PeerMediaDescription* mediaDescription)
+{
+    g_object_set_data_full(G_OBJECT(session), "ice-ufrag", g_strdup(mediaDescription->iceUfrag().ascii().data()), g_free);
+    g_object_set_data_full(G_OBJECT(session), "ice-password", g_strdup(mediaDescription->icePassword().ascii().data()), g_free);
+
+    g_signal_connect(session, "on-new-candidate", G_CALLBACK(gotCandidate), this);
+    g_signal_connect(session, "on-candidate-gathering-done", G_CALLBACK(candidateGatheringDone), this);
+    g_signal_connect(session, "notify::ice-connection-state", G_CALLBACK(iceConnectionStateChange), this);
+}
+
+void MediaEndpointOwr::prepareMediaSession(OwrMediaSession* mediaSession, PeerMediaDescription* mediaDescription, bool isInitiator)
+{
+    prepareSession(OWR_SESSION(mediaSession), mediaDescription);
+
+    bool useRtcpMux = !isInitiator && mediaDescription->rtcpMux();
+    g_object_set(mediaSession, "rtcp-mux", useRtcpMux, nullptr);
+
+    if (!mediaDescription->cname().isEmpty() && mediaDescription->ssrcs().size()) {
+        g_object_set(mediaSession, "cname", mediaDescription->cname().ascii().data(),
+            "send-ssrc", mediaDescription->ssrcs()[0],
+            nullptr);
+    }
+
+    g_signal_connect(mediaSession, "on-incoming-source", G_CALLBACK(gotIncomingSource), this);
+
+    for (auto& payload : mediaDescription->payloads()) {
+        if (payload->encodingName().convertToASCIIUppercase() == "RTX")
+            continue;
+
+        RefPtr<MediaPayload> rtxPayload = findRtxPayload(mediaDescription->payloads(), payload->type());
+
+        ASSERT(codecTypes.find(payload->encodingName()) != notFound);
+        OwrCodecType codecType = static_cast<OwrCodecType>(codecTypes.find(payload->encodingName().convertToASCIIUppercase()));
+
+        OwrPayload* receivePayload;
+        if (mediaDescription->type() == "audio")
+            receivePayload = owr_audio_payload_new(codecType, payload->type(), payload->clockRate(), payload->channels());
+        else {
+            receivePayload = owr_video_payload_new(codecType, payload->type(), payload->clockRate(), payload->ccmfir(), payload->nackpli());
+            g_object_set(receivePayload, "rtx-payload-type", rtxPayload ? rtxPayload->type() : -1,
+                "rtx-time", rtxPayload && rtxPayload->parameters().contains("rtxTime") ? rtxPayload->parameters().get("rtxTime") : 0, nullptr);
+        }
+
+        owr_media_session_add_receive_payload(mediaSession, receivePayload);
+    }
+}
+
+struct HelperServerUrl {
+    String protocol;
+    String host;
+    unsigned short port;
+    String query;
+};
+
+static void parseHelperServerUrl(GRegex& regex, const URL& url, HelperServerUrl& outUrl)
+{
+    GMatchInfo* matchInfo;
+
+    if (g_regex_match(&regex, url.string().ascii().data(), static_cast<GRegexMatchFlags>(0), &matchInfo)) {
+        gchar** matches =  g_match_info_fetch_all(matchInfo);
+        gint matchCount = g_strv_length(matches);
+
+        outUrl.protocol = matches[1];
+        outUrl.host = matches[2][0] == '[' ? String(matches[2] + 1, strlen(matches[2]) - 2) // IPv6
+            : matches[2];
+
+        outUrl.port = 0;
+        if (matchCount >= 4) {
+            String portString = String(matches[3] + 1); // Skip port colon
+            outUrl.port = portString.toUIntStrict();
+        }
+
+        if (matchCount == 5)
+            outUrl.query = String(matches[4] + 1); // Skip question mark
+
+        g_strfreev(matches);
+    }
+
+    g_match_info_free(matchInfo);
+}
+
+void MediaEndpointOwr::ensureTransportAgentAndTransceivers(bool isInitiator, const Vector<TransceiverConfig>& transceiverConfigs)
+{
+    ASSERT(m_dtlsPrivateKey);
+    ASSERT(m_dtlsCertificate);
+
+    if (!m_transportAgent) {
+        m_transportAgent = owr_transport_agent_new(false);
+
+        for (auto& server : m_configuration->iceServers()) {
+            for (auto& webkitUrl : server->urls()) {
+                HelperServerUrl url;
+                // WebKit's URL class can't handle ICE helper server urls properly
+                parseHelperServerUrl(*m_helperServerRegEx, webkitUrl, url);
+
+                unsigned short port = url.port ? url.port : helperServerDefaultPort;
+
+                if (url.protocol == "stun") {
+                    owr_transport_agent_add_helper_server(m_transportAgent, OWR_HELPER_SERVER_TYPE_STUN,
+                        url.host.ascii().data(), port, nullptr, nullptr);
+
+                } else if (url.protocol == "turn") {
+                    OwrHelperServerType serverType = url.query == "transport=tcp" ? OWR_HELPER_SERVER_TYPE_TURN_TCP
+                        : OWR_HELPER_SERVER_TYPE_TURN_UDP;
+
+                    owr_transport_agent_add_helper_server(m_transportAgent, serverType,
+                        url.host.ascii().data(), port,
+                        server->username().ascii().data(), server->credential().ascii().data());
+                } else
+                    ASSERT_NOT_REACHED();
+            }
+        }
+    }
+
+    g_object_set(m_transportAgent, "ice-controlling-mode", isInitiator, nullptr);
+
+    for (auto& config : transceiverConfigs) {
+        OwrSession* session = OWR_SESSION(owr_media_session_new(config.isDtlsClient));
+        g_object_set(session, "dtls-certificate", m_dtlsCertificate.utf8().data(),
+            "dtls-key", m_dtlsPrivateKey.utf8().data(),
+            nullptr);
+
+        m_transceivers.append(OwrTransceiver::create(config.mid, session));
+    }
+}
+
+void MediaEndpointOwr::internalAddRemoteCandidate(OwrSession* session, IceCandidate& candidate, const String& ufrag, const String& password)
+{
+    gboolean rtcpMux;
+    g_object_get(session, "rtcp-mux", &rtcpMux, nullptr);
+
+    if (rtcpMux && candidate.componentId() == OWR_COMPONENT_TYPE_RTCP)
+        return;
+
+    ASSERT(candidateTypes.find(candidate.type()) != notFound);
+
+    OwrCandidateType candidateType = static_cast<OwrCandidateType>(candidateTypes.find(candidate.type()));
+    OwrComponentType componentId = static_cast<OwrComponentType>(candidate.componentId());
+    OwrTransportType transportType;
+
+    if (candidate.transport().convertToASCIIUppercase() == "UDP")
+        transportType = OWR_TRANSPORT_TYPE_UDP;
+    else {
+        ASSERT(candidateTcpTypes.find(candidate.tcpType()) != notFound);
+        transportType = static_cast<OwrTransportType>(candidateTcpTypes.find(candidate.tcpType()));
+    }
+
+    OwrCandidate* owrCandidate = owr_candidate_new(candidateType, componentId);
+    g_object_set(owrCandidate, "transport-type", transportType,
+        "address", candidate.address().ascii().data(),
+        "port", candidate.port(),
+        "base-address", candidate.relatedAddress().ascii().data(),
+        "base-port", candidate.relatedPort(),
+        "priority", candidate.priority(),
+        "foundation", candidate.foundation().ascii().data(),
+        "ufrag", ufrag.ascii().data(),
+        "password", password.ascii().data(),
+        nullptr);
+
+    owr_session_add_remote_candidate(session, owrCandidate);
+}
+
+static void gotCandidate(OwrSession* session, OwrCandidate* candidate, MediaEndpointOwr* mediaEndpoint)
+{
+    OwrCandidateType candidateType;
+    gchar* foundation;
+    OwrComponentType componentId;
+    OwrTransportType transportType;
+    gint priority;
+    gchar* address;
+    guint port;
+    gchar* relatedAddress;
+    guint relatedPort;
+
+    g_object_get(candidate, "type", &candidateType,
+        "foundation", &foundation,
+        "component-type", &componentId,
+        "transport-type", &transportType,
+        "priority", &priority,
+        "address", &address,
+        "port", &port,
+        "base-address", &relatedAddress,
+        "base-port", &relatedPort,
+        nullptr);
+
+    ASSERT(candidateType >= 0 && candidateType < candidateTypes.size());
+    ASSERT(transportType >= 0 && transportType < candidateTcpTypes.size());
+
+    RefPtr<IceCandidate> iceCandidate = IceCandidate::create();
+    iceCandidate->setType(candidateTypes[candidateType]);
+    iceCandidate->setFoundation(foundation);
+    iceCandidate->setComponentId(componentId);
+    iceCandidate->setPriority(priority);
+    iceCandidate->setAddress(address);
+    iceCandidate->setPort(port ? port : candidateDefaultPort);
+
+    if (transportType == OWR_TRANSPORT_TYPE_UDP)
+        iceCandidate->setTransport("UDP");
+    else {
+        iceCandidate->setTransport("TCP");
+        iceCandidate->setTcpType(candidateTcpTypes[transportType]);
+    }
+
+    if (candidateType != OWR_CANDIDATE_TYPE_HOST) {
+        iceCandidate->setRelatedAddress(relatedAddress);
+        iceCandidate->setRelatedPort(relatedPort ? relatedPort : candidateDefaultPort);
+    }
+
+    g_object_set(G_OBJECT(candidate), "ufrag", g_object_get_data(G_OBJECT(session), "ice-ufrag"),
+        "password", g_object_get_data(G_OBJECT(session), "ice-password"),
+        nullptr);
+
+    mediaEndpoint->dispatchNewIceCandidate(mediaEndpoint->sessionMid(session), WTFMove(iceCandidate));
+
+    g_free(foundation);
+    g_free(address);
+    g_free(relatedAddress);
+}
+
+static void candidateGatheringDone(OwrSession* session, MediaEndpointOwr* mediaEndpoint)
+{
+    mediaEndpoint->dispatchGatheringDone(mediaEndpoint->sessionMid(session));
+}
+
+static void iceConnectionStateChange(OwrSession* session, GParamSpec*, MediaEndpointOwr* mediaEndpoint)
+{
+    mediaEndpoint->processIceTransportStateChange(session);
+}
+
+static void gotIncomingSource(OwrMediaSession* mediaSession, OwrMediaSource* source, MediaEndpointOwr* mediaEndpoint)
+{
+    mediaEndpoint->unmuteRemoteSource(mediaEndpoint->sessionMid(OWR_SESSION(mediaSession)), source);
+}
+
+} // namespace WebCore
+
+#endif // ENABLE(WEB_RTC)
diff --git a/Source/WebCore/platform/mediastream/openwebrtc/MediaEndpointOwr.h b/Source/WebCore/platform/mediastream/openwebrtc/MediaEndpointOwr.h
new file mode 100644 (file)
index 0000000..7d43dc2
--- /dev/null
@@ -0,0 +1,146 @@
+/*
+ * Copyright (C) 2015, 2016 Ericsson AB. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ *
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer
+ *    in the documentation and/or other materials provided with the
+ *    distribution.
+ * 3. Neither the name of Ericsson nor the names of its contributors
+ *    may be used to endorse or promote products derived from this
+ *    software without specific prior written permission.
+ *
+ * THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
+ * "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
+ * LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
+ * A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
+ * OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
+ * SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
+ * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+ * DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+ * THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+ * (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
+ * OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#if ENABLE(WEB_RTC)
+
+#include "MediaEndpoint.h"
+#include <owr/owr_session.h>
+#include <wtf/RefCounted.h>
+#include <wtf/RefPtr.h>
+
+typedef struct _OwrMediaSession OwrMediaSession;
+typedef struct _OwrMediaSource OwrMediaSource;
+typedef struct _OwrTransportAgent OwrTransportAgent;
+
+namespace WebCore {
+
+class PeerMediaDescription;
+class RealtimeMediaSourceOwr;
+class RTCConfigurationPrivate;
+
+class OwrTransceiver : public RefCounted<OwrTransceiver> {
+public:
+    static Ref<OwrTransceiver> create(const String& mid, OwrSession* session)
+    {
+        return adoptRef(*new OwrTransceiver(mid, session));
+    }
+    virtual ~OwrTransceiver() { }
+
+    const String& mid() const { return m_mid; }
+    OwrSession* session() const { return m_session; }
+
+    OwrIceState owrIceState() const { return m_owrIceState; }
+    void setOwrIceState(OwrIceState state) { m_owrIceState = state; }
+
+    bool gotEndOfRemoteCandidates() const { return m_gotEndOfRemoteCandidates; }
+    void markGotEndOfRemoteCandidates() { m_gotEndOfRemoteCandidates = true; }
+
+private:
+    OwrTransceiver(const String& mid, OwrSession* session)
+        : m_mid(mid)
+        , m_session(session)
+    { }
+
+    String m_mid;
+    OwrSession* m_session;
+
+    OwrIceState m_owrIceState { OWR_ICE_STATE_DISCONNECTED };
+    bool m_gotEndOfRemoteCandidates { false };
+};
+
+class MediaEndpointOwr : public MediaEndpoint {
+public:
+    MediaEndpointOwr(MediaEndpointClient&);
+    ~MediaEndpointOwr();
+
+    void setConfiguration(RefPtr<MediaEndpointConfiguration>&&) override;
+
+    void generateDtlsInfo() override;
+    MediaPayloadVector getDefaultAudioPayloads() override;
+    MediaPayloadVector getDefaultVideoPayloads() override;
+    MediaPayloadVector filterPayloads(const MediaPayloadVector& remotePayloads, const MediaPayloadVector& defaultPayloads) override;
+
+    UpdateResult updateReceiveConfiguration(MediaEndpointSessionConfiguration*, bool isInitiator) override;
+    UpdateResult updateSendConfiguration(MediaEndpointSessionConfiguration*, const RealtimeMediaSourceMap&, bool isInitiator) override;
+
+    void addRemoteCandidate(IceCandidate&, const String& mid, const String& ufrag, const String& password) override;
+
+    Ref<RealtimeMediaSource> createMutedRemoteSource(const String& mid, RealtimeMediaSource::Type) override;
+    void replaceMutedRemoteSourceMid(const String&, const String&) final;
+    void replaceSendSource(RealtimeMediaSource&, const String& mid) override;
+
+    void stop() override;
+
+    size_t transceiverIndexForSession(OwrSession*) const;
+    const String& sessionMid(OwrSession*) const;
+    OwrTransceiver* matchTransceiverByMid(const String& mid) const;
+
+    void dispatchNewIceCandidate(const String& mid, RefPtr<IceCandidate>&&);
+    void dispatchGatheringDone(const String& mid);
+    void processIceTransportStateChange(OwrSession*);
+    void dispatchDtlsFingerprint(gchar* privateKey, gchar* certificate, const String& fingerprint, const String& fingerprintFunction);
+    void unmuteRemoteSource(const String& mid, OwrMediaSource*);
+
+private:
+    enum SessionType { SessionTypeMedia };
+
+    struct TransceiverConfig {
+        SessionType type;
+        bool isDtlsClient;
+        String mid;
+    };
+
+    void prepareSession(OwrSession*, PeerMediaDescription*);
+    void prepareMediaSession(OwrMediaSession*, PeerMediaDescription*, bool isInitiator);
+
+    void ensureTransportAgentAndTransceivers(bool isInitiator, const Vector<TransceiverConfig>&);
+    void internalAddRemoteCandidate(OwrSession*, IceCandidate&, const String& ufrag, const String& password);
+
+    RefPtr<MediaEndpointConfiguration> m_configuration;
+    GRegex* m_helperServerRegEx;
+
+    OwrTransportAgent* m_transportAgent;
+    Vector<RefPtr<OwrTransceiver>> m_transceivers;
+    HashMap<String, RefPtr<RealtimeMediaSourceOwr>> m_mutedRemoteSources;
+
+    MediaEndpointClient& m_client;
+
+    unsigned m_numberOfReceivePreparedSessions;
+    unsigned m_numberOfSendPreparedSessions;
+
+    String m_dtlsPrivateKey;
+    String m_dtlsCertificate;
+};
+
+} // namespace WebCore
+
+#endif // ENABLE(WEB_RTC)
index d24b065..8695c7b 100644 (file)
@@ -54,6 +54,8 @@ RealtimeMediaSourceOwr(OwrMediaSource* mediaSource, const String& id, RealtimeMe
     : RealtimeMediaSource(id, type, name)
     , m_mediaSource(mediaSource)
     {
+        if (!mediaSource)
+            m_muted = true;
     }
 
 RealtimeMediaSourceOwr(const String& id, RealtimeMediaSource::Type type, const String& name)
@@ -64,6 +66,12 @@ RealtimeMediaSourceOwr(const String& id, RealtimeMediaSource::Type type, const S
 
     virtual ~RealtimeMediaSourceOwr() { }
 
+    void swapOutShallowSource(OwrMediaSource& realSource)
+    {
+        m_mediaSource = &realSource;
+        setMuted(false);
+    }
+
     virtual RefPtr<RealtimeMediaSourceCapabilities> capabilities() { return m_capabilities; }
     virtual const RealtimeMediaSourceSettings& settings() { return m_currentSettings; }