GstWebRTC - PubNub Signaling Examples - x86: Difference between revisions

From RidgeRun Developer Wiki
No edit summary
mNo edit summary
 
(29 intermediate revisions by 3 users not shown)
Line 1: Line 1:
<table>
{{GstWebRTC/Head|previous=OpenWebRTC Signaler Examples - x86|next=AppRTC Signaler Examples - x86|metakeywords=GstRrWebRTC Examples,WebRTC Examples,GstRrWebRTC GStreamer pipelines,WebRTC GStreamer pipelines,GstRrWebRTC PubNub Signaler,WebRTC PubNub Signaler,signaling}}
<tr>
<td><div class="clear; float:right">__TOC__</div></td>
<td valign=top>
{{Debug Symbol}} Problems running the pipelines shown on this page?<br>Please see our [http://developer.ridgerun.com/wiki/index.php?title=GStreamer_Debugging GStreamer Debugging guide] for help.
</td>
</table>


{{GstWebRTC Page|
This page links to the GstRrWebRTC examples using the PubNub signaler on x86 platform.
[[GstWebRTC - Audio + Video Examples|Audio + Video]]|
[[GstWebRTC|Home]]|


This page shows a series of pipelines to try out the experimental PubNub support.
== Examples Index ==
  1. [[GstWebRTC - PubNub Audio Examples - x86|Audio Examples]]
  2. [[GstWebRTC - PubNub Video Examples - x86|Video Examples]]
  3. [[GstWebRTC - PubNub Audio + Video Examples - x86|Audio + Video Examples]]
  4. [[GstWebRTC - PubNub Web Pages - x86|Demo Web Page]]


__TOC__
{{GstWebRTC/Foot|previous=OpenWebRTC Signaler Examples - x86|next=AppRTC Signaler Examples - x86}}
 
== PubNub Demo Pages ==
 
PubNub offers a couple of test web pages to try out the WebRTC signalers.
* https://kevingleason.me/SimpleRTC/minivid.html
* https://www.pubnub.com/developers/demos/webrtc/
 
<pre>
These examples are such that audio+video is required in both ways.
</pre>
 
<pre style="background-color:yellow">
There is a known issue where different pipeline parameters are required for each browser. We are working to get this fixed.
</pre>
 
=== SimpleRTC WebPage ===
 
The following figure show how to establish a call using the SimpleRTC web page in https://kevingleason.me/SimpleRTC/minivid.html
 
[[File:demo-minivid.png|700px|center|Establish a WebRTC call with https://kevingleason.me/SimpleRTC/minivid.html]]
 
# Type a unique name in the top text bar.
# Register that unique name to PubNub
# Type the pipeline name in the bottom text bar. In the examples 123 this would be $USER_CHANNEL or 123
# Start the call
 
=== PubNub Official Demo ===
 
The following figure show how to establish a call using the official PubNub's WebRTC demo in https://www.pubnub.com/developers/demos/webrtc/
 
[[File:demo-pubnub.png|700px|center|Establish a WebRTC call with https://www.pubnub.com/developers/demos/webrtc/]]
 
# Type the pipeline's name in the bottom text bar. In the examples 123 this would be $USER_CHANNEL or 123
# Start the call
 
=== PubNub's WebRTC page ===
 
WebRTC provides a PubNub page that can be found the repository. In the following example, the port is defined as 8080:
 
<syntaxhighlight lang='bash'>
cd gst-webrtc/tests/examples/signalers/pubnub
python3 -m http.server 8080
</syntaxhighlight>
 
The following show how to establish a call using the PubNub's WebRTC page in https://localhost:8080 (Select the port you defined)
 
# Type a unique name in the top text bar.
# Type the pipeline name in the bottom text bar.
# Select video and audio options: Disable, Send only, Receive only or Send-Receive.
# Select Data: Send-Receive or Disable
# Select Register button.
# Start the call from the pipeline or selecting the call button on the Webpage.
 
== VP8+Opus Send+Receive ==
 
===Pipeline to Webpage===
 
This pipeline will encode a video and audio streams to VP8 and OPUS respectively and send them to the demo web page. Additionally, it will receive the web page's video and audio feeds, in the same format.
 
<syntaxhighlight lang=bash>
USER_CHANNEL=123
gst-launch-1.0 webrtcbin rtcp-mux=true start-call=false signaler::user-channel=$USER_CHANNEL name=web \
videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink \
web.video_src ! rtpvp8depay ! avdec_vp8 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
===Pipeline to Pipeline===
 
==== First Endpoint ====
<syntaxhighlight lang=bash>
USER_CHANNEL=123
PEER_CHANNEL=123peer
gst-launch-1.0 -v webrtcbin rtcp-mux=true start-call=true signaler::user-channel=$USER_CHANNEL signaler::peer-channel=$PEER_CHANNEL name=web \
videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink \
web.video_src ! rtpvp8depay ! avdec_vp8 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
==== Second Endpoint ====
 
<syntaxhighlight lang=bash>
USER_CHANNEL=123peer
PEER_CHANNEL=123
gst-launch-1.0 -v webrtcbin rtcp-mux=true start-call=false signaler::user-channel=$USER_CHANNEL signaler::peer-channel=$PEER_CHANNEL name=web \
videotestsrc is-live=true ! vp8enc ! rtpvp8pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink \
web.video_src ! rtpvp8depay ! avdec_vp8 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
== H264+Opus Send+Receive ==
 
This pipeline will encode a video and audio streams to H264 and OPUS accordingly and send them to the demo web page. Additionally, it will receive the web page's video and audio feeds, in the same format.
 
===Pipeline to Webpage===
 
==== x264 ====
 
<pre style="background-color:yellow">
It seems that browsers do not get along with x264 because of SEI NAL units sent with the stream. As a workaround, we set key-int-max=1 and avoid the SEI insertions.
</pre>
 
<syntaxhighlight lang=bash>
USER_CHANNEL=123
gst-launch-1.0 webrtcbin rtcp-mux=true start-call=false signaler::user-channel=$USER_CHANNEL name=web \
videotestsrc is-live=true ! x264enc aud=false key-int-max=1 tune=zerolatency intra-refresh=true ! "video/x-h264,profile=constrained-baseline,level=(string)3.1" ! \
rtph264pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink \
web.video_src ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
==== OpenH264 ====
 
<syntaxhighlight lang=bash>
USER_CHANNEL=123
gst-launch-1.0 webrtcbin rtcp-mux=true start-call=false signaler::user-channel=$USER_CHANNEL name=web \
videotestsrc is-live=true ! openh264enc ! \
rtph264pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay pt=111 ! web.audio_sink \
web.video_src ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
===Pipeline to Pipeline===
 
==== First Endpoint ====
 
<syntaxhighlight lang=bash>
USER_CHANNEL=123
PEER_CHANNEL=123peer
gst-launch-1.0 webrtcbin rtcp-mux=true start-call=true signaler::user-channel=$USER_CHANNEL signaler::peer-channel=$PEER_CHANNEL name=web \
videotestsrc is-live=true ! x264enc aud=false key-int-max=1 tune=zerolatency intra-refresh=true ! "video/x-h264,profile=constrained-baseline,level=(string)3.1" ! rtph264pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink \
web.video_src ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
==== Second Endpoint ====
 
<syntaxhighlight lang=bash>
USER_CHANNEL=123peer
PEER_CHANNEL=123
gst-launch-1.0 webrtcbin rtcp-mux=true start-call=false signaler::user-channel=$USER_CHANNEL signaler::peer-channel=$PEER_CHANNEL name=web \
videotestsrc is-live=true ! x264enc aud=false key-int-max=1 tune=zerolatency intra-refresh=true ! "video/x-h264,profile=constrained-baseline,level=(string)3.1" ! rtph264pay ! web.video_sink \
audiotestsrc is-live=true ! opusenc ! rtpopuspay ! web.audio_sink \
web.video_src ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink \
web.audio_src ! rtpopusdepay ! opusdec ! autoaudiosink
</syntaxhighlight>
 
}}

Latest revision as of 16:46, 9 March 2023