GstWebRTC - GstWebRTCBin: Difference between revisions

From RidgeRun Developer Wiki
No edit summary
No edit summary
 
(52 intermediate revisions by 5 users not shown)
Line 1: Line 1:
{{GstWebRTC Page|
{{GstWebRTC/Head|previous=Elements|next=GstWebRTCBin Properties|metakeywords=Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview,GstRrWebRTCBin element,GstRrWebRTCBin,signaling}}
[[GstWebRTC - WebRTC Fundamentals|WebRTC Fundamentals]]|
[[GstWebRTC - Evaluating GstWebRTC | Evaluating GstWebRTC]]|


This pages gives an overview of the GstWebRTCBin element.
__TOC__
 
This page gives an overview of the GstRrWebRTCBin element.
== Architecture ==
== Architecture ==
Ridgerun's GstWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1.
Ridgerun's GstRrWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstRrWebRTCBin element, it becomes a send-receive endpoint. GstRrWebRTCBin may receive and send audio, video or both simultaneously.
 
<br/>
<br />
[[File:Gstwebrtc-sendreceive-bin.png|800px|thumb|center|Figure 1. Pipeline as a WebRTC send-receive endpoint.]]
<br />


=== Configurations ===
=== Configurations ===
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.
<html>
<html>
<center>
<center>
Line 17: Line 15:
<tr>
<tr>
     <td>Send/Receive Audio Only</td>  
     <td>Send/Receive Audio Only</td>  
     <td><br /><img src="http://developer.ridgerun.com/wiki/images/thumb/c/c5/Gstwebrtc-sendreceive-audio-bin.png/800px-Gstwebrtc-sendreceive-audio-bin.png" width=600></img><br /></td>
     <td><br /><img src="https://developer.ridgerun.com/wiki/images/thumb/c/c5/Gstwebrtc-sendreceive-audio-bin.png/800px-Gstwebrtc-sendreceive-audio-bin.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Opus_Examples#Send-Receive_Pipeline>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send/Receive Video Only</td>
   <td>Send/Receive Video Only</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/thumb/d/dc/Gstwebrtc-sendreceive-bin.png/800px-Gstwebrtc-sendreceive-bin.png" width=600></img><br /></td>
   <td><br /><img src="https://developer.ridgerun.com/wiki/images/thumb/d/dc/Gstwebrtc-sendreceive-bin.png/800px-Gstwebrtc-sendreceive-bin.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264_Examples#Send-Receive_Pipeline>OpenWebRTC Example pipeline</a></td>
</tr>
 
<tr>
  <td>Send/Receive Audio and Video</td>
  <td><br /><img src="https://developer.ridgerun.com/wiki/images/thumb/f/f2/Gstwebrtc-sendreceive-audio-video-bin.png/800px-Gstwebrtc-sendreceive-audio-video-bin.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Send-Receive_Pipeline>OpenWebRTC Example pipeline</a></td>
</tr>
<tr>
  <td>Send Video - Receive Audio</td>
  <td><br /><img src="https://developer.ridgerun.com/wiki/images/e/ec/GstWebRTCBinSendVideoReceiveAudio.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Vp8-Opus_Examples#Send-Receive_Pipeline_3>OpenWebRTC Example pipeline</a></td>
</tr>
<tr>
  <td>Send Audio - Receive Video</td>
  <td><br /><img src="https://developer.ridgerun.com/wiki/images/5/53/GstWebRTCBinsendaudioReceivevideo.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Send-Receive_Pipeline_3>OpenWebRTC Example pipeline</a></td>
</tr>
<tr>
  <td>Send Video and Audio - Receive Audio</td>
  <td><br /><img src="https://developer.ridgerun.com/wiki/images/9/97/GstWebRTC-send-audio-video-receive-audio.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Send-Receive_Pipeline_6>OpenWebRTC Example pipeline</a></td>
</tr>
<tr>
  <td>Send Video and Audio - Receive Video</td>
  <td><br /><img src="https://developer.ridgerun.com/wiki/images/6/60/GstWebRTCbin-send-audio-video-receive-video.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Vp8-Opus_Examples#Send-Receive_Pipeline_6>OpenWebRTC Example pipeline</a></td>
</tr>
<tr>
  <td>Send Audio - Receive Audio and Video</td>
  <td><br /><img src="https://developer.ridgerun.com/wiki/images/c/c7/GstWebRTCBin-send-audio-receive-audio-and-video.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Send-Receive_Pipeline_5>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send/Receive Audio+Video</td>
   <td>Send Video - Receive Audio and Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/thumb/f/f2/Gstwebrtc-sendreceive-audio-video-bin.png/800px-Gstwebrtc-sendreceive-audio-video-bin.png" width=600></img><br /></td>
   <td><br /><img src="https://developer.ridgerun.com/wiki/images/c/c3/GstWebRTCBin-send-video-receive-audio-and-video.png" width=600></img><br /></td>
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Vp8-Opus_Examples#Send-Receive_Pipeline_5>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<caption>Table 1. GstWebRTCBin Supported Capabilities
<caption>Table 1. GstRrWebRTCBin Supported Capabilities
</table>
</table>
</center>
</center>
</html>
</html>


}}
{{GstWebRTC/Foot|previous=Elements|next=GstWebRTCBin Properties}}

Latest revision as of 03:31, 10 July 2024




Previous: Elements Index Next: GstWebRTCBin Properties




This page gives an overview of the GstRrWebRTCBin element.

Architecture

Ridgerun's GstRrWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstRrWebRTCBin element, it becomes a send-receive endpoint. GstRrWebRTCBin may receive and send audio, video or both simultaneously.

Configurations

The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.

Send/Receive Audio Only

OpenWebRTC Example pipeline
Send/Receive Video Only

OpenWebRTC Example pipeline
Send/Receive Audio and Video

OpenWebRTC Example pipeline
Send Video - Receive Audio

OpenWebRTC Example pipeline
Send Audio - Receive Video

OpenWebRTC Example pipeline
Send Video and Audio - Receive Audio

OpenWebRTC Example pipeline
Send Video and Audio - Receive Video

OpenWebRTC Example pipeline
Send Audio - Receive Audio and Video

OpenWebRTC Example pipeline
Send Video - Receive Audio and Video

OpenWebRTC Example pipeline
Table 1. GstRrWebRTCBin Supported Capabilities


Previous: Elements Index Next: GstWebRTCBin Properties