GstWebRTC - GstWebRTCBin: Difference between revisions

From RidgeRun Developer Wiki
No edit summary
No edit summary
Line 27: Line 27:
   <td>Send/Receive Audio and Video</td>
   <td>Send/Receive Audio and Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/thumb/f/f2/Gstwebrtc-sendreceive-audio-video-bin.png/800px-Gstwebrtc-sendreceive-audio-video-bin.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/thumb/f/f2/Gstwebrtc-sendreceive-audio-video-bin.png/800px-Gstwebrtc-sendreceive-audio-video-bin.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=http://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Send-Receive_Pipeline>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send Video - Receive Audio</td>
   <td>Send Video - Receive Audio</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/e/ec/GstWebRTCBinSendVideoReceiveAudio.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/e/ec/GstWebRTCBinSendVideoReceiveAudio.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send Audio - Receive Video</td>
   <td>Send Audio - Receive Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/5/53/GstWebRTCBinsendaudioReceivevideo.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/5/53/GstWebRTCBinsendaudioReceivevideo.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send Video and Audio - Receive Audio</td>
   <td>Send Video and Audio - Receive Audio</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/9/97/GstWebRTC-send-audio-video-receive-audio.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/9/97/GstWebRTC-send-audio-video-receive-audio.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send Video and Audio - Receive Video</td>
   <td>Send Video and Audio - Receive Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/6/60/GstWebRTCbin-send-audio-video-receive-video.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/6/60/GstWebRTCbin-send-audio-video-receive-video.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send Audio - Receive Audio and Video</td>
   <td>Send Audio - Receive Audio and Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/c/c7/GstWebRTCBin-send-audio-receive-audio-and-video.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/c/c7/GstWebRTCBin-send-audio-receive-audio-and-video.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<tr>
<tr>
   <td>Send Video - Receive Audio and Video</td>
   <td>Send Video - Receive Audio and Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/c/c3/GstWebRTCBin-send-video-receive-audio-and-video.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/c/c3/GstWebRTCBin-send-video-receive-audio-and-video.png" width=600></img><br /></td>
<td>Example pipeline</td>
<td><a href=>OpenWebRTC Example pipeline</a></td>
</tr>
</tr>
<caption>Table 1. GstWebRTCBin Supported Capabilities
<caption>Table 1. GstWebRTCBin Supported Capabilities

Revision as of 18:24, 12 July 2017


WebRTC Fundamentals


Home

Evaluating GstWebRTC



This pages gives an overview of the GstWebRTCBin element.

Architecture

Ridgerun's GstWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstWebRTCBin element, it becomes a send-receive endpoint. GstWebRTCBin may receive and send audio, video or both simultaneously.

Configurations

The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.

Send/Receive Audio Only

OpenWebRTC Example pipeline
Send/Receive Video Only

OpenWebRTC Example pipeline
Send/Receive Audio and Video

OpenWebRTC Example pipeline
Send Video - Receive Audio

OpenWebRTC Example pipeline
Send Audio - Receive Video

OpenWebRTC Example pipeline
Send Video and Audio - Receive Audio

OpenWebRTC Example pipeline
Send Video and Audio - Receive Video

OpenWebRTC Example pipeline
Send Audio - Receive Audio and Video

OpenWebRTC Example pipeline
Send Video - Receive Audio and Video

OpenWebRTC Example pipeline
Table 1. GstWebRTCBin Supported Capabilities



WebRTC Fundamentals


Home

Evaluating GstWebRTC