GstWebRTC - GstWebRTCSrc: Difference between revisions

From RidgeRun Developer Wiki
No edit summary
No edit summary
Line 6: Line 6:


== Architecture ==
== Architecture ==
RidgeRun's GstWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstWebRTCSource element, it becomes a receive-only endpoint, as shown in Figure 1.
RidgeRun's GstWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstWebRTCSource element, it becomes a receive-only endpoint. GstWebRTCSrc element may receive audio, video or both simultaneously. The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias.  
<br /><br />
Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.
[[File:Gstwebrtc-receiveonly.png|600px|thumb|center|Figure 1. Pipeline as a WebRTC receive-only endpoint.]]
<br />
== Media Type ==
GstWebRTCSrc element may receive audio, video or both simultaneously. The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.
<html>
<html>
<center>
<center>
Line 17: Line 13:
<tr>
<tr>
   <td>Audio Only</td>
   <td>Audio Only</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/8/89/Gstwebrtcsrc-audio.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/8/89/Gstwebrtcsrc-audio.png" width=600></img> Figure 1: Receive Audio<br /></td>
  <td>Example Pipeline</td>
</tr>
</tr>
<tr>
<tr>
   <td>Video Only</td>
   <td>Video Only</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/6/69/Gstwebrtcsrc-video.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/6/69/Gstwebrtcsrc-video.png" width=600></img><br /></td>
  <td>Example Pipeline</td>
</tr>
</tr>
<tr>
<tr>
   <td>Audio+Video</td>
   <td>Audio+Video</td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/e/e2/Gstwebrtcsrc-audiovideo.png" width=600></img><br /></td>
   <td><br /><img src="http://developer.ridgerun.com/wiki/images/e/e2/Gstwebrtcsrc-audiovideo.png" width=600></img><br /></td>
  <td>Example Pipeline</td>
</tr>
</tr>
<caption>Table 1. GstWebRTCSrc Supported Capabilities  
<caption>Table 1. GstWebRTCSrc Supported Capabilities  

Revision as of 20:57, 7 July 2017


WebRTC Fundamentals


Home

Evaluating GstWebRTC



This pages gives an overview of the GstWebRTCSrc element.

Architecture

RidgeRun's GstWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstWebRTCSource element, it becomes a receive-only endpoint. GstWebRTCSrc element may receive audio, video or both simultaneously. The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.

Audio Only
Figure 1: Receive Audio
Example Pipeline
Video Only

Example Pipeline
Audio+Video

Example Pipeline
Table 1. GstWebRTCSrc Supported Capabilities

Element Properties



WebRTC Fundamentals


Home

Evaluating GstWebRTC