GstWebRTC - GstWebRTCSrc: Difference between revisions
Jcaballero (talk | contribs) No edit summary |
No edit summary |
||
(24 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
{{GstWebRTC Page| | {{GstWebRTC Page| | ||
[[GstWebRTC - | [[GstWebRTC - Elements|Plugin Elements]]| | ||
[[GstWebRTC - | [[GstWebRTC - GstWebRTCSink | GstRrWebRTCSink Element]]| | ||
This pages gives an overview of the | __TOC__ | ||
This pages gives an overview of the GstRrWebRTCSrc element. | |||
== Architecture == | == Architecture == | ||
RidgeRun's | RidgeRun's GstRrWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstRrWebRTCSrc element, it becomes a receive-only endpoint. GstRrWebRTCSrc element may receive audio, video or both simultaneously. | ||
=== Configurations === | |||
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created. | |||
<br /><br /> | <br /><br /> | ||
<html> | <html> | ||
<center> | <center> | ||
Line 17: | Line 19: | ||
<tr> | <tr> | ||
<td>Audio Only</td> | <td>Audio Only</td> | ||
<td | <td align="center"><img src="https://developer.ridgerun.com/wiki/images/8/89/Gstwebrtcsrc-audio.png" width=600></img><figcaption>Figure 1: Receive Audio</figcaption></td> | ||
<td><a href="https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Opus_Examples#Receive_pipeline">OpenWebRTC Example Pipeline</a></td> | |||
</tr> | </tr> | ||
<tr> | <tr> | ||
<td>Video Only</td> | <td>Video Only</td> | ||
<td | <td align="center"><img src="https://developer.ridgerun.com/wiki/images/6/69/Gstwebrtcsrc-video.png" width=600></img><figcaption>Figure 2: Receive Video</figcaption></td> | ||
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264_Examples#Receive_Pipeline>OpenWebRTC Example Pipeline</a></td> | |||
</tr> | </tr> | ||
<tr> | <tr> | ||
<td>Audio+Video</td> | <td>Audio + Video</td> | ||
<td | <td align="center"><img src="https://developer.ridgerun.com/wiki/images/e/e2/Gstwebrtcsrc-audiovideo.png" width=600></img><figcaption>Figure 3: Receive Audio and Video</figcaption></td> | ||
<td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Receive_Pipeline>OpenWebRTC Example Pipeline</a></td> | |||
</tr> | </tr> | ||
<caption>Table 1. | <caption>Table 1. GstRrWebRTCSrc Supported Capabilities | ||
</table> | </table> | ||
</center> | </center> | ||
</html> | </html> | ||
}} | |keywords=Gstreamer WebRTC Basics,Plugin Overview,Gstreamer WebRTC Plugin Overview,GstRrWebRTCSrc element,GStreamer WebRTC source element}} |
Latest revision as of 03:32, 10 July 2024
← |
⌂ Home |
→ |
This pages gives an overview of the GstRrWebRTCSrc element.
Architecture
RidgeRun's GstRrWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstRrWebRTCSrc element, it becomes a receive-only endpoint. GstRrWebRTCSrc element may receive audio, video or both simultaneously.
Configurations
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.
Audio Only | OpenWebRTC Example Pipeline | |
Video Only | OpenWebRTC Example Pipeline | |
Audio + Video | OpenWebRTC Example Pipeline |
← |
⌂ Home |
→ |