2,225
edits
mNo edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
{{GstWebRTC Page| | {{GstWebRTC Page| | ||
[[GstWebRTC - Elements|Plugin Elements]]| | [[GstWebRTC - Elements|Plugin Elements]]| | ||
[[GstWebRTC - GstWebRTCSink | | [[GstWebRTC - GstWebRTCSink | GstRrWebRTCSink Element]]| | ||
This pages gives an overview of the | This pages gives an overview of the GstRrWebRTCSrc element. | ||
== Architecture == | == Architecture == | ||
RidgeRun's | RidgeRun's GstRrWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstRrWebRTCSrc element, it becomes a receive-only endpoint. GstRrWebRTCSrc element may receive audio, video or both simultaneously. | ||
=== Configurations === | === Configurations === | ||
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a | The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created. | ||
<br /><br /> | <br /><br /> | ||
Line 30: | Line 30: | ||
<td><a href=http://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Receive_Pipeline>OpenWebRTC Example Pipeline</a></td> | <td><a href=http://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Receive_Pipeline>OpenWebRTC Example Pipeline</a></td> | ||
</tr> | </tr> | ||
<caption>Table 1. | <caption>Table 1. GstRrWebRTCSrc Supported Capabilities | ||
</table> | </table> | ||
</center> | </center> | ||
</html> | </html> | ||
|keywords=Gstreamer WebRTC Basics,Plugin Overview,Gstreamer WebRTC Plugin Overview, | |keywords=Gstreamer WebRTC Basics,Plugin Overview,Gstreamer WebRTC Plugin Overview,GstRrWebRTCSrc element,GStreamer WebRTC source element}} |