GstWebRTC - GstWebRTCBin: Difference between revisions

From RidgeRun Developer Wiki
mNo edit summary
No edit summary
Line 65: Line 65:
</center>
</center>
</html>
</html>
== Properties ==
|keywords=Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview,GstRrWebRTCBin element,GstRrWebRTCBin}}
|keywords=Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview,GstRrWebRTCBin element,GstRrWebRTCBin}}

Revision as of 23:38, 19 February 2019


GstRrWebRTCSink Element


Home

Evaluating GstRrWebRTC



This page gives an overview of the GstRrWebRTCBin element.

Architecture

Ridgerun's GstRrWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstRrWebRTCBin element, it becomes a send-receive endpoint. GstRrWebRTCBin may receive and send audio, video or both simultaneously.

Configurations

The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.

Send/Receive Audio Only

OpenWebRTC Example pipeline
Send/Receive Video Only

OpenWebRTC Example pipeline
Send/Receive Audio and Video

OpenWebRTC Example pipeline
Send Video - Receive Audio

OpenWebRTC Example pipeline
Send Audio - Receive Video

OpenWebRTC Example pipeline
Send Video and Audio - Receive Audio

OpenWebRTC Example pipeline
Send Video and Audio - Receive Video

OpenWebRTC Example pipeline
Send Audio - Receive Audio and Video

OpenWebRTC Example pipeline
Send Video - Receive Audio and Video

OpenWebRTC Example pipeline
Table 1. GstRrWebRTCBin Supported Capabilities

Properties



GstRrWebRTCSink Element


Home

Evaluating GstRrWebRTC