GstWebRTC - GstWebRTCBin: Difference between revisions
mNo edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
{{GstWebRTC Page| | {{GstWebRTC Page| | ||
[[GstWebRTC - GstWebRTCSink| | [[GstWebRTC - GstWebRTCSink|GstRrWebRTCSink Element]]| | ||
[[GstWebRTC - Evaluating GstWebRTC | Evaluating | [[GstWebRTC - Evaluating GstWebRTC | Evaluating GstRrWebRTC]]| | ||
This page gives an overview of the | This page gives an overview of the GstRrWebRTCBin element. | ||
== Architecture == | == Architecture == | ||
Ridgerun's | Ridgerun's GstRrWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstRrWebRTCBin element, it becomes a send-receive endpoint. GstRrWebRTCBin may receive and send audio, video or both simultaneously. | ||
<br /> | <br /> | ||
=== Configurations === | === Configurations === | ||
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a | The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction. | ||
<html> | <html> | ||
<center> | <center> | ||
Line 59: | Line 59: | ||
<td><a href=http://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Vp8-Opus_Examples#Send-Receive_Pipeline_5>OpenWebRTC Example pipeline</a></td> | <td><a href=http://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Vp8-Opus_Examples#Send-Receive_Pipeline_5>OpenWebRTC Example pipeline</a></td> | ||
</tr> | </tr> | ||
<caption>Table 1. | <caption>Table 1. GstRrWebRTCBin Supported Capabilities | ||
</table> | </table> | ||
</center> | </center> | ||
</html> | </html> | ||
|keywords=Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview, | |keywords=Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview,GstRrWebRTCBin element,GstRrWebRTCBin}} |
Revision as of 23:43, 9 January 2019
← |
⌂ Home |
→ |
This page gives an overview of the GstRrWebRTCBin element.
Architecture
Ridgerun's GstRrWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstRrWebRTCBin element, it becomes a send-receive endpoint. GstRrWebRTCBin may receive and send audio, video or both simultaneously.
Configurations
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.
Send/Receive Audio Only | OpenWebRTC Example pipeline | |
Send/Receive Video Only | OpenWebRTC Example pipeline | |
Send/Receive Audio and Video | OpenWebRTC Example pipeline | |
Send Video - Receive Audio | OpenWebRTC Example pipeline | |
Send Audio - Receive Video | OpenWebRTC Example pipeline | |
Send Video and Audio - Receive Audio | OpenWebRTC Example pipeline | |
Send Video and Audio - Receive Video | OpenWebRTC Example pipeline | |
Send Audio - Receive Audio and Video | OpenWebRTC Example pipeline | |
Send Video - Receive Audio and Video | OpenWebRTC Example pipeline |
← |
⌂ Home |
→ |