GstWebRTC - GstWebRTCBin: Difference between revisions
mNo edit summary |
mNo edit summary |
||
Line 63: | Line 63: | ||
</center> | </center> | ||
</html> | </html> | ||
|Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview,GstWebRTCBin element,GstWebRTCBin}} | |keywords=Gstreamer WebRTC Basics,Plugin Overview,WebRTC Basics,Gstreamer WebRTC Plugin Overview,GstWebRTCBin element,GstWebRTCBin}} |
Revision as of 13:24, 11 December 2018
← |
⌂ Home |
→ |
This page gives an overview of the GstWebRTCBin element.
Architecture
Ridgerun's GstWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstWebRTCBin element, it becomes a send-receive endpoint. GstWebRTCBin may receive and send audio, video or both simultaneously.
Configurations
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.
Send/Receive Audio Only | OpenWebRTC Example pipeline | |
Send/Receive Video Only | OpenWebRTC Example pipeline | |
Send/Receive Audio and Video | OpenWebRTC Example pipeline | |
Send Video - Receive Audio | OpenWebRTC Example pipeline | |
Send Audio - Receive Video | OpenWebRTC Example pipeline | |
Send Video and Audio - Receive Audio | OpenWebRTC Example pipeline | |
Send Video and Audio - Receive Video | OpenWebRTC Example pipeline | |
Send Audio - Receive Audio and Video | OpenWebRTC Example pipeline | |
Send Video - Receive Audio and Video | OpenWebRTC Example pipeline |
← |
⌂ Home |
→ |