GstWebRTC - GstWebRTCBin
GstWebRTC | ||||||||
---|---|---|---|---|---|---|---|---|
WebRTC Fundamentals | ||||||||
GstWebRTC Basics | ||||||||
|
||||||||
Evaluating GstWebRTC | ||||||||
Getting the code | ||||||||
Building GstWebRTC | ||||||||
Examples | ||||||||
|
||||||||
MCU Demo Application | ||||||||
Contact Us |
This page gives an overview of the GstRrWebRTCBin element.
Architecture
Ridgerun's GstRrWebRTCBin can be used as a sender-receiver endpoint, as shown in Figure 1. If a pipeline uses GstRrWebRTCBin element, it becomes a send-receive endpoint. GstRrWebRTCBin may receive and send audio, video or both simultaneously.
Configurations
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCBin was created with video and audio pads, it will be capable of sending and receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created in direction.
Send/Receive Audio Only | OpenWebRTC Example pipeline | |
Send/Receive Video Only | OpenWebRTC Example pipeline | |
Send/Receive Audio and Video | OpenWebRTC Example pipeline | |
Send Video - Receive Audio | OpenWebRTC Example pipeline | |
Send Audio - Receive Video | OpenWebRTC Example pipeline | |
Send Video and Audio - Receive Audio | OpenWebRTC Example pipeline | |
Send Video and Audio - Receive Video | OpenWebRTC Example pipeline | |
Send Audio - Receive Audio and Video | OpenWebRTC Example pipeline | |
Send Video - Receive Audio and Video | OpenWebRTC Example pipeline |