GstWebRTC - GstWebRTCSrc: Difference between revisions
mNo edit summary |
No edit summary |
||
Line 19: | Line 19: | ||
<tr> | <tr> | ||
<td>Audio Only</td> | <td>Audio Only</td> | ||
<td align="center"><img src=" | <td align="center"><img src="https://developer.ridgerun.com/wiki/images/8/89/Gstwebrtcsrc-audio.png" width=600></img><figcaption>Figure 1: Receive Audio</figcaption></td> | ||
<td><a href=" | <td><a href="https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_Opus_Examples#Receive_pipeline">OpenWebRTC Example Pipeline</a></td> | ||
</tr> | </tr> | ||
<tr> | <tr> | ||
<td>Video Only</td> | <td>Video Only</td> | ||
<td align="center"><img src=" | <td align="center"><img src="https://developer.ridgerun.com/wiki/images/6/69/Gstwebrtcsrc-video.png" width=600></img><figcaption>Figure 2: Receive Video</figcaption></td> | ||
<td><a href= | <td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264_Examples#Receive_Pipeline>OpenWebRTC Example Pipeline</a></td> | ||
</tr> | </tr> | ||
<tr> | <tr> | ||
<td>Audio + Video</td> | <td>Audio + Video</td> | ||
<td align="center"><img src=" | <td align="center"><img src="https://developer.ridgerun.com/wiki/images/e/e2/Gstwebrtcsrc-audiovideo.png" width=600></img><figcaption>Figure 3: Receive Audio and Video</figcaption></td> | ||
<td><a href= | <td><a href=https://developer.ridgerun.com/wiki/index.php?title=GstWebRTC_-_H264-Opus_Examples#Receive_Pipeline>OpenWebRTC Example Pipeline</a></td> | ||
</tr> | </tr> | ||
<caption>Table 1. GstRrWebRTCSrc Supported Capabilities | <caption>Table 1. GstRrWebRTCSrc Supported Capabilities |
Latest revision as of 03:32, 10 July 2024
← |
⌂ Home |
→ |
This pages gives an overview of the GstRrWebRTCSrc element.
Architecture
RidgeRun's GstRrWebRTCSrc is used as a standard GStreamer source element. If a pipeline uses the GstRrWebRTCSrc element, it becomes a receive-only endpoint. GstRrWebRTCSrc element may receive audio, video or both simultaneously.
Configurations
The supported capabilities are determined at runtime based on the pads that were requested for the elements. Simply said, if a GstRrWebRTCSrc was created with video and audio pads, it will be capable of receiving both medias. Table 1 summarizes the complete set of possible configurations. At the time being, only one pad of each media type can be created.
Audio Only | OpenWebRTC Example Pipeline | |
Video Only | OpenWebRTC Example Pipeline | |
Audio + Video | OpenWebRTC Example Pipeline |
← |
⌂ Home |
→ |