NVIDIA Jetson AGX Thor - GStreamer Pipelines

From RidgeRun Developer Wiki

Follow Us On Twitter LinkedIn Email Share this page





Previous: Blackwell_GPU Index Next: GStreamer_Pipelines/Capture_and_Display









GStreamer Pipelines Overview

GStreamer is an open-source pipeline-based framework that enables quick prototyping of multimedia projects. In this section, we provide some example pipelines that capture from USB and MIPI CSI-2 cameras to help you get started.

On Jetson platforms, there are three possible capture paths:

  • Through the ISP: The Image Signal Processor can be used through the libargus API, and among its capabilities, it allows debayering, auto-exposure, and gain control. This path is often used to capture from MIPI CSI-2 cameras that provide Bayer images, because the ISP takes care of the debayering, leaving the CPU and GPU processing power free for other tasks. The element used in GStreamer to capture through this path is nvarguscamerasrc.
  • Bypassing the ISP: If the ISP is not required, it can be bypassed using the GStreamer element v4l2src. This method is useful for USB cameras or to capture raw Bayer images from MIPI CSI-2 cameras.
  • Camera-over-Ethernet (CoE) is NVIDIA’s next-gen solution for high-performance, scalable camera integration over Ethernet, fully supported in the SIPL framework with a unified API for Holoscan Sensor Bridge cameras.


Documentation
More information of CoE in NVIDIA Jetson Thor Documentation


Jetson AGX Thor Encoders Available

Jetson AGX Thor provides the following encoders:

Table 1: Jetson AGX Thor Encoders
Encoder Description Use case
nvautogpuh264enc Encode H.264 video streams using NVCODEC API auto GPU select Mode nvautogpuh264enc uses NVIDIA GPUs to speed up H.264 video encoding in GStreamer, making it ideal for real-time streaming, quick transcoding, and low-latency video capture.
nvautogpuh265enc Encode H.265 video streams using NVCODEC API auto GPU select Mode nvautogpuh265enc uses NVIDIA GPUs to accelerate H.265 encoding in GStreamer, ideal for fast, high-quality video streaming, transcoding, and low-latency capture.
nvh264enc Encode H.264 video streams using NVCODEC API CUDA Mode nvh264enc is a GStreamer encoder that uses NVIDIA’s CUDA-enabled NVCODEC API to hardware-accelerate H.264 video encoding, ideal for fast and high-efficiency video processing directly on NVIDIA GPUs
nvh265enc Encode H.265 video streams using NVCODEC API CUDA Mode nvh265enc is a GStreamer encoder that uses NVIDIA GPUs to speed up H.265 (HEVC) video encoding, providing fast and efficient compression.
nvjpegenc Encode images in JPEG format Commonly used for saving and snapshot or encoding a image to JPEG image.
nvv4l2av1enc Encode AV1 video streams via V4L2 API AV1 format for streaming platforms, so the video uses much less bandwidth while keeping good visual quality.
nvv4l2h264enc Encode H.264 video streams via V4L2 API Real-time video streaming on NVIDIA Jetson devices, where raw camera input is hardware-encoded into H.264
nvv4l2h265enc Encode H.265 video streams via V4L2 API Encoding 4K/8K video into H.265 (HEVC) in real time on Jetson devices, enabling higher compression efficiency than H.264 for applications like video conferencing, cloud streaming, or surveillance systems.

Examples Pipelines

You will find the following subsections with examples of both capture paths:

  • Capture and Display: Provides examples of pipelines that capture from MIPI CSI-2 or USB cameras and display the video.
  • H264: Provides examples of pipelines that capture from MIPI CSI-2 or USB cameras, encode the video using the h.264 hardware codec, and save the video to file.
  • H265: Provides examples of pipelines that capture from MIPI CSI-2 or USB cameras, encode the video using the h.264 hardware codec, and save the video to file.

The pipelines presented in the listed subsections provide some performance metrics. You can find information on how the performance metrics were extracted in the following Performance Measurement subsection. Finally, the Remote Development subsection provides tips for running the pipelines from an SSH connection.

Performance Measurement

The performance measurements are extracted using the specified power profile (operation mode) and with the GStreamer pipeline running in the system.

Operation Mode:

CPU usage:

  • For the CPU percentage utilization, the tegrastats utility was used as follows: sudo tegrastats. This command shows the percentage utilization of the 14 cores that the Jetson AGX Thor devkit has. For the measurement, we compute the average over 30 samples of the utilization values for the CPU cores.

GPU Usage:

  • The same procedure described for the CPU usage is done in the case of GPU percentage. As the board has 2 GPUs, 30 samples of each one are taken, and then, the average is obtained.

FPS:

  • For the FPS, RidgeRun's gst-perf plugin is used. This plugin computes the FPS mean. For the measurement, we compute the average over 30 samples of the FPS mean values. The following example shows a pipeline with the perf element incorporated:
 gst-launch-1.0 -v v4l2src device=/dev/video0 ! perf ! video/x-raw,framerate=30/1,width=1280,height=720 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=NV12' ! nv3dsink

Latency:

  • In the case of the Latency metric, the RidgeRun GstShark plugin is used. For this metric, it is important to take into account that it is not a glass-to-glass measurement. GstShark provides the time needed by a buffer to travel from the source pad of the source element to the source pad of the remaining elements of the pipeline. The following pipeline shows an example of the GstShark interlatency tracer execution:
 GST_DEBUG="GST_TRACER:7" GST_TRACERS="interlatency" gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=1280,height=720 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=NV12' ! nv3dsink

Remote Development

Often we want to run tests on a Jetson that is in another physical location. In these cases, you will need to have a display connected to the Jetson and also find out the proper value of the DISPLAY environment variable. For this, before you move to developing remotely or with someone's help at the Jetson location, connect a keyboard to the Jetson, open a terminal, and check for the DISPLAY variable by executing the following command:

echo $DISPLAY


This command will print the needed value, usually something like :0. Then you can run the GStreamer Pipeline from an SSH terminal as follows:

 DISPLAY=:0 GST_DEBUG=2 gst-launch-1.0 -v v4l2src device=/dev/video0 ! video/x-raw,framerate=30/1,width=1280,height=720 ! nvvidconv ! xvimagesink sync=true



Previous: Blackwell_GPU Index Next: GStreamer_Pipelines/Capture_and_Display