NVIDIA Jetson AGX Thor - Capture and Display
The NVIDIA Jetson AGX Thor documentation from RidgeRun is presently being developed. |
General GStreamer Pipeline structure
In GStreamer, it is possible to capture and display output using the following basic structure as a reference.

Depending on your application, you can add more elements to the pipeline, but for this section we will keep it simple.
Capture GStreamer elements
The Jetson AGX Thor supports the following camera types for capture:
- MIPI CSI.
- Camera over ethernet
- USB Cameras.
For each type of camera, GStreamer provides the corresponding plugin. Table 1 summarizes the plugin and the associated camera type.
Camera connection | Plugin | Example pipeline |
---|---|---|
MIPI CSI | nvarguscamerasrc
|
gst-launch-1.0 nvsiplsrc json-file=/path/to/sensor_config.json \ sensor-id=0 bl-output=true ! 'video/x-raw(memory:NVMM), width=2560, \ height=1984,format=(string)NV12, framerate=(fraction)30/1' ! \ fakesink silent=false -v |
USB Camera non Accelerated Hardware | v4l2src
|
gst-launch-1.0 v4l2src device="/dev/video0" ! \ "video/x-raw, width=640, height=480, format=(string)YUY2" ! \ fakesink silent=false -v |
USB Camera with Accelerated Hardware | nvv4l2camerasrc
|
gst-launch-1.0 nvv4l2camerasrc device=/dev/video0 ! \ 'video/x-raw(memory:NVMM), format=(string)UYVY, width=(int)2592, height=(int)1944, framerate=(fraction)14/1, interlace-mode=progressive' ! \ fakesink silent=false -v |
CoE | nvsiplsrc
|
gst-launch-1.0 nvsiplsrc json-file=/path/to/sensor_config.json sensor-id=0 bl-output=true ! \ 'video/x-raw(memory:NVMM), width=(int)2560, height=(int)1984, \ format=(string)NV12, framerate=(fraction)30/1' ! fakesink silent=false -v |
Display GStreamer elements
Table 2 lists the available display options with hardware-accelerated plugins.
Display | Plugin |
---|---|
EGL/GLES (X11 and Wayland) | nveglglessink
|
EGL/GLES (X11 only) | nv3dsink
|
DRM | nvdrmvideosink
|
Displaying with EGL/GLES
For EGL/GLES display, you can use either nv3dsink
or nveglglessink
.
nv3dsink
generally performs better with NVMM memory compared to nveglglessink
. For example the following pipeline takes a video stream in NVMM memory from a MIPI CSI camera.
gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvivafilter cuda-process=true \ customer-lib-name="libnvsample_cudaprocess.so" ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink -e
With nv3dsink
, you can adjust the size and position of the output window. For example, you can create a window sized 512×512 positioned at (300, 300) on the screen.
gst-launch-1.0 nvarguscamerasrc ! \ 'video/x-raw(memory:NVMM), width=(int)3840, height=(int)2160, \ format=(string)NV12, framerate=(fraction)30/1' ! \ nvivafilter cuda-process=true \ customer-lib-name="libnvsample_cudaprocess.so" ! \ 'video/x-raw(memory:NVMM), format=(string)NV12' ! nv3dsink window-x=300 window-y=300 window-width=512 window-height=512
nveglglessink
you can use NVMM memory. Take the following pipeline as example where it takes the videostream by the USB camera plugin
gst-launch-1.0 nvv4l2camerasrc device=/dev/video3 ! \ 'video/x-raw(memory:NVMM), format=(string)UYVY, \ width=(int)1920, height=(int)1080, \ interlace-mode= progressive, \ framerate=(fraction)30/1' ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! 'video/x-raw(memory:NVMM), \ width=1280, height=720, format=(string)I420 ! nveglglessink -e
Wayland Backend
Instead of using X11 as the default backend, you can use the Wayland backend. To enable it for the first time:
1. Start Weston.
nvstart-weston.sh
2. Test it with the following pipeline
gst-launch-1.0 videotestsrc ! nvvidconv ! nveglglessink winsys=wayland
Displaying with DRM
For running with DRM, make sure to have a dislpay connected, then follow the next steps: 1. Stop the Ubuntu Display Manager
sudo service gdm stop
2. Unset the DISPLAY variable
unset DISPLAY
3. Run the Jetson DRM Driver
sudo modprobe nvidia-drm modeset=1
Now you can test the cameras with the nvdrmvideosink
plugin. Following you can find and example:
gst-launch-1.0 v4l2src ! nvvidconv ! "video/x-raw(memory:NVMM), format=(string)YVYU, width=(int)1920, height=(int)1080" ! nvdrmvideosink
Capture and Display GStreamer pipelines
MIPI - Argus
The camera used for evaluation MIPI using Argus is the <name> and used the following pipeline.
gst-launch-1.0 nvarguscamerasrc
Table 3 shows the performance information for this pipeline.
Operation Mode | CPU (%) | GPU 1 (%) | GPU 2 (%) | FPS | latency (ms) |
---|---|---|---|---|---|
0 (max performance) | |||||
1 (Default Mode) |
MIPI - V4L2
The camera used for evaluation MIPI using V4L is the <name> and used the following pipeline.
gst-launch-1.0 nvv4l2camerasrc
Table 4 shows the performance information for this pipeline.
Operation Mode | CPU (%) | GPU 1 (%) | GPU 2 (%) | FPS | latency (ms) |
---|---|---|---|---|---|
0 (max performance) | |||||
1 (Default Mode) |
CoE - SIPL
The camera used for evaluation CoE using SIPL is the <name> and used the following pipeline.
gst-launch-1.0 nvsiplsrc
Table 5 shows the performance information for this pipeline.
Operation Mode | CPU (%) | GPU 1 (%) | GPU 2 (%) | FPS | latency (ms) |
---|---|---|---|---|---|
0 (max performance) | |||||
1 (Default Mode) |
USB - V4L2
The camera used for evaluation USB using V4L2 is the <name> and used the following pipeline.
gst-launch-1.0 v4l2src device=/dev/video2 ! video/x-h264, width=1920, height=1080, framerate=30/1 ! h264parse ! video/x-h264, stream-format=avc ! h264parse ! video/x-h264, stream-format=byte-stream ! nvv4l2decoder ! nveglglessink
Table 6 shows the performance information for this pipeline.
Operation Mode | CPU (%) | GPU 1 (%) | GPU 2 (%) | FPS | latency (ms) |
---|---|---|---|---|---|
0 (max performance) | |||||
1 (Default Mode) |
Filter
Jetson AGX Thor provides the filter plugin nvvidconv
so you can take the input and then scale it, change the format, or more, depending on your objective.
Following you can see a brief summary on how to use the nvvidconv
.
Supported formats
nvvidconv
supports the following formats for VIC based and for CUDA based.
Formats | Raw Memory | CUDA Memory | ||
---|---|---|---|---|
Input | Output | Input | Output | |
I420 | ✔ | ✔ | ✔ | ✔ |
UYVY | ✔ | ✔ | ✔ | ✔ |
YUY2 | ✔ | ✔ | ✔ | ✔ |
YVYU | ✔ | ✔ | ✔ | ✔ |
NV12 | ✔ | ✔ | ✔ | ✔ |
NV16 | ✔ | ✔ | ✔ | ✔ |
NV24 | ✔ | ✔ | ✔ | ✔ |
GRAY8 | ✔ | ✔ | ✔ | ✔ |
BGRx | ✔ | ✔ | ✔ | ✔ |
RGBA | ✔ | ✔ | ✔ | ✔ |
Y42B | ✔ | ✔ | ✔ | ✔ |
P010_10LE | × | × | ✔ | ✔ |
I420_10LE | × | × | ✔ | ✔ |
I420_12LE | × | × | ✔ | × |
Y444 | ✔ | ✔ | ✔ | ✔ |
Taking this information, it is possible to change the format to another one as always it is supported by the type of memory you are using.
Following you can find and example of nvvidconv changing the format with NVMM memory.
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! \ 'video/x-raw(memory:NVMM), format=(string)RGBA'! fakesink silent=false -v
For raw memory you can use the following pipeline as reference.
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! \ 'video/x-raw, format=(string)UYVY' ! fakesink silent=false -v
Additionally, Jetson AGX Thor can convert with CUDA-based conversion as follows.
gst-launch-1.0 filesrc location=720x480_30i_MP.mp4 ! qtdemux ! \ h264parse ! nvv4l2decoder cudadec-memtype=1 ! nvvidconv compute-hw=GPU \ nvbuf-memory-type=nvbuf-mem-cuda-device ! nveglglessink -e
Cropping and scaling
With nvvidconv
you can change the resolution by adjusting the width, and height values as follows.
gst-launch-1.0 videotestsrc ! nvvidconv ! "video/x-raw(memory:NVMM), format=(string)Y444, width=(int)1920, height=(int)1080" ! nvdrmvideosink -v
For cropping with nvvidconv
, you need to create the box of what you want to keep. In other words, you define the coordinates where it will start and finish the crop. Take the following example, it will display only the top left corner from the input as a box of 400x400.
gst-launch-1.0 v4lsrc ! nvvidconv top=0 left=0 right=400 bottom=400 ! nv3dsink