GStreamer Qt Overlay for Embedded Systems/Examples/i.MX6: Difference between revisions

no edit summary
(Created page with "<noinclude> {{GStreamer Qt Overlay for Embedded Systems/Head|next=GstQtOverlay Overview/Architecture}} </noinclude> You can run all the examples presented for a PC-based plat...")
 
No edit summary
Line 3: Line 3:
</noinclude>
</noinclude>


You can run all the examples presented for a PC-based platform without issues in an NVIDIA Jetson. Nevertheless, you can use NVMM support for Jetson, which can be even faster since you reduce the number of memory copies from userspace to device memory and vice-versa.
Before running the examples below, make sure you have set up the environment correctly.
 
Depending on your setup (whether you have a display connected or not), please, take into account configuring the display server by setting the variables presented in [[Gstreamer_QT_Overlay#Running_Without_a_Graphics_Server|Running without a graphics server]].
 
==== Display a text overlay ====
 
Take the file '''main.qml''' used in the [[Gstreamer_QT_Overlay#1._Display_a_text_overlay|PC example]].
 
On Jetson, you can use the <code>nvoverlaysink</code> element, which removes a memory copy (from NVMM to userspace, and a videoconvert in CPU):
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvoverlaysink sync=false
</syntaxhighlight>
 
For using the <code>nvarguscamerasrc</code>, your Jetson shall have a camera connected.
 
{{Ambox
|type=notice
|small=left
|issue='''Note:'''<span style="color:#FF0000"> For Jetson 4.5</span>, you must add a color space conversion before the <span>nvoverlaysink</span> element.
|style=width:unset;
}}
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvvidconv ! "video/x-raw(memory:NVMM),format=I420" ! nvoverlaysink sync=false
</syntaxhighlight>
 
The <code>nvvidconv</code> elements are intended for converting to RGBA back-and-forth because of QT compatibility
 
You can still NVMM when using user-space elements, like <code>videoconvert</code> or <code>videotestsrc</code>:
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! qtoverlay qml=main.qml ! nvvidconv ! xvimagesink sync=false
</syntaxhighlight>
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 videotestsrc ! nvvidconv ! 'video/x-raw(memory:NVMM)' ! qtoverlay qml=main.qml ! nvoverlaysink sync=false
</syntaxhighlight>
 
==== Animated gif overlay ====
 
Take the file '''animation.qml''' used in the [[Gstreamer_QT_Overlay#2._Animated_gif_overlay|PC example]].
 
Run the example shown above using NVMM:
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=animation.qml ! nvoverlaysink
</syntaxhighlight>
 
You can use the other pipelines shown in other examples. The only change is the <code>qml</code> property of qtoverlay.
 
{{Ambox
|type=notice
|small=left
|issue='''Note:'''<span style="color:#FF0000"> For Jetson 4.5</span>, you must add a color space conversion before the <span>nvoverlaysink</span> element.
|style=width:unset;
}}
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=animation.qml ! nvvidconv ! "video/x-raw(memory:NVMM),format=I420" ! nvoverlaysink
</syntaxhighlight>
 
==== NvArgus camera to display ====
 
Take the file '''main.qml''' used in the [[Gstreamer_QT_Overlay#Display_a_text_overlay|PC example]].
 
This example manages the memory throughout the pipeline in NVMM format, never bringing the data to userspace.


<syntaxhighlight lang='bash'>
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvoverlaysink
export QT_EGLFS_IMX6_NO_FB_MULTI_BUFFER=1
</syntaxhighlight>
export QT_QPA_PLATFORM=eglfs
 
export DISPLAY=:0.0
{{Ambox
|type=notice
|small=left
|issue='''Note:'''<span style="color:#FF0000"> For Jetson 4.5</span>, you must add a color space conversion before the <span>nvoverlaysink</span> element.
|style=width:unset;
}}
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvvidconv ! "video/x-raw(memory:NVMM),format=I420" ! nvoverlaysink
</syntaxhighlight>
 
If you want to use X for displaying:
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 nvarguscamerasrc ! nvvidconv ! qtoverlay qml=main.qml ! nvvidconv ! nvvidconv ! ximagesink sync=false
</syntaxhighlight>
 
The first <code>nvvidconv</code> is used for converting from any format to RGBA. Then, the two <code>nvvidconv</code>s at the end are used for converting from RBGA to a format acceptable by <code>ximagesink</code>, and the transfer from NVMM memory to user-space memory.
 
==== V4L2 camera to display ====
 
Take the file '''main.qml''' used in the [[Gstreamer_QT_Overlay#Display_a_text_overlay|PC example]].
 
Except for v4l2src, this example manages the memory in NVMM format, never bringing the data to userspace.
 
<syntaxhighlight lang='bash'>
gst-launch-1.0 v4l2src ! nvvidconv ! qtoverlay qml=main.qml ! nvoverlaysink
</syntaxhighlight>
 
==== Capturing and saving into a file ====
 
Having a display is not needed to use the GstQtOverlay plug-in. The display should be set for the Qt Engine to have a reference of the GPU resources available, especially for the EGL.
 
You can have even pipelines with multiple paths. Please, find the following example by using a 4K camera as a capture device:
 
'''With NVMM support''':
 
<syntaxhighlight lang=bash>
gst-launch-1.0 nvarguscamerasrc \
  ! nvvidconv \
  ! qtoverlay qml=main.qml \
  ! nvvidconv \
  ! nvv4l2h264enc  maxperf-enable=1 \
  ! h264parse \
  ! qtmux ! filesink location=test.mp4 -e
</syntaxhighlight>
</syntaxhighlight>


Some important aspects of the NVMM support observed from the pipeline shown above:
Now, you are able to run the following examples:


1. Each QtOverlay element should have a fresh NVMM buffer since it works ''in-place''. It is achieved by using an <code>nvvidconv</code> element.
====Display a text overlay====
gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw, width=640, height=480, framerate=30/1' ! queue ! imxg2dvideotransform ! \
queue ! qtoverlay qml=/main.qml ! queue ! imxeglvivsink qos=false sync=false enable-last-sample=false


2. The storage can represent a bottleneck. It is recommended to handle the file dumping on a fast storage unit such as SSD or RAM disk.
====Saving a video with a text overlay====
gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw, width=640, height=480, framerate=30/1' ! queue ! imxg2dvideotransform ! \
queue ! qtoverlay qml=/main.qml ! queue ! imxipuvideotransform input-crop=false ! \
capsfilter caps=video/x-raw,width=640,height=480,format=NV12 ! imxvpuenc_h264 bitrate=4000 gop-size=15 idr-interval=15  ! \
capsfilter caps=video/x-h264 ! mpegtsmux alignment=7 ! queue ! filesink location=test.mp4 -e


'''Without NVMM'''
====Network streaming with a text overlay====


The following pipeline uses qtoverlay in Non-NVMM mode. In this case, take into account the video conversion and the encoding, which are done in the CPU. Besides, there is a need for video conversion from RGBA because of QT compatibility.
Send stream to a host with IP address $IP


<syntaxhighlight lang=bash>
'''On the IMX6: video source'''
gst-launch-1.0 nvarguscamerasrc \
gst-launch-1.0 videotestsrc is-live=true ! 'video/x-raw, width=640, height=480, framerate=30/1' ! queue ! imxg2dvideotransform ! \
  ! nvvidconv \
queue ! qtoverlay qml=/main.qml ! queue ! imxipuvideotransform input-crop=false ! \
  ! qtoverlay qml=main.qml \
capsfilter caps=video/x-raw,width=640,height=480,format=NV12 ! imxvpuenc_h264 bitrate=4000 gop-size=15 idr-interval=15  ! \
  ! videoconvert \
capsfilter caps=video/x-h264 ! mpegtsmux alignment=7 ! queue ! udpsink async=false sync=false host=$IP port=5012
  ! x264enc \
  ! h264parse \
  ! qtmux ! filesink location=test.mp4 -e
</syntaxhighlight>


For an even simpler case:
'''On the host: video receptor'''
gst-launch-1.0 udpsrc address=$IP port=5012 ! queue ! tsdemux ! identity single-segment=true ! queue ! decodebin ! \
queue ! fpsdisplaysink sync=false -v


<syntaxhighlight lang=bash>
gst-launch-1.0 videotestsrc \
  ! qtoverlay qml=main.qml \
  ! videoconvert \
  ! x264enc \
  ! h264parse \
  ! qtmux ! filesink location=test.mp4 -e
</syntaxhighlight>
The pipelines shown above was tested on a Jetson Nano, performing faster than 30fps.
==== 4. Stream video over network from RSTP source ====
Let's assume:
'''SOURCE_ADRESS=rtsp://10.251.101.176:5004/test''': origin rtsp video
'''DEST_ADRESS=10.251.101.57''': destination computer/board
'''PORT=5004''': source/dest port
On an NVIDIA Jetson board:
<syntaxhighlight lang='bash'>
gst-launch-1.0 rtspsrc location=$SOURCE_ADRESS ! rtph264depay ! h264parse  ! omxh264dec ! nvvidconv ! video/x-raw ! qtoverlay qml=main.qml ! videoconvert ! video/x-raw,format=I420 ! queue ! jpegenc ! rtpjpegpay ! udpsink host=$DEST_ADRESS port=$PORT sync=false enable-last-sample=false max-lateness=00000000  -v
</syntaxhighlight>


<noinclude>
<noinclude>
{{GStreamer Qt Overlay for Embedded Systems/Foot||GstQtOverlay Overview/Architecture}}
{{GStreamer Qt Overlay for Embedded Systems/Foot||GstQtOverlay Overview/Architecture}}
</noinclude>
</noinclude>
1,593

edits