Qualcomm Robotics RB5/RB6 - Encoding GStreamer Pipelines

From RidgeRun Developer Wiki


Index





In this section, we present some GStreamer pipelines to capture from the MIPI interface main camera (IMX577) in the Qualcomm Robotics RB5 development kit and encode the video in H264 and H265[1]. These pipelines apply for the RB6, as well. The encoding is done in two cases: hardware accelerated encoding using OpenMAX and without hardware acceleration[2]. We also measure key performance indicator for each pipeline. Check our GStreamer Pipelines section to find more information about how we extracted the performance metrics presented in this section. For latency, we record the stopwatch in the encoding pipeline and check the last frame in the generated video.

Hardware Acclerated from file source

The Qualcomm Robotics RB5/RB6 uses OpenMAX to perform hardware accelerated encoding. Because of the camera architecture, the source element qtiqmmfsrc has the ability give encoded video thanks to the QMMF server. Because of this, we are going to explore two options to encode. The first is going to be directly setting the caps of the source element, so it calls the QMMF server and gives encoded video and the second one, we are going to use the element omxh264enc from the gst-omx-plugin that maps the OpenMAX APIs and states with those of GStreamer. In both cases, we are also going to encode in both the H264 format and H265.

Encoding directly from camera

Here we are encoding the video by setting the caps of the source element qtiqmmfsrc for it to use the QMMF server, this will that allow the streaming of encoded video.

H264

The following pipeline captures from the main camera and sets the caps of the source element to H264 encoding. We then parse and save a file in the MP4 format in the /data/mux1.mp4 location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 1 shows the performance metrics for this pipeline.

gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-h264,format=NV12,width=1920,height=1080,framerate=30/1" ! h264parse \
    ! queue ! mp4mux ! filesink location="/data/HW_H264_camera.mp4"


Table 1: Performance of pipeline for H264 HW encoding.
Operation Mode CPU (%) FPS
Max performance 15.4 29.996


H265

The following pipeline captures from the main camera and sets the caps from the source element to H265 encoding. We then parse and save a file in the MP4 format in the /data/mux2.mp4 location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 2 shows the performance metrics for this pipeline.

gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-h265,format=NV12,profile=main,level=(string)5.2,width=1920,height=1080,framerate=30/1" ! h265parse \
    ! queue ! mp4mux ! filesink location="/data/HW_H265_camera.mp4"


Table 2: Performance of pipeline for H265 HW encoding.
Operation Mode CPU (%) FPS
Max performance 15.3 29.999


Encoding using qtic2venc

In this section we will encode a pipeline using the element qtic2venc. This plugin is propietary of QTI and can encode in H264, H265 and HEIC. For this example pipeline, we will be using H265.

Note
Note: Not all official OS Images have this plugin. If it's not included, please check our other sections Encoding directly from camera or Encoding using OpenMAX plugins to see other ways of encoding.

The following pipeline uses the propietary QTI source qtiqmmfsrc that outputs a RAW stream that will later be encoded to H265 using qtic2venc. Finally, we stream using rtspsink. If you want to know more about rtspsink, please check out our wiki guide on GStreamer plugin for RtspSink element.

gst-launch-1.0 qtiqmmfsrc ! "video/x-raw(memory:GBM),format=NV12,width=1920,height=1080,framerate=30/1" ! queue leaky=2 max-size-buffers=5 ! qtic2venc control-rate=3 target-bitrate=3000000 idr-interval=15 ! "video/x-h265,mapping=/stream1" ! h265parse config-interval=-1 ! rtspsink service=5000

To see this stream, you can use VLC in another device and run the following command:

vlc rtsp://192.168.1.6:5000//stream1

Encoding using OpenMAX plugins

In here we are encoding video using the OpenMAX plugins that will allow to use its platform for hardware accelerated encoding.

Note
Note: For OpenMAX Plugins, you need to use GBM memory and set a target-bitrate, like the example pipelines.

H264

The following pipeline captures from the main camera and the then the OMX plugin does the encoding to H264. We then parse and save a file in the MP4 format in the /data/mux3.mp4 location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 3 shows the performance metrics for this pipeline.

gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-raw(memory:GBM),format=NV12,width=1920,height=1080,framerate=30/1,profile=high,level=(string)5.1" \
    ! omxh264enc target-bitrate=8000000 ! h264parse ! queue ! mp4mux ! filesink location="/data/HW_H264_OpenMax.mp4"


Table 3: Performance of pipeline for H264 HW encoding with OpenMAX.
Operation Mode CPU (%) FPS ProcTime (ms)
Max performance 16.7 29.985 9.949


H265

The following pipeline captures from the main camera and the then the OMX plugin does the encoding to H265. We then parse and save a file in the MP4 format in the /data/mux4.mp4 location. The capture has a resolution of 1920x1080 with a framerate of 30fps. Table 4 shows the performance metrics for this pipeline.

gst-launch-1.0 -e -v qtiqmmfsrc camera=0 ! "video/x-raw(memory:GBM),level=(string)5.2,format=NV12,width=1920,height=1080,framerate=30/1" \
    ! omxh265enc target-bitrate=8000000 ! h265parse ! queue ! mp4mux ! filesink location="/data/HW_H265_OpenMax.mp4"


Table 4: Performance of pipeline for H265 HW encoding with OpenMAX.
Operation Mode CPU (%) FPS ProcTime (ms)
Max performance 16.4 30.005 12.787


Software encoding

H264 Composed

Now we are going to show an example pipeline that encodes video in H264 format using encoding by software. To do this, we are using the x264enc element from GStreamer. These pipelines will capture from both cameras in the Qualcomm Robotics RB5 Development Kit: IMX577 and OV9282. Then we use the composer element to compose both frames side by side in the output video. Finally, this video is encoded to H264 and saved to an mp4 file.

gst-launch-1.0 -v -e qtiqmmfsrc camera=0 name=imx577 ! "video/x-raw,format=NV12,width=640,height=400,framerate=15/1" ! videoconvert ! compositor.sink_0 \
    qtiqmmfsrc camera=1 name=ov9282 ! "video/x-raw,format=NV12,width=640,height=400,framerate=15/1" ! videoconvert ! compositor.sink_1 \
    compositor background=1 name=compositor sink_0::xpos=0 sink_0::ypos=0 sink_1::xpos=640 sink_1::ypos=0 \
    ! "video/x-raw,format=NV12,width=1280,height=400,framerate=15/1" ! videoconvert ! x264enc ! h264parse \
    ! mp4mux ! queue ! filesink location=/data/SW_composed_H264.mp4
Table 5: Performance of pipeline for H264 SW encoding.
Operation Mode CPU (%) FPS
Max performance 25.2 15.008


Taking Snapshots

Another useful encoding format is JPEG. With a JPEG encoder we can use our cameras in the Qualcomm Robotics RB5/RB6 to take multiple snapshots! For this, we are using the jpegenc element from GStreamer's good plug-ins. In our pipeline, we still use the qtiqmmfsrc element from Qualcomm to capture from the camera. We then encode every capture and save each in a file.

gst-launch-1.0 -v -e qtiqmmfsrc camera=0 ! "video/x-raw,width=1280,height=800,framerate=30/1" ! videoconvert ! jpegenc ! queue ! multifilesink location="/data/capture%05d.jpg"
Table 6: Performance of pipeline for JPEG encoding.
Operation Mode CPU (%) FPS
Max performance 16.4 29.867


References

  1. Camera Capture/Encode. Retrieved February 17, 2023, from [1]
  2. GSteamer Plugins, omxh264enc. Retrieved February 17, 2023, from [2]


Index