NVIDIA Advanced GStreamer Pipelines with Gstd
|
Problems running the pipelines shown on this page? Please see our GStreamer Debugging guide for help. |
Introduction
This wiki intends to show how to handle different NVIDIA accelerated pipelines using Gstd along with GstInterpipe. We will be using different pipelines to describe the system shown in the next figure:
We will explain this example using the shell interface of Gstd on a Jetson Xavier NX device. However, you can check the other available APIs for GStreamer Daemon, which has a very similar syntax.
Procedure
First of all, we need to initialize the GStreamer Daemon. To do this open a terminal and run:
gstd
Creating the Sources
Video RAW Source
Here we will be using a videotestsrc for compatibility but you could use either v4l2src or nvarguscamerasrc depending on your camera. The raw video pipeline will have a resolution of 1280x720@60fps, to demonstrate later how to decrease the framerate. This will be one of the branches that we can choose with the input selector.
WIDTH=1280 HEIGHT=720 FRAMERATE=60 FORMAT="I420" gst-client pipeline_create test_src videotestsrc is-live=true ! "video/x-raw,width=${WIDTH},height=${HEIGHT},framerate=${FRAMERATE}/1,format=${FORMAT}" ! interpipesink name=test_src sync=true
NOTE: For some use cases it might be useful to set a format of UYVP to simulate the output of an HD-SDI converter. If this is your case, set the format variable to FORMAT="UYVP" . |
UDP Source
In this example we will generate a UDP stream to use as a source in the same Xavier NX, however, you could modify the udpsrc element properties to listen to any address you need.
Similar to the raw video pipeline we will use a videotestsrc stream with another pattern and send it locally through UDP so that it can be listened by the actual UDP source branch from which we choose.
PORT_H265=8000 WIDTH=1280 HEIGHT=720 FRAMERATE=60 gst-client pipeline_create udp_sink videotestsrc pattern=ball flip=true is-live=true ! "video/x-raw,width=${WIDTH},height=${HEIGHT},framerate=${FRAMERATE}/1" ! nvvidconv ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink host=127.0.0.1 port=${PORT_H265}
And now we create the actual UDP source branch that is listening to the locally generated UDP stream.
gst-client pipeline_create udp_src udpsrc port=${PORT_H265} ! tsdemux ! h265parse ! nvv4l2decoder ! nvvidconv ! "video/x-raw" ! interpipesink name=udp_src sync=true
Creating the Processing Pipeline
In this stage, our processing pipeline will be able to choose from either of the raw video or UDP streams and apply some processing. The processing here includes:
- Framerate decrease (60 -> 30)
- ROI cropping (80px from the right)
- Scaling (640x480)
- Encoding (H265)
- Performance monitoring (requires gst-perf)
- UDP streaming with multicast
MULTICAST_IP="224.1.1.1" PORT="12345" WIDTH=640 HEIGHT=480 ROI_TOP=0 ROI_BOTTOM=720 ROI_LEFT=0 ROI_RIGHT=1200 FRAMERATE=30 gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! perf name=perf_monitor ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false
NOTE: If you set an input format of UYVP to simulate the output of an HD-SDI converter, then you need to add a videoconvert element before the nvvidconv element, so the pipeline would be like this: |
gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! videoconvert ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! perf name=perf_monitor ! h265parse ! mpegtsmux alignment=7 ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false
Interacting with the Pipelines
Basic Controls
- Playing the pipelines
gst-client pipeline_play udp_sink gst-client pipeline_play udp_src gst-client pipeline_play test_src gst-client pipeline_play proc_pipe
- Pausing the pipelines
gst-client pipeline_pause udp_sink gst-client pipeline_pause udp_src gst-client pipeline_pause test_src gst-client pipeline_pause proc_pipe
- Stopping the pipelines
gst-client pipeline_stop udp_sink gst-client pipeline_stop udp_src gst-client pipeline_stop test_src gst-client pipeline_stop proc_pipe
- Deleting the pipelines
gst-client pipeline_delete udp_sink gst-client pipeline_delete udp_src gst-client pipeline_delete test_src gst-client pipeline_delete proc_pipe
Changing between Sources
You can choose to listen from the UDP source with:
gst-client element_set proc_pipe interpipe listen-to udp_src
Or from the raw video source with:
gst-client element_set proc_pipe interpipe listen-to test_src
Setting the Output Resolution
The output resolution is controlled by the scaler block and we can modify it by setting the scale_filter caps.
NEW_WIDTH=320 NEW_HEIGHT=240 gst-client element_set proc_pipe scale_filter caps "video/x-raw(memory:NVMM),width=${NEW_WIDTH},height=${NEW_HEIGHT}"
Setting the Encoder Input Format
The encoder input format is controlled also by the scaler block and we can modify it by setting the scale_filter caps. The supported values are NV12, I420 and P010_10LE.
NEW_FORMAT="P010_10LE" gst-client element_set proc_pipe scale_filter caps "video/x-raw(memory:NVMM),format=${NEW_FORMAT}"
Decreasing the Framerate
The output framerate is controlled by the framerate filter and we can modify it by setting the max rate that it is allowed to pass.
gst-client element_set proc_pipe framerate_filter max-rate 30
Modifying UDP Input and Output Settings
To modify the input settings just modify the udp_src pipeline:
NEW_PORT=12346 gst-client element_set udp_src input_udp port ${NEW_PORT}
And for the output modify the proc_pipe pipeline:
NEW_ADDRESS=225.1.1.1 NEW_PORT=12346 gst-client element_set proc_pipe output_udp host ${NEW_ADDRESS} gst-client element_set proc_pipe output_udp port ${NEW_PORT}
Setting a ROI
You can set a specific ROI with the following command:
NEW_TOP=100 NEW_BOTTOM=600 NEW_LEFT=100 NEW_RIGHT=1200 gst-client element_set proc_pipe cropper top ${NEW_TOP} gst-client element_set proc_pipe cropper bottom ${NEW_BOTTOM} gst-client element_set proc_pipe cropper left ${NEW_LEFT} gst-client element_set proc_pipe cropper right ${NEW_RIGHT}
NOTE: Keep in mind that the output ROI must be greater than 120x68. |
Tuning the Encoder Settings
You can modify any of the available encoder settings. You can check this NVIDIA_H265_Encoding_Configurations to see which settings are available using GStreamer.
An example for setting the bitrate would be:
# 10Mbit/s gst-client element_set proc_pipe encoder bitrate 10000000 # 1Mbit/s gst-client element_set proc_pipe encoder bitrate 1000000
Querying Bitrate and Framerate
You can get the values of current and average bitrate and framerate by querying the perf element with the following command:
gst-client element_get proc_pipe perf_monitor last-info
And it should give you an output similar to this:
{ "code" : 0, "description" : "Success", "response" : { "name" : "last-info", "value" : "\"perf:\\ perf_monitor\\;\\ timestamp:\\ 1:03:21.318198410\\;\\ bps:\\ 3981424.000\\;\\ mean_bps:\\ 4627982.149\\;\\ fps:\\ 59.995\\;\\ mean_fps:\\ 60.023\"", "param" : { "description" : "A string containing the performance information posted to the GStreamer bus (timestamp, bps, mean_bps, fps, mean_fps)", "type" : "gchararray", "access" : "((GstdParamFlags) READ | 224)" } } }
UDP Clients
The clients for the final stream just need to connect to the correct multicast group. This can be done on the client-side by running the following pipelines:
x86
ADDRESS=224.1.1.1 PORT=12345 gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux demux. ! queue ! h265parse ! avdec_h265 ! videoconvert ! autovideosink sync=false
Jetson
ADDRESS=224.1.1.1 PORT=12345 gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux demux. ! queue ! h265parse ! nvv4l2decoder ! nvvidconv ! xvimagesink sync=false
Adding Metadata
Sometimes it might be required to include metadata in the stream to add any additional information. Our solution GStreamer In-Band Metadata for MPEG Transport Stream makes it easier for your convenience.
Sending Metadata through TCP
In this example, we will show how to send/receive the metadata through a TCP socket. First, we create the TCP metadata source with the following pipeline:
META_SOURCE_IP="10.251.101.238" META_SOURCE_PORT="3001" gst-client pipeline_create tcp_meta metasrc is-live=true name=meta ! tcpserversink host=${META_SOURCE_IP} port=${META_SOURCE_PORT}
The metasrc element may send any kind of binary metadata, but it requires an application to do that. Instead, we will show how to send specifically string data using gstd. To do this just set the metadata property with the text you would like to send:
gst-client element_set tcp_meta meta metadata Hello_TCP
This will send the metadata just once, but if we want it to send it periodically we just need to set the period property of the metasrc element. For example, if we want to send it every second it would be like this:
gst-client element_set tcp_meta meta period 1
Then we play the pipeline:
gst-client pipeline_play tcp_meta
Modifying the Processing Pipe to Support Metadata
We can add metadata support by slightly modifying the proc_pipe of our last example.
MULTICAST_IP="224.1.1.1" PORT="12345" META_SOURCE_IP="10.251.101.238" META_SOURCE_PORT="3001" WIDTH=640 HEIGHT=480 ROI_TOP=0 ROI_BOTTOM=720 ROI_LEFT=0 ROI_RIGHT=1200 FRAMERATE=30 gst-client pipeline_create proc_pipe interpipesrc name=interpipe listen-to=test_src is-live=true format=time ! videorate name=framerate_filter max-rate=${FRAMERATE} drop-only=true ! nvvidconv name=cropper top=${ROI_TOP} bottom=${ROI_BOTTOM} left=${ROI_LEFT} right=${ROI_RIGHT} ! capsfilter name=scale_filter caps="video/x-raw(memory:NVMM),width=${WIDTH},height=${HEIGHT}" ! queue max-size-buffers=3 leaky=downstream ! nvv4l2h265enc name=encoder bitrate=2000000 iframeinterval=300 vbv-size=33333 insert-sps-pps=true control-rate=constant_bitrate profile=Main num-B-Frames=0 ratecontrol-enable=true preset-level=UltraFastPreset EnableTwopassCBR=false maxperf-enable=true ! h265parse ! mpegtsmux alignment=7 name=mux ! queue ! udpsink name=output_udp host="${MULTICAST_IP}" port=${PORT} auto-multicast=true sync=false \ tcpclientsrc host=${META_SOURCE_IP} port=${META_SOURCE_PORT} ! queue ! mux.meta_54
Here we are adding the incoming TCP stream to the MPEG-TS multiplexer so that receiver applications can process the metadata in a separate way too. The meta_54 is an identifier for the stream used to demux the content of the receiving side in case there are multiple metadata streams muxed.
Then we play the processing pipeline:
gst-client pipeline_play proc_pipe
Receiving Metadata
In order to receive the metadata we also need to modify the UDP receiver clients to demux each of the incoming streams:
ADDRESS=224.1.1.1 PORT=12345 gst-launch-1.0 udpsrc port=${PORT} address=${ADDRESS} ! tsdemux name=demux ! queue ! h265parse ! avdec_h265 ! queue ! videoconvert ! autovideosink sync=false demux.private_0_0036 ! queue ! 'meta/x-klv' ! metasink -v
Note that in the private_0_0036 identifier the 36 corresponds to the hexadecimal representation of the meta_54 identifier we used in the muxer.
Here the metasink element will allow us to see without an extra application the sent metadata, since it will dump the contents to the standard output in a way similar to this:
00000000 (0x7fd0f002d590): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP. 00000000 (0x7fd0f002d550): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP. 00000000 (0x7fd0f002d5b0): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP. 00000000 (0x7fd0f002d650): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP. 00000000 (0x7fd0f002d670): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP. 00000000 (0x7fd0f002d690): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP. 00000000 (0x7fd0f002d6b0): 48 65 6c 6c 6f 5f 54 43 50 00 Hello_TCP.
Extracting and Processing Metadata
In case you want to apply some processing rather than just print to the standard output, you can create an application instead of the gst-launch-1.0 pipeline and use appsink instead of metasink to extract the data. Here's an example using the last pipeline with appsink to extract the data:
import sys import gi gi.require_version("Gst", "1.0") from gi.repository import Gst, GObject import numpy def message_handler(bus, msg, loop): """Handle gracefully the EOS and errors""" if msg.type in [Gst.MessageType.EOS, Gst.MessageType.ERROR]: loop.quit() def on_new_sample(sink, data): """Get the KLV data on every buffer the appsink receives""" sample = sink.emit("pull-sample") buffer = sample.get_buffer() size = buffer.get_size() # Extract the KLV data into an array and do your processing klv_array = numpy.ndarray( size, buffer=buffer.extract_dup(0, size), dtype=numpy.uint8) print("\nMeta: ", end="") for byte in klv_array: print(chr(byte), end="") return Gst.FlowReturn.OK def main(args): Gst.init(args) timeout_seconds = 3 pipeline = Gst.parse_launch( "udpsrc address=224.1.1.1 port=12345 ! tsdemux name=demux ! queue ! h265parse ! avdec_h265 " "! queue ! videoconvert ! autovideosink sync=false demux.private_0_0036 " "! queue ! meta/x-klv ! appsink name=sink emit-signals=true") sink = pipeline.get_by_name("sink") sink.connect("new-sample", on_new_sample, sink) # Init GObject loop to handle Gstreamer Bus Events loop = GObject.MainLoop() # Listen to bus messages to handle errors and EOS bus = pipeline.get_bus() bus.add_signal_watch() bus.enable_sync_message_emission() bus.connect("message", message_handler, loop) print("Playing...\nPress Ctrl+C to exit\n") pipeline.set_state(Gst.State.PLAYING) pipeline.get_state(timeout_seconds * Gst.SECOND) try: loop.run() except BaseException: loop.quit() print("\nClosing app...") pipeline.set_state(Gst.State.NULL) pipeline.get_state(timeout_seconds * Gst.SECOND) if __name__ == "__main__": sys.exit(main(sys.argv))
For direct inquiries, please refer to the contact information available on our Contact page. Alternatively, you may complete and submit the form provided at the same link. We will respond to your request at our earliest opportunity.
Links to RidgeRun Resources and RidgeRun Artificial Intelligence Solutions can be found in the footer below.