RidgeRun product plugins used in the NVIDIA GTC 2024 360 VR Demo

From RidgeRun Developer Wiki


Previous: Using the Demo/Usage Index Next: Potential applications and use cases







Understanding how the demo functions and how to execute it is crucial. Below, you'll find an overview of the process behind creating the demo, highlighting the role of RidgeRun's products in making it possible.

RidgeRun Products in the Demo

To grasp the nuances of each RidgeRun product's functionality, it's essential to dissect the intricacies of the processing pipeline that powers the demo. Let's delve into the behind-the-scenes workings to gain a comprehensive understanding of how these products operate.

The process handled by the GStreamer pipeline controlled by the webpage follows the flow shown in the next image.

Interpipe structure of the 360 video demo


This flow process in summary is the following:

1) Capture: The section in charge of taking the capture of the cameras and converting the output of the fish eye lens into a complete 360-degree field of view.

2) Video Processing and AI: This section adds AI features by processing the video generated by the capture.

3) Features: various features such as RTSP or HLS streaming.

4) GstInterpipes: Essential plugin to achieve the interconnection of these independent pipelines.

Each of these sections will be explained in detail, but before it's important to know what is GstInterpipes.

GstInterpipe

GstInterpipe allows easy and dynamic interconnection between the different sections of the pipeline. This plugin allows buffers and events to flow between two or more independent pipelines. The GstInterpipe is composed of two elements:

  • interpipesink: Sink for internal pipeline buffers communication
  • interpipesrc: Source for internal pipeline buffers communication

The web page allows the users to designate the features they wish to integrate into the processing pipeline by enabling the interpipesink or interpipesrc in the backend. Users can craft a customised workflow for the 360 video demo without the need to code these pipelines – all done seamlessly through the webpage setup menu.

Capture

The capture section takes the input of the cameras and processes it until the output gets the form of a 360-degree field of view video. To achieve it, this pipeline includes the GstSticher (plugin in charge of stitching two or more images in one) and the GstProjector (plugin in charge of converting the output of the fish eye to an equirectangular image). The process works as follows.

Structure of the capture pipeline using GstInterpipes

The cameras using a fish eye lens take the capture as follows.

Capture of the camera 0
Capture of the camera 1

Then by using the GstProjector, the images are converted to equirectangular images.

Equirectangular projection of the camera 0
Equirectangular projection of camera 1

Finally, the GstStitcher stitches both capture in one resulting in the complete 360 Image.

Stitching of the equirectangular projections

Video Processing and AI

This section focused on the addition of DeepStream to the output of the capture pipeline.

People / Face Detection

This pipeline introduces DeepStream to detect people and faces from the output of the capture section if enabled. Note that this feature includes a pre-trained model such as PeopleNet capable of achieving this.

Face Blurring (Anonymization)

In contrast to the face detection feature, this pipeline will use the library of AI-based object redaction, for this demo the censuring is applied to the faces as an example.

Features

In this last section, the enabled features will receive the output of the video processing. The available features are:

Snapshot

This pipeline takes a snapshot of the output that the user will see with the MetaQuest and VLC. This pipeline will always send a thumbnail to the page.

RTSP Stream

Pipeline that sends the output via RSTP protocol. You can visualize the output with VLC or any other media player with RTSP support. GstRtspSink makes possible the transmission of the stream with the RTSP protocol.

HLS Stream (VR)

This pipeline will generate an HLS stream.



Previous: Using the Demo/Usage Index Next: Potential applications and use cases