Demo Description

From RidgeRun Developer Wiki






The SmartSeek360 search for objects specified by the user in a 360-degree video stream, once the object is found the system follows it trough the video leveraging the PTZ functions, and triggers video recording at the same time.

This system is made up of the following Microservices:

NVIDIA Metropolis Microservices:

  1. Video Storage Toolkit (VST): VST NVIDIA microservice auto-discovers ONVIF-S compliant IP cameras, and allows use of custom IP stream as video source. It then allows for video to be stored, played back at various speeds, or paused at any frame
  2. Deepstream AI Microservice: Metropolis Microservices provide a DeepStream application for real-time processing of various camera streams presented by VST. The processing includes object detection, and object tracking. The output of the application is metadata based on metropolis schema sent to Redis message bus using msgbroker plugin.
  3. REDIS: Redis is a message broker that contains all device metadata. VST depends on Redis streams to publish device status change events. DeepStream depends on Redis streams to publish metadata and receive VST camera status events.
  4. Analytics: This NVIDIA microservice is used to create metrics and to emit alerts based on the Analytics microservice configuration.

RidgeRun Microservices:

  1. PTZ Microservice:The PTZ Microservice is designed to receive 360-degree equirectangular RTSP video streams and outputs an RTSP stream with the performed PTZ operations over the equirectangular videos using RidgeRun libpanorama.
  2. Detection Microservice: Detection Microservice detects in the input stream the target objects described in a text prompt.The microservice use the NanoOwl generative AI model that allows open vocabulary detection. Meaning that the user can provide a list of objects that are not bounded to any specific classes. For example, it is possible to indicate: search for dogs.
  3. AI Agent: AI Agent Microservice allows a natural communication between the user and other microservices. This service

uses the Hugging Face LLM Trelis/Llama-2-7b-chat-hf-function-calling-v3 to convert text commands into API calls, processes the LLM result and calls the correspoding API request. For example, it is possible to request: move the camera 30 degrees to the Right.

  1. Analytics: This Microservice reads detection metadata provided by the REDIS NVIDIA microservice and moves the camera to the detected object position using PTZ and starts video recordings using the NVIDIA VST microservice.
  2. BIPS: This Microservice can be used to handle direct memory buffer transfers, currently this is work in progress and it is not included in the Demo.

‎ ‎