Signals

From RidgeRun Developer Wiki




Previous: Metadatas/GstInferenceMeta Index Next: Overlay Elements





Overview

Metadata from GstInference is available to be obtained through GSignals and therefore can be used in other programs or processes such as using Python or C++.

Inference String Signal

The following code show a simple capture of the signal in GStreamer using Python. It installs the function handler to the signal called "new-inference-string" from GstInference element. The signal sends a string formatted as json which can be parsed in python using json.loads function.

For details about what elements can be accessed in the serialized json string, check this section.

import gi
gi.require_version("Gst", "1.0")
gi.require_version("GstVideo", "1.0")
from gi.repository import Gst, GObject, GstVideo
import json

GObject.threads_init()
Gst.init(None)

def newPrediction(element, meta):
    # Parse data from string to json object
    data = json.loads(meta)
    print(data)

# Settings
video_dev = "/dev/video0"
arch = "mobilenetv2ssd"
backend = "coral"
model = "/home/coral/models/ssd_mobilenet_v2_coco_quant_postprocess_edgetpu.tflite"
input_layer = "" # Needed by other backends such as Tensorflow
output_layer = "" # Needed by other backends such as Tensorflow

# Pipeline
inf_pipe_str = "v4l2src device=%s ! videoscale ! videoconvert ! \
                video/x-raw,width=640,height=480,format=I420 ! \
                videoconvert ! inferencebin arch=%s backend=%s \
                model-location=%s input-layer=%s output-layer=%s \
                overlay=true name=net ! \
                videoconvert ! autovideosink name=videosink sync=false" % \
                (video_dev,arch,backend,model,input_layer,output_layer)

# Load pipeline from string
inference_pipe = Gst.parse_launch(inf_pipe_str)
# Start pipeline
inference_pipe.set_state(Gst.State.PLAYING)

if (not inference_pipe):
    print("Unable to create pipeline")
    exit(1)

# Search for arch element from inferencebin
net = inference_pipe.get_by_name("arch")

# Connect to inference string signal
net.connect("new-inference-string", newPrediction)

# Launch loop
loop = GObject.MainLoop() 
loop.run()


Previous: Metadatas/GstInferenceMeta Index Next: Overlay Elements