GstInference/Example pipelines with hierarchical metadata: Difference between revisions
mNo edit summary |
mNo edit summary |
||
(25 intermediate revisions by 5 users not shown) | |||
Line 1: | Line 1: | ||
<noinclude> | |||
{{GstInference/Head|previous=Example pipelines/IMX8|next=Example Applications|metakeywords=inference metadata,hierarchical metadata|title=GstInference GStreamer example pipelines with hierarchical metadata}} | |||
</noinclude> | |||
<!-- If you want a custom title for the page, un-comment and edit this line: | |||
{{DISPLAYTITLE:GstInference - <descriptive page name>|noerror}} | |||
--> | |||
__NOTOC__ | |||
== Sample pipelines == | |||
The following section contains a tool for generating simple GStreamer pipelines with one model of a selected architecture using our hierarchical inference metadata. If you are using and older version, you can check the legacy pipelines section. Please make sure to check the documentation to understand the property usage for each element. | |||
The required elements are: | |||
* Backend | |||
* Model | |||
* Model location | |||
* Labels | |||
* Source | |||
* Sink | |||
The optional elements include: | |||
* inferencefilter | |||
* inferencrop | |||
* inferenceoverlay | |||
[[File:Inference example.png|1000px|thumb|center|Detection with new metadata]] | |||
<noinclude> | <noinclude> | ||
{{GstInference/Head|previous=Example pipelines/IMX8|next=Example Applications|keywords=inference metadata,hierarchical metadata|title=GstInference GStreamer example pipelines with hierarchical metadata}} | {{GstInference/Head|previous=Example pipelines/IMX8|next=Example Applications|keywords=inference metadata,hierarchical metadata|title=GstInference GStreamer example pipelines with hierarchical metadata}} | ||
Line 24: | Line 53: | ||
* inferencrop | * inferencrop | ||
* inferenceoverlay | * inferenceoverlay | ||
[[File:Inference example.png|1000px|thumb|center|Detection with new metadata]] | [[File:Inference example.png|1000px|thumb|center|Detection with new metadata]] | ||
== Pipeline Generator == | |||
<br> | |||
{{Ambox | |||
|type=notice | |||
|small=left | |||
|issue=NOTE: The following tool provides unoptimized pipelines. Consider removing unnecessary elements. | |||
|style=width:unset; | |||
}} | |||
<br> | |||
<html> | <html> | ||
<head> | <head> | ||
Line 121: | Line 161: | ||
</head> | </head> | ||
<body> | <body> | ||
<p>The following tool will provide simple pipelines according to the selected elements.</p> | <p>The following tool will provide simple pipelines according to the selected elements.</p> | ||
Line 146: | Line 184: | ||
<select id="backend" name="backend" onchange="backend_selection(this.options[this.selectedIndex].value)"> | <select id="backend" name="backend" onchange="backend_selection(this.options[this.selectedIndex].value)"> | ||
<option value="" disabled selected>Select your backend</option> | <option value="" disabled selected>Select your backend</option> | ||
<option value="edgetpu">EdgeTPU</option> | |||
<option value="tflite">TFLite</option> | |||
<option value="tensorflow">TensorFlow</option> | <option value="tensorflow">TensorFlow</option> | ||
<option value=" | <option value="tensorrt">TensorRT</option> | ||
<option value="onnxrt">ONNXRT</option> | |||
<option value="onnxrt_acl">ONNXRT ACL</option> | |||
<option value="onnxrt_openvino">ONNXRT OpenVINO</option> | |||
</select> | </select> | ||
</div> | </div> | ||
Line 403: | Line 446: | ||
enable_element("inputlayer"); | enable_element("inputlayer"); | ||
enable_element("outputlayer"); | enable_element("outputlayer"); | ||
break; | |||
case "onnxrt" : | |||
search_option("tinyyolov3","TinyYolov3","model","add"); | |||
search_option("facenetv1","FaceNet","model","remove"); | |||
disable_element("inputlayer"); | |||
disable_element("outputlayer"); | |||
break; | break; | ||
case "tflite" : | case "tflite" : | ||
Line 431: | Line 480: | ||
document.getElementById("outputlayer").value = output_layers[document.getElementById("model").value]; | document.getElementById("outputlayer").value = output_layers[document.getElementById("model").value]; | ||
tmp_model_location = "graph_" + document.getElementById("model").value + "_tensorflow.pb"; | tmp_model_location = "graph_" + document.getElementById("model").value + "_tensorflow.pb"; | ||
break; | |||
case "onnxrt": | |||
tmp_model_location = "graph_" + document.getElementById("model").value + ".onnx"; | |||
break; | break; | ||
case "tflite": | case "tflite": | ||
Line 591: | Line 643: | ||
if(overlay != "") { | if(overlay != "") { | ||
overlay = " fakesink net.src_bypass ! queue ! inferenceoverlay | overlay = " fakesink net.src_bypass ! queue ! inferenceoverlay "; | ||
var thickness= document.getElementById("thickness").value; | var thickness= document.getElementById("thickness").value; | ||
var fontscale=document.getElementById("fontscale").value; | var fontscale=document.getElementById("fontscale").value; | ||
Line 600: | Line 652: | ||
} | } | ||
if(fontscale != "") { | if(fontscale != "") { | ||
overlay = overlay + " | overlay = overlay + " font-scale=" + fontscale; | ||
} | } | ||
if(style != "") { | if(style != "") { | ||
Line 634: | Line 686: | ||
</html> | </html> | ||
<br> | <br> | ||
== Advanced pipelines == | == Advanced pipelines == | ||
<br> | <br> | ||
<noinclude> | <noinclude> | ||
{{GstInference/Foot|Example pipelines/IMX8|Example Applications}} | {{GstInference/Foot|Example pipelines/IMX8|Example Applications}} | ||
</noinclude> | </noinclude> |
Latest revision as of 19:38, 27 February 2023
Make sure you also check GstInference's companion project: R2Inference |
GstInference |
---|
Introduction |
Getting started |
Supported architectures |
InceptionV1 InceptionV3 YoloV2 AlexNet |
Supported backends |
Caffe |
Metadata and Signals |
Overlay Elements |
Utils Elements |
Legacy pipelines |
Example pipelines |
Example applications |
Benchmarks |
Model Zoo |
Project Status |
Contact Us |
|
Sample pipelines
The following section contains a tool for generating simple GStreamer pipelines with one model of a selected architecture using our hierarchical inference metadata. If you are using and older version, you can check the legacy pipelines section. Please make sure to check the documentation to understand the property usage for each element.
The required elements are:
- Backend
- Model
- Model location
- Labels
- Source
- Sink
The optional elements include:
- inferencefilter
- inferencrop
- inferenceoverlay
Make sure you also check GstInference's companion project: R2Inference |
GstInference |
---|
Introduction |
Getting started |
Supported architectures |
InceptionV1 InceptionV3 YoloV2 AlexNet |
Supported backends |
Caffe |
Metadata and Signals |
Overlay Elements |
Utils Elements |
Legacy pipelines |
Example pipelines |
Example applications |
Benchmarks |
Model Zoo |
Project Status |
Contact Us |
|
Sample pipelines
The following section contains a tool for generating simple GStreamer pipelines with one model of a selected architecture using our hierarchical inference metadata. If you are using and older version, you can check the legacy pipelines section. Please make sure to check the documentation to understand the property usage for each element.
The required elements are:
- Backend
- Model
- Model location
- Labels
- Source
- Sink
The optional elements include:
- inferencefilter
- inferencrop
- inferenceoverlay
Pipeline Generator
NOTE: The following tool provides unoptimized pipelines. Consider removing unnecessary elements. |
The following tool will provide simple pipelines according to the selected elements.
Advanced pipelines