Jump to content

GstInference/Supported backends/NCSDK: Difference between revisions

m
no edit summary
mNo edit summary
Line 5: Line 5:
{{DISPLAYTITLE:GstInference - <descriptive page name>|noerror}}
{{DISPLAYTITLE:GstInference - <descriptive page name>|noerror}}
-->
-->
{{DISPLAYTITLE:GstInference and NCSDK backend|noerror}}


The NCSDK Intel® Movidius™ Neural Compute SDK (Intel® Movidius™ NCSDK) enables deployment of deep neural networks on compatible devices such as the Intel® Movidius™ Neural Compute Stick. The NCSDK includes a set of software tools to compile, profile, and validate DNNs (Deep Neural Networks) as well as APIs on C/C++ and Python for application development.
The NCSDK Intel® Movidius™ Neural Compute SDK (Intel® Movidius™ NCSDK) enables deployment of deep neural networks on compatible devices such as the Intel® Movidius™ Neural Compute Stick. The NCSDK includes a set of software tools to compile, profile, and validate DNNs (Deep Neural Networks) as well as APIs on C/C++ and Python for application development.
Line 10: Line 12:
To use the ncsdk on Gst-Inference be sure to run the R2Inference configure with the flag <code> --enable-ncsdk </code> and use the property <code> backend=ncsdk </code> on the Gst-Inference plugins.  
To use the ncsdk on Gst-Inference be sure to run the R2Inference configure with the flag <code> --enable-ncsdk </code> and use the property <code> backend=ncsdk </code> on the Gst-Inference plugins.  


=Installation=
==Installation==


You can install the NCSDK on a system running Linux directly, downloading a Docker container, on a virtual machine or using a Python virtual environment. All the possible installation paths are documented on the [https://movidius.github.io/ncsdk/install.html official installation guide].
You can install the NCSDK on a system running Linux directly, downloading a Docker container, on a virtual machine or using a Python virtual environment. All the possible installation paths are documented on the [https://movidius.github.io/ncsdk/install.html official installation guide].
Line 18: Line 20:
Note: It is recommended to take the docker container route on the NCSDK installation. Other routes may affect your python environment because it sometimes uninstalls and reinstalls python and some common plugins such as numpy or TensorFlow. Docker installation is actually very simple and it doesn't affect your environment at all. Use this [https://movidius.github.io/ncsdk/docker.html link] to jump directly to the docker section on the installation guide.
Note: It is recommended to take the docker container route on the NCSDK installation. Other routes may affect your python environment because it sometimes uninstalls and reinstalls python and some common plugins such as numpy or TensorFlow. Docker installation is actually very simple and it doesn't affect your environment at all. Use this [https://movidius.github.io/ncsdk/docker.html link] to jump directly to the docker section on the installation guide.


= Enabling the backend =
== Enabling the backend ==


To enable NCSDK as a backend for GstInference you need to install R2Inference with NCSDK support. To do this, use the option --enable-ncsdk during R2Inference configure:
To enable NCSDK as a backend for GstInference you need to install R2Inference with NCSDK support. To do this, use the option --enable-ncsdk during R2Inference configure:
Line 26: Line 28:
</syntaxhighlight>
</syntaxhighlight>


=Generating a graph=
==Generating a graph==


GstInference ncsdk backend uses the same graphs as the NCSDK API. Those graphs are specially compiled to run inference on a Neural Compute Stick(NCS). The NCSDK provides a tool (mvNCCompile) to generate NCS graphs from either a TensorFlow frozen model, or a Caffe model and weights. For examples on how to generate a graph please check the [[R2Inference/Supported_backends/NCSDK#Generating_a_model_for_R2I | Generating a model for R2I]] section on the R2Inference wiki.
GstInference ncsdk backend uses the same graphs as the NCSDK API. Those graphs are specially compiled to run inference on a Neural Compute Stick(NCS). The NCSDK provides a tool (mvNCCompile) to generate NCS graphs from either a TensorFlow frozen model, or a Caffe model and weights. For examples on how to generate a graph please check the [[R2Inference/Supported_backends/NCSDK#Generating_a_model_for_R2I | Generating a model for R2I]] section on the R2Inference wiki.


=Properties=
==Properties==
You can find the full documentation of the C API [https://movidius.github.io/ncsdk/ncapi/ncapi2/c_api/readme.html here] and the Python API [https://movidius.github.io/ncsdk/ncapi/ncapi2/py_api/readme.html here]. Gst-Inference uses only the C API and R2Inference takes care of devices, graphs, models and fifos. Because of this, we will only take a look at the options that you can change when using the C API through R2Inference.
You can find the full documentation of the C API [https://movidius.github.io/ncsdk/ncapi/ncapi2/c_api/readme.html here] and the Python API [https://movidius.github.io/ncsdk/ncapi/ncapi2/py_api/readme.html here]. Gst-Inference uses only the C API and R2Inference takes care of devices, graphs, models and fifos. Because of this, we will only take a look at the options that you can change when using the C API through R2Inference.


Line 54: Line 56:
To learn more about the NCSDK C API option, please check the [[R2Inference/Supported_backends/NCSDK#API| NCSDK API wiki section]] on the R2Inference subwiki.
To learn more about the NCSDK C API option, please check the [[R2Inference/Supported_backends/NCSDK#API| NCSDK API wiki section]] on the R2Inference subwiki.


=Tools=
==Tools==


The NCSDK installation include some useful tools to analyze, optimize and compile models. We will mention these tools here, but if you want some examples and a more complete description please check the [[R2Inference/Supported_backends/NCSDK| NCSDK wiki page]] on the R2Inference subwiki.
The NCSDK installation include some useful tools to analyze, optimize and compile models. We will mention these tools here, but if you want some examples and a more complete description please check the [[R2Inference/Supported_backends/NCSDK| NCSDK wiki page]] on the R2Inference subwiki.
Cookies help us deliver our services. By using our services, you agree to our use of cookies.