GstInference/Supported backends/NCSDK: Difference between revisions
mNo edit summary |
mNo edit summary |
||
Line 1: | Line 1: | ||
<noinclude> | <noinclude> | ||
{{GstInference/Head}} | {{GstInference/Head|previous=Supported backends|next=Example pipelines|keywords=GstInference backends,NCSDK,Movidius,Intel Movidius,Neural Compute SDK,Intel Movidius NCSDK,Deep Neural Networks,DNN,DNN Model,Neural Compute API, NCAPI}} | ||
</noinclude> | </noinclude> | ||
<!-- If you want a custom title for the page, un-comment and edit this line: | <!-- If you want a custom title for the page, un-comment and edit this line: |
Revision as of 12:28, 19 December 2018
Make sure you also check GstInference's companion project: R2Inference |
GstInference |
---|
Introduction |
Getting started |
Supported architectures |
InceptionV1 InceptionV3 YoloV2 AlexNet |
Supported backends |
Caffe |
Metadata and Signals |
Overlay Elements |
Utils Elements |
Legacy pipelines |
Example pipelines |
Example applications |
Benchmarks |
Model Zoo |
Project Status |
Contact Us |
|
The NCSDK Intel® Movidius™ Neural Compute SDK (Intel® Movidius™ NCSDK) enables deployment of deep neural networks on compatible devices such as the Intel® Movidius™ Neural Compute Stick. The NCSDK includes a set of software tools to compile, profile, and validate DNNs (Deep Neural Networks) as well as APIs on C/C++ and Python for application development.
The NCSDK has two general usages:
- Profiling, tuning, and compiling a DNN models.
- Prototyping user applications, that run accelerated with a neural compute device hardware, using the NCAPI.
Installation
You can install the NCSDK on a system running Linux directly, downloading a Docker container, on a virtual machine or using a Python virtual environment. Al the possible installation paths are documented on the official installation guide.
Tools