ONNX simple sample
Introduction
On this page, you are going to find the steps to install ONXX and ONXXRuntime and run a simple C/C++ example on Linux. This wiki page describes the importance of ONNX models and how to use it. The goal is to provide you some examples.
Installing ONNX
You can install ONNX from PyPI with the following command:
sudo pip install onnx
You can also build and install ONNX locally from source code:
git clone https://github.com/onnx/onnx.git cd onnx git submodule update --init --recursive python setup.py install
Installing ONNXRuntime
This guide builds the baseline CPU version of ONNXRuntime form source, to build it use the following commands:
git clone --recursive https://github.com/Microsoft/onnxruntime -b v1.0.0 cd onnxruntime
Before install onnxruntime you need to install CMake 3.13 or higher.
sudo -H pip3 install cmake
After install CMake run the following command to build onnxruntime:
./build.sh --config RelWithDebInfo --build_shared_lib --parallel
* To use a different backend please refer to this site to check how to build ONNXRuntime
Finally, install it:
cd build/Linux/RelWithDebInfo sudo make install
Finally, copy the .so file to general lib path:
cp libonnxruntime.so /usr/lib/x86_64-linux-gnu/
Enabling other execution providers
ONNX Runtime supports multiple execution providers for a full list visit: https://github.com/microsoft/onnxruntime/blob/master/BUILD.md
Intel DNNL
./build.sh --config RelWithDebInfo --build_shared_lib --parallel --use_dnnl cd build/Linux/RelWithDebInfo sudo make install sudo cp libonnxruntime.so.1.2.0 /usr/lib/x86_64-linux-gnu/libonnxruntime.so sudo cp dnnl/install/lib/libmkldnn.so /usr/lib/x86_64-linux-gnu/
Example
This guide is for using an ONNXRuntime C/C++ code on Linux, for that reason only the SqueezeNet examples are built it.
Build
First, go to the path with the C/C++ code examples.
cd onnxruntime/csharp/test/Microsoft.ML.OnnxRuntime.EndToEndTests.Capi/
After that, build the code:
g++ -o Capi_sample C_Api_Sample.cpp -I $PATHTOONNXRUNTIMESESSION (#CHOOSE THE APPROPRIATE PATH TO onnxruntime/include/onnxruntime/core/session) -lonnxruntime -std=c++14
Run
Finally, just run the code:
./Capi_sample
Running this example you will get the following output:
Using Onnxruntime C API Number of inputs = 1 Input 0 : name=data_0 Input 0 : type=1 Input 0 : num_dims=4 Input 0 : dim 0=1 Input 0 : dim 1=3 Input 0 : dim 2=224 Input 0 : dim 3=224 Score for class [0] = 0.000045 Score for class [1] = 0.003846 Score for class [2] = 0.000125 Score for class [3] = 0.001180 Score for class [4] = 0.001317 Done!
Convert DNN models to ONNX
The objective of ONNX is provide a common language to describe the graph of neural network, for that reason they provide tools to convert models from different deep learning frameworks to ONNX protocol buffer.
Tensorflow to ONNX
This tool can convert TensorFlow models from saved_model, checkpoint or frozen graph formats.
Here you can find an example of how to convert a saved_model or frozen graph to ONNX.
To obtain more information and download the tool, refer to this site
Keras to ONNX
This tool can convert Keras models.
To obtain more information and download the tool, refer to this site
Scikit-Learn to ONNX
This tool can convert Scikit-Learn models.
To obtain more information and download the tool, refer to this site
ONNXMLTools
ONNXMLTools provide support to convert models from CoreML, LightGBM, LibSVM, and XGBoost to ONNX.
This project also work as a wrapper to the TesorFlow, Keras and Scikit-Learn converters.
To obtain more information and download the tool, refer to this site