R2Inference - ONNXRT Examples

From RidgeRun Developer Wiki




Previous: Examples/EdgeTPU Index Next: Examples/ONNXRT ACL




R2Inference provides several examples to highlight its usage with different architectures and frameworks. The models used for these examples can be downloaded from our Model Zoo.

Preparation

To test the ONNXRT example you will need a ONNX compatible model file of the InceptionV2 architecture. You can obtain the model from the Model Zoo.


Inception v2

This example is located in r2inference/examples/r2i/onnxrt . To use this example run:

./inception -i [JPG input Image] -m [ONNX Model] -s [Model Input Size] -I [Path to preprocessing .so module] -O [Path to postprocessing .so module]

For example, evaluating this image:

Inception example input

Should produce the following output:

./inception -i plane.jpg -m graph_inceptionv2.onnx -s 224 -I ../../../r2i/preprocessing/.libs/libnormalize_inceptionv1.so -O ../../../r2i/postprocessing/.libs/top_sort_postprocessing.so
Loading Model: graph_inceptionv2.onnx
Setting model to engine
Configuring ONNXRT session parameters
Loading image: plane.jpg
Starting engine
Predicting...
Postprocessing...
Highest probability is label 405 (0.703488)
Stopping engine

According to the ImageNet labels, 405 corresponds to an 'airship'.




Previous: Examples/EdgeTPU Index Next: Examples/ONNXRT ACL