Installing R2Inference

From RidgeRun Developer Wiki





Previous: Development/Integrating Artificial Intelligence software stack/Installing TensorFlow Lite for NXP based images Index Next: Development/Integrating Artificial Intelligence software stack/Installing GstInference




Download the meta-ridgerun recipes

Once you have a Yocto environment, please go to the sources files in order to download the meta-ridgerun, where the r2inference recipe is located:

cd $HOME/<your image folder>/sources

git clone https://github.com/RidgeRun/meta-ridgerun.git

Find the bblayers.conf at:

cd $HOME/<your image folder>/<your build directory>/conf/

In the bblayers.conf please add the following line at the end of the file:

BBLAYERS += "${BSPDIR}/sources/meta-ridgerun"

Configuring and build r2inference

Go to the r2inference recipe at:

cd $HOME/<your image folder>/sources/meta-ridgerun/recipes-multimedia/r2inference

Now ensure that your recipe looks like the following:

SUMMARY = "RidgeRun's Inference Library"
DESCRIPTION = "A machine learning library for easy inference integration"
HOMEPAGE = "https://developer.ridgerun.com/wiki/index.php?title=R2Inference"
SECTION = "multimedia"
LICENSE = "LGPL2.1"

LIC_FILES_CHKSUM = "file://COPYING;md5=a079f37f0484c6a88e7b23a94d6326c5"

DEPENDS = "tensorflow-lite glib-2.0"

# FIXME: change branch
SRCBRANCH ?= "feature/add-support-tflite-2.4"
# FIXME: change commit
SRCREV = "9bcc2080e34d2b60fbd6b59f2132667422073ba7"
SRC_URI = "git://github.com/RidgeRun/r2inference;protocol=https;branch=${SRCBRANCH}"

# Enabling TensorFlow Lite and NNAPI backends:
EXTRA_OEMESON += " -Denable-nnapi=true -Denable-tflite=true -Denable-tests=disabled -Denable-docs=disabled"

# Adding the path where the TensorFlow Headers are read by the recipe
TENSORFLOW_PATH="../recipe-sysroot/usr/include/"
CXXFLAGS="-I${TENSORFLOW_PATH}/tensorflow/lite -I${TENSORFLOW_PATH}/tensorflow/lite/tools/make/downloads/flatbuffers/include"

S = "${WORKDIR}/git"

FILES_${PN} += "${libdir}/libr2inference.so"

inherit meson pkgconfig gettext

RDEPENDS_${PN} = "ninja python3-core python3-modules"


In the local.conf file, please add the following line in order to build the r2inference library inside the image:

IMAGE_INSTALL_append += "r2inference"

Building the image with the changes in local.conf and bblayers.conf

cd $HOME/<your image folder>

source setup-environment <your build directory>

bitbake imx-image-core



Previous: Development/Integrating Artificial Intelligence software stack/Installing TensorFlow Lite for NXP based images Index Next: Development/Integrating Artificial Intelligence software stack/Installing GstInference