Installing TensorFlow Lite for NXP based images
Getting started with AI on NXP i.MX8M Plus RidgeRun documentation is currently under development. |
Installing TensorFlow Lite for i.MX8M Plus
For Gatesgarth there are some issues related to the construction of this library, in this version TensorFlow Lite is built as a static library, but other plugins in this workflow will need TensorFlow as a shared library. NXP has solved some of these issues in the Hardknott Yocto version, so let's use it.
Please go to the local.conf at:
cd $HOME/<your image folder>/<your build directory>/conf/ cat local.conf
There you can add the following lines:
# -- TensorFlow Lite stuff -- IMAGE_INSTALL_append += " tensorflow-lite"
Enable the NNAPI and XNNPACK delegate in the recipe
The TensorFlow Lite recipe is located at:
cd <your image folder>/sources/meta-imx/meta-ml/recipes-libraries/tensorflow-lite
Ensure the recipe has the 31st line like the shown below:
EXTRA_OECMAKE += "-DTFLITE_ENABLE_XNNPACK=on -DTFLITE_ENABLE_RUY=on -DTFLITE_ENABLE_NNAPI=on ${EXTRA_OECMAKE_MX8} -DTFLITE_BUILD_EVALTOOLS=on -DTFLITE_BUILD_SHARED_LIB=on ${S}/tensorflow/lite/"
Building the image with the changes in local.conf
cd $HOME/<your image folder> source setup-environment <your build directory> bitbake imx-image-core
Testing the TensorFlow Lite installation
For this section, please go through the Cross-compiling apps for GStreamer, TensorFlow Lite, and OpenCV