How to Install TensorFlow Lite: Difference between revisions

From RidgeRun Developer Wiki
 
mNo edit summary
 
Line 1: Line 1:
<seo title="TensorFlow Lite | Open source deep learning framework | RidgeRun Developer" titlemode="replace" keywords="GStreamer, Linux SDK, Linux BSP,  Embedded Linux, Device Drivers, Nvidia, Xilinx, TI, NXP, Freescale, Embedded Linux driver development, Linux Software development, Embedded Linux SDK, Embedded Linux Application development, GStreamer Multimedia Framework, TensorFlow Lite, Tensorflow, NVIDIA TensorFlow Lite"  description="This wiki explains about how to configure,build and benchmarking a TensorFlow Lite model."></seo>
{{#seo:
|title=How to Install TensorFlow Lite
|title_mode=replace
|description={{{description|This wiki explains about how to configure,build and benchmarking a TensorFlow Lite model.}}}
}}
 
{{NVIDIA Pref Partner logo and RR Contact}}


== Building TensorFlow Lite from Source ==  
== Building TensorFlow Lite from Source ==  
Line 12: Line 18:
=== TensorFlow Lite dependencies ===
=== TensorFlow Lite dependencies ===


<syntaxhighlight lang=bash>
<syntaxhighlight lang="bash">
sudo apt install git wget unzip python3 curl build-essential zlib1g-dev python3-numpy python3-six
sudo apt install git wget unzip python3 curl build-essential zlib1g-dev python3-numpy python3-six


Line 26: Line 32:
=== Configuration Variables ===
=== Configuration Variables ===


<syntaxhighlight lang=bash>
<syntaxhighlight lang="bash">
TF_VERSION=v2.0.1
TF_VERSION=v2.0.1
</syntaxhighlight>
</syntaxhighlight>
Line 32: Line 38:
=== TensorFlow Lite Build Steps ===
=== TensorFlow Lite Build Steps ===


<syntaxhighlight lang=bash>
<syntaxhighlight lang="bash">
git clone https://github.com/tensorflow/tensorflow.git
git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow
cd tensorflow
Line 65: Line 71:
Make sure you've run the previous steps already.
Make sure you've run the previous steps already.


<syntaxhighlight lang=bash>
<syntaxhighlight lang="bash">
# Configure tensorflow (if you havent already)- I usually accept all defaults
# Configure tensorflow (if you havent already)- I usually accept all defaults
./configure
./configure
Line 78: Line 84:
</syntaxhighlight>
</syntaxhighlight>


==Contact Us==
{{ContactUs}}
{{ContactUs}}


[[Category:Jetson]][[Category:NVIDIA Xavier]][[Category:JetsonNano]][[Category:JetsonTX2]][[Category:JetsonXavierNX]][[Category:NVIDIA Jetson Orin]]
[[Category:Jetson]][[Category:NVIDIA Xavier]][[Category:JetsonNano]][[Category:JetsonTX2]][[Category:JetsonXavierNX]][[Category:NVIDIA Jetson Orin]]

Latest revision as of 17:05, 6 December 2024





Building TensorFlow Lite from Source

Tested Operating Systems

This instructions have been tested on:

  • Ubuntu Xenial 16.04
  • Ubuntu Bionic 18.04
  • Elementary Hera 5.1

TensorFlow Lite dependencies

sudo apt install git wget unzip python3 curl build-essential zlib1g-dev python3-numpy python3-six

# Some TF tools assume python is pointing to python3
# This uses the update-alternative util to do so. Caution,
# verify you don't have an alternative already configured
# by running "update-alternatives --config python"
#
sudo update-alternatives --install /usr/bin/python python /usr/bin/python3 1

Configuration Variables

TF_VERSION=v2.0.1

TensorFlow Lite Build Steps

git clone https://github.com/tensorflow/tensorflow.git
cd tensorflow

# Move to a stable version
git checkout $TF_VERSION

# Important - download a valid bazel version
BAZEL=`awk -F"'" '/_TF_MAX_BAZEL_VERSION = /{print $2}' configure.py`
mkdir bazel
cd bazel
wget -c https://github.com/bazelbuild/bazel/releases/download/${BAZEL}/bazel-${BAZEL}-installer-linux-x86_64.sh
chmod +x bazel-${BAZEL}-installer-linux-x86_64.sh
sudo ./bazel-${BAZEL}-installer-linux-x86_64.sh
cd ..

# Configure tensorflow - I usually accept all defaults
./configure

# Build the library
cd tensorflow/lite/tools/make
./download_dependencies.sh
./build_lib.sh
cd ../../../..

# The library will be built here
realpath tensorflow/lite/tools/make/gen/linux_x86_64/lib/libtensorflow-lite.a

Benchmarking a TensorFlow Lite Model

Make sure you've run the previous steps already.

# Configure tensorflow (if you havent already)- I usually accept all defaults
./configure

# Finally build the benchmarking tool
bazel build -c opt tensorflow/lite/tools/benchmark:benchmark_model

# Run the benchmarking
bazel-bin/tensorflow/lite/tools/benchmark/benchmark_model \
  --graph=mobilenet_quant_v1_224.tflite \
  --num_threads=4

Contact Us



For direct inquiries, please refer to the contact information available on our Contact page. Alternatively, you may complete and submit the form provided at the same link. We will respond to your request at our earliest opportunity.


Links to RidgeRun Resources and RidgeRun Artificial Intelligence Solutions can be found in the footer below.