Jump to content

R2Inference/Supported backends/TensorFlow-Lite: Difference between revisions

Line 38: Line 38:
</pre>
</pre>


<!--
 
===Nvidia Jetson (TX1, TX2, Xavier, Nano)===
===Nvidia Jetson (TX1, TX2, Xavier, Nano)===
For NVIDIA Jetson boards download the prebuilt binaries for free from [https://shop.ridgerun.com/search?q=TensorFlow-CPU+Binaries+for+NVIDIA our shop]. Then run the following:
You can install the C/C++ Tensorflow API for Jetson devices following the next steps:
*  Build and install Tensorflow Lite
Download Tensorflow source code:
 
<pre>
git clone https://github.com/tensorflow/tensorflow
cd tensorflow/tensorflow/lite/tools/make
</pre>
 
Download dependencies:
<pre>
./download_dependencies.sh
</pre>


Build:
<pre>
./build_aarch64_lib.sh
</pre>
Copy the static library to the libraries path:
<pre>
cp gen/aarch64_armv8-a/lib/libtensorflow-lite.a /usr/lib/aarch64-linux-gnu/
</pre>
<!--
===i.MX 8===
===i.MX 8===
For i.MX8 boards download the prebuilt binary for free from [https://shop.ridgerun.com/search?q=TensorFlow-CPU+Binaries+for+i.MX8 our shop]. Then run the following:
For i.MX8 boards download the prebuilt binary for free from [https://shop.ridgerun.com/search?q=TensorFlow-CPU+Binaries+for+i.MX8 our shop]. Then run the following:
507

edits

Cookies help us deliver our services. By using our services, you agree to our use of cookies.