Building the GstInference plugin
Make sure you also check GstInference's companion project: R2Inference |
GstInference |
---|
Introduction |
Getting started |
Supported architectures |
InceptionV1 InceptionV3 YoloV2 AlexNet |
Supported backends |
Caffe |
Metadata and Signals |
Overlay Elements |
Utils Elements |
Legacy pipelines |
Example pipelines |
Example applications |
Benchmarks |
Model Zoo |
Project Status |
Contact Us |
|
This page provides a guide to build and install GstInference
Dependencies
System Dependencies
GstInference has the following system dependencies:
- pkg-config
- gtk-doc-tools
- libgstreamer1.0-dev
- libgstreamer-plugins-base1.0-dev
- gstreamer1.0-tools
- gstreamer1.0-plugins-good
- gstreamer1.0-libav
- opencv 3.3.1*
Also, GstInference makes use of the Meson build system.
Note: OpenCV is only needed if you wish to build the overlay elements.
Linux
In Debian based systems, you may install the dependencies with the following command:
sudo apt-get install -y python3 python3-pip python3-setuptools python3-wheel ninja-build pkg-config gtk-doc-tools libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-libav libopencv-dev
Then, use pip3 to install the latest version of Meson directly from its repository.
sudo -H pip3 install git+https://github.com/mesonbuild/meson.git
R2Inference
To build GstInference you need to install R2Inference.
To install R2Inference please check R2Inference Build Instructions
Build Plugin
Linux
Note: If you are using Edge TPU backend with Google Coral the /usr/local/lib directory is not included by default in the linking path, hence we need to include it before building.
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib/
export LD_LIBRARY_PATH=/usr/local/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH git clone https://github.com/RidgeRun/gst-inference.git cd gst-inference meson build --prefix /usr ninja -C build ninja -C build test sudo ninja -C build install
Note: If you are building GstInference in the Coral Dev Kit consider using ninja -C build -j 1
if you have troubles related to memory consumption.
Yocto
GstInference is available at Ridgerun's meta-layer, please check our recipes here. Actually, only i.MX8 platforms are supported with Yocto.
First, create a Yocto environment for i.MX8, this i.MX8 dedicated wiki has more information to setup up a Yocto environment.
i.MX8 Yocto guide here.
In your Yocto sources folder, run the following command
git clone https://github.com/RidgeRun/meta-ridgerun.git
Enable RidgeRun's meta-layer in your conf/bblayers.conf file by adding the following line.
${BSPDIR}/sources/meta-ridgerun \
Enable Prebuilt-TensorFlow, R2Inference and GstInference in your conf/local.conf.
IMAGE_INSTALL_append = "prebuilt-tensorflow r2inference gst-inference"
Finally, build your desired image, the previous steps added GstInference and its requirements into your Yocto image.
Verify Installation
The plugin installation can be verified by running:
gst-inspect-1.0 inference
In case of any error, please contact support@ridgerun.com with the following output, modify path according to your architecture:
gst-inspect-1.0 /usr/lib/x86_64-linux-gnu/gstreamer-1.0/libgstinference.so