Coral from Google/GstInference/Introduction: Difference between revisions
No edit summary |
mNo edit summary |
||
Line 15: | Line 15: | ||
* [[GstInference/Getting_started/Building_the_plugin|Building GstInference]] | * [[GstInference/Getting_started/Building_the_plugin|Building GstInference]] | ||
After the installation is completed, you can generate GStreamer pipelines for | After the installation is completed, you can generate GStreamer pipelines for Coral using different GstInference elements in this [[GstInference/Example_pipelines_with_hierarchical_metadata| tool]]. | ||
Revision as of 17:38, 23 February 2021
Coral from Google |
---|
|
Introduction |
GStreamer |
GstInference |
Camera Drivers |
Reference Documentation |
Contact Us |
Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems. To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task. The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames. GstInference is an open-source GStreamer project sponsored by RidgeRun that allows easy integration of deep learning networks into your video streaming application.
GstInference and R2Inference are supported by Coral. To install it, follow these guides:
- Getting R2Inference code
- Building R2Inference: The Google Coral is optimized for Tensorflow Lite. To install TFLite for ARM64 follow this guide.
- Getting GstInference code
- Building GstInference
After the installation is completed, you can generate GStreamer pipelines for Coral using different GstInference elements in this tool.