Coral from Google/GstInference/Introduction: Difference between revisions

From RidgeRun Developer Wiki
No edit summary
mNo edit summary
 
(15 intermediate revisions by 2 users not shown)
Line 1: Line 1:
<noinclude>
<noinclude>
{{Coral from Google/Head|next=GstInference/Why_use_GstInference%3F|previous=GstInference|keywords=}}
{{Coral from Google/Head|next=GstInference/Why_use_GstInference?|previous=GstInference/Introduction|metakeywords=}}
</noinclude>
</noinclude>


== GstInference Description ==


Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems.  To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task.  The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames.  GstInference is an open-source GStreamer project sponsored by  RidgeRun that allows easy integration of deep learning networks into your video streaming application.
Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems.  To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task.  The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames.  GstInference is an open-source GStreamer project sponsored by  RidgeRun that allows easy integration of deep learning networks into your video streaming application.
<br>
[[File:Coral example.png|thumb|center|600px|Example of use of GstInference in the Coral. The model used was the TinyYolo V2. Video took from https://pixabay.com/videos/road-autobahn-motorway-highway-11018/]]
<br>
[[GstInference|GstInference]] and [[R2Inference| R2Inference]] are supported by Coral. To install it, follow these guides:


Please check our developer wiki with information about the project:
* [[R2Inference/Getting_started/Getting_the_code|Getting R2Inference code]]
* [[R2Inference/Getting_started/Building_the_library|Building R2Inference]]: Coral is optimized for Tensorflow Lite. For installation, please follow the [[R2Inference/Supported_backends/TensorFlow-Lite#Cross-compile_for_ARM64 | Guide to install TFLite for ARM64]].
* [[GstInference/Getting_started/Getting_the_code|Getting GstInference code]]
* [[GstInference/Getting_started/Building_the_plugin|Building GstInference]]


* [https://developer.ridgerun.com/wiki/index.php?title=GstInference GstInference]
After the installation is completed, you can generate GStreamer pipelines for Coral using different GstInference elements in this [[GstInference/Example_pipelines_with_hierarchical_metadata#Pipeline_Generator| Pipeline Generator Tool]].




<noinclude>
<noinclude>
{{Coral from Google/Foot|GstInference|GstInference/Why_use_GstInference%3F}}
{{Coral from Google/Foot|GstInference/Introduction|GstInference/Why_use_GstInference?}}
</noinclude>
</noinclude>

Latest revision as of 19:13, 11 May 2023




Previous: GstInference/Introduction Index Next: GstInference/Why_use_GstInference?





Deep Learning (DL) has revolutionized classic computer vision techniques to enable even more intelligent and autonomous systems. To ease the software development burden for complex embedded visual Deep Learning applications, a multimedia framework, such as GStreamer, is utilized to simplify the task. The Open Source GStreamer audio video streaming framework is a good choice as it separates the complexities of handling streaming video from the inference models processing the individual frames. GstInference is an open-source GStreamer project sponsored by RidgeRun that allows easy integration of deep learning networks into your video streaming application.

Example of use of GstInference in the Coral. The model used was the TinyYolo V2. Video took from https://pixabay.com/videos/road-autobahn-motorway-highway-11018/


GstInference and R2Inference are supported by Coral. To install it, follow these guides:

After the installation is completed, you can generate GStreamer pipelines for Coral using different GstInference elements in this Pipeline Generator Tool.


Previous: GstInference/Introduction Index Next: GstInference/Why_use_GstInference?