Information for "GstInference/Supported backends/Tensorflow-Lite"

Basic information

Display titleGstInference and TensorFlow-Lite backend
Default sort keyGstInference/Supported backends/Tensorflow-Lite
Page length (in bytes)3,392
Namespace ID0
Page ID3280
Page content languageen - English
Page content modelwikitext
Indexing by robotsAllowed
Number of redirects to this page1
Counted as a content pageYes
Number of subpages of this page0 (0 redirects; 0 non-redirects)

Page protection

EditAllow all users (infinite)
MoveAllow all users (infinite)
View the protection log for this page.

Edit history

Page creatorSpalli (talk | contribs)
Date of page creation12:24, 19 April 2023
Latest editorSpalli (talk | contribs)
Date of latest edit13:25, 29 November 2024
Total number of edits2
Total number of distinct authors1
Recent number of edits (within past 90 days)0
Recent number of distinct authors0

Page properties

Transcluded templates (12)

Templates used on this page:

SEO properties

Description

Content

Page title: (title)
This attribute controls the content of the <title> element.
GstInference - Supported backends - Tensorflow-Lite
Title mode (title_mode)
More_information
replace
Article description: (description)
This attribute controls the content of the description and og:description elements.
GstInference is an open-source project from RidgeRun that provides a framework for integrating deep learning inference into GStreamer.
Information from Extension:WikiSEO