GstInference/Benchmarks: Difference between revisions
(→x86) |
No edit summary Tag: Manual revert |
||
(17 intermediate revisions by 3 users not shown) | |||
Line 1: | Line 1: | ||
<noinclude> | <noinclude> | ||
{{GstInference/Head|previous=Example Applications/DispTec|next=Model Zoo| | {{GstInference/Head|previous=Example Applications/DispTec|next=Model Zoo|metakeywords=GstInference gstreamer pipelines, Inference gstreamer pipelines, NCSDK Inference gstreamer pipelines, GoogLeNet, TinyYolo v2, GoogLeNet x2, TensorFlow backend|title=GstInference Benchmarks}} | ||
</noinclude> | </noinclude> | ||
Line 11: | Line 11: | ||
</html> | </html> | ||
= GstInference Benchmarks | == GstInference Benchmarks - Introduction == | ||
This wiki summarizes a series of benchmarks on different hardware platforms based on the [https://github.com/RidgeRun/gst-inference/blob/master/tests/benchmark/run_benchmark.sh run_benchmark.sh] bash script that can be found in the official [https://github.com/RidgeRun/gst-inference GstInference repository]. The script is based on the following GStreamer pipeline: | This wiki summarizes a series of benchmarks on different hardware platforms based on the [https://github.com/RidgeRun/gst-inference/blob/master/tests/benchmark/run_benchmark.sh run_benchmark.sh] bash script that can be found in the official [https://github.com/RidgeRun/gst-inference GstInference repository]. The script is based on the following GStreamer pipeline: | ||
<source lang="bash"> | <source lang="bash"> | ||
Line 46: | Line 43: | ||
The following video was used to perform the benchmark tests. | The following video was used to perform the benchmark tests. | ||
<br> | <br> | ||
To download the video press | To download the video press right-click on the video and select 'Save video as' and save this on your computer. | ||
<br> | |||
[[File:Test benchmark video.mp4| | <br> | ||
[[File:Test benchmark video.mp4|thumb|border|center|500px|alt=Alt|Video 1. Test benchmark video]] | |||
== x86 == | == x86 == | ||
Line 686: | Line 684: | ||
== Google Coral == | == Google Coral == | ||
The following benchmarks were performed on the Coral Dev Board. | |||
=== FPS Measurements === | === FPS Measurements === | ||
Line 720: | Line 720: | ||
'TensorFlow Lite \n Coral', | 'TensorFlow Lite \n Coral', | ||
'TensorFlow Lite EdgeTPU \n Coral'], //Column 1 | 'TensorFlow Lite EdgeTPU \n Coral'], //Column 1 | ||
['InceptionV1', 3.11, 41. | ['InceptionV1', 3.11, 41.6], //row 1 | ||
['InceptionV2', 2.31, 42], //row 2 | ['InceptionV2', 2.31, 42.8], //row 2 | ||
['InceptionV3', 0.9, 15. | ['InceptionV3', 0.9, 15.02], //row 3 | ||
['InceptionV4', 0, 8. | ['InceptionV4', 0, 8.56], //row 4 | ||
[' | ['MobileNetV2', 0, 41.12], //row 5 | ||
[' | ['MobileNetV2 + SSD', 0, 38.64] //row 6 | ||
]); | ]); | ||
var Coral_materialOptions_fps = { | var Coral_materialOptions_fps = { | ||
Line 795: | Line 795: | ||
'TensorFlow Lite EdgeTPU \n Coral'], //Column 1 | 'TensorFlow Lite EdgeTPU \n Coral'], //Column 1 | ||
['InceptionV1', 73, 32], //row 1 | ['InceptionV1', 73, 32], //row 1 | ||
['InceptionV2', 72, | ['InceptionV2', 72, 37], //row 2 | ||
['InceptionV3', 74, | ['InceptionV3', 74, 14], //row 3 | ||
['InceptionV4', 0, | ['InceptionV4', 0, 5], //row 4 | ||
[' | ['MobileNetV2', 0, 34], //row 5 | ||
[' | ['MobileNetV2 + SSD', 0, 45] //row 6 | ||
]); | ]); | ||
var Coral_materialOptions_cpu = { | var Coral_materialOptions_cpu = { |
Latest revision as of 20:39, 4 September 2024
Make sure you also check GstInference's companion project: R2Inference |
GstInference |
---|
Introduction |
Getting started |
Supported architectures |
InceptionV1 InceptionV3 YoloV2 AlexNet |
Supported backends |
Caffe |
Metadata and Signals |
Overlay Elements |
Utils Elements |
Legacy pipelines |
Example pipelines |
Example applications |
Benchmarks |
Model Zoo |
Project Status |
Contact Us |
|
GstInference Benchmarks - Introduction
This wiki summarizes a series of benchmarks on different hardware platforms based on the run_benchmark.sh bash script that can be found in the official GstInference repository. The script is based on the following GStreamer pipeline:
#Script to run each model run_all_models(){ model_array=(inceptionv1 inceptionv2 inceptionv3 inceptionv4 tinyyolov2 tinyyolov3) model_upper_array=(InceptionV1 InceptionV2 InceptionV3 InceptionV4 TinyYoloV2 TinyYoloV3) input_array=(input input input input input/Placeholder inputs ) output_array=(InceptionV1/Logits/Predictions/Reshape_1 Softmax InceptionV3/Predictions/Reshape_1 InceptionV4/Logits/Predictions add_8 output_boxes ) mkdir -p logs/ rm -f logs/* for ((i=0;i<${#model_array[@]};++i)); do echo Perf ${model_array[i]} gst-launch-1.0 \ filesrc location=$VIDEO_PATH num-buffers=600 ! decodebin ! videoconvert ! \ perf print-arm-load=true name=inputperf ! tee name=t t. ! videoscale ! queue ! net.sink_model t. ! queue ! net.sink_bypass \ ${model_array[i]} backend=$BACKEND name=net backend::input-layer=${input_array[i]} backend::output-layer=${output_array[i]} \ model-location="${MODELS_PATH}${model_upper_array[i]}_${INTERNAL_PATH}/graph_${model_array[i]}${EXTENSION}" \ net.src_bypass ! perf print-arm-load=true name=outputperf ! videoconvert ! fakesink sync=false > logs/${model_array[i]}.log done }
Test benchmark video
The following video was used to perform the benchmark tests.
To download the video press right-click on the video and select 'Save video as' and save this on your computer.
x86
The Desktop PC had the following specifications:
- Intel(R) Core(TM) Core i7-7700HQ CPU @ 2.80GHz
- 12 GB RAM
- Linux 4.15.0-106-generic x86_64 (Ubuntu 16.04)
- GStreamer 1.8.3
FPS Measurements
CPU Load Measurements
Jetson AGX Xavier
The Jetson Xavier power modes used were 2 and 6 (more information: Supported Modes and Power Efficiency)
- View current power mode:
$ sudo /usr/sbin/nvpmodel -q
- Change current power mode:
sudo /usr/sbin/nvpmodel -m x
Where x is the power mode ID (e.g. 0, 1, 2, 3, 4, 5, 6).
FPS Measurements
CPU Load Measurements
Jetson TX2
FPS Measurements
CPU Load Measurements
Jetson Nano
FPS Measurements
CPU Load Measurements
Google Coral
The following benchmarks were performed on the Coral Dev Board.
FPS Measurements
CPU Load Measurements