583
edits
mNo edit summary |
No edit summary |
||
Line 30: | Line 30: | ||
As such, a single C/C++ application may work with a Caffe or TensorFlow model, for example. | As such, a single C/C++ application may work with a Caffe or TensorFlow model, for example. | ||
This is especially useful for hybrid solutions, where multiple models need to be inferred. R2Inference may be able to execute one model on the DLA and another on the CPU, for instance. | This is especially useful for hybrid solutions, where multiple models need to be inferred. R2Inference may be able to execute one model on the DLA and another on the CPU, for instance. | ||
R2Inference is a Coral compatible project. | |||
[[File:Works with coral v2.png|800px|frameless|center]] | |||
Get started with R2Inference by clicking the button below! | Get started with R2Inference by clicking the button below! |
edits