23,794
edits
mNo edit summary |
mNo edit summary |
||
Line 1: | Line 1: | ||
{{#seo: | |||
|title=Tritonserver support for NVIDIA Jetson Platforms | |||
|title_mode=replace | |||
|description={{{description|This RidgeRun developer wiki guide is about Tritonserver, how to run and test the server on a NVIDIA JetPack.}}} | |||
}} | |||
{{NVIDIA Pref Partner logo and RR Contact}} | |||
{{NVIDIA | |||
== Introduction to Triton Inference Server== | == Introduction to Triton Inference Server== | ||
Line 131: | Line 124: | ||
If you search the documentation, you will find that there are docker images on NGC for tritonserver, named <code>nvcr.io/nvidia/tritonserver:<xx.yy>-py3</code>. These images don't work for JetPack because are built for Windows or Ubuntu for PC, therefore you should need to create one by using the following Dockerfile. | If you search the documentation, you will find that there are docker images on NGC for tritonserver, named <code>nvcr.io/nvidia/tritonserver:<xx.yy>-py3</code>. These images don't work for JetPack because are built for Windows or Ubuntu for PC, therefore you should need to create one by using the following Dockerfile. | ||
<syntaxhighlight lang=docker> | <syntaxhighlight lang="docker"> | ||
FROM nvcr.io/nvidia/l4t-ml:r32.6.1-py3 | FROM nvcr.io/nvidia/l4t-ml:r32.6.1-py3 | ||
Line 303: | Line 296: | ||
This is done using the following deployment YAML file description for kubernetes: | This is done using the following deployment YAML file description for kubernetes: | ||
<syntaxhighlight lang=yaml> | <syntaxhighlight lang="yaml"> | ||
apiVersion: apps/v1 | apiVersion: apps/v1 |