Table Of Contents
Table Of Contents

InferenceΒΆ

The following tutorials will help you learn how to deploy MXNet models for inference applications.

GluonCV Models in a C++ Inference Applicationhttps://gluon-cv.mxnet.io/build/examples_deployment/cpp_inference.html

An example application that works with an exported MXNet GluonCV YOLO model.

Inference with Quantized Modelshttps://gluon-cv.mxnet.io/build/examples_deployment/int8_inference.html

How to use quantized GluonCV models for inference on Intel Xeon Processors to gain higher performance.