Shortcuts

Note

You are reading the documentation for MMClassification 0.x, which will soon be deprecated at the end of 2022. We recommend you upgrade to MMClassification 1.0 to enjoy fruitful new features and better performance brought by OpenMMLab 2.0. Check the installation tutorial, migration tutorial and changelog for more details.

Pytorch to ONNX (Experimental)

How to convert models from Pytorch to ONNX

Prerequisite

  1. Please refer to install for installation of MMClassification.

  2. Install onnx and onnxruntime

pip install onnx onnxruntime==1.5.1

Usage

python tools/deployment/pytorch2onnx.py \
    ${CONFIG_FILE} \
    --checkpoint ${CHECKPOINT_FILE} \
    --output-file ${OUTPUT_FILE} \
    --shape ${IMAGE_SHAPE} \
    --opset-version ${OPSET_VERSION} \
    --dynamic-export \
    --show \
    --simplify \
    --verify \

Description of all arguments:

  • config : The path of a model config file.

  • --checkpoint : The path of a model checkpoint file.

  • --output-file: The path of output ONNX model. If not specified, it will be set to tmp.onnx.

  • --shape: The height and width of input tensor to the model. If not specified, it will be set to 224 224.

  • --opset-version : The opset version of ONNX. If not specified, it will be set to 11.

  • --dynamic-export : Determines whether to export ONNX with dynamic input shape and output shapes. If not specified, it will be set to False.

  • --show: Determines whether to print the architecture of the exported model. If not specified, it will be set to False.

  • --simplify: Determines whether to simplify the exported ONNX model. If not specified, it will be set to False.

  • --verify: Determines whether to verify the correctness of an exported model. If not specified, it will be set to False.

Example:

python tools/deployment/pytorch2onnx.py \
    configs/resnet/resnet18_8xb16_cifar10.py \
    --checkpoint checkpoints/resnet/resnet18_8xb16_cifar10.pth \
    --output-file checkpoints/resnet/resnet18_8xb16_cifar10.onnx \
    --dynamic-export \
    --show \
    --simplify \
    --verify \

How to evaluate ONNX models with ONNX Runtime

We prepare a tool tools/deployment/test.py to evaluate ONNX models with ONNXRuntime or TensorRT.

Prerequisite

  • Install onnx and onnxruntime-gpu

    pip install onnx onnxruntime-gpu
    

Usage

python tools/deployment/test.py \
    ${CONFIG_FILE} \
    ${ONNX_FILE} \
    --backend ${BACKEND} \
    --out ${OUTPUT_FILE} \
    --metrics ${EVALUATION_METRICS} \
    --metric-options ${EVALUATION_OPTIONS} \
    --show
    --show-dir ${SHOW_DIRECTORY} \
    --cfg-options ${CFG_OPTIONS} \

Description of all arguments

  • config: The path of a model config file.

  • model: The path of a ONNX model file.

  • --backend: Backend for input model to run and should be onnxruntime or tensorrt.

  • --out: The path of output result file in pickle format.

  • --metrics: Evaluation metrics, which depends on the dataset, e.g., “accuracy”, “precision”, “recall”, “f1_score”, “support” for single label dataset, and “mAP”, “CP”, “CR”, “CF1”, “OP”, “OR”, “OF1” for multi-label dataset.

  • --show: Determines whether to show classifier outputs. If not specified, it will be set to False.

  • --show-dir: Directory where painted images will be saved

  • --metrics-options: Custom options for evaluation, the key-value pair in xxx=yyy format will be kwargs for dataset.evaluate() function

  • --cfg-options: Override some settings in the used config file, the key-value pair in xxx=yyy format will be merged into config file.

Results and Models

This part selects ImageNet for onnxruntime verification. ImageNet has multiple versions, but the most commonly used one is ILSVRC 2012.

Model Config Metric PyTorch ONNXRuntime TensorRT-fp32 TensorRT-fp16
ResNet resnet50_8xb32_in1k.py Top 1 / 5 76.55 / 93.15 76.49 / 93.22 76.49 / 93.22 76.50 / 93.20
ResNeXt resnext50-32x4d_8xb32_in1k.py Top 1 / 5 77.90 / 93.66 77.90 / 93.66 77.90 / 93.66 77.89 / 93.65
SE-ResNet seresnet50_8xb32_in1k.py Top 1 / 5 77.74 / 93.84 77.74 / 93.84 77.74 / 93.84 77.74 / 93.85
ShuffleNetV1 shufflenet-v1-1x_16xb64_in1k.py Top 1 / 5 68.13 / 87.81 68.13 / 87.81 68.13 / 87.81 68.10 / 87.80
ShuffleNetV2 shufflenet-v2-1x_16xb64_in1k.py Top 1 / 5 69.55 / 88.92 69.55 / 88.92 69.55 / 88.92 69.55 / 88.92
MobileNetV2 mobilenet-v2_8xb32_in1k.py Top 1 / 5 71.86 / 90.42 71.86 / 90.42 71.86 / 90.42 71.88 / 90.40

List of supported models exportable to ONNX

The table below lists the models that are guaranteed to be exportable to ONNX and runnable in ONNX Runtime.

Model

Config

Batch Inference

Dynamic Shape

Note

MobileNetV2

mobilenet-v2_8xb32_in1k.py

Y

Y

ResNet

resnet18_8xb16_cifar10.py

Y

Y

ResNeXt

resnext50-32x4d_8xb32_in1k.py

Y

Y

SE-ResNet

seresnet50_8xb32_in1k.py

Y

Y

ShuffleNetV1

shufflenet-v1-1x_16xb64_in1k.py

Y

Y

ShuffleNetV2

shufflenet-v2-1x_16xb64_in1k.py

Y

Y

Notes:

  • All models above are tested with Pytorch==1.6.0

Reminders

  • If you meet any problem with the listed models above, please create an issue and it would be taken care of soon. For models not included in the list, please try to dig a little deeper and debug a little bit more and hopefully solve them by yourself.

FAQs

  • None

Read the Docs v: latest
Versions
latest
stable
Downloads
pdf
html
epub
On Read the Docs
Project Home
Builds

Free document hosting provided by Read the Docs.