• Stars
    star
    1,002
  • Rank 45,804 (Top 1.0 %)
  • Language
    Python
  • License
    Apache License 2.0
  • Created almost 7 years ago
  • Updated 6 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

ONNXMLTools enables conversion of models to ONNX

ONNXMLTools_logo_main

Linux Windows
Build Status Build Status

Introduction

ONNXMLTools enables you to convert models from different machine learning toolkits into ONNX. Currently the following toolkits are supported:

Pytorch has its builtin ONNX exporter check here for details.

Install

You can install latest release of ONNXMLTools from PyPi:

pip install onnxmltools

or install from source:

pip install git+https://github.com/microsoft/onnxconverter-common
pip install git+https://github.com/onnx/onnxmltools

If you choose to install onnxmltools from its source code, you must set the environment variable ONNX_ML=1 before installing the onnx package.

Dependencies

This package relies on ONNX, NumPy, and ProtoBuf. If you are converting a model from scikit-learn, Core ML, Keras, LightGBM, SparkML, XGBoost, H2O, CatBoost or LibSVM, you will need an environment with the respective package installed from the list below:

  1. scikit-learn
  2. CoreMLTools (version 3.1 or lower)
  3. Keras (version 2.0.8 or higher) with the corresponding Tensorflow version
  4. LightGBM
  5. SparkML
  6. XGBoost
  7. libsvm
  8. H2O
  9. CatBoost

ONNXMLTools is tested with Python 3.7+.

Examples

If you want the converted ONNX model to be compatible with a certain ONNX version, please specify the target_opset parameter upon invoking the convert function. The following Keras model conversion example demonstrates this below. You can identify the mapping from ONNX Operator Sets (referred to as opsets) to ONNX releases in the versioning documentation.

Keras to ONNX Conversion

Next, we show an example of converting a Keras model into an ONNX model with target_opset=7, which corresponds to ONNX release version 1.2.

import onnxmltools
from keras.layers import Input, Dense, Add
from keras.models import Model

# N: batch size, C: sub-model input dimension, D: final model's input dimension
N, C, D = 2, 3, 3

# Define a sub-model, it will become a part of our final model
sub_input1 = Input(shape=(C,))
sub_mapped1 = Dense(D)(sub_input1)
sub_model1 = Model(inputs=sub_input1, outputs=sub_mapped1)

# Define another sub-model, it will become a part of our final model
sub_input2 = Input(shape=(C,))
sub_mapped2 = Dense(D)(sub_input2)
sub_model2 = Model(inputs=sub_input2, outputs=sub_mapped2)

# Define a model built upon the previous two sub-models
input1 = Input(shape=(D,))
input2 = Input(shape=(D,))
mapped1_2 = sub_model1(input1)
mapped2_2 = sub_model2(input2)
sub_sum = Add()([mapped1_2, mapped2_2])
keras_model = Model(inputs=[input1, input2], outputs=sub_sum)

# Convert it! The target_opset parameter is optional.
onnx_model = onnxmltools.convert_keras(keras_model, target_opset=7)

CoreML to ONNX Conversion

Here is a simple code snippet to convert a Core ML model into an ONNX model.

import onnxmltools
import coremltools

# Load a Core ML model
coreml_model = coremltools.utils.load_spec('example.mlmodel')

# Convert the Core ML model into ONNX
onnx_model = onnxmltools.convert_coreml(coreml_model, 'Example Model')

# Save as protobuf
onnxmltools.utils.save_model(onnx_model, 'example.onnx')

H2O to ONNX Conversion

Below is a code snippet to convert a H2O MOJO model into an ONNX model. The only prerequisite is to have a MOJO model saved on the local file-system.

import onnxmltools

# Convert the Core ML model into ONNX
onnx_model = onnxmltools.convert_h2o('/path/to/h2o/gbm_mojo.zip')

# Save as protobuf
onnxmltools.utils.save_model(onnx_model, 'h2o_gbm.onnx')

Testing model converters

onnxmltools converts models into the ONNX format which can be then used to compute predictions with the backend of your choice.

Checking the operator set version of your converted ONNX model

You can check the operator set of your converted ONNX model using Netron, a viewer for Neural Network models. Alternatively, you could identify your converted model's opset version through the following line of code.

opset_version = onnx_model.opset_import[0].version

If the result from checking your ONNX model's opset is smaller than the target_opset number you specified in the onnxmltools.convert function, be assured that this is likely intended behavior. The ONNXMLTools converter works by converting each operator to the ONNX format individually and finding the corresponding opset version that it was most recently updated in. Once all of the operators are converted, the resultant ONNX model has the maximal opset version of all of its operators.

To illustrate this concretely, let's consider a model with two operators, Abs and Add. As of December 2018, Abs was most recently updated in opset 6, and Add was most recently updated in opset 7. Therefore, the converted ONNX model's opset will always be 7, even if you request target_opset=8. The converter behavior was defined this way to ensure backwards compatibility.

Documentation for the ONNX Model format and more examples for converting models from different frameworks can be found in the ONNX tutorials repository.

Test all existing converters

All converter unit test can generate the original model and converted model to automatically be checked with onnxruntime or onnxruntime-gpu. The unit test cases are all the normal python unit test cases, you can run it with pytest command line, for example:

python -m pytest --ignore .\tests\

It requires onnxruntime, numpy for most models, pandas for transforms related to text features, and scipy for sparse features. One test also requires keras to test a custom operator. That means sklearn or any machine learning library is requested.

Add a new converter

Once the converter is implemented, a unit test is added to confirm that it works. At the end of the unit test, function dump_data_and_model or any equivalent function must be called to dump the expected output and the converted model. Once these file are generated, a corresponding test must be added in tests_backend to compute the prediction with the runtime.

License

Apache License v2.0

More Repositories

1

onnx

Open standard for machine learning interoperability
Python
17,846
star
2

models

A collection of pre-trained, state-of-the-art models in the ONNX format
Jupyter Notebook
7,054
star
3

tutorials

Tutorials for creating and using ONNX models
Jupyter Notebook
3,175
star
4

onnx-tensorrt

ONNX-TensorRT: TensorRT backend for ONNX
C++
2,922
star
5

tensorflow-onnx

Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
Jupyter Notebook
2,303
star
6

onnx-tensorflow

Tensorflow Backend for ONNX
Python
1,281
star
7

onnx-mlir

Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure
C++
754
star
8

optimizer

Actively maintained ONNX Optimizer
C++
579
star
9

sklearn-onnx

Convert scikit-learn models and pipelines to ONNX
Python
548
star
10

onnx-coreml

ONNX to Core ML Converter
Python
393
star
11

keras-onnx

Convert tf.keras/Keras models to ONNX
Python
379
star
12

onnx-caffe2

Caffe2 implementation of Open Neural Network Exchange (ONNX)
Python
165
star
13

onnx-docker

Dockerfiles and scripts for ONNX container images
Jupyter Notebook
133
star
14

onnx-mxnet

ONNX model format support for Apache MXNet
Python
96
star
15

turnkeyml

The AI insights toolchain
Python
53
star
16

onnx-r

R Interface to Open Neural Network Exchange (ONNX)
R
44
star
17

backend-scoreboard

Scoreboard for ONNX Backend Compatibility
Python
24
star
18

onnx.github.io

Code of the official webpage of onnx
HTML
22
star
19

steering-committee

Notes and artifacts from the ONNX steering committee
Jupyter Notebook
22
star
20

working-groups

Repository for ONNX working group artifacts
Jupyter Notebook
20
star
21

sigs

Repository for ONNX SIG artifacts
19
star
22

onnx-xla

XLA integration of Open Neural Network Exchange (ONNX)
C++
19
star
23

wheel-builder

Utils for building and publishing ONNX wheels
Shell
7
star
24

onnx-cntk

Python
6
star