• Stars
    star
    1,281
  • Rank 36,746 (Top 0.8 %)
  • Language
    Python
  • License
    Other
  • Created about 7 years ago
  • Updated 8 months ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

Tensorflow Backend for ONNX

TensorFlow Backend for ONNX

Backend Test Status ModelZoo Test Status

Note this repo is not actively maintained and will be deprecated. If you are interested in becoming the owner, please contact the ONNX Steering Committee (https://github.com/onnx/steering-committee).

Open Neural Network Exchange (ONNX) is an open standard format for representing machine learning models. ONNX is supported by a community of partners who have implemented it in many frameworks and tools.

TensorFlow Backend for ONNX makes it possible to use ONNX models as input for TensorFlow. The ONNX model is first converted to a TensorFlow model and then delegated for execution on TensorFlow to produce the output.

This is one of the two TensorFlow converter projects which serve different purposes in the ONNX community:

Converting Models from ONNX to TensorFlow

Use CLI

Command Line Interface Documentation

From ONNX to TensorFlow: onnx-tf convert -i /path/to/input.onnx -o /path/to/output

Convert Programmatically

From ONNX to TensorFlow

Migrating from onnx-tf to tf-onnx

We have joined force with Microsoft to co-develop ONNX TensorFlow frontend. For current onnx-tf frontend users, please migrate to use tf-onnx (https://github.com/onnx/tensorflow-onnx) where our code had been merged into.

ONNX Model Inference with TensorFlow Backend

import onnx
from onnx_tf.backend import prepare

onnx_model = onnx.load("input_path")  # load onnx model
output = prepare(onnx_model).run(input)  # run the loaded model

More Tutorials

Running an ONNX model using TensorFlow

Production Installation

ONNX-TF requires ONNX (Open Neural Network Exchange) as an external dependency, for any issues related to ONNX installation, we refer our users to ONNX project repository for documentation and help. Notably, please ensure that protoc is available if you plan to install ONNX via pip.

The specific ONNX release version that we support in the master branch of ONNX-TF can be found here. This information about ONNX version requirement is automatically encoded in setup.py, therefore users needn't worry about ONNX version requirement when installing ONNX-TF.

To install the latest version of ONNX-TF via pip, run pip install onnx-tf.

Because users often have their own preferences for which variant of TensorFlow to install (i.e., a GPU version instead of a CPU version), we do not explicitly require tensorflow in the installation script. It is therefore users' responsibility to ensure that the proper variant of TensorFlow is available to ONNX-TF. Moreover, we require TensorFlow version == 2.8.0.

Development

Coverage Status

ONNX-TensorFlow Op Coverage Status

API

ONNX-TensorFlow API

Installation

  • Install ONNX master branch from source.
  • Install TensorFlow >= 2.8.0, tensorflow-probability and tensorflow-addons. (Note TensorFlow 1.x is no longer supported)
  • Run git clone https://github.com/onnx/onnx-tensorflow.git && cd onnx-tensorflow.
  • Run pip install -e ..

Folder Structure

  • onnx_tf: main source code file.
  • test: test files.

Code Standard

  • Format code
pip install yapf
yapf -rip --style="{based_on_style: google, indent_width: 2}" $FilePath$
  • Install pylint
pip install pylint
wget -O /tmp/pylintrc https://raw.githubusercontent.com/tensorflow/tensorflow/master/tensorflow/tools/ci_build/pylintrc
  • Check format
pylint --rcfile=/tmp/pylintrc myfile.py

Documentation Standard

Google Style Python Docstrings

Testing

Unit Tests

To perfom unit tests:

pip install pytest tabulate
python -m unittest discover test

Note: Only the ONNX backend tests found in test_onnx_backend.py require the pytest and tabulate packages.

Testing requires significant hardware resources, but nonetheless, we highly recommend that users run through the complete test suite before deploying onnx-tf. The complete test suite typically takes between 15 and 45 minutes to complete, depending on hardware configurations.

Model Zoo Tests

The tests in test_modelzoo.py verify whether the ONNX Model Zoo models can be successfully validated against the ONNX specification and converted to a TensorFlow representation. Model inferencing on the converted model is not tested currently.

Prerequisites

The model zoo uses Git LFS (Large File Storage) to store ONNX model files. Make sure that Git LFS is installed on your operating system.

Running

By default, the tests assume that the model zoo repository has been cloned into this project directory. The model zoo directory is scanned for ONNX models. For each model found: download the model, convert the model to TensorFlow, generate a test status, and delete the model. By default, the generated test report is created in the system temporary directory. Run python test/test_modelzoo.py -h for help on command line options.

git clone https://github.com/onnx/models
python test/test_modelzoo.py

Testing all models can take at least an hour to complete, depending on hardware configuration and model download times. If you expect to test some models frequently, we recommend using Git LFS to download those models before running the tests so the large files are cached locally.

Reports

When making code contributions, the model zoo tests are run when a commit is merged. Generated test reports are published on the onnx-tensorflow wiki.

More Repositories

1

onnx

Open standard for machine learning interoperability
Python
17,846
star
2

models

A collection of pre-trained, state-of-the-art models in the ONNX format
Jupyter Notebook
7,054
star
3

tutorials

Tutorials for creating and using ONNX models
Jupyter Notebook
3,175
star
4

onnx-tensorrt

ONNX-TensorRT: TensorRT backend for ONNX
C++
2,922
star
5

tensorflow-onnx

Convert TensorFlow, Keras, Tensorflow.js and Tflite models to ONNX
Jupyter Notebook
2,303
star
6

onnxmltools

ONNXMLTools enables conversion of models to ONNX
Python
1,002
star
7

onnx-mlir

Representation and Reference Lowering of ONNX Models in MLIR Compiler Infrastructure
C++
754
star
8

optimizer

Actively maintained ONNX Optimizer
C++
579
star
9

sklearn-onnx

Convert scikit-learn models and pipelines to ONNX
Python
548
star
10

onnx-coreml

ONNX to Core ML Converter
Python
393
star
11

keras-onnx

Convert tf.keras/Keras models to ONNX
Python
379
star
12

onnx-caffe2

Caffe2 implementation of Open Neural Network Exchange (ONNX)
Python
165
star
13

onnx-docker

Dockerfiles and scripts for ONNX container images
Jupyter Notebook
133
star
14

onnx-mxnet

ONNX model format support for Apache MXNet
Python
96
star
15

turnkeyml

The AI insights toolchain
Python
53
star
16

onnx-r

R Interface to Open Neural Network Exchange (ONNX)
R
44
star
17

backend-scoreboard

Scoreboard for ONNX Backend Compatibility
Python
24
star
18

onnx.github.io

Code of the official webpage of onnx
HTML
22
star
19

steering-committee

Notes and artifacts from the ONNX steering committee
Jupyter Notebook
22
star
20

working-groups

Repository for ONNX working group artifacts
Jupyter Notebook
20
star
21

sigs

Repository for ONNX SIG artifacts
19
star
22

onnx-xla

XLA integration of Open Neural Network Exchange (ONNX)
C++
19
star
23

wheel-builder

Utils for building and publishing ONNX wheels
Shell
7
star
24

onnx-cntk

Python
6
star