Commit 92d5624c authored by Michał Karzyński's avatar Michał Karzyński Committed by Scott Cyphers

[ONNX] Add documentation (#2629)

* [ONNX] Add documentation

* Update documentation contributor's instructions
parent 5be2cca9
......@@ -16,7 +16,7 @@ workloads on CPU for inference, please refer to the links below.
|----------------------------|----------------------------------------|-----------------------------------
| TensorFlow* 1.12 | [Pip install](https://github.com/NervanaSystems/ngraph-tf#option-1-use-a-pre-built-ngraph-tensorflow-bridge) or [Build from source](https://github.com/NervanaSystems/ngraph-tf#option-2-build-ngraph-bridge-from-source) | 20 [Validated workloads]
| MXNet* 1.3 | [Pip install](https://github.com/NervanaSystems/ngraph-mxnet#Installation) or [Build from source](https://github.com/NervanaSystems/ngraph-mxnet#building-with-ngraph-support)| 18 [Validated workloads]
| ONNX 1.3 | [Pip install](https://github.com/NervanaSystems/ngraph-onnx#installation) | 14 [Validated workloads]
| ONNX 1.4 | [Pip install](https://github.com/NervanaSystems/ngraph-onnx#installation) | 17 [Validated workloads]
#### Python wheels for nGraph
......
.. onnx_integ.rst:
.. ---------------------------------------------------------------------------
.. Copyright 2017 Intel Corporation
.. Licensed under the Apache License, Version 2.0 (the "License");
.. you may not use this file except in compliance with the License.
.. You may obtain a copy of the License at
..
.. http://www.apache.org/licenses/LICENSE-2.0
..
.. Unless required by applicable law or agreed to in writing, software
.. distributed under the License is distributed on an "AS IS" BASIS,
.. WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
.. See the License for the specific language governing permissions and
.. limitations under the License.
.. ---------------------------------------------------------------------------
ONNX & ONNXIFI
==============
\ No newline at end of file
ONNX Support
============
nGraph is able to import and execute ONNX models. Models are converted to nGraph's internal representation and converted to ``Function`` objects, which can be compiled and executed on one of nGraph's backends.
You can use nGraph's Python API to run an ONNX model and nGraph can be used as an ONNX backend using the add-on package `nGraph-ONNX <ngraph_onnx>`_.
.. note:: In order to support ONNX, nGraph must be built with the ``NGRAPH_ONNX_IMPORT_ENABLE`` flag. See `Building nGraph-ONNX <ngraph_onnx_building>`_ for more information. All nGraph packages published on PyPI are built with ONNX support.
Installation
------------
In order to prepare your environment to use nGraph and ONNX, install the Python packages for nGraph, ONNX and NumPy:
::
$ pip install ngraph-core onnx numpy
Importing an ONNX model
-----------------------
You can download models from the `ONNX Model Zoo <onnx_model_zoo_>`_.
For example ResNet-50:
::
$ wget https://s3.amazonaws.com/download.onnx/models/opset_9/resnet50.tar.gz
$ tar -xzvf resnet50.tar.gz
Use the following Python commands to convert the downloaded model to an nGraph ``Function``:
.. code-block:: python
# Import ONNX and load an ONNX file from disk
>>> import onnx
>>> onnx_protobuf = onnx.load('resnet50/model.onnx')
# Convert ONNX model to an ngraph model
>>> from ngraph.impl.onnx_import import import_onnx_model
>>> ng_function = import_onnx_model(onnx_protobuf.SerializeToString())
# The importer returns a list of ngraph models for every ONNX graph output:
>>> print(ng_function)
<Function: 'resnet50' ([1, 1000])>
This creates an nGraph ``Function`` object, which can be used to execute a computation on a chosen backend.
Running a computation
---------------------
You can now create an nGraph ``Runtime`` backend and use it to compile your ``Function`` to a backend-specific ``Computation`` object.
Finally, you can execute your model by calling the created ``Computation`` object with input data.
.. code-block:: python
# Using an nGraph runtime (CPU backend) create a callable computation object
>>> import ngraph as ng
>>> runtime = ng.runtime(backend_name='CPU')
>>> resnet_on_cpu = runtime.computation(ng_function)
>>> print(resnet_on_cpu)
<Computation: resnet50(Parameter_269)>
# Load an image (or create a mock as in this example)
>>> import numpy as np
>>> picture = np.ones([1, 3, 224, 224], dtype=np.float32)
# Run computation on the picture:
>>> resnet_on_cpu(picture)
[array([[2.16105007e-04, 5.58412226e-04, 9.70510227e-05, 5.76671446e-05,
7.45318757e-05, 4.80892748e-04, 5.67404088e-04, 9.48728994e-05,
...
You can find more information about nGraph and ONNX in the `nGraph-ONNX <ngraph_onnx>`_ GitHub repository.
.. _ngraph_onnx: https://github.com/NervanaSystems/ngraph-onnx/
.. _ngraph_onnx_building: https://github.com/NervanaSystems/ngraph-onnx/blob/master/BUILDING.md
.. _onnx_model_zoo: https://github.com/onnx/models
......@@ -83,8 +83,7 @@ ONNX
====
Additionally, we validated the following workloads are functional through
`nGraph ONNX importer`_:
`nGraph ONNX importer`_. ONNX models can be downloaded from the `ONNX Model Zoo`_.
.. csv-table::
:header: "ONNX Workload", "Genre of Deep Learning"
......@@ -92,9 +91,11 @@ Additionally, we validated the following workloads are functional through
:escape: ~
ResNet-50, Image recognition
ResNet-50-v2, Image recognition
DenseNet-121, Image recognition
Inception-v1, Image recognition
Inception-v2, Image recognition
Mobilenet, Image recognition
Shufflenet, Image recognition
SqueezeNet, Image recognition
VGG-19, Image recognition
......@@ -104,7 +105,8 @@ Additionally, we validated the following workloads are functional through
BVLC AlexNet, Image recognition
BVLC GoogleNet, Image recognition
BVLC CaffeNet, Image recognition
BVLC R-CNN ILSVRC13, Object detection
BVLC R-CNN ILSVRC13, Object detection
ArcFace, Face Detection and Recognition
.. important:: Please see Intel's `Optimization Notice`_ for details on disclaimers.
......@@ -123,6 +125,7 @@ Additionally, we validated the following workloads are functional through
.. _Optimization Notice: https://software.intel.com/en-us/articles/optimization-notice
.. _nGraph ONNX importer: https://github.com/NervanaSystems/ngraph-onnx/blob/master/README.md
.. _ONNX Model Zoo: https://github.com/onnx/models
.. Notice revision #20110804: Intel's compilers may or may not optimize to the same degree for
non-Intel microprocessors for optimizations that are not unique to Intel microprocessors.
......
......@@ -108,13 +108,13 @@ How to build the documentation
-------------------------------
.. note:: Stuck on how to generate the html? Run these commands; they assume
you start at a command line running within a clone (or a cloned fork) of the
``ngraph`` repo. You do **not** need to run a virtual environment to create
documentation if you don't want; running ``$ make clean`` in the
.. note:: Stuck on how to generate the html? Run these commands; they assume
you start at a command line running within a clone (or a cloned fork) of the
``ngraph`` repo. You do **not** need to run a virtual environment to create
documentation if you don't want; running ``$ make clean`` in the
``doc/sphinx`` folder removes any generated files.
Right now the minimal version of Sphinx needed to build the documentation is
Right now the minimal version of Sphinx needed to build the documentation is
Sphinx v. 1.7.5. This can be installed with :command:`pip3`, either to a virtual
environment, or to your base system if you plan to contribute much core code or
documentation. For C++ API docs that contain inheritance diagrams and collaboration
......@@ -127,8 +127,8 @@ To build documentation locally, run:
.. code-block:: console
$ sudo apt-get install python3-sphinx
$ pip3 install [-I] Sphinx==1.7.5 [--user]
$ pip3 install [-I] breathe numpy [--user]
$ pip3 install Sphinx==1.7.5
$ pip3 install breathe numpy
$ cd doc/sphinx/
$ make html
$ cd build/html
......@@ -142,7 +142,7 @@ To build documentation in a python3 virtualenv, run:
$ python3 -m venv py3doc
$ . py3doc/bin/activate
(py3doc)$ pip install python3-sphinx breathe numpy
(py3doc)$ pip install Sphinx breathe numpy
(py3doc)$ cd doc/sphinx
(py3doc)$ make html
(py3doc)$ cd build/html
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment