import.rst 5.67 KB
Newer Older
1 2 3 4 5 6 7 8
.. import.rst:

###############
Import a model
###############

:ref:`from_onnx`

9 10 11 12
.. That can be the first page data scientists find when they are simply trying 
.. to run a trained model; they DO NOT need to do a system install of the Intel
.. nGraph++ bridges; they can use our Python APIs to run a trained model.
..  
13

14 15 16 17
The Intel nGraph APIs can be used to run inference on a model that has been 
*exported* from a Deep Learning framework. An export produces a file with 
a serialized model that can be loaded and passed to one of the nGraph 
backends.  
18 19 20

.. _from_onnx:

21 22
Importing a model from ONNX
============================
23 24 25 26 27 28 29 30 31 32

The most-widely supported :term:`export` format available today is `ONNX`_.
Models that have been serialized to ONNX are easy to identify; they are 
usually named ``<some_model>.onnx`` or ``<some_model>.onnx.pb``. These 
`tutorials from ONNX`_ describe how to turn trained models into an 
``.onnx`` export.

.. important:: If you landed on this page and you already have an ``.onnx`` 
   or ``.onnx.pb`` formatted file, you should be able to run the inference 
   without needing to dig into anything from the "Frameworks" sections. You 
33
   will, however, need to have completed the steps outlined in 
34 35
   our :doc:`../install` guide.  

36 37 38 39 40
To demonstrate functionality, we'll use an already serialized CIFAR10 model 
trained via ResNet20. Remember that this model has already been trained and 
exported from a framework such as Caffe2, PyTorch or CNTK; we are simply going 
to build an nGraph representation of the model, execute it, and produce some 
outputs.
41 42 43


Installing ``ngraph_onnx``
44
==========================
45

46 47
To use ONNX models with ngraph, you will also need the companion tool 
``ngraph_onnx``. ``ngraph_onnx`` requires Python 3.4 or higher.
48

49 50 51
This code assumes that you already followed the default instructions from the 
:doc:`../install` guide; ``ngraph_dist`` was installed to ``$HOME/ngraph_dist``
and the `ngraph` repo was cloned to ``/opt/libraries/``
52

53
#. First set the environment variables to where we built the nGraph++ libraries:
54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75

   .. code-block:: bash

      export NGRAPH_CPP_BUILD_PATH=$HOME/ngraph_dist
      export LD_LIBRARY_PATH=$HOME/ngraph_dist/lib
      export DYLD_LIBRARY_PATH=$HOME/ngraph_dist/lib  # On MacOS

#. Now add *Protocol Buffers* and Python3 PIP dependencies to your system. ONNX 
   requires Protocol Buffers version 2.6.1 or higher. For example, on Ubuntu:

   .. code-block:: console

      $ sudo apt install protobuf-compiler libprotobuf-dev python3-pip

#. Checkout the branch named `python_binding`: 

   .. code-block:: console

      $ cd /opt/libraries/ngraph
      $ git checkout python_binding
        Switched to branch 'python_binding'

76
#. Recursively update the submodule and install the Python dependencies: 
77 78 79 80 81

   .. code-block:: console

      $ git submodule update --init --recursive
      $ cd python
82 83
      $ pip3 install -r requirements.txt
      $ pip3 install .
84

85 86 87 88
#. Finally, clone the ``ngraph-onnx`` repo and use :command:`pip` to install the 
   Python dependencies for this tool; if you set up your original nGraph library 
   installation under a ``libraries`` directory    as recommended, it's a good 
   idea to clone this repo there, as well.
89 90 91 92 93 94 95 96 97 98 99
   
   .. code-block:: console

      $ cd /opt/libraries
      $ git clone git@github.com:NervanaSystems/ngraph-onnx
      $ cd ngnraph-onnx
      $ pip3 install -r requirements.txt
      $ pip3 install .
 

Importing a serialized model
100
=============================
101

102 103 104
With the dependencies added, we can now import a model that has 
been serialized by ONNX, interact locally with the model by running 
Python code, create and load objects, and run inference. 
105

106 107 108 109 110
This section assumes that you have your own ONNX model. With this 
example model from Microsoft\*'s Deep Learning framework, `CNTK`_,
we can outline the procedure to show how to run ResNet on model 
that has been trained on the CIFAR10 data set and serialized with 
ONNX. 
111 112


113
Enable ONNX and load an ONNX file from disk
114 115 116 117 118 119 120 121
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. literalinclude:: ../../../examples/onnx_example.py
   :language: python
   :lines: 17-19

 
Convert an ONNX model to an ngraph model 
122
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
123 124 125 126 127 128

.. literalinclude:: ../../../examples/onnx_example.py
   :language: python
   :lines: 22-23

 
129 130
Using ngraph_api, create a callable computation object
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148

.. literalinclude:: ../../../examples/onnx_example.py
   :language: python
   :lines: 27-29


Load or create an image
~~~~~~~~~~~~~~~~~~~~~~~~

.. literalinclude:: ../../../examples/onnx_example.py
   :language: python
   :lines: 32-33

Run ResNet inference on picture
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

.. literalinclude:: ../../../examples/onnx_example.py
   :language: python
149
   :lines: 36-37
150 151 152 153 154 155 156 157
 

Put it all together
===================

.. literalinclude:: ../../../examples/onnx_example.py
   :language: python
   :lines: 17-37
158
   :caption: "Demo sample code to run inference with nGraph"
159 160


161 162 163
Outputs will vary greatly, depending on your model; for
demonstration purposes, the code will look something like: 

164 165 166

.. code-block:: python 

167 168 169
   array([[ 1.312082 , -1.6729496,  4.2079577,  1.4012241, -3.5463796,
         2.3433776,  1.7799224, -1.6155214,  0.0777044, -4.2944093]],
      dtype=float32)
170 171 172 173 174 175 176 177 178 179



.. Importing models from NNVM
   ---------------------------

.. if you work on NNVM you can add this instuction here. 



180 181
.. Importing models serialized with XLA
   -------------------------------------
182 183 184 185 186 187 188 189 190 191

.. if you work on XLA you can add this instruction here.


.. etc, eof 



.. _ONNX: http://onnx.ai
.. _tutorials from ONNX: https://github.com/onnx/tutorials
192
.. _CNTK: https://www.microsoft.com/en-us/cognitive-toolkit/features/model-gallery/