import.rst 7.52 KB
Newer Older
1 2 3 4 5 6 7 8
.. import.rst:

###############
Import a model
###############

:ref:`from_onnx`

9 10 11
nGraph APIs can be used to run inference on a model that has been *exported* 
from a Deep Learning framework. An export produces a file with a serialized 
model that can be loaded and passed to one of the nGraph backends.  
12 13 14 15


.. _from_onnx:

16 17
Importing a model from ONNX
============================
18 19 20 21 22 23 24 25 26 27

The most-widely supported :term:`export` format available today is `ONNX`_.
Models that have been serialized to ONNX are easy to identify; they are 
usually named ``<some_model>.onnx`` or ``<some_model>.onnx.pb``. These 
`tutorials from ONNX`_ describe how to turn trained models into an 
``.onnx`` export.

.. important:: If you landed on this page and you already have an ``.onnx`` 
   or ``.onnx.pb`` formatted file, you should be able to run the inference 
   without needing to dig into anything from the "Frameworks" sections. You 
28
   will, however, need to have completed the steps outlined in 
29
   our :doc:`../buildlb` guide.  If you intend to build nGraph for :   doc:`distributed-training`, 
30 31
   you will need to build that has already been compiled with the additional 
   cmake flag: ``-DNGRAPH_DISTRIBUTED_ENABLE=TRUE``.
32

33
To demonstrate functionality, we'll use an already-serialized CIFAR10 model 
34 35 36 37
trained via ResNet20. Remember that this model has already been trained and 
exported from a framework such as Caffe2, PyTorch or CNTK; we are simply going 
to build an nGraph representation of the model, execute it, and produce some 
outputs.
38 39


40 41
Installing ``ngraph_onnx`` with nGraph from scratch
====================================================
42

43 44 45 46 47 48 49
To use ONNX models with nGraph, you will also need the companion tool 
``ngraph_onnx``, which requires Python 3.4 or higher. If nGraph has not 
yet been installed to your system, you can follow these steps to install 
everything at once; if an `ngraph_dist` is already installed on your system, 
skip ahead to the next section, :ref:`install_ngonnx`.
   

50
#. Install prerequisites for the system and install nGraph as ``ngraph_dist``:
51 52
  
   .. code-block:: console
53

54
      $ apt update
L.S. Cook's avatar
L.S. Cook committed
55
      $ apt install python3 python3-pip python3-dev python3-venv
56
      $ apt install build-essential cmake curl clang-3.9 git zlib1g zlib1g-dev libtinfo-dev
57
      $ git clone https://github.com/NervanaSystems/ngraph.git
58
      $ cd ngraph && mkdir build
59
      $ cd build && cmake ../ -DCMAKE_INSTALL_PREFIX=~/ngraph_dist -DNGRAPH_USE_PREBUILT_LLVM=TRUE
60 61
      $ make install

62 63 64
#. Build the Python package (binary wheel) for ngraph and set up an env for ONNX;
   be sure to export the ``NGRAPH_CPP_BUILD_PATH`` where the ``ngraph_dist`` was 
   installed. 
65 66 67

   .. code-block:: console

68
      $ cd ngraph/python
69 70
      $ git clone --recursive -b allow-nonconstructible-holders https://github.com/jagerman/pybind11.git
      $ export PYBIND_HEADERS_PATH=$PWD/pybind11
71
      $ export NGRAPH_CPP_BUILD_PATH=~/ngraph_dist
72
      $ python3 setup.py bdist_wheel
L.S. Cook's avatar
L.S. Cook committed
73 74 75
      $ cd .. python3 -m venv onnx
      $ cd onnx/
      $ . bin/activate
76

L.S. Cook's avatar
L.S. Cook committed
77 78
#. Check for the binary wheel file under ``/ngraph/python/dist/`` and install it 
   with pip.
79

80
   .. code-block:: console
81

82
      (onnx)$ pip install -U python/dist/ngraph-0.5.0-cp35-cp35m-linux_x86_64.whl    
83

84

85
#. Confirm ngraph is properly installed through a Python interpreter:
86 87 88

   .. code-block:: console

L.S. Cook's avatar
L.S. Cook committed
89
      (onnx)$ python3
90

91 92 93 94 95
   .. code-block:: python
      
      import ngraph as ng
      ng.abs([[1, 2, 3], [4, 5, 6]])
      <Abs: 'Abs_1' ([2, 3])>
96

97
   If you don't see any errors, ngraph should be installed correctly.
98 99


100
.. _install_ngonnx:
101

102 103
Installing ngraph-onnx
-----------------------
104

L.S. Cook's avatar
L.S. Cook committed
105 106 107 108 109 110 111
Add the dependencies for ONNX:  

.. code-block:: console

   $ apt install protobuf-compiler libprotobuf-dev


112
Install the ``ngraph-onnx`` companion tool using pip:
113

114
.. code-block:: console
115

116
   (onnx) $ pip install git+https://github.com/NervanaSystems/ngraph-onnx/
117 118 119
 

Importing a serialized model
120
=============================
121

122 123 124
With the dependencies added, we can now import a model that has 
been serialized by ONNX, interact locally with the model by running 
Python code, create and load objects, and run inference. 
125

126 127 128 129 130
This section assumes that you have your own ONNX model. With this 
example model from Microsoft\*'s Deep Learning framework, `CNTK`_,
we can outline the procedure to show how to run ResNet on model 
that has been trained on the CIFAR10 data set and serialized with 
ONNX. 
131 132


133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152
(Optional) Localize your export to the virtual environment 
----------------------------------------------------------

For this example, let's say that our serialized file was output under our $HOME 
directory, say at ``~/onnx_conversions/trained_model.onnx``. To make loading this 
file easier, you can run the example below from your Venv in that directory. If 
you invoke your python interpreter in a different directory, you will need to 
specify the relative path to the location of the ``.onnx`` file.

.. important:: If you invoke your Python interpreter in directory other than 
   where you outputted your trained model, you will need to specify the 
   **relative** path to the location of the ``.onnx`` file.


.. code-block:: console 

   (onnx) $ cd ~/onnx_conversions 
   (onnx) $ python3 


153
Enable ONNX and load an ONNX file from disk
154
--------------------------------------------
155

156
.. literalinclude:: ../../../examples/onnx/onnx_example.py
157 158 159 160 161
   :language: python
   :lines: 17-19

 
Convert an ONNX model to an ngraph model 
162
-------------------------------------------
163

164
.. literalinclude:: ../../../examples/onnx/onnx_example.py
165 166 167
   :language: python
   :lines: 22-23

168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186

The importer returns a list of ngraph models for every ONNX graph 
output:


.. code-block:: python

   print(ng_models)
   [{
       'name': 'Plus5475_Output_0',
       'output': <Add: 'Add_1972' ([1, 10])>,
       'inputs': [<Parameter: 'Parameter_1104' ([1, 3, 32, 32], float)>]
    }]

The ``output`` field contains the ngraph node corrsponding to the output node 
in the imported ONNX computational graph. The ``inputs`` list contains all 
input parameters for the computation which generates the output.


187
 
188
Using ngraph_api, create a callable computation object
189
-------------------------------------------------------
190

191
.. literalinclude:: ../../../examples/onnx/onnx_example.py
192 193 194 195 196
   :language: python
   :lines: 27-29


Load or create an image
197
------------------------
198

199
.. literalinclude:: ../../../examples/onnx/onnx_example.py
200 201 202 203
   :language: python
   :lines: 32-33

Run ResNet inference on picture
204
---------------------------------
205

206
.. literalinclude:: ../../../examples/onnx/onnx_example.py
207
   :language: python
208
   :lines: 36-37
209 210 211 212 213
 

Put it all together
===================

214
.. literalinclude:: ../../../examples/onnx/onnx_example.py
215 216
   :language: python
   :lines: 17-37
217
   :caption: "Demo sample code to run inference with nGraph"
218 219


220 221 222
Outputs will vary greatly, depending on your model; for
demonstration purposes, the code will look something like: 

223 224 225

.. code-block:: python 

226 227 228
   array([[ 1.312082 , -1.6729496,  4.2079577,  1.4012241, -3.5463796,
         2.3433776,  1.7799224, -1.6155214,  0.0777044, -4.2944093]],
      dtype=float32)
229 230 231 232 233 234



.. Importing models from NNVM
   ---------------------------

235
.. if you work on NNVM you can add this instruction here. 
236 237 238



239 240
.. Importing models serialized with XLA
   -------------------------------------
241 242 243 244 245 246 247

.. if you work on XLA you can add this instruction here.


.. etc, eof 


248
.. _ngraph-onnx: https://github.com/NervanaSystems/ngraph-onnx#ngraph
249 250
.. _ONNX: http://onnx.ai
.. _tutorials from ONNX: https://github.com/onnx/tutorials
251
.. _CNTK: https://www.microsoft.com/en-us/cognitive-toolkit/features/model-gallery/