import.rst 5.08 KB
Newer Older
1 2 3 4 5 6 7 8
.. import.rst:

###############
Import a model
###############

:ref:`from_onnx`

9 10 11
nGraph APIs can be used to run inference on a model that has been *exported* 
from a Deep Learning framework. An export produces a file with a serialized 
model that can be loaded and passed to one of the nGraph backends.  
12 13 14 15


.. _from_onnx:

16 17
Importing a model from ONNX
============================
18 19 20 21 22 23 24

The most-widely supported :term:`export` format available today is `ONNX`_.
Models that have been serialized to ONNX are easy to identify; they are 
usually named ``<some_model>.onnx`` or ``<some_model>.onnx.pb``. These 
`tutorials from ONNX`_ describe how to turn trained models into an 
``.onnx`` export.

25 26 27 28
.. important:: If you landed on this page and you already have an ``.onnx`` or 
   an ``.onnx.pb`` formatted file, you should be able to run the inference without
   needing to dig into anything from the "Frameworks" sections. You will, however, 
   need to have completed the steps outlined in our :doc:`../../buildlb` guide.  
29

30
To demonstrate functionality, we'll use an already-serialized CIFAR10 model 
31 32 33 34
trained via ResNet20. Remember that this model has already been trained and 
exported from a framework such as Caffe2, PyTorch or CNTK; we are simply going 
to build an nGraph representation of the model, execute it, and produce some 
outputs.
35

36 37
Installing ``ngraph_onnx`` with nGraph from scratch
====================================================
38

39 40
See the documentation on: `building nGraph and nGraph-ONNX`_ for the latest 
instructions. 
41 42
 

43
.. _import_model:
L.S. Cook's avatar
L.S. Cook committed
44

45
Importing a serialized model
46
=============================
47

48
After building and installing ``ngraph_onnx``, we can import a model that has 
49 50
been serialized by ONNX, interact locally with the model by running 
Python code, create and load objects, and run inference. 
51

52 53 54 55 56
This section assumes that you have your own ONNX model. With this 
example model from Microsoft\*'s Deep Learning framework, `CNTK`_,
we can outline the procedure to show how to run ResNet on model 
that has been trained on the CIFAR10 data set and serialized with 
ONNX. 
57 58


59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78
(Optional) Localize your export to the virtual environment 
----------------------------------------------------------

For this example, let's say that our serialized file was output under our $HOME 
directory, say at ``~/onnx_conversions/trained_model.onnx``. To make loading this 
file easier, you can run the example below from your Venv in that directory. If 
you invoke your python interpreter in a different directory, you will need to 
specify the relative path to the location of the ``.onnx`` file.

.. important:: If you invoke your Python interpreter in directory other than 
   where you outputted your trained model, you will need to specify the 
   **relative** path to the location of the ``.onnx`` file.


.. code-block:: console 

   (onnx) $ cd ~/onnx_conversions 
   (onnx) $ python3 


79
Enable ONNX and load an ONNX file from disk
80
--------------------------------------------
81

82
.. literalinclude:: ../../../../examples/onnx/onnx_example.py
83 84 85 86 87
   :language: python
   :lines: 17-19

 
Convert an ONNX model to an ngraph model 
88
-------------------------------------------
89

90
.. literalinclude:: ../../../../examples/onnx/onnx_example.py
91 92 93
   :language: python
   :lines: 22-23

94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112

The importer returns a list of ngraph models for every ONNX graph 
output:


.. code-block:: python

   print(ng_models)
   [{
       'name': 'Plus5475_Output_0',
       'output': <Add: 'Add_1972' ([1, 10])>,
       'inputs': [<Parameter: 'Parameter_1104' ([1, 3, 32, 32], float)>]
    }]

The ``output`` field contains the ngraph node corrsponding to the output node 
in the imported ONNX computational graph. The ``inputs`` list contains all 
input parameters for the computation which generates the output.


113
 
114
Using ngraph_api, create a callable computation object
115
-------------------------------------------------------
116

117
.. literalinclude:: ../../../../examples/onnx/onnx_example.py
118 119 120 121 122
   :language: python
   :lines: 27-29


Load or create an image
123
------------------------
124

125
.. literalinclude:: ../../../../examples/onnx/onnx_example.py
126 127 128 129
   :language: python
   :lines: 32-33

Run ResNet inference on picture
130
---------------------------------
131

132
.. literalinclude:: ../../../../examples/onnx/onnx_example.py
133
   :language: python
134
   :lines: 36-37
135 136 137 138 139
 

Put it all together
===================

140
.. literalinclude:: ../../../../examples/onnx/onnx_example.py
141 142
   :language: python
   :lines: 17-37
143
   :caption: "Demo sample code to run inference with nGraph"
144 145


146 147 148
Outputs will vary greatly, depending on your model; for
demonstration purposes, the code will look something like: 

149 150 151

.. code-block:: python 

152 153 154
   array([[ 1.312082 , -1.6729496,  4.2079577,  1.4012241, -3.5463796,
         2.3433776,  1.7799224, -1.6155214,  0.0777044, -4.2944093]],
      dtype=float32)
155 156


157
.. _building nGraph and nGraph-ONNX: https://github.com/NervanaSystems/ngraph-onnx/blob/master/BUILDING.md
158
.. _ngraph-onnx: https://github.com/NervanaSystems/ngraph-onnx#ngraph
159 160
.. _ONNX: http://onnx.ai
.. _tutorials from ONNX: https://github.com/onnx/tutorials
161
.. _CNTK: https://www.microsoft.com/en-us/cognitive-toolkit/features/model-gallery/