Commit 0bf21af9 authored by L.S. Cook's avatar L.S. Cook Committed by Scott Cyphers

Amazon codeshare (#429)

* WIP on finding a good format for op docs in RST

* A few more scribbles

* fix up branch for Amazon code share

* add conf.py configuration details from aproctor's branch for doxy-breathe integration

* update section on how to build the documentation with breathe install details

* Remove empty file on training, update framework integration notes

* Add CentOS stub, fix spelling, core op definition, add to glossary.

* more documentation cleanup on README and installation and testing

* more cleanup of docs for TernsorFlow

* Simplify Dot Autodiff (#412)

* Simplify Dot Autodiff

* remove commented code

* Remove TupleType, ValueType (#411)

* Remove TupleType, ValueType

* Fix compile error.

* Change convolution reference to work with f32 (#409)

* Drwebb/gpu backend dot op (#413)

* Drwebb/gpu backend dot op (#387)

* GPU Dot prod emitter switch statement

* cuBLAS dot kernel call

* Flush out arg substitution into gpu dot kernel call

* Drwebb/gpu backend dot op (#392)

* Take in CodeWriter into gpu op emitters

* Introduce GPU function gen based on pass functions

* Additional gpu emitter stubs

* link cublas in to unit test and ngraph

* Use static code gen methods for GPU, add new GPU op stubs

* use pass manager to declare functions / cublas Updates

* Prune down gpu_external_function wip

* Switch back to GPU tensor views in GPU backend

* Pass in cublas handle to GPU external function

* cuMalloc memory in gpu tensor view

* Use cuda runtime malloc and free for tensor view managment c

* change GPU tensor view init, and use GPU tensor view for GPU call frame

* include headers as system dirs

* GPU tensor printing utility function

* cublasSetPointer to device mode / Fix copyright notification lowercasing

* Passing GPU dot product test using cuBLAS

Clean up

* Changes from review

* Add an overivew.

* Intro for building graphs.

* Refactor docs so that Doxygen and Sphinx are integrated (Sphinx depends on Doxygen with the docstrings stuff)

Still need to resolve a lingering assumption that the build dir is contained in private-ngraph-cpp. It's proving to be surprisingly tricky.

* Added the TensorFlow XLA build information and example of how to run MNIST MLP with TF/nGraph

* Updated TF integration guide for clarity. Added files from cyphers-amazon branch. Add minor changes to sphinx-doxy to test apis

* Small revision of overview and add graphic from arXiv paper

* WIP more editing, picking up from where I left off last week

* Fix garbled sentence edit

* WIP Edit for readability and such
:

* Better font rendering on all architectures included with our custom theme

* Cleanup current version of documentation.  Add NeoSans font binaries to make local font rendering of h1 h2 etc

* Missed merge conflict

* Add something on functions, don't forward-reference parameters

* What we have so far into a PR for review

* Need file for cmake

* Missing header

* Remove duplicate file

* added breathe package to contrib/docker/Dockerfile.ngraph_cpp
parent b408a08e
# Copyright 2017 Nervana Systems Inc. # Copyright 2018 Nervana Systems Inc.
# Licensed under the Apache License, Version 2.0 (the "License"); # Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License. # you may not use this file except in compliance with the License.
# You may obtain a copy of the License at # You may obtain a copy of the License at
......
...@@ -19,5 +19,6 @@ RUN pip install --upgrade pip ...@@ -19,5 +19,6 @@ RUN pip install --upgrade pip
# installed sphinx with pip to get the updated version 1.6.5 # installed sphinx with pip to get the updated version 1.6.5
# allows for make html build under the doc/source directory as an interim build process # allows for make html build under the doc/source directory as an interim build process
RUN pip install sphinx RUN pip install sphinx
RUN pip install breathe
WORKDIR /home WORKDIR /home
...@@ -10,25 +10,20 @@ ...@@ -10,25 +10,20 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
if ("${NGRAPH_BUILD_DOCS}" MATCHES "^ON$")
add_custom_target( docs
COMMENT "Build all of the documentation types selected during CMake configuration."
)
add_custom_target( docs add_subdirectory( doxygen )
COMMENT "Build all of the documentation types selected during CMake configuration." add_subdirectory( sphinx )
) else()
add_custom_target( docs
set(DOCS_TARGET_IS_EMPTY TRUE)
add_subdirectory( doxygen )
add_subdirectory( sphinx )
if (DOCS_TARGET_IS_EMPTY)
add_custom_target( docs-is-noop-error
COMMAND echo COMMAND echo
COMMAND echo "The 'docs' target does nothing because every kind of doc was disabled during configuration" COMMAND echo "The 'docs' target is disabled. To enable the building of documentation, re-run cmake with the option -DNGRAPH_BUILD_DOCS=ON."
COMMAND echo COMMAND echo
COMMAND false COMMAND false
VERBATIM VERBATIM
) )
add_dependencies( docs docs-is-noop-error )
endif() endif()
...@@ -11,38 +11,41 @@ ...@@ -11,38 +11,41 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
set(NGRAPH_BUILD_DOXYGEN_DOCS FALSE find_package(Doxygen REQUIRED)
CACHE BOOL
"The NGraph build system shall contain a target for Doxygen-based docs." if ("${NGRAPH_DOXYGEN_WARN_IF_UNDOCUMENTED}" MATCHES "^ON$")
) set(DOXYGEN_WARN_IF_UNDOCUMENTED YES)
else()
set(DOXYGEN_WARN_IF_UNDOCUMENTED NO)
endif()
if (NGRAPH_BUILD_DOXYGEN_DOCS) if ("${NGRAPH_DOXYGEN_QUIET}" MATCHES "^ON$")
find_package(Doxygen REQUIRED) set(DOXYGEN_QUIET YES)
else()
set(DOXYGEN_IN "${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile.in") set(DOXYGEN_QUIET NO)
set(DOXYGEN_OUT "${CMAKE_CURRENT_BINARY_DIR}/Doxyfile")
configure_file("${DOXYGEN_IN}" "${DOXYGEN_OUT}" @ONLY)
add_custom_target(doxygen-docs
ALL
COMMAND "${DOXYGEN_EXECUTABLE}" "${DOXYGEN_OUT}"
WORKING_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}"
COMMENT "Generating documentation with Doxygen"
VERBATIM )
add_dependencies( docs doxygen-docs )
set(DOCS_TARGET_IS_EMPTY FALSE PARENT_SCOPE)
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/html/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/html"
OPTIONAL
)
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/latex/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/latex"
OPTIONAL
)
endif() endif()
set(DOXYGEN_IN "${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile.in")
set(DOXYGEN_OUT "${CMAKE_CURRENT_BINARY_DIR}/Doxyfile")
configure_file("${DOXYGEN_IN}" "${DOXYGEN_OUT}" @ONLY)
add_custom_target(doxygen-docs
ALL
COMMAND "${DOXYGEN_EXECUTABLE}" "${DOXYGEN_OUT}"
WORKING_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}"
COMMENT "Generating documentation with Doxygen"
VERBATIM )
add_dependencies( docs doxygen-docs )
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/html/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/html"
OPTIONAL
)
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/latex/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/latex"
OPTIONAL
)
PROJECT_NAME = "ngraph++" PROJECT_NAME = "Intel® nGraph™ library"
PROJECT_BRIEF = "Nervana graph compiler" PROJECT_BRIEF = "Intel® nGraph™ library"
OUTPUT_DIRECTORY = @CMAKE_CURRENT_BINARY_DIR@ OUTPUT_DIRECTORY = @CMAKE_CURRENT_BINARY_DIR@
INPUT = @CMAKE_SOURCE_DIR@/src INPUT = @CMAKE_SOURCE_DIR@/src
RECURSIVE = YES RECURSIVE = YES
EXCLUDE_PATTERNS = json.hpp
USE_MATHJAX = YES USE_MATHJAX = YES
GENERATE_XML = YES
WARN_IF_UNDOCUMENTED = @DOXYGEN_WARN_IF_UNDOCUMENTED@
QUIET = @DOXYGEN_QUIET@
# Minimal makefile for Sphinx documentation # Robust Makefile for Sphinx documentation
# #
# You can set these variables from the command line. # You can set these variables from the command line.
...@@ -17,4 +17,113 @@ help: ...@@ -17,4 +17,113 @@ help:
# Catch-all target: route all unknown targets to Sphinx using the new # Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS). # "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile %: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O) @$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
\ No newline at end of file
doxy-code:
$(Q)(cat ngraph.doxyfile ; echo "STRIP_FROM_PATH=${NGRAPH_BASE}" ) | doxygen - 2>&1 | tee doc.log
doxy: doxy-code
clean:
@rm -rf $(BUILDDIR)/*
@rm -rf html
@rm -rf xml
@rm -rf doxygen
@rm -rf latex
htmldocs: doxy html
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json: prep
$(SPHINXBUILD) -t $(DOC_TAG) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@rm -rf samples
@rm -rf boards
@echo
@echo "Build finished; now you can process the JSON files."
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/ngraph"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/ngraph"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
This source diff could not be displayed because it is too large. You can view the blob instead.
# Maintainer: Gavin Lloyd
# https://github.intel.com/gavinllo/ttf-neo-sans-intel
pkgname=ttf-neo-sans-intel
pkgver=1.00
pkgrel=3
pkgdesc='Versatile, futuristic typeface for Intel-branded material'
arch=('ANY')
depends=('fontconfig' 'xorg-font-utils')
source=('NeoSansIntel-Italic.ttf'
'NeoSansIntel-LightItalic.ttf'
'NeoSansIntel-Light.ttf'
'NeoSansIntel-MediumItalic.ttf'
'NeoSansIntel-Medium.ttf'
'NeoSansIntel.ttf')
sha256sums=('be2f036d58320bd0fab7cca7327b806840ddfedfdc4e44a520a85bd53a1ed7b3'
'ce45deb38ad2749ba25cbb76084955e34a86f627043f1f0f8f8073720115545c'
'd522c9c3905532680f8bb8068fa340200d2e5e45376ea89d97bcc8edbce8eff8'
'61b3ce0ed96b6f343c8ac0a94471ed504708782bee7d9df88fadc564640ffbba'
'6cd878034142c390eeb98d2a17ee1b949c2f8ded0a8684d3b17e0fe4203a8fd8'
'303bc44874e23a563775e5d463a6ec3dd7bdfc7948fa95d65a45fa965bf5ee28')
package() {
install -d $pkgdir/usr/share/fonts/TTF/
install -m644 *.ttf $pkgdir/usr/share/fonts/TTF/
}
.. about:
About
=====
Welcome to the Intel nGraph project, an open source C++ library for developers
of :abbr:`Deep Learning (DL)` (DL) systems and frameworks. Here you will find
a suite of components, documentation, and APIs that can be used with
:abbr:`Deep Neural Network (DNN)` models defined in a variety of frameworks.
The nGraph library translates a framework’s representation of computations into
a neutral-:abbr:`Intermediate Representation (IR)` designed to promote
computational efficiency on target hardware; it works on Intel and non-Intel
platforms.
.. figure:: graphics/fig.jpeg
The *nGraph core* uses a strongly-typed and platform-neutral stateless graph
representation for computations. Each node, or *op*, in the graph corresponds
to one step in a computation, where each step produces zero or more tensor
outputs from zero or more tensor inputs.
There is a *framework bridge* for each supported framework which acts as
an intermediary between the *ngraph core* and the framework. A *transformer*
plays a similar role between the ngraphcore and the various execution
platforms.
Transformers compile the graph using a combination of generic and
platform-specific graph transformations. The result is a function that
can be executed from the framework bridge. Transformers also allocate
and deallocate, as well as read and write, tensors under direction of the
bridge.
For this early |release| release, we provide framework integration guides
to
* :ref:`mxnet_intg`,
* :ref:`tensorflow_intg`, and
* Try neon™ `frontend`_ framework for training GPU-performant models.
Integration guides for each of these other frameworks is tentatively
forthcoming and/or open to the community for contributions and sample
documentation:
* `Chainer`_,
* `PyTorch`_,
* `Caffe2`_, and
* Frameworks not yet written (for algorithms that do not yet exist).
.. _Caffe2: https://github.com/caffe2/
.. _PyTorch: http://pytorch.org/
.. _Chainer: https://chainer.org/
.. _frontend: http://neon.nervanasys.com/index.html/
...@@ -3,4 +3,8 @@ ...@@ -3,4 +3,8 @@
API API
### ###
.. TODO don't add Python APIs that will break the build. .. TODO don't add Python APIs that will break the build.
\ No newline at end of file
Sections
********
...@@ -22,10 +22,11 @@ This script does *not* modify the source code. ...@@ -22,10 +22,11 @@ This script does *not* modify the source code.
Core Ops Core Ops
-------- --------
We have some core ops. Other ops may be added to core when they Our design philosophy is that the graph is not a script for running kernels, but, rather,
have sufficient documentation and examples of those ops in practice that the graph should describe the computation in terms of ops that are building blocks,
or potentially-practical use cases. and compilation should match these ops to appropriate kernels for the backend(s) in use.
Thus, we expect that adding core ops should be infrequent. Instead, functionality should
be added by adding functions that build sub-graphs from existing core ops.
Coding style Coding style
......
...@@ -34,7 +34,8 @@ needs_sphinx = '1.6.5' ...@@ -34,7 +34,8 @@ needs_sphinx = '1.6.5'
extensions = ['sphinx.ext.mathjax', extensions = ['sphinx.ext.mathjax',
'sphinx.ext.ifconfig', 'sphinx.ext.ifconfig',
'sphinx.ext.viewcode', 'sphinx.ext.viewcode',
'sphinx.ext.autodoc' 'sphinx.ext.autodoc',
'breathe'
] ]
# Add any paths that contain templates here, relative to this directory. # Add any paths that contain templates here, relative to this directory.
...@@ -62,9 +63,9 @@ author = 'Intel Corporation' ...@@ -62,9 +63,9 @@ author = 'Intel Corporation'
# built documents. # built documents.
# #
# The short X.Y version. # The short X.Y version.
version = '0.5.1' version = 'alpha'
# The full version, including alpha/beta/rc tags. # The full version, including alpha/beta/rc tags.
release = '0.5.1' release = 'alpha'
# The language for content autogenerated by Sphinx. Refer to documentation # The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages. # for a list of supported languages.
...@@ -189,6 +190,16 @@ texinfo_documents = [ ...@@ -189,6 +190,16 @@ texinfo_documents = [
html_add_permalinks = "" html_add_permalinks = ""
breathe_projects = {
"nGraph": "../../../build/doc/doxygen/xml",
}
breathe_default_project = "nGraph"
breathe_projects = {
"nGraph": "xml"
}
rst_epilog = u""" rst_epilog = u"""
.. |codename| replace:: Intel nGraph .. |codename| replace:: Intel nGraph
......
...@@ -27,7 +27,7 @@ with respect to additions or feature requests. ...@@ -27,7 +27,7 @@ with respect to additions or feature requests.
If you prefer to use a containerized application, like Jupyter\* notebooks, If you prefer to use a containerized application, like Jupyter\* notebooks,
Google Docs\*, or MS Word\* to write and share documentation contributions, Google Docs\*, or MS Word\* to write and share documentation contributions,
you can convert the ``doc/source/.rst`` files to another format with a tool you can convert the ``doc/sphinx/source/*.rst`` files to another format with a tool
like ``pypandoc`` and share a link to your docs on our `wiki`_. like ``pypandoc`` and share a link to your docs on our `wiki`_.
Another option is to fork the `ngraph repo`_, essentially snapshotting it at Another option is to fork the `ngraph repo`_, essentially snapshotting it at
...@@ -38,8 +38,7 @@ our wiki. ...@@ -38,8 +38,7 @@ our wiki.
.. note:: Please do not submit Jupyter* notebook code to the Intel nGraph library .. note:: Please do not submit Jupyter* notebook code to the Intel nGraph library
repos; best practice is to maintain any project-specific examples, tests, or repos; best practice is to maintain any project-specific examples, tests, or
walk-throughs separately. Alternatively, you may wish to upstream documentation walk-throughs separately. Alternatively, you may wish to upstream documentation
contributions directly to whatever frontend framework supports the rendering and contributions directly to whatever frontend framework supports your example.
reproducibility of your example.
...@@ -126,21 +125,26 @@ Build the Documentation ...@@ -126,21 +125,26 @@ Build the Documentation
Right now the minimal version of Sphinx needed to build the documentation is Right now the minimal version of Sphinx needed to build the documentation is
Sphinx v. 1.6.5. This can be installed with `pip3` either to a virtual environment, or Sphinx v. 1.6.5. This can be installed with `pip3`, either to a virtual
to your base system if you plan to contribute much to docs. environment, or to your base system if you plan to contribute much to docs.
`Breathe`_ can also be installed to build C++ API documentation (currently WIP).
To build documentation locally, run:
.. code-block:: console .. code-block:: console
$ pip3 install [-I] Sphinx==1.6.5 [--user]
$ pip3 install [-I] breathe [--user]
$ cd doc/sphinx/ $ cd doc/sphinx/
$ make html $ make html
For tips similar to this, see the `sphinx`_ stable reST documentation. For tips similar to this, see the `sphinx`_ stable reST documentation.
.. _ngraph repo: https://github.com/NervanaSystems/ngraph/ .. _ngraph repo: https://github.com/NervanaSystems/ngraph-cpp/
.. _documentation repo: https://github.com/NervanaSystems/ngraph/tree/master/doc .. _documentation repo: https://github.com/NervanaSystems/ngraph/tree/master/doc
.. _sphinx: http://www.sphinx-doc.org/en/stable/rest.html .. _sphinx: http://www.sphinx-doc.org/en/stable/rest.html
.. _wiki: https://github.com/NervanaSystems/ngraph/wiki/ .. _wiki: https://github.com/NervanaSystems/ngraph/wiki/
.. _Breathe: https://breathe.readthedocs.io/en/latest/
...@@ -5,14 +5,35 @@ Glossary ...@@ -5,14 +5,35 @@ Glossary
.. glossary:: .. glossary::
function graph
The Intel nGraph library uses a function graph to represent an ``op``'s
parameters and results.
op
An op represents an operation. Ops are stateless and have zero or more
inputs and zero or more outputs. Some ops have additional constant
attributes. Every output of an op corresponds to a tensor and has an
element type and a shape. The element types and shapes of the outputs of
an op are determined by the inputs and attributes of the op.
tensors
Tensors are maps from *coordinates* to scalar values, all of the same type,
called the *element type* of the tensor.
parameter parameter
In the context of a function graph, a "paramater" refers In the context of a function graph, a "parameter" refers to what "stands
to what "stands in" for an argument in an ``op`` definition. in" for an argument in an ``op`` definition.
result result
In the context of a function graph, the term "result" refers to what In the context of a function graph, the term "result" refers to what
stands in for the returned *value*. stands in for the returned value.
shape
The shape of a tensor is a tuple of non-negative integers that represents an
exclusive upper bound for coordinate values.
step
An abstract "action" that produces zero or more tensor outputs from zero or more tensor
inputs. Steps correspond to *ops* that connect *nodes*.
function graph
The Intel nGraph library uses a function graph to represent an ``op``'s
parameters and results.
.. --------------------------------------------------------------------------- .. ---------------------------------------------------------------------------
.. Copyright 2017 Intel Corporation .. Copyright 2018 Intel Corporation
.. Licensed under the Apache License, Version 2.0 (the "License"); .. Licensed under the Apache License, Version 2.0 (the "License");
.. you may not use this file except in compliance with the License. .. you may not use this file except in compliance with the License.
.. You may obtain a copy of the License at .. You may obtain a copy of the License at
...@@ -13,10 +13,23 @@ ...@@ -13,10 +13,23 @@
.. limitations under the License. .. limitations under the License.
.. --------------------------------------------------------------------------- .. ---------------------------------------------------------------------------
.. Intel nGraph library core documentation master file, created on Mon Dec 25 13:04:12 2017. #############################
Intel nGraph library project
#############################
Intel nGraph library Welcome to the Intel nGraph project, an open source C++ library for developers
==================== of :abbr:`Deep Learning (DL)` (DL) systems and frameworks. Here you will find
a suite of components, documentation, and APIs that can be used with
:abbr:`Deep Neural Network (DNN)` models defined in a variety of frameworks.
The nGraph library translates a framework’s representation of computations into
a neutral-:abbr:`Intermediate Representation (IR)` designed to promote
computational efficiency on target hardware; it works on Intel and non-Intel
platforms.
For further overview details, see the :doc:`about` page.
=======
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
...@@ -26,15 +39,12 @@ Intel nGraph library ...@@ -26,15 +39,12 @@ Intel nGraph library
installation.rst installation.rst
testing-libngraph.rst testing-libngraph.rst
framework-integration-guides.rst framework-integration-guides.rst
build-a-functiongraph.rst graph-basics.rst
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
:caption: Models :caption: Algorithms
:name: Models :name:
training.rst
model-phases.rst
.. toctree:: .. toctree::
:maxdepth: 2 :maxdepth: 2
...@@ -48,10 +58,17 @@ Intel nGraph library ...@@ -48,10 +58,17 @@ Intel nGraph library
autodiff.rst autodiff.rst
glossary.rst glossary.rst
.. toctree::
:maxdepth: 1
:caption: Ops
ops/convolution.rst
.. toctree:: .. toctree::
:maxdepth: 1 :maxdepth: 1
:caption: Project Docs :caption: Project Docs
about.rst
release-notes.rst release-notes.rst
code-contributor-README.rst code-contributor-README.rst
...@@ -68,3 +85,4 @@ Indices and tables ...@@ -68,3 +85,4 @@ Indices and tables
================== ==================
* :ref:`search` * :ref:`search`
* :ref:`genindex`
\ No newline at end of file
.. installation: .. installation:
Building the Intel® nGraph™ library ###################################
#################################### Install the Intel® nGraph™ library
###################################
Build Environments Build Environments
================== ==================
...@@ -15,33 +16,28 @@ packages and prerequisites: ...@@ -15,33 +16,28 @@ packages and prerequisites:
:widths: 25, 15, 25, 20, 25 :widths: 25, 15, 25, 20, 25
:escape: ~ :escape: ~
CentOS 7.4 64-bit, CLang 3.4, GCC 4.8 + CMake 2.8, supported, ``patch diffutils zlib1g-dev libtinfo-dev``
Ubuntu 16.04 (LTS) 64-bit, CLang 3.9, CMake 3.5.1 + GNU Make, supported, ``build-essential cmake clang-3.9 git libtinfo-dev`` Ubuntu 16.04 (LTS) 64-bit, CLang 3.9, CMake 3.5.1 + GNU Make, supported, ``build-essential cmake clang-3.9 git libtinfo-dev``
Ubuntu 16.04 (LTS) 64-bit, CLang 4.0, CMake 3.5.1 + GNU Make, officially unsupported, ``build-essential cmake clang-4.0 git libtinfo-dev`` Ubuntu 16.04 (LTS) 64-bit, CLang 4.0, CMake 3.5.1 + GNU Make, officially unsupported, ``build-essential cmake clang-4.0 git libtinfo-dev``
Clear Linux\* OS for Intel Architecture, Clang 5.0.1, CMake 3.10.2, experimental, bundles ``machine-learning-basic dev-utils python3-basic python-basic-dev`` Clear Linux\* OS for Intel Architecture, CLang 5.0.1, CMake 3.10.2, experimental, bundles ``machine-learning-basic dev-utils python3-basic python-basic-dev``
macOS support is limited; see the macOS development prerequisites section
at the end of this page for details.
Installation Steps Installation Steps
================== ==================
.. note:: If you are developing |nGl| projects on macOS*\, please be To build |nGl| on one of the supported systems, the CMake procedure will
aware that this platform is officially unsupported; see the section install ``ngraph_dist`` to the installing user's ``$HOME`` directory as
`macOS Development Prerequisites`_ below. the default location. See the :file:`CMakeLists.txt` file for more
information about how to change or customize this location.
To build |nGl| on one of the supported systems, the default CMake procedure
will install ``ngraph_dist`` to your user's ``$HOME`` directory as #. (Optional) Since most of a developer's interaction with a frontend
the default install location. See the :file:`CMakeLists.txt` file for more framework will take place locally through Pythonic APIs to the C++
information. library, you can set a reference placeholder for the documented source
cloned from the repo. Create something like ``/opt/local`` and (with sudo
This guide provides one possible configuration that does not rely on a permissions), give ownership of that local directory to your user.
virtual environment. You are, of course, free to use a virtual environment,
or to set up user directories and permissions however you like.
#. Since most of a developer's interaction with a frontend framework
will take place locally through Python, set a placeholder directory
where Python bindings can interact more efficiently with the nGraph
library backend components. Create something like ``/opt/local`` and
(presuming you have sudo permissions), give ownership of that local
directory to your user. This will make configuring for various ``PATH``
and environmental variables much more simple later.
.. code-block:: console .. code-block:: console
...@@ -83,15 +79,16 @@ or to set up user directories and permissions however you like. ...@@ -83,15 +79,16 @@ or to set up user directories and permissions however you like.
#. (Optional, requires `Sphinx`_.) Run ``make html`` inside the #. (Optional, requires `Sphinx`_.) Run ``make html`` inside the
``doc/sphinx`` directory to build HTML docs for the nGraph library. ``doc/sphinx`` directory to build HTML docs for the nGraph library.
#. (COMING SOON -- optional, requires `doxygen`_.) TBD #. (COMING SOON -- Generate API docs. Optional, requires `doxygen`_.) TBD
.. macOS Development Prerequisites:
macOS Development Prerequisites macOS Development Prerequisites
------------------------------- -------------------------------
.. note:: If you are developing |nGl| projects on macOS*\, please be
aware that this platform is officially unsupported.
The repository includes two scripts (``maint/check-code-format.sh`` and The repository includes two scripts (``maint/check-code-format.sh`` and
``maint/apply-code-format.sh``) that are used respectively to check adherence ``maint/apply-code-format.sh``) that are used respectively to check adherence
to `libngraph` code formatting conventions, and to automatically reformat code to `libngraph` code formatting conventions, and to automatically reformat code
...@@ -106,14 +103,6 @@ according to those conventions. These scripts require the command ...@@ -106,14 +103,6 @@ according to those conventions. These scripts require the command
$ ln -s /usr/local/opt/llvm@3.9/bin/clang-format $HOME/bin/clang-format-3.9 $ ln -s /usr/local/opt/llvm@3.9/bin/clang-format $HOME/bin/clang-format-3.9
$ echo 'export PATH=$HOME/bin:$PATH' >> $HOME/.bash_profile $ echo 'export PATH=$HOME/bin:$PATH' >> $HOME/.bash_profile
External library requirements
==============================
TBD
.. _doxygen: https://www.stack.nl/~dimitri/doxygen/ .. _doxygen: https://www.stack.nl/~dimitri/doxygen/
.. _Sphinx: http://www.sphinx-doc.org/en/stable/ .. _Sphinx: http://www.sphinx-doc.org/en/stable/
.. _NervanaSystems: https://github.com/NervanaSystems/private-ngraph-cpp/blob/master/README.md .. _NervanaSystems: https://github.com/NervanaSystems/private-ngraph-cpp/blob/master/README.md
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
.. training:
Training
########
This diff is collapsed.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment