Commit 0bf21af9 authored by L.S. Cook's avatar L.S. Cook Committed by Scott Cyphers

Amazon codeshare (#429)

* WIP on finding a good format for op docs in RST

* A few more scribbles

* fix up branch for Amazon code share

* add conf.py configuration details from aproctor's branch for doxy-breathe integration

* update section on how to build the documentation with breathe install details

* Remove empty file on training, update framework integration notes

* Add CentOS stub, fix spelling, core op definition, add to glossary.

* more documentation cleanup on README and installation and testing

* more cleanup of docs for TernsorFlow

* Simplify Dot Autodiff (#412)

* Simplify Dot Autodiff

* remove commented code

* Remove TupleType, ValueType (#411)

* Remove TupleType, ValueType

* Fix compile error.

* Change convolution reference to work with f32 (#409)

* Drwebb/gpu backend dot op (#413)

* Drwebb/gpu backend dot op (#387)

* GPU Dot prod emitter switch statement

* cuBLAS dot kernel call

* Flush out arg substitution into gpu dot kernel call

* Drwebb/gpu backend dot op (#392)

* Take in CodeWriter into gpu op emitters

* Introduce GPU function gen based on pass functions

* Additional gpu emitter stubs

* link cublas in to unit test and ngraph

* Use static code gen methods for GPU, add new GPU op stubs

* use pass manager to declare functions / cublas Updates

* Prune down gpu_external_function wip

* Switch back to GPU tensor views in GPU backend

* Pass in cublas handle to GPU external function

* cuMalloc memory in gpu tensor view

* Use cuda runtime malloc and free for tensor view managment c

* change GPU tensor view init, and use GPU tensor view for GPU call frame

* include headers as system dirs

* GPU tensor printing utility function

* cublasSetPointer to device mode / Fix copyright notification lowercasing

* Passing GPU dot product test using cuBLAS

Clean up

* Changes from review

* Add an overivew.

* Intro for building graphs.

* Refactor docs so that Doxygen and Sphinx are integrated (Sphinx depends on Doxygen with the docstrings stuff)

Still need to resolve a lingering assumption that the build dir is contained in private-ngraph-cpp. It's proving to be surprisingly tricky.

* Added the TensorFlow XLA build information and example of how to run MNIST MLP with TF/nGraph

* Updated TF integration guide for clarity. Added files from cyphers-amazon branch. Add minor changes to sphinx-doxy to test apis

* Small revision of overview and add graphic from arXiv paper

* WIP more editing, picking up from where I left off last week

* Fix garbled sentence edit

* WIP Edit for readability and such
:

* Better font rendering on all architectures included with our custom theme

* Cleanup current version of documentation.  Add NeoSans font binaries to make local font rendering of h1 h2 etc

* Missed merge conflict

* Add something on functions, don't forward-reference parameters

* What we have so far into a PR for review

* Need file for cmake

* Missing header

* Remove duplicate file

* added breathe package to contrib/docker/Dockerfile.ngraph_cpp
parent b408a08e
# Copyright 2017 Nervana Systems Inc.
# Copyright 2018 Nervana Systems Inc.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
......
......@@ -19,5 +19,6 @@ RUN pip install --upgrade pip
# installed sphinx with pip to get the updated version 1.6.5
# allows for make html build under the doc/source directory as an interim build process
RUN pip install sphinx
RUN pip install breathe
WORKDIR /home
......@@ -10,25 +10,20 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
if ("${NGRAPH_BUILD_DOCS}" MATCHES "^ON$")
add_custom_target( docs
COMMENT "Build all of the documentation types selected during CMake configuration."
)
add_custom_target( docs
COMMENT "Build all of the documentation types selected during CMake configuration."
)
set(DOCS_TARGET_IS_EMPTY TRUE)
add_subdirectory( doxygen )
add_subdirectory( sphinx )
if (DOCS_TARGET_IS_EMPTY)
add_custom_target( docs-is-noop-error
add_subdirectory( doxygen )
add_subdirectory( sphinx )
else()
add_custom_target( docs
COMMAND echo
COMMAND echo "The 'docs' target does nothing because every kind of doc was disabled during configuration"
COMMAND echo "The 'docs' target is disabled. To enable the building of documentation, re-run cmake with the option -DNGRAPH_BUILD_DOCS=ON."
COMMAND echo
COMMAND false
VERBATIM
)
add_dependencies( docs docs-is-noop-error )
endif()
......@@ -11,38 +11,41 @@
# See the License for the specific language governing permissions and
# limitations under the License.
set(NGRAPH_BUILD_DOXYGEN_DOCS FALSE
CACHE BOOL
"The NGraph build system shall contain a target for Doxygen-based docs."
)
find_package(Doxygen REQUIRED)
if ("${NGRAPH_DOXYGEN_WARN_IF_UNDOCUMENTED}" MATCHES "^ON$")
set(DOXYGEN_WARN_IF_UNDOCUMENTED YES)
else()
set(DOXYGEN_WARN_IF_UNDOCUMENTED NO)
endif()
if (NGRAPH_BUILD_DOXYGEN_DOCS)
find_package(Doxygen REQUIRED)
set(DOXYGEN_IN "${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile.in")
set(DOXYGEN_OUT "${CMAKE_CURRENT_BINARY_DIR}/Doxyfile")
configure_file("${DOXYGEN_IN}" "${DOXYGEN_OUT}" @ONLY)
add_custom_target(doxygen-docs
ALL
COMMAND "${DOXYGEN_EXECUTABLE}" "${DOXYGEN_OUT}"
WORKING_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}"
COMMENT "Generating documentation with Doxygen"
VERBATIM )
add_dependencies( docs doxygen-docs )
set(DOCS_TARGET_IS_EMPTY FALSE PARENT_SCOPE)
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/html/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/html"
OPTIONAL
)
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/latex/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/latex"
OPTIONAL
)
if ("${NGRAPH_DOXYGEN_QUIET}" MATCHES "^ON$")
set(DOXYGEN_QUIET YES)
else()
set(DOXYGEN_QUIET NO)
endif()
set(DOXYGEN_IN "${CMAKE_CURRENT_SOURCE_DIR}/Doxyfile.in")
set(DOXYGEN_OUT "${CMAKE_CURRENT_BINARY_DIR}/Doxyfile")
configure_file("${DOXYGEN_IN}" "${DOXYGEN_OUT}" @ONLY)
add_custom_target(doxygen-docs
ALL
COMMAND "${DOXYGEN_EXECUTABLE}" "${DOXYGEN_OUT}"
WORKING_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}"
COMMENT "Generating documentation with Doxygen"
VERBATIM )
add_dependencies( docs doxygen-docs )
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/html/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/html"
OPTIONAL
)
install(
DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/latex/"
DESTINATION "${NGRAPH_INSTALL_DOC}/api-reference/latex"
OPTIONAL
)
PROJECT_NAME = "ngraph++"
PROJECT_BRIEF = "Nervana graph compiler"
PROJECT_NAME = "Intel® nGraph™ library"
PROJECT_BRIEF = "Intel® nGraph™ library"
OUTPUT_DIRECTORY = @CMAKE_CURRENT_BINARY_DIR@
INPUT = @CMAKE_SOURCE_DIR@/src
RECURSIVE = YES
EXCLUDE_PATTERNS = json.hpp
USE_MATHJAX = YES
GENERATE_XML = YES
WARN_IF_UNDOCUMENTED = @DOXYGEN_WARN_IF_UNDOCUMENTED@
QUIET = @DOXYGEN_QUIET@
# Minimal makefile for Sphinx documentation
# Robust Makefile for Sphinx documentation
#
# You can set these variables from the command line.
......@@ -17,4 +17,113 @@ help:
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
\ No newline at end of file
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
doxy-code:
$(Q)(cat ngraph.doxyfile ; echo "STRIP_FROM_PATH=${NGRAPH_BASE}" ) | doxygen - 2>&1 | tee doc.log
doxy: doxy-code
clean:
@rm -rf $(BUILDDIR)/*
@rm -rf html
@rm -rf xml
@rm -rf doxygen
@rm -rf latex
htmldocs: doxy html
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
json: prep
$(SPHINXBUILD) -t $(DOC_TAG) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@rm -rf samples
@rm -rf boards
@echo
@echo "Build finished; now you can process the JSON files."
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/ngraph"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/ngraph"
@echo "# devhelp"
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
This source diff could not be displayed because it is too large. You can view the blob instead.
......@@ -347,33 +347,30 @@ big, small {
clear: both;
}
/*!
* Font Awesome 4.2.0 by @davegandy - http://fontawesome.io - @fontawesome
* License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License)
*/
/* FONT PATH
* -------------------------- */
/*! */
/* NeoSansIntel FONT */
/* */
/* -------------------------- */
@font-face {
font-family: 'FontAwesome';
src: url("../fonts/fontawesome-webfont.eot?v=4.2.0");
src: url("../fonts/fontawesome-webfont.eot?#iefix&v=4.2.0") format("embedded-opentype"), url("../fonts/fontawesome-webfont.woff?v=4.2.0") format("woff"), url("../fonts/fontawesome-webfont.ttf?v=4.2.0") format("truetype"), url("../fonts/fontawesome-webfont.svg?v=4.2.0#fontawesomeregular") format("svg");
font-family: 'NeoSansIntel';
src: url("../fonts/NeoSansIntel); src: url("../fonts/NeoSansIntel) format("ttf","svg"), webformat("svg");
font-weight: normal;
font-style: normal;
}
.fa, .rst-content .admonition-title, .rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink, .rst-content tt.download span:first-child, .rst-content code.download span:first-child, .icon, .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand {
display: inline-block;
font: normal normal normal 14px/1 FontAwesome;
font: normal normal normal 1.011em/1 NeoSansIntel;
font-size: inherit;
letter-spacing: -0.41em;
text-rendering: auto;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
/* makes the font 33% larger relative to the icon container */
.fa-lg {
font-size: 1.33333em;
font-size: 1.25em;
line-height: 0.75em;
vertical-align: -15%;
vertical-align: -11%;
}
.fa-2x {
......@@ -534,2000 +531,8 @@ big, small {
color: #fff;
}
/* Font Awesome uses the Unicode Private Use Area (PUA) to ensure screen
readers do not read off random characters that represent icons */
.fa-glass:before {
content: "";
}
.fa-music:before {
content: "";
}
.fa-search:before, .icon-search:before {
content: "";
}
.fa-envelope-o:before {
content: "";
}
.fa-heart:before {
content: "";
}
.fa-star:before {
content: "";
}
.fa-star-o:before {
content: "";
}
.fa-user:before {
content: "";
}
.fa-film:before {
content: "";
}
.fa-th-large:before {
content: "";
}
.fa-th:before {
content: "";
}
.fa-th-list:before {
content: "";
}
.fa-check:before {
content: "";
}
.fa-remove:before,
.fa-close:before,
.fa-times:before {
content: "";
}
.fa-search-plus:before {
content: "";
}
.fa-search-minus:before {
content: "";
}
.fa-power-off:before {
content: "";
}
.fa-signal:before {
content: "";
}
.fa-gear:before,
.fa-cog:before {
content: "";
}
.fa-trash-o:before {
content: "";
}
.fa-home:before, .icon-home:before {
content: "";
}
.fa-file-o:before {
content: "";
}
.fa-clock-o:before {
content: "";
}
.fa-road:before {
content: "";
}
.fa-download:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before {
content: "";
}
.fa-arrow-circle-o-down:before {
content: "";
}
.fa-arrow-circle-o-up:before {
content: "";
}
.fa-inbox:before {
content: "";
}
.fa-play-circle-o:before {
content: "";
}
.fa-rotate-right:before,
.fa-repeat:before {
content: "";
}
.fa-refresh:before {
content: "";
}
.fa-list-alt:before {
content: "";
}
.fa-lock:before {
content: "";
}
.fa-flag:before {
content: "";
}
.fa-headphones:before {
content: "";
}
.fa-volume-off:before {
content: "";
}
.fa-volume-down:before {
content: "";
}
.fa-volume-up:before {
content: "";
}
.fa-qrcode:before {
content: "";
}
.fa-barcode:before {
content: "";
}
.fa-tag:before {
content: "";
}
.fa-tags:before {
content: "";
}
.fa-book:before, .icon-book:before {
content: "";
}
.fa-bookmark:before {
content: "";
}
.fa-print:before {
content: "";
}
.fa-camera:before {
content: "";
}
.fa-font:before {
content: "";
}
.fa-bold:before {
content: "";
}
.fa-italic:before {
content: "";
}
.fa-text-height:before {
content: "";
}
.fa-text-width:before {
content: "";
}
.fa-align-left:before {
content: "";
}
.fa-align-center:before {
content: "";
}
.fa-align-right:before {
content: "";
}
.fa-align-justify:before {
content: "";
}
.fa-list:before {
content: "";
}
.fa-dedent:before,
.fa-outdent:before {
content: "";
}
.fa-indent:before {
content: "";
}
.fa-video-camera:before {
content: "";
}
.fa-photo:before,
.fa-image:before,
.fa-picture-o:before {
content: "";
}
.fa-pencil:before {
content: "";
}
.fa-map-marker:before {
content: "";
}
.fa-adjust:before {
content: "";
}
.fa-tint:before {
content: "";
}
.fa-edit:before,
.fa-pencil-square-o:before {
content: "";
}
.fa-share-square-o:before {
content: "";
}
.fa-check-square-o:before {
content: "";
}
.fa-arrows:before {
content: "";
}
.fa-step-backward:before {
content: "";
}
.fa-fast-backward:before {
content: "";
}
.fa-backward:before {
content: "";
}
.fa-play:before {
content: "";
}
.fa-pause:before {
content: "";
}
.fa-stop:before {
content: "";
}
.fa-forward:before {
content: "";
}
.fa-fast-forward:before {
content: "";
}
.fa-step-forward:before {
content: "";
}
.fa-eject:before {
content: "";
}
.fa-chevron-left:before {
content: "";
}
.fa-chevron-right:before {
content: "";
}
.fa-plus-circle:before {
content: "";
}
.fa-minus-circle:before {
content: "";
}
.fa-times-circle:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before {
content: "";
}
.fa-check-circle:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before {
content: "";
}
.fa-question-circle:before {
content: "";
}
.fa-info-circle:before {
content: "";
}
.fa-crosshairs:before {
content: "";
}
.fa-times-circle-o:before {
content: "";
}
.fa-check-circle-o:before {
content: "";
}
.fa-ban:before {
content: "";
}
.fa-arrow-left:before {
content: "";
}
.fa-arrow-right:before {
content: "";
}
.fa-arrow-up:before {
content: "";
}
.fa-arrow-down:before {
content: "";
}
.fa-mail-forward:before,
.fa-share:before {
content: "";
}
.fa-expand:before {
content: "";
}
.fa-compress:before {
content: "";
}
.fa-plus:before {
content: "";
}
.fa-minus:before {
content: "";
}
.fa-asterisk:before {
content: "";
}
.fa-exclamation-circle:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before, .rst-content .admonition-title:before {
content: "";
}
.fa-gift:before {
content: "";
}
.fa-leaf:before {
content: "";
}
.fa-fire:before, .icon-fire:before {
content: "";
}
.fa-eye:before {
content: "";
}
.fa-eye-slash:before {
content: "";
}
.fa-warning:before,
.fa-exclamation-triangle:before {
content: "";
}
.fa-plane:before {
content: "";
}
.fa-calendar:before {
content: "";
}
.fa-random:before {
content: "";
}
.fa-comment:before {
content: "";
}
.fa-magnet:before {
content: "";
}
.fa-chevron-up:before {
content: "";
}
.fa-chevron-down:before {
content: "";
}
.fa-retweet:before {
content: "";
}
.fa-shopping-cart:before {
content: "";
}
.fa-folder:before {
content: "";
}
.fa-folder-open:before {
content: "";
}
.fa-arrows-v:before {
content: "";
}
.fa-arrows-h:before {
content: "";
}
.fa-bar-chart-o:before,
.fa-bar-chart:before {
content: "";
}
.fa-twitter-square:before {
content: "";
}
.fa-facebook-square:before {
content: "";
}
.fa-camera-retro:before {
content: "";
}
.fa-key:before {
content: "";
}
.fa-gears:before,
.fa-cogs:before {
content: "";
}
.fa-comments:before {
content: "";
}
.fa-thumbs-o-up:before {
content: "";
}
.fa-thumbs-o-down:before {
content: "";
}
.fa-star-half:before {
content: "";
}
.fa-heart-o:before {
content: "";
}
.fa-sign-out:before {
content: "";
}
.fa-linkedin-square:before {
content: "";
}
.fa-thumb-tack:before {
content: "";
}
.fa-external-link:before {
content: "";
}
.fa-sign-in:before {
content: "";
}
.fa-trophy:before {
content: "";
}
.fa-github-square:before {
content: "";
}
.fa-upload:before {
content: "";
}
.fa-lemon-o:before {
content: "";
}
.fa-phone:before {
content: "";
}
.fa-square-o:before {
content: "";
}
.fa-bookmark-o:before {
content: "";
}
.fa-phone-square:before {
content: "";
}
.fa-twitter:before {
content: "";
}
.fa-facebook:before {
content: "";
}
.fa-github:before, .icon-github:before {
content: "";
}
.fa-unlock:before {
content: "";
}
.fa-credit-card:before {
content: "";
}
.fa-rss:before {
content: "";
}
.fa-hdd-o:before {
content: "";
}
.fa-bullhorn:before {
content: "";
}
.fa-bell:before {
content: "";
}
.fa-certificate:before {
content: "";
}
.fa-hand-o-right:before {
content: "";
}
.fa-hand-o-left:before {
content: "";
}
.fa-hand-o-up:before {
content: "";
}
.fa-hand-o-down:before {
content: "";
}
.fa-arrow-circle-left:before, .icon-circle-arrow-left:before {
content: "";
}
.fa-arrow-circle-right:before, .icon-circle-arrow-right:before {
content: "";
}
.fa-arrow-circle-up:before {
content: "";
}
.fa-arrow-circle-down:before {
content: "";
}
.fa-globe:before {
content: "";
}
.fa-wrench:before {
content: "";
}
.fa-tasks:before {
content: "";
}
.fa-filter:before {
content: "";
}
.fa-briefcase:before {
content: "";
}
.fa-arrows-alt:before {
content: "";
}
.fa-group:before,
.fa-users:before {
content: "";
}
.fa-chain:before,
.fa-link:before,
.icon-link:before {
content: "";
}
.fa-cloud:before {
content: "";
}
.fa-flask:before {
content: "";
}
.fa-cut:before,
.fa-scissors:before {
content: "";
}
.fa-copy:before,
.fa-files-o:before {
content: "";
}
.fa-paperclip:before {
content: "";
}
.fa-save:before,
.fa-floppy-o:before {
content: "";
}
.fa-square:before {
content: "";
}
.fa-navicon:before,
.fa-reorder:before,
.fa-bars:before {
content: "";
}
.fa-list-ul:before {
content: "";
}
.fa-list-ol:before {
content: "";
}
.fa-strikethrough:before {
content: "";
}
.fa-underline:before {
content: "";
}
.fa-table:before {
content: "";
}
.fa-magic:before {
content: "";
}
.fa-truck:before {
content: "";
}
.fa-pinterest:before {
content: "";
}
.fa-pinterest-square:before {
content: "";
}
.fa-google-plus-square:before {
content: "";
}
.fa-google-plus:before {
content: "";
}
.fa-money:before {
content: "";
}
.fa-caret-down:before, .wy-dropdown .caret:before, .icon-caret-down:before {
content: "";
}
.fa-caret-up:before {
content: "";
}
.fa-caret-left:before {
content: "";
}
.fa-caret-right:before {
content: "";
}
.fa-columns:before {
content: "";
}
.fa-unsorted:before,
.fa-sort:before {
content: "";
}
.fa-sort-down:before,
.fa-sort-desc:before {
content: "";
}
.fa-sort-up:before,
.fa-sort-asc:before {
content: "";
}
.fa-envelope:before {
content: "";
}
.fa-linkedin:before {
content: "";
}
.fa-rotate-left:before,
.fa-undo:before {
content: "";
}
.fa-legal:before,
.fa-gavel:before {
content: "";
}
.fa-dashboard:before,
.fa-tachometer:before {
content: "";
}
.fa-comment-o:before {
content: "";
}
.fa-comments-o:before {
content: "";
}
.fa-flash:before,
.fa-bolt:before {
content: "";
}
.fa-sitemap:before {
content: "";
}
.fa-umbrella:before {
content: "";
}
.fa-paste:before,
.fa-clipboard:before {
content: "";
}
.fa-lightbulb-o:before {
content: "";
}
.fa-exchange:before {
content: "";
}
.fa-cloud-download:before {
content: "";
}
.fa-cloud-upload:before {
content: "";
}
.fa-user-md:before {
content: "";
}
.fa-stethoscope:before {
content: "";
}
.fa-suitcase:before {
content: "";
}
.fa-bell-o:before {
content: "";
}
.fa-coffee:before {
content: "";
}
.fa-cutlery:before {
content: "";
}
.fa-file-text-o:before {
content: "";
}
.fa-building-o:before {
content: "";
}
.fa-hospital-o:before {
content: "";
}
.fa-ambulance:before {
content: "";
}
.fa-medkit:before {
content: "";
}
.fa-fighter-jet:before {
content: "";
}
.fa-beer:before {
content: "";
}
.fa-h-square:before {
content: "";
}
.fa-plus-square:before {
content: "";
}
.fa-angle-double-left:before {
content: "";
}
.fa-angle-double-right:before {
content: "";
}
.fa-angle-double-up:before {
content: "";
}
.fa-angle-double-down:before {
content: "";
}
.fa-angle-left:before {
content: "";
}
.fa-angle-right:before {
content: "";
}
.fa-angle-up:before {
content: "";
}
.fa-angle-down:before {
content: "";
}
.fa-desktop:before {
content: "";
}
.fa-laptop:before {
content: "";
}
.fa-tablet:before {
content: "";
}
.fa-mobile-phone:before,
.fa-mobile:before {
content: "";
}
.fa-circle-o:before {
content: "";
}
.fa-quote-left:before {
content: "";
}
.fa-quote-right:before {
content: "";
}
.fa-spinner:before {
content: "";
}
.fa-circle:before {
content: "";
}
.fa-mail-reply:before,
.fa-reply:before {
content: "";
}
.fa-github-alt:before {
content: "";
}
.fa-folder-o:before {
content: "";
}
.fa-folder-open-o:before {
content: "";
}
.fa-smile-o:before {
content: "";
}
.fa-frown-o:before {
content: "";
}
.fa-meh-o:before {
content: "";
}
.fa-gamepad:before {
content: "";
}
.fa-keyboard-o:before {
content: "";
}
.fa-flag-o:before {
content: "";
}
.fa-flag-checkered:before {
content: "";
}
.fa-terminal:before {
content: "";
}
.fa-code:before {
content: "";
}
.fa-mail-reply-all:before,
.fa-reply-all:before {
content: "";
}
.fa-star-half-empty:before,
.fa-star-half-full:before,
.fa-star-half-o:before {
content: "";
}
.fa-location-arrow:before {
content: "";
}
.fa-crop:before {
content: "";
}
.fa-code-fork:before {
content: "";
}
.fa-unlink:before,
.fa-chain-broken:before {
content: "";
}
.fa-question:before {
content: "";
}
.fa-info:before {
content: "";
}
.fa-exclamation:before {
content: "";
}
.fa-superscript:before {
content: "";
}
.fa-subscript:before {
content: "";
}
.fa-eraser:before {
content: "";
}
.fa-puzzle-piece:before {
content: "";
}
.fa-microphone:before {
content: "";
}
.fa-microphone-slash:before {
content: "";
}
.fa-shield:before {
content: "";
}
.fa-calendar-o:before {
content: "";
}
.fa-fire-extinguisher:before {
content: "";
}
.fa-rocket:before {
content: "";
}
.fa-maxcdn:before {
content: "";
}
.fa-chevron-circle-left:before {
content: "";
}
.fa-chevron-circle-right:before {
content: "";
}
.fa-chevron-circle-up:before {
content: "";
}
.fa-chevron-circle-down:before {
content: "";
}
.fa-html5:before {
content: "";
}
.fa-css3:before {
content: "";
}
.fa-anchor:before {
content: "";
}
.fa-unlock-alt:before {
content: "";
}
.fa-bullseye:before {
content: "";
}
.fa-ellipsis-h:before {
content: "";
}
.fa-ellipsis-v:before {
content: "";
}
.fa-rss-square:before {
content: "";
}
.fa-play-circle:before {
content: "";
}
.fa-ticket:before {
content: "";
}
.fa-minus-square:before {
content: "";
}
.fa-minus-square-o:before, .wy-menu-vertical li.on a span.toctree-expand:before, .wy-menu-vertical li.current > a span.toctree-expand:before {
content: "";
}
.fa-level-up:before {
content: "";
}
.fa-level-down:before {
content: "";
}
.fa-check-square:before {
content: "";
}
.fa-pencil-square:before {
content: "";
}
.fa-external-link-square:before {
content: "";
}
.fa-share-square:before {
content: "";
}
.fa-compass:before {
content: "";
}
.fa-toggle-down:before,
.fa-caret-square-o-down:before {
content: "";
}
.fa-toggle-up:before,
.fa-caret-square-o-up:before {
content: "";
}
.fa-toggle-right:before,
.fa-caret-square-o-right:before {
content: "";
}
.fa-euro:before,
.fa-eur:before {
content: "";
}
.fa-gbp:before {
content: "";
}
.fa-dollar:before,
.fa-usd:before {
content: "";
}
.fa-rupee:before,
.fa-inr:before {
content: "";
}
.fa-cny:before,
.fa-rmb:before,
.fa-yen:before,
.fa-jpy:before {
content: "";
}
.fa-ruble:before,
.fa-rouble:before,
.fa-rub:before {
content: "";
}
.fa-won:before,
.fa-krw:before {
content: "";
}
.fa-bitcoin:before,
.fa-btc:before {
content: "";
}
.fa-file:before {
content: "";
}
.fa-file-text:before {
content: "";
}
.fa-sort-alpha-asc:before {
content: "";
}
.fa-sort-alpha-desc:before {
content: "";
}
.fa-sort-amount-asc:before {
content: "";
}
.fa-sort-amount-desc:before {
content: "";
}
.fa-sort-numeric-asc:before {
content: "";
}
.fa-sort-numeric-desc:before {
content: "";
}
.fa-thumbs-up:before {
content: "";
}
.fa-thumbs-down:before {
content: "";
}
.fa-youtube-square:before {
content: "";
}
.fa-youtube:before {
content: "";
}
.fa-xing:before {
content: "";
}
.fa-xing-square:before {
content: "";
}
.fa-youtube-play:before {
content: "";
}
.fa-dropbox:before {
content: "";
}
.fa-stack-overflow:before {
content: "";
}
.fa-instagram:before {
content: "";
}
.fa-flickr:before {
content: "";
}
.fa-adn:before {
content: "";
}
.fa-bitbucket:before, .icon-bitbucket:before {
content: "";
}
.fa-bitbucket-square:before {
content: "";
}
.fa-tumblr:before {
content: "";
}
.fa-tumblr-square:before {
content: "";
}
.fa-long-arrow-down:before {
content: "";
}
.fa-long-arrow-up:before {
content: "";
}
.fa-long-arrow-left:before {
content: "";
}
.fa-long-arrow-right:before {
content: "";
}
.fa-apple:before {
content: "";
}
.fa-windows:before {
content: "";
}
.fa-android:before {
content: "";
}
.fa-linux:before {
content: "";
}
.fa-dribbble:before {
content: "";
}
.fa-skype:before {
content: "";
}
.fa-foursquare:before {
content: "";
}
.fa-trello:before {
content: "";
}
.fa-female:before {
content: "";
}
.fa-male:before {
content: "";
}
.fa-gittip:before {
content: "";
}
.fa-sun-o:before {
content: "";
}
.fa-moon-o:before {
content: "";
}
.fa-archive:before {
content: "";
}
.fa-bug:before {
content: "";
}
.fa-vk:before {
content: "";
}
.fa-weibo:before {
content: "";
}
.fa-renren:before {
content: "";
}
.fa-pagelines:before {
content: "";
}
.fa-stack-exchange:before {
content: "";
}
.fa-arrow-circle-o-right:before {
content: "";
}
.fa-arrow-circle-o-left:before {
content: "";
}
.fa-toggle-left:before,
.fa-caret-square-o-left:before {
content: "";
}
.fa-dot-circle-o:before {
content: "";
}
.fa-wheelchair:before {
content: "";
}
.fa-vimeo-square:before {
content: "";
}
.fa-turkish-lira:before,
.fa-try:before {
content: "";
}
.fa-plus-square-o:before, .wy-menu-vertical li span.toctree-expand:before {
content: "";
}
.fa-space-shuttle:before {
content: "";
}
.fa-slack:before {
content: "";
}
.fa-envelope-square:before {
content: "";
}
.fa-wordpress:before {
content: "";
}
.fa-openid:before {
content: "";
}
.fa-institution:before,
.fa-bank:before,
.fa-university:before {
content: "";
}
.fa-mortar-board:before,
.fa-graduation-cap:before {
content: "";
}
.fa-yahoo:before {
content: "";
}
.fa-google:before {
content: "";
}
.fa-reddit:before {
content: "";
}
.fa-reddit-square:before {
content: "";
}
.fa-stumbleupon-circle:before {
content: "";
}
.fa-stumbleupon:before {
content: "";
}
.fa-delicious:before {
content: "";
}
.fa-digg:before {
content: "";
}
.fa-pied-piper:before {
content: "";
}
.fa-pied-piper-alt:before {
content: "";
}
.fa-drupal:before {
content: "";
}
.fa-joomla:before {
content: "";
}
.fa-language:before {
content: "";
}
.fa-fax:before {
content: "";
}
.fa-building:before {
content: "";
}
.fa-child:before {
content: "";
}
.fa-paw:before {
content: "";
}
.fa-spoon:before {
content: "";
}
.fa-cube:before {
content: "";
}
.fa-cubes:before {
content: "";
}
.fa-behance:before {
content: "";
}
.fa-behance-square:before {
content: "";
}
.fa-steam:before {
content: "";
}
.fa-steam-square:before {
content: "";
}
.fa-recycle:before {
content: "";
}
.fa-automobile:before,
.fa-car:before {
content: "";
}
.fa-cab:before,
.fa-taxi:before {
content: "";
}
.fa-tree:before {
content: "";
}
.fa-spotify:before {
content: "";
}
.fa-deviantart:before {
content: "";
}
.fa-soundcloud:before {
content: "";
}
.fa-database:before {
content: "";
}
.fa-file-pdf-o:before {
content: "";
}
.fa-file-word-o:before {
content: "";
}
.fa-file-excel-o:before {
content: "";
}
.fa-file-powerpoint-o:before {
content: "";
}
.fa-file-photo-o:before,
.fa-file-picture-o:before,
.fa-file-image-o:before {
content: "";
}
.fa-file-zip-o:before,
.fa-file-archive-o:before {
content: "";
}
.fa-file-sound-o:before,
.fa-file-audio-o:before {
content: "";
}
.fa-file-movie-o:before,
.fa-file-video-o:before {
content: "";
}
.fa-file-code-o:before {
content: "";
}
.fa-vine:before {
content: "";
}
.fa-codepen:before {
content: "";
}
.fa-jsfiddle:before {
content: "";
}
.fa-life-bouy:before,
.fa-life-buoy:before,
.fa-life-saver:before,
.fa-support:before,
.fa-life-ring:before {
content: "";
}
.fa-circle-o-notch:before {
content: "";
}
.fa-ra:before,
.fa-rebel:before {
content: "";
}
.fa-ge:before,
.fa-empire:before {
content: "";
}
.fa-git-square:before {
content: "";
}
.fa-git:before {
content: "";
}
.fa-hacker-news:before {
content: "";
}
.fa-tencent-weibo:before {
content: "";
}
.fa-qq:before {
content: "";
}
.fa-wechat:before,
.fa-weixin:before {
content: "";
}
.fa-send:before,
.fa-paper-plane:before {
content: "";
}
.fa-send-o:before,
.fa-paper-plane-o:before {
content: "";
}
.fa-history:before {
content: "";
}
.fa-circle-thin:before {
content: "";
}
.fa-header:before {
content: "";
}
.fa-paragraph:before {
content: "";
}
.fa-sliders:before {
content: "";
}
.fa-share-alt:before {
content: "";
}
.fa-share-alt-square:before {
content: "";
}
.fa-bomb:before {
content: "";
}
.fa-soccer-ball-o:before,
.fa-futbol-o:before {
content: "";
}
.fa-tty:before {
content: "";
}
.fa-binoculars:before {
content: "";
}
.fa-plug:before {
content: "";
}
.fa-slideshare:before {
content: "";
}
.fa-twitch:before {
content: "";
}
.fa-yelp:before {
content: "";
}
.fa-newspaper-o:before {
content: "";
}
.fa-wifi:before {
content: "";
}
.fa-calculator:before {
content: "";
}
.fa-paypal:before {
content: "";
}
.fa-google-wallet:before {
content: "";
}
.fa-cc-visa:before {
content: "";
}
.fa-cc-mastercard:before {
content: "";
}
.fa-cc-discover:before {
content: "";
}
.fa-cc-amex:before {
content: "";
}
.fa-cc-paypal:before {
content: "";
}
.fa-cc-stripe:before {
content: "";
}
.fa-bell-slash:before {
content: "";
}
.fa-bell-slash-o:before {
content: "";
}
.fa-trash:before {
content: "";
}
.fa-copyright:before {
content: "";
}
.fa-at:before {
content: "";
}
.fa-eyedropper:before {
content: "";
}
.fa-paint-brush:before {
content: "";
}
.fa-birthday-cake:before {
content: "";
}
.fa-area-chart:before {
content: "";
}
.fa-pie-chart:before {
content: "";
}
.fa-line-chart:before {
content: "";
}
.fa-lastfm:before {
content: "";
}
.fa-lastfm-square:before {
content: "";
}
.fa-toggle-off:before {
content: "";
}
.fa-toggle-on:before {
content: "";
}
.fa-bicycle:before {
content: "";
}
.fa-bus:before {
content: "";
}
.fa-ioxhost:before {
content: "";
}
.fa-angellist:before {
content: "";
}
.fa-cc:before {
content: "";
}
.fa-shekel:before,
.fa-sheqel:before,
.fa-ils:before {
content: "";
}
.fa-meanpath:before {
content: "";
}
.fa, .rst-content .admonition-title, .rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink, .rst-content tt.download span:first-child, .rst-content code.download span:first-child, .icon, .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand, .wy-dropdown .caret, .wy-inline-validate.wy-inline-validate-success .wy-input-context, .wy-inline-validate.wy-inline-validate-danger .wy-input-context, .wy-inline-validate.wy-inline-validate-warning .wy-input-context, .wy-inline-validate.wy-inline-validate-info .wy-input-context {
font-family: inherit;
}
.fa:before, .rst-content .admonition-title:before, .rst-content h1 .headerlink:before, .rst-content h2 .headerlink:before, .rst-content h3 .headerlink:before, .rst-content h4 .headerlink:before, .rst-content h5 .headerlink:before, .rst-content h6 .headerlink:before, .rst-content dl dt .headerlink:before, .rst-content p.caption .headerlink:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before, .icon:before, .wy-menu-vertical li span.toctree-expand:before, .wy-menu-vertical li.on a span.toctree-expand:before, .wy-menu-vertical li.current > a span.toctree-expand:before, .wy-dropdown .caret:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before {
font-family: "FontAwesome";
font-family: "NeoSansIntel";
display: inline-block;
font-style: normal;
font-weight: normal;
......@@ -2712,7 +717,7 @@ button[disabled] {
background-color: #27AE60;
text-decoration: none;
font-weight: normal;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
box-shadow: 0px 1px 2px -1px rgba(255, 255, 255, 0.5) inset, 0px -2px 0px 0px rgba(0, 0, 0, 0.1) inset;
outline-none: false;
vertical-align: middle;
......@@ -3150,7 +1155,7 @@ input {
input[type="button"], input[type="reset"], input[type="submit"] {
-webkit-appearance: button;
cursor: pointer;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
*overflow: visible;
}
input[type="text"], input[type="password"], input[type="email"], input[type="url"], input[type="date"], input[type="month"], input[type="time"], input[type="datetime"], input[type="datetime-local"], input[type="week"], input[type="number"], input[type="search"], input[type="tel"], input[type="color"] {
......@@ -3159,7 +1164,7 @@ input[type="text"], input[type="password"], input[type="email"], input[type="url
display: inline-block;
border: 1px solid #ccc;
font-size: 80%;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
box-shadow: inset 0 1px 3px #ddd;
border-radius: 0;
-webkit-transition: border 0.3s linear;
......@@ -3228,7 +1233,7 @@ textarea {
overflow: auto;
vertical-align: top;
width: 100%;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
}
select, textarea {
......@@ -3632,7 +1637,7 @@ html {
}
body {
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
font-weight: normal;
color: #404040;
min-height: 100%;
......@@ -3711,7 +1716,7 @@ a.wy-text-neutral:hover {
h1, h2, .rst-content .toctree-wrapper p.caption, h3, h4, h5, h6, legend {
margin-top: 0;
font-weight: 700;
font-family: "Lato", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
font-family: "NeoSansIntel", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
}
p {
......@@ -3722,15 +1727,15 @@ p {
}
h1 {
font-size: 185%;
font-size: 153%;
}
h2, .rst-content .toctree-wrapper p.caption {
font-size: 140%;
font-size: 141%;
}
h3 {
font-size: 105%;
font-size: 127%;
}
h4 {
......@@ -3834,7 +1839,7 @@ div[class^='highlight'] td.code {
}
code, p.caption, caption-text {
font-family: sans-serif, monospace;
font-family: RobotoSlab, sans-serif, monospace;
color: #A79992;
font-size: 0.95em;
line-height: 1.11em;
......@@ -4201,7 +2206,7 @@ div[class^='highlight'] pre {
color: #fcfcfc;
background: #1f1d1d;
border-top: solid 10px #5f5f5f;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
z-index: 400;
}
.rst-versions a {
......@@ -4390,7 +2395,7 @@ div[class^='highlight'] pre {
.rst-content h1 .headerlink:after, .rst-content h2 .headerlink:after, .rst-content .toctree-wrapper p.caption .headerlink:after, .rst-content h3 .headerlink:after, .rst-content h4 .headerlink:after, .rst-content h5 .headerlink:after, .rst-content h6 .headerlink:after, .rst-content dl dt .headerlink:after, .rst-content p.caption .headerlink:after {
visibility: visible;
content: "";
font-family: FontAwesome;
font-family: NeoSansIntel;
display: inline-block;
}
.rst-content h1:hover .headerlink, .rst-content h2:hover .headerlink, .rst-content .toctree-wrapper p.caption:hover .headerlink, .rst-content h3:hover .headerlink, .rst-content h4:hover .headerlink, .rst-content h5:hover .headerlink, .rst-content h6:hover .headerlink, .rst-content dl dt:hover .headerlink, .rst-content p.caption:hover .headerlink {
......@@ -4413,7 +2418,7 @@ div[class^='highlight'] pre {
}
.rst-content .sidebar .sidebar-title {
display: block;
font-family: "Lato", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
font-family: "NeoSansIntel", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
font-weight: bold;
background: #e1e4e5;
padding: 6px 12px;
......@@ -4625,16 +2630,16 @@ span[id*='MathJax-Span'] {
src: local("Inconsolata Bold"), local("Inconsolata-Bold"), url(../fonts/Inconsolata-Bold.ttf) format("truetype");
}
@font-face {
font-family: "Lato";
font-family: "NeoSansIntel";
font-style: normal;
font-weight: 400;
src: local("Lato Regular"), local("Lato-Regular"), url(../fonts/Lato-Regular.ttf) format("truetype");
src: local("NeoSansIntel Regular"), local("NeoSansIntel-Regular"), url(../fonts/NeoSansIntel-Regular.ttf) format("truetype");
}
@font-face {
font-family: "Lato";
font-family: "NeoSansIntel";
font-style: normal;
font-weight: 700;
src: local("Lato Bold"), local("Lato-Bold"), url(../fonts/Lato-Bold.ttf) format("truetype");
src: local("NeoSansIntel Bold"), local("NeoSansIntel-Bold"), url(../fonts/NeoSansIntel-Bold.ttf) format("truetype");
}
@font-face {
font-family: "Roboto Slab";
......
......@@ -347,33 +347,30 @@ big, small {
clear: both;
}
/*!
* Font Awesome 4.2.0 by @davegandy - http://fontawesome.io - @fontawesome
* License - http://fontawesome.io/license (Font: SIL OFL 1.1, CSS: MIT License)
*/
/* FONT PATH
* -------------------------- */
/*! */
/* NeoSansIntel FONT */
/* */
/* -------------------------- */
@font-face {
font-family: 'FontAwesome';
src: url("../fonts/fontawesome-webfont.eot?v=4.2.0");
src: url("../fonts/fontawesome-webfont.eot?#iefix&v=4.2.0") format("embedded-opentype"), url("../fonts/fontawesome-webfont.woff?v=4.2.0") format("woff"), url("../fonts/fontawesome-webfont.ttf?v=4.2.0") format("truetype"), url("../fonts/fontawesome-webfont.svg?v=4.2.0#fontawesomeregular") format("svg");
font-family: 'NeoSansIntel';
src: url("../fonts/NeoSansIntel); src: url("../fonts/NeoSansIntel) format("ttf","svg"), webformat("svg");
font-weight: normal;
font-style: normal;
}
.fa, .rst-content .admonition-title, .rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink, .rst-content tt.download span:first-child, .rst-content code.download span:first-child, .icon, .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand {
display: inline-block;
font: normal normal normal 14px/1 FontAwesome;
font: normal normal normal 1.011em/1 NeoSansIntel;
font-size: inherit;
letter-spacing: -0.41em;
text-rendering: auto;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
}
/* makes the font 33% larger relative to the icon container */
.fa-lg {
font-size: 1.33333em;
font-size: 1.25em;
line-height: 0.75em;
vertical-align: -15%;
vertical-align: -11%;
}
.fa-2x {
......@@ -534,2000 +531,8 @@ big, small {
color: #fff;
}
/* Font Awesome uses the Unicode Private Use Area (PUA) to ensure screen
readers do not read off random characters that represent icons */
.fa-glass:before {
content: "";
}
.fa-music:before {
content: "";
}
.fa-search:before, .icon-search:before {
content: "";
}
.fa-envelope-o:before {
content: "";
}
.fa-heart:before {
content: "";
}
.fa-star:before {
content: "";
}
.fa-star-o:before {
content: "";
}
.fa-user:before {
content: "";
}
.fa-film:before {
content: "";
}
.fa-th-large:before {
content: "";
}
.fa-th:before {
content: "";
}
.fa-th-list:before {
content: "";
}
.fa-check:before {
content: "";
}
.fa-remove:before,
.fa-close:before,
.fa-times:before {
content: "";
}
.fa-search-plus:before {
content: "";
}
.fa-search-minus:before {
content: "";
}
.fa-power-off:before {
content: "";
}
.fa-signal:before {
content: "";
}
.fa-gear:before,
.fa-cog:before {
content: "";
}
.fa-trash-o:before {
content: "";
}
.fa-home:before, .icon-home:before {
content: "";
}
.fa-file-o:before {
content: "";
}
.fa-clock-o:before {
content: "";
}
.fa-road:before {
content: "";
}
.fa-download:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before {
content: "";
}
.fa-arrow-circle-o-down:before {
content: "";
}
.fa-arrow-circle-o-up:before {
content: "";
}
.fa-inbox:before {
content: "";
}
.fa-play-circle-o:before {
content: "";
}
.fa-rotate-right:before,
.fa-repeat:before {
content: "";
}
.fa-refresh:before {
content: "";
}
.fa-list-alt:before {
content: "";
}
.fa-lock:before {
content: "";
}
.fa-flag:before {
content: "";
}
.fa-headphones:before {
content: "";
}
.fa-volume-off:before {
content: "";
}
.fa-volume-down:before {
content: "";
}
.fa-volume-up:before {
content: "";
}
.fa-qrcode:before {
content: "";
}
.fa-barcode:before {
content: "";
}
.fa-tag:before {
content: "";
}
.fa-tags:before {
content: "";
}
.fa-book:before, .icon-book:before {
content: "";
}
.fa-bookmark:before {
content: "";
}
.fa-print:before {
content: "";
}
.fa-camera:before {
content: "";
}
.fa-font:before {
content: "";
}
.fa-bold:before {
content: "";
}
.fa-italic:before {
content: "";
}
.fa-text-height:before {
content: "";
}
.fa-text-width:before {
content: "";
}
.fa-align-left:before {
content: "";
}
.fa-align-center:before {
content: "";
}
.fa-align-right:before {
content: "";
}
.fa-align-justify:before {
content: "";
}
.fa-list:before {
content: "";
}
.fa-dedent:before,
.fa-outdent:before {
content: "";
}
.fa-indent:before {
content: "";
}
.fa-video-camera:before {
content: "";
}
.fa-photo:before,
.fa-image:before,
.fa-picture-o:before {
content: "";
}
.fa-pencil:before {
content: "";
}
.fa-map-marker:before {
content: "";
}
.fa-adjust:before {
content: "";
}
.fa-tint:before {
content: "";
}
.fa-edit:before,
.fa-pencil-square-o:before {
content: "";
}
.fa-share-square-o:before {
content: "";
}
.fa-check-square-o:before {
content: "";
}
.fa-arrows:before {
content: "";
}
.fa-step-backward:before {
content: "";
}
.fa-fast-backward:before {
content: "";
}
.fa-backward:before {
content: "";
}
.fa-play:before {
content: "";
}
.fa-pause:before {
content: "";
}
.fa-stop:before {
content: "";
}
.fa-forward:before {
content: "";
}
.fa-fast-forward:before {
content: "";
}
.fa-step-forward:before {
content: "";
}
.fa-eject:before {
content: "";
}
.fa-chevron-left:before {
content: "";
}
.fa-chevron-right:before {
content: "";
}
.fa-plus-circle:before {
content: "";
}
.fa-minus-circle:before {
content: "";
}
.fa-times-circle:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before {
content: "";
}
.fa-check-circle:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before {
content: "";
}
.fa-question-circle:before {
content: "";
}
.fa-info-circle:before {
content: "";
}
.fa-crosshairs:before {
content: "";
}
.fa-times-circle-o:before {
content: "";
}
.fa-check-circle-o:before {
content: "";
}
.fa-ban:before {
content: "";
}
.fa-arrow-left:before {
content: "";
}
.fa-arrow-right:before {
content: "";
}
.fa-arrow-up:before {
content: "";
}
.fa-arrow-down:before {
content: "";
}
.fa-mail-forward:before,
.fa-share:before {
content: "";
}
.fa-expand:before {
content: "";
}
.fa-compress:before {
content: "";
}
.fa-plus:before {
content: "";
}
.fa-minus:before {
content: "";
}
.fa-asterisk:before {
content: "";
}
.fa-exclamation-circle:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before, .rst-content .admonition-title:before {
content: "";
}
.fa-gift:before {
content: "";
}
.fa-leaf:before {
content: "";
}
.fa-fire:before, .icon-fire:before {
content: "";
}
.fa-eye:before {
content: "";
}
.fa-eye-slash:before {
content: "";
}
.fa-warning:before,
.fa-exclamation-triangle:before {
content: "";
}
.fa-plane:before {
content: "";
}
.fa-calendar:before {
content: "";
}
.fa-random:before {
content: "";
}
.fa-comment:before {
content: "";
}
.fa-magnet:before {
content: "";
}
.fa-chevron-up:before {
content: "";
}
.fa-chevron-down:before {
content: "";
}
.fa-retweet:before {
content: "";
}
.fa-shopping-cart:before {
content: "";
}
.fa-folder:before {
content: "";
}
.fa-folder-open:before {
content: "";
}
.fa-arrows-v:before {
content: "";
}
.fa-arrows-h:before {
content: "";
}
.fa-bar-chart-o:before,
.fa-bar-chart:before {
content: "";
}
.fa-twitter-square:before {
content: "";
}
.fa-facebook-square:before {
content: "";
}
.fa-camera-retro:before {
content: "";
}
.fa-key:before {
content: "";
}
.fa-gears:before,
.fa-cogs:before {
content: "";
}
.fa-comments:before {
content: "";
}
.fa-thumbs-o-up:before {
content: "";
}
.fa-thumbs-o-down:before {
content: "";
}
.fa-star-half:before {
content: "";
}
.fa-heart-o:before {
content: "";
}
.fa-sign-out:before {
content: "";
}
.fa-linkedin-square:before {
content: "";
}
.fa-thumb-tack:before {
content: "";
}
.fa-external-link:before {
content: "";
}
.fa-sign-in:before {
content: "";
}
.fa-trophy:before {
content: "";
}
.fa-github-square:before {
content: "";
}
.fa-upload:before {
content: "";
}
.fa-lemon-o:before {
content: "";
}
.fa-phone:before {
content: "";
}
.fa-square-o:before {
content: "";
}
.fa-bookmark-o:before {
content: "";
}
.fa-phone-square:before {
content: "";
}
.fa-twitter:before {
content: "";
}
.fa-facebook:before {
content: "";
}
.fa-github:before, .icon-github:before {
content: "";
}
.fa-unlock:before {
content: "";
}
.fa-credit-card:before {
content: "";
}
.fa-rss:before {
content: "";
}
.fa-hdd-o:before {
content: "";
}
.fa-bullhorn:before {
content: "";
}
.fa-bell:before {
content: "";
}
.fa-certificate:before {
content: "";
}
.fa-hand-o-right:before {
content: "";
}
.fa-hand-o-left:before {
content: "";
}
.fa-hand-o-up:before {
content: "";
}
.fa-hand-o-down:before {
content: "";
}
.fa-arrow-circle-left:before, .icon-circle-arrow-left:before {
content: "";
}
.fa-arrow-circle-right:before, .icon-circle-arrow-right:before {
content: "";
}
.fa-arrow-circle-up:before {
content: "";
}
.fa-arrow-circle-down:before {
content: "";
}
.fa-globe:before {
content: "";
}
.fa-wrench:before {
content: "";
}
.fa-tasks:before {
content: "";
}
.fa-filter:before {
content: "";
}
.fa-briefcase:before {
content: "";
}
.fa-arrows-alt:before {
content: "";
}
.fa-group:before,
.fa-users:before {
content: "";
}
.fa-chain:before,
.fa-link:before,
.icon-link:before {
content: "";
}
.fa-cloud:before {
content: "";
}
.fa-flask:before {
content: "";
}
.fa-cut:before,
.fa-scissors:before {
content: "";
}
.fa-copy:before,
.fa-files-o:before {
content: "";
}
.fa-paperclip:before {
content: "";
}
.fa-save:before,
.fa-floppy-o:before {
content: "";
}
.fa-square:before {
content: "";
}
.fa-navicon:before,
.fa-reorder:before,
.fa-bars:before {
content: "";
}
.fa-list-ul:before {
content: "";
}
.fa-list-ol:before {
content: "";
}
.fa-strikethrough:before {
content: "";
}
.fa-underline:before {
content: "";
}
.fa-table:before {
content: "";
}
.fa-magic:before {
content: "";
}
.fa-truck:before {
content: "";
}
.fa-pinterest:before {
content: "";
}
.fa-pinterest-square:before {
content: "";
}
.fa-google-plus-square:before {
content: "";
}
.fa-google-plus:before {
content: "";
}
.fa-money:before {
content: "";
}
.fa-caret-down:before, .wy-dropdown .caret:before, .icon-caret-down:before {
content: "";
}
.fa-caret-up:before {
content: "";
}
.fa-caret-left:before {
content: "";
}
.fa-caret-right:before {
content: "";
}
.fa-columns:before {
content: "";
}
.fa-unsorted:before,
.fa-sort:before {
content: "";
}
.fa-sort-down:before,
.fa-sort-desc:before {
content: "";
}
.fa-sort-up:before,
.fa-sort-asc:before {
content: "";
}
.fa-envelope:before {
content: "";
}
.fa-linkedin:before {
content: "";
}
.fa-rotate-left:before,
.fa-undo:before {
content: "";
}
.fa-legal:before,
.fa-gavel:before {
content: "";
}
.fa-dashboard:before,
.fa-tachometer:before {
content: "";
}
.fa-comment-o:before {
content: "";
}
.fa-comments-o:before {
content: "";
}
.fa-flash:before,
.fa-bolt:before {
content: "";
}
.fa-sitemap:before {
content: "";
}
.fa-umbrella:before {
content: "";
}
.fa-paste:before,
.fa-clipboard:before {
content: "";
}
.fa-lightbulb-o:before {
content: "";
}
.fa-exchange:before {
content: "";
}
.fa-cloud-download:before {
content: "";
}
.fa-cloud-upload:before {
content: "";
}
.fa-user-md:before {
content: "";
}
.fa-stethoscope:before {
content: "";
}
.fa-suitcase:before {
content: "";
}
.fa-bell-o:before {
content: "";
}
.fa-coffee:before {
content: "";
}
.fa-cutlery:before {
content: "";
}
.fa-file-text-o:before {
content: "";
}
.fa-building-o:before {
content: "";
}
.fa-hospital-o:before {
content: "";
}
.fa-ambulance:before {
content: "";
}
.fa-medkit:before {
content: "";
}
.fa-fighter-jet:before {
content: "";
}
.fa-beer:before {
content: "";
}
.fa-h-square:before {
content: "";
}
.fa-plus-square:before {
content: "";
}
.fa-angle-double-left:before {
content: "";
}
.fa-angle-double-right:before {
content: "";
}
.fa-angle-double-up:before {
content: "";
}
.fa-angle-double-down:before {
content: "";
}
.fa-angle-left:before {
content: "";
}
.fa-angle-right:before {
content: "";
}
.fa-angle-up:before {
content: "";
}
.fa-angle-down:before {
content: "";
}
.fa-desktop:before {
content: "";
}
.fa-laptop:before {
content: "";
}
.fa-tablet:before {
content: "";
}
.fa-mobile-phone:before,
.fa-mobile:before {
content: "";
}
.fa-circle-o:before {
content: "";
}
.fa-quote-left:before {
content: "";
}
.fa-quote-right:before {
content: "";
}
.fa-spinner:before {
content: "";
}
.fa-circle:before {
content: "";
}
.fa-mail-reply:before,
.fa-reply:before {
content: "";
}
.fa-github-alt:before {
content: "";
}
.fa-folder-o:before {
content: "";
}
.fa-folder-open-o:before {
content: "";
}
.fa-smile-o:before {
content: "";
}
.fa-frown-o:before {
content: "";
}
.fa-meh-o:before {
content: "";
}
.fa-gamepad:before {
content: "";
}
.fa-keyboard-o:before {
content: "";
}
.fa-flag-o:before {
content: "";
}
.fa-flag-checkered:before {
content: "";
}
.fa-terminal:before {
content: "";
}
.fa-code:before {
content: "";
}
.fa-mail-reply-all:before,
.fa-reply-all:before {
content: "";
}
.fa-star-half-empty:before,
.fa-star-half-full:before,
.fa-star-half-o:before {
content: "";
}
.fa-location-arrow:before {
content: "";
}
.fa-crop:before {
content: "";
}
.fa-code-fork:before {
content: "";
}
.fa-unlink:before,
.fa-chain-broken:before {
content: "";
}
.fa-question:before {
content: "";
}
.fa-info:before {
content: "";
}
.fa-exclamation:before {
content: "";
}
.fa-superscript:before {
content: "";
}
.fa-subscript:before {
content: "";
}
.fa-eraser:before {
content: "";
}
.fa-puzzle-piece:before {
content: "";
}
.fa-microphone:before {
content: "";
}
.fa-microphone-slash:before {
content: "";
}
.fa-shield:before {
content: "";
}
.fa-calendar-o:before {
content: "";
}
.fa-fire-extinguisher:before {
content: "";
}
.fa-rocket:before {
content: "";
}
.fa-maxcdn:before {
content: "";
}
.fa-chevron-circle-left:before {
content: "";
}
.fa-chevron-circle-right:before {
content: "";
}
.fa-chevron-circle-up:before {
content: "";
}
.fa-chevron-circle-down:before {
content: "";
}
.fa-html5:before {
content: "";
}
.fa-css3:before {
content: "";
}
.fa-anchor:before {
content: "";
}
.fa-unlock-alt:before {
content: "";
}
.fa-bullseye:before {
content: "";
}
.fa-ellipsis-h:before {
content: "";
}
.fa-ellipsis-v:before {
content: "";
}
.fa-rss-square:before {
content: "";
}
.fa-play-circle:before {
content: "";
}
.fa-ticket:before {
content: "";
}
.fa-minus-square:before {
content: "";
}
.fa-minus-square-o:before, .wy-menu-vertical li.on a span.toctree-expand:before, .wy-menu-vertical li.current > a span.toctree-expand:before {
content: "";
}
.fa-level-up:before {
content: "";
}
.fa-level-down:before {
content: "";
}
.fa-check-square:before {
content: "";
}
.fa-pencil-square:before {
content: "";
}
.fa-external-link-square:before {
content: "";
}
.fa-share-square:before {
content: "";
}
.fa-compass:before {
content: "";
}
.fa-toggle-down:before,
.fa-caret-square-o-down:before {
content: "";
}
.fa-toggle-up:before,
.fa-caret-square-o-up:before {
content: "";
}
.fa-toggle-right:before,
.fa-caret-square-o-right:before {
content: "";
}
.fa-euro:before,
.fa-eur:before {
content: "";
}
.fa-gbp:before {
content: "";
}
.fa-dollar:before,
.fa-usd:before {
content: "";
}
.fa-rupee:before,
.fa-inr:before {
content: "";
}
.fa-cny:before,
.fa-rmb:before,
.fa-yen:before,
.fa-jpy:before {
content: "";
}
.fa-ruble:before,
.fa-rouble:before,
.fa-rub:before {
content: "";
}
.fa-won:before,
.fa-krw:before {
content: "";
}
.fa-bitcoin:before,
.fa-btc:before {
content: "";
}
.fa-file:before {
content: "";
}
.fa-file-text:before {
content: "";
}
.fa-sort-alpha-asc:before {
content: "";
}
.fa-sort-alpha-desc:before {
content: "";
}
.fa-sort-amount-asc:before {
content: "";
}
.fa-sort-amount-desc:before {
content: "";
}
.fa-sort-numeric-asc:before {
content: "";
}
.fa-sort-numeric-desc:before {
content: "";
}
.fa-thumbs-up:before {
content: "";
}
.fa-thumbs-down:before {
content: "";
}
.fa-youtube-square:before {
content: "";
}
.fa-youtube:before {
content: "";
}
.fa-xing:before {
content: "";
}
.fa-xing-square:before {
content: "";
}
.fa-youtube-play:before {
content: "";
}
.fa-dropbox:before {
content: "";
}
.fa-stack-overflow:before {
content: "";
}
.fa-instagram:before {
content: "";
}
.fa-flickr:before {
content: "";
}
.fa-adn:before {
content: "";
}
.fa-bitbucket:before, .icon-bitbucket:before {
content: "";
}
.fa-bitbucket-square:before {
content: "";
}
.fa-tumblr:before {
content: "";
}
.fa-tumblr-square:before {
content: "";
}
.fa-long-arrow-down:before {
content: "";
}
.fa-long-arrow-up:before {
content: "";
}
.fa-long-arrow-left:before {
content: "";
}
.fa-long-arrow-right:before {
content: "";
}
.fa-apple:before {
content: "";
}
.fa-windows:before {
content: "";
}
.fa-android:before {
content: "";
}
.fa-linux:before {
content: "";
}
.fa-dribbble:before {
content: "";
}
.fa-skype:before {
content: "";
}
.fa-foursquare:before {
content: "";
}
.fa-trello:before {
content: "";
}
.fa-female:before {
content: "";
}
.fa-male:before {
content: "";
}
.fa-gittip:before {
content: "";
}
.fa-sun-o:before {
content: "";
}
.fa-moon-o:before {
content: "";
}
.fa-archive:before {
content: "";
}
.fa-bug:before {
content: "";
}
.fa-vk:before {
content: "";
}
.fa-weibo:before {
content: "";
}
.fa-renren:before {
content: "";
}
.fa-pagelines:before {
content: "";
}
.fa-stack-exchange:before {
content: "";
}
.fa-arrow-circle-o-right:before {
content: "";
}
.fa-arrow-circle-o-left:before {
content: "";
}
.fa-toggle-left:before,
.fa-caret-square-o-left:before {
content: "";
}
.fa-dot-circle-o:before {
content: "";
}
.fa-wheelchair:before {
content: "";
}
.fa-vimeo-square:before {
content: "";
}
.fa-turkish-lira:before,
.fa-try:before {
content: "";
}
.fa-plus-square-o:before, .wy-menu-vertical li span.toctree-expand:before {
content: "";
}
.fa-space-shuttle:before {
content: "";
}
.fa-slack:before {
content: "";
}
.fa-envelope-square:before {
content: "";
}
.fa-wordpress:before {
content: "";
}
.fa-openid:before {
content: "";
}
.fa-institution:before,
.fa-bank:before,
.fa-university:before {
content: "";
}
.fa-mortar-board:before,
.fa-graduation-cap:before {
content: "";
}
.fa-yahoo:before {
content: "";
}
.fa-google:before {
content: "";
}
.fa-reddit:before {
content: "";
}
.fa-reddit-square:before {
content: "";
}
.fa-stumbleupon-circle:before {
content: "";
}
.fa-stumbleupon:before {
content: "";
}
.fa-delicious:before {
content: "";
}
.fa-digg:before {
content: "";
}
.fa-pied-piper:before {
content: "";
}
.fa-pied-piper-alt:before {
content: "";
}
.fa-drupal:before {
content: "";
}
.fa-joomla:before {
content: "";
}
.fa-language:before {
content: "";
}
.fa-fax:before {
content: "";
}
.fa-building:before {
content: "";
}
.fa-child:before {
content: "";
}
.fa-paw:before {
content: "";
}
.fa-spoon:before {
content: "";
}
.fa-cube:before {
content: "";
}
.fa-cubes:before {
content: "";
}
.fa-behance:before {
content: "";
}
.fa-behance-square:before {
content: "";
}
.fa-steam:before {
content: "";
}
.fa-steam-square:before {
content: "";
}
.fa-recycle:before {
content: "";
}
.fa-automobile:before,
.fa-car:before {
content: "";
}
.fa-cab:before,
.fa-taxi:before {
content: "";
}
.fa-tree:before {
content: "";
}
.fa-spotify:before {
content: "";
}
.fa-deviantart:before {
content: "";
}
.fa-soundcloud:before {
content: "";
}
.fa-database:before {
content: "";
}
.fa-file-pdf-o:before {
content: "";
}
.fa-file-word-o:before {
content: "";
}
.fa-file-excel-o:before {
content: "";
}
.fa-file-powerpoint-o:before {
content: "";
}
.fa-file-photo-o:before,
.fa-file-picture-o:before,
.fa-file-image-o:before {
content: "";
}
.fa-file-zip-o:before,
.fa-file-archive-o:before {
content: "";
}
.fa-file-sound-o:before,
.fa-file-audio-o:before {
content: "";
}
.fa-file-movie-o:before,
.fa-file-video-o:before {
content: "";
}
.fa-file-code-o:before {
content: "";
}
.fa-vine:before {
content: "";
}
.fa-codepen:before {
content: "";
}
.fa-jsfiddle:before {
content: "";
}
.fa-life-bouy:before,
.fa-life-buoy:before,
.fa-life-saver:before,
.fa-support:before,
.fa-life-ring:before {
content: "";
}
.fa-circle-o-notch:before {
content: "";
}
.fa-ra:before,
.fa-rebel:before {
content: "";
}
.fa-ge:before,
.fa-empire:before {
content: "";
}
.fa-git-square:before {
content: "";
}
.fa-git:before {
content: "";
}
.fa-hacker-news:before {
content: "";
}
.fa-tencent-weibo:before {
content: "";
}
.fa-qq:before {
content: "";
}
.fa-wechat:before,
.fa-weixin:before {
content: "";
}
.fa-send:before,
.fa-paper-plane:before {
content: "";
}
.fa-send-o:before,
.fa-paper-plane-o:before {
content: "";
}
.fa-history:before {
content: "";
}
.fa-circle-thin:before {
content: "";
}
.fa-header:before {
content: "";
}
.fa-paragraph:before {
content: "";
}
.fa-sliders:before {
content: "";
}
.fa-share-alt:before {
content: "";
}
.fa-share-alt-square:before {
content: "";
}
.fa-bomb:before {
content: "";
}
.fa-soccer-ball-o:before,
.fa-futbol-o:before {
content: "";
}
.fa-tty:before {
content: "";
}
.fa-binoculars:before {
content: "";
}
.fa-plug:before {
content: "";
}
.fa-slideshare:before {
content: "";
}
.fa-twitch:before {
content: "";
}
.fa-yelp:before {
content: "";
}
.fa-newspaper-o:before {
content: "";
}
.fa-wifi:before {
content: "";
}
.fa-calculator:before {
content: "";
}
.fa-paypal:before {
content: "";
}
.fa-google-wallet:before {
content: "";
}
.fa-cc-visa:before {
content: "";
}
.fa-cc-mastercard:before {
content: "";
}
.fa-cc-discover:before {
content: "";
}
.fa-cc-amex:before {
content: "";
}
.fa-cc-paypal:before {
content: "";
}
.fa-cc-stripe:before {
content: "";
}
.fa-bell-slash:before {
content: "";
}
.fa-bell-slash-o:before {
content: "";
}
.fa-trash:before {
content: "";
}
.fa-copyright:before {
content: "";
}
.fa-at:before {
content: "";
}
.fa-eyedropper:before {
content: "";
}
.fa-paint-brush:before {
content: "";
}
.fa-birthday-cake:before {
content: "";
}
.fa-area-chart:before {
content: "";
}
.fa-pie-chart:before {
content: "";
}
.fa-line-chart:before {
content: "";
}
.fa-lastfm:before {
content: "";
}
.fa-lastfm-square:before {
content: "";
}
.fa-toggle-off:before {
content: "";
}
.fa-toggle-on:before {
content: "";
}
.fa-bicycle:before {
content: "";
}
.fa-bus:before {
content: "";
}
.fa-ioxhost:before {
content: "";
}
.fa-angellist:before {
content: "";
}
.fa-cc:before {
content: "";
}
.fa-shekel:before,
.fa-sheqel:before,
.fa-ils:before {
content: "";
}
.fa-meanpath:before {
content: "";
}
.fa, .rst-content .admonition-title, .rst-content h1 .headerlink, .rst-content h2 .headerlink, .rst-content h3 .headerlink, .rst-content h4 .headerlink, .rst-content h5 .headerlink, .rst-content h6 .headerlink, .rst-content dl dt .headerlink, .rst-content p.caption .headerlink, .rst-content tt.download span:first-child, .rst-content code.download span:first-child, .icon, .wy-menu-vertical li span.toctree-expand, .wy-menu-vertical li.on a span.toctree-expand, .wy-menu-vertical li.current > a span.toctree-expand, .wy-dropdown .caret, .wy-inline-validate.wy-inline-validate-success .wy-input-context, .wy-inline-validate.wy-inline-validate-danger .wy-input-context, .wy-inline-validate.wy-inline-validate-warning .wy-input-context, .wy-inline-validate.wy-inline-validate-info .wy-input-context {
font-family: inherit;
}
.fa:before, .rst-content .admonition-title:before, .rst-content h1 .headerlink:before, .rst-content h2 .headerlink:before, .rst-content h3 .headerlink:before, .rst-content h4 .headerlink:before, .rst-content h5 .headerlink:before, .rst-content h6 .headerlink:before, .rst-content dl dt .headerlink:before, .rst-content p.caption .headerlink:before, .rst-content tt.download span:first-child:before, .rst-content code.download span:first-child:before, .icon:before, .wy-menu-vertical li span.toctree-expand:before, .wy-menu-vertical li.on a span.toctree-expand:before, .wy-menu-vertical li.current > a span.toctree-expand:before, .wy-dropdown .caret:before, .wy-inline-validate.wy-inline-validate-success .wy-input-context:before, .wy-inline-validate.wy-inline-validate-danger .wy-input-context:before, .wy-inline-validate.wy-inline-validate-warning .wy-input-context:before, .wy-inline-validate.wy-inline-validate-info .wy-input-context:before {
font-family: "FontAwesome";
font-family: "NeoSansIntel";
display: inline-block;
font-style: normal;
font-weight: normal;
......@@ -2712,7 +717,7 @@ button[disabled] {
background-color: #27AE60;
text-decoration: none;
font-weight: normal;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
box-shadow: 0px 1px 2px -1px rgba(255, 255, 255, 0.5) inset, 0px -2px 0px 0px rgba(0, 0, 0, 0.1) inset;
outline-none: false;
vertical-align: middle;
......@@ -3150,7 +1155,7 @@ input {
input[type="button"], input[type="reset"], input[type="submit"] {
-webkit-appearance: button;
cursor: pointer;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
*overflow: visible;
}
input[type="text"], input[type="password"], input[type="email"], input[type="url"], input[type="date"], input[type="month"], input[type="time"], input[type="datetime"], input[type="datetime-local"], input[type="week"], input[type="number"], input[type="search"], input[type="tel"], input[type="color"] {
......@@ -3159,7 +1164,7 @@ input[type="text"], input[type="password"], input[type="email"], input[type="url
display: inline-block;
border: 1px solid #ccc;
font-size: 80%;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
box-shadow: inset 0 1px 3px #ddd;
border-radius: 0;
-webkit-transition: border 0.3s linear;
......@@ -3228,7 +1233,7 @@ textarea {
overflow: auto;
vertical-align: top;
width: 100%;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
}
select, textarea {
......@@ -3632,7 +1637,7 @@ html {
}
body {
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
font-weight: normal;
color: #404040;
min-height: 100%;
......@@ -3711,7 +1716,7 @@ a.wy-text-neutral:hover {
h1, h2, .rst-content .toctree-wrapper p.caption, h3, h4, h5, h6, legend {
margin-top: 0;
font-weight: 700;
font-family: "Lato", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
font-family: "NeoSansIntel", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
}
p {
......@@ -3722,15 +1727,15 @@ p {
}
h1 {
font-size: 185%;
font-size: 153%;
}
h2, .rst-content .toctree-wrapper p.caption {
font-size: 140%;
font-size: 141%;
}
h3 {
font-size: 105%;
font-size: 127%;
}
h4 {
......@@ -3813,11 +1818,11 @@ code.code-large, .rst-content tt.code-large {
pre, .codeblock, pre.literal-block, .rst-content .literal-block, .rst-content pre.literal-block, div[class^='highlight'] {
border: 0.05em solid #dfeef1;
padding: 0.0em;
padding: 0.01em;
caption-side: bottom;
overflow-x: auto;
font-size: 1.001em;
line-height: 1.01337em;
font-size: 0.97em;
line-height: 1.013em;
background: #efeeed;
margin: 1px 0 20px 0;
}
......@@ -3834,7 +1839,7 @@ div[class^='highlight'] td.code {
}
code, p.caption, caption-text {
font-family: sans-serif, monospace;
font-family: RobotoSlab, sans-serif, monospace;
color: #A79992;
font-size: 0.95em;
line-height: 1.11em;
......@@ -4201,7 +2206,7 @@ div[class^='highlight'] pre {
color: #fcfcfc;
background: #1f1d1d;
border-top: solid 10px #5f5f5f;
font-family: "Lato", "proxima-nova", "Helvetica Neue", Arial, sans-serif;
font-family: "NeoSansIntel", "Lato", "Helvetica Neue", Arial, sans-serif;
z-index: 400;
}
.rst-versions a {
......@@ -4390,7 +2395,7 @@ div[class^='highlight'] pre {
.rst-content h1 .headerlink:after, .rst-content h2 .headerlink:after, .rst-content .toctree-wrapper p.caption .headerlink:after, .rst-content h3 .headerlink:after, .rst-content h4 .headerlink:after, .rst-content h5 .headerlink:after, .rst-content h6 .headerlink:after, .rst-content dl dt .headerlink:after, .rst-content p.caption .headerlink:after {
visibility: visible;
content: "";
font-family: FontAwesome;
font-family: NeoSansIntel;
display: inline-block;
}
.rst-content h1:hover .headerlink, .rst-content h2:hover .headerlink, .rst-content .toctree-wrapper p.caption:hover .headerlink, .rst-content h3:hover .headerlink, .rst-content h4:hover .headerlink, .rst-content h5:hover .headerlink, .rst-content h6:hover .headerlink, .rst-content dl dt:hover .headerlink, .rst-content p.caption:hover .headerlink {
......@@ -4413,7 +2418,7 @@ div[class^='highlight'] pre {
}
.rst-content .sidebar .sidebar-title {
display: block;
font-family: "Lato", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
font-family: "NeoSansIntel", "Roboto Slab", "ff-tisa-web-pro", "Georgia", Arial, sans-serif;
font-weight: bold;
background: #e1e4e5;
padding: 6px 12px;
......@@ -4625,16 +2630,16 @@ span[id*='MathJax-Span'] {
src: local("Inconsolata Bold"), local("Inconsolata-Bold"), url(../fonts/Inconsolata-Bold.ttf) format("truetype");
}
@font-face {
font-family: "Lato";
font-family: "NeoSansIntel";
font-style: normal;
font-weight: 400;
src: local("Lato Regular"), local("Lato-Regular"), url(../fonts/Lato-Regular.ttf) format("truetype");
src: local("NeoSansIntel Regular"), local("NeoSansIntel-Regular"), url(../fonts/NeoSansIntel-Regular.ttf) format("truetype");
}
@font-face {
font-family: "Lato";
font-family: "NeoSansIntel";
font-style: normal;
font-weight: 700;
src: local("Lato Bold"), local("Lato-Bold"), url(../fonts/Lato-Bold.ttf) format("truetype");
src: local("NeoSansIntel Bold"), local("NeoSansIntel-Bold"), url(../fonts/NeoSansIntel-Bold.ttf) format("truetype");
}
@font-face {
font-family: "Roboto Slab";
......
# Maintainer: Gavin Lloyd
# https://github.intel.com/gavinllo/ttf-neo-sans-intel
pkgname=ttf-neo-sans-intel
pkgver=1.00
pkgrel=3
pkgdesc='Versatile, futuristic typeface for Intel-branded material'
arch=('ANY')
depends=('fontconfig' 'xorg-font-utils')
source=('NeoSansIntel-Italic.ttf'
'NeoSansIntel-LightItalic.ttf'
'NeoSansIntel-Light.ttf'
'NeoSansIntel-MediumItalic.ttf'
'NeoSansIntel-Medium.ttf'
'NeoSansIntel.ttf')
sha256sums=('be2f036d58320bd0fab7cca7327b806840ddfedfdc4e44a520a85bd53a1ed7b3'
'ce45deb38ad2749ba25cbb76084955e34a86f627043f1f0f8f8073720115545c'
'd522c9c3905532680f8bb8068fa340200d2e5e45376ea89d97bcc8edbce8eff8'
'61b3ce0ed96b6f343c8ac0a94471ed504708782bee7d9df88fadc564640ffbba'
'6cd878034142c390eeb98d2a17ee1b949c2f8ded0a8684d3b17e0fe4203a8fd8'
'303bc44874e23a563775e5d463a6ec3dd7bdfc7948fa95d65a45fa965bf5ee28')
package() {
install -d $pkgdir/usr/share/fonts/TTF/
install -m644 *.ttf $pkgdir/usr/share/fonts/TTF/
}
.. about:
About
=====
Welcome to the Intel nGraph project, an open source C++ library for developers
of :abbr:`Deep Learning (DL)` (DL) systems and frameworks. Here you will find
a suite of components, documentation, and APIs that can be used with
:abbr:`Deep Neural Network (DNN)` models defined in a variety of frameworks.
The nGraph library translates a framework’s representation of computations into
a neutral-:abbr:`Intermediate Representation (IR)` designed to promote
computational efficiency on target hardware; it works on Intel and non-Intel
platforms.
.. figure:: graphics/fig.jpeg
The *nGraph core* uses a strongly-typed and platform-neutral stateless graph
representation for computations. Each node, or *op*, in the graph corresponds
to one step in a computation, where each step produces zero or more tensor
outputs from zero or more tensor inputs.
There is a *framework bridge* for each supported framework which acts as
an intermediary between the *ngraph core* and the framework. A *transformer*
plays a similar role between the ngraphcore and the various execution
platforms.
Transformers compile the graph using a combination of generic and
platform-specific graph transformations. The result is a function that
can be executed from the framework bridge. Transformers also allocate
and deallocate, as well as read and write, tensors under direction of the
bridge.
For this early |release| release, we provide framework integration guides
to
* :ref:`mxnet_intg`,
* :ref:`tensorflow_intg`, and
* Try neon™ `frontend`_ framework for training GPU-performant models.
Integration guides for each of these other frameworks is tentatively
forthcoming and/or open to the community for contributions and sample
documentation:
* `Chainer`_,
* `PyTorch`_,
* `Caffe2`_, and
* Frameworks not yet written (for algorithms that do not yet exist).
.. _Caffe2: https://github.com/caffe2/
.. _PyTorch: http://pytorch.org/
.. _Chainer: https://chainer.org/
.. _frontend: http://neon.nervanasys.com/index.html/
......@@ -3,4 +3,8 @@
API
###
.. TODO don't add Python APIs that will break the build.
\ No newline at end of file
.. TODO don't add Python APIs that will break the build.
Sections
********
......@@ -22,10 +22,11 @@ This script does *not* modify the source code.
Core Ops
--------
We have some core ops. Other ops may be added to core when they
have sufficient documentation and examples of those ops in practice
or potentially-practical use cases.
Our design philosophy is that the graph is not a script for running kernels, but, rather,
that the graph should describe the computation in terms of ops that are building blocks,
and compilation should match these ops to appropriate kernels for the backend(s) in use.
Thus, we expect that adding core ops should be infrequent. Instead, functionality should
be added by adding functions that build sub-graphs from existing core ops.
Coding style
......
......@@ -34,7 +34,8 @@ needs_sphinx = '1.6.5'
extensions = ['sphinx.ext.mathjax',
'sphinx.ext.ifconfig',
'sphinx.ext.viewcode',
'sphinx.ext.autodoc'
'sphinx.ext.autodoc',
'breathe'
]
# Add any paths that contain templates here, relative to this directory.
......@@ -62,9 +63,9 @@ author = 'Intel Corporation'
# built documents.
#
# The short X.Y version.
version = '0.5.1'
version = 'alpha'
# The full version, including alpha/beta/rc tags.
release = '0.5.1'
release = 'alpha'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
......@@ -189,6 +190,16 @@ texinfo_documents = [
html_add_permalinks = ""
breathe_projects = {
"nGraph": "../../../build/doc/doxygen/xml",
}
breathe_default_project = "nGraph"
breathe_projects = {
"nGraph": "xml"
}
rst_epilog = u"""
.. |codename| replace:: Intel nGraph
......
......@@ -27,7 +27,7 @@ with respect to additions or feature requests.
If you prefer to use a containerized application, like Jupyter\* notebooks,
Google Docs\*, or MS Word\* to write and share documentation contributions,
you can convert the ``doc/source/.rst`` files to another format with a tool
you can convert the ``doc/sphinx/source/*.rst`` files to another format with a tool
like ``pypandoc`` and share a link to your docs on our `wiki`_.
Another option is to fork the `ngraph repo`_, essentially snapshotting it at
......@@ -38,8 +38,7 @@ our wiki.
.. note:: Please do not submit Jupyter* notebook code to the Intel nGraph library
repos; best practice is to maintain any project-specific examples, tests, or
walk-throughs separately. Alternatively, you may wish to upstream documentation
contributions directly to whatever frontend framework supports the rendering and
reproducibility of your example.
contributions directly to whatever frontend framework supports your example.
......@@ -126,21 +125,26 @@ Build the Documentation
Right now the minimal version of Sphinx needed to build the documentation is
Sphinx v. 1.6.5. This can be installed with `pip3` either to a virtual environment, or
to your base system if you plan to contribute much to docs.
Sphinx v. 1.6.5. This can be installed with `pip3`, either to a virtual
environment, or to your base system if you plan to contribute much to docs.
`Breathe`_ can also be installed to build C++ API documentation (currently WIP).
To build documentation locally, run:
.. code-block:: console
$ pip3 install [-I] Sphinx==1.6.5 [--user]
$ pip3 install [-I] breathe [--user]
$ cd doc/sphinx/
$ make html
For tips similar to this, see the `sphinx`_ stable reST documentation.
.. _ngraph repo: https://github.com/NervanaSystems/ngraph/
.. _ngraph repo: https://github.com/NervanaSystems/ngraph-cpp/
.. _documentation repo: https://github.com/NervanaSystems/ngraph/tree/master/doc
.. _sphinx: http://www.sphinx-doc.org/en/stable/rest.html
.. _wiki: https://github.com/NervanaSystems/ngraph/wiki/
.. _Breathe: https://breathe.readthedocs.io/en/latest/
.. framework-integration-guides:
.. framework-integration-guides:
#############################
Framework Integration Guides
############################
#############################
.. contents::
Compile MXNet with ``libngraph``
================================
.. _mxnet_intg:
#. Add the `MXNet`_ prerequisites to your system, if the system doesn't have them
already:
Compile MXNet\* with ``libngraph``
==================================
.. code-block:: console
.. important:: These instructions pick up from where the :doc:`installation`
installation instructions left off, so they presume that your system already
has the library installed at ``$HOME/ngraph_dist`` as the default location.
If the |nGl| code has not yet been installed to your system, please go back
and return here to finish compiling MXNet with ``libngraph``.
$ sudo apt-get install -y libopencv-dev curl libatlas-base-dev python
python-pip python-dev python-opencv graphviz python-scipy python-sklearn
libopenblas-dev
#. Set the ``LD_LIBRARY_PATH`` path to the location where we built the libraries:
#. Set the ``LD_LIBRARY_PATH`` path to the location where we built the nGraph
libraries:
.. code-block:: bash
export LD_LIBRARY_PATH=$HOME/ngraph_dist/lib/
#. Clone the ``ngraph-mxnet`` repository recursively and checkout the
``ngraph-integration-dev branch``:
#. Add the `MXNet`_ prerequisites to your system, if the system doesn't have them
already. These requirements are Ubuntu\*-specific.
.. code-block:: console
$ sudo apt-get install -y libopencv-dev curl libatlas-base-dev python
python-pip python-dev python-opencv graphviz python-scipy python-sklearn
libopenblas-dev
#. Clone the ``ngraph-mxnet`` repository recursively and checkout the
``ngraph-integration-dev`` branch:
.. code-block:: console
$ git clone --recursive git@github.com:NervanaSystems/ngraph-mxnet.git
$ cd ngraph-mxnet && git checkout ngraph-integration-dev
#. Edit the ``make/config.mk`` file from the repo we just checked out to set
the ``USE_NGRAPH`` option to true with `1` and set :envvar:`NGRAPH_DIR`
to point to the installation:
#. Edit the ``make/config.mk`` file from the repo we just checked out to set
the ``USE_NGRAPH`` option (line ``80``) to true with `1` and set the :envvar:`NGRAPH_DIR`
(line ``81``) to point to the installation location target where the |nGl|
was installed:
.. code-block:: bash
USE_NGRAPH = 1
NGRAPH_DIR = $(HOME)/ngraph_dist
#. Ensure that the config file has disabled nnpack and mklml.
#. Ensure that settings on the config file are disabled for ``USE_MKL2017``
(line ``93``) and ``USE_NNPACK`` (line ``100``).
.. code-block:: bash
# whether use MKL2017 library
USE_MKL2017 = 0
# whether use MKL2017 experimental feature for high performance
# Prerequisite USE_MKL2017=1
USE_MKL2017_EXPERIMENTAL = 0
# whether use NNPACK library
USE_NNPACK = 0
#. Finally, compile MXNet:
#. Finally, compile MXNet with |InG|:
.. code-block:: console
$ make -j $(nproc)
#. After successfully running ``make``, install the python integration packages
your MXNet build needs to run a training example.
#. After successfully running ``make``, install the Python integration packages
that your MXNet build needs to run a training example.
.. code-block:: console
$ cd python && pip install -e . && cd ../
#. Confirm a successful integration by running the MNIST training example:
#. Confirm a successful integration by running the MNIST training example:
.. code-block:: console
$ python example/image-classification/train_mnist.py
Using ``libngraph`` from Tensorflow as XLA plugin
=================================================
.. _tensorflow_intg:
Build TensorFlow\* with an XLA plugin to ``libngraph``
======================================================
.. important:: These instructions pick up where the :doc:`installation`
installation instructions left off, so they presume that your system already
has the |nGl| installed. If the |nGl| code has not yet been installed to
your system, please go back to complete those steps, and return here when
you are ready to build TensorFlow\*.
.. TODO: add Avijit's presentation info and process here
.. warning:: Section below is a Work in Progress.
#. Set the ``LD_LIBRARY_PATH`` path to the location where we built the nGraph
libraries:
#. Get the `ngraph` fork of TensorFlow from this repo: ``git@github.com:NervanaSystems/ngraph-tensorflow.git``
#. Etc.
#. Go to the end near the following snippet
.. code-block:: bash
export LD_LIBRARY_PATH=$HOME/ngraph_dist/lib/
::
#. To prepare to build TensorFlow with an XLA plugin capable of running |nGl|,
use the standard build process which is a system called "bazel". These
instructions were tested with `bazel version 0.5.4`_.
native.new_local_repository(
name = "ngraph_external",
path = "/your/home/directory/where/ngraph_is_installed",
build_file = str(Label("//tensorflow/compiler/plugin/ngraph:ngraph.BUILD")),
)
.. code-block:: console
and modify the following line in the :file:`tensorflow/workspace.bzl` file to
provide an absolute path to ``~/ngraph_dist``
$ wget https://github.com/bazelbuild/bazel/releases/download/0.5.4/bazel-0.5.4-installer-linux-x86_64.sh
$ chmod +x bazel-0.5.4-installer-linux-x86_64.sh
$ ./bazel-0.5.4-installer-linux-x86_64.sh --user
#. Add and source the ``bin`` path to your ``~/.bashrc`` file in order to be
able to call bazel from the user's installation we set up:
.. code-block:: bash
::
path = "/directory/where/ngraph_is_installed"
export PATH=$PATH:~/bin
.. code-block:: console
$ source ~/.bashrc
#. Ensure that all the TensorFlow 1.3 dependencies are installed, as per the
TensorFlow `1.3 installation guide`_:
.. note:: You do not need CUDA in order to use the nGraph XLA plugin.
#. Once TensorFlow's dependencies are installed, clone the source of the
`ngraph-tensorflow`_ repo to your machine; this is the required fork for
this integration:
.. code-block:: console
$ git clone git@github.com:NervanaSystems/ngraph-tensorflow.git
$ cd ngraph-tensorflow
#. Now run :command:`configure` and choose `y` when prompted to build TensorFlow
with XLA just-in-time compiler.
.. code-block:: console
:emphasize-lines: 5-6
. . .
Do you wish to build TensorFlow with Hadoop File System support? [y/N]
No Hadoop File System support will be enabled for TensorFlow
Do you wish to build TensorFlow with the XLA just-in-time compiler (experimental)? [y/N] y
XLA JIT support will be enabled for TensorFlow
Do you wish to build TensorFlow with VERBS support? [y/N]
No VERBS support will be enabled for TensorFlow
Do you wish to build TensorFlow with OpenCL support? [y/N]
. . .
#. Next build the pip package
.. code-block:: console
$ bazel build --config=opt //tensorflow/tools/pip_package:build_pip_package
$ bazel-bin/tensorflow/tools/pip_package/build_pip_package /tmp/tensorflow_pkg
#. Finally install the pip package
.. code-block:: console
$ pip install /tmp/tensorflow_pkg/tensorflow-1.3.0-cp27-cp27mu-linux_x86_64.whl
Run MNIST MLP through the TensorFlow / XLA plugin to nGraph
------------------------------------------------------------
To test an example through the TensorFlow / XLA plugin to nGraph, you can use the
the MNIST softmax regression example script named `mnist_softmax_ngraph.py` that
is available in the `/examples/mnist`_ directory.
This script was modified from the example explained in the TensorFlow\* tutorial;
the following changes were made from the original script:
.. code-block:: python
def main(_):
with tf.device('/device:XLA_NGRAPH:0'):
run_mnist(_)
#. Now run :command:`configure` and follow the rest of the TF build process.
def run_mnist(_):
# Import data
mnist = input_data.read_data_sets(FLAGS.data_dir, one_hot=True)
...
To test everything together, set the configuration options:
.. code-block:: bash
Maintaining ``libngraph``
=========================
TBD
export OMP_NUM_THREADS=4
export KMP_AFFINITY=granularity=fine,scatter
And run the script as follows from within the `/examples/mnist`_ directory of
your cloned version of `ngraph-tensorflow`_:
.. code-block:: console
.. _MXNet: http://mxnet.incubator.apache.org/
$ python mnist_softmax_ngraph.py
.. _MXNet: http://mxnet.incubator.apache.org
.. _bazel version 0.5.4: https://github.com/bazelbuild/bazel/releases/tag/0.5.4
.. _1.3 installation guide: https://www.tensorflow.org/versions/r1.3/install/install_sources#prepare_environment_for_linux
.. _ngraph-tensorflow: https://github.com/NervanaSystems/ngraph-tensorflow
.. _/examples/mnist: https://github.com/NervanaSystems/ngraph-tensorflow/tree/develop/tensorflow/compiler/plugin/ngraph/examples/mnist
......@@ -5,14 +5,35 @@ Glossary
.. glossary::
function graph
The Intel nGraph library uses a function graph to represent an ``op``'s
parameters and results.
op
An op represents an operation. Ops are stateless and have zero or more
inputs and zero or more outputs. Some ops have additional constant
attributes. Every output of an op corresponds to a tensor and has an
element type and a shape. The element types and shapes of the outputs of
an op are determined by the inputs and attributes of the op.
tensors
Tensors are maps from *coordinates* to scalar values, all of the same type,
called the *element type* of the tensor.
parameter
In the context of a function graph, a "paramater" refers
to what "stands in" for an argument in an ``op`` definition.
In the context of a function graph, a "parameter" refers to what "stands
in" for an argument in an ``op`` definition.
result
In the context of a function graph, the term "result" refers to what
stands in for the returned *value*.
stands in for the returned value.
shape
The shape of a tensor is a tuple of non-negative integers that represents an
exclusive upper bound for coordinate values.
step
An abstract "action" that produces zero or more tensor outputs from zero or more tensor
inputs. Steps correspond to *ops* that connect *nodes*.
function graph
The Intel nGraph library uses a function graph to represent an ``op``'s
parameters and results.
.. build-a-functiongraph:
Defining a function graph on the nGraph library
###############################################
.. graph-basics:
Graph Basics
============
To build a function graph with the nGraph library, first understand the ways
that the library will handle graph values before and during compilation. Since
it can be fairly easy to confuse C++ terms with their counterparts in the
``ngraph`` function (and with the lower-level C++ representations of those
counterparts), we provide this reference.
Descriptions of ngraph values
-----------------------------
- *Element values* are integers, floats, etc.
- Each ``type`` of element value is described by an ``ElementType``.
- A C++ :cpp:type:`type` is required for referencing literals during
compilation.
- The :cpp:type:`type`'s ``value`` may be represented differently in a
compiled compilation. For example, a 32-bit float can hold a 16-bit float.
- A *value* in a graph is either a tensor view or a tuple.
- A **tensor view** is an indexed collection of element values, all of
the same element type. An element value is not a graph value; a 0-rank
tensor holds one element value and serves the same purpose.
- A **tuple** is 0 or more values, which can consist of tuples and
tensor views.
- Analogous to the value are "value types", also defined recursively.
- **Tensor view types** These types describe indexed collections of
primitive types. They are specified by a shape and an primitive
type for the elements.
.. TODO add Doxy links corresponding to these tensor view types'
APIs or use the literalinclude better
- **Tuple types** These are cartesian product types for tuples of
tuples and tensors, described by a sequence of tuple types and
tensor view types.
Tensors
-------
.. TODO add basic semantics
*Tensors* are maps from coordinates to scalar values, all of the same type,
called the *element type* of the tensor. Coordinates are tuples of non-negative
integers; all the coordinates for a tensor have the same length, called the
*rank* of the tensor. We often use :math:`n`-tensor for tensors with rank
:math:`n`. An :math:`n`-dimensional array is a common implementation of a
tensor, and the two terms are often used interchangeably. However, a tensor
could just as easily be a function that returns 0 for every coordinate.
The :term:`shape` of a tensor is a tuple of non-negative integers that
represents an exclusive upper bound for coordinate values. A tensor has an
element for every coordinate less than the shape, so the *size* of the tensor
is the product of the values in the shape.
An :math:`n`-dimensional array is a common implementation of a tensor, and the
two terms are often used interchangeably, but a tensor could just as easily be
a function that returns 0 for every coordinate.
In the graph, every op input must be associated with an op output, and every op
output must have a constant element type and shape that will correspond to the
tensors used in the computation.
Ops
---
The graph is a composition of tensor computations, called ``ops``, which are
nodes in the graph. In the graph, every :term:`op` *input* must be associated
with an op *output*, and every op output must have a constant element type and
shape to correspond with the tensors used in the computation. Every op has:
* zero or more inputs, and
* zero or more outputs;
these represent tensors that will be provided during execution. Ops may also
have additional attributes that do not change during execution.
Graph function
---------------
Function definition begins with creating one or more ``Parameter`` ops,
which represent the tensors that will be supplied as arguments to the function.
Parameters have no inputs and attributes for the element type and shape of the
tensor that will be provided as an argument. The unique output of the
``Parameter`` will have the provided element type and shape.
Constructed ops have element types and shapes for each of their outputs, which
are determined during op construction from the element types and shapes
associated with the inputs, as well as additional attributes of the ops. For
example, tensor addition is defined for two tensors of the same shape and size
and results in a tensor with the same element type and shape:
.. math::
(A+B)_I = A_I + B_I
Here, :math:`X_I` means the value of a coordinate :math:`I` for the tensor
:math:`X`. So the value of sum of two tensors is a tensor whose value at a
coordinate is the sum of the elements are that coordinate for the two inputs.
Unlike many frameowrks, it says nothing about storage or arrays.
An ``Add`` op is used to represent a tensor sum. To construct an Add op, each of
the two inputs of the ``Add`` must be associated with some output of some
already-created op. All outputs of constructed ops have element types and shapes,
so when the Add is constructed, it verifies that the two outputs associated with
its two inputs have the same element type and shape and sets its output to have
the same element type and shape.
Since all nodes supplying outputs for inputs to a new node must exist before the
new node can be created, it is impossible to construct a cyclic graph.
Furthermore, type-checking can be performed as the ops are constructed.
Functions
---------
Ops are grouped together in an ``ExternalFunction``, which describes a
computation that can be invoked on tensor arguments to compute tensor
results. The caller provides tensors in the form of row-major arrays
for each argument and each computed result. The same array can be used
for more than one argument, but each result must use a distinct array,
and argument arrays cannot be used as result arrays.
The ``ExternalFunction`` has ``Parameter``, a vector of ``Parameter`` ops,
where no ``Parameter`` op may appear more than once in the vector.
Each ``Parameter`` op has attributes for its shape and element type;
arrays passed to the function must have the same shape and element type.
The ``ExternalFunction`` also has ``Nodes``, a vector of ops that
are the results being computed (Note: We may require the results to
be ``Result`` ops in the future. A ``Result`` op would have a single
input and no outputs, and complement the zero input single output
``Parameter`` op.)
During execution, the output of the nth ``Parameter`` op will be the tensor
corresponding to the array provided as the nth argument, and the outputs
of all result ops will be written into the result arrays in row-major
order.
.. important:: During graph building, most of the storage associated
with values is *implicit*. During compilation, *explicit* storage
......@@ -84,20 +140,16 @@ sources: *literals*, *calls* to ops (built-in ops or user-defined ops AKA
zero or more run-time parameters of *arbitrary* value types and a result
whose type is the tuple type of the types of the parameters.
- *functions* are user-defined ops.
- A user-defined function is "external" if it can be called externally.
- The result is a graph node that depends only on parameters.
- The result's ``type`` of call to a function is determined from the
types of the arguments.
- Any external function interacting with the graph at the level of
user-defined ``op`` must specify a type for each of its parameters.
#. **Functions*** are user-defined ops.
- A user-defined function is "external" if it can be called externally.
- The result is a graph node that depends only on parameters.
- The result's type of call to a function is determined from the types of the arguments.
- Any external function interacting with the graph at the level of user-defined op must specify a type for each of its parameters.
#. *Parameters* of user-defined *functions* may also be a source of a graph's
values. Externally-callable functions must specify a type for each parameter.
Building a Graph
================
......
.. ---------------------------------------------------------------------------
.. Copyright 2017 Intel Corporation
.. Copyright 2018 Intel Corporation
.. Licensed under the Apache License, Version 2.0 (the "License");
.. you may not use this file except in compliance with the License.
.. You may obtain a copy of the License at
......@@ -13,10 +13,23 @@
.. limitations under the License.
.. ---------------------------------------------------------------------------
.. Intel nGraph library core documentation master file, created on Mon Dec 25 13:04:12 2017.
#############################
Intel nGraph library project
#############################
Intel nGraph library
====================
Welcome to the Intel nGraph project, an open source C++ library for developers
of :abbr:`Deep Learning (DL)` (DL) systems and frameworks. Here you will find
a suite of components, documentation, and APIs that can be used with
:abbr:`Deep Neural Network (DNN)` models defined in a variety of frameworks.
The nGraph library translates a framework’s representation of computations into
a neutral-:abbr:`Intermediate Representation (IR)` designed to promote
computational efficiency on target hardware; it works on Intel and non-Intel
platforms.
For further overview details, see the :doc:`about` page.
=======
.. toctree::
:maxdepth: 1
......@@ -26,15 +39,12 @@ Intel nGraph library
installation.rst
testing-libngraph.rst
framework-integration-guides.rst
build-a-functiongraph.rst
graph-basics.rst
.. toctree::
:maxdepth: 1
:caption: Models
:name: Models
training.rst
model-phases.rst
:caption: Algorithms
:name:
.. toctree::
:maxdepth: 2
......@@ -48,10 +58,17 @@ Intel nGraph library
autodiff.rst
glossary.rst
.. toctree::
:maxdepth: 1
:caption: Ops
ops/convolution.rst
.. toctree::
:maxdepth: 1
:caption: Project Docs
about.rst
release-notes.rst
code-contributor-README.rst
......@@ -68,3 +85,4 @@ Indices and tables
==================
* :ref:`search`
* :ref:`genindex`
\ No newline at end of file
.. installation:
Building the Intel® nGraph™ library
####################################
###################################
Install the Intel® nGraph™ library
###################################
Build Environments
==================
......@@ -15,33 +16,28 @@ packages and prerequisites:
:widths: 25, 15, 25, 20, 25
:escape: ~
CentOS 7.4 64-bit, CLang 3.4, GCC 4.8 + CMake 2.8, supported, ``patch diffutils zlib1g-dev libtinfo-dev``
Ubuntu 16.04 (LTS) 64-bit, CLang 3.9, CMake 3.5.1 + GNU Make, supported, ``build-essential cmake clang-3.9 git libtinfo-dev``
Ubuntu 16.04 (LTS) 64-bit, CLang 4.0, CMake 3.5.1 + GNU Make, officially unsupported, ``build-essential cmake clang-4.0 git libtinfo-dev``
Clear Linux\* OS for Intel Architecture, Clang 5.0.1, CMake 3.10.2, experimental, bundles ``machine-learning-basic dev-utils python3-basic python-basic-dev``
Clear Linux\* OS for Intel Architecture, CLang 5.0.1, CMake 3.10.2, experimental, bundles ``machine-learning-basic dev-utils python3-basic python-basic-dev``
macOS support is limited; see the macOS development prerequisites section
at the end of this page for details.
Installation Steps
==================
.. note:: If you are developing |nGl| projects on macOS*\, please be
aware that this platform is officially unsupported; see the section
`macOS Development Prerequisites`_ below.
To build |nGl| on one of the supported systems, the default CMake procedure
will install ``ngraph_dist`` to your user's ``$HOME`` directory as
the default install location. See the :file:`CMakeLists.txt` file for more
information.
This guide provides one possible configuration that does not rely on a
virtual environment. You are, of course, free to use a virtual environment,
or to set up user directories and permissions however you like.
#. Since most of a developer's interaction with a frontend framework
will take place locally through Python, set a placeholder directory
where Python bindings can interact more efficiently with the nGraph
library backend components. Create something like ``/opt/local`` and
(presuming you have sudo permissions), give ownership of that local
directory to your user. This will make configuring for various ``PATH``
and environmental variables much more simple later.
To build |nGl| on one of the supported systems, the CMake procedure will
install ``ngraph_dist`` to the installing user's ``$HOME`` directory as
the default location. See the :file:`CMakeLists.txt` file for more
information about how to change or customize this location.
#. (Optional) Since most of a developer's interaction with a frontend
framework will take place locally through Pythonic APIs to the C++
library, you can set a reference placeholder for the documented source
cloned from the repo. Create something like ``/opt/local`` and (with sudo
permissions), give ownership of that local directory to your user.
.. code-block:: console
......@@ -83,15 +79,16 @@ or to set up user directories and permissions however you like.
#. (Optional, requires `Sphinx`_.) Run ``make html`` inside the
``doc/sphinx`` directory to build HTML docs for the nGraph library.
#. (COMING SOON -- optional, requires `doxygen`_.) TBD
#. (COMING SOON -- Generate API docs. Optional, requires `doxygen`_.) TBD
.. macOS Development Prerequisites:
macOS Development Prerequisites
-------------------------------
.. note:: If you are developing |nGl| projects on macOS*\, please be
aware that this platform is officially unsupported.
The repository includes two scripts (``maint/check-code-format.sh`` and
``maint/apply-code-format.sh``) that are used respectively to check adherence
to `libngraph` code formatting conventions, and to automatically reformat code
......@@ -106,14 +103,6 @@ according to those conventions. These scripts require the command
$ ln -s /usr/local/opt/llvm@3.9/bin/clang-format $HOME/bin/clang-format-3.9
$ echo 'export PATH=$HOME/bin:$PATH' >> $HOME/.bash_profile
External library requirements
==============================
TBD
.. _doxygen: https://www.stack.nl/~dimitri/doxygen/
.. _Sphinx: http://www.sphinx-doc.org/en/stable/
.. _NervanaSystems: https://github.com/NervanaSystems/private-ngraph-cpp/blob/master/README.md
......
.. model-phases:
.. NOTE this is mostly just placeholder text designed to start a discussion around
the ways we can highlight something other than "run MNIST models" for training
as a feature of the nGraph library.
Phases
======
With the optimizations built into Intel nGraph library core, you can
train a model and quickly iterate upon (or with) learnings from your
original dataset. Once the model's data has been trained with the nGraph
library, it is essentially "freed" from the original framework that you
wrangled it into, and you can apply different kinds of operations and
tests to further refine to the goals of your data science.
.. For example, let's say that you notice the `MNIST` MLP dataset running
with MXNet on nGraph trains itself to 0.997345 or 1.00000 accuracy after
only 10 Epochs. The original model was written to train the dataset for
20 Epochs. This means that there are potentially 10 wasted cycles of
compute power that can be used elsewhere.
.. convolution.rst:
###########
Convolution
###########
A batched convolution operation.
Basic Operation
===============
In the simplest case, (TODO: explain what convolution is in human words.)
+-----------------+-------------------------+--------------------------------+
| Input Name | Element Type | Shape |
+=================+=========================+================================+
| ``image_batch`` | Any | ``(N, C_in, d_1, ..., d_n)`` |
+-----------------+-------------------------+--------------------------------+
| ``filters`` | Same as ``image_batch`` | ``(N, C_in, df_1, ..., df_n)`` |
+-----------------+-------------------------+--------------------------------+
+------------------+-------------------------+----------------------------------------------------+
| Output Name | Element Type | Shape |
+==================+=========================+====================================================+
| ``features_out`` | Same as ``image_batch`` | ``(N, C_in, d_1 - df_1 + 1, ..., d_n - df_n + 1)`` |
+------------------+-------------------------+----------------------------------------------------+
It must be the case that after dilation and padding are applied, the filter fits within the image.
(TODO: pictorial example of basic convolution.)
Window Parameters
=================
Two optional parameters affect the... stuff.
+-----------------------------+-----------------------------+------------------------------------+
| Parameter Name | Type | Meaning |
+=============================+=============================+====================================+
| ``window_movement_strides`` | ``Strides`` of length ``n`` | How far to slide the window along |
| | | each axis at each step. |
+-----------------------------+ +------------------------------------+
| ``window_dilation_strides`` | | Per-axis dilation to apply to the |
| | | filters. |
+-----------------------------+-----------------------------+------------------------------------+
(TODO: pictorial example of the effect of window movement stride.)
(TODO: pictorial example of window before and after dilation.)
Image Batch Parameters
======================
Three optional parameters affect the... stuff.
+----------------------------+-----------------------------+---------------------------------------+
| Parameter Name | Type | Meaning |
+============================+=============================+=======================================+
| ``padding_below`` | ``Padding`` of length ``n`` | How many padding elements to add |
| | | below the 0-coordinate on each axis. |
+----------------------------+ +---------------------------------------+
| ``padding_above`` | | How manny padding elements to add |
| | | above the max-coordinate on each axis.|
+----------------------------+-----------------------------+---------------------------------------+
| ``image_dilation_strides`` | ``Strides`` of length ``n`` | Per-axis dilation to apply to the |
| | | image batch. |
+----------------------------+-----------------------------+---------------------------------------+
(TODO: pictorial examples of the above)
Mathematical Definition
=======================
Padding
-------
Let :math:`p` (the padding below) and :math:`q` (the padding above) be a sequence of :math:`n`
integers, and :math:`T` be a tensor of shape :math:`(d_1,\dots,d_n)`, such that for all :math:`i`,
:math:`p_i + d_i + q_i \ge 0`. Then :math:`\mathit{Pad}[p,q](T)` is the tensor of shape
:math:`(p_1 + d_1 + q_1,\dots,p_n + d_n + q_n)` such that
.. math::
\mathit{Pad}[p,q](T)_{i_1,\dots,i_n} \triangleq \begin{cases}
T_{i_1 - p_1,\dots,i_n - p_n} &\mbox{if for all }j, i_j \ge p_j\mbox{ and }i_j < p_j + d_j \\
0 &\mbox{otherwise.}
\end{cases}
Dilation
--------
Let :math:`l` (the dilation strides) be a sequence of :math:`n` positive integers, and :math:`T`
be a tensor of shape :math:`(d_1,\dots,d_n)`. Then :math:`\mathit{Dilate}[l](T)` is the tensor of
shape :math:`(d'_1,\dots,d'_n)` where :math:`d'_i = \mathit{max}(0,l_i(d_i - 1) + 1)` such that
.. math::
\mathit{Dilate}[l](T)_{i_1,\dots,i_n} \triangleq \begin{cases}
T_{i_1/l_1,\dots,i_n/l_n} &\mbox{if for all }j, i_j\mbox{ is a multiple of }l_j \\
0 &\mbox{otherwise.}
\end{cases}
Striding
--------
Let :math:`s` (the strides) be a sequence of :math:`n` positive integers, and :math:`T` be a
tensor of shape :math:`(d_1,\dots,d_n)`. Then :math:`\mathit{Stride}[s](T)` is the tensor of
shape :math:`(d'_1,\dots,d'_n)` where :math:`d'_i = \left\lceil \frac{d_i}{s_i} \right\rceil`
such that
.. math::
\mathit{Stride}[s](T)_{i_1,\dots,i_n} \triangleq T_{s_1i_1,\dots,s_ni_n}
Convolution
-----------
TODO.
Padded, Dilated, Strided Convolution
------------------------------------
.. math::
\mathit{PDSConv}[g,p,q,l,s](T_\mathit{image},T_\mathit{filter} \triangleq \mathit{Stride}[s](\mathit{Conv}(\mathit{Pad}[p,q](\mathit{Dilate}[g](T_\mathit{batch})),\mathit{Dilate}[l](T_\mathit{filter})))
Batched, Padded, Dilated, Strided Convolution
---------------------------------------------
TODO.
C++ Interface
=============
.. doxygenclass:: ngraph::op::Convolution
:members:
Python Interface
================
is not merged yet, but could go here!
.. testing-libngraph:
##########################
Testing the nGraph library
##########################
......@@ -20,29 +21,28 @@ To perform the unit tests:
$ cd build/
$ make check
#. To build the full Google test suite (required to compile with MXNet):
.. code-block:: console
$ git clone git@github.com:google/googletest.git
$ cd googletest/ && cmake . && make -j$(nproc) && sudo make install
Compiling a framework with ``libngraph``
========================================
After building and installing the nGraph library to your system, the next
logical step is to compile a framework that you can use to run a
training/inference model with one of function-driven backends that are now
enabled. See our :doc:`model-phases` documentation for more about function-driven
backend design and architecture for algorithms.
training/inference model with one of the backends that are now enabled.
Intel nGraph library supports all of the popular frameworks including `MXNet`_,
`TensorFlow`_, `Caffe2`_, `PyTorch`_, `Chainer`_ and the native `neon`_ frontend
framework. Currently we provide integration guides for MXNet and Tensorflow, as
well as legacy documentation for the `neon`_ framework. Integration guides for
each of the other frameworks is forthcoming.
For this early release, we provide integration guides for
* `MXNet`_,
* `TensorFlow`_, and
* neon™ `frontend framework`_
Integration guides for each of these other frameworks is tentatively
forthcoming and/or open to the community for contributions and sample
documentation:
* `Chainer`_,
* `PyTorch`_,
* `Caffe2`_, and
* Frameworks not yet written (for algorithms that do not yet exist).
.. _GTest framework: https://github.com/google/googletest.git
.. _MXNet: http://mxnet.incubator.apache.org/
......@@ -50,4 +50,6 @@ each of the other frameworks is forthcoming.
.. _Caffe2: https://github.com/caffe2/
.. _PyTorch: http://pytorch.org/
.. _Chainer: https://chainer.org/
.. _neon: http://neon.nervanasys.com/index.html/
.. _frontend framework: http://neon.nervanasys.com/index.html/
.. training:
Training
########
......@@ -16,6 +16,7 @@
#include <algorithm>
#include <cmath>
#include <numeric>
#include <vector>
#include "ngraph/common.hpp"
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment