This is an automated email from the ASF dual-hosted git repository.

nswamy pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new f6317c9  fix broken links and reorganize build from source page 
(#12962)
f6317c9 is described below

commit f6317c9a92fe2b7dddaecf7891f5ff7f215ae329
Author: Aaron Markham <[email protected]>
AuthorDate: Wed Nov 21 09:02:43 2018 -0800

    fix broken links and reorganize build from source page (#12962)
---
 docs/install/build_from_source.md           | 231 +++++++++++++++++-----------
 docs/tutorials/unsupervised_learning/gan.md |   7 +-
 2 files changed, 145 insertions(+), 93 deletions(-)

diff --git a/docs/install/build_from_source.md 
b/docs/install/build_from_source.md
index eff6666..b28fca3 100644
--- a/docs/install/build_from_source.md
+++ b/docs/install/build_from_source.md
@@ -1,93 +1,83 @@
 # Build MXNet from Source
 
-This document explains how to build MXNet from source code. Building MXNet 
from source is a two step process.
-
-1. Build the MXNet shared library, `libmxnet.so`, from [C++ source 
files](#build-the-shared-library)
-2. Install the [language bindings](#installing-mxnet-language-bindings) for 
MXNet. MXNet supports the following languages:
-    - Python
-    - C++
-    - Clojure
-    - Julia
-    - Perl
-    - R
-    - Scala
+This document explains how to build MXNet from source code.
+
+
+## Overview
+
+Building from source follows this general two-step flow of building the shared 
library, then installing your preferred language binding. Use the following 
links to jump to the different sections of this guide.
+
+1. Build the MXNet shared library, `libmxnet.so`.
+    * [Clone the repository](#clone-the-mxnet-project)
+    * [Prerequisites](#prerequisites)
+        * [Math library selection](#math-library-selection)
+        * [Install GPU software](#install-gpu-software)
+        * [Install optional software](#install-optional-software)
+    * [Adjust your build configuration](#build-configurations)
+    * [Build MXNet](#build-mxnet)
+        * [with NCCL](#build-mxnet-with-nccl) (optional)
+        * [for C++](#build-mxnet-with-c++) (optional)
+        * [Usage Examples](#usage-examples)
+            * [systems with GPUs and Intel 
CPUs](#recommended-for-Systems-with-NVIDIA-GPUs-and-Intel-CPUs)
+            * [GPUs with non-Intel 
CPUs](#recommended-for-Systems-with-Intel-CPUs)
+            * [Intel CPUs](#recommended-for-Systems-with-Intel-CPUs)
+            * [non-Intel CPUs](#recommended-for-Systems-with-non-Intel-CPUs)
+2. [Install the language API binding(s)](#installing-mxnet-language-bindings) 
you would like to use for MXNet.
+MXNet's newest and most popular API is Gluon. Gluon is built into the Python 
binding. If Python isn't your preference, you still have more options. MXNet 
supports several other language APIs:
+    - [Python (includes Gluon)](../api/python/index.html)
+    - [C++](../api/c++/index.html)
+    - [Clojure](../api/clojure/index.html)
+    - Java (coming soon)
+    - [Julia](../api/julia/index.html)
+    - [Perl](../api/perl/index.html)
+    - [R](../api/r/index.html)
+    - [Scala](../api/scala/index.html)
+
+<hr>
 
-## Prerequisites
-
-You need C++ build tools and a BLAS library to build the MXNet shared library. 
If you want to run MXNet with GPUs, you will need to install [NVDIA CUDA and 
cuDNN](https://developer.nvidia.com/cuda-downloads) first.
+## Build Instructions by Operating System
 
-You may use [GNU Make](https://www.gnu.org/software/make/) to build the 
library but [cmake](https://cmake.org/) is required when building with MKLDNN
+Detailed instructions are provided per operating system. Each of these guides 
also covers how to install the specific [Language 
Bindings](#installing-mxnet-language-bindings) you require.
+You may jump to those, but it is recommended that you continue reading to 
understand more general "build from source" options.
 
+* [Amazon Linux / CentOS / RHEL](centos_setup.html)
+* [macOS](osx_setup.html)
+* [Raspbian](raspian_setup.html)
+* [TX2](tx2_setup.html)
+* [Ubuntu](ubuntu_setup.html)
+* [Windows](windows_setup.html)
 
-### C++ build tools
 
-1. A C++ compiler that supports C++ 11.
-[G++ (4.8 or later)](https://gcc.gnu.org/gcc-4.8/) or
-[Clang](http://clang.llvm.org/) is required.
+<hr>
 
-2. [Git](https://git-scm.com/downloads) for downloading the sources from 
Github repository.
+## Clone the MXNet Project
 
+1. Clone or fork the MXNet project.
+```bash
+git clone --recursive https://github.com/apache/incubator-mxnet mxnet
+cd mxnet
+```
 
+<hr>
 
+## Prerequisites
 
-### BLAS library
+The following sections will help you decide which specific prerequisites you 
need to install.
 
+#### Math Library Selection
+It is useful to consider your math library selection prior to your other 
prerequisites.
 MXNet relies on the
 [BLAS](https://en.wikipedia.org/wiki/Basic_Linear_Algebra_Subprograms) (Basic
 Linear Algebra Subprograms) library for numerical computations.
 Those can be extended with [LAPACK (Linear Algebra 
Package)](https://github.com/Reference-LAPACK/lapack), an additional set of 
mathematical functions.
 
 MXNet supports multiple mathematical backends for computations on the CPU:
-
 * [Apple Accelerate](https://developer.apple.com/documentation/accelerate)
 * [ATLAS](http://math-atlas.sourceforge.net/)
 * [MKL](https://software.intel.com/en-us/intel-mkl) (MKL, MKLML)
 * [MKL-DNN](https://github.com/intel/mkl-dnn)
 * [OpenBLAS](http://www.openblas.net/)
 
-Usage of these are covered in more detail in the [build 
configurations](#build-configurations) section.
-
-
-### Optional
-
-These might be optional, but they're typically desirable.
-
-* [OpenCV](http://opencv.org/) for Image Loading and Augmentation
-* [NVDIA CUDA and cuDNN](https://developer.nvidia.com/cuda-downloads) for 
running MXNet with GPUs
-
-
-## Build Instructions by Operating System
-
-Detailed instructions are provided per operating system.
-You may jump to those, but it is recommended that you continue reading to 
understand more general build from source options.
-
-| | | | |
-|---|---|---|---|
-| [macOS](osx_setup.html) | [Ubuntu](ubuntu_setup.html) | 
[CentOS/*unix](centos_setup.html) | [Windows](windows_setup.html) |
-| [raspbian](raspian_setup.html) | [tx2](tx2_setup.html) | | |
-
-
-
-## Build
-
-1. Clone the MXNet project.
-```bash
-git clone --recursive https://github.com/apache/incubator-mxnet mxnet
-cd mxnet
-```
-
-There is a configuration file for make,
-[`make/config.mk`](https://github.com/apache/incubator-mxnet/blob/master/make/config.mk),
 that contains all the compilation options. You can edit it and then run `make`.
-
-
-## Build Configurations
-
-`cmake` is recommended for building MXNet (and is required to build with 
MKLDNN), however you may use `make` instead.
-
-
-### Math Library Selection
-It is useful to consider your math library selection first.
-
 The default order of choice for the libraries if found follows the path from 
the most
 (recommended) to less performant backends.
 The following lists show this order by library and `cmake` switch.
@@ -122,21 +112,21 @@ https://software.intel.com/en-us/mkl
 It has following flavors:
 
 * MKL is a complete math library, containing all the functionality found in 
ATLAS, OpenBlas and LAPACK. It is free under
-  community support licensing 
(https://software.intel.com/en-us/articles/free-mkl),
-  but needs to be downloaded and installed manually.
+community support licensing 
(https://software.intel.com/en-us/articles/free-mkl),
+but needs to be downloaded and installed manually.
 
 * MKLML is a subset of MKL. It contains a smaller number of functions to 
reduce the
-  size of the download and reduce the number of dynamic libraries user needs.
+size of the download and reduce the number of dynamic libraries user needs.
 
-  <!-- [Removed until #11148 is merged.] This is the most effective option 
since it can be downloaded and installed automatically
-  by the cmake script (see cmake/DownloadMKLML.cmake).-->
+<!-- [Removed until #11148 is merged.] This is the most effective option since 
it can be downloaded and installed automatically
+by the cmake script (see cmake/DownloadMKLML.cmake).-->
 
 * MKL-DNN is a separate open-source library, it can be used separately from 
MKL or MKLML. It is
-  shipped as a subrepo with MXNet source code (see 3rdparty/mkldnn or the 
[MKL-DNN project](https://github.com/intel/mkl-dnn))
+shipped as a subrepo with MXNet source code (see 3rdparty/mkldnn or the 
[MKL-DNN project](https://github.com/intel/mkl-dnn))
 
 Since the full MKL library is almost always faster than any other BLAS library 
it's turned on by default,
 however it needs to be downloaded and installed manually before doing `cmake` 
configuration.
-Register and download on the [Intel performance libraries 
website](https://software.seek.intel.com/performance-libraries).
+Register and download on the [Intel performance libraries 
website](https://software.intel.com/en-us/performance-libraries).
 
 Note: MKL is supported only for desktop builds and the framework itself 
supports the following
 hardware:
@@ -150,6 +140,32 @@ If you have a different processor you can still try to use 
MKL, but performance
 unpredictable.
 
 
+#### Install GPU Software
+
+If you want to run MXNet with GPUs, you must install [NVDIA CUDA and 
cuDNN](https://developer.nvidia.com/cuda-downloads).
+
+
+#### Install Optional Software
+
+These might be optional, but they're typically desirable as the extend or 
enhance MXNet's functionality.
+
+* [OpenCV](http://opencv.org/) - Image Loading and Augmentation. Each 
operating system has different packages and build from source options for 
OpenCV. Refer to your OS's link in the [Build Instructions by Operating 
System](#build-instructions-by-operating-system) section for further 
instructions.
+* [NCCL](https://developer.nvidia.com/nccl) - NVIDIA's Collective 
Communications Library. Instructions for installing NCCL are found in the 
following [Build MXNet with NCCL](#build-mxnet-with-nccl) section.
+
+More information on turning these features on or off are found in the 
following [build configurations](#build-configurations) section.
+
+
+<hr>
+
+## Build Configurations
+
+There is a configuration file for make,
+[`make/config.mk`](https://github.com/apache/incubator-mxnet/blob/master/make/config.mk),
 that contains all the compilation options. You can edit it and then run `make` 
or `cmake`. `cmake` is recommended for building MXNet (and is required to build 
with MKLDNN), however you may use `make` instead.
+
+<hr>
+
+## Build MXNet
+
 ### Build MXNet with NCCL
 - Download and install the latest NCCL library from NVIDIA.
 - Note the directory path in which NCCL libraries and header files are 
installed.
@@ -183,53 +199,88 @@ nosetests --verbose tests/python/gpu/test_nccl.py
 **Recommendation to get the best performance out of NCCL:**
 It is recommended to set environment variable NCCL_LAUNCH_MODE to PARALLEL 
when using NCCL version 2.1 or newer.
 
+<hr>
+
+### Build MXNet with C++
 
-### Build MXNet with Language Packages
 * To enable C++ package, just add `USE_CPP_PACKAGE=1` when you run `make` or 
`cmake`.
 
+<hr>
 
 ### Usage Examples
-* `-j` runs multiple jobs against multi-core CPUs. Example using all cores on 
Linux:
+
+* `-j` runs multiple jobs against multi-core CPUs.
+
+For example, you can specify using all cores on Linux as follows:
 
 ```bash
-make -j$(nproc)
+cmake -j$(nproc)
 ```
 
-* Build without using OpenCV:
+
+#### Recommended for Systems with NVIDIA GPUs and Intel CPUs
+* Build MXNet with `cmake` and install with MKL DNN, GPU, and OpenCV support:
 
 ```bash
-make USE_OPENCV=0
+cmake -j USE_CUDA=1 USE_CUDA_PATH=/usr/local/cuda USE_CUDNN=1 USE_MKLDNN=1
 ```
 
+#### Recommended for Systems with NVIDIA GPUs
 * Build with both OpenBLAS, GPU, and OpenCV support:
 
 ```bash
-make -j USE_BLAS=openblas USE_CUDA=1 USE_CUDA_PATH=/usr/local/cuda USE_CUDNN=1
+cmake -j BLAS=open USE_CUDA=1 USE_CUDA_PATH=/usr/local/cuda USE_CUDNN=1
+```
+
+#### Recommended for Systems with Intel CPUs
+* Build MXNet with `cmake` and install with MKL DNN, and OpenCV support:
+
+```bash
+cmake -j USE_CUDA=0 USE_MKLDNN=1
+```
+
+#### Recommended for Systems with non-Intel CPUs
+* Build MXNet with `cmake` and install with OpenBLAS and OpenCV support:
+
+```bash
+cmake -j USE_CUDA=0 BLAS=open
+```
+
+#### Other Examples
+
+* Build without using OpenCV:
+
+```bash
+cmake USE_OPENCV=0
 ```
 
 * Build on **macOS** with the default BLAS library (Apple Accelerate) and 
Clang installed with `xcode` (OPENMP is disabled because it is not supported by 
the Apple version of Clang):
 
 ```bash
-make -j USE_BLAS=apple USE_OPENCV=0 USE_OPENMP=0
+cmake -j BLAS=apple USE_OPENCV=0 USE_OPENMP=0
 ```
 
 * To use OpenMP on **macOS** you need to install the Clang compiler, `llvm` 
(the one provided by Apple does not support OpenMP):
 
 ```bash
 brew install llvm
-make -j USE_BLAS=apple USE_OPENMP=1
+cmake -j BLAS=apple USE_OPENMP=1
 ```
 
+<hr>
+
 ## Installing MXNet Language Bindings
-After building MXNet's shared library, you can install other language 
bindings. (Except for C++. You need to build this when you build MXNet from 
source.)
+After building MXNet's shared library, you can install other language bindings.
+
+**NOTE:** The C++ API binding must be built when you build MXNet from source. 
See [Build MXNet with C++](#build-mxnet-with-c++).
 
 The following table provides links to each language binding by operating 
system:
-|   | Linux | macOS | Windows |
-|---|---|---|---|
-| Python | [Linux](ubuntu_setup.html#install-mxnet-for-python) | 
[macOS](osx_setup.html) | 
[Windows](windows_setup.html#install-mxnet-for-python) |
-| C++ | [Linux](c_plus_plus.html) | [macOS](c_plus_plus.html) | 
[Windows](c_plus_plus.html) |
-| Clojure | 
[Linux](https://github.com/apache/incubator-mxnet/tree/master/contrib/clojure-package)
 | 
[macOS](https://github.com/apache/incubator-mxnet/tree/master/contrib/clojure-package)
 | n/a |
-| Julia | [Linux](ubuntu_setup.html#install-the-mxnet-package-for-julia) | 
[macOS](osx_setup.html#install-the-mxnet-package-for-julia) | 
[Windows](windows_setup.html#install-the-mxnet-package-for-julia) |
-| Perl | [Linux](ubuntu_setup.html#install-the-mxnet-package-for-perl) | 
[macOS](osx_setup.html#install-the-mxnet-package-for-perl) | [Windows](n/a) |
-| R | [Linux](ubuntu_setup.html#install-the-mxnet-package-for-r) | 
[macOS](osx_setup.html#install-the-mxnet-package-for-r) | 
[Windows](windows_setup.html#install-the-mxnet-package-for-r) |
-| Scala | [Linux](scala_setup.html) | [macOS](scala_setup.html) | n/a |
+| | [Ubuntu](ubuntu_setup.html) | [macOS](osx_setup.html) | 
[Windows](windows_setup.html) |
+| --- | ----  | --- | ------- |
+| Python | [Ubuntu guide](ubuntu_setup.html#install-mxnet-for-python) | [OSX 
guide](osx_setup.html) | [Windows 
guide](windows_setup.html#install-mxnet-for-python) |
+| C++ | [C++ guide](c_plus_plus.html) | [C++ guide](c_plus_plus.html) | [C++ 
guide](c_plus_plus.html) |
+| Clojure | [Clojure 
guide](https://github.com/apache/incubator-mxnet/tree/master/contrib/clojure-package)
 | [Clojure 
guide](https://github.com/apache/incubator-mxnet/tree/master/contrib/clojure-package)
 | n/a |
+| Julia | [Ubuntu 
guide](ubuntu_setup.html#install-the-mxnet-package-for-julia) | [OSX 
guide](osx_setup.html#install-the-mxnet-package-for-julia) | [Windows 
guide](windows_setup.html#install-the-mxnet-package-for-julia) |
+| Perl | [Ubuntu guide](ubuntu_setup.html#install-the-mxnet-package-for-perl) 
| [OSX guide](osx_setup.html#install-the-mxnet-package-for-perl) | n/a |
+| R | [Ubuntu guide](ubuntu_setup.html#install-the-mxnet-package-for-r) | [OSX 
guide](osx_setup.html#install-the-mxnet-package-for-r) | [Windows 
guide](windows_setup.html#install-the-mxnet-package-for-r) |
+| Scala | [Scala guide](scala_setup.html) | [Scala guide](scala_setup.html) | 
n/a |
diff --git a/docs/tutorials/unsupervised_learning/gan.md 
b/docs/tutorials/unsupervised_learning/gan.md
index f436a15..0efdc55 100644
--- a/docs/tutorials/unsupervised_learning/gan.md
+++ b/docs/tutorials/unsupervised_learning/gan.md
@@ -63,7 +63,7 @@ The MNIST dataset contains 70,000 images of handwritten 
digits. Each image is 28
 
 ### 1. Preparing the MNSIT dataset
 
-Let us start by preparing the handwritten digits from the MNIST dataset. 
+Let us start by preparing the handwritten digits from the MNIST dataset.
 ```python
 import mxnet as mx
 import numpy as np
@@ -75,7 +75,7 @@ mnist_test = mx.gluon.data.vision.datasets.MNIST(train=False)
 ```python
 # The downloaded data is of type `Dataset` which are
 # Well suited to work with the new Gluon interface but less
-# With the older symbol API, used in this tutorial. 
+# With the older symbol API, used in this tutorial.
 # Therefore we convert them to numpy array first
 X = np.zeros((70000, 28, 28))
 for i, (data, label) in enumerate(mnist_train):
@@ -394,7 +394,8 @@ As a result, we have created two neural nets: a Generator, 
which is able to crea
 Along the way, we have learned how to do the image manipulation and 
visualization that is associated with the training of deep neural nets. We have 
also learned how to use MXNet's Module APIs to perform advanced model training 
functionality to fit the model.
 
 ## Acknowledgements
+
 This tutorial is based on [MXNet DCGAN 
codebase](https://github.com/apache/incubator-mxnet/blob/master/example/gluon/dc_gan/dcgan.py),
 [The original paper on GANs](https://arxiv.org/abs/1406.2661), as well as 
[this paper on deep convolutional GANs](https://arxiv.org/abs/1511.06434).
 
-<!-- INSERT SOURCE DOWNLOAD BUTTONS -->
\ No newline at end of file
+<!-- INSERT SOURCE DOWNLOAD BUTTONS -->

Reply via email to