szha closed pull request #10622: [MXNET-341] Added General Info to API Docs for 
Python and Gluon
URL: https://github.com/apache/incubator-mxnet/pull/10622
 
 
   

This is a PR merged from a forked repository.
As GitHub hides the original diff on merge, it is displayed below for
the sake of provenance:

As this is a foreign pull request (from a fork), the diff is supplied
below (as it won't show otherwise due to GitHub magic):

diff --git a/docs/api/python/gluon/gluon.md b/docs/api/python/gluon/gluon.md
index f523e649a45..9bf866d21a1 100644
--- a/docs/api/python/gluon/gluon.md
+++ b/docs/api/python/gluon/gluon.md
@@ -9,10 +9,79 @@
 
 ## Overview
 
-Gluon package is a high-level interface for MXNet designed to be easy to use 
while
-keeping most of the flexibility of low level API. Gluon supports both 
imperative
-and symbolic programming, making it easy to train complex models imperatively
-in Python and then deploy with symbolic graph in C++ and Scala.
+The Gluon package is a high-level interface for MXNet designed to be easy to 
use, while keeping most of the flexibility of a low level API. Gluon supports 
both imperative and symbolic programming, making it easy to train complex 
models imperatively in Python and then deploy with a symbolic graph in C++ and 
Scala.
+
+Based on the the [Gluon API 
specification](https://github.com/gluon-api/gluon-api), the Gluon API in Apache 
MXNet provides a clear, concise, and simple API for deep learning. It makes it 
easy to prototype, build, and train deep learning models without sacrificing 
training speed.
+
+**Advantages**
+
+1. Simple, Easy-to-Understand Code: Gluon offers a full set of plug-and-play 
neural network building blocks, including predefined layers, optimizers, and 
initializers.
+2. Flexible, Imperative Structure: Gluon does not require the neural network 
model to be rigidly defined, but rather brings the training algorithm and model 
closer together to provide flexibility in the development process.
+3. Dynamic Graphs: Gluon enables developers to define neural network models 
that are dynamic, meaning they can be built on the fly, with any structure, and 
using any of Python’s native control flow.
+4. High Performance: Gluon provides all of the above benefits without 
impacting the training speed that the underlying engine provides. 
+
+**Examples**
+
+*Simple, Easy-to-Understand Code*
+
+Use plug-and-play neural network building blocks, including predefined layers, 
optimizers, and initializers:
+
+```
+net = gluon.nn.Sequential()
+# When instantiated, Sequential stores a chain of neural network layers. 
+# Once presented with data, Sequential executes each layer in turn, using 
+# the output of one layer as the input for the next
+with net.name_scope():
+    net.add(gluon.nn.Dense(256, activation="relu")) # 1st layer (256 nodes)
+    net.add(gluon.nn.Dense(256, activation="relu")) # 2nd hidden layer
+    net.add(gluon.nn.Dense(num_outputs))
+```
+
+*Flexible, Imperative Structure*
+
+Prototype, build, and train neural networks in fully imperative manner using 
the MXNet autograd package and the Gluon trainer method:
+
+```
+epochs = 10
+
+for e in range(epochs):
+    for i, (data, label) in enumerate(train_data):
+        with autograd.record():
+            output = net(data) # the forward iteration
+            loss = softmax_cross_entropy(output, label)
+            loss.backward()
+        trainer.step(data.shape[0])
+```
+
+*Dynamic Graphs*
+
+Build neural networks on the fly for use cases where neural networks must 
change in size and shape during model training:
+
+```
+def forward(self, F, inputs, tree):
+    children_outputs = [self.forward(F, inputs, child)
+                        for child in tree.children]
+    #Recursively builds the neural network based on each input sentence’s
+    #syntactic structure during the model definition and training process
+    ...
+```
+
+*High Performance*
+
+Easily cache the neural network to achieve high performance by defining your 
neural network with *HybridSequential* and calling the *hybridize* method:
+
+```
+net = nn.HybridSequential()
+with net.name_scope():
+    net.add(nn.Dense(256, activation="relu"))
+    net.add(nn.Dense(128, activation="relu"))
+    net.add(nn.Dense(2))
+    
+net.hybridize()
+```
+
+
+## Contents
 
 ```eval_rst
 .. toctree::
diff --git a/docs/api/python/index.md b/docs/api/python/index.md
index b097e2045b1..88e8031cfc4 100644
--- a/docs/api/python/index.md
+++ b/docs/api/python/index.md
@@ -1,10 +1,14 @@
 # MXNet - Python API
 
-MXNet provides a rich Python API to serve a broad community of Python 
developers.
-In this section, we provide an in-depth discussion of the functionality 
provided by
-various MXNet Python packages. We have included code samples for most of the 
APIs
-for improved clarity. These code samples will run as-is as long as MXNet is 
first
-imported by running:
+MXNet provides a comprehensive and flexible Python API to serve a broad 
community of developers with different levels of experience and wide ranging 
requirements. In this section, we provide an in-depth discussion of the 
functionality provided by various MXNet Python packages.
+
+MXNet's Python API has two primary high-level packages*: the Gluon API and 
Module API. We recommend that new users start with the Gluon API as it's more 
flexible and easier to debug. Underlying these high-level packages are the core 
packages of NDArray and Symbol.
+
+NDArray works with arrays in an imperative fashion, i.e. you define how arrays 
will be transformed to get to an end result. Symbol works with arrays in a 
declarative fashion, i.e. you define the end result that is required (via a 
symbolic graph) and the MXNet engine will use various optimizations to 
determine the steps required to obtain this. With NDArray you have a great deal 
of flexibility when composing operations (as you can use Python control flow), 
and you can easily step through your code and inspect the values of arrays, 
which helps with debugging. Unfortunately, this comes at a performance cost 
when compared to Symbol, which can perform optimizations on the symbolic graph.
+
+Module API is backed by Symbol, so, although it's very performant, it's also a 
little more restrictive. With the Gluon API, you can get the best of both 
worlds. You can develop and test your model imperatively using NDArray, a then 
switch to Symbol for faster model training and inference (if Symbol equivalents 
exist for your operations).
+
+Code examples are placed throughout the API documentation and these can be run 
after importing MXNet as follows:
 
 ```python
 >>> import mxnet as mx
@@ -12,13 +16,15 @@ imported by running:
 
 ```eval_rst
 
-.. note:: A convenient way to execute examples is the ``%doctest_mode`` mode of
+.. note:: A convenient way to execute code examples is using the 
``%doctest_mode`` mode of
     Jupyter notebook, which allows for pasting multi-line examples containing
     ``>>>`` while preserving indentation. Run ``%doctest_mode?`` in Jupyter 
notebook
     for more details.
 
 ```
 
+\* Some old references to Model API may exist, but this API has been 
deprecated.
+
 ## NDArray API
 
 ```eval_rst


 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

Reply via email to