Repository: incubator-systemml
Updated Branches:
  refs/heads/gh-pages 1efaa5520 -> 0a235e647


[MINOR] Updated documentation and improved log messages

- Also, BLAS is disabled by default. We can enable it after more rigorous
  testing.


Project: http://git-wip-us.apache.org/repos/asf/incubator-systemml/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-systemml/commit/0a235e64
Tree: http://git-wip-us.apache.org/repos/asf/incubator-systemml/tree/0a235e64
Diff: http://git-wip-us.apache.org/repos/asf/incubator-systemml/diff/0a235e64

Branch: refs/heads/gh-pages
Commit: 0a235e647d7e1ada3a00d4d20afc7c783527e12c
Parents: 1efaa55
Author: Niketan Pansare <[email protected]>
Authored: Tue May 2 15:48:49 2017 -0800
Committer: Niketan Pansare <[email protected]>
Committed: Tue May 2 16:48:49 2017 -0700

----------------------------------------------------------------------
 beginners-guide-caffe2dml.md | 34 ++++++++++++++++++++++++++++++++++
 index.md                     |  1 +
 native-backend.md            | 35 +++++++++++++++++++++++------------
 3 files changed, 58 insertions(+), 12 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-systemml/blob/0a235e64/beginners-guide-caffe2dml.md
----------------------------------------------------------------------
diff --git a/beginners-guide-caffe2dml.md b/beginners-guide-caffe2dml.md
index cfcc0cb..dea53fd 100644
--- a/beginners-guide-caffe2dml.md
+++ b/beginners-guide-caffe2dml.md
@@ -31,6 +31,40 @@ limitations under the License.
 
 Caffe2DML is an experimental API that converts an Caffe specification to DML.
 
+## Example: Train Lenet
+
+1. Install `mlextend` package to get MNIST data: `pip install mlxtend`.
+2. (Optional but recommended) Follow the steps mentioned in [the user 
guide]([the user guide of native 
backend](http://apache.github.io/incubator-systemml/native-backend)) and 
install Intel MKL.
+3. Install 
[SystemML](http://apache.github.io/incubator-systemml/beginners-guide-python#install-systemml).
+4. Invoke PySpark shell: `pyspark --conf 
spark.executorEnv.LD_LIBRARY_PATH=/path/to/blas-n-other-dependencies`.
+
+```bash
+# Download the MNIST dataset
+from mlxtend.data import mnist_data
+import numpy as np
+from sklearn.utils import shuffle
+X, y = mnist_data()
+X, y = shuffle(X, y)
+
+# Split the data into training and test
+n_samples = len(X)
+X_train = X[:int(.9 * n_samples)]
+y_train = y[:int(.9 * n_samples)]
+X_test = X[int(.9 * n_samples):]
+y_test = y[int(.9 * n_samples):]
+
+# Download the Lenet network
+import urllib
+urllib.urlretrieve('https://raw.githubusercontent.com/niketanpansare/model_zoo/master/caffe/vision/lenet/mnist/lenet.proto',
 'lenet.proto')
+urllib.urlretrieve('https://raw.githubusercontent.com/niketanpansare/model_zoo/master/caffe/vision/lenet/mnist/lenet_solver.proto',
 'lenet_solver.proto')
+
+# Train Lenet On MNIST using scikit-learn like API
+from systemml.mllearn import Caffe2DML
+lenet = Caffe2DML(sqlCtx, solver='lenet_solver.proto', input_shape=(1, 28, 
28)).set(debug=True).setStatistics(True)
+lenet.fit(X_train, y_train)
+y_predicted = lenet.predict(X_test)
+```
+
 ## Frequently asked questions
 
 - How to set batch size ?

http://git-wip-us.apache.org/repos/asf/incubator-systemml/blob/0a235e64/index.md
----------------------------------------------------------------------
diff --git a/index.md b/index.md
index c84e7b7..080dfd2 100644
--- a/index.md
+++ b/index.md
@@ -57,6 +57,7 @@ machine in R-like and Python-like declarative languages.
   in Standalone Mode.
 * [JMLC](jmlc) - Java Machine Learning Connector.
   * See [Java Machine Learning Connector (JMLC)](jmlc) for more information.
+* *Experimental* [Caffe2DML 
API](http://apache.github.io/incubator-systemml/beginners-guide-caffe2dml) for 
Deep Learning.
 
 ## Language Guides
 

http://git-wip-us.apache.org/repos/asf/incubator-systemml/blob/0a235e64/native-backend.md
----------------------------------------------------------------------
diff --git a/native-backend.md b/native-backend.md
index 86a1340..c64c2fe 100644
--- a/native-backend.md
+++ b/native-backend.md
@@ -1,3 +1,8 @@
+---
+layout: global
+title: Using SystemML with Native BLAS support
+description: Using SystemML with Native BLAS support
+---
 <!--
 {% comment %}
 Licensed to the Apache Software Foundation (ASF) under one or more
@@ -17,6 +22,11 @@ limitations under the License.
 {% endcomment %}
 -->
 
+* This will become a table of contents (this text will be scraped).
+{:toc}
+
+<br/>
+
 # User Guide
 
 By default, SystemML implements all its matrix operations in Java.
@@ -25,16 +35,16 @@ This simplifies deployment especially in a distributed 
environment.
 In some cases (such as deep learning), the user might want to use native BLAS
 rather than SystemML's internal Java library for performing single-node
 operations such matrix multiplication, convolution, etc.
+
+To allow SystemML to use native BLAS rather than internal Java library,
+please set the configuration property `native.blas` to `true`.
+
 By default, SystemML will first attempt to use Intel MKL (if installed)
 and then OpenBLAS (if installed).
 If both Intel MKL and OpenBLAS are not available, SystemML
 falls back to its internal Java library.
 
-To force SystemML to use internal Java library rather than native BLAS,
-please set the configuration property `native.blas` to `false`.
-
-The current version of SystemML only supports BLAS on Linux machines.
-
+The current version of SystemML only supports BLAS on **Linux** machines.
 
 ## Step 1: Install BLAS
 
@@ -95,19 +105,20 @@ sudo ln -s /lib64/libgomp.so.1 /lib64/libgomp.so
        
 ## Step 3: Provide the location of the native libraries
 
-1. Add the location of the native libraries (i.e. BLAS and other dependencies) 
+1. Pass the location of the native libraries using command-line options:
+
+- [Spark](http://spark.apache.org/docs/latest/configuration.html): `--conf 
spark.executorEnv.LD_LIBRARY_PATH=/path/to/blas-n-other-dependencies`
+- Java: `-Djava.library.path=/path/to/blas-n-other-dependencies`
+
+2. Alternatively, you can add the location of the native libraries (i.e. BLAS 
and other dependencies) 
 to the environment variable `LD_LIBRARY_PATH` (on Linux). 
-If you want to use SystemML with Spark, please add the following line to 
`spark-env.sh`
+If you want to use SystemML with Spark, please add the following line to 
`spark-env.sh` 
+(or to the bash profile).
 
        ```bash
        export LD_LIBRARY_PATH=/path/to/blas-n-other-dependencies
-       # Or export SPARK_LIBRARY_PATH=/path/to/blas-n-other-dependencies
        ```
 
-2. Alternatively, you can pass the location of the native libraries using 
command-line options:
-
-- Java: `-Djava.library.path=/path/to/blas-n-other-dependencies`
-- [Spark](http://spark.apache.org/docs/latest/configuration.html): 
`--driver-library-path`
 
 ## Common issues on Linux
 

Reply via email to