Repository: systemml
Updated Branches:
  refs/heads/gh-pages c44f6c022 -> ddeb11205


[SYSTEMML-1669] Migrate incubator-systemml to systemml in docs

Update 'incubator-systemml' references in docs and README to 'systemml'.
Update dist dev/incubator/systemml to dev/systemml in release process.
Remove 'incubating' from examples and update version numbers.


Project: http://git-wip-us.apache.org/repos/asf/systemml/repo
Commit: http://git-wip-us.apache.org/repos/asf/systemml/commit/ddeb1120
Tree: http://git-wip-us.apache.org/repos/asf/systemml/tree/ddeb1120
Diff: http://git-wip-us.apache.org/repos/asf/systemml/diff/ddeb1120

Branch: refs/heads/gh-pages
Commit: ddeb112051a1c3f65bb9fa3456089b96239d606f
Parents: c44f6c0
Author: Deron Eriksson <de...@us.ibm.com>
Authored: Thu Jun 8 11:04:54 2017 -0700
Committer: Deron Eriksson <de...@us.ibm.com>
Committed: Thu Jun 8 11:04:54 2017 -0700

----------------------------------------------------------------------
 _layouts/global.html                 |  4 +-
 algorithms-classification.md         |  8 +--
 algorithms-regression.md             |  2 +-
 beginners-guide-caffe2dml.md         | 10 ++--
 beginners-guide-python.md            | 28 ++++-----
 beginners-guide-to-dml-and-pydml.md  |  2 +-
 contributing-to-systemml.md          | 18 +++---
 devdocs/gpu-backend.md               |  2 +-
 hadoop-batch-mode.md                 | 12 ++--
 index.md                             |  2 +-
 python-reference.md                  | 14 ++---
 release-process.md                   | 99 +++++++++++++++----------------
 spark-batch-mode.md                  |  6 +-
 spark-mlcontext-programming-guide.md | 36 +++++------
 standalone-guide.md                  |  4 +-
 troubleshooting-guide.md             |  2 +-
 16 files changed, 124 insertions(+), 125 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/_layouts/global.html
----------------------------------------------------------------------
diff --git a/_layouts/global.html b/_layouts/global.html
index 815f108..5e84276 100644
--- a/_layouts/global.html
+++ b/_layouts/global.html
@@ -41,12 +41,12 @@
                 <nav class="navbar-collapse collapse">
                     <ul class="nav navbar-nav navbar-right">
                         <li><a href="index.html">Overview</a></li>
-                        <li><a 
href="https://github.com/apache/incubator-systemml";>GitHub</a></li>
+                        <li><a 
href="https://github.com/apache/systemml";>GitHub</a></li>
                         <li class="dropdown">
                             <a href="#" class="dropdown-toggle" 
data-toggle="dropdown">Documentation<b class="caret"></b></a>
                             <ul class="dropdown-menu" role="menu">
                                 <li><b>Running SystemML:</b></li>
-                                <li><a 
href="https://github.com/apache/incubator-systemml";>SystemML GitHub 
README</a></li>
+                                <li><a 
href="https://github.com/apache/systemml";>SystemML GitHub README</a></li>
                                 <li><a 
href="spark-mlcontext-programming-guide.html">Spark MLContext</a></li>
                                 <li><a href="spark-batch-mode.html">Spark 
Batch Mode</a>
                                 <li><a href="hadoop-batch-mode.html">Hadoop 
Batch Mode</a>

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/algorithms-classification.md
----------------------------------------------------------------------
diff --git a/algorithms-classification.md b/algorithms-classification.md
index 04c5eb8..1895103 100644
--- a/algorithms-classification.md
+++ b/algorithms-classification.md
@@ -229,7 +229,7 @@ if no maximum limit provided
 `mm`, or `csv`; see read/write functions in
 SystemML Language Reference for details.
 
-Please see [mllearn 
documentation](https://apache.github.io/incubator-systemml/python-reference#mllearn-api)
 for
+Please see [mllearn 
documentation](https://apache.github.io/systemml/python-reference#mllearn-api) 
for
 more details on the Python API. 
 
 ### Examples
@@ -637,7 +637,7 @@ held-out test set. Note that this is an optional argument.
 **confusion**: Location (on HDFS) to store the confusion matrix computed
 using a held-out test set. Note that this is an optional argument.
 
-Please see [mllearn 
documentation](https://apache.github.io/incubator-systemml/python-reference#mllearn-api)
 for
+Please see [mllearn 
documentation](https://apache.github.io/systemml/python-reference#mllearn-api) 
for
 more details on the Python API. 
 
 #### Examples
@@ -908,7 +908,7 @@ SystemML Language Reference for details.
 **confusion**: Location (on HDFS) to store the confusion matrix computed
     using a held-out test set. Note that this is an optional argument.
 
-Please see [mllearn 
documentation](https://apache.github.io/incubator-systemml/python-reference#mllearn-api)
 for
+Please see [mllearn 
documentation](https://apache.github.io/systemml/python-reference#mllearn-api) 
for
 more details on the Python API. 
 
 #### Examples
@@ -1246,7 +1246,7 @@ SystemML Language Reference for details.
 **confusion**: Location (on HDFS) to store the confusion matrix computed
     using a held-out test set. Note that this is an optional argument.
 
-Please see [mllearn 
documentation](https://apache.github.io/incubator-systemml/python-reference#mllearn-api)
 for
+Please see [mllearn 
documentation](https://apache.github.io/systemml/python-reference#mllearn-api) 
for
 more details on the Python API. 
 
 ### Examples

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/algorithms-regression.md
----------------------------------------------------------------------
diff --git a/algorithms-regression.md b/algorithms-regression.md
index 13f6cff..df2ad3e 100644
--- a/algorithms-regression.md
+++ b/algorithms-regression.md
@@ -212,7 +212,7 @@ gradient iterations, or `0` if no maximum limit provided
 `mm`, or `csv`; see read/write functions in
 SystemML Language Reference for details.
 
-Please see [mllearn 
documentation](https://apache.github.io/incubator-systemml/python-reference#mllearn-api)
 for
+Please see [mllearn 
documentation](https://apache.github.io/systemml/python-reference#mllearn-api) 
for
 more details on the Python API. 
 
 ### Examples

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/beginners-guide-caffe2dml.md
----------------------------------------------------------------------
diff --git a/beginners-guide-caffe2dml.md b/beginners-guide-caffe2dml.md
index 6d48c61..b44b113 100644
--- a/beginners-guide-caffe2dml.md
+++ b/beginners-guide-caffe2dml.md
@@ -51,7 +51,7 @@ Lenet is a simple convolutional neural network, proposed by 
Yann LeCun in 1998.
 Similar to Caffe, the network has been modified to add dropout. 
 For more detail, please see http://yann.lecun.com/exdb/lenet/
 
-The [solver 
specification](https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/nn/examples/caffe2dml/models/mnist_lenet/lenet_solver.proto)
+The [solver 
specification](https://raw.githubusercontent.com/apache/systemml/master/scripts/nn/examples/caffe2dml/models/mnist_lenet/lenet_solver.proto)
 specifies to Caffe2DML to use following configuration when generating the 
training DML script:  
 - `type: "SGD", momentum: 0.9`: Stochastic Gradient Descent with momentum 
optimizer with `momentum=0.9`.
 - `lr_policy: "exp", gamma: 0.95, base_lr: 0.01`: Use exponential decay 
learning rate policy (`base_lr * gamma ^ iter`).
@@ -79,8 +79,8 @@ X_test = X[int(.9 * n_samples):]
 y_test = y[int(.9 * n_samples):]
 
 # Download the Lenet network
-urllib.urlretrieve('https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/nn/examples/caffe2dml/models/mnist_lenet/lenet.proto',
 'lenet.proto')
-urllib.urlretrieve('https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/nn/examples/caffe2dml/models/mnist_lenet/lenet_solver.proto',
 'lenet_solver.proto')
+urllib.urlretrieve('https://raw.githubusercontent.com/apache/systemml/master/scripts/nn/examples/caffe2dml/models/mnist_lenet/lenet.proto',
 'lenet.proto')
+urllib.urlretrieve('https://raw.githubusercontent.com/apache/systemml/master/scripts/nn/examples/caffe2dml/models/mnist_lenet/lenet_solver.proto',
 'lenet_solver.proto')
 
 # Train Lenet On MNIST using scikit-learn like API
 # MNIST dataset contains 28 X 28 gray-scale (number of channel=1).
@@ -106,7 +106,7 @@ lenet.fit(X_train, y_train)
 lenet.predict(X_test)
 ```
 
-For more detail on enabling native BLAS, please see the documentation for the 
[native backend](http://apache.github.io/incubator-systemml/native-backend).
+For more detail on enabling native BLAS, please see the documentation for the 
[native backend](http://apache.github.io/systemml/native-backend).
 
 ## Frequently asked questions
 
@@ -137,7 +137,7 @@ we include their licenses in our jar files.
 
 - Enable native BLAS to improve the performance of CP convolution and matrix 
multiplication operators.
 If you are using OpenBLAS, please ensure that it was built with `USE_OPENMP` 
flag turned on.
-For more detail see http://apache.github.io/incubator-systemml/native-backend
+For more detail see http://apache.github.io/systemml/native-backend
 
 ```python
 caffe2dmlObject.setConfigProperty("native.blas", "auto")

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/beginners-guide-python.md
----------------------------------------------------------------------
diff --git a/beginners-guide-python.md b/beginners-guide-python.md
index b75e73c..266d50f 100644
--- a/beginners-guide-python.md
+++ b/beginners-guide-python.md
@@ -35,7 +35,7 @@ one with an R-like syntax (DML) and one with a Python-like 
syntax (PyDML).
 Algorithm scripts written in DML and PyDML can be run on Hadoop, on Spark, or 
in Standalone mode. 
 No script modifications are required to change between modes. SystemML 
automatically performs advanced optimizations 
 based on data and cluster characteristics, so much of the need to manually 
tweak algorithms is largely reduced or eliminated.
-To understand more about DML and PyDML, we recommend that you read [Beginner's 
Guide to DML and 
PyDML](https://apache.github.io/incubator-systemml/beginners-guide-to-dml-and-pydml.html).
+To understand more about DML and PyDML, we recommend that you read [Beginner's 
Guide to DML and 
PyDML](https://apache.github.io/systemml/beginners-guide-to-dml-and-pydml.html).
 
 For convenience of Python users, SystemML exposes several language-level APIs 
that allow Python users to use SystemML
 and its algorithms without the need to know DML or PyDML. We explain these 
APIs in the below sections with example usecases.
@@ -92,18 +92,18 @@ If you want to try out the bleeding edge version, please 
use following commands:
 <div class="codetabs">
 <div data-lang="Python 2" markdown="1">
 ```bash
-git checkout https://github.com/apache/incubator-systemml.git
-cd incubator-systemml
+git checkout https://github.com/apache/systemml.git
+cd systemml
 mvn clean package -P distribution
-pip install target/systemml-0.12.0-incubating-SNAPSHOT-python.tgz
+pip install target/systemml-1.0.0-SNAPSHOT-python.tgz
 ```
 </div>
 <div data-lang="Python 3" markdown="1">
 ```bash
-git checkout https://github.com/apache/incubator-systemml.git
-cd incubator-systemml
+git checkout https://github.com/apache/systemml.git
+cd systemml
 mvn clean package -P distribution
-pip3 install target/systemml-0.12.0-incubating-SNAPSHOT-python.tgz
+pip3 install target/systemml-1.0.0-SNAPSHOT-python.tgz
 ```
 </div>
 </div>
@@ -163,7 +163,7 @@ array([[-60.],
        [-60.]])
 ```
 
-Let us now write a simple script to train [linear 
regression](https://apache.github.io/incubator-systemml/algorithms-regression.html#linear-regression)
 
+Let us now write a simple script to train [linear 
regression](https://apache.github.io/systemml/algorithms-regression.html#linear-regression)
 
 model: $ \beta = solve(X^T X, X^T y) $. For simplicity, we will use 
direct-solve method and ignore
 regularization parameter as well as intercept. 
 
@@ -204,12 +204,12 @@ will use `mllearn` API described in the next section.
 
 ## Invoke SystemML's algorithms
 
-SystemML also exposes a subpackage 
[mllearn](https://apache.github.io/incubator-systemml/python-reference#mllearn-api).
 This subpackage allows Python users to invoke SystemML algorithms
+SystemML also exposes a subpackage 
[mllearn](https://apache.github.io/systemml/python-reference#mllearn-api). This 
subpackage allows Python users to invoke SystemML algorithms
 using Scikit-learn or MLPipeline API.  
 
 ### Scikit-learn interface
 
-In the below example, we invoke SystemML's [Linear 
Regression](https://apache.github.io/incubator-systemml/algorithms-regression.html#linear-regression)
+In the below example, we invoke SystemML's [Linear 
Regression](https://apache.github.io/systemml/algorithms-regression.html#linear-regression)
 algorithm.
  
 ```python
@@ -242,7 +242,7 @@ Residual sum of squares: 6991.17
 
 As expected, by adding intercept and regularizer the residual error drops 
significantly.
 
-Here is another example that where we invoke SystemML's [Logistic 
Regression](https://apache.github.io/incubator-systemml/algorithms-classification.html#multinomial-logistic-regression)
+Here is another example that where we invoke SystemML's [Logistic 
Regression](https://apache.github.io/systemml/algorithms-classification.html#multinomial-logistic-regression)
 algorithm on digits datasets.
 
 ```python
@@ -363,8 +363,8 @@ Output:
 
 ## Invoking DML/PyDML scripts using MLContext
 
-The below example demonstrates how to invoke the algorithm 
[scripts/algorithms/MultiLogReg.dml](https://github.com/apache/incubator-systemml/blob/master/scripts/algorithms/MultiLogReg.dml)
-using Python [MLContext 
API](https://apache.github.io/incubator-systemml/spark-mlcontext-programming-guide).
+The below example demonstrates how to invoke the algorithm 
[scripts/algorithms/MultiLogReg.dml](https://github.com/apache/systemml/blob/master/scripts/algorithms/MultiLogReg.dml)
+using Python [MLContext 
API](https://apache.github.io/systemml/spark-mlcontext-programming-guide).
 
 ```python
 from sklearn import datasets
@@ -380,7 +380,7 @@ X_df = sqlCtx.createDataFrame(pd.DataFrame(X_digits[:int(.9 
* n_samples)]))
 y_df = sqlCtx.createDataFrame(pd.DataFrame(y_digits[:int(.9 * n_samples)]))
 ml = sml.MLContext(sc)
 # Run the MultiLogReg.dml script at the given URL
-scriptUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/MultiLogReg.dml";
+scriptUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/MultiLogReg.dml";
 script = sml.dml(scriptUrl).input(X=X_df, Y_vec=y_df).output("B_out")
 beta = ml.execute(script).get('B_out').toNumPy()
 ```

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/beginners-guide-to-dml-and-pydml.md
----------------------------------------------------------------------
diff --git a/beginners-guide-to-dml-and-pydml.md 
b/beginners-guide-to-dml-and-pydml.md
index 9d19cc8..442c07b 100644
--- a/beginners-guide-to-dml-and-pydml.md
+++ b/beginners-guide-to-dml-and-pydml.md
@@ -884,5 +884,5 @@ for (i in 0:numRowsToPrint-1):
 
 The [Language Reference](dml-language-reference.html) contains highly detailed 
information regarding DML.
 
-In addition, many excellent examples can be found in the 
[`scripts`](https://github.com/apache/incubator-systemml/tree/master/scripts) 
directory.
+In addition, many excellent examples can be found in the 
[`scripts`](https://github.com/apache/systemml/tree/master/scripts) directory.
 

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/contributing-to-systemml.md
----------------------------------------------------------------------
diff --git a/contributing-to-systemml.md b/contributing-to-systemml.md
index b7d1d98..97c2654 100644
--- a/contributing-to-systemml.md
+++ b/contributing-to-systemml.md
@@ -74,12 +74,12 @@ fashion, please contact us on the dev mailing list and we 
will be happy to help
 
 Once you have an issue to work on, how do you go about doing your work? The 
first thing you need is a GitHub
 account. Once you have a GitHub account, go to the Apache SystemML GitHub site 
at
-[https://github.com/apache/incubator-systemml](https://github.com/apache/incubator-systemml)
 and
+[https://github.com/apache/systemml](https://github.com/apache/systemml) and
 click the Fork button to fork a personal remote copy of the SystemML 
repository to your GitHub account.
 
 The next step is to clone your SystemML fork to your local machine.
 
-       $ git clone https://github.com/YOUR_GITHUB_NAME/incubator-systemml.git
+       $ git clone https://github.com/YOUR_GITHUB_NAME/systemml.git
 
 Following this, it's a good idea to set your git user name and email address. 
In addition, you may want
 to set the `push.default` property to `simple`. You only need to execute these 
commands once.
@@ -91,17 +91,17 @@ to set the `push.default` property to `simple`. You only 
need to execute these c
 Next, reference the main SystemML repository as a remote repository. By 
convention, you can
 call this `upstream`. You only need to add the remote `upstream` repository 
once.
 
-       $ git remote add upstream 
https://github.com/apache/incubator-systemml.git
+       $ git remote add upstream https://github.com/apache/systemml.git
 
 After this, you should have an `origin` repository, which references your 
personal forked SystemML
 repository on GitHub, and the `upstream` repository, which references the main 
SystemML repository
 on GitHub.
 
        $ git remote -v
-       origin   https://github.com/YOUR_GITHUB_NAME/incubator-systemml.git 
(fetch)
-       origin   https://github.com/YOUR_GITHUB_NAME/incubator-systemml.git 
(push)
-       upstream https://github.com/apache/incubator-systemml.git (fetch)
-       upstream https://github.com/apache/incubator-systemml.git (push)
+       origin   https://github.com/YOUR_GITHUB_NAME/systemml.git (fetch)
+       origin   https://github.com/YOUR_GITHUB_NAME/systemml.git (push)
+       upstream https://github.com/apache/systemml.git (fetch)
+       upstream https://github.com/apache/systemml.git (push)
 
 The main code branch by convention is the `master` branch. You can check out 
the `master` branch
 using the `checkout` command:
@@ -150,7 +150,7 @@ that you did on this branch. A Pull Request is a request 
for project committers
 write access to Apache SystemML) to review your code and integrate your code 
into the project.
 Typically, you will see a green button to allow you to file a Pull Request.
 
-Once your Pull Request is opened at [SystemML Pull 
Requests](https://github.com/apache/incubator-systemml/pulls),
+Once your Pull Request is opened at [SystemML Pull 
Requests](https://github.com/apache/systemml/pulls),
 typically Jenkins will automatically build the project to see
 if all tests pass when run for your particular branch. These automatic builds
 can be seen 
[here](https://sparktc.ibmcloud.com/jenkins/job/SystemML-PullRequestBuilder/).
@@ -195,7 +195,7 @@ You can allow others to preview your documentation updates 
on GitHub by pushing
 For instance, if you have filed a Pull Request for a documentation update on a 
regular branch,
 you could additionally push the `docs` subtree to the remote `gh-pages` 
branch. In the Pull Request
 conversation, you could include a link to the documentation that was 
automatically generated
-when you pushed to `gh-pages`. The URL is 
http://&lt;YOUR_NAME&gt;.github.io/incubator-systemml/.
+when you pushed to `gh-pages`. The URL is 
http://&lt;YOUR_NAME&gt;.github.io/systemml/.
 
 If you experience issues pushing the `docs` subtree to the `gh-pages` branch 
because you've
 previously pushed from a different branch, one simple solution is to delete 
the remote `gh-pages`

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/devdocs/gpu-backend.md
----------------------------------------------------------------------
diff --git a/devdocs/gpu-backend.md b/devdocs/gpu-backend.md
index 40311c7..63da844 100644
--- a/devdocs/gpu-backend.md
+++ b/devdocs/gpu-backend.md
@@ -57,5 +57,5 @@ To use SystemML's GPU backend when using the jar or uber-jar
 
 For example: to use GPU backend in standalone mode:
 ```bash
-java -classpath $JAR_PATH:systemml-0.14.0-incubating-SNAPSHOT-standalone.jar 
org.apache.sysml.api.DMLScript -f MyDML.dml -gpu -exec singlenode ... 
+java -classpath $JAR_PATH:systemml-1.0.0-SNAPSHOT-standalone.jar 
org.apache.sysml.api.DMLScript -f MyDML.dml -gpu -exec singlenode ... 
 ```

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/hadoop-batch-mode.md
----------------------------------------------------------------------
diff --git a/hadoop-batch-mode.md b/hadoop-batch-mode.md
index 37df064..9b29d29 100644
--- a/hadoop-batch-mode.md
+++ b/hadoop-batch-mode.md
@@ -140,15 +140,15 @@ Following this, I unpacked it.
 
 **Alternatively**, we could have built the SystemML distributed release using 
[Apache Maven](http://maven.apache.org) and unpacked it.
 
-       [hadoop@host1 ~]$ git clone 
https://github.com/apache/incubator-systemml.git
-       [hadoop@host1 ~]$ cd incubator-systemml
-       [hadoop@host1 incubator-systemml]$ mvn clean package -P distribution
-       [hadoop@host1 incubator-systemml]$ tar -xvzf 
target/systemml-{{site.SYSTEMML_VERSION}}.tar.gz -C ..
+       [hadoop@host1 ~]$ git clone https://github.com/apache/systemml.git
+       [hadoop@host1 ~]$ cd systemml
+       [hadoop@host1 systemml]$ mvn clean package -P distribution
+       [hadoop@host1 systemml]$ tar -xvzf 
target/systemml-{{site.SYSTEMML_VERSION}}.tar.gz -C ..
        [hadoop@host1 ~]$ cd ..
 
 I downloaded the `genLinearRegressionData.dml` script that is used in the 
SystemML README example.
 
-       [hadoop@host1 ~]$ wget 
https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/datagen/genLinearRegressionData.dml
+       [hadoop@host1 ~]$ wget 
https://raw.githubusercontent.com/apache/systemml/master/scripts/datagen/genLinearRegressionData.dml
 
 Next, I invoked the `genLinearRegressionData.dml` DML script in Hadoop Batch 
mode.
 Hadoop was executed with the `SystemML.jar` file specified by the hadoop `jar` 
option.
@@ -853,7 +853,7 @@ The `numreducers` property specifies the number of reduce 
tasks per MR job.
 
 To begin, I'll download the `genRandData4Kmeans.dml` script that I'll use to 
generate a set of data.
 
-       [hadoop@host1 ~]$ wget 
https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/datagen/genRandData4Kmeans.dml
+       [hadoop@host1 ~]$ wget 
https://raw.githubusercontent.com/apache/systemml/master/scripts/datagen/genRandData4Kmeans.dml
 
 A description of the named arguments that can be passed in to this script can 
be found in the comment section at the top of the
 `genRandData4Kmeans.dml` file. For data, I'll generate a matrix `X.mtx` 
consisting of 1 million rows and 100 features. I'll explicitly reference my 
`SystemML-config.xml` file, since I'm

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/index.md
----------------------------------------------------------------------
diff --git a/index.md b/index.md
index 6b49e68..96b6b2a 100644
--- a/index.md
+++ b/index.md
@@ -30,7 +30,7 @@ SystemML's distinguishing characteristics are:
   2. **Multiple execution modes**, including Spark MLContext, Spark Batch, 
Hadoop Batch, Standalone, and JMLC.
   3. **Automatic optimization** based on data and cluster characteristics to 
ensure both efficiency and scalability.
 
-The [SystemML GitHub README](https://github.com/apache/incubator-systemml) 
describes
+The [SystemML GitHub README](https://github.com/apache/systemml) describes
 building, testing, and running SystemML. Please read [Contributing to 
SystemML](contributing-to-systemml)
 to find out how to help make SystemML even better!
 

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/python-reference.md
----------------------------------------------------------------------
diff --git a/python-reference.md b/python-reference.md
index 4519bc1..7de3fb0 100644
--- a/python-reference.md
+++ b/python-reference.md
@@ -35,7 +35,7 @@ one with an R-like syntax (DML) and one with a Python-like 
syntax (PyDML).
 Algorithm scripts written in DML and PyDML can be run on Hadoop, on Spark, or 
in Standalone mode. 
 No script modifications are required to change between modes. SystemML 
automatically performs advanced optimizations 
 based on data and cluster characteristics, so much of the need to manually 
tweak algorithms is largely reduced or eliminated.
-To understand more about DML and PyDML, we recommend that you read [Beginner's 
Guide to DML and 
PyDML](https://apache.github.io/incubator-systemml/beginners-guide-to-dml-and-pydml.html).
+To understand more about DML and PyDML, we recommend that you read [Beginner's 
Guide to DML and 
PyDML](https://apache.github.io/systemml/beginners-guide-to-dml-and-pydml.html).
 
 For convenience of Python users, SystemML exposes several language-level APIs 
that allow Python users to use SystemML
 and its algorithms without the need to know DML or PyDML. We explain these 
APIs in the below sections.
@@ -108,7 +108,7 @@ save(mVar4, " ")
 
 Since matrix is backed by lazy evaluation and uses a recursive Depth First 
Search (DFS),
 you may run into `RuntimeError: maximum recursion depth exceeded`. 
-Please see below [troubleshooting 
steps](http://apache.github.io/incubator-systemml/python-reference#maximum-recursion-depth-exceeded)
+Please see below [troubleshooting 
steps](http://apache.github.io/systemml/python-reference#maximum-recursion-depth-exceeded)
 
 ### Dealing with the loops
 
@@ -118,7 +118,7 @@ This can lead to two issues:
 
 1. Since matrix is backed by lazy evaluation and uses a recursive Depth First 
Search (DFS),
 you may run into `RuntimeError: maximum recursion depth exceeded`. 
-Please see below [troubleshooting 
steps](http://apache.github.io/incubator-systemml/python-reference#maximum-recursion-depth-exceeded)
+Please see below [troubleshooting 
steps](http://apache.github.io/systemml/python-reference#maximum-recursion-depth-exceeded)
 
 2. Significant parsing/compilation overhead of potentially large unrolled DML 
script.
 
@@ -340,8 +340,8 @@ As a result, it offers a convenient way to interact with 
SystemML from the Spark
 
 ### Usage
 
-The below example demonstrates how to invoke the algorithm 
[scripts/algorithms/MultiLogReg.dml](https://github.com/apache/incubator-systemml/blob/master/scripts/algorithms/MultiLogReg.dml)
-using Python [MLContext 
API](https://apache.github.io/incubator-systemml/spark-mlcontext-programming-guide).
+The below example demonstrates how to invoke the algorithm 
[scripts/algorithms/MultiLogReg.dml](https://github.com/apache/systemml/blob/master/scripts/algorithms/MultiLogReg.dml)
+using Python [MLContext 
API](https://apache.github.io/systemml/spark-mlcontext-programming-guide).
 
 ```python
 from sklearn import datasets, neighbors
@@ -369,7 +369,7 @@ beta = ml.execute(script).get('B_out').toNumPy()
 
 mllearn API is designed to be compatible with scikit-learn and MLLib.
 The classes that are part of mllearn API are LogisticRegression, 
LinearRegression, SVM, NaiveBayes 
-and 
[Caffe2DML](http://apache.github.io/incubator-systemml/beginners-guide-caffe2dml).
+and [Caffe2DML](http://apache.github.io/systemml/beginners-guide-caffe2dml).
 
 The below code describes how to use mllearn API for training:
 
@@ -423,7 +423,7 @@ The table below describes the parameter available for 
mllearn algorithms:
 | is_multi_class | Specifies whether to use binary-class or multi-class 
classifier (default: False) | - | - | X | - |
 | laplace | Laplace smoothing specified by the user to avoid creation of 0 
probabilities (default: 1.0) | - | - | - | X |
 
-In the below example, we invoke SystemML's [Logistic 
Regression](https://apache.github.io/incubator-systemml/algorithms-classification.html#multinomial-logistic-regression)
+In the below example, we invoke SystemML's [Logistic 
Regression](https://apache.github.io/systemml/algorithms-classification.html#multinomial-logistic-regression)
 algorithm on digits datasets.
 
 ```python

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/release-process.md
----------------------------------------------------------------------
diff --git a/release-process.md b/release-process.md
index 9cf9821..f41c7c8 100644
--- a/release-process.md
+++ b/release-process.md
@@ -39,7 +39,7 @@ the release candidate is deployed to servers for review.)
 
 <a href="#release-candidate-checklist">Up to Checklist</a>
 
-Verify that each expected artifact is present at 
[https://dist.apache.org/repos/dist/dev/incubator/systemml/](https://dist.apache.org/repos/dist/dev/incubator/systemml/)
 and that each artifact has accompanying
+Verify that each expected artifact is present at 
[https://dist.apache.org/repos/dist/dev/systemml/](https://dist.apache.org/repos/dist/dev/systemml/)
 and that each artifact has accompanying
 checksums (such as .asc and .md5).
 
 
@@ -57,10 +57,10 @@ with an empty local Maven repository.
 
 Here is an example:
 
-       $ git clone https://github.com/apache/incubator-systemml.git
-       $ cd incubator-systemml
+       $ git clone https://github.com/apache/systemml.git
+       $ cd systemml
        $ git tag -l
-       $ git checkout tags/0.11.0-incubating-rc1 -b 0.11.0-incubating-rc1
+       $ git checkout tags/1.0.0-rc1 -b 1.0.0-rc1
        $ mvn -Dmaven.repo.local=$HOME/.m2/temp-repo clean package -P 
distribution
 
 
@@ -81,43 +81,43 @@ The test suite can be run using:
 Validate that all of the binary artifacts can execute, including those 
artifacts packaged
 in other artifacts (in the tgz and zip artifacts).
 
-The build artifacts should be downloaded from 
[https://dist.apache.org/repos/dist/dev/incubator/systemml/](https://dist.apache.org/repos/dist/dev/incubator/systemml/)
 and these artifacts should be tested, as in
+The build artifacts should be downloaded from 
[https://dist.apache.org/repos/dist/dev/systemml/](https://dist.apache.org/repos/dist/dev/systemml/)
 and these artifacts should be tested, as in
 this OS X example.
 
        # download artifacts
-       wget -r -nH -nd -np -R 'index.html*' 
https://dist.apache.org/repos/dist/dev/incubator/systemml/0.13.0-incubating-rc1/
+       wget -r -nH -nd -np -R 'index.html*' 
https://dist.apache.org/repos/dist/dev/systemml/1.0.0-rc1/
 
        # verify standalone tgz works
-       tar -xvzf systemml-0.13.0-incubating-bin.tgz
-       cd systemml-0.13.0-incubating-bin
+       tar -xvzf systemml-1.0.0-bin.tgz
+       cd systemml-1.0.0-bin
        echo "print('hello world');" > hello.dml
        ./runStandaloneSystemML.sh hello.dml
        cd ..
 
        # verify standalon zip works
-       rm -rf systemml-0.13.0-incubating-bin
-       unzip systemml-0.13.0-incubating-bin.zip
-       cd systemml-0.13.0-incubating-bin
+       rm -rf systemml-1.0.0-bin
+       unzip systemml-1.0.0-bin.zip
+       cd systemml-1.0.0-bin
        echo "print('hello world');" > hello.dml
        ./runStandaloneSystemML.sh hello.dml
        cd ..
 
        # verify src works
-       tar -xvzf systemml-0.13.0-incubating-src.tgz
-       cd systemml-0.13.0-incubating-src
+       tar -xvzf systemml-1.0.0-src.tgz
+       cd systemml-1.0.0-src
        mvn clean package -P distribution
        cd target/
-       java -cp "./lib/*:systemml-0.13.0-incubating.jar" 
org.apache.sysml.api.DMLScript -s "print('hello world');"
+       java -cp "./lib/*:systemml-1.0.0.jar" org.apache.sysml.api.DMLScript -s 
"print('hello world');"
        java -cp "./lib/*:SystemML.jar" org.apache.sysml.api.DMLScript -s 
"print('hello world');"
        cd ../..
 
        # verify spark batch mode
        export SPARK_HOME=~/spark-2.1.0-bin-hadoop2.7
-       cd systemml-0.13.0-incubating-bin/target/lib
-       $SPARK_HOME/bin/spark-submit systemml-0.13.0-incubating.jar -s 
"print('hello world');" -exec hybrid_spark
+       cd systemml-1.0.0-bin/target/lib
+       $SPARK_HOME/bin/spark-submit systemml-1.0.0.jar -s "print('hello 
world');" -exec hybrid_spark
 
        # verify hadoop batch mode
-       hadoop jar systemml-0.13.0-incubating.jar -s "print('hello world');"
+       hadoop jar systemml-1.0.0.jar -s "print('hello world');"
 
 
        # verify python artifact
@@ -127,8 +127,8 @@ this OS X example.
        pip install scipy
        export SPARK_HOME=~/spark-2.1.0-bin-hadoop2.7
        # get into the pyspark prompt
-       cd systemml-0.13.0
-       $SPARK_HOME/bin/pyspark --driver-class-path 
systemml-java/systemml-0.13.0-incubating.jar
+       cd systemml-1.0.0
+       $SPARK_HOME/bin/pyspark --driver-class-path 
systemml-java/systemml-1.0.0.jar
        # Use this program at the prompt:
        import systemml as sml
        import numpy as np
@@ -188,8 +188,8 @@ The project should be built using the `src` (tgz and zip) 
artifacts.
 In addition, the test suite should be run using an `src` artifact and
 the tests should pass.
 
-       tar -xvzf systemml-0.13.0-incubating-src.tgz
-       cd systemml-0.13.0-incubating-src
+       tar -xvzf systemml-1.0.0-src.tgz
+       cd systemml-1.0.0-src
        mvn clean package -P distribution
        mvn verify
 
@@ -202,11 +202,11 @@ The standalone tgz and zip artifacts contain 
`runStandaloneSystemML.sh` and `run
 files. Verify that one or more algorithms can be run on a single node using 
these
 standalone distributions.
 
-Here is an example based on the [Standalone 
Guide](http://apache.github.io/incubator-systemml/standalone-guide.html)
+Here is an example based on the [Standalone 
Guide](http://apache.github.io/systemml/standalone-guide.html)
 demonstrating the execution of an algorithm (on OS X).
 
-       tar -xvzf systemml-0.13.0-incubating-bin.tgz
-       cd systemml-0.13.0-incubating-bin
+       tar -xvzf systemml-1.0.0-bin.tgz
+       cd systemml-1.0.0-bin
        wget -P data/ 
http://archive.ics.uci.edu/ml/machine-learning-databases/haberman/haberman.data
        echo '{"rows": 306, "cols": 4, "format": "csv"}' > 
data/haberman.data.mtd
        echo '1,1,1,2' > data/types.csv
@@ -223,12 +223,12 @@ Verify that SystemML runs algorithms on Spark locally.
 
 Here is an example of running the `Univar-Stats.dml` algorithm on random 
generated data.
 
-       cd systemml-0.13.0-incubating-bin/lib
+       cd systemml-1.0.0-bin/lib
        export SPARK_HOME=~/spark-2.1.0-bin-hadoop2.7
-       $SPARK_HOME/bin/spark-submit systemml-0.13.0-incubating.jar -f 
../scripts/datagen/genRandData4Univariate.dml -exec hybrid_spark -args 1000000 
100 10 1 2 3 4 uni.mtx
+       $SPARK_HOME/bin/spark-submit systemml-1.0.0.jar -f 
../scripts/datagen/genRandData4Univariate.dml -exec hybrid_spark -args 1000000 
100 10 1 2 3 4 uni.mtx
        echo '1' > uni-types.csv
        echo '{"rows": 1, "cols": 1, "format": "csv"}' > uni-types.csv.mtd
-       $SPARK_HOME/bin/spark-submit systemml-0.13.0-incubating.jar -f 
../scripts/algorithms/Univar-Stats.dml -exec hybrid_spark -nvargs X=uni.mtx 
TYPES=uni-types.csv STATS=uni-stats.txt CONSOLE_OUTPUT=TRUE
+       $SPARK_HOME/bin/spark-submit systemml-1.0.0.jar -f 
../scripts/algorithms/Univar-Stats.dml -exec hybrid_spark -nvargs X=uni.mtx 
TYPES=uni-types.csv STATS=uni-stats.txt CONSOLE_OUTPUT=TRUE
        cd ..
 
 
@@ -240,8 +240,8 @@ Verify that SystemML runs algorithms on Hadoop locally.
 
 Based on the "Single-Node Spark" setup above, the `Univar-Stats.dml` algorithm 
could be run as follows:
 
-       cd systemml-0.13.0-incubating-bin/lib
-       hadoop jar systemml-0.13.0-incubating.jar -f 
../scripts/algorithms/Univar-Stats.dml -nvargs X=uni.mtx TYPES=uni-types.csv 
STATS=uni-stats.txt CONSOLE_OUTPUT=TRUE
+       cd systemml-1.0.0-bin/lib
+       hadoop jar systemml-1.0.0.jar -f ../scripts/algorithms/Univar-Stats.dml 
-nvargs X=uni.mtx TYPES=uni-types.csv STATS=uni-stats.txt CONSOLE_OUTPUT=TRUE
 
 
 ## Notebooks
@@ -249,7 +249,7 @@ Based on the "Single-Node Spark" setup above, the 
`Univar-Stats.dml` algorithm c
 <a href="#release-candidate-checklist">Up to Checklist</a>
 
 Verify that SystemML can be executed from Jupyter and Zeppelin notebooks.
-For examples, see the [Spark MLContext Programming 
Guide](http://apache.github.io/incubator-systemml/spark-mlcontext-programming-guide.html).
+For examples, see the [Spark MLContext Programming 
Guide](http://apache.github.io/systemml/spark-mlcontext-programming-guide.html).
 
 
 ## Performance Suite
@@ -263,7 +263,6 @@ include 80MB, 800MB, 8GB, and 80GB data sizes.
 # Voting
 
 Following a successful release candidate vote by SystemML PMC members on the 
SystemML mailing list, the release candidate
-is voted on by Incubator PMC members on the general incubator mailing list. If 
this vote succeeds, the release candidate
 has been approved.
 
 
@@ -282,39 +281,39 @@ This section describes how to deploy versioned project 
documentation to the main
 Note that versioned project documentation is committed directly to the `svn` 
project's `docs` folder.
 The versioned project documentation is not committed to the website's `git` 
project.
 
-Checkout branch in main project (`incubator-systemml`).
+Checkout branch in main project (`systemml`).
 
-       $ git checkout branch-0.13.0
+       $ git checkout branch-1.0.0
 
-In `incubator-systemml/docs/_config.yml`, set:
+In `systemml/docs/_config.yml`, set:
 
-* `SYSTEMML_VERSION` to project version (0.13.0)
+* `SYSTEMML_VERSION` to project version (1.0.0)
 * `FEEDBACK_LINKS` to `false` (only have feedback links on `LATEST` docs)
 * `API_DOCS_MENU` to `true` (adds `API Docs` menu to get to project javadocs)
 
-Generate `docs/_site` by running `bundle exec jekyll serve` in 
`incubator-systemml/docs`.
+Generate `docs/_site` by running `bundle exec jekyll serve` in `systemml/docs`.
 
        $ bundle exec jekyll serve
 
 Verify documentation site looks correct.
 
-In website `svn` project, create `incubator-systemml-website-site/docs/0.13.0` 
folder.
+In website `svn` project, create `systemml-website-site/docs/1.0.0` folder.
 
-Copy contents of `incubator-systemml/docs/_site` to 
`incubator-systemml-website-site/docs/0.13.0`.
+Copy contents of `systemml/docs/_site` to `systemml-website-site/docs/1.0.0`.
 
 Delete any unnecessary files (`Gemfile`, `Gemfile.lock`).
 
-Create `incubator-systemml-website-site/docs/0.13.0/api/java` folder for 
javadocs.
+Create `systemml-website-site/docs/1.0.0/api/java` folder for javadocs.
 
-Update `incubator-systemml/pom.xml` project version to what should be 
displayed in javadocs (such as `0.13.0`).
+Update `systemml/pom.xml` project version to what should be displayed in 
javadocs (such as `1.0.0`).
 
 Build project (which generates javadocs).
 
        $ mvn clean package -P distribution
 
-Copy contents of `incubator-systemml/target/apidocs` to 
`incubator-systemml-website-site/docs/0.13.0/api/java`.
+Copy contents of `systemml/target/apidocs` to 
`systemml-website-site/docs/1.0.0/api/java`.
 
-Open up `file:///.../incubator-systemml-website-site/docs/0.13.0/index.html` 
and verify `API Docs` &rarr; `Javadoc` link works and that the correct Javadoc 
version is displayed. Verify feedback links under `Issues` menu are not present.
+Open up `file:///.../systemml-website-site/docs/1.0.0/index.html` and verify 
`API Docs` &rarr; `Javadoc` link works and that the correct Javadoc version is 
displayed. Verify feedback links under `Issues` menu are not present.
 
 Clean up any unnecessary files (such as deleting `.DS_Store` files on OS X).
 
@@ -323,31 +322,31 @@ Clean up any unnecessary files (such as deleting 
`.DS_Store` files on OS X).
 Commit the versioned project documentation to `svn`:
 
        $ svn status
-       $ svn add docs/0.13.0
-       $ svn commit -m "Add 0.13.0 docs to website"
+       $ svn add docs/1.0.0
+       $ svn commit -m "Add 1.0.0 docs to website"
 
-Update `incubator-systemml-website/_src/documentation.html` to include 0.13.0 
link.
+Update `systemml-website/_src/documentation.html` to include 1.0.0 link.
 
-Start main website site by running `gulp` in `incubator-systemml-website`:
+Start main website site by running `gulp` in `systemml-website`:
 
        $ gulp
 
 Commit and push the update to `git` project.
 
        $ git add -u
-       $ git commit -m "Add 0.13.0 link to documentation page"
+       $ git commit -m "Add 1.0.0 link to documentation page"
        $ git push
        $ git push apache master
 
-Copy contents of `incubator-systemml-website/_site` (generated by `gulp`) to 
`incubator-systemml-website-site`.
-After doing so, we should see that 
`incubator-systemml-website-site/documentation.html` has been updated.
+Copy contents of `systemml-website/_site` (generated by `gulp`) to 
`systemml-website-site`.
+After doing so, we should see that `systemml-website-site/documentation.html` 
has been updated.
 
        $ svn status
        $ svn diff
 
 Commit the update to `documentation.html` to publish the website update.
 
-       $ svn commit -m "Add 0.13.0 link to documentation page"
+       $ svn commit -m "Add 1.0.0 link to documentation page"
 
 The versioned project documentation is now deployed to the main website, and 
the
 [Documentation Page](http://systemml.apache.org/documentation) contains a link 
to the versioned documentation.

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/spark-batch-mode.md
----------------------------------------------------------------------
diff --git a/spark-batch-mode.md b/spark-batch-mode.md
index 39bcd3e..353dff6 100644
--- a/spark-batch-mode.md
+++ b/spark-batch-mode.md
@@ -77,8 +77,8 @@ For best performance, we recommend setting the following 
flags when running Syst
 # Examples
 
 Please see the MNIST examples in the included
-[SystemML-NN](https://github.com/apache/incubator-systemml/tree/master/scripts/staging/SystemML-NN)
+[SystemML-NN](https://github.com/apache/systemml/tree/master/scripts/staging/SystemML-NN)
 library for examples of Spark Batch mode execution with SystemML to train 
MNIST classifiers:
 
-  * [MNIST Softmax 
Classifier](https://github.com/apache/incubator-systemml/blob/master/scripts/staging/SystemML-NN/examples/mnist_softmax-train.dml)
-  * [MNIST LeNet 
ConvNet](https://github.com/apache/incubator-systemml/blob/master/scripts/staging/SystemML-NN/examples/mnist_lenet-train.dml)
+  * [MNIST Softmax 
Classifier](https://github.com/apache/systemml/blob/master/scripts/staging/SystemML-NN/examples/mnist_softmax-train.dml)
+  * [MNIST LeNet 
ConvNet](https://github.com/apache/systemml/blob/master/scripts/staging/SystemML-NN/examples/mnist_lenet-train.dml)

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/spark-mlcontext-programming-guide.md
----------------------------------------------------------------------
diff --git a/spark-mlcontext-programming-guide.md 
b/spark-mlcontext-programming-guide.md
index ddccde1..8123a89 100644
--- a/spark-mlcontext-programming-guide.md
+++ b/spark-mlcontext-programming-guide.md
@@ -533,7 +533,7 @@ We'll pull the data from a URL and convert it to an RDD, 
`habermanRDD`. Next, we
 stating that the matrix consists of 306 rows and 4 columns.
 
 As we can see from the comments in the script
-[here](https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml),
 the
+[here](https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml),
 the
 script requires a 'TYPES' input matrix that lists the types of the features (1 
for scale, 2 for nominal, 3 for
 ordinal), so we create a `typesRDD` matrix consisting of 1 row and 4 columns, 
with corresponding metadata, `typesMetadata`.
 
@@ -554,7 +554,7 @@ val habermanRDD = sc.parallelize(habermanList)
 val habermanMetadata = new MatrixMetadata(306, 4)
 val typesRDD = sc.parallelize(Array("1.0,1.0,1.0,2.0"))
 val typesMetadata = new MatrixMetadata(1, 4)
-val scriptUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
+val scriptUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
 val uni = dmlFromUrl(scriptUrl).in("A", habermanRDD, habermanMetadata).in("K", 
typesRDD, typesMetadata).in("$CONSOLE_OUTPUT", true)
 ml.execute(uni)
 
@@ -580,8 +580,8 @@ typesRDD: org.apache.spark.rdd.RDD[String] = 
ParallelCollectionRDD[160] at paral
 scala> val typesMetadata = new MatrixMetadata(1, 4)
 typesMetadata: org.apache.sysml.api.mlcontext.MatrixMetadata = rows: 1, 
columns: 4, non-zeros: None, rows per block: None, columns per block: None
 
-scala> val scriptUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
-scriptUrl: String = 
https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml
+scala> val scriptUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
+scriptUrl: String = 
https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml
 
 scala> val uni = dmlFromUrl(scriptUrl).in("A", habermanRDD, 
habermanMetadata).in("K", typesRDD, typesMetadata).in("$CONSOLE_OUTPUT", true)
 uni: org.apache.sysml.api.mlcontext.Script =
@@ -667,7 +667,7 @@ format, metadata needs to be supplied for the matrix.
 {% highlight scala %}
 val habermanUrl = 
"http://archive.ics.uci.edu/ml/machine-learning-databases/haberman/haberman.data";
 val typesRDD = sc.parallelize(Array("1.0,1.0,1.0,2.0"))
-val scriptUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
+val scriptUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
 val uni = dmlFromUrl(scriptUrl).in("A", new java.net.URL(habermanUrl)).in("K", 
typesRDD).in("$CONSOLE_OUTPUT", true)
 ml.execute(uni)
 {% endhighlight %}
@@ -681,8 +681,8 @@ habermanUrl: String = 
http://archive.ics.uci.edu/ml/machine-learning-databases/h
 scala> val typesRDD = sc.parallelize(Array("1.0,1.0,1.0,2.0"))
 typesRDD: org.apache.spark.rdd.RDD[String] = ParallelCollectionRDD[50] at 
parallelize at <console>:33
 
-scala> val scriptUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
-scriptUrl: String = 
https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml
+scala> val scriptUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
+scriptUrl: String = 
https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml
 
 scala> val uni = dmlFromUrl(scriptUrl).in("A", new 
java.net.URL(habermanUrl)).in("K", typesRDD).in("$CONSOLE_OUTPUT", true)
 uni: org.apache.sysml.api.mlcontext.Script =
@@ -762,7 +762,7 @@ None
 ### Input Variables vs Input Parameters
 
 If we examine the
-[`Univar-Stats.dml`](https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml)
+[`Univar-Stats.dml`](https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml)
 file, we see in the comments that it can take 4 input
 parameters, `$X`, `$TYPES`, `$CONSOLE_OUTPUT`, and `$STATS`. Input parameters 
are typically useful when
 executing SystemML in Standalone mode, Spark batch mode, or Hadoop batch mode. 
For example, `$X` specifies
@@ -1338,7 +1338,7 @@ ScriptFactory can create a script object from a String, 
File, URL, or InputStrea
 Here we create Script object `s1` by reading `Univar-Stats.dml` from a URL.
 
 {% highlight scala %}
-val uniUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
+val uniUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
 val s1 = ScriptFactory.dmlFromUrl(scriptUrl)
 {% endhighlight %}
 
@@ -1350,7 +1350,7 @@ Both methods perform the same action. This example reads 
an algorithm at a URL t
 creates two script objects based on this String.
 
 {% highlight scala %}
-val uniUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
+val uniUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
 val uniString = scala.io.Source.fromURL(uniUrl).mkString
 val s2 = ScriptFactory.dml(uniString)
 val s3 = ScriptFactory.dmlFromString(uniString)
@@ -1363,7 +1363,7 @@ We create Script object `s4` based on a path to a file 
using ScriptFactory's `dm
 reads a URL to a String, writes this String to a file, and then uses the path 
to the file to create a Script object.
 
 {% highlight scala %}
-val uniUrl = 
"https://raw.githubusercontent.com/apache/incubator-systemml/master/scripts/algorithms/Univar-Stats.dml";
+val uniUrl = 
"https://raw.githubusercontent.com/apache/systemml/master/scripts/algorithms/Univar-Stats.dml";
 val uniString = scala.io.Source.fromURL(uniUrl).mkString
 scala.tools.nsc.io.File("uni.dml").writeAll(uniString)
 val s4 = ScriptFactory.dmlFromFile("uni.dml")
@@ -1738,24 +1738,24 @@ print(ml.info.property("Main-Class"))
 <div data-lang="Spark Shell" markdown="1">
 {% highlight scala %}
 scala> print(ml.version)
-0.13.0-incubating-SNAPSHOT
+1.0.0-SNAPSHOT
 scala> print(ml.buildTime)
-2017-02-03 22:32:43 UTC
+2017-06-08 17:51:11 UTC
 scala> print(ml.info)
 Archiver-Version: Plexus Archiver
 Artifact-Id: systemml
 Build-Jdk: 1.8.0_60
-Build-Time: 2017-02-03 22:32:43 UTC
+Build-Time: 2017-06-08 17:51:11 UTC
 Built-By: sparkuser
 Created-By: Apache Maven 3.3.9
 Group-Id: org.apache.systemml
 Main-Class: org.apache.sysml.api.DMLScript
 Manifest-Version: 1.0
-Version: 0.13.0-incubating-SNAPSHOT
+Minimum-Recommended-Spark-Version: 2.1.0
+Version: 1.0.0-SNAPSHOT
 
 scala> print(ml.info.property("Main-Class"))
 org.apache.sysml.api.DMLScript
-
 {% endhighlight %}
 </div>
 
@@ -1771,8 +1771,8 @@ Similar to the Scala API, SystemML also provides a Python 
MLContext API.  Before
 
 Here, we'll explore the use of SystemML via PySpark in a [Jupyter 
notebook](http://jupyter.org/).
 This Jupyter notebook example can be nicely viewed in a rendered state
-[on 
GitHub](https://github.com/apache/incubator-systemml/blob/master/samples/jupyter-notebooks/SystemML-PySpark-Recommendation-Demo.ipynb),
-and can be [downloaded 
here](https://raw.githubusercontent.com/apache/incubator-systemml/master/samples/jupyter-notebooks/SystemML-PySpark-Recommendation-Demo.ipynb)
 to a directory of your choice.
+[on 
GitHub](https://github.com/apache/systemml/blob/master/samples/jupyter-notebooks/SystemML-PySpark-Recommendation-Demo.ipynb),
+and can be [downloaded 
here](https://raw.githubusercontent.com/apache/systemml/master/samples/jupyter-notebooks/SystemML-PySpark-Recommendation-Demo.ipynb)
 to a directory of your choice.
 
 From the directory with the downloaded notebook, start Jupyter with PySpark:
 

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/standalone-guide.md
----------------------------------------------------------------------
diff --git a/standalone-guide.md b/standalone-guide.md
index a2a95d4..4f901c1 100644
--- a/standalone-guide.md
+++ b/standalone-guide.md
@@ -50,10 +50,10 @@ algorithms can be found in the [Algorithms 
Reference](algorithms-reference.html)
 
 # Download SystemML
 
-Apache incubator releases of SystemML are available from the 
[Downloads](http://systemml.apache.org/download.html) page.
+Apache SystemML releases are available from the 
[Downloads](http://systemml.apache.org/download.html) page.
 
 SystemML can also be downloaded from GitHub and built with Maven.
-The SystemML project is available on GitHub at 
[https://github.com/apache/incubator-systemml](https://github.com/apache/incubator-systemml).
+The SystemML project is available on GitHub at 
[https://github.com/apache/systemml](https://github.com/apache/systemml).
 Instructions to build SystemML can be found in the <a 
href="engine-dev-guide.html">Engine Developer Guide</a>.
 
 # Standalone vs Distributed Execution Mode

http://git-wip-us.apache.org/repos/asf/systemml/blob/ddeb1120/troubleshooting-guide.md
----------------------------------------------------------------------
diff --git a/troubleshooting-guide.md b/troubleshooting-guide.md
index e80cad4..b4eac52 100644
--- a/troubleshooting-guide.md
+++ b/troubleshooting-guide.md
@@ -139,4 +139,4 @@ We would highly appreciate if you file a bug report on our 
[issue tracker](https
 
 ## Native BLAS errors
 
-Please see [the user guide of native 
backend](http://apache.github.io/incubator-systemml/native-backend).
\ No newline at end of file
+Please see [the user guide of native 
backend](http://apache.github.io/systemml/native-backend).

Reply via email to