Repository: spark
Updated Branches:
  refs/heads/branch-1.6 696d4a52d -> 0bc813b45


[SPARK-11476][DOCS] Incorrect function referred to in MLib Random data 
generation documentation

Fix Python example to use normalRDD as advertised

Author: Sean Owen <so...@cloudera.com>

Closes #9529 from srowen/SPARK-11476.

(cherry picked from commit d981902101767b32dc83a5a639311e197f5cbcc1)
Signed-off-by: Sean Owen <so...@cloudera.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/0bc813b4
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/0bc813b4
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/0bc813b4

Branch: refs/heads/branch-1.6
Commit: 0bc813b45e1ca42c64239c63127243eaee051d1f
Parents: 696d4a5
Author: Sean Owen <so...@cloudera.com>
Authored: Sun Nov 8 11:15:58 2015 +0000
Committer: Sean Owen <so...@cloudera.com>
Committed: Sun Nov 8 11:16:14 2015 +0000

----------------------------------------------------------------------
 docs/mllib-statistics.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/0bc813b4/docs/mllib-statistics.md
----------------------------------------------------------------------
diff --git a/docs/mllib-statistics.md b/docs/mllib-statistics.md
index 2c7c9ed..ade5b07 100644
--- a/docs/mllib-statistics.md
+++ b/docs/mllib-statistics.md
@@ -594,7 +594,7 @@ sc = ... # SparkContext
 
 # Generate a random double RDD that contains 1 million i.i.d. values drawn 
from the
 # standard normal distribution `N(0, 1)`, evenly distributed in 10 partitions.
-u = RandomRDDs.uniformRDD(sc, 1000000L, 10)
+u = RandomRDDs.normalRDD(sc, 1000000L, 10)
 # Apply a transform to get a random double RDD following `N(1, 4)`.
 v = u.map(lambda x: 1.0 + 2.0 * x)
 {% endhighlight %}


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to