Repository: spark
Updated Branches:
  refs/heads/branch-2.1 db691f05c -> cff7a70b5


[SPARK-11496][GRAPHX][FOLLOWUP] Add param checking for 
runParallelPersonalizedPageRank

## What changes were proposed in this pull request?
add the param checking to keep in line with other algos

## How was this patch tested?
existing tests

Author: Zheng RuiFeng <ruife...@foxmail.com>

Closes #15876 from zhengruifeng/param_check_runParallelPersonalizedPageRank.

(cherry picked from commit 75934457d75996be71ffd0d4b448497d656c0d40)
Signed-off-by: DB Tsai <dbt...@dbtsai.com>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/cff7a70b
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/cff7a70b
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/cff7a70b

Branch: refs/heads/branch-2.1
Commit: cff7a70b59c3ac2cb1fab2216e9e6dcf2a6ac89a
Parents: db691f0
Author: Zheng RuiFeng <ruife...@foxmail.com>
Authored: Mon Nov 14 19:42:00 2016 +0000
Committer: DB Tsai <dbt...@dbtsai.com>
Committed: Mon Nov 14 19:56:51 2016 +0000

----------------------------------------------------------------------
 .../src/main/scala/org/apache/spark/graphx/lib/PageRank.scala | 7 +++++++
 1 file changed, 7 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/cff7a70b/graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala
----------------------------------------------------------------------
diff --git a/graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala 
b/graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala
index f4b0075..c0c3c73 100644
--- a/graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala
+++ b/graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala
@@ -185,6 +185,13 @@ object PageRank extends Logging {
   def runParallelPersonalizedPageRank[VD: ClassTag, ED: ClassTag](graph: 
Graph[VD, ED],
     numIter: Int, resetProb: Double = 0.15,
     sources: Array[VertexId]): Graph[Vector, Double] = {
+    require(numIter > 0, s"Number of iterations must be greater than 0," +
+      s" but got ${numIter}")
+    require(resetProb >= 0 && resetProb <= 1, s"Random reset probability must 
belong" +
+      s" to [0, 1], but got ${resetProb}")
+    require(sources.nonEmpty, s"The list of sources must be non-empty," +
+      s" but got ${sources.mkString("[", ",", "]")}")
+
     // TODO if one sources vertex id is outside of the int range
     // we won't be able to store its activations in a sparse vector
     val zero = Vectors.sparse(sources.size, List()).asBreeze


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to