Github user dbtsai commented on a diff in the pull request:
https://github.com/apache/spark/pull/9457#discussion_r43928660
--- Diff: graphx/src/main/scala/org/apache/spark/graphx/lib/PageRank.scala
---
@@ -159,6 +161,83 @@ object PageRank extends Logging {
}
/**
+ * Run Personalized PageRank for a fixed number of iterations, for a
+ * set of starting nodes in parallel. Returns a graph with vertex
attributes
+ * containing the pagerank relative to all starting nodes (as a sparse
vector) and
+ * edge attributes the normalized edge weight
+ *
+ * @tparam VD The original vertex attribute (not used)
+ * @tparam ED The original edge attribute (not used)
+ *
+ * @param graph The graph on which to compute personalized pagerank
+ * @param numIter The number of iterations to run
+ * @param resetProb The random reset probability
+ * @param sources The list of sources to compute personalized pagerank
from
+ * @return the graph with vertex attributes
+ * containing the pagerank relative to all starting nodes (as a
sparse vector) and
+ * edge attributes the normalized edge weight
+ */
+ def runParallelPersonalizedPageRank[VD: ClassTag, ED: ClassTag](graph:
Graph[VD, ED],
+ numIter: Int, resetProb: Double = 0.15,
+ sources : Array[VertexId]): Graph[SparseVector[Double], Double] =
+ {
+ // TODO if one sources vertex id is outside of the int range
+ // we won't be able to store its activations in a sparse vector
+ val zero = new SparseVector[Double](Array(), Array(), sources.size)
+ val sourcesInitMap = sources.zipWithIndex.map{case (vid, i) => {
--- End diff --
should be
```scala
val sourcesInitMap = sources.zipWithIndex.map { case (vid, i) =>
val v = new SparseVector[Double](Array(i), Array(resetProb), sources.size)
(vid, v)
}.toMap
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]