But groupByKey() gives me the error saying that it is not a member of org.apache.spark,rdd,RDD[(Double, org.apache.spark.graphx.VertexId)] when run in the graphx directory of spark-1.0.0. This error does not come when I use the same in the interactive shell.
On Wed, Dec 3, 2014 at 3:49 PM, Ankur Dave <[email protected]> wrote: > At 2014-12-03 02:13:49 -0800, Deep Pradhan <[email protected]> > wrote: > > We cannot do sc.parallelize(List(VertexRDD)), can we? > > There's no need to do this, because every VertexRDD is also a pair RDD: > > class VertexRDD[VD] extends RDD[(VertexId, VD)] > > You can simply use graph.vertices in place of `a` in my example. > > Ankur >
