You can also do rdd.toJavaRDD(). Pls check the API docs Regards Sab On 18-Nov-2015 3:12 am, "Bryan Cutler" <[email protected]> wrote:
> Hi Ivan, > > Since Spark 1.4.1 there is a Java-friendly function in LDAModel to get the > topic distributions called javaTopicDistributions() that returns a > JavaPairRDD. If you aren't able to upgrade, you can check out the > conversion used here > https://github.com/apache/spark/blob/v1.4.1/mllib/src/main/scala/org/apache/spark/mllib/clustering/LDAModel.scala#L350 > > -bryan > > On Tue, Nov 17, 2015 at 3:06 AM, frula00 <[email protected]> > wrote: > >> Hi, >> I'm working in Java, with Spark 1.3.1 - I am trying to extract data from >> the >> RDD returned by >> org.apache.spark.mllib.clustering.DistributedLDAModel.topicDistributions() >> (return type is RDD<Tuple2<Object, Vector>>). How do I work with it >> from >> within Java, I can't seem to cast it to JavaPairRDD nor JavaRDD and if I >> try >> to collect it it simply returns an Object? >> >> Thank you for your help in advance! >> >> Ivan >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Working-with-RDD-from-Java-tp25399.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> --------------------------------------------------------------------- >> To unsubscribe, e-mail: [email protected] >> For additional commands, e-mail: [email protected] >> >> >
