I'm guessing the other result was wrong, or just never evaluated here. The RDD transforms being lazy may have let it be expressed, but it wouldn't work. Nested RDD's are not supported.
On Mon, Mar 17, 2014 at 4:01 PM, anny9699 <anny9...@gmail.com> wrote: > Hi Andrew, > > Thanks for the reply. However I did almost the same thing in another > closure: > > val simi=dataByRow.map(point => { > val corrs=dataByRow.map(x => arrCorr(point._2,x._2)) > (point._1,corrs) > }) > > here dataByRow is of format RDD[(Int,Array[Double])] and arrCorr is a > function that I wrote to compute correlation between two scala arrays. > > and it worked. So I am a little confused why it worked here and not in > other > places. > > Thanks! > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/java-lang-NullPointerException-met-when-computing-new-RDD-or-use-count-tp2766p2779.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >