Hi,
I am no expert but my best guess is that its a 'closure' problem.Spark map
reduce internally does a closure of all the variables outside its scope
which are being used for the map operation.It does a serialization check for
the map task . Since class scala.util.Random is not serializable it th
The code base is huge but sharing the snapshot of it which I think might give
you some idea . Here is my class Player which is supposed to be my vertex
attribute :
*class Player(var RvalRdd: RDD[((Int, Int), Double)], Slope_m: Double)
extends Serializable {
//Some code here
}*
As you can see thi
Hi,
I have a scenario where I am having a class X with constructor parameter as
(RDD,Double).When I am initializing the the class object with corresponding
RDD and double value (of name say x1) and *putting it as a vertex attribute
in graph* , I am losing my RDD value . The Double value remains in
Hi,
I have a scenario where I am having a class X with constructor parameter as
(RDD,Double).When I am initializing the the class object with corresponding
RDD and double value (of name say x1) and putting it as a vertex attribute
in graph , I am losing my RDD value . The Double value remains inta