Hi,

I am no expert but my best guess is that its a 'closure' problem.Spark map
reduce internally does a closure of all the variables outside its scope
which are being used for the map operation.It does a serialization check for
the map task . Since class scala.util.Random is not serializable it throws
that exception. Best way to avoid this would be do create a wrapper object
which is serializable and create a random instance inside it. Then you can
use that instance which is part of that serializable object without an
issue.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/rdd-checkpoint-driver-side-lineage-tp9049p16997.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to