Hi Sean Thanks for reply, We upgraded our spark cluster from 1.1.0 to 1.2.0. And we also thought that this issue might be due to mis matching spark jar versions. But we double checked and re installed our app completely in a new system with spark-1.2.0 distro, but still no result. Facing the same problem.
This does not happen when master is set to 'local[*]'. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Issue-with-Spark-latest-1-2-0-build-ClassCastException-from-B-to-SerializableWritable-tp19824p19864.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org