Hello forum,

We are using spark distro built from the source of latest 1.2.0 tag.
And we are facing the below issue, while trying to act upon the JavaRDD
instance, the stacktrace is given below.
Can anyone please let me know, what can be wrong here?

java.lang.ClassCastException: [B cannot be cast to
org.apache.spark.SerializableWritable
        at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:138)
        at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:194)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
        at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
        at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
        at org.apache.spark.rdd.RDD.take(RDD.scala:1060)
        at 
org.apache.spark.api.java.JavaRDDLike$class.take(JavaRDDLike.scala:419)
        at org.apache.spark.api.java.JavaRDD.take(JavaRDD.scala:32)
        at
com.dataken.common.chores.InformationDataLoadChore.run(InformationDataLoadChore.java:69)
        at com.dataken.common.pipeline.DatakenTask.start(DatakenTask.java:110)
        at
com.dataken.tasks.objectcentricprocessor.ObjectCentricProcessTask.execute(ObjectCentricProcessTask.java:99)
        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
        at
org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
2014-11-26 08:07:38,454 ERROR [DefaultQuartzScheduler_Worker-2]
org.quartz.core.ErrorLogger
Job (report_report.report_report threw an exception.

org.quartz.SchedulerException: Job threw an unhandled exception. [See nested
exception: java.lang.ClassCastException: [B cannot be cast to
org.apache.spark.SerializableWritable]
        at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
        at
org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Caused by: java.lang.ClassCastException: [B cannot be cast to
org.apache.spark.SerializableWritable
        at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:138)
        at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:194)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
        at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
        at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:205)
        at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:203)
        at scala.Option.getOrElse(Option.scala:120)
        at org.apache.spark.rdd.RDD.partitions(RDD.scala:203)
        at org.apache.spark.rdd.RDD.take(RDD.scala:1060)
        at 
org.apache.spark.api.java.JavaRDDLike$class.take(JavaRDDLike.scala:419)
        at org.apache.spark.api.java.JavaRDD.take(JavaRDD.scala:32)
        at
com.dataken.common.chores.InformationDataLoadChore.run(InformationDataLoadChore.java:69)
        at com.dataken.common.pipeline.DatakenTask.start(DatakenTask.java:110)
        at
com.dataken.tasks.objectcentricprocessor.ObjectCentricProcessTask.execute(ObjectCentricProcessTask.java:99)
        at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
        ... 1 more




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Issue-with-Spark-latest-1-2-0-build-ClassCastException-from-B-to-SerializableWritable-tp19815.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to