Re: Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-05-16 Thread Nathan Kronenfeld
> sc: sparkContext >> >> val kk:RDD[(Int,List[Double])]=series.map(t=>(t._1,new >> DWTsample().computeDwt(sc,t._2))) >> >> Error: >> org.apache.spark.SparkException: Job failed: >> java.io.NotSerializableException: org.apache.spark.SparkContext >&

Re: Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-05-15 Thread Shivani Rao
ation is a class having computeDwt function. > sc: sparkContext > > val kk:RDD[(Int,List[Double])]=series.map(t=>(t._1,new > DWTsample().computeDwt(sc,t._2))) > > Error: > org.apache.spark.SparkException: Job failed: > java.io.NotSerializableException:

Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-05-12 Thread yh18190
ror: org.apache.spark.SparkException: Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext org.apache.spark.SparkException: Job failed: java.io.NotSerializableException: org.apache.spark.SparkContext at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala: