yes, I have - I compiled both Spark and my soft from sources - actually the whole processing is executing fine - just saving results is failing.
2014-06-03 21:01 GMT+02:00 Gerard Maas <gerard.m...@gmail.com>: > Have you tried re-compiling your job against the 1.0 release? > > > On Tue, Jun 3, 2014 at 8:46 PM, Marek Wiewiorka <marek.wiewio...@gmail.com > > wrote: > >> Hi All, >> I've been experiencing a very strange error after upgrade from Spark 0.9 >> to 1.0 - it seems that saveAsTestFile function is throwing >> java.lang.UnsupportedOperationException that I have never seen before. >> Any hints appreciated. >> >> scheduler.TaskSetManager: Loss was due to >> java.lang.ClassNotFoundException: >> org.apache.spark.rdd.RDD$$anonfun$saveAsTextFile$1 [duplicate 45] >> 14/06/03 16:46:23 ERROR actor.OneForOneStrategy: >> java.lang.UnsupportedOperationException >> at >> org.apache.spark.scheduler.SchedulerBackend$class.killTask(SchedulerBackend.scala:32) >> at >> org.apache.spark.scheduler.cluster.mesos.MesosSchedulerBackend.killTask(MesosSchedulerBackend.scala:41) >> at >> org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$cancelTasks$3$$anonfun$apply$1.apply$mcVJ$sp(TaskSchedulerImpl.scala:185) >> at >> org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$cancelTasks$3$$anonfun$apply$1.apply(TaskSchedulerImpl.scala:183) >> at >> org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$cancelTasks$3$$anonfun$apply$1.apply(TaskSchedulerImpl.scala:183) >> at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) >> at >> org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$cancelTasks$3.apply(TaskSchedulerImpl.scala:183) >> at >> org.apache.spark.scheduler.TaskSchedulerImpl$$anonfun$cancelTasks$3.apply(TaskSchedulerImpl.scala:176) >> at scala.Option.foreach(Option.scala:236) >> at >> org.apache.spark.scheduler.TaskSchedulerImpl.cancelTasks(TaskSchedulerImpl.scala:176) >> at >> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages$1.apply$mcVI$sp(DAGScheduler.scala:1058) >> at >> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages$1.apply(DAGScheduler.scala:1045) >> at >> org.apache.spark.scheduler.DAGScheduler$$anonfun$org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages$1.apply(DAGScheduler.scala:1045) >> at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) >> at org.apache.spark.scheduler.DAGScheduler.org >> <http://org.apache.spark.scheduler.dagscheduler.org/> >> $apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1045) >> at >> org.apache.spark.scheduler.DAGScheduler.handleJobCancellation(DAGScheduler.scala:998) >> at >> org.apache.spark.scheduler.DAGScheduler$$anonfun$doCancelAllJobs$1.apply$mcVI$sp(DAGScheduler.scala:499) >> at >> org.apache.spark.scheduler.DAGScheduler$$anonfun$doCancelAllJobs$1.apply(DAGScheduler.scala:499) >> at >> org.apache.spark.scheduler.DAGScheduler$$anonfun$doCancelAllJobs$1.apply(DAGScheduler.scala:499) >> at scala.collection.mutable.HashSet.foreach(HashSet.scala:79) >> at >> org.apache.spark.scheduler.DAGScheduler.doCancelAllJobs(DAGScheduler.scala:499) >> at >> org.apache.spark.scheduler.DAGSchedulerActorSupervisor$$anonfun$2.applyOrElse(DAGScheduler.scala:1151) >> at >> org.apache.spark.scheduler.DAGSchedulerActorSupervisor$$anonfun$2.applyOrElse(DAGScheduler.scala:1147) >> at >> akka.actor.SupervisorStrategy.handleFailure(FaultHandling.scala:295) >> at >> akka.actor.dungeon.FaultHandling$class.handleFailure(FaultHandling.scala:253) >> at akka.actor.ActorCell.handleFailure(ActorCell.scala:338) >> at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:423) >> at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447) >> at >> akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262) >> at akka.dispatch.Mailbox.run(Mailbox.scala:218) >> at >> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386) >> at >> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260) >> at >> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339) >> at >> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979) >> at >> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107) >> >> Thanks, >> Marek >> > >