This error is happens when you use eclipse compile the .java file, and
fixed in 0.9.0 (in git).




2013/12/19 [email protected] <[email protected]>

>  I write a example MyWordCount , just set spark.akka.frameSize larger
> than default . but when I run this jar , there is a problem :
>
>  13/12/19 18:53:48 INFO ClusterTaskSetManager: Lost TID 0 (task 0.0:0)
> 13/12/19 18:53:48 INFO ClusterTaskSetManager: Loss was due to
> java.lang.AbstractMethodError
> java.lang.AbstractMethodError:
> org.apache.spark.api.java.function.WrappedFunction1.call(Ljava/lang/Object;)Ljava/lang/Object;
>         at
> org.apache.spark.api.java.function.WrappedFunction1.apply(WrappedFunction1.scala:31)
>         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
>         at
> org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:90)
>         at scala.collection.Iterator$$anon$21.hasNext(Iterator.scala:440)
>         at scala.collection.Iterator$class.foreach(Iterator.scala:772)
>         at scala.collection.Iterator$$anon$21.foreach(Iterator.scala:437)
>         at
> scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:48)
>         at
> scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:102)
>         at
> scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:250)
>         at scala.collection.Iterator$$anon$21.toBuffer(Iterator.scala:437)
>         at
> scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:237)
>         at scala.collection.Iterator$$anon$21.toArray(Iterator.scala:437)
>         at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
>         at org.apache.spark.rdd.RDD$$anonfun$1.apply(RDD.scala:560)
>         at
> org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
>         at
> org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:758)
>
> it caused by  this code :
>  JavaRDD<String> words = lines.flatMap(new FlatMapFunction<String,
> String>() {
>     public Iterable<String> call(String s) {
>         return Arrays.asList(s.split(" "));
>     } });
>
> there is the parent class:
>
>  private[spark] abstract class WrappedFunction1[T, R] extends
> AbstractFunction1[T, R] {
>   @throws(classOf[Exception])
>   def call(t: T): R
>
>   final def apply(t: T): R = call(t)
> }
>
> the code is same as the JavaWordCount , I don't know what's the error .
>
> Thanks
>
> Leo
>
> ------------------------------
>  [email protected]
>

Reply via email to