, however
> I'm getting below
>
> java.io.InvalidClassException: scala.Option; local class incompatible:
> stream classdesc serialVersionUID = -114498752079829388, local class
> serialVersionUID = 5081326844987135632
> at java.io.ObjectStreamClass.initNonProxy
Hello, I'm writing a Play Framework application which does Spark, however
I'm getting below
java.io.InvalidClassException: scala.Option; local class incompatible:
stream classdesc serialVersionUID = -114498752079829388, local class
serialVersionUID = 5081326844987135632
ption: Job aborted due to stage failure: Task
>> 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
>> 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException:
>> org.apache.spark.executor.TaskMetrics; local class incompatible: stream
>> classdesc ser
la 2.11.8.
>
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
> in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
> 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException:
> org.apache.spark.executor.TaskMetrics; local class incompatible: stream
>
I get the following exception. I am using Spark 2.0.1 and Scala 2.11.8.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage
0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException
)
2015-10-13 03:42:54,843 [task-result-getter-2] WARN
org.apache.spark.scheduler.TaskSetManager - Lost task 173.0 in stage 0.0
(TID 173, 9.37.251.65): java.io.InvalidClassException:
scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc
serialVersionUID = -4937928798201944954
2015-10-13 03:42:54,843 [task-result-getter-2] WARN
> org.apache.spark.scheduler.TaskSetManager - Lost task 173.0 in stage 0.0
> (TID 173, 9.37.251.65): java.io.InvalidClassException:
> scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc
> serialVersionUID = -4937928798
): java.io.InvalidClassException:
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Nomatch$;
no valid constructor at
java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:150)
at java.io.ObjectStreamClass.checkDeserialize
: java.io.InvalidClassException
I would certainly try to mark the Validator class as Serializable...If that
doesn't do it you can also try and see if this flag sheds more light:
-Dsun.io.serialization.extendedDebugInfo=true
By programming guide I mean this:
https://spark.apache.org/docs/latest/programming
]
Sent: Monday, July 13, 2015 2:16 PM
To: Ellafi, Saif A.
Cc: user@spark.apache.org
Subject: Re: java.io.InvalidClassException
It's a bit hard to tell from the snippets of code but it's likely related to
the fact that when you serialize instances the enclosing class, if any, also
gets serialized
Kadiyska [mailto:yana.kadiy...@gmail.com]
*Sent:* Monday, July 13, 2015 2:16 PM
*To:* Ellafi, Saif A.
*Cc:* user@spark.apache.org
*Subject:* Re: java.io.InvalidClassException
It's a bit hard to tell from the snippets of code but it's likely related
to the fact that when you serialize
evaluate_paths(some_row, validators).
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 125.0 failed 1 times, most recent failure: Lost task 0.0 in stage 125.0
(TID 830, localhost): java.io.InvalidClassException:
$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC
to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure:
Lost task 0.3 in stage 3.0 (TID 346, datanode02):
java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local
class incompatible: stream classdesc serialVersionUID =
8789839749593513237, local class serialVersionUID
The workaround was to wrap the map returned by spark libraries into HashMap
and then broadcast them.
Could anyone please let me know if there is any issue open?
--
View this message in context:
SerializableMapWrapper was added in
https://issues.apache.org/jira/browse/SPARK-3926; do you mind opening a new
JIRA and linking it to that one?
On Mon, Dec 1, 2014 at 12:17 AM, lokeshkumar lok...@dataken.net wrote:
The workaround was to wrap the map returned by spark libraries into HashMap
to stage failure: Task 0 in
stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage
815081.0 (TID 4751, ns2.x.net): java.io.InvalidClassException:
org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid
constructor
at
java.io.ObjectStreamClass
to stage failure: Task 0 in
stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage
815081.0 (TID 4751, ns2.dataken.net): java.io.InvalidClassException:
org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid
constructor
at
java.io.ObjectStreamClass
After wasting a lot of time, I've found the problem. Despite I haven't used
hadoop/hdfs in my application, hadoop client matters. The problem was in
hadoop-client version, it was different than the version of hadoop, spark
was built for. Spark's hadoop version 1.2.1, but in my application that was
18 matches
Mail list logo