Re: ClassLoader problem - java.io.InvalidClassException: scala.Option; local class incompatible

2017-02-20 Thread Kohki Nishio
, however > I'm getting below > > java.io.InvalidClassException: scala.Option; local class incompatible: > stream classdesc serialVersionUID = -114498752079829388, local class > serialVersionUID = 5081326844987135632 > at java.io.ObjectStreamClass.initNonProxy

ClassLoader problem - java.io.InvalidClassException: scala.Option; local class incompatible

2017-02-20 Thread Kohki Nishio
Hello, I'm writing a Play Framework application which does Spark, however I'm getting below java.io.InvalidClassException: scala.Option; local class incompatible: stream classdesc serialVersionUID = -114498752079829388, local class serialVersionUID = 5081326844987135632

Re: java.io.InvalidClassException: org.apache.spark.executor.TaskMetrics

2017-01-20 Thread kant kodali
ption: Job aborted due to stage failure: Task >> 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage >> 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException: >> org.apache.spark.executor.TaskMetrics; local class incompatible: stream >> classdesc ser

Re: java.io.InvalidClassException: org.apache.spark.executor.TaskMetrics

2017-01-20 Thread kant kodali
la 2.11.8. > > org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 > in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage > 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException: > org.apache.spark.executor.TaskMetrics; local class incompatible: stream >

java.io.InvalidClassException: org.apache.spark.executor.TaskMetrics

2017-01-20 Thread kant kodali
I get the following exception. I am using Spark 2.0.1 and Scala 2.11.8. org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException

java.io.InvalidClassException using spark1.4.1 for Terasort

2015-10-14 Thread Shreeharsha G Neelakantachar
) 2015-10-13 03:42:54,843 [task-result-getter-2] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 173.0 in stage 0.0 (TID 173, 9.37.251.65): java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954

Re: java.io.InvalidClassException using spark1.4.1 for Terasort

2015-10-14 Thread Sonal Goyal
2015-10-13 03:42:54,843 [task-result-getter-2] WARN > org.apache.spark.scheduler.TaskSetManager - Lost task 173.0 in stage 0.0 > (TID 173, 9.37.251.65): java.io.InvalidClassException: > scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc > serialVersionUID = -4937928798

Re: java.io.InvalidClassException

2015-07-13 Thread Yana Kadiyska
): java.io.InvalidClassException: $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$Nomatch$; no valid constructor at java.io.ObjectStreamClass$ExceptionInfo.newInvalidClassException(ObjectStreamClass.java:150) at java.io.ObjectStreamClass.checkDeserialize

RE: java.io.InvalidClassException

2015-07-13 Thread Saif.A.Ellafi
: java.io.InvalidClassException I would certainly try to mark the Validator class as Serializable...If that doesn't do it you can also try and see if this flag sheds more light: -Dsun.io.serialization.extendedDebugInfo=true By programming guide I mean this: https://spark.apache.org/docs/latest/programming

RE: java.io.InvalidClassException

2015-07-13 Thread Saif.A.Ellafi
] Sent: Monday, July 13, 2015 2:16 PM To: Ellafi, Saif A. Cc: user@spark.apache.org Subject: Re: java.io.InvalidClassException It's a bit hard to tell from the snippets of code but it's likely related to the fact that when you serialize instances the enclosing class, if any, also gets serialized

Re: java.io.InvalidClassException

2015-07-13 Thread Yana Kadiyska
Kadiyska [mailto:yana.kadiy...@gmail.com] *Sent:* Monday, July 13, 2015 2:16 PM *To:* Ellafi, Saif A. *Cc:* user@spark.apache.org *Subject:* Re: java.io.InvalidClassException It's a bit hard to tell from the snippets of code but it's likely related to the fact that when you serialize

java.io.InvalidClassException

2015-07-13 Thread Saif.A.Ellafi
evaluate_paths(some_row, validators). org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 125.0 failed 1 times, most recent failure: Lost task 0.0 in stage 125.0 (TID 830, localhost): java.io.InvalidClassException: $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC

java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc

2015-03-10 Thread Manas Kar
to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3.0 (TID 346, datanode02): java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc serialVersionUID = 8789839749593513237, local class serialVersionUID

Re: java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-12-01 Thread lokeshkumar
The workaround was to wrap the map returned by spark libraries into HashMap and then broadcast them. Could anyone please let me know if there is any issue open? -- View this message in context:

Re: java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-12-01 Thread Josh Rosen
SerializableMapWrapper was added in https://issues.apache.org/jira/browse/SPARK-3926; do you mind opening a new JIRA and linking it to that one? On Mon, Dec 1, 2014 at 12:17 AM, lokeshkumar lok...@dataken.net wrote: The workaround was to wrap the map returned by spark libraries into HashMap

java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-11-29 Thread lokeshkumar
to stage failure: Task 0 in stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage 815081.0 (TID 4751, ns2.x.net): java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor at java.io.ObjectStreamClass

java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-11-29 Thread lokeshkumar
to stage failure: Task 0 in stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage 815081.0 (TID 4751, ns2.dataken.net): java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor at java.io.ObjectStreamClass

Re: Standalone spark cluster. Can't submit job programmatically - java.io.InvalidClassException

2014-09-08 Thread DrKhu
After wasting a lot of time, I've found the problem. Despite I haven't used hadoop/hdfs in my application, hadoop client matters. The problem was in hadoop-client version, it was different than the version of hadoop, spark was built for. Spark's hadoop version 1.2.1, but in my application that was