Re: ClassLoader problem - java.io.InvalidClassException: scala.Option; local class incompatible

2017-02-20 Thread Kohki Nishio
'm getting below > > java.io.InvalidClassException: scala.Option; local class incompatible: > stream classdesc serialVersionUID = -114498752079829388, local class > serialVersionUID = 5081326844987135632 > at java.io.ObjectStreamClass.initNonProxy(

ClassLoader problem - java.io.InvalidClassException: scala.Option; local class incompatible

2017-02-20 Thread Kohki Nishio
Hello, I'm writing a Play Framework application which does Spark, however I'm getting below java.io.InvalidClassException: scala.Option; local class incompatible: stream classdesc serialVersionUID = -114498752079829388, local class serialVersionUID = 5081326844987135

Re: java.io.InvalidClassException: org.apache.spark.executor.TaskMetrics

2017-01-20 Thread kant kodali
>> 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage >> 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException: >> org.apache.spark.executor.TaskMetrics; local class incompatible: stream >> classdesc serialVersionUID = -2231953621568687904, local class >> serialVersionUID = -6966587383730940799 >> > >

Re: java.io.InvalidClassException: org.apache.spark.executor.TaskMetrics

2017-01-20 Thread kant kodali
n: Job aborted due to stage failure: Task 0 > in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage > 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException: > org.apache.spark.executor.TaskMetrics; local class incompatible: stream > classdesc serialVersionUID = -

java.io.InvalidClassException: org.apache.spark.executor.TaskMetrics

2017-01-20 Thread kant kodali
I get the following exception. I am using Spark 2.0.1 and Scala 2.11.8. org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 13, 172.31.20.212): java.io.InvalidClassException

Re: java.io.InvalidClassException using spark1.4.1 for Terasort

2015-10-14 Thread Sonal Goyal
2015-10-13 03:42:54,843 [task-result-getter-2] WARN > org.apache.spark.scheduler.TaskSetManager - Lost task 173.0 in stage 0.0 > (TID 173, 9.37.251.65): java.io.InvalidClassException: > scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc > serialVersionUID = -4937928798201944954, lo

java.io.InvalidClassException using spark1.4.1 for Terasort

2015-10-14 Thread Shreeharsha G Neelakantachar
) 2015-10-13 03:42:54,843 [task-result-getter-2] WARN org.apache.spark.scheduler.TaskSetManager - Lost task 173.0 in stage 0.0 (TID 173, 9.37.251.65): java.io.InvalidClassException: scala.reflect.ClassTag$$anon$1; local class incompatible: stream classdesc serialVersionUID = -4937928798201944954

RE: java.io.InvalidClassException

2015-07-13 Thread Saif.A.Ellafi
: java.io.InvalidClassException I would certainly try to mark the Validator class as Serializable...If that doesn't do it you can also try and see if this flag sheds more light: -Dsun.io.serialization.extendedDebugInfo=true By programming guide I mean this: https://spark.apache.org/docs/latest/progra

Re: java.io.InvalidClassException

2015-07-13 Thread Yana Kadiyska
t: Row): Validator = { > > var check1: Boolean = if (input.getDouble(shortsale_in_pos) > > 140.0) true else false > > if (check1) this else Nomatch > > } > > } > > > > Saif > > > > *From:* Yana Kadiyska [mailto:yana.kadiy...@gmail.co

RE: java.io.InvalidClassException

2015-07-13 Thread Saif.A.Ellafi
il.com] Sent: Monday, July 13, 2015 2:16 PM To: Ellafi, Saif A. Cc: user@spark.apache.org Subject: Re: java.io.InvalidClassException It's a bit hard to tell from the snippets of code but it's likely related to the fact that when you serialize instances the enclosing class, if any, also

Re: java.io.InvalidClassException

2015-07-13 Thread Yana Kadiyska
eption: Job aborted due to stage failure: Task 0 > in stage 125.0 failed 1 times, most recent failure: Lost task 0.0 in stage > 125.0 (TID 830, localhost): java.io.InvalidClassException: > $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iw

java.io.InvalidClassException

2015-07-13 Thread Saif.A.Ellafi
ion works properly, when calling evaluate_paths(some_row, validators). org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 125.0 failed 1 times, most recent failure: Lost task 0.0 in stage 125.0 (TID 830, localhost): java.io.InvalidClassException: $iwC$$iwC$$iwC$$iwC

java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc

2015-03-10 Thread Manas Kar
ted due to stage failure: Task 0 in stage 3.0 failed 4 times, most recent failure: Lost task 0.3 in stage 3.0 (TID 346, datanode02): java.io.InvalidClassException: org.apache.spark.rdd.PairRDDFunctions; local class incompatible: stream classdesc serialVersionUID = 8789839749593513237, l

Re: Standalone spark cluster. Can't submit job programmatically -> java.io.InvalidClassException

2014-12-11 Thread sivarani
No able to get it , how did you exactly fix it? i am using maven build i downloaded spark1.1.1 and then packaged with mvn -Dhadoop.version=1.2.1 -DskipTests clean package but i keep getting invalid class exceptions -- View this message in context: http://apache-spark-user-list.1001560.n3.nabbl

Re: java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-12-01 Thread Josh Rosen
SerializableMapWrapper was added in https://issues.apache.org/jira/browse/SPARK-3926; do you mind opening a new JIRA and linking it to that one? On Mon, Dec 1, 2014 at 12:17 AM, lokeshkumar wrote: > The workaround was to wrap the map returned by spark libraries into HashMap > and then broadcast

Re: java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-12-01 Thread lokeshkumar
The workaround was to wrap the map returned by spark libraries into HashMap and then broadcast them. Could anyone please let me know if there is any issue open? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-io-InvalidClassException-org-apache-spark-a

java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-11-29 Thread lokeshkumar
stage failure: Task 0 in stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage 815081.0 (TID 4751, ns2.dataken.net): java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor at java.io.ObjectStreamClass

java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor

2014-11-29 Thread lokeshkumar
stage failure: Task 0 in stage 815081.0 failed 4 times, most recent failure: Lost task 0.3 in stage 815081.0 (TID 4751, ns2.x.net): java.io.InvalidClassException: org.apache.spark.api.java.JavaUtils$SerializableMapWrapper; no valid constructor at java.io.ObjectStreamClass

Re: Standalone spark cluster. Can't submit job programmatically -> java.io.InvalidClassException

2014-09-08 Thread DrKhu
After wasting a lot of time, I've found the problem. Despite I haven't used hadoop/hdfs in my application, hadoop client matters. The problem was in hadoop-client version, it was different than the version of hadoop, spark was built for. Spark's hadoop version 1.2.1, but in my application that was