task not serializable on simple operations

2017-10-16 Thread Imran Rajjad
Is there a way around to implement a separate Java class that implements serializable interface for even small petty arithmetic operations? below is code from simple decision tree example Double testMSE = predictionAndLabel.map(new Function, Double>() { @Override public Double call(Tupl

RE: Spark sql with Zeppelin, Task not serializable error when I try to cache the spark sql table

2017-05-31 Thread Mahesh Sawaiker
serializable error when I try to cache the spark sql table Hello all, I am using Zeppelin 0.7.1 with Spark 2.1.0 I am getting org.apache.spark.SparkException: Task not serializable error when I try to cache the spark sql table. I am using a UDF on a column of table and want to cache the resultant table

Spark sql with Zeppelin, Task not serializable error when I try to cache the spark sql table

2017-05-31 Thread shyla deshpande
Hello all, I am using Zeppelin 0.7.1 with Spark 2.1.0 I am getting org.apache.spark.SparkException: Task not serializable error when I try to cache the spark sql table. I am using a UDF on a column of table and want to cache the resultant table . I can execute the paragraph successfully when

Re: org.apache.spark.SparkException: Task not serializable

2017-03-13 Thread Yong Zhang
tava; user@spark.apache.org Subject: Re: org.apache.spark.SparkException: Task not serializable For scala, make your class Serializable, like this ``` class YourClass extends Serializable { } ``` On Sat, Mar 11, 2017 at 3:51 PM, 萝卜丝炒饭 <1427357...@qq.com<mailto:1427357...@qq.com>> wrote: hi min

Re: org.apache.spark.SparkException: Task not serializable

2017-03-11 Thread Yan Facai
#x27;s idea. > > thanks > Robin > > ---Original--- > *From:* "Mina Aslani" > *Date:* 2017/3/7 05:32:10 > *To:* "Ankur Srivastava"; > *Cc:* "user@spark.apache.org"; > *Subject:* Re: org.apache.spark.SparkException: Task not serializable &

Re: org.apache.spark.SparkException: Task not serializable

2017-03-10 Thread ??????????
hi mina, can you paste your new code here pleasel i meet this issue too but do not get Ankur's idea. thanks Robin ---Original--- From: "Mina Aslani" Date: 2017/3/7 05:32:10 To: "Ankur Srivastava"; Cc: "user@spark.apache.org"; Subject: Re: org.apache.spark.Sp

Re: org.apache.spark.SparkException: Task not serializable

2017-03-06 Thread Mina Aslani
f lines of a text file in my >> mac, however I get >> >> org.apache.spark.SparkException: Task not serializable error on >> >> JavaRDD logData = javaCtx.textFile(file); >> >> Please see below for the sample of code and the stackTrace. >> >> Any idea why th

Re: org.apache.spark.SparkException: Task not serializable

2017-03-06 Thread Ankur Srivastava
ith spark and get number of lines of a text file in my > mac, however I get > > org.apache.spark.SparkException: Task not serializable error on > > JavaRDD logData = javaCtx.textFile(file); > > Please see below for the sample of code and the stackTrace. > > Any idea why th

org.apache.spark.SparkException: Task not serializable

2017-03-06 Thread Mina Aslani
Hi, I am trying to start with spark and get number of lines of a text file in my mac, however I get org.apache.spark.SparkException: Task not serializable error on JavaRDD logData = javaCtx.textFile(file); Please see below for the sample of code and the stackTrace. Any idea why this error is

task not serializable in case of groupByKey() + mapGroups + map?

2016-10-31 Thread Yang
map(xx=>{ val simpley = yyy.value 1 }) I'm seeing error: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:298) at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$Closure

Re: Spark 2.0 Structured Streaming: sc.parallelize in foreach sink cause Task not serializable error

2016-09-26 Thread Michael Armbrust
uot;sstest") > println(v_str(0),v_str(1),v_str(2),v_str(3))} > override def close(errorOrNull: Throwable) = () > } > > val query = > line_count.writeStream.outputMode("complete").foreach(writer).start() > > query.awaitTermination() &

Spark 2.0 Structured Streaming: sc.parallelize in foreach sink cause Task not serializable error

2016-09-25 Thread Jianshi
problem? Or is there another way to save the result using foreach sink? Thanks very much. Best, Jianshi -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-2-0-Structured-Streaming-sc-parallelize-in-foreach-sink-cause-Task-not-serializable-error-tp27791.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Task not serializable: java.io.NotSerializableException: org.json4s.Serialization$$anon$1

2016-07-19 Thread RK Aduri
Did you check this: case class Example(name : String, age ; Int) there is a semicolon. should have been (age : Int) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Task-not-serializable-java-io-NotSerializableException-org-json4s-Serialization-anon-1

Re: Task not serializable: java.io.NotSerializableException: org.json4s.Serialization$$anon$1

2016-07-19 Thread joshuata
r-list.1001560.n3.nabble.com/Task-not-serializable-java-io-NotSerializableException-org-json4s-Serialization-anon-1-tp8233p27359.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail

Spark Task not serializable with lag Window function

2016-05-18 Thread luca_guerra
I've noticed that after I use a Window function over a DataFrame if I call a map() with a function, Spark returns a "Task not serializable" Exception This is my code: val hc = new org.apache.spark.sql.hive.HiveContext(sc) import hc.implicits._ import org.apache.spark.sql.expression

Re: Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Prashant Sharma
text variable (sc) to a new variable and >>> reference >>> another variable in an RDD lambda expression we get a task not >>> serializable exception. >>> >>> The following three lines of code illustrate this : >>> >>> val temp = 10 >>

Re: Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Rahul Palamuttam
zeppelin. >> If we assign the sparkcontext variable (sc) to a new variable and >> reference >> another variable in an RDD lambda expression we get a task not >> serializable exception. >> >> The following three lines of code illustrate this : >> >&g

Re: Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Jeff Zhang
sc) to a new variable and reference > another variable in an RDD lambda expression we get a task not > serializable exception. > > The following three lines of code illustrate this : > > val temp = 10 > val newSC = sc > val new RDD = newSC.parallelize(0 to 100).map(p => p

Renaming sc variable in sparkcontext throws task not serializable

2016-03-02 Thread Rahul Palamuttam
Hi All, We recently came across this issue when using the spark-shell and zeppelin. If we assign the sparkcontext variable (sc) to a new variable and reference another variable in an RDD lambda expression we get a task not serializable exception. The following three lines of code illustrate this

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-22 Thread Huy Banh
>>> val header = logData.first >>> // filter out header >>> val sample = logData.filter(!_.contains(header)).map { >>> line => line.replaceAll("['\"]","").substring(0,line.length()-1) >>> }.takeSampl

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Alexis Gillain
>> >> Code: >> >> //get file >> val logFile = "s3n://file" >> val logData = sc.textFile(logFile) >> // header >> val header = logData.first >> // filter out header >> val sample = logData.filter(!_.contains(header)).map {

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Igor Berman
quot;").substring(0,line.length()-1) > }.takeSample(false,100,12L) > > Stack Trace: > > org.apache.spark.SparkException: Task not serializable > > org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:315) > > or

Re: Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Ted Yu
er(!_.contains(header)).map { > line => line.replaceAll("['\"]","").substring(0,line.length()-1) > }.takeSample(false,100,12L) > > Stack Trace: > > org.apache.spark.SparkException: Task not serializable > > org.apache.spark.util.Clos

Troubleshooting "Task not serializable" in Spark/Scala environments

2015-09-21 Thread Balaji Vijayan
/ filter out header val sample = logData.filter(!_.contains(header)).map { line => line.replaceAll("['\"]","").substring(0,line.length()-1) }.takeSample(false,100,12L) Stack Trace: org.apache.spark.SparkException: Task not serializable org.apache.spark.u

Re: Job aborted due to stage failure: Task not serializable:

2015-07-16 Thread Akhil Das
> > I am using the below code and using kryo serializer .when i run this code > i got this error : Task not serializable in commented line > 2) how broadcast variables are treated in exceotu.are they local variables > or can be used in any function defined as global variables. >

Job aborted due to stage failure: Task not serializable:

2015-07-15 Thread Naveen Dabas
I am using the below code and using kryo serializer .when i run this code i got this error : Task not serializable in commented line2) how broadcast variables are treated in exceotu.are they local variables or can be used in any function defined as global variables. object

Re: Spark stream test throw org.apache.spark.SparkException: Task not serializable when execute in spark shell

2015-06-24 Thread Yana Kadiyska
I can't tell immediately, but you might be able to get more info with the hint provided here: http://stackoverflow.com/questions/27980781/spark-task-not-serializable-with-simple-accumulator (short version, set -Dsun.io.serialization.extendedDebugInfo=true) Also, unless you're simpli

Spark stream test throw org.apache.spark.SparkException: Task not serializable when execute in spark shell

2015-06-24 Thread yuemeng (A)
hi ,all there two examples one is throw Task not serializable when execute in spark shell,the other one is ok,i am very puzzled,can anyone give what's different about this two code and why the other is ok 1.The one which throw Task not serializable : import org.apache.spark._ i

Re: Wired Problem: Task not serializable[Spark Streaming]

2015-06-08 Thread Michael Albert
not on the workers.So on the workers, there is no where to which the "return" can jump.Hence it is not serializable. Good luck.-Mike From: "bit1...@163.com" To: user Sent: Monday, June 8, 2015 10:01 PM Subject: Re: Wired Problem: Task not serializable[Spark S

Re: Wired Problem: Task not serializable[Spark Streaming]

2015-06-08 Thread bit1...@163.com
Could someone help explain what happens that leads to the Task not serializable issue? Thanks. bit1...@163.com From: bit1...@163.com Date: 2015-06-08 19:08 To: user Subject: Wired Problem: Task not serializable[Spark Streaming] Hi, With the following simple code, I got an exception that

[SQL][1.3.1][JAVA] UDF in java cause Task not serializable

2015-04-27 Thread Shuai Zheng
Hi All, Basically I try to define a simple UDF and use it in the query, but it gives me "Task not serializable" public void test() { RiskGroupModelDefinition model = registeredRiskGroupMap.get(this.modelId); RiskGroupModelDefinition edm = this

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-22 Thread Tathagata Das
moving the context bounds solve the >>>>>> problem... What does this exception mean in general? >>>>>> >>>>>> On Mon, Apr 20, 2015 at 1:33 PM, Tathagata Das >>>>>> wrote: >>>>>> >>>>>>> When are

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-22 Thread Jean-Pascal Billaud
PM, Tathagata Das >>>>> wrote: >>>>> >>>>>> When are you getting this exception? After starting the context? >>>>>> >>>>>> TD >>>>>> >>>>>> On Mon, Apr 20, 2015 at 10:44

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Tathagata Das
>>> j...@tellapart.com> wrote: >>>>> >>>>>> Hi, >>>>>> >>>>>> I am getting this serialization exception and I am not too sure what >>>>>> "Graph is unexpectedly null when DStream is bein

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Jean-Pascal Billaud
>>>> >>>>> I am getting this serialization exception and I am not too sure what >>>>> "Graph is unexpectedly null when DStream is being serialized" means? >>>>> >>>>> 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final a

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Tathagata Das
>>>> "Graph is unexpectedly null when DStream is being serialized" means? >>>> >>>> 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: >>>> FAILED, exitCode: 15, (reason: User class threw exception: T

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-21 Thread Jean-Pascal Billaud
>> "Graph is unexpectedly null when DStream is being serialized" means? >>> >>> 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, >>> exitCode: 15, (reason: User class threw exception: Task not serializable) >>> Excep

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-20 Thread Jean-Pascal Billaud
> "Graph is unexpectedly null when DStream is being serialized" means? >> >> 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, >> exitCode: 15, (reason: User class threw exception: Task not serializable) >> Exception in thread &

Re: Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-20 Thread Tathagata Das
means? > > 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, > exitCode: 15, (reason: User class threw exception: Task not serializable) > Exception in thread "Driver" org.apache.spark.SparkException: Task not > serializable > at org.ap

Task not Serializable: Graph is unexpectedly null when DStream is being serialized

2015-04-20 Thread Jean-Pascal Billaud
Hi, I am getting this serialization exception and I am not too sure what "Graph is unexpectedly null when DStream is being serialized" means? 15/04/20 06:12:38 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: Task not se

Task not serializable exception

2015-02-24 Thread Kartheek.R
Hi, I run into Task not Serializable excption with following code below. When I remove the threads and run, it works, but with threads I run into Task not serializable exception. object SparkKart extends Serializable{ def parseVector(line: String): Vector[Double] = { DenseVector(line.split

Re: Task not serializable exception

2015-02-23 Thread Kartheek.R
} }) thread1.start } } -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Task-not-serializable-exception-tp21776p21778.html Sent from the Apache Spark User List mailing list archive at Nabble.com. ---

Task not serializable exception

2015-02-23 Thread Kartheek.R
() { val dist1 =data.map(x => squaredDistance(x,kPoints(0))) } }) thread1.start I am facing Task not serializable exception: Exception in thread "Thread-32" org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.C

Re: SparkException: Task not serializable - Jackson Json

2015-02-14 Thread mickdelaney
er node but only per partition and not for every row like above. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkException-Task-not-serializable-Jackson-Json-tp21347p21655.html Sent from the Apache Spark User List mailing list archive at

Re: SparkException: Task not serializable - Jackson Json

2015-02-13 Thread jamckelvey
I'm having the same problem with the same sample code. Any progress on this? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkException-Task-not-serializable-Jackson-Json-tp21347p21651.html Sent from the Apache Spark User List mailing list archi

Re: Task not serializable problem in the multi-thread SQL query

2015-02-12 Thread lihu
>> >> } >> }) >> } >> >> this will throw a Task serializable Exception, if I do not use the >> multi-thread, it works well. >> Since there is no object is not serializable? so what is the problem? >> >> >> java.

Re: Task not serializable problem in the multi-thread SQL query

2015-02-12 Thread Michael Armbrust
//some other query > } > > } > }) > } > > this will throw a Task serializable Exception, if I do not use the > multi-thread, it works well. > Since there is no object is not serializable? so what is the problem? > > > java.lang.Error: org.ap

Task not serializable problem in the multi-thread SQL query

2015-02-12 Thread lihu
//some other query } } }) } this will throw a Task serializable Exception, if I do not use the multi-thread, it works well. Since there is no object is not serializable? so what is the problem? java.lang.Error: org.apache.spark.SparkEx

SparkException: Task not serializable - Jackson Json

2015-01-24 Thread mickdelaney
ny])) } catch { case e: Exception => None } }) result.map(mapper.writeValueAsString(_)).saveAsTextFile(outputFile) } }/ Exception in thread "main" org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleane

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread vdiwakar.malladi
Thanks. After writing it as static inner class, that exception not coming. But getting snappy related exception. I could see the corresponding dependency is in the spark assembly jar. Still getting the exception. Any quick suggestion on this? Here is the stack trace. java.lang.UnsatisfiedLinkErr

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread Sean Owen
You are declaring an anonymous inner class here. It has a reference to the containing class even if you don't use it. If the closure cleaner can't determine it isn't used, this reference will cause everything in the outer class to serialize. Try rewriting this as a named static inner class . On Nov

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread vdiwakar.malladi
Thanks for your prompt response. I'm not using any thing in my map function. please see the below code. For sample purpose, I would like to using 'select * from '. This code worked for me in standalone mode. But when I integrated with my web application, it is throwing the specified exception.

Re: Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread Akhil Das
x27;m using this code implements Serializable class. Could > anyone let me know the cause. > > > org.apache.spark.SparkException: Task not serializable > > Caused by: org.apache.spark.SparkException: Task not serializable > at > > org.apache.spark.util.C

Getting exception on JavaSchemaRDD; org.apache.spark.SparkException: Task not serializable

2014-11-22 Thread vdiwakar.malladi
ializable class. Could anyone let me know the cause. org.apache.spark.SparkException: Task not serializable Caused by: org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:166) at org.apache.spark.util

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-27 Thread Akhil Das
is works for you also. > > > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Accumulators-Task-not-serializable-java-io-NotSerializableException-org-apache-spark-SparkContext-tp17262p17287.html > Sent from the Apache Sp

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-26 Thread octavian.ganea
Hi Akhil, Please see this related message. http://apache-spark-user-list.1001560.n3.nabble.com/Bug-in-Accumulators-td17263.html I am curious if this works for you also. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Accumulators-Task-not-serializable

Re: Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-26 Thread Akhil Das
= 1; foo(iter)}} > .reduce(_ + _) > println(accum.value) > > Now, if I remove the 'accum += 1', everything works fine. If I keep it, I > get this weird error: > > Exception in thread "main" 14/10/25 21:58:56 INFO TaskSchedulerImpl:

Accumulators : Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-10-25 Thread octavian.ganea
ot;main" 14/10/25 21:58:56 INFO TaskSchedulerImpl: Cancelling stage 0 org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSc

Re: What's wrong with my spark filter? I get "org.apache.spark.SparkException: Task not serializable"

2014-10-19 Thread Ilya Ganelin
ong with > my filter, > i get "org.apache.spark.SparkException: Task not serializable" expetion. > > here is my filter function: > object OBJ { >def f1(): Boolean = { > var i = 1; > for (j<-1 to 10) i = i +1; > true; >} >

Re: What's wrong with my spark filter? I get "org.apache.spark.SparkException: Task not serializable"

2014-10-17 Thread Sourav Chandra
Hi, > > Probably I am missing very simple principle , but something is wrong with > my filter, > i get "org.apache.spark.SparkException: Task not serializable" expetion. > > here is my filter function: > object OBJ { >def f1(): Boolean = { > var i =

What's wrong with my spark filter? I get "org.apache.spark.SparkException: Task not serializable"

2014-10-17 Thread shahab
Hi, Probably I am missing very simple principle , but something is wrong with my filter, i get "org.apache.spark.SparkException: Task not serializable" expetion. here is my filter function: object OBJ { def f1(): Boolean = { var i = 1; for (j<-1 to

Re: Task not serializable

2014-09-10 Thread Marcelo Vanzin
, you create the object on the driver and try to >> >> serialize and copy it to workers. In the second, you're creating >> >> SomeUnserializableManagerClass in the function and therefore on the >> >> worker. >> >> >> >> mapPartitions is better if t

Re: Task not serializable

2014-09-10 Thread Sarath Chandra
creation is expensive. > >> > >> On Fri, Sep 5, 2014 at 3:06 PM, Sarath Chandra > >> wrote: > >> > Hi, > >> > > >> > I'm trying to migrate a map-reduce program to work with spark. I > >> > migrated &g

Re: Task not serializable

2014-09-10 Thread Sean Owen
ing this FileSystem instance I read those reference files > and use that data in my processing logic. > > This is throwing task not serializable exceptions for 'UserGroupInformation' > and 'FileSystem' classes. I also tried using 'SparkHadoopUtil' instead

Re: Task not serializable

2014-09-10 Thread Sarath Chandra
again available in HDFS. So inside the map method I'm trying to instantiate UserGroupInformation to get an instance of FileSystem. Then using this FileSystem instance I read those reference files and use that data in my processing logic. This is throwing task not serializable exception

Re: Task not serializable

2014-09-06 Thread Sarath Chandra
ch line in the file it applies several transformation > > functions available in various external libraries. > > > > When I execute this over spark, it is throwing me "Task not serializable" > > exceptions for each and every class being used from these from exter

Re: Task not serializable

2014-09-06 Thread Sean Owen
k. I migrated > the program from Java to Scala. The map-reduce program basically loads a > HDFS file and for each line in the file it applies several transformation > functions available in various external libraries. > > When I execute this over spark, it is throwing me "Task not se

Re: Task not serializable

2014-09-05 Thread Sarath Chandra
lies several transformation >> functions available in various external libraries. >> >> When I execute this over spark, it is throwing me "Task not serializable" >> exceptions for each and every class being used from these from external >> libraries. I in

Re: Task not serializable

2014-09-05 Thread Akhil Das
xternal libraries. > > When I execute this over spark, it is throwing me "Task not serializable" > exceptions for each and every class being used from these from external > libraries. I included serialization to few classes which are in my scope, > but there there ar

Task not serializable

2014-09-05 Thread Sarath Chandra
this over spark, it is throwing me "Task not serializable" exceptions for each and every class being used from these from external libraries. I included serialization to few classes which are in my scope, but there there are several other classes which are out of my scope like org.apache

Re: Debugging "Task not serializable"

2014-08-15 Thread Juan Rodríguez Hortalá
uot; a >>> écrit : >>> >>> A quick fix would be to implement java.io.Serializable in those >>>> classes which are causing this exception. >>>> >>>> >>>> >>>> Thanks >>>> Best Regards >>>> >>>> >&

Spark streaming error - Task not serializable

2014-08-11 Thread Xuri Nagarin
GScheduler: Failed to run foreach at :31 14/08/12 00:47:25 ERROR JobScheduler: Error running job streaming job 1407804445000 ms.0 org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: org.apache.spark.streaming.StreamingContext at org.apa

Re: Debugging "Task not serializable"

2014-07-30 Thread Sourav Chandra
;>> >>> >>> On Mon, Jul 28, 2014 at 9:21 PM, Juan Rodríguez Hortalá < >>> juan.rodriguez.hort...@gmail.com> wrote: >>> >>>> Hi all, >>>> >>>> I was wondering if someone has conceived a method for debugging "Task >&

Re: Debugging "Task not serializable"

2014-07-30 Thread Juan Rodríguez Hortalá
, Jul 28, 2014 at 9:21 PM, Juan Rodríguez Hortalá < >> juan.rodriguez.hort...@gmail.com> wrote: >> >>> Hi all, >>> >>> I was wondering if someone has conceived a method for debugging "Task >>> not serializable: java.io.NotSerializableExcepti

Re: Debugging "Task not serializable"

2014-07-28 Thread andy petrella
ausing this exception. > > > > Thanks > Best Regards > > > On Mon, Jul 28, 2014 at 9:21 PM, Juan Rodríguez Hortalá < > juan.rodriguez.hort...@gmail.com> wrote: > >> Hi all, >> >> I was wondering if someone has conceived a method for debugging &quo

Re: Debugging "Task not serializable"

2014-07-28 Thread Akhil Das
ed a method for debugging "Task not > serializable: java.io.NotSerializableException" errors, apart from > commenting and uncommenting parts of the program, or just turning > everything into Serializable. I find this kind of error very hard to debug, > as these are originated

Debugging "Task not serializable"

2014-07-28 Thread Juan Rodríguez Hortalá
Hi all, I was wondering if someone has conceived a method for debugging "Task not serializable: java.io.NotSerializableException" errors, apart from commenting and uncommenting parts of the program, or just turning everything into Serializable. I find this kind of error very hard to

Re: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-24 Thread Tathagata Das
is code is in the MLLib, I just try to broadcast the >>> centerArrays ] >>> >>> it can success in the redeceBykey operation, but failed at the >>> collect operation, this confused me. >>> >>> >>> INFO DAGScheduler: Fai

Re: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-24 Thread lihu
rrays ] >> >> it can success in the redeceBykey operation, but failed at the >> collect operation, this confused me. >> >> >> INFO DAGScheduler: Failed to run collect at KMeans.scala:235 >> [error] (run-main-0) org.apache.spark.SparkException: Job aborted:

Re: Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-21 Thread hsy...@gmail.com
eans.scala:235 > [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task > not serializable: java.io.NotSerializableException: > org.apache.spark.SparkContext > org.apache.spark.SparkException: Job aborted: Task not serializable: > java.io.NotSerial

Task not serializable: java.io.NotSerializableException: org.apache.spark.SparkContext

2014-07-19 Thread lihu
centerArrays ] it can success in the redeceBykey operation, but failed at the collect operation, this confused me. INFO DAGScheduler: Failed to run collect at KMeans.scala:235 [error] (run-main-0) org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException

Nested method in a class: Task not serializable?

2014-05-16 Thread Pierre B
Hi! I understand the usual "Task not serializable" issue that arises when accessing a field or a method that is out of scope of a closure. To fix it, I usually define a local copy of these fields/methods, which avoids the need to serialize the whole class: class MyClass(val my

Re: Task not serializable?

2014-05-15 Thread pedro
-list.1001560.n3.nabble.com/Re-Task-not-serializable-tp3507p5506.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Task not serializable: collect, take

2014-05-02 Thread SK
Thank you very much. Making the trait serializable worked. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Task-not-serializable-collect-take-tp5193p5236.html Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Task not serializable: collect, take

2014-05-01 Thread Marcelo Vanzin
Have you tried making A extend Serializable? On Thu, May 1, 2014 at 3:47 PM, SK wrote: > Hi, > > I have the following code structure. I compiles ok, but at runtime it aborts > with the error: > Exception in thread "main" org.apache.spark.SparkException: Job aborted

Task not serializable: collect, take

2014-05-01 Thread SK
Hi, I have the following code structure. I compiles ok, but at runtime it aborts with the error: Exception in thread "main" org.apache.spark.SparkException: Job aborted: Task not serializable: java.io.NotSerializableException: I am running in local (standalone) mode. trait A{

Re: Task not serializable?

2014-03-31 Thread Daniel Liu
Hi I am new to Spark and I encountered this error when I try to map RDD[A] => RDD[Array[Double]] then collect the results. A is a custom class extends Serializable. (Actually it's just a wrapper class which wraps a few variables that are all serializable). I also tried KryoSerializer according t