I had issues around embedded functions here's what I have figured. Every
inner class actually contains a field referencing the outer class. The
anonymous class actually has a this$0 field referencing the outer class,
and thus why
Spark is trying to serialize Outer class.

In the Scala API, the closure (which is really just implemented
as anonymous classes) has a field called "$outer", and Spark uses a
"closure cleaner" that goes into the anonymous class to remove the $outer
field if it is not used in the closure itself. In Java, the compiler
generates a field called "this$0", and thus the closure cleaner doesn't
find it and can't "clean" it properly.



Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>



On Wed, Jun 4, 2014 at 4:18 PM, nilmish <nilmish....@gmail.com> wrote:

> The error is resolved. I was using a comparator which was not serialised
> because of which it was throwing the error.
>
> I have now switched to kryo serializer as it is faster than java serialser.
> I have set the required config
>
> conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer");
> conf.set("spark.kryo.registrator", "MyRegistrator");
>
> and also in MyRegistrator class I have registered all the classes I am
> serialising.
>
> How can I confirm that my code is actually using kryo serialiser and not
> java serialiser now ?
>
> PS : It seems like my code is still not using kryo serialiser.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Error-related-to-serialisation-in-spark-streaming-tp6801p6904.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to