Re: debugging NotSerializableException while using Kryo

2013-12-24 Thread Ameet Kini
Hi Michael, I re-ran this on another machine which is on spark's master branch 0.9.0-SNAPSHOT from Dec 14 (right after the scala 2.10 branch was merged back into master) and recreated the NPE towards the end of this message. I can't tell looking at the relevant code what may have caused the

Re: debugging NotSerializableException while using Kryo

2013-12-24 Thread Ameet Kini
If Java serialization is the only one that properly works for closures, then I shouldn't be setting spark.closure.serializer to org.apache.spark.serializer.KryoSerializer, and my only hope for getting lookup (and other such methods that still use closure serializers) to work is to either a) use

Re: debugging NotSerializableException while using Kryo

2013-12-24 Thread Eugen Cepoi
In scala case classes are serializable by default, your TileIdWritable should be a case class. I usually enable Kryo ser for objects and keep default ser for closures, this works pretty well. Eugen 2013/12/24 Ameet Kini ameetk...@gmail.com If Java serialization is the only one that properly

Re: debugging NotSerializableException while using Kryo

2013-12-24 Thread Dmitriy Lyubimov
On Tue, Dec 24, 2013 at 7:29 AM, Ameet Kini ameetk...@gmail.com wrote: If Java serialization is the only one that properly works for closures, then I shouldn't be setting spark.closure.serializer to org.apache.spark.serializer.KryoSerializer, My understanding is that it's not that it kryo

Re: debugging NotSerializableException while using Kryo

2013-12-23 Thread Ameet Kini
Thanks Imran. I tried setting spark.closure.serializer to org.apache.spark.serializer.KryoSerializer and now end up seeing NullPointerException when the executor starts up. This is a snippet of the executor's log. Notice how registered TileIdWritable and registered ArgWritable is called, so I see

Re: debugging NotSerializableException while using Kryo

2013-12-23 Thread Jie Deng
maybe try to implement your class with serializable... 2013/12/23 Ameet Kini ameetk...@gmail.com Thanks Imran. I tried setting spark.closure.serializer to org.apache.spark.serializer.KryoSerializer and now end up seeing NullPointerException when the executor starts up. This is a snippet of

Re: debugging NotSerializableException while using Kryo

2013-12-23 Thread Ameet Kini
Using Java serialization would make the NPE go away, but it would be a less preferable solution. My application is network-intensive, and serialization cost is significant. In other words, these objects are ideal candidates for Kryo. On Mon, Dec 23, 2013 at 3:41 PM, Jie Deng

Re: debugging NotSerializableException while using Kryo

2013-12-23 Thread Michael (Bach) Bui
What spark version are you using? By looking at the code Executor.scala line195, you will at least know what cause the NPE. We can start from there. On Dec 23, 2013, at 10:21 AM, Ameet Kini ameetk...@gmail.com wrote: Thanks Imran. I tried setting spark.closure.serializer to

Re: debugging NotSerializableException while using Kryo

2013-12-20 Thread Imran Rashid
there is a separate setting for serializing closures spark.closure.serializer (listed here http://spark.incubator.apache.org/docs/latest/configuration.html) that is used to serialize whatever is used by all the fucntions on an RDD, eg., map, filter, and lookup. Those closures include referenced