Awesome. Thanks so much for sharing the solution! It's very appreciated.
On Apr 29, 2015 6:14 AM, "madhvi" wrote:
> Hi Josh,
>
> The classes which are not serialized are automatically serialized by
> kryo.It is used as follows while making the sparkConf object:
> val conf = new SparkConf()
>
>
Hi Josh,
The classes which are not serialized are automatically serialized by
kryo.It is used as follows while making the sparkConf object:
val conf = new SparkConf()
.setAppName(detail)
.set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer")
.set("spark.dr
Thanks for the report back, Vaibhav.
To clarify, did you have to write code to do the serialization with Kryo
or was this something that Kryo could do automatically.
If you could expand on the steps you took, that'd be awesome as I assume
it will be extremely helpful in the mail archives for
Hi josh,
We solved it using the kryo serializer library to serialise the key class.
Thanks
vaibhav
On 28-Apr-2015 11:14 pm, "Josh Elser" wrote:
> Hi Madhvi,
>
> Thanks for posting this. I'm not super familiar, but my hunch is that
> Spark requires objects that it works with to implement the Jav
Hi Madhvi,
Thanks for posting this. I'm not super familiar, but my hunch is that
Spark requires objects that it works with to implement the Java
Serializable interface.
Accumulo deals with Key (and Value) through Hadoop's Writable interface
(technically WritableComparable, but still stems fr
Hi,
While connecting to accumulo through spark by making sparkRDD I am
getting the following error:
object not serializable (class: org.apache.accumulo.core.data.Key)
This is due to the 'key' class of accumulo which does not implement
serializable interface.How it can be solved and accumulo