Re: Kryo serialization failed: Buffer overflow : Broadcast Join

2018-02-02 Thread Pralabh Kumar
I am using spark 2.1.0 On Fri, Feb 2, 2018 at 5:08 PM, Pralabh Kumar wrote: > Hi > > I am performing broadcast join where my small table is 1 gb . I am > getting following error . > > I am using > > > org.apache.spark.SparkException: > . Available: 0, required: 28869232. To avoid this, increase

Re: Kryo not registered class

2017-11-20 Thread Vadim Semenov
Try: Class.forName("[Lorg.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex$SerializableFileStatus$SerializableBlockLocation;") On Sun, Nov 19, 2017 at 3:24 PM, Angel Francisco Orta < angel.francisco.o...@gmail.com> wrote: > Hello, I'm with spark 2.1.0 with scala and I'm register

Re: Kryo On Spark 1.6.0

2017-01-14 Thread Yan Facai
For scala, you could fix it by using: conf.registerKryoClasses(Array(Class.forName("scala.collection.mutable. WrappedArray$ofRef"))) By the way, if the class is array of primitive class of Java, say byte[], then to use: Class.forName("[B") if it is array of other class, say scala.collection.muta

RE: Kryo On Spark 1.6.0 [Solution in this email]

2017-01-11 Thread Enrico DUrso
January 2017 15:12 To: Enrico DUrso Cc: user@spark.apache.org Subject: Re: Kryo On Spark 1.6.0 If you don’t mind, could please share me with the scala solution? I tried to use kryo but seamed not work at all. I hope to get some practical example. THX On 2017年1月10日, at 19:10, Enrico DUrso

Re: Kryo On Spark 1.6.0

2017-01-10 Thread Yang Cao
If you don’t mind, could please share me with the scala solution? I tried to use kryo but seamed not work at all. I hope to get some practical example. THX > On 2017年1月10日, at 19:10, Enrico DUrso wrote: > > Hi, > > I am trying to use Kryo on Spark 1.6.0. > I am able to register my own classes a

RE: Kryo On Spark 1.6.0

2017-01-10 Thread Enrico DUrso
according to how Spark works. How can I register all those classes? cheers, From: Richard Startin [mailto:richardstar...@outlook.com] Sent: 10 January 2017 11:18 To: Enrico DUrso; user@spark.apache.org Subject: Re: Kryo On Spark 1.6.0 Hi Enrico, Only set spark.kryo.registrationRequired if you want

Re: Kryo On Spark 1.6.0

2017-01-10 Thread Richard Startin
Hi Enrico, Only set spark.kryo.registrationRequired if you want to forbid any classes you have not explicitly registered - see http://spark.apache.org/docs/latest/configuration.html. Configuration - Spark 2.0.2 Documentation spark.apache

Re: Kryo serializer slower than Java serializer for Spark Streaming

2016-10-06 Thread Rajkiran Rajkumar
Oops, realized that I didn't reply to all. Pasting snippet again. Hi Sean, Thanks for the reply. I've done the part of forcing registration of classes to the kryo serializer. The observation is in that scenario. To give a sense of the data, they are records which are serialized using thrift and re

Re: Kryo serializer slower than Java serializer for Spark Streaming

2016-10-06 Thread Sean Owen
It depends a lot on your data. If it's a lot of custom types then Kryo doesn't have a lot of advantage, although, you want to make sure to register all your classes with kryo (and consider setting the flag that requires kryo registration to ensure it) because that can let kryo avoid writing a bunch

Re: Kryo ClassCastException during Serialization/deserialization in Spark Streaming

2016-06-23 Thread swetha kasireddy
sampleMap is populated from inside a method that is getting called from updateStateByKey On Thu, Jun 23, 2016 at 1:13 PM, Ted Yu wrote: > Can you illustrate how sampleMap is populated ? > > Thanks > > On Thu, Jun 23, 2016 at 12:34 PM, SRK wrote: > >> Hi, >> >> I keep getting the following error

Re: Kryo ClassCastException during Serialization/deserialization in Spark Streaming

2016-06-23 Thread Ted Yu
Can you illustrate how sampleMap is populated ? Thanks On Thu, Jun 23, 2016 at 12:34 PM, SRK wrote: > Hi, > > I keep getting the following error in my Spark Streaming every now and then > after the job runs for say around 10 hours. I have those 2 classes > registered in kryo as shown below. s

Re: kryo

2016-05-12 Thread Ted Yu
e.DateTimeZone.convertUTCToLocal(DateTimeZone.java:925) > > > > > > Any ideas? > > > > Thanks > > > > > > *From:* Ted Yu [mailto:yuzhih...@gmail.com] > *Sent:* May-11-16 5:32 PM > *To:* Younes Naguib > *Cc:* user@spark.apache.org > *Subject:* Re: kr

RE: kryo

2016-05-12 Thread Younes Naguib
org.joda.time.DateTimeZone.convertUTCToLocal(DateTimeZone.java:925) Any ideas? Thanks From: Ted Yu [mailto:yuzhih...@gmail.com] Sent: May-11-16 5:32 PM To: Younes Naguib Cc: user@spark.apache.org Subject: Re: kryo Have you seen this thread ? http://search-hadoop.com/m/q3RTtpO0qI3cp06/JodaDateTimeSerializer+spark

Re: kryo

2016-05-11 Thread Ted Yu
Have you seen this thread ? http://search-hadoop.com/m/q3RTtpO0qI3cp06/JodaDateTimeSerializer+spark&subj=Re+NPE+when+using+Joda+DateTime On Wed, May 11, 2016 at 2:18 PM, Younes Naguib < younes.nag...@tritondigital.com> wrote: > Hi all, > > I'm trying to get to use spark.serializer. > I set it in

Re: Kryo serialization mismatch in spark sql windowing function

2016-04-06 Thread Soam Acharya
Hi Josh, Appreciate the response! Also, Steve - we meet again :) At any rate, here's the output (a lot of it anyway) of running spark-sql with the verbose option so that you can get a sense of the settings and the classpath. Does anything stand out? Using properties file: /opt/spark/conf/spark-de

Re: Kryo serialization mismatch in spark sql windowing function

2016-04-06 Thread Josh Rosen
Spark is compiled against a custom fork of Hive 1.2.1 which added shading of Protobuf and removed shading of Kryo. What I think that what's happening here is that stock Hive 1.2.1 is taking precedence so the Kryo instance that it's returning is an instance of shaded/relocated Hive version rather th

Re: Kryo serializer Exception during serialization: java.io.IOException: java.lang.IllegalArgumentException:

2016-01-08 Thread Shixiong(Ryan) Zhu
Could you disable `spark.kryo.registrationRequired`? Some classes may not be registered but they work well with Kryo's default serializer. On Fri, Jan 8, 2016 at 8:58 AM, Ted Yu wrote: > bq. try adding scala.collection.mutable.WrappedArray > > But the hint said registering > scala.collection.mu

Re: Kryo serializer Exception during serialization: java.io.IOException: java.lang.IllegalArgumentException:

2016-01-08 Thread Ted Yu
bq. try adding scala.collection.mutable.WrappedArray But the hint said registering scala.collection.mutable.WrappedArray$ofRef.class , right ? On Fri, Jan 8, 2016 at 8:52 AM, jiml wrote: > (point of post is to see if anyone has ideas about errors at end of post) > > In addition, the real way to

Re: Kryo serializer Exception during serialization: java.io.IOException: java.lang.IllegalArgumentException:

2016-01-08 Thread jiml
(point of post is to see if anyone has ideas about errors at end of post) In addition, the real way to test if it's working is to force serialization: In Java: Create array of all your classes: // for kyro serializer it wants to register all classes that need to be serialized Class[] kryoClassAr

Re: Kryo serialization fails when using SparkSQL and HiveContext

2015-12-14 Thread Michael Armbrust
You'll need to either turn off registration (spark.kryo.registrationRequired) or create a custom register spark.kryo.registrator http://spark.apache.org/docs/latest/configuration.html#compression-and-serialization On Mon, Dec 14, 2015 at 2:17 AM, Linh M. Tran wrote: > Hi everyone, > I'm using H

Re: Kryo Serialization in Spark

2015-12-10 Thread manasdebashiskar
Are you sure you are using Kryo serialization. You are getting a java serialization error. Are you setting up your sparkcontext with kryo serialization enabled? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-Serialization-in-Spark-tp25628p25678.html

Re: Kryo Serializer on Worker doesn't work by default.

2015-07-08 Thread Eugene Morozov
What I seem to be don’t get is how my code ends up being on Worker node. My understanding was that jar file, which I use to start the job should automatically be copied into Worker nodes and added to classpath. It seems to be not the case. But if my jar is not copied into Worker nodes, then how

Re: Kryo fails to serialise output

2015-07-03 Thread Will Briggs
Kryo serialization is used internally by Spark for spilling or shuffling intermediate results, not for writing out an RDD as an action. Look at Sandy Ryza's examples for some hints on how to do this: https://github.com/sryza/simplesparkavroapp Regards, Will On July 3, 2015, at 2:45 AM, Dominik

Re: Kryo serialization of classes in additional jars

2015-06-26 Thread patcharee
Hi, I am having this problem on spark 1.4. Do you have any ideas how to solve it? I tried to use spark.executor.extraClassPath, but it did not help BR, Patcharee On 04. mai 2015 23:47, Imran Rashid wrote: Oh, this seems like a real pain. You should file a jira, I didn't see an open issue --

Re: Kryo serialization of classes in additional jars

2015-05-13 Thread Akshat Aranya
I cherry-picked this commit into my local 1.2 branch. It fixed the problem with setting spark.serializer, but I get a similar problem with spark.closure.serializer org.apache.spark.SparkException: Failed to register classes with Kryo at org.apache.spark.serializer.KryoSerializer.newKryo(Kry

Re: Kryo serialization of classes in additional jars

2015-05-04 Thread Akshat Aranya
Actually, after some digging, I did find a JIRA for it: SPARK-5470. The fix for this has gone into master, but it isn't in 1.2. On Mon, May 4, 2015 at 2:47 PM, Imran Rashid wrote: > Oh, this seems like a real pain. You should file a jira, I didn't see an > open issue -- if nothing else just to d

Re: Kryo serialization of classes in additional jars

2015-05-04 Thread Imran Rashid
Oh, this seems like a real pain. You should file a jira, I didn't see an open issue -- if nothing else just to document the issue. As you've noted, the problem is that the serializer is created immediately in the executors, right when the SparkEnv is created, but the other jars aren't downloaded

Re: Kryo exception : Encountered unregistered class ID: 13994

2015-04-13 Thread ๏̯͡๏
gt; > > Thank you for your answer. > > > > I’m already registering my classes as you’re suggesting… > > > > Regards > > > > *De :* tsingfu [via Apache Spark User List] [mailto:ml-node+[hidden email] > <http:///user/SendEmail.jtp?type=node&

RE: Kryo exception : Encountered unregistered class ID: 13994

2015-04-13 Thread mehdisinger
Hello, Thank you for your answer. I'm already registering my classes as you're suggesting... Regards De : tsingfu [via Apache Spark User List] [mailto:ml-node+s1001560n22468...@n3.nabble.com] Envoyé : lundi 13 avril 2015 03:48 À : Mehdi Singer Objet : Re: Kryo exception : E

Re: Kryo exception : Encountered unregistered class ID: 13994

2015-04-09 Thread Guillaume Pitel
Hi, From my experience, those errors happen under very high memory pressure, and/or with machines with bad hardware (memory, network card,..) I have had a few of them, as well as Snappy uncompress errors, on a machine with a slightly failing memory stick. Given the large amount of data trans

Re: Kryo exception : Encountered unregistered class ID: 13994

2015-04-09 Thread Ted Yu
Is there custom class involved in your application ? I assume you have called sparkConf.registerKryoClasses() for such class(es). Cheers On Thu, Apr 9, 2015 at 7:15 AM, mehdisinger wrote: > Hi, > > I'm facing an issue when I try to run my Spark application. I keep getting > the following excep

Re: Kryo NPE with Array

2014-12-02 Thread Simone Franzini
I finally solved this issue. The problem was that: 1. I defined a case class with a Buffer[MyType] field. 2. I instantiated the class with the field set to the value given by an implicit conversion from a Java list, which is supposedly a Buffer. 3. However, the underlying type of that field was ins

RE: Kryo exception for CassandraSQLRow

2014-12-01 Thread Ashic Mahtab
Don't know if this'll solve it, but if you're on Spark 1.1, the Cassandra Connector version 1.1.0 final fixed the guava back compat issue. Maybe taking the guava exclusions might help? Date: Mon, 1 Dec 2014 10:48:25 +0100 Subject: Kryo exception for CassandraSQLRow From: shahab.mok...@gmail.com

Re: Kryo NPE with Array

2014-11-26 Thread Simone Franzini
I guess I already have the answer of what I have to do here, which is to configure the kryo object with the strategy as above. Now the question becomes: how can I pass this custom kryo configuration to the spark kryo serializer / kryo registrator? I've had a look at the code but I am still fairly n

Re: Kryo UnsupportedOperationException

2014-09-25 Thread Ian O'Connell
I would guess the field serializer is having issues being able to reconstruct the class again, its pretty much best effort. Is this an intermediate type? On Thu, Sep 25, 2014 at 2:12 PM, Sandy Ryza wrote: > We're running into an error (below) when trying to read spilled shuffle > data back in.

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-19 Thread Frank Austin Nothaft
Hi Mohan, It’s a bit convoluted to follow in their source, but they essentially typedef KSerializer as being a KryoSerializer, and then their serializers all extend KSerializer. Spark should identify them properly as Kryo Serializers, but I haven’t tried it myself. Regards, Frank Austin Notha

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-19 Thread mohan.gadm
Thanks for the info frank. Twitter's-chill avro serializer looks great. But how does spark identifies it as serializer, as its not extending from KryoSerializer. (sorry scala is an alien lang for me). - Thanks & Regards, Mohan -- View this message in context: http://apache-spark-user-list.

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread Frank Austin Nothaft
Mohan, You’ll need to register it; we register our serializer in lines 69 to 76 in https://github.com/bigdatagenomics/adam/blob/master/adam-core/src/main/scala/org/bdgenomics/adam/serialization/ADAMKryoRegistrator.scala. Our serializer implementation falls back on the default Avro serializer; yo

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread mohan.gadm
Thanks for the info frank. so your suggestion could be to use Avro serializer. i just have to configure it like Kryo for the same property? and is there any registering process for this or just specify serializer? Also does it effect performance. what measures to be taken to avoid. (im using kryo

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread Frank Austin Nothaft
Hi Mohan, It’s been a while since I’ve looked at this specifically, but I don’t think the default Kryo serializer will properly serialize Avro. IIRC, there are complications around the way that Avro handles nullable fields, which would be consistent with the NPE you’re encountering here. That’s

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread mohan.gadm
Hi frank, thanks for the info, thats great. but im not saying Avro serializer is failing. Kryo is failing but im using kryo serializer. and registering Avro generated classes with kryo. sparkConf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer"); sparkConf.set("spark.kr

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread Frank Austin Nothaft
Mohan, I don’t think this is a Spark issue, rather, I think the issue is coming from your serializer. Can you point us to the serializer that you are using? We have no problems serializing complex Avro (nested schemas with unions and arrays) when using this serializer. You may also want to look

Re: Kryo fails with avro having Arrays and unions, but succeeds with simple avro.

2014-09-18 Thread mohan.gadm
Added some more info on this issue in the tracker Spark-3447 https://issues.apache.org/jira/browse/SPARK-3447 - Thanks & Regards, Mohan -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-fails-with-avro-having-Arrays-and-unions-but-succeeds-with-simple

Re: Kryo Issue on Spark 1.0.1, Mesos 0.18.2

2014-07-25 Thread Gary Malouf
Maybe this is me misunderstanding the Spark system property behavior, but I'm not clear why the class being loaded ends up having '/' rather than '.' in it's fully qualified name. When I tested this out locally, the '/' were preventing the class from being loaded. On Fri, Jul 25, 2014 at 2:27 PM

RE: Kryo is slower, and the size saving is minimal

2014-07-09 Thread innowireless TaeYun Kim
ginal Message- From: wxhsdp [mailto:wxh...@gmail.com] Sent: Wednesday, July 09, 2014 5:47 PM To: u...@spark.incubator.apache.org Subject: Re: Kryo is slower, and the size saving is minimal i'am not familiar with kryo and my opinion may be not right. in my case, kryo only saves about 5% of th

Re: Kryo is slower, and the size saving is minimal

2014-07-09 Thread wxhsdp
i'am not familiar with kryo and my opinion may be not right. in my case, kryo only saves about 5% of the original size when dealing with primitive types such as Arrays. i'am not sure whether it is the common case. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.

Re: Kryo serialization does not compress

2014-03-07 Thread pradeeps8
Hi Patrick, Thanks for your reply. I am guessing even an array type will be registered automatically. Is this correct? Thanks, Pradeep -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-serialization-does-not-compress-tp2042p2400.html Sent from the Apac

Re: Kryo serialization does not compress

2014-03-06 Thread Patrick Wendell
Hey There, This is interesting... thanks for sharing this. If you are storing in MEMORY_ONLY then you are just directly storing Java objects in the JVM. So they can't be compressed because they aren't really stored in a known format it's just left up to the JVM. To answer you other question, it's

Re: Kryo serialization does not compress

2014-03-06 Thread pradeeps8
We are trying to use kryo serialization, but with kryo serialization ON the memory consumption does not change. We have tried this on multiple sets of data. We have also checked the logs of Kryo serialization and have confirmed that Kryo is being used. Can somebody please help us with this? The s

Re: Kryo Registration, class is not registered, but Log.TRACE() says otherwise

2014-02-28 Thread pondwater
Has no one ever registered generic classes in scala? Is it possible? -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-Registration-class-is-not-registered-but-Log-TRACE-says-otherwise-tp2077p2182.html Sent from the Apache Spark User List mailing list arc