[jira] [Commented] (SPARK-24302) when using spark persist(),"KryoException:IndexOutOfBoundsException" happens
[ https://issues.apache.org/jira/browse/SPARK-24302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16482330#comment-16482330 ] yijukang commented on SPARK-24302: -- [~hyukjin.kwon] OK thanks > when using spark persist(),"KryoException:IndexOutOfBoundsException" happens > > > Key: SPARK-24302 > URL: https://issues.apache.org/jira/browse/SPARK-24302 > Project: Spark > Issue Type: Bug > Components: Input/Output >Affects Versions: 1.6.0 >Reporter: yijukang >Priority: Major > Labels: apache-spark > > my operation is using spark to insert RDD data into Hbase like this: > -- > localData.persist() > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > this way throw Exception: > com.esotericsoftware.kryo.KryoException: > java.lang.IndexOutOfBoundsException:index:99, Size:6 > Serialization trace: > familyMap (org.apache.hadoop.hbase.client.Put) > at > com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > > when i deal with this: > - > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > it works well,what the persist() method did? > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-24302) when using spark persist(),"KryoException:IndexOutOfBoundsException" happens
[ https://issues.apache.org/jira/browse/SPARK-24302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] yijukang updated SPARK-24302: - Labels: apache-spark (was: ) > when using spark persist(),"KryoException:IndexOutOfBoundsException" happens > > > Key: SPARK-24302 > URL: https://issues.apache.org/jira/browse/SPARK-24302 > Project: Spark > Issue Type: Bug > Components: Input/Output >Affects Versions: 1.6.0 >Reporter: yijukang >Priority: Major > Labels: apache-spark > > my operation is using spark to insert RDD data into Hbase like this: > -- > localData.persist() > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > this way throw Exception: > com.esotericsoftware.kryo.KryoException: > java.lang.IndexOutOfBoundsException:index:99, Size:6 > Serialization trace: > familyMap (org.apache.hadoop.hbase.client.Put) > at > com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > > when i deal with this: > - > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > it works well,what the persist() method did? > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-24302) when using spark persist(),"KryoException:IndexOutOfBoundsException" happens
[ https://issues.apache.org/jira/browse/SPARK-24302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] yijukang updated SPARK-24302: - Description: my operation is using spark to insert RDD data into Hbase like this: -- localData.persist() localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- this way throw Exception: com.esotericsoftware.kryo.KryoException: java.lang.IndexOutOfBoundsException:index:99, Size:6 Serialization trace: familyMap (org.apache.hadoop.hbase.client.Put) at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221) at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) when i deal with this: - localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- it works well,what the persist() method did? was: my operation is using spark to insert RDD data into Hbase like this: -- localData.persist() localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- this way throw Exception: com.esotericsoftware.kryo.KryoException: java.lang.IndexOutOfBoundsException:index:99, Size:6 Serialization trace: familyMap (org.apache.hadoop.hbase.client.Put) at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) when i deal with this: - localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- it works well,what the persist() method did? > when using spark persist(),"KryoException:IndexOutOfBoundsException" happens > > > Key: SPARK-24302 > URL: https://issues.apache.org/jira/browse/SPARK-24302 > Project: Spark > Issue Type: Bug > Components: Input/Output >Affects Versions: 1.6.0 >Reporter: yijukang >Priority: Major > > my operation is using spark to insert RDD data into Hbase like this: > -- > localData.persist() > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > this way throw Exception: > com.esotericsoftware.kryo.KryoException: > java.lang.IndexOutOfBoundsException:index:99, Size:6 > Serialization trace: > familyMap (org.apache.hadoop.hbase.client.Put) > at > com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > > when i deal with this: > - > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > it works well,what the persist() method did? > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Updated] (SPARK-24302) when using spark persist(),"KryoException:IndexOutOfBoundsException" happens
[ https://issues.apache.org/jira/browse/SPARK-24302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] yijukang updated SPARK-24302: - Description: my operation is using spark to insert RDD data into Hbase like this: -- localData.persist() localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- this way throw Exception: com.esotericsoftware.kryo.KryoException: java.lang.IndexOutOfBoundsException:index:99, Size:6 Serialization trace: familyMap (org.apache.hadoop.hbase.client.Put) at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) when i deal with this: - localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- it works well,what the persist() method did? was: my operation is using spark to insert RDD data into Hbase like this: -- localData.persist() localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- this way throw Exception: !image-2018-05-17-10-24-00-392.png! when i deal with this: - localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- it works well,what the persist() method did? > when using spark persist(),"KryoException:IndexOutOfBoundsException" happens > > > Key: SPARK-24302 > URL: https://issues.apache.org/jira/browse/SPARK-24302 > Project: Spark > Issue Type: Bug > Components: Input/Output >Affects Versions: 1.6.0 >Reporter: yijukang >Priority: Major > > my operation is using spark to insert RDD data into Hbase like this: > -- > localData.persist() > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > this way throw Exception: > com.esotericsoftware.kryo.KryoException: > java.lang.IndexOutOfBoundsException:index:99, Size:6 > Serialization trace: > familyMap (org.apache.hadoop.hbase.client.Put) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42) > at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33) > at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729) > > when i deal with this: > - > localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) > -- > it works well,what the persist() method did? > > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Created] (SPARK-24302) when using spark persist(),"KryoException:IndexOutOfBoundsException" happens
yijukang created SPARK-24302: Summary: when using spark persist(),"KryoException:IndexOutOfBoundsException" happens Key: SPARK-24302 URL: https://issues.apache.org/jira/browse/SPARK-24302 Project: Spark Issue Type: Bug Components: Input/Output Affects Versions: 1.6.0 Reporter: yijukang my operation is using spark to insert RDD data into Hbase like this: -- localData.persist() localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- this way throw Exception: !image-2018-05-17-10-24-00-392.png! when i deal with this: - localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration) -- it works well,what the persist() method did? -- This message was sent by Atlassian JIRA (v7.6.3#76005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org