[ 
https://issues.apache.org/jira/browse/SPARK-24302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon resolved SPARK-24302.
----------------------------------
    Resolution: Invalid

Sounds like a question and not clear if it's a Spark issue. Let's ask this to 
the dev mailing list and leave this resolved for now until it's clear if it's 
an issue in Spark.

1.6.0 is too old and BTW.

> when using spark persist(),"KryoException:IndexOutOfBoundsException" happens
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-24302
>                 URL: https://issues.apache.org/jira/browse/SPARK-24302
>             Project: Spark
>          Issue Type: Bug
>          Components: Input/Output
>    Affects Versions: 1.6.0
>            Reporter: yijukang
>            Priority: Major
>              Labels: apache-spark
>
> my operation is using spark to insert RDD data into Hbase like this:
> ------------------------------
> localData.persist()
>  localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration)
> --------------------------------------
> this way throw Exception:
>    com.esotericsoftware.kryo.KryoException: 
> java.lang.IndexOutOfBoundsException:index:99, Size:6
> Serialization trace:
>     familyMap (org.apache.hadoop.hbase.client.Put)
>   at 
> com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:221)
>    at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729)
>    at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:42)
>    at com.twitter.chill.Tuple2Serializer.read(TupleSerializers.scala:33)
>    at com.esotericsoftware.kryo.kryo.readClassAndObject(Kryo.java:729)
>  
> when i deal with this:
> -----------------------------
>  localData.saveAsNewAPIHadoopDataset(jobConf.getConfiguration)
> --------------------------------------
> it works well,what the persist() method did?
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to