anyone have a idea for such a error,when load hive table, error as follow:

L4J [2015-12-24 13:54:39,354][INFO][org.apache.hadoop.mapreduce.Job] - Task
Id : attempt_1450856278246_0003_m_000000_0, Status : FAILED
Error: java.io.IOException: Deserialization error: invalid stream header:
926A2769
        at 
org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:120)
        at
org.apache.hive.hcatalog.mapreduce.HCatSplit.readFields(HCatSplit.java:132)
        at
org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:71)
        at
org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserializer.deserialize(WritableSerialization.java:42)
        at org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:754)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.StreamCorruptedException: invalid stream header: 926A2769
        at 
java.io.ObjectInputStream.readStreamHeader(ObjectInputStream.java:804)
        at java.io.ObjectInputStream.<init>(ObjectInputStream.java:299)
        at 
org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:117)
        ... 11 more

L4J [2015-12-24 13:54:41,105][INFO][org.apache.hadoop.mapreduce.Job] -  map
0% reduce 0%

--
View this message in context: 
http://apache-kylin.74782.x6.nabble.com/encouter-Deserialization-error-when-load-hive-table-tp2888.html
Sent from the Apache Kylin mailing list archive at Nabble.com.

Reply via email to