InterestingÅ . Could you run $KYLIN_HOME/bin/find-hive-dependency.sh and
then copy the output here? If there is some entry like hive/lib/*, please
also copy the list of that folder, that might be helpful for identify the
issue.

On 5/28/15, 5:01 PM, "Mohit Bhatnagar" <[email protected]> wrote:

>Hi,
>
>We have 2 jars hive-metastore-0.13.1-cdh5.3.3.jar and hive-metastore.jar.
>Both of these have org.apache.hadoop.hive.metastore.api.Table class. But
>both their checksums are same. i.e same version. Can there be any other
>issue? 
>
>
>
>Thanks,
>Mohit 
>
>-----Original Message-----
>From: ShaoFeng Shi [mailto:[email protected]]
>Sent: Thursday, May 28, 2015 2:03 PM
>To: [email protected]
>Subject: Re: Issue while building CUBE: java.io.IOException:
>Deserialization error:
>
>This error indicates there might be different version jars in your
>environment; As the class is packaged in hive-metastore-*.jar, please
>search your machine to see whether there are different versions, and then
>remove the redundant one;
>
>2015-05-28 15:33 GMT+08:00 Mohit Bhatnagar <[email protected]>:
>
>> Hi,
>> Can someone help me out with this. I am getting the below error while
>> building the cube in stage 2 :
>> Here are the logs
>>
>> 2015-05-27 19:04:58,108 FATAL [IPC Server handler 1 on 44765]
>> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task:
>> attempt_1432706588203_2125_m_000000_0 - exited : java.io.IOException:
>> Deserialization error: org.apache.hadoop.hive.metastore.api.Table;
>> local class incompatible: stream classdesc serialVersionUID =
>> 398473631015277182, local class serialVersionUID = -946662244473213550
>>         at
>> org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:117)
>>         at
>> 
>>org.apache.hive.hcatalog.mapreduce.HCatSplit.readFields(HCatSplit.java:13
>>9)
>>         at
>> 
>>org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserialize
>>r.deserialize(WritableSerialization.java:71)
>>         at
>> 
>>org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserialize
>>r.deserialize(WritableSerialization.java:42)
>>         at
>> org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
>>         at 
>>org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:751)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> 
>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>>.java:1642)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>> Caused by: java.io.InvalidClassException:
>> org.apache.hadoop.hive.metastore.api.Table; local class incompatible:
>> stream classdesc serialVersionUID = 398473631015277182, local class
>> serialVersionUID = -946662244473213550
>>         at
>> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
>>         at
>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
>>         at
>> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>>         at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>>         at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at 
>>java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>>         at
>> org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:115)
>>         ... 11 more
>>
>> 2015-05-27 19:04:58,108 INFO [IPC Server handler 1 on 44765]
>> org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report
>> from
>> attempt_1432706588203_2125_m_000000_0: Error: java.io.IOException:
>> Deserialization error: org.apache.hadoop.hive.metastore.api.Table;
>> local class incompatible: stream classdesc serialVersionUID =
>> 398473631015277182, local class serialVersionUID = -946662244473213550
>>         at
>> org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:117)
>>         at
>> 
>>org.apache.hive.hcatalog.mapreduce.HCatSplit.readFields(HCatSplit.java:13
>>9)
>>         at
>> 
>>org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserialize
>>r.deserialize(WritableSerialization.java:71)
>>         at
>> 
>>org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserialize
>>r.deserialize(WritableSerialization.java:42)
>>         at
>> org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
>>         at 
>>org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:751)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> 
>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>>.java:1642)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>> Caused by: java.io.InvalidClassException:
>> org.apache.hadoop.hive.metastore.api.Table; local class incompatible:
>> stream classdesc serialVersionUID = 398473631015277182, local class
>> serialVersionUID = -946662244473213550
>>         at
>> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
>>         at
>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
>>         at
>> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>>         at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>>         at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at 
>>java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>>         at
>> org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:115)
>>         ... 11 more
>>
>> 2015-05-27 19:04:58,110 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
>> Diagnostics report from attempt_1432706588203_2125_m_000000_0: Error:
>> java.io.IOException: Deserialization error:
>> org.apache.hadoop.hive.metastore.api.Table; local class incompatible:
>> stream classdesc serialVersionUID = 398473631015277182, local class
>> serialVersionUID = -946662244473213550
>>         at
>> org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:117)
>>         at
>> 
>>org.apache.hive.hcatalog.mapreduce.HCatSplit.readFields(HCatSplit.java:13
>>9)
>>         at
>> 
>>org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserialize
>>r.deserialize(WritableSerialization.java:71)
>>         at
>> 
>>org.apache.hadoop.io.serializer.WritableSerialization$WritableDeserialize
>>r.deserialize(WritableSerialization.java:42)
>>         at
>> org.apache.hadoop.mapred.MapTask.getSplitDetails(MapTask.java:372)
>>         at 
>>org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:751)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> 
>>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation
>>.java:1642)
>>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>> Caused by: java.io.InvalidClassException:
>> org.apache.hadoop.hive.metastore.api.Table; local class incompatible:
>> stream classdesc serialVersionUID = 398473631015277182, local class
>> serialVersionUID = -946662244473213550
>>         at
>> java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
>>         at
>> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
>>         at
>> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>>         at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at
>> java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>>         at
>> java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>>         at
>> 
>>java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>>         at
>> java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>         at 
>>java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>>         at
>> org.apache.hive.hcatalog.common.HCatUtil.deserialize(HCatUtil.java:115)
>>         ... 11 more
>>
>> 2015-05-27 19:04:58,112 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
>> attempt_1432706588203_2125_m_000000_0 TaskAttempt Transitioned from
>> RUNNING to FAIL_CONTAINER_CLEANUP
>> 2015-05-27 19:04:58,114 INFO [ContainerLauncher #1]
>> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
>> Processing the event EventType: CONTAINER_REMOTE_CLEANUP for container
>> container_e38_1432706588203_2125_01_000002 taskAttempt
>> attempt_1432706588203_2125_m_000000_0
>> 2015-05-27 19:04:58,114 INFO [ContainerLauncher #1]
>> org.apache.hadoop.mapreduce.v2.app.launcher.ContainerLauncherImpl:
>> KILLING
>> attempt_1432706588203_2125_m_000000_0
>> 2015-05-27 19:04:58,115 INFO [ContainerLauncher #1]
>> org.apache.hadoop.yarn.client.api.impl.ContainerManagementProtocolProxy:
>> Opening proxy : sj1dra054.corp.adobe.com:8041
>> 2015-05-27 19:04:58,138 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
>> attempt_1432706588203_2125_m_000000_0 TaskAttempt Transitioned from
>> FAIL_CONTAINER_CLEANUP to FAIL_TASK_CLEANUP
>> 2015-05-27 19:04:58,139 INFO [CommitterEvent Processor #1]
>> org.apache.hadoop.mapreduce.v2.app.commit.CommitterEventHandler:
>> Processing the event EventType: TASK_ABORT
>> 2015-05-27 19:04:58,148 WARN [CommitterEvent Processor #1]
>> org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not
>> delete
>> hdfs://nameservice1/tmp/kylin-ffb0e3f9-4720-418a-a8d9-924f47fe1eea/mmm
>> mmm/fact_distinct_columns/_temporary/1/_temporary/attempt_143270658820
>> 3_2125_m_000000_0
>> 2015-05-27 19:04:58,150 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
>> attempt_1432706588203_2125_m_000000_0 TaskAttempt Transitioned from
>> FAIL_TASK_CLEANUP to FAILED
>> 2015-05-27 19:04:58,161 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.yarn.util.RackResolver: Resolved
>> sj1dra054.corp.adobe.com to /sb
>> 2015-05-27 19:04:58,161 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.yarn.util.RackResolver: Resolved
>> sj1dra151.corp.adobe.com to /sa
>> 2015-05-27 19:04:58,161 INFO [Thread-51]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor: 1 failures
>> on node sj1dra054.corp.adobe.com
>> 2015-05-27 19:04:58,164 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
>> attempt_1432706588203_2125_m_000000_1 TaskAttempt Transitioned from
>> NEW to UNASSIGNED
>> 2015-05-27 19:04:58,165 INFO [Thread-51]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Added
>> attempt_1432706588203_2125_m_000000_1 to list of failed maps
>> 2015-05-27 19:04:58,365 INFO [RMCommunicator Allocator]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Before
>> Scheduling: PendingReds:1 ScheduledMaps:1 ScheduledReds:0
>> AssignedMaps:1
>> AssignedReds:0 CompletedMaps:0 CompletedReds:0 ContAlloc:1 ContRel:0
>> HostLocal:1 RackLocal:0
>> 2015-05-27 19:04:58,368 INFO [RMCommunicator Allocator]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerRequestor:
>> getResources() for application_1432706588203_2125: ask=1 release= 0
>> newContainers=0
>> finishedContainers=0 resourcelimit=<memory:321024, vCores:262>
>> knownNMs=10
>> 2015-05-27 19:04:58,368 INFO [RMCommunicator Allocator]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator:
>> Recalculating schedule, headroom=<memory:321024, vCores:262>
>> 2015-05-27 19:04:58,368 INFO [RMCommunicator Allocator]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Reduce
>> slow start threshold not met. completedMapsForReduceSlowstart 1
>> 2015-05-27 19:04:59,378 INFO [RMCommunicator Allocator]
>> org.apache.hadoop.mapreduce.v2.app.rm.RMContainerAllocator: Received
>> completed container container_e38_1432706588203_2125_01_000002
>> 2015-05-27 19:04:59,379 INFO [AsyncDispatcher event handler]
>> org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl:
>> Diagnostics report from attempt_1432706588203_2125_m_000000_0:
>> Container killed by the ApplicationMaster.
>> Container killed on request. Exit code is 143
>>
>>
>>
>>
>> Thanks,
>> Mohit
>>

Reply via email to