[ 
https://issues.apache.org/jira/browse/HIVE-7066?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14003576#comment-14003576
 ] 

Xuefu Zhang commented on HIVE-7066:
-----------------------------------

[~davidzchen] I ran your mvn command and built a tarball. I unzipped it, and 
found in the lib directory there is a jar named avro-1.7.5.jar. Is this what 
you need? I thought all jars under /lib will be shipped to the cluster. BTW, I 
was able to run avro query from this build, though my cluster may already have 
the avro jar.

Your change seems to shade the avro jar in Hive libexec jar, but the JIRA's 
title mentiones avro-mapred. I'm little confused.

> hive-exec jar is missing avro-mapred
> ------------------------------------
>
>                 Key: HIVE-7066
>                 URL: https://issues.apache.org/jira/browse/HIVE-7066
>             Project: Hive
>          Issue Type: Bug
>            Reporter: David Chen
>            Assignee: David Chen
>         Attachments: HIVE-7066.1.patch
>
>
> Running a simple query that reads an Avro table caused the following 
> exception to be thrown on the cluster side:
> {code}
> java.lang.RuntimeException: 
> org.apache.hive.com.esotericsoftware.kryo.KryoException: 
> java.lang.IllegalArgumentException: Unable to create serializer 
> "org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer" for 
> class: org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat
> Serialization trace:
> outputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
> aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:365)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:276)
>       at 
> org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:254)
>       at 
> org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:445)
>       at 
> org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:438)
>       at 
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:587)
>       at 
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:191)
>       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:412)
>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:366)
>       at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:394)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
>       at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: 
> java.lang.IllegalArgumentException: Unable to create serializer 
> "org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer" for 
> class: org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat
> Serialization trace:
> outputFileFormatClass (org.apache.hadoop.hive.ql.plan.PartitionDesc)
> aliasToPartnInfo (org.apache.hadoop.hive.ql.plan.MapWork)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:139)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:17)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:694)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:672)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.deserializeObjectByKryo(Utilities.java:942)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:850)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.deserializePlan(Utilities.java:864)
>       at 
> org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:334)
>       ... 13 more
> Caused by: java.lang.IllegalArgumentException: Unable to create serializer 
> "org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer" for 
> class: org.apache.hadoop.hive.ql.io.avro.AvroContainerOutputFormat
>       at 
> org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:45)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:26)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.newDefaultSerializer(Kryo.java:343)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.getDefaultSerializer(Kryo.java:336)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.registerImplicit(DefaultClassResolver.java:56)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.getRegistration(Kryo.java:476)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:148)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:238)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:226)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:745)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:113)
>       ... 25 more
> Caused by: java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.factories.ReflectionSerializerFactory.makeSerializer(ReflectionSerializerFactory.java:32)
>       ... 37 more
> Caused by: java.lang.NoClassDefFoundError: org/apache/avro/io/DatumWriter
>       at java.lang.Class.getDeclaredFields0(Native Method)
>       at java.lang.Class.privateGetDeclaredFields(Class.java:2348)
>       at java.lang.Class.getDeclaredFields(Class.java:1779)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.rebuildCachedFields(FieldSerializer.java:150)
>       at 
> org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.<init>(FieldSerializer.java:109)
>       ... 42 more
> Caused by: java.lang.ClassNotFoundException: org.apache.avro.io.DatumWriter
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>       ... 47 more
> {code}
> I took a look at the hive-exec jar and found that the Avro core jar was not 
> included, though avro-mapred is included.
> I confirmed that Avro core was included in the Hive 0.12 hive-exec jar. Was 
> there a reason why this was removed in trunk? It seems that this would break 
> the AvroSerDe.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to