[ 
https://issues.apache.org/jira/browse/HIVE-11425?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14649567#comment-14649567
 ] 

Prasanth Jayachandran commented on HIVE-11425:
----------------------------------------------

+1

> submitting a query via CLI against a running cluster fails with 
> ClassNotFoundException: org.apache.hadoop.hive.common.type.HiveDecimal
> --------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-11425
>                 URL: https://issues.apache.org/jira/browse/HIVE-11425
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive
>    Affects Versions: 2.0.0
>            Reporter: Eugene Koifman
>            Assignee: Eugene Koifman
>         Attachments: HIVE-11425.patch
>
>
> submitting a query via CLI against a running cluster fails.  This is a side 
> effect of the new
> storage-api module which is not included hive-exec.jar
> {noformat}
> hive> insert into orders values(1,2);
> Query ID = ekoifman_20150730182807_a24eee8c-6f59-42dc-9713-ae722916c82e
> Total jobs = 1
> Launching Job 1 out of 1
> Number of reduce tasks determined at compile time: 1
> In order to change the average load for a reducer (in bytes):
>   set hive.exec.reducers.bytes.per.reducer=<number>
> In order to limit the maximum number of reducers:
>   set hive.exec.reducers.max=<number>
> In order to set a constant number of reducers:
>   set mapreduce.job.reduces=<number>
> Starting Job = job_1438305627853_0002, Tracking URL = 
> http://localhost:8088/proxy/application_1438305627853_0002/
> Kill Command = 
> /Users/ekoifman/dev/hwxhadoop/hadoop-dist/target/hadoop-2.7.1-SNAPSHOT/bin/hadoop
>  job  -kill job_1438305627853_0002
> Hadoop job information for Stage-1: number of mappers: 1; number of reducers: 
> 1
> 2015-07-30 18:28:16,330 Stage-1 map = 0%,  reduce = 0%
> 2015-07-30 18:28:33,929 Stage-1 map = 100%,  reduce = 100%
> Ended Job = job_1438305627853_0002 with errors
> Error during job, obtaining debugging information...
> Job Tracking URL: http://localhost:8088/proxy/application_1438305627853_0002/
> Examining task ID: task_1438305627853_0002_m_000000 (and more) from job 
> job_1438305627853_0002
> Task with the most failures(4): 
> -----
> Task ID:
>   task_1438305627853_0002_m_000000
> URL:
>   
> http://localhost:8088/taskdetails.jsp?jobid=job_1438305627853_0002&tipid=task_1438305627853_0002_m_000000
> -----
> Diagnostic Messages for this Task:
> Error: java.lang.RuntimeException: Error in configuring object
>       at 
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
>       at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
>       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:449)
>       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at javax.security.auth.Subject.doAs(Subject.java:415)
>       at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
>       ... 9 more
> Caused by: java.lang.RuntimeException: Error in configuring object
>       at 
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:112)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)
>       at 
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
>       at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)
>       ... 14 more
> Caused by: java.lang.reflect.InvocationTargetException
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
>       ... 17 more
> Caused by: java.lang.RuntimeException: Map operator initialization failed
>       at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:140)
>       ... 22 more
> Caused by: java.lang.NoClassDefFoundError: 
> org/apache/hadoop/hive/common/type/HiveDecimal
>       at 
> org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils.<clinit>(PrimitiveObjectInspectorUtils.java:234)
>       at 
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.expect(TypeInfoUtils.java:341)
>       at 
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.expect(TypeInfoUtils.java:331)
>       at 
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseType(TypeInfoUtils.java:392)
>       at 
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.parseTypeInfos(TypeInfoUtils.java:305)
>       at 
> org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils.getTypeInfosFromTypeString(TypeInfoUtils.java:765)
>       at 
> org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.extractColumnInfo(LazySerDeParameters.java:142)
>       at 
> org.apache.hadoop.hive.serde2.lazy.LazySerDeParameters.<init>(LazySerDeParameters.java:85)
>       at 
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.initialize(LazySimpleSerDe.java:125)
>       at 
> org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53)
>       at 
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:533)
>       at 
> org.apache.hadoop.hive.ql.plan.PartitionDesc.getDeserializer(PartitionDesc.java:166)
>       at 
> org.apache.hadoop.hive.ql.exec.MapOperator.getConvertedOI(MapOperator.java:302)
>       at 
> org.apache.hadoop.hive.ql.exec.MapOperator.setChildren(MapOperator.java:338)
>       at 
> org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:109)
>       ... 22 more
> Caused by: java.lang.ClassNotFoundException: 
> org.apache.hadoop.hive.common.type.HiveDecimal
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>       ... 37 more
> FAILED: Execution Error, return code 2 from 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> MapReduce Jobs Launched: 
> Stage-Stage-1: Map: 1  Reduce: 1   HDFS Read: 0 HDFS Write: 0 FAIL
> Total MapReduce CPU Time Spent: 0 msec
> (2,FAILED: Execution Error, return code 2 from 
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask,08S01)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to