[ 
https://issues.apache.org/jira/browse/HIVE-13180?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Rajesh Balamohan resolved HIVE-13180.
-------------------------------------
    Resolution: Fixed

> WindowingTableFunction fails with ClassCastException: 
> org.apache.hadoop.io.DoubleWritable
> -----------------------------------------------------------------------------------------
>
>                 Key: HIVE-13180
>                 URL: https://issues.apache.org/jira/browse/HIVE-13180
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive
>            Reporter: Rajesh Balamohan
>
> This is with hive-master branch with tpcds-51 query.
> {noformat}
> ], TaskAttempt 3 failed, info=[Error: Failure while running task: 
> attempt_1455662455106_2317_27_02_000284_3:java.lang.RuntimeException: 
> java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: 
> Hive Runtime Error while processing row (tag=0) 
> {"key":{"_col0":63443,"_col1":"2000-01-04"},"value":{"_col0":10.75}}
>         at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:195)
>         at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:160)
>         at 
> org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:354)
>         at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:71)
>         at 
> org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:59)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>         at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:59)
>         at 
> org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:36)
>         at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>         at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.RuntimeException: 
> org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while 
> processing row (tag=0) 
> {"key":{"_col0":63443,"_col1":"2000-01-04"},"value":{"_col0":10.75}}
>         at 
> org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:288)
>         at 
> org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.run(ReduceRecordProcessor.java:263)
>         at 
> org.apache.hadoop.hive.ql.exec.tez.TezProcessor.initializeAndRunProcessor(TezProcessor.java:172)
>         ... 14 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime 
> Error while processing row (tag=0) 
> {"key":{"_col0":63443,"_col1":"2000-01-04"},"value":{"_col0":10.75}}
>         at 
> org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:356)
>         at 
> org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource.pushRecord(ReduceRecordSource.java:278)
>         ... 16 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
> java.lang.ClassCastException: java.lang.Object cannot be cast to 
> org.apache.hadoop.io.DoubleWritable
>         at 
> org.apache.hadoop.hive.ql.exec.GroupByOperator.process(GroupByOperator.java:775)
>         at 
> org.apache.hadoop.hive.ql.exec.tez.ReduceRecordSource$GroupIterator.next(ReduceRecordSource.java:347)
>         ... 17 more
> Caused by: java.lang.ClassCastException: java.lang.Object cannot be cast to 
> org.apache.hadoop.io.DoubleWritable
>         at 
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableDoubleObjectInspector.get(WritableDoubleObjectInspector.java:36)
>         at 
> org.apache.hadoop.hive.serde2.objectinspector.primitive.WritableDoubleObjectInspector.copyObject(WritableDoubleObjectInspector.java:41)
>         at 
> org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils.copyToStandardObject(ObjectInspectorUtils.java:380)
>         at 
> org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils.copyToStandardObject(ObjectInspectorUtils.java:324)
>         at 
> org.apache.hadoop.hive.ql.udf.ptf.WindowingTableFunction$WindowingIterator.next(WindowingTableFunction.java:1416)
>         at 
> org.apache.hadoop.hive.ql.exec.PTFOperator$PTFInvocation.finishPartition(PTFOperator.java:374)
>         at 
> org.apache.hadoop.hive.ql.exec.PTFOperator.process(PTFOperator.java:123)
>         at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837)
>         at 
> org.apache.hadoop.hive.ql.exec.GroupByOperator.forward(GroupByOperator.java:1025)
>         at 
> org.apache.hadoop.hive.ql.exec.GroupByOperator.processAggr(GroupByOperator.java:830)
>         at 
> org.apache.hadoop.hive.ql.exec.GroupByOperator.processKey(GroupByOperator.java:704)
>         at 
> org.apache.hadoop.hive.ql.exec.GroupByOperator.process(GroupByOperator.java:770)
>         ... 18 more
> ]]
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to