Robert Hou created DRILL-6567:
---------------------------------

             Summary: Jenkins Regression: TPCDS query 93 fails with 
INTERNAL_ERROR ERROR: java.lang.reflect.UndeclaredThrowableException.
                 Key: DRILL-6567
                 URL: https://issues.apache.org/jira/browse/DRILL-6567
             Project: Apache Drill
          Issue Type: Bug
          Components: Execution - Relational Operators
    Affects Versions: 1.14.0
            Reporter: Robert Hou
            Assignee: Pritesh Maker
             Fix For: 1.14.0


This is TPCDS Query 93.

Query: 
/root/drillAutomation/framework-master/framework/resources/Advanced/tpcds/tpcds_sf100/hive/parquet/query93.sql

SELECT ss_customer_sk,
Sum(act_sales) sumsales
FROM   (SELECT ss_item_sk,
ss_ticket_number,
ss_customer_sk,
CASE
WHEN sr_return_quantity IS NOT NULL THEN
( ss_quantity - sr_return_quantity ) * ss_sales_price
ELSE ( ss_quantity * ss_sales_price )
END act_sales
FROM   store_sales
LEFT OUTER JOIN store_returns
ON ( sr_item_sk = ss_item_sk
AND sr_ticket_number = ss_ticket_number ),
reason
WHERE  sr_reason_sk = r_reason_sk
AND r_reason_desc = 'reason 38') t
GROUP  BY ss_customer_sk
ORDER  BY sumsales,
ss_customer_sk
LIMIT 100;

Here is the stack trace:
2018-06-29 07:00:32 INFO  DrillTestLogger:348 - 
Exception:

java.sql.SQLException: INTERNAL_ERROR ERROR: 
java.lang.reflect.UndeclaredThrowableException

Setup failed for null
Fragment 4:56

[Error Id: 3c72c14d-9362-4a9b-affb-5cf937bed89e on atsqa6c82.qa.lab:31010]

  (org.apache.drill.common.exceptions.ExecutionSetupException) 
java.lang.reflect.UndeclaredThrowableException
    
org.apache.drill.common.exceptions.ExecutionSetupException.fromThrowable():30
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.setup():327
    org.apache.drill.exec.physical.impl.ScanBatch.getNextReaderIfHas():245
    org.apache.drill.exec.physical.impl.ScanBatch.next():164
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.partitionsender.PartitionSenderRootExec.innerNext():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.util.concurrent.ExecutionException) 
java.lang.reflect.UndeclaredThrowableException
    java.util.concurrent.FutureTask.report():122
    java.util.concurrent.FutureTask.get():192
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.setup():320
    org.apache.drill.exec.physical.impl.ScanBatch.getNextReaderIfHas():245
    org.apache.drill.exec.physical.impl.ScanBatch.next():164
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.partitionsender.PartitionSenderRootExec.innerNext():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.reflect.UndeclaredThrowableException) null
    org.apache.hadoop.security.UserGroupInformation.doAs():1610
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.drill.common.exceptions.ExecutionSetupException) Failed 
to get o.a.hadoop.mapred.RecordReader from Hive InputFormat
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.initNextReader():279
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.init():257
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.access$000():71
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():313
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():310
    org.apache.drill.exec.ops.OperatorContextImpl$1$1.run():104
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (hive.org.apache.parquet.io.ParquetDecodingException) Can not read 
value at 1 in block 0 in file 
maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_12_1.parquet
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():243
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():117
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():80
    
org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader():72
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.initNextReader():276
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.init():257
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.access$000():71
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():313
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():310
    org.apache.drill.exec.ops.OperatorContextImpl$1$1.run():104
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.UnsupportedOperationException) 
org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$8$1
    hive.org.apache.parquet.io.api.PrimitiveConverter.addInt():101
    hive.org.apache.parquet.column.impl.ColumnReaderImpl$2$3.writeValue():254
    
hive.org.apache.parquet.column.impl.ColumnReaderImpl.writeCurrentValueToConverter():371
    hive.org.apache.parquet.io.RecordReaderImplementation.read():405
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():218
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():117
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():80
    
org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader():72
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.initNextReader():276
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.init():257
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.access$000():71
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():313
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():310
    org.apache.drill.exec.ops.OperatorContextImpl$1$1.run():104
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748

        at 
org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:528)
        at 
org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:600)
        at 
org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1904)
        at 
org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:64)
        at 
oadd.org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:630)
        at 
org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1109)
        at 
org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1120)
        at 
oadd.org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:638)
        at 
org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
        at 
oadd.org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:149)
        at 
oadd.org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:218)
        at 
org.apache.drill.jdbc.impl.DrillStatementImpl.executeQuery(DrillStatementImpl.java:110)
        at 
org.apache.drill.test.framework.DrillTestJdbc.executeQuery(DrillTestJdbc.java:210)
        at 
org.apache.drill.test.framework.DrillTestJdbc.run(DrillTestJdbc.java:115)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: oadd.org.apache.drill.common.exceptions.UserRemoteException: 
INTERNAL_ERROR ERROR: java.lang.reflect.UndeclaredThrowableException

Setup failed for null
Fragment 4:56

[Error Id: 3c72c14d-9362-4a9b-affb-5cf937bed89e on atsqa6c82.qa.lab:31010]

  (org.apache.drill.common.exceptions.ExecutionSetupException) 
java.lang.reflect.UndeclaredThrowableException
    
org.apache.drill.common.exceptions.ExecutionSetupException.fromThrowable():30
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.setup():327
    org.apache.drill.exec.physical.impl.ScanBatch.getNextReaderIfHas():245
    org.apache.drill.exec.physical.impl.ScanBatch.next():164
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.partitionsender.PartitionSenderRootExec.innerNext():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.util.concurrent.ExecutionException) 
java.lang.reflect.UndeclaredThrowableException
    java.util.concurrent.FutureTask.report():122
    java.util.concurrent.FutureTask.get():192
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.setup():320
    org.apache.drill.exec.physical.impl.ScanBatch.getNextReaderIfHas():245
    org.apache.drill.exec.physical.impl.ScanBatch.next():164
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.partitionsender.PartitionSenderRootExec.innerNext():152
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.reflect.UndeclaredThrowableException) null
    org.apache.hadoop.security.UserGroupInformation.doAs():1610
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (org.apache.drill.common.exceptions.ExecutionSetupException) Failed 
to get o.a.hadoop.mapred.RecordReader from Hive InputFormat
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.initNextReader():279
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.init():257
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.access$000():71
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():313
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():310
    org.apache.drill.exec.ops.OperatorContextImpl$1$1.run():104
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (hive.org.apache.parquet.io.ParquetDecodingException) Can not read 
value at 1 in block 0 in file 
maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_12_1.parquet
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():243
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():117
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():80
    
org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader():72
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.initNextReader():276
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.init():257
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.access$000():71
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():313
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():310
    org.apache.drill.exec.ops.OperatorContextImpl$1$1.run():104
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.UnsupportedOperationException) 
org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$8$1
    hive.org.apache.parquet.io.api.PrimitiveConverter.addInt():101
    hive.org.apache.parquet.column.impl.ColumnReaderImpl$2$3.writeValue():254
    
hive.org.apache.parquet.column.impl.ColumnReaderImpl.writeCurrentValueToConverter():371
    hive.org.apache.parquet.io.RecordReaderImplementation.read():405
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():218
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():117
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.<init>():80
    
org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader():72
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.initNextReader():276
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.init():257
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader.access$000():71
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():313
    org.apache.drill.exec.store.hive.readers.HiveAbstractReader$1.call():310
    org.apache.drill.exec.ops.OperatorContextImpl$1$1.run():104
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.ops.OperatorContextImpl$1.call():101
    java.util.concurrent.FutureTask.run():266
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748

        at 
oadd.org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
        at 
oadd.org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
        at 
oadd.org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
        at 
oadd.org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:274)
        at 
oadd.org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:244)
        at 
oadd.io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
        at 
oadd.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at 
oadd.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at 
oadd.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at 
oadd.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at 
oadd.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at oadd.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at 
oadd.io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        ... 1 more




--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to