[jira] [Commented] (HIVE-19098) Hive: impossible to insert data in a parquet's table with "union all" in the select query
[ https://issues.apache.org/jira/browse/HIVE-19098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16752407#comment-16752407 ] ACOSS commented on HIVE-19098: -- Hello I Test with Hive Hive 3.1 (and not 3.0) In this version, the problem is fixed. Best regards > Hive: impossible to insert data in a parquet's table with "union all" in the > select query > - > > Key: HIVE-19098 > URL: https://issues.apache.org/jira/browse/HIVE-19098 > Project: Hive > Issue Type: Bug > Components: File Formats, Hive >Affects Versions: 2.3.2 >Reporter: ACOSS >Assignee: Peter Vary >Priority: Minor > > Hello > We have a parquet's table. > We want to insert data in the table by a querie like this: > "insert into my_table select * from my_select_table_1 union all select * from > my_select_table_2" > It's fail with the error: > 2018-04-03 15:49:28,898 FATAL [IPC Server handler 2 on 38465] > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: > attempt_1522749003448_0028_m_00_0 - exited : java.io.IOException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(HadoopShimsSecure.java:217) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:169) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257) > ... 11 more > Caused by: java.lang.NullPointerException > at java.util.AbstractCollection.addAll(AbstractCollection.java:343) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189) > at > org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:60) > at > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.(CombineHiveRecordReader.java:99) > ... 16 more > > Scenario: > create table t1 (col1 string); > create table t2 (col1 string); > insert into t2 values ('2017'); > insert into t1 values ('2017'); > create table t3 (col1 string) STORED AS PARQUETFILE; > INSERT into t3 select col1 from t1 union all select col1 from t2; -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-19098) Hive: impossible to insert data in a parquet's table with "union all" in the select query
[ https://issues.apache.org/jira/browse/HIVE-19098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16694476#comment-16694476 ] Ke Zhang commented on HIVE-19098: - This bug may related to HIVE-16958 showing similar failure behaviors (in Hive 2.2+2.3) and may be fixed in Hive 3.0. > Hive: impossible to insert data in a parquet's table with "union all" in the > select query > - > > Key: HIVE-19098 > URL: https://issues.apache.org/jira/browse/HIVE-19098 > Project: Hive > Issue Type: Bug > Components: File Formats, Hive >Affects Versions: 2.3.2 >Reporter: ACOSS >Assignee: Peter Vary >Priority: Minor > > Hello > We have a parquet's table. > We want to insert data in the table by a querie like this: > "insert into my_table select * from my_select_table_1 union all select * from > my_select_table_2" > It's fail with the error: > 2018-04-03 15:49:28,898 FATAL [IPC Server handler 2 on 38465] > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: > attempt_1522749003448_0028_m_00_0 - exited : java.io.IOException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(HadoopShimsSecure.java:217) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:169) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257) > ... 11 more > Caused by: java.lang.NullPointerException > at java.util.AbstractCollection.addAll(AbstractCollection.java:343) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189) > at > org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:60) > at > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.(CombineHiveRecordReader.java:99) > ... 16 more > > Scenario: > create table t1 (col1 string); > create table t2 (col1 string); > insert into t2 values ('2017'); > insert into t1 values ('2017'); > create table t3 (col1 string) STORED AS PARQUETFILE; > INSERT into t3 select col1 from t1 union all select col1 from t2; -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-19098) Hive: impossible to insert data in a parquet's table with "union all" in the select query
[ https://issues.apache.org/jira/browse/HIVE-19098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16439367#comment-16439367 ] ACOSS commented on HIVE-19098: -- Hello I have tested in SingleNode Configuration and had the same error (Under fedora release 23 with java 1.8.0_60-b27). With hadoop 2.8.3 and with (hive 2.3.2 or hive 2.3.3) With hadoop 2.7.4 and with (hive 2.3.2 or hive 2.3.) It's work fine with hadoop 2.7.4 and with hive 2.1.1. > Hive: impossible to insert data in a parquet's table with "union all" in the > select query > - > > Key: HIVE-19098 > URL: https://issues.apache.org/jira/browse/HIVE-19098 > Project: Hive > Issue Type: Bug > Components: File Formats, Hive >Affects Versions: 2.3.2 >Reporter: ACOSS >Assignee: Janaki Lahorani >Priority: Minor > > Hello > We have a parquet's table. > We want to insert data in the table by a querie like this: > "insert into my_table select * from my_select_table_1 union all select * from > my_select_table_2" > It's fail with the error: > 2018-04-03 15:49:28,898 FATAL [IPC Server handler 2 on 38465] > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: > attempt_1522749003448_0028_m_00_0 - exited : java.io.IOException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(HadoopShimsSecure.java:217) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:169) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257) > ... 11 more > Caused by: java.lang.NullPointerException > at java.util.AbstractCollection.addAll(AbstractCollection.java:343) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189) > at > org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:60) > at > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.(CombineHiveRecordReader.java:99) > ... 16 more > > Scenario: > create table t1 (col1 string); > create table t2 (col1 string); > insert into t2 values ('2017'); > insert into t1 values ('2017'); > create table t3 (col1 string) STORED AS PARQUETFILE; > INSERT into t3 select col1 from t1 union all select col1 from t2; -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-19098) Hive: impossible to insert data in a parquet's table with "union all" in the select query
[ https://issues.apache.org/jira/browse/HIVE-19098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16435469#comment-16435469 ] ACOSS commented on HIVE-19098: -- Hello we use: hadoop 2.8.3 and Hive 2.3.2 best regards > Hive: impossible to insert data in a parquet's table with "union all" in the > select query > - > > Key: HIVE-19098 > URL: https://issues.apache.org/jira/browse/HIVE-19098 > Project: Hive > Issue Type: Bug > Components: File Formats, Hive >Affects Versions: 2.3.2 >Reporter: ACOSS >Assignee: Janaki Lahorani >Priority: Minor > > Hello > We have a parquet's table. > We want to insert data in the table by a querie like this: > "insert into my_table select * from my_select_table_1 union all select * from > my_select_table_2" > It's fail with the error: > 2018-04-03 15:49:28,898 FATAL [IPC Server handler 2 on 38465] > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: > attempt_1522749003448_0028_m_00_0 - exited : java.io.IOException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(HadoopShimsSecure.java:217) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:169) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257) > ... 11 more > Caused by: java.lang.NullPointerException > at java.util.AbstractCollection.addAll(AbstractCollection.java:343) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189) > at > org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:60) > at > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.(CombineHiveRecordReader.java:99) > ... 16 more > > Scenario: > create table t1 (col1 string); > create table t2 (col1 string); > insert into t2 values ('2017'); > insert into t1 values ('2017'); > create table t3 (col1 string) STORED AS PARQUETFILE; > INSERT into t3 select col1 from t1 union all select col1 from t2; -- This message was sent by Atlassian JIRA (v7.6.3#76005)
[jira] [Commented] (HIVE-19098) Hive: impossible to insert data in a parquet's table with "union all" in the select query
[ https://issues.apache.org/jira/browse/HIVE-19098?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=16434616#comment-16434616 ] Janaki Lahorani commented on HIVE-19098: [~SDAT/SACT/IET] The following test works in master branch: create table t1 (col1 string); create table t2 (col1 string); insert into t2 values ('2017'); insert into t1 values ('2017'); create table t3 (col1 string) STORED AS PARQUETFILE; INSERT into t3 select col1 from t1 union all select col1 from t2; select * from t3; Please confirm the version you tried this on. > Hive: impossible to insert data in a parquet's table with "union all" in the > select query > - > > Key: HIVE-19098 > URL: https://issues.apache.org/jira/browse/HIVE-19098 > Project: Hive > Issue Type: Bug > Components: File Formats, Hive >Affects Versions: 2.3.2 >Reporter: ACOSS >Assignee: Janaki Lahorani >Priority: Minor > > Hello > We have a parquet's table. > We want to insert data in the table by a querie like this: > "insert into my_table select * from my_select_table_1 union all select * from > my_select_table_2" > It's fail with the error: > 2018-04-03 15:49:28,898 FATAL [IPC Server handler 2 on 38465] > org.apache.hadoop.mapred.TaskAttemptListenerImpl: Task: > attempt_1522749003448_0028_m_00_0 - exited : java.io.IOException: > java.lang.reflect.InvocationTargetException > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) > at > org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(HadoopShimsSecure.java:217) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345) > at > org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:695) > at > org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:169) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:422) > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169) > Caused by: java.lang.reflect.InvocationTargetException > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) > at > sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) > at > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) > at java.lang.reflect.Constructor.newInstance(Constructor.java:423) > at > org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257) > ... 11 more > Caused by: java.lang.NullPointerException > at java.util.AbstractCollection.addAll(AbstractCollection.java:343) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:118) > at > org.apache.hadoop.hive.ql.io.parquet.ProjectionPusher.pushProjectionsAndFilters(ProjectionPusher.java:189) > at > org.apache.hadoop.hive.ql.io.parquet.ParquetRecordReaderBase.getSplit(ParquetRecordReaderBase.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:75) > at > org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:60) > at > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:75) > at > org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.(CombineHiveRecordReader.java:99) > ... 16 more > > Scenario: > create table t1 (col1 string); > create table t2 (col1 string); > insert into t2 values ('2017'); > insert into t1 values ('2017'); > create table t3 (col1 string) STORED AS PARQUETFILE; > INSERT into t3 select col1 from t1 union all select col1 from t2; -- This message was sent by Atlassian JIRA (v7.6.3#76005)