[
https://issues.apache.org/jira/browse/HIVE-11217?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14902595#comment-14902595
]
Yongzhi Chen commented on HIVE-11217:
-------------------------------------
[~prasanth_j], Thanks for review the change.
I think the error has to be thrown in the context of void will be used as
column field schema in the CTAS, if the NullExprProcessor used in subquery and
not used in the CTAS column, the query should succeed. The place that I am
changing will make sure the void will be used as field schema.
Give you an example:
{noformat}
create table foo3 as select b.code from (select null, code from jsmall) b;
create table foo4 as select null, code from jsmall;
{noformat}
The first one should succeed while the second should fail. They both use
NullExprProcessor and both are CTAS.
What I am trying to say that TypeCheckProcFactory may sometimes have difficulty
to figure out the context of the expression.
> CTAS statements throws error, when the table is stored as ORC File format and
> select clause has NULL/VOID type column
> ----------------------------------------------------------------------------------------------------------------------
>
> Key: HIVE-11217
> URL: https://issues.apache.org/jira/browse/HIVE-11217
> Project: Hive
> Issue Type: Bug
> Components: File Formats
> Affects Versions: 0.13.1
> Reporter: Gaurav Kohli
> Assignee: Yongzhi Chen
> Priority: Minor
> Attachments: HIVE-11217.1.patch, HIVE-11217.2.patch,
> HIVE-11217.3.patch, HIVE-11217.4.patch, HIVE-11271.5.patch
>
>
> If you try to use create-table-as-select (CTAS) statement and create a ORC
> File format based table, then you can't use NULL as a column value in select
> clause
> CREATE TABLE empty (x int);
> CREATE TABLE orc_table_with_null
> STORED AS ORC
> AS
> SELECT
> x,
> null
> FROM empty;
> Error:
> {quote}
> 347084 [main] ERROR hive.ql.exec.DDLTask -
> org.apache.hadoop.hive.ql.metadata.HiveException:
> java.lang.IllegalArgumentException: Unknown primitive type VOID
> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:643)
> at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4242)
> at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:285)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1554)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1321)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1139)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:962)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:952)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:269)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:221)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:431)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:367)
> at
> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:464)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:474)
> at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:756)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:694)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:633)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:323)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:284)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:39)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:227)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1642)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> Caused by: java.lang.IllegalArgumentException: Unknown primitive type VOID
> at
> org.apache.hadoop.hive.ql.io.orc.OrcStruct.createObjectInspector(OrcStruct.java:530)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcStruct$OrcStructInspector.<init>(OrcStruct.java:195)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcStruct.createObjectInspector(OrcStruct.java:534)
> at
> org.apache.hadoop.hive.ql.io.orc.OrcSerde.initialize(OrcSerde.java:106)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:519)
> at
> org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:345)
> at
> org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:292)
> at
> org.apache.hadoop.hive.ql.metadata.Table.checkValidity(Table.java:194)
> at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:621)
> ... 35 more
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)