[
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Matt McCline updated HIVE-20524:
--------------------------------
Description:
Issue that started this JIRA:
{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following
columns have types incompatible with the existing columns in their respective
positions :
c1
{code}
There appear to be 2 issues here:
1) When hive.metastore.disallow.incompatible.col.type.changes is true (the
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a
number that can hold the largest numbers. In
The new org.apache.hadoop.hive.metastore.ColumnType class under hive version 3
hive-standalone-metadata-server method checkColTypeChangeCompatible lost a
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type
decorations when checking for Schema Evolution compatibility.
Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version performed
the logic in TypeInfoUtils.implicitConvertible on the PrimitiveCategory not the
raw type string.
was:
Issue that started this JIRA:
{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following
columns have types incompatible with the existing columns in their respective
positions :
c1
{code}
The new org.apache.hadoop.hive.metastore.ColumnType class under hive version 3
hive-standalone-metadata-server method checkColTypeChangeCompatible lost a
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type
decorations when checking for Schema Evolution compatibility.
Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version performed
the logic in TypeInfoUtils.implicitConvertible on the PrimitiveCategory not the
raw type string.
> Schema Evolution checking is broken in going from ver 2 to ver 3 for ALTER
> TABLE VARCHAR to DECIMAL
> ---------------------------------------------------------------------------------------------------
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
> Issue Type: Bug
> Components: Hive
> Reporter: Matt McCline
> Assignee: Matt McCline
> Priority: Critical
> Attachments: HIVE-20524.01.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following
> columns have types incompatible with the existing columns in their respective
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a
> number that can hold the largest numbers. In
> The new org.apache.hadoop.hive.metastore.ColumnType class under hive version
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible lost a
> version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type
> decorations when checking for Schema Evolution compatibility.
> Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version performed
> the logic in TypeInfoUtils.implicitConvertible on the PrimitiveCategory not
> the raw type string.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)