[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-15 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Resolution: Fixed
Status: Resolved  (was: Patch Available)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Fix For: 4.0.0
>
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch, HIVE-20524.04.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-15 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Fix Version/s: 4.0.0

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Fix For: 4.0.0
>
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch, HIVE-20524.04.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-14 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Status: In Progress  (was: Patch Available)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch, HIVE-20524.04.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-14 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Status: Patch Available  (was: In Progress)

Again.

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch, HIVE-20524.04.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-14 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Attachment: HIVE-20524.04.patch

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch, HIVE-20524.04.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-13 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Status: Patch Available  (was: In Progress)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-13 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Attachment: HIVE-20524.03.patch

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch, 
> HIVE-20524.03.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-13 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Status: In Progress  (was: Patch Available)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-12 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Status: In Progress  (was: Patch Available)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-12 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Status: Patch Available  (was: In Progress)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-12 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Attachment: HIVE-20524.02.patch

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch, HIVE-20524.02.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
> allows StringFamily to either DOUBLE or DECIMAL conversion.
> The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible only 
> allows DOUBLE.
> This JIRA fixes that problem.
> 2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
> fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
> checking for Schema Evolution compatibility.  So, when that code is checking 
> if a data type "varchar(25)" is StringFamily it fails because the "(25)" 
> didn't get removed properly.
> This JIRA fixes issue #2 also.
> NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
> performed the logic in TypeInfoUtils.implicitConvertible on the 
> PrimitiveCategory not the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)


[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-09 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Description: 
Issue that started this JIRA:

{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
columns have types incompatible with the existing columns in their respective 
positions :
c1
{code}

There appear to be 2 issues here:

1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
number that can hold the largest numbers.  The theory being we don't want data 
loss you would get by converting the StringFamily field into integers, etc.  In 
Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At some point 
during Hive version 2 we realized this was incorrect and put DOUBLE the top.

However, the Hive2 Hive version 2 TypeInfoUtils.implicitConversion method 
allows StringFamily to either DOUBLE or DECIMAL conversion.

The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 3 
hive-standalone-metadata-server method checkColTypeChangeCompatible only allows 
DOUBLE.

This JIRA fixes that problem.

2) Also, the checkColTypeChangeCompatible method lost a version 2 series bug 
fix that drops CHAR/VARCHAR (and DECIMAL I think) type decorations when 
checking for Schema Evolution compatibility.  So, when that code is checking if 
a data type "varchar(25)" is StringFamily it fails because the "(25)" didn't 
get removed properly.

This JIRA fixes issue #2 also.


NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
performed the logic in TypeInfoUtils.implicitConvertible on the 
PrimitiveCategory not the raw type string.

  was:
Issue that started this JIRA:

{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
columns have types incompatible with the existing columns in their respective 
positions :
c1
{code}

There appear to be 2 issues here:

1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
number that can hold the largest numbers.  The theory being we don't want data 
loss you would get by converting the StringFamily field into integers, etc.  In 
Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At some point 
during Hive version 2 we realized this was incorrect and put DOUBLE the top.

However, the Hive2 TypeInfoUtils.implicitConversion method allows StringFamily 
to either DOUBLE or DECIMAL conversion.

2) The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
3 hive-standalone-metadata-server method checkColTypeChangeCompatible lost a 
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type 
decorations when checking for Schema Evolution compatibility.  So, when that 
code is checking if a data type "varchar(25)" is StringFamily it fails because 
the "(25)" didn't get removed properly.


NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
performed the logic in TypeInfoUtils.implicitConvertible on the 
PrimitiveCategory not the raw type string.


> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of 

[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-09 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Description: 
Issue that started this JIRA:

{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
columns have types incompatible with the existing columns in their respective 
positions :
c1
{code}

There appear to be 2 issues here:

1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
number that can hold the largest numbers.  The theory being we don't want data 
loss you would get by converting the StringFamily field into integers, etc.  In 
Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At some point 
during Hive version 2 we realized this was incorrect and put DOUBLE the top.

However, the Hive2 TypeInfoUtils.implicitConversion method allows StringFamily 
to either DOUBLE or DECIMAL conversion.

2) The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
3 hive-standalone-metadata-server method checkColTypeChangeCompatible lost a 
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type 
decorations when checking for Schema Evolution compatibility.  So, when that 
code is checking if a data type "varchar(25)" is StringFamily it fails because 
the "(25)" didn't get removed properly.


NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
performed the logic in TypeInfoUtils.implicitConvertible on the 
PrimitiveCategory not the raw type string.

  was:
Issue that started this JIRA:

{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
columns have types incompatible with the existing columns in their respective 
positions :
c1
{code}

There appear to be 2 issues here:

1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
number that can hold the largest numbers.  The theory being we don't want data 
loss you would get by converting the StringFamily field into integers, etc.  In 
Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At some point 
during Hive ver 2 we realized this was incorrect and put DOUBLE the top.

However, the Hive2 TypeInfoUtils.implicitConversion method allows StringFamily 
to either DOUBLE or DECIMAL conversion.

2) The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
3 hive-standalone-metadata-server method checkColTypeChangeCompatible lost a 
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type 
decorations when checking for Schema Evolution compatibility.  So, when that 
code is checking if a data type "varchar(25)" is StringFamily it fails because 
the "(25)" didn't get removed properly.


NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
performed the logic in TypeInfoUtils.implicitConvertible on the 
PrimitiveCategory not the raw type string.


> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive version 2 we realized this was incorrect and put 
> DOUBLE the top.
> However, the Hive2 

[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-09 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Description: 
Issue that started this JIRA:

{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
columns have types incompatible with the existing columns in their respective 
positions :
c1
{code}

There appear to be 2 issues here:

1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
number that can hold the largest numbers.  The theory being we don't want data 
loss you would get by converting the StringFamily field into integers, etc.  In 
Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At some point 
during Hive ver 2 we realized this was incorrect and put DOUBLE the top.

However, the Hive2 TypeInfoUtils.implicitConversion method allows StringFamily 
to either DOUBLE or DECIMAL conversion.

2) The new org.apache.hadoop.hive.metastore.ColumnType class under Hive version 
3 hive-standalone-metadata-server method checkColTypeChangeCompatible lost a 
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type 
decorations when checking for Schema Evolution compatibility.  So, when that 
code is checking if a data type "varchar(25)" is StringFamily it fails because 
the "(25)" didn't get removed properly.


NOTE: Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version 
performed the logic in TypeInfoUtils.implicitConvertible on the 
PrimitiveCategory not the raw type string.

  was:
Issue that started this JIRA:

{code}
create external table varchar_decimal (c1 varchar(25));
alter table varchar_decimal change c1 c1 decimal(31,0);
ERROR : FAILED: Execution Error, return code 1 from 
org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
columns have types incompatible with the existing columns in their respective 
positions :
c1
{code}

There appear to be 2 issues here:

1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
number that can hold the largest numbers.  The theory being we don't want data 
loss you would get by converting the StringFamily field into integers, etc.  In 
Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At some point 
during Hive ver 2 we realized this was incorrect and put DOUBLE the top.

However, the Hive2 TypeInfoUtils.implicitConversion method allows StringFamily 
to either DOUBLE or DECIMAL.



The new org.apache.hadoop.hive.metastore.ColumnType class under hive version 3 
hive-standalone-metadata-server method checkColTypeChangeCompatible lost a 
version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type 
decorations when checking for Schema Evolution compatibility.

Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version performed 
the logic in TypeInfoUtils.implicitConvertible on the PrimitiveCategory not the 
raw type string.


> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive ver 2 we realized this was incorrect and put DOUBLE 
> the top.
> However, the Hive2 TypeInfoUtils.implicitConversion method allows 
> StringFamily to either DOUBLE or DECIMAL conversion.
> 2) The new org.apache.hadoop.hive.metastore.ColumnType class 

[jira] [Updated] (HIVE-20524) Schema Evolution checking is broken in going from Hive version 2 to version 3 for ALTER TABLE VARCHAR to DECIMAL

2018-09-09 Thread Matt McCline (JIRA)


 [ 
https://issues.apache.org/jira/browse/HIVE-20524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matt McCline updated HIVE-20524:

Summary: Schema Evolution checking is broken in going from Hive version 2 
to version 3 for ALTER TABLE VARCHAR to DECIMAL  (was: Schema Evolution 
checking is broken in going from Hive ver 2 to ver 3 for ALTER TABLE VARCHAR to 
DECIMAL)

> Schema Evolution checking is broken in going from Hive version 2 to version 3 
> for ALTER TABLE VARCHAR to DECIMAL
> 
>
> Key: HIVE-20524
> URL: https://issues.apache.org/jira/browse/HIVE-20524
> Project: Hive
>  Issue Type: Bug
>  Components: Hive
>Reporter: Matt McCline
>Assignee: Matt McCline
>Priority: Critical
> Attachments: HIVE-20524.01.patch
>
>
> Issue that started this JIRA:
> {code}
> create external table varchar_decimal (c1 varchar(25));
> alter table varchar_decimal change c1 c1 decimal(31,0);
> ERROR : FAILED: Execution Error, return code 1 from 
> org.apache.hadoop.hive.ql.exec.DDLTask. Unable to alter table. The following 
> columns have types incompatible with the existing columns in their respective 
> positions :
> c1
> {code}
> There appear to be 2 issues here:
> 1) When hive.metastore.disallow.incompatible.col.type.changes is true (the 
> default) we only allow StringFamily (STRING, CHAR, VARCHAR) conversion to a 
> number that can hold the largest numbers.  The theory being we don't want 
> data loss you would get by converting the StringFamily field into integers, 
> etc.  In Hive version 2 the hierarchy of numbers had DECIMAL at the top.  At 
> some point during Hive ver 2 we realized this was incorrect and put DOUBLE 
> the top.
> However, the Hive2 TypeInfoUtils.implicitConversion method allows 
> StringFamily to either DOUBLE or DECIMAL.
> The new org.apache.hadoop.hive.metastore.ColumnType class under hive version 
> 3 hive-standalone-metadata-server method checkColTypeChangeCompatible lost a 
> version 2 series bug fix that drops CHAR/VARCHAR (and DECIMAL I think) type 
> decorations when checking for Schema Evolution compatibility.
> Hive1 version 2 did undecoratedTypeName(oldType) and Hive2 version performed 
> the logic in TypeInfoUtils.implicitConvertible on the PrimitiveCategory not 
> the raw type string.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)