[
https://issues.apache.org/jira/browse/HIVE-12475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15035327#comment-15035327
]
Hive QA commented on HIVE-12475:
--------------------------------
Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12775067/HIVE-12475.1.patch
{color:green}SUCCESS:{color} +1 due to 1 test(s) being added or modified.
{color:red}ERROR:{color} -1 due to 8 failed/errored test(s), 9869 tests executed
*Failed tests:*
{noformat}
TestHWISessionManager - did not produce a TEST-*.xml file
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver_dynamic_partition_pruning
org.apache.hadoop.hive.cli.TestMiniLlapCliDriver.testCliDriver_vectorized_dynamic_partition_pruning
org.apache.hadoop.hive.cli.TestMiniTezCliDriver.testCliDriver_mergejoin
org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_authorization_uri_import
org.apache.hadoop.hive.metastore.TestHiveMetaStorePartitionSpecs.testFetchingPartitionsWithDifferentSchemas
org.apache.hadoop.hive.metastore.TestHiveMetaStorePartitionSpecs.testGetPartitionSpecs_WithAndWithoutPartitionGrouping
org.apache.hive.jdbc.TestSSL.testSSLVersion
{noformat}
Test results:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/6192/testReport
Console output:
http://ec2-174-129-184-35.compute-1.amazonaws.com/jenkins/job/PreCommit-HIVE-TRUNK-Build/6192/console
Test logs:
http://ec2-174-129-184-35.compute-1.amazonaws.com/logs/PreCommit-HIVE-TRUNK-Build-6192/
Messages:
{noformat}
Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 8 tests failed
{noformat}
This message is automatically generated.
ATTACHMENT ID: 12775067 - PreCommit-HIVE-TRUNK-Build
> Parquet schema evolution within array<struct<>> doesn't work
> ------------------------------------------------------------
>
> Key: HIVE-12475
> URL: https://issues.apache.org/jira/browse/HIVE-12475
> Project: Hive
> Issue Type: Bug
> Components: File Formats
> Affects Versions: 1.1.0
> Reporter: Mohammad Kamrul Islam
> Assignee: Mohammad Kamrul Islam
> Attachments: HIVE-12475.1.patch
>
>
> If we create a table with type array<struct<>>, and later added a field in
> the struct, we got the following exception.
> The following SQL statements would recreate the error:
> {quote}
> CREATE TABLE pq_test (f1 array<struct<c1:int,c2:int>>) STORED AS PARQUET;
> INSERT INTO TABLE pq_test select array(named_struct("c1",1,"c2",2)) FROM tmp
> LIMIT 2;
> SELECT * from pq_test;
> ALTER TABLE pq_test REPLACE COLUMNS (f1
> array<struct<c1:int,c2:int,cccccc:int>>); //***** cccccc
> SELECT * from pq_test;
> {quote}
> Exception:
> {quote}
> Caused by: java.lang.ArrayIndexOutOfBoundsException: 2
> at
> org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getStructFieldData(ArrayWritableObjectInspector.java:142)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:363)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:316)
> at
> org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:199)
> at
> org.apache.hadoop.hive.serde2.DelimitedJSONSerDe.serializeField(DelimitedJSONSerDe.java:61)
> at
> org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doSerialize(LazySimpleSerDe.java:236)
> at
> org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.serialize(AbstractEncodingAwareSerDe.java:55)
> at
> org.apache.hadoop.hive.ql.exec.DefaultFetchFormatter.convert(DefaultFetchFormatter.java:71)
> at
> org.apache.hadoop.hive.ql.exec.DefaultFetchFormatter.convert(DefaultFetchFormatter.java:40)
> at
> org.apache.hadoop.hive.ql.exec.ListSinkOperator.process(ListSinkOperator.java:89)
> {quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)